Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    TypeError
Message:      Couldn't cast array of type
struct<content_hash: string, timestamp: string, source: string, line_count: int64, max_line_length: int64, avg_line_length: double, alnum_prop: double, repo_name: string, id: string, size: string, binary: bool, copies: string, ref: string, path: string, mode: string, license: string, language: list<item: struct<name: string, bytes: string>>, symlink_target: string>
to
{'content_hash': Value(dtype='string', id=None), 'timestamp': Value(dtype='string', id=None), 'source': Value(dtype='string', id=None), 'line_count': Value(dtype='int64', id=None), 'max_line_length': Value(dtype='int64', id=None), 'avg_line_length': Value(dtype='float64', id=None), 'alnum_prop': Value(dtype='float64', id=None), 'repo_name': Value(dtype='string', id=None), 'id': Value(dtype='string', id=None), 'size': Value(dtype='string', id=None), 'binary': Value(dtype='bool', id=None), 'copies': Value(dtype='string', id=None), 'ref': Value(dtype='string', id=None), 'path': Value(dtype='string', id=None), 'mode': Value(dtype='string', id=None), 'license': Value(dtype='string', id=None), 'language': [{'name': Value(dtype='string', id=None), 'bytes': Value(dtype='string', id=None)}]}
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2011, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 585, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2302, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in cast_table_to_schema
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in <listcomp>
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in wrapper
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in <listcomp>
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2122, in cast_array_to_feature
                  raise TypeError(f"Couldn't cast array of type\n{_short_str(array.type)}\nto\n{_short_str(feature)}")
              TypeError: Couldn't cast array of type
              struct<content_hash: string, timestamp: string, source: string, line_count: int64, max_line_length: int64, avg_line_length: double, alnum_prop: double, repo_name: string, id: string, size: string, binary: bool, copies: string, ref: string, path: string, mode: string, license: string, language: list<item: struct<name: string, bytes: string>>, symlink_target: string>
              to
              {'content_hash': Value(dtype='string', id=None), 'timestamp': Value(dtype='string', id=None), 'source': Value(dtype='string', id=None), 'line_count': Value(dtype='int64', id=None), 'max_line_length': Value(dtype='int64', id=None), 'avg_line_length': Value(dtype='float64', id=None), 'alnum_prop': Value(dtype='float64', id=None), 'repo_name': Value(dtype='string', id=None), 'id': Value(dtype='string', id=None), 'size': Value(dtype='string', id=None), 'binary': Value(dtype='bool', id=None), 'copies': Value(dtype='string', id=None), 'ref': Value(dtype='string', id=None), 'path': Value(dtype='string', id=None), 'mode': Value(dtype='string', id=None), 'license': Value(dtype='string', id=None), 'language': [{'name': Value(dtype='string', id=None), 'bytes': Value(dtype='string', id=None)}]}
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1529, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1154, in convert_to_parquet
                  builder.download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1027, in download_and_prepare
                  self._download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1122, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1882, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2038, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

text
string
meta
dict
<?php namespace Kunstmaan\TranslatorBundle\Model\Translation; /** * Defines a new translation - bridge between controller and service layer */ class NewTranslation { /** * An array with all translations, key = locale, value = translation * @var array */ protected $locales = array(); /** * Keyword of the new translations * @var string */ protected $keyword; /** * Domain name of the new translations * @var string */ protected $domain; public function getLocales() { return $this->locales; } public function setLocales($locales) { $this->locales = $locales; } public function getKeyword() { return $this->keyword; } public function setKeyword($keyword) { $this->keyword = $keyword; } public function getDomain() { return $this->domain; } public function setDomain($domain) { $this->domain = $domain; } }
{ "content_hash": "7b564a0f45c2e0d2ed61d191a0789c46", "timestamp": "", "source": "github", "line_count": 57, "max_line_length": 74, "avg_line_length": 17.649122807017545, "alnum_prop": 0.5785288270377733, "repo_name": "kln3wrld/KunstmaanBundlesCMS", "id": "84418ced575aa8eb0bc1dfe56907fd65edfc08d3", "size": "1006", "binary": false, "copies": "33", "ref": "refs/heads/master", "path": "src/Kunstmaan/TranslatorBundle/Model/Translation/NewTranslation.php", "mode": "33188", "license": "mit", "language": [ { "name": "Batchfile", "bytes": "130" }, { "name": "CSS", "bytes": "546285" }, { "name": "Cucumber", "bytes": "21755" }, { "name": "HTML", "bytes": "641660" }, { "name": "JavaScript", "bytes": "6365118" }, { "name": "PHP", "bytes": "2331223" }, { "name": "PowerShell", "bytes": "161" }, { "name": "Ruby", "bytes": "76" }, { "name": "Shell", "bytes": "4014" } ] }
<?php class Google_Service_Compute_AddressesScopedListWarning extends Google_Collection { protected $collection_key = 'data'; public $code; protected $dataType = 'Google_Service_Compute_AddressesScopedListWarningData'; protected $dataDataType = 'array'; public $message; public function setCode($code) { $this->code = $code; } public function getCode() { return $this->code; } public function setData($data) { $this->data = $data; } public function getData() { return $this->data; } public function setMessage($message) { $this->message = $message; } public function getMessage() { return $this->message; } }
{ "content_hash": "41a667465ce808a9d6427a4b894b960d", "timestamp": "", "source": "github", "line_count": 36, "max_line_length": 81, "avg_line_length": 18.944444444444443, "alnum_prop": 0.6642228739002932, "repo_name": "dwivivagoal/KuizMilioner", "id": "ccfad568b01af10b6f815d0427944ff3b64cfae2", "size": "1272", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "application/libraries/php-google-sdk/google/apiclient-services/src/Google/Service/Compute/AddressesScopedListWarning.php", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "6534560" }, { "name": "CoffeeScript", "bytes": "83631" }, { "name": "HTML", "bytes": "3080360" }, { "name": "Hack", "bytes": "1703568" }, { "name": "JavaScript", "bytes": "18658301" }, { "name": "Makefile", "bytes": "952" }, { "name": "PHP", "bytes": "23648205" }, { "name": "Shell", "bytes": "1628" } ] }
import {empty, interval, Observable, of} from 'rxjs'; import {buffer, bufferCount, bufferTime, bufferToggle, bufferWhen} from 'rxjs/operators'; import {asyncTest} from '../test-util'; xdescribe('Observable.buffer', () => { let log: any[]; let observable1: Observable<any>; beforeEach(() => { log = []; }); it('buffer func callback should run in the correct zone', asyncTest((done: any) => { const constructorZone1: Zone = Zone.current.fork({name: 'Constructor Zone1'}); const subscriptionZone: Zone = Zone.current.fork({name: 'Subscription Zone'}); observable1 = constructorZone1.run(() => { const source = interval(350); const iv = interval(100); return iv.pipe(buffer(source)); }); subscriptionZone.run(() => { const subscriber = observable1.subscribe( (result: any) => { expect(Zone.current.name).toEqual(subscriptionZone.name); log.push(result); if (result[0] >= 3) { subscriber.unsubscribe(); } }, () => { fail('should not call error'); }, () => { log.push('completed'); expect(Zone.current.name).toEqual(subscriptionZone.name); expect(log).toEqual([[0, 1, 2], [3, 4, 5], 'completed']); done(); }); }); expect(log).toEqual([]); }, Zone.root)); it('bufferCount func callback should run in the correct zone', asyncTest((done: any) => { const constructorZone1: Zone = Zone.current.fork({name: 'Constructor Zone1'}); const subscriptionZone: Zone = Zone.current.fork({name: 'Subscription Zone'}); observable1 = constructorZone1.run(() => { const iv = interval(100); return iv.pipe(bufferCount(3)); }); subscriptionZone.run(() => { const subscriber = observable1.subscribe( (result: any) => { expect(Zone.current.name).toEqual(subscriptionZone.name); log.push(result); if (result[0] >= 3) { subscriber.unsubscribe(); } }, () => { fail('should not call error'); }, () => { log.push('completed'); expect(Zone.current.name).toEqual(subscriptionZone.name); expect(log).toEqual([[0, 1, 2], [3, 4, 5], 'completed']); done(); }); }); expect(log).toEqual([]); }, Zone.root)); it('bufferTime func callback should run in the correct zone', asyncTest((done: any) => { const constructorZone1: Zone = Zone.current.fork({name: 'Constructor Zone1'}); const subscriptionZone: Zone = Zone.current.fork({name: 'Subscription Zone'}); observable1 = constructorZone1.run(() => { const iv = interval(100); return iv.pipe(bufferTime(350)); }); subscriptionZone.run(() => { const subscriber = observable1.subscribe( (result: any) => { expect(Zone.current.name).toEqual(subscriptionZone.name); log.push(result); if (result[0] >= 3) { subscriber.unsubscribe(); } }, () => { fail('should not call error'); }, () => { log.push('completed'); expect(Zone.current.name).toEqual(subscriptionZone.name); expect(log).toEqual([[0, 1, 2], [3, 4, 5], 'completed']); done(); }); }); expect(log).toEqual([]); }, Zone.root)); it('bufferToggle func callback should run in the correct zone', asyncTest((done: any) => { const constructorZone1: Zone = Zone.current.fork({name: 'Constructor Zone1'}); const subscriptionZone: Zone = Zone.current.fork({name: 'Subscription Zone'}); observable1 = constructorZone1.run(() => { const source = interval(10); const opening = interval(25); const closingSelector = (v: any) => { expect(Zone.current.name).toEqual(constructorZone1.name); return v % 2 === 0 ? of(v) : empty(); }; return source.pipe(bufferToggle(opening, closingSelector)); }); let i = 0; subscriptionZone.run(() => { const subscriber = observable1.subscribe( (result: any) => { expect(Zone.current.name).toEqual(subscriptionZone.name); log.push(result); subscriber.unsubscribe(); }, () => { fail('should not call error'); }, () => { log.push('completed'); expect(Zone.current.name).toEqual(subscriptionZone.name); expect(log).toEqual([[], 'completed']); done(); }); }); expect(log).toEqual([]); }, Zone.root)); it('bufferWhen func callback should run in the correct zone', asyncTest((done: any) => { const constructorZone1: Zone = Zone.current.fork({name: 'Constructor Zone1'}); const subscriptionZone: Zone = Zone.current.fork({name: 'Subscription Zone'}); observable1 = constructorZone1.run(() => { const source = interval(100); return source.pipe(bufferWhen(() => { expect(Zone.current.name).toEqual(constructorZone1.name); return interval(220); })); }); let i = 0; subscriptionZone.run(() => { const subscriber = observable1.subscribe( (result: any) => { expect(Zone.current.name).toEqual(subscriptionZone.name); log.push(result); if (i++ >= 3) { subscriber.unsubscribe(); } }, () => { fail('should not call error'); }, () => { log.push('completed'); expect(Zone.current.name).toEqual(subscriptionZone.name); expect(log).toEqual([[0, 1], [2, 3], [4, 5], [6, 7], 'completed']); done(); }); }); expect(log).toEqual([]); }, Zone.root)); });
{ "content_hash": "e847c12c424a5a3a54b3ca0c79eb35e4", "timestamp": "", "source": "github", "line_count": 179, "max_line_length": 92, "avg_line_length": 35.58659217877095, "alnum_prop": 0.5054945054945055, "repo_name": "matsko/angular", "id": "bf2be46e6d4ab5e5e2b092897eb1ca4a5cebe57d", "size": "6572", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "packages/zone.js/test/rxjs/rxjs.Observable.buffer.spec.ts", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "346628" }, { "name": "Dockerfile", "bytes": "11884" }, { "name": "HTML", "bytes": "487784" }, { "name": "JSONiq", "bytes": "619" }, { "name": "JavaScript", "bytes": "2490486" }, { "name": "PHP", "bytes": "7222" }, { "name": "PowerShell", "bytes": "3909" }, { "name": "Shell", "bytes": "98819" }, { "name": "Starlark", "bytes": "453276" }, { "name": "TypeScript", "bytes": "22544767" } ] }
reset vsh flash0:/vsh/module/vshmain.prx
{ "content_hash": "304106d447e9dba4d635b8ca12d3993a", "timestamp": "", "source": "github", "line_count": 2, "max_line_length": 30, "avg_line_length": 20.5, "alnum_prop": 0.8048780487804879, "repo_name": "173210/psplinkusb", "id": "038203a5e41fefdd8f3ab7df595eefe2a0f23f49", "size": "105", "binary": false, "copies": "4", "ref": "refs/heads/master", "path": "scripts/loadvsh.sh", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "Assembly", "bytes": "118394" }, { "name": "C", "bytes": "465093" }, { "name": "C++", "bytes": "165730" }, { "name": "Objective-C", "bytes": "451" }, { "name": "Shell", "bytes": "226" } ] }
package de.plushnikov.builder.importbuilder; import de.plushnikov.builder.BuilderExample; import de.plushnikov.builder.BuilderExample.BuilderExampleBuilder; import de.plushnikov.builder.importbuilder.otherpackage.Builder2Import; import static de.plushnikov.builder.importbuilder.otherpackage.Builder2Import.Builder2ImportBuilder; import de.plushnikov.builder.simple.BuilderSimple; import de.plushnikov.builder.simple.BuilderSimple.BuilderSimpleBuilder; public class TestImportingBuilderClass { public static void main(String[] args) { BuilderSimpleBuilder builderSimple = BuilderSimple.builder(); BuilderSimple simple = builderSimple.myInt(1).build(); System.out.println(simple); BuilderExampleBuilder builderExampleBuilder = BuilderExample.builder(); Builder2ImportBuilder builder2ImportBuilder = Builder2Import.builder(); } }
{ "content_hash": "316df4638aaee25c7c5fc155cb966b96", "timestamp": "", "source": "github", "line_count": 21, "max_line_length": 100, "avg_line_length": 40.80952380952381, "alnum_prop": 0.8296382730455076, "repo_name": "mplushnikov/lombok-intellij-plugin", "id": "65dd005d00a9f14183a306239e2d5dbbb47d96b4", "size": "857", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "test-manual/src/main/java/de/plushnikov/builder/importbuilder/TestImportingBuilderClass.java", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "HTML", "bytes": "33025" }, { "name": "Java", "bytes": "2152851" }, { "name": "Lex", "bytes": "1624" } ] }
package main import ( "bytes" "fmt" "io" "io/ioutil" "log" "reflect" "strconv" "strings" "unicode" "golang.org/x/text/internal/gen" "golang.org/x/text/internal/triegen" "golang.org/x/text/internal/ucd" "golang.org/x/text/unicode/norm" ) func main() { gen.Init() genTables() genTablesTest() gen.Repackage("gen_trieval.go", "trieval.go", "cases") } // runeInfo contains all information for a rune that we care about for casing // operations. type runeInfo struct { Rune rune entry info // trie value for this rune. CaseMode info // Simple case mappings. Simple [1 + maxCaseMode][]rune // Special casing HasSpecial bool Conditional bool Special [1 + maxCaseMode][]rune // Folding FoldSimple rune FoldSpecial rune FoldFull []rune // TODO: FC_NFKC, or equivalent data. // Properties SoftDotted bool CaseIgnorable bool Cased bool DecomposeGreek bool BreakType string BreakCat breakCategory // We care mostly about 0, Above, and IotaSubscript. CCC byte } type breakCategory int const ( breakBreak breakCategory = iota breakLetter breakMid ) // mapping returns the case mapping for the given case type. func (r *runeInfo) mapping(c info) string { if r.HasSpecial { return string(r.Special[c]) } if len(r.Simple[c]) != 0 { return string(r.Simple[c]) } return string(r.Rune) } func parse(file string, f func(p *ucd.Parser)) { ucd.Parse(gen.OpenUCDFile(file), f) } func parseUCD() []runeInfo { chars := make([]runeInfo, unicode.MaxRune) get := func(r rune) *runeInfo { c := &chars[r] c.Rune = r return c } parse("UnicodeData.txt", func(p *ucd.Parser) { ri := get(p.Rune(0)) ri.CCC = byte(p.Int(ucd.CanonicalCombiningClass)) ri.Simple[cLower] = p.Runes(ucd.SimpleLowercaseMapping) ri.Simple[cUpper] = p.Runes(ucd.SimpleUppercaseMapping) ri.Simple[cTitle] = p.Runes(ucd.SimpleTitlecaseMapping) if p.String(ucd.GeneralCategory) == "Lt" { ri.CaseMode = cTitle } }) // <code>; <property> parse("PropList.txt", func(p *ucd.Parser) { if p.String(1) == "Soft_Dotted" { chars[p.Rune(0)].SoftDotted = true } }) // <code>; <word break type> parse("DerivedCoreProperties.txt", func(p *ucd.Parser) { ri := get(p.Rune(0)) switch p.String(1) { case "Case_Ignorable": ri.CaseIgnorable = true case "Cased": ri.Cased = true case "Lowercase": ri.CaseMode = cLower case "Uppercase": ri.CaseMode = cUpper } }) // <code>; <lower> ; <title> ; <upper> ; (<condition_list> ;)? parse("SpecialCasing.txt", func(p *ucd.Parser) { // We drop all conditional special casing and deal with them manually in // the language-specific case mappers. Rune 0x03A3 is the only one with // a conditional formatting that is not language-specific. However, // dealing with this letter is tricky, especially in a streaming // context, so we deal with it in the Caser for Greek specifically. ri := get(p.Rune(0)) if p.String(4) == "" { ri.HasSpecial = true ri.Special[cLower] = p.Runes(1) ri.Special[cTitle] = p.Runes(2) ri.Special[cUpper] = p.Runes(3) } else { ri.Conditional = true } }) // TODO: Use text breaking according to UAX #29. // <code>; <word break type> parse("auxiliary/WordBreakProperty.txt", func(p *ucd.Parser) { ri := get(p.Rune(0)) ri.BreakType = p.String(1) // We collapse the word breaking properties onto the categories we need. switch p.String(1) { // TODO: officially we need to canonicalize. case "MidLetter", "MidNumLet", "Single_Quote": ri.BreakCat = breakMid if !ri.CaseIgnorable { // finalSigma relies on the fact that all breakMid runes are // also a Case_Ignorable. Revisit this code when this changes. log.Fatalf("Rune %U, which has a break category mid, is not a case ignorable", ri) } case "ALetter", "Hebrew_Letter", "Numeric", "Extend", "ExtendNumLet", "Format", "ZWJ": ri.BreakCat = breakLetter } }) // <code>; <type>; <mapping> parse("CaseFolding.txt", func(p *ucd.Parser) { ri := get(p.Rune(0)) switch p.String(1) { case "C": ri.FoldSimple = p.Rune(2) ri.FoldFull = p.Runes(2) case "S": ri.FoldSimple = p.Rune(2) case "T": ri.FoldSpecial = p.Rune(2) case "F": ri.FoldFull = p.Runes(2) default: log.Fatalf("%U: unknown type: %s", p.Rune(0), p.String(1)) } }) return chars } func genTables() { chars := parseUCD() verifyProperties(chars) t := triegen.NewTrie("case") for i := range chars { c := &chars[i] makeEntry(c) t.Insert(rune(i), uint64(c.entry)) } w := gen.NewCodeWriter() defer w.WriteVersionedGoFile("tables.go", "cases") gen.WriteUnicodeVersion(w) // TODO: write CLDR version after adding a mechanism to detect that the // tables on which the manually created locale-sensitive casing code is // based hasn't changed. w.WriteVar("xorData", string(xorData)) w.WriteVar("exceptions", string(exceptionData)) sz, err := t.Gen(w, triegen.Compact(&sparseCompacter{})) if err != nil { log.Fatal(err) } w.Size += sz } func makeEntry(ri *runeInfo) { if ri.CaseIgnorable { if ri.Cased { ri.entry = cIgnorableCased } else { ri.entry = cIgnorableUncased } } else { ri.entry = ri.CaseMode } // TODO: handle soft-dotted. ccc := cccOther switch ri.CCC { case 0: // Not_Reordered ccc = cccZero case above: // Above ccc = cccAbove } switch ri.BreakCat { case breakBreak: ccc = cccBreak case breakMid: ri.entry |= isMidBit } ri.entry |= ccc if ri.CaseMode == cUncased { return } // Need to do something special. if ri.CaseMode == cTitle || ri.HasSpecial || ri.mapping(cTitle) != ri.mapping(cUpper) { makeException(ri) return } if f := string(ri.FoldFull); len(f) > 0 && f != ri.mapping(cUpper) && f != ri.mapping(cLower) { makeException(ri) return } // Rune is either lowercase or uppercase. orig := string(ri.Rune) mapped := "" if ri.CaseMode == cUpper { mapped = ri.mapping(cLower) } else { mapped = ri.mapping(cUpper) } if len(orig) != len(mapped) { makeException(ri) return } if string(ri.FoldFull) == ri.mapping(cUpper) { ri.entry |= inverseFoldBit } n := len(orig) // Create per-byte XOR mask. var b []byte for i := 0; i < n; i++ { b = append(b, orig[i]^mapped[i]) } // Remove leading 0 bytes, but keep at least one byte. for ; len(b) > 1 && b[0] == 0; b = b[1:] { } if len(b) == 1 && b[0]&0xc0 == 0 { ri.entry |= info(b[0]) << xorShift return } key := string(b) x, ok := xorCache[key] if !ok { xorData = append(xorData, 0) // for detecting start of sequence xorData = append(xorData, b...) x = len(xorData) - 1 xorCache[key] = x } ri.entry |= info(x<<xorShift) | xorIndexBit } var xorCache = map[string]int{} // xorData contains byte-wise XOR data for the least significant bytes of a // UTF-8 encoded rune. An index points to the last byte. The sequence starts // with a zero terminator. var xorData = []byte{} // See the comments in gen_trieval.go re "the exceptions slice". var exceptionData = []byte{0} // makeException encodes case mappings that cannot be expressed in a simple // XOR diff. func makeException(ri *runeInfo) { ccc := ri.entry & cccMask // Set exception bit and retain case type. ri.entry &= 0x0007 ri.entry |= exceptionBit if len(exceptionData) >= 1<<numExceptionBits { log.Fatalf("%U:exceptionData too large %x > %d bits", ri.Rune, len(exceptionData), numExceptionBits) } // Set the offset in the exceptionData array. ri.entry |= info(len(exceptionData) << exceptionShift) orig := string(ri.Rune) tc := ri.mapping(cTitle) uc := ri.mapping(cUpper) lc := ri.mapping(cLower) ff := string(ri.FoldFull) // addString sets the length of a string and adds it to the expansions array. addString := func(s string, b *byte) { if len(s) == 0 { // Zero-length mappings exist, but only for conditional casing, // which we are representing outside of this table. log.Fatalf("%U: has zero-length mapping.", ri.Rune) } *b <<= 3 if s != orig { n := len(s) if n > 7 { log.Fatalf("%U: mapping larger than 7 (%d)", ri.Rune, n) } *b |= byte(n) exceptionData = append(exceptionData, s...) } } // byte 0: exceptionData = append(exceptionData, byte(ccc)|byte(len(ff))) // byte 1: p := len(exceptionData) exceptionData = append(exceptionData, 0) if len(ff) > 7 { // May be zero-length. log.Fatalf("%U: fold string larger than 7 (%d)", ri.Rune, len(ff)) } exceptionData = append(exceptionData, ff...) ct := ri.CaseMode if ct != cLower { addString(lc, &exceptionData[p]) } if ct != cUpper { addString(uc, &exceptionData[p]) } if ct != cTitle { // If title is the same as upper, we set it to the original string so // that it will be marked as not present. This implies title case is // the same as upper case. if tc == uc { tc = orig } addString(tc, &exceptionData[p]) } } // sparseCompacter is a trie value block Compacter. There are many cases where // successive runes alternate between lower- and upper-case. This Compacter // exploits this by adding a special case type where the case value is obtained // from or-ing it with the least-significant bit of the rune, creating large // ranges of equal case values that compress well. type sparseCompacter struct { sparseBlocks [][]uint16 sparseOffsets []uint16 sparseCount int } // makeSparse returns the number of elements that compact block would contain // as well as the modified values. func makeSparse(vals []uint64) ([]uint16, int) { // Copy the values. values := make([]uint16, len(vals)) for i, v := range vals { values[i] = uint16(v) } alt := func(i int, v uint16) uint16 { if cm := info(v & fullCasedMask); cm == cUpper || cm == cLower { // Convert cLower or cUpper to cXORCase value, which has the form 11x. xor := v xor &^= 1 xor |= uint16(i&1) ^ (v & 1) xor |= 0x4 return xor } return v } var count int var previous uint16 for i, v := range values { if v != 0 { // Try if the unmodified value is equal to the previous. if v == previous { continue } // Try if the xor-ed value is equal to the previous value. a := alt(i, v) if a == previous { values[i] = a continue } // This is a new value. count++ // Use the xor-ed value if it will be identical to the next value. if p := i + 1; p < len(values) && alt(p, values[p]) == a { values[i] = a v = a } } previous = v } return values, count } func (s *sparseCompacter) Size(v []uint64) (int, bool) { _, n := makeSparse(v) // We limit using this method to having 16 entries. if n > 16 { return 0, false } return 2 + int(reflect.TypeOf(valueRange{}).Size())*n, true } func (s *sparseCompacter) Store(v []uint64) uint32 { h := uint32(len(s.sparseOffsets)) values, sz := makeSparse(v) s.sparseBlocks = append(s.sparseBlocks, values) s.sparseOffsets = append(s.sparseOffsets, uint16(s.sparseCount)) s.sparseCount += sz return h } func (s *sparseCompacter) Handler() string { // The sparse global variable and its lookup method is defined in gen_trieval.go. return "sparse.lookup" } func (s *sparseCompacter) Print(w io.Writer) (retErr error) { p := func(format string, args ...interface{}) { _, err := fmt.Fprintf(w, format, args...) if retErr == nil && err != nil { retErr = err } } ls := len(s.sparseBlocks) if ls == len(s.sparseOffsets) { s.sparseOffsets = append(s.sparseOffsets, uint16(s.sparseCount)) } p("// sparseOffsets: %d entries, %d bytes\n", ls+1, (ls+1)*2) p("var sparseOffsets = %#v\n\n", s.sparseOffsets) ns := s.sparseCount p("// sparseValues: %d entries, %d bytes\n", ns, ns*4) p("var sparseValues = [%d]valueRange {", ns) for i, values := range s.sparseBlocks { p("\n// Block %#x, offset %#x", i, s.sparseOffsets[i]) var v uint16 for i, nv := range values { if nv != v { if v != 0 { p(",hi:%#02x},", 0x80+i-1) } if nv != 0 { p("\n{value:%#04x,lo:%#02x", nv, 0x80+i) } } v = nv } if v != 0 { p(",hi:%#02x},", 0x80+len(values)-1) } } p("\n}\n\n") return } // verifyProperties that properties of the runes that are relied upon in the // implementation. Each property is marked with an identifier that is referred // to in the places where it is used. func verifyProperties(chars []runeInfo) { for i, c := range chars { r := rune(i) // Rune properties. // A.1: modifier never changes on lowercase. [ltLower] if c.CCC > 0 && unicode.ToLower(r) != r { log.Fatalf("%U: non-starter changes when lowercased", r) } // A.2: properties of decompositions starting with I or J. [ltLower] d := norm.NFD.PropertiesString(string(r)).Decomposition() if len(d) > 0 { if d[0] == 'I' || d[0] == 'J' { // A.2.1: we expect at least an ASCII character and a modifier. if len(d) < 3 { log.Fatalf("%U: length of decomposition was %d; want >= 3", r, len(d)) } // All subsequent runes are modifiers and all have the same CCC. runes := []rune(string(d[1:])) ccc := chars[runes[0]].CCC for _, mr := range runes[1:] { mc := chars[mr] // A.2.2: all modifiers have a CCC of Above or less. if ccc == 0 || ccc > above { log.Fatalf("%U: CCC of successive rune (%U) was %d; want (0,230]", r, mr, ccc) } // A.2.3: a sequence of modifiers all have the same CCC. if mc.CCC != ccc { log.Fatalf("%U: CCC of follow-up modifier (%U) was %d; want %d", r, mr, mc.CCC, ccc) } // A.2.4: for each trailing r, r in [0x300, 0x311] <=> CCC == Above. if (ccc == above) != (0x300 <= mr && mr <= 0x311) { log.Fatalf("%U: modifier %U in [U+0300, U+0311] != ccc(%U) == 230", r, mr, mr) } if i += len(string(mr)); i >= len(d) { break } } } } // A.3: no U+0307 in decomposition of Soft-Dotted rune. [ltUpper] if unicode.Is(unicode.Soft_Dotted, r) && strings.Contains(string(d), "\u0307") { log.Fatalf("%U: decomposition of soft-dotted rune may not contain U+0307", r) } // A.4: only rune U+0345 may be of CCC Iota_Subscript. [elUpper] if c.CCC == iotaSubscript && r != 0x0345 { log.Fatalf("%U: only rune U+0345 may have CCC Iota_Subscript", r) } // A.5: soft-dotted runes do not have exceptions. if c.SoftDotted && c.entry&exceptionBit != 0 { log.Fatalf("%U: soft-dotted has exception", r) } // A.6: Greek decomposition. [elUpper] if unicode.Is(unicode.Greek, r) { if b := norm.NFD.PropertiesString(string(r)).Decomposition(); b != nil { runes := []rune(string(b)) // A.6.1: If a Greek rune decomposes and the first rune of the // decomposition is greater than U+00FF, the rune is always // great and not a modifier. if f := runes[0]; unicode.IsMark(f) || f > 0xFF && !unicode.Is(unicode.Greek, f) { log.Fatalf("%U: expected first rune of Greek decomposition to be letter, found %U", r, f) } // A.6.2: Any follow-up rune in a Greek decomposition is a // modifier of which the first should be gobbled in // decomposition. for _, m := range runes[1:] { switch m { case 0x0313, 0x0314, 0x0301, 0x0300, 0x0306, 0x0342, 0x0308, 0x0304, 0x345: default: log.Fatalf("%U: modifier %U is outside of expected Greek modifier set", r, m) } } } } // Breaking properties. // B.1: all runes with CCC > 0 are of break type Extend. if c.CCC > 0 && c.BreakType != "Extend" { log.Fatalf("%U: CCC == %d, but got break type %s; want Extend", r, c.CCC, c.BreakType) } // B.2: all cased runes with c.CCC == 0 are of break type ALetter. if c.CCC == 0 && c.Cased && c.BreakType != "ALetter" { log.Fatalf("%U: cased, but got break type %s; want ALetter", r, c.BreakType) } // B.3: letter category. if c.CCC == 0 && c.BreakCat != breakBreak && !c.CaseIgnorable { if c.BreakCat != breakLetter { log.Fatalf("%U: check for letter break type gave %d; want %d", r, c.BreakCat, breakLetter) } } } } func genTablesTest() { w := &bytes.Buffer{} fmt.Fprintln(w, "var (") printProperties(w, "DerivedCoreProperties.txt", "Case_Ignorable", verifyIgnore) // We discard the output as we know we have perfect functions. We run them // just to verify the properties are correct. n := printProperties(ioutil.Discard, "DerivedCoreProperties.txt", "Cased", verifyCased) n += printProperties(ioutil.Discard, "DerivedCoreProperties.txt", "Lowercase", verifyLower) n += printProperties(ioutil.Discard, "DerivedCoreProperties.txt", "Uppercase", verifyUpper) if n > 0 { log.Fatalf("One of the discarded properties does not have a perfect filter.") } // <code>; <lower> ; <title> ; <upper> ; (<condition_list> ;)? fmt.Fprintln(w, "\tspecial = map[rune]struct{ toLower, toTitle, toUpper string }{") parse("SpecialCasing.txt", func(p *ucd.Parser) { // Skip conditional entries. if p.String(4) != "" { return } r := p.Rune(0) fmt.Fprintf(w, "\t\t0x%04x: {%q, %q, %q},\n", r, string(p.Runes(1)), string(p.Runes(2)), string(p.Runes(3))) }) fmt.Fprint(w, "\t}\n\n") // <code>; <type>; <runes> table := map[rune]struct{ simple, full, special string }{} parse("CaseFolding.txt", func(p *ucd.Parser) { r := p.Rune(0) t := p.String(1) v := string(p.Runes(2)) if t != "T" && v == string(unicode.ToLower(r)) { return } x := table[r] switch t { case "C": x.full = v x.simple = v case "S": x.simple = v case "F": x.full = v case "T": x.special = v } table[r] = x }) fmt.Fprintln(w, "\tfoldMap = map[rune]struct{ simple, full, special string }{") for r := rune(0); r < 0x10FFFF; r++ { x, ok := table[r] if !ok { continue } fmt.Fprintf(w, "\t\t0x%04x: {%q, %q, %q},\n", r, x.simple, x.full, x.special) } fmt.Fprint(w, "\t}\n\n") // Break property notBreak := map[rune]bool{} parse("auxiliary/WordBreakProperty.txt", func(p *ucd.Parser) { switch p.String(1) { case "Extend", "Format", "MidLetter", "MidNumLet", "Single_Quote", "ALetter", "Hebrew_Letter", "Numeric", "ExtendNumLet", "ZWJ": notBreak[p.Rune(0)] = true } }) fmt.Fprintln(w, "\tbreakProp = []struct{ lo, hi rune }{") inBreak := false for r := rune(0); r <= lastRuneForTesting; r++ { if isBreak := !notBreak[r]; isBreak != inBreak { if isBreak { fmt.Fprintf(w, "\t\t{0x%x, ", r) } else { fmt.Fprintf(w, "0x%x},\n", r-1) } inBreak = isBreak } } if inBreak { fmt.Fprintf(w, "0x%x},\n", lastRuneForTesting) } fmt.Fprint(w, "\t}\n\n") // Word break test // Filter out all samples that do not contain cased characters. cased := map[rune]bool{} parse("DerivedCoreProperties.txt", func(p *ucd.Parser) { if p.String(1) == "Cased" { cased[p.Rune(0)] = true } }) fmt.Fprintln(w, "\tbreakTest = []string{") parse("auxiliary/WordBreakTest.txt", func(p *ucd.Parser) { c := strings.Split(p.String(0), " ") const sep = '|' numCased := 0 test := "" for ; len(c) >= 2; c = c[2:] { if c[0] == "÷" && test != "" { test += string(sep) } i, err := strconv.ParseUint(c[1], 16, 32) r := rune(i) if err != nil { log.Fatalf("Invalid rune %q.", c[1]) } if r == sep { log.Fatalf("Separator %q not allowed in test data. Pick another one.", sep) } if cased[r] { numCased++ } test += string(r) } if numCased > 1 { fmt.Fprintf(w, "\t\t%q,\n", test) } }) fmt.Fprintln(w, "\t}") fmt.Fprintln(w, ")") gen.WriteVersionedGoFile("tables_test.go", "cases", w.Bytes()) } // These functions are just used for verification that their definition have not // changed in the Unicode Standard. func verifyCased(r rune) bool { return verifyLower(r) || verifyUpper(r) || unicode.IsTitle(r) } func verifyLower(r rune) bool { return unicode.IsLower(r) || unicode.Is(unicode.Other_Lowercase, r) } func verifyUpper(r rune) bool { return unicode.IsUpper(r) || unicode.Is(unicode.Other_Uppercase, r) } // verifyIgnore is an approximation of the Case_Ignorable property using the // core unicode package. It is used to reduce the size of the test data. func verifyIgnore(r rune) bool { props := []*unicode.RangeTable{ unicode.Mn, unicode.Me, unicode.Cf, unicode.Lm, unicode.Sk, } for _, p := range props { if unicode.Is(p, r) { return true } } return false } // printProperties prints tables of rune properties from the given UCD file. // A filter func f can be given to exclude certain values. A rune r will have // the indicated property if it is in the generated table or if f(r). func printProperties(w io.Writer, file, property string, f func(r rune) bool) int { verify := map[rune]bool{} n := 0 varNameParts := strings.Split(property, "_") varNameParts[0] = strings.ToLower(varNameParts[0]) fmt.Fprintf(w, "\t%s = map[rune]bool{\n", strings.Join(varNameParts, "")) parse(file, func(p *ucd.Parser) { if p.String(1) == property { r := p.Rune(0) verify[r] = true if !f(r) { n++ fmt.Fprintf(w, "\t\t0x%.4x: true,\n", r) } } }) fmt.Fprint(w, "\t}\n\n") // Verify that f is correct, that is, it represents a subset of the property. for r := rune(0); r <= lastRuneForTesting; r++ { if !verify[r] && f(r) { log.Fatalf("Incorrect filter func for property %q.", property) } } return n } // The newCaseTrie, sparseValues and sparseOffsets definitions below are // placeholders referred to by gen_trieval.go. The real definitions are // generated by this program and written to tables.go. func newCaseTrie(int) int { return 0 } var ( sparseValues [0]valueRange sparseOffsets [0]uint16 )
{ "content_hash": "c2dfed5bd36f90775b0b6d88c97a4592", "timestamp": "", "source": "github", "line_count": 828, "max_line_length": 102, "avg_line_length": 26.035024154589372, "alnum_prop": 0.6350141485364382, "repo_name": "mantzas/golinear", "id": "1cfe1c0201f721b5360036f7ed7f85cd6d9d8446", "size": "22067", "binary": false, "copies": "94", "ref": "refs/heads/master", "path": "vendor/golang.org/x/text/cases/gen.go", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Go", "bytes": "26977" } ] }
package org.sleuthkit.autopsy.coreutils; /** * * @author jwallace */ public class JLnkParserException extends Exception { /** * Constructs an instance of * <code>JLnkParserException</code> caused by the given exception. * * @param msg the detail message. */ public JLnkParserException(Exception cause) { super(cause); } }
{ "content_hash": "a853aae0977ee29ee0e8b83be92f694f", "timestamp": "", "source": "github", "line_count": 19, "max_line_length": 70, "avg_line_length": 19.68421052631579, "alnum_prop": 0.6470588235294118, "repo_name": "sidheshenator/autopsy", "id": "f9e034df429d4d342d517caaaf789316590cffc8", "size": "1055", "binary": false, "copies": "1", "ref": "refs/heads/develop", "path": "Core/src/org/sleuthkit/autopsy/coreutils/JLnkParserException.java", "mode": "33261", "license": "apache-2.0", "language": [ { "name": "Batchfile", "bytes": "5199" }, { "name": "CSS", "bytes": "1672" }, { "name": "HTML", "bytes": "7669" }, { "name": "Java", "bytes": "5438469" }, { "name": "Perl", "bytes": "1145199" }, { "name": "Python", "bytes": "169290" } ] }
<!DOCTYPE html> <!-- Copyright (c) 2012 Intel Corporation. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of works must retain the original copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the original copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Intel Corporation nor the names of its contributors may be used to endorse or promote products derived from this work without specific prior written permission. THIS SOFTWARE IS PROVIDED BY INTEL CORPORATION "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL INTEL CORPORATION BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. Authors: Liu, Jinfeng <jinfengx.liu@intel.com> --> <html> <head> <title>CSS3 MultiColumn Test: CSS3Multicolumn_break-after_always</title> <link rel="author" title="Intel" href="http://www.intel.com/" /> <link rel="help" href="http://www.w3.org/TR/2011/CR-css3-multicol-20110412/#break-before-break-after-break-inside" /> <meta name="assert" content="Check if break-after attribute value is 'always'" /> <script type="text/javascript" src="../resources/testharness.js"></script> <script type="text/javascript" src="../resources/testharnessreport.js"></script> <script type="text/javascript" src="support/support.js"></script> <style> #test { height: 30px; width: 80px; padding: 5px; border: 5px solid black; margin: 5px; background: blue; } </style> </head> <body> <div id="log"></div> <div id="test"></div> <script type="text/javascript"> var div = document.querySelector("#test"); var t = async_test(document.title, {timeout: 500}); t.step(function () { div.style[headProp("column-break-after")] = "always"; div.style[headProp("width")] = "800px"; div.style[headProp("column-width")] = "200px"; div.style[headProp("column-rule-color")] = "yellow"; div.style[headProp("column-rule-width")] = "4px"; div.style[headProp("column-rule-style")] = "dotted"; var propvalue = GetCurrentStyle("columnBreakAfter"); var prop = propvalue.indexOf("always") != -1; assert_true(prop, "The element break-after test"); }); t.done(); </script> </body> </html>
{ "content_hash": "51c7ac8b465d2ab55ea8a87f63a491c0", "timestamp": "", "source": "github", "line_count": 73, "max_line_length": 121, "avg_line_length": 43.76712328767123, "alnum_prop": 0.6826291079812207, "repo_name": "crosswalk-project/crosswalk-test-suite", "id": "703048dad4053eb248aa445cc46e78f40a07347a", "size": "3195", "binary": false, "copies": "30", "ref": "refs/heads/master", "path": "webapi/tct-multicolumn-css3-tests/multicolumn/CSS3Multicolumn_break-after_always.html", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "C", "bytes": "2738" }, { "name": "C#", "bytes": "1437" }, { "name": "CSS", "bytes": "63576" }, { "name": "Cucumber", "bytes": "133383" }, { "name": "GLSL", "bytes": "2187925" }, { "name": "HTML", "bytes": "23702581" }, { "name": "Java", "bytes": "1755638" }, { "name": "JavaScript", "bytes": "3166019" }, { "name": "Makefile", "bytes": "1044" }, { "name": "PHP", "bytes": "37474" }, { "name": "Python", "bytes": "1882174" }, { "name": "Shell", "bytes": "614247" } ] }
<?php defined('C5_EXECUTE') or die("Access Denied."); class Concrete5_Controller_Block_DashboardAppStatus extends BlockController { protected $btCacheBlockRecord = true; protected $btCacheBlockOutput = true; protected $btCacheBlockOutputOnPost = true; protected $btCacheBlockOutputForRegisteredUsers = true; protected $btCacheBlockOutputLifetime = 86400; // check every day protected $btIsInternal = true; public function getBlockTypeDescription() { return t("Displays update and welcome back information on your dashboard."); } public function getBlockTypeName() { return t("Dashboard App Status"); } public function view() { Loader::library('update'); $this->set('latest_version', Update::getLatestAvailableVersionNumber()); $tp = new TaskPermission(); $updates = 0; $local = array(); $remote = array(); if ($tp->canInstallPackages()) { $local = Package::getLocalUpgradeablePackages(); $remote = Package::getRemotelyUpgradeablePackages(); } // now we strip out any dupes for the total $updates = 0; $localHandles = array(); foreach($local as $_pkg) { $updates++; $localHandles[] = $_pkg->getPackageHandle(); } foreach($remote as $_pkg) { if (!in_array($_pkg->getPackageHandle(), $localHandles)) { $updates++; } } $this->set('updates', $updates); } }
{ "content_hash": "b19f10cba68999fb377cd31b045dec92", "timestamp": "", "source": "github", "line_count": 49, "max_line_length": 79, "avg_line_length": 28.26530612244898, "alnum_prop": 0.6700361010830325, "repo_name": "mchakon/afc", "id": "f2449908eaa4472b9ec80c935235688294c263cf", "size": "1786", "binary": false, "copies": "43", "ref": "refs/heads/master", "path": "concrete/core/controllers/blocks/dashboard_app_status.php", "mode": "33188", "license": "mit", "language": [ { "name": "ASP", "bytes": "3034" }, { "name": "ActionScript", "bytes": "172889" }, { "name": "JavaScript", "bytes": "565453" }, { "name": "PHP", "bytes": "9680579" }, { "name": "Perl", "bytes": "44239" }, { "name": "XSLT", "bytes": "28086" } ] }
namespace realm { template <typename ValueType, typename ContextType> void Object::set_property_value(ContextType& ctx, StringData prop_name, ValueType value, bool try_update) { verify_attached(); m_realm->verify_in_write(); auto& property = property_for_name(prop_name); // Modifying primary keys is allowed in migrations to make it possible to // add a new primary key to a type (or change the property type), but it // is otherwise considered the immutable identity of the row if (property.is_primary && !m_realm->is_in_migration()) throw std::logic_error("Cannot modify primary key after creation"); set_property_value_impl(ctx, property, value, try_update); } template <typename ValueType, typename ContextType> ValueType Object::get_property_value(ContextType& ctx, StringData prop_name) { return get_property_value_impl<ValueType>(ctx, property_for_name(prop_name)); } template <typename ValueType, typename ContextType> void Object::set_property_value_impl(ContextType& ctx, const Property &property, ValueType value, bool try_update, bool is_default) { ctx.will_change(*this, property); auto& table = *m_row.get_table(); size_t col = property.table_column; size_t row = m_row.get_index(); if (is_nullable(property.type) && ctx.is_null(value)) { if (property.type == PropertyType::Object) { if (!is_default) table.nullify_link(col, row); } else { table.set_null(col, row, is_default); } ctx.did_change(); return; } if (is_array(property.type)) { if (property.type == PropertyType::LinkingObjects) throw ReadOnlyPropertyException(m_object_schema->name, property.name); REALM_ASSERT(property.type == PropertyType::Object); List list(m_realm, m_row.get_linklist(col)); list.remove_all(); if (!ctx.is_null(value)) { ContextType child_ctx(ctx, property); ctx.enumerate_list(value, [&](auto&& element) { list.add(child_ctx, element, try_update); }); } ctx.did_change(); return; } switch (property.type & ~PropertyType::Flags) { case PropertyType::Bool: table.set(col, row, ctx.template unbox<bool>(value), is_default); break; case PropertyType::Int: table.set(col, row, ctx.template unbox<int64_t>(value), is_default); break; case PropertyType::Float: table.set(col, row, ctx.template unbox<float>(value), is_default); break; case PropertyType::Double: table.set(col, row, ctx.template unbox<double>(value), is_default); break; case PropertyType::String: table.set(col, row, ctx.template unbox<StringData>(value), is_default); break; case PropertyType::Data: table.set(col, row, ctx.template unbox<BinaryData>(value), is_default); break; case PropertyType::Any: throw std::logic_error("not supported"); case PropertyType::Date: table.set(col, row, ctx.template unbox<Timestamp>(value), is_default); break; case PropertyType::Object: { ContextType child_ctx(ctx, property); auto link = child_ctx.template unbox<RowExpr>(value, true, try_update); table.set_link(col, row, link.get_index(), is_default); break; } default: REALM_COMPILER_HINT_UNREACHABLE(); } ctx.did_change(); } template <typename ValueType, typename ContextType> ValueType Object::get_property_value_impl(ContextType& ctx, const Property &property) { verify_attached(); size_t column = property.table_column; if (is_nullable(property.type) && m_row.is_null(column)) { return ctx.null_value(); } if (is_array(property.type) && property.type != PropertyType::LinkingObjects) { REALM_ASSERT(property.type == PropertyType::Object); return ctx.box(List(m_realm, m_row.get_linklist(column))); } switch (property.type & ~PropertyType::Flags) { case PropertyType::Bool: return ctx.box(m_row.get_bool(column)); case PropertyType::Int: return ctx.box(m_row.get_int(column)); case PropertyType::Float: return ctx.box(m_row.get_float(column)); case PropertyType::Double: return ctx.box(m_row.get_double(column)); case PropertyType::String: return ctx.box(m_row.get_string(column)); case PropertyType::Data: return ctx.box(m_row.get_binary(column)); case PropertyType::Date: return ctx.box(m_row.get_timestamp(column)); case PropertyType::Any: return ctx.box(m_row.get_mixed(column)); case PropertyType::Object: { auto linkObjectSchema = m_realm->schema().find(property.object_type); TableRef table = ObjectStore::table_for_object_type(m_realm->read_group(), property.object_type); return ctx.box(Object(m_realm, *linkObjectSchema, table->get(m_row.get_link(column)))); } case PropertyType::LinkingObjects: { auto target_object_schema = m_realm->schema().find(property.object_type); auto link_property = target_object_schema->property_for_name(property.link_origin_property_name); TableRef table = ObjectStore::table_for_object_type(m_realm->read_group(), target_object_schema->name); auto tv = m_row.get_table()->get_backlink_view(m_row.get_index(), table.get(), link_property->table_column); return ctx.box(Results(m_realm, std::move(tv))); } default: REALM_UNREACHABLE(); } } template<typename ValueType, typename ContextType> Object Object::create(ContextType& ctx, std::shared_ptr<Realm> const& realm, StringData object_type, ValueType value, bool try_update, Row* out_row) { auto object_schema = realm->schema().find(object_type); REALM_ASSERT(object_schema != realm->schema().end()); return create(ctx, realm, *object_schema, value, try_update, out_row); } template<typename ValueType, typename ContextType> Object Object::create(ContextType& ctx, std::shared_ptr<Realm> const& realm, ObjectSchema const& object_schema, ValueType value, bool try_update, Row* out_row) { realm->verify_in_write(); // get or create our accessor bool created = false; // try to get existing row if updating size_t row_index = realm::not_found; TableRef table = ObjectStore::table_for_object_type(realm->read_group(), object_schema.name); bool skip_primary = true; if (auto primary_prop = object_schema.primary_key_property()) { // search for existing object based on primary key type auto primary_value = ctx.value_for_property(value, primary_prop->name, primary_prop - &object_schema.persisted_properties[0]); if (!primary_value) primary_value = ctx.default_value_for_property(object_schema, primary_prop->name); if (!primary_value) { if (!is_nullable(primary_prop->type)) throw MissingPropertyValueException(object_schema.name, primary_prop->name); primary_value = ctx.null_value(); } row_index = get_for_primary_key_impl(ctx, *table, *primary_prop, *primary_value); if (row_index == realm::not_found) { created = true; if (primary_prop->type == PropertyType::Int) { #if REALM_HAVE_SYNC_STABLE_IDS row_index = sync::create_object_with_primary_key(realm->read_group(), *table, ctx.template unbox<util::Optional<int64_t>>(*primary_value)); #else row_index = table->add_empty_row(); if (ctx.is_null(*primary_value)) table->set_null_unique(primary_prop->table_column, row_index); else table->set_unique(primary_prop->table_column, row_index, ctx.template unbox<int64_t>(*primary_value)); #endif // REALM_HAVE_SYNC_STABLE_IDS } else if (primary_prop->type == PropertyType::String) { auto value = ctx.template unbox<StringData>(*primary_value); #if REALM_HAVE_SYNC_STABLE_IDS row_index = sync::create_object_with_primary_key(realm->read_group(), *table, value); #else row_index = table->add_empty_row(); table->set_unique(primary_prop->table_column, row_index, value); #endif // REALM_HAVE_SYNC_STABLE_IDS } else { REALM_TERMINATE("Unsupported primary key type."); } } else if (!try_update) { if (realm->is_in_migration()) { // Creating objects with duplicate primary keys is allowed in migrations // as long as there are no duplicates at the end, as adding an entirely // new column which is the PK will inherently result in duplicates at first row_index = table->add_empty_row(); created = true; skip_primary = false; } else { throw std::logic_error(util::format("Attempting to create an object of type '%1' with an existing primary key value '%2'.", object_schema.name, ctx.print(*primary_value))); } } } else { #if REALM_HAVE_SYNC_STABLE_IDS row_index = sync::create_object(realm->read_group(), *table); #else row_index = table->add_empty_row(); #endif // REALM_HAVE_SYNC_STABLE_IDS created = true; } // populate Object object(realm, object_schema, table->get(row_index)); if (out_row) *out_row = object.row(); for (size_t i = 0; i < object_schema.persisted_properties.size(); ++i) { auto& prop = object_schema.persisted_properties[i]; if (skip_primary && prop.is_primary) continue; auto v = ctx.value_for_property(value, prop.name, i); if (!created && !v) continue; bool is_default = false; if (!v) { v = ctx.default_value_for_property(object_schema, prop.name); is_default = true; } if ((!v || ctx.is_null(*v)) && !is_nullable(prop.type) && !is_array(prop.type)) { if (prop.is_primary || !ctx.allow_missing(value)) throw MissingPropertyValueException(object_schema.name, prop.name); } if (v) object.set_property_value_impl(ctx, prop, *v, try_update, is_default); } return object; } template<typename ValueType, typename ContextType> Object Object::get_for_primary_key(ContextType& ctx, std::shared_ptr<Realm> const& realm, StringData object_type, ValueType primary_value) { auto object_schema = realm->schema().find(object_type); REALM_ASSERT(object_schema != realm->schema().end()); return get_for_primary_key(ctx, realm, *object_schema, primary_value); } template<typename ValueType, typename ContextType> Object Object::get_for_primary_key(ContextType& ctx, std::shared_ptr<Realm> const& realm, const ObjectSchema &object_schema, ValueType primary_value) { auto primary_prop = object_schema.primary_key_property(); if (!primary_prop) { throw MissingPrimaryKeyException(object_schema.name); } auto table = ObjectStore::table_for_object_type(realm->read_group(), object_schema.name); if (!table) return Object(realm, object_schema, RowExpr()); auto row_index = get_for_primary_key_impl(ctx, *table, *primary_prop, primary_value); return Object(realm, object_schema, row_index == realm::not_found ? Row() : Row(table->get(row_index))); } template<typename ValueType, typename ContextType> size_t Object::get_for_primary_key_impl(ContextType& ctx, Table const& table, const Property &primary_prop, ValueType primary_value) { bool is_null = ctx.is_null(primary_value); if (is_null && !is_nullable(primary_prop.type)) throw std::logic_error("Invalid null value for non-nullable primary key."); if (primary_prop.type == PropertyType::String) { return table.find_first(primary_prop.table_column, ctx.template unbox<StringData>(primary_value)); } if (is_nullable(primary_prop.type)) return table.find_first(primary_prop.table_column, ctx.template unbox<util::Optional<int64_t>>(primary_value)); return table.find_first(primary_prop.table_column, ctx.template unbox<int64_t>(primary_value)); } } // namespace realm #endif // REALM_OS_OBJECT_ACCESSOR_HPP
{ "content_hash": "dc10372e08c990e8cd88e92958f1dcb1", "timestamp": "", "source": "github", "line_count": 302, "max_line_length": 155, "avg_line_length": 43.235099337748345, "alnum_prop": 0.6079497587500957, "repo_name": "followmoe/EMA02", "id": "08be944631838f72d9d916ab6a36949d048333e6", "size": "14298", "binary": false, "copies": "14", "ref": "refs/heads/master", "path": "Pods/Realm/include/object_accessor.hpp", "mode": "33188", "license": "mit", "language": [ { "name": "Ruby", "bytes": "89" }, { "name": "Swift", "bytes": "22658" } ] }
package com.github.dozermapper.core; import java.util.Random; import com.github.dozermapper.core.config.SettingsDefaults; import com.github.dozermapper.core.config.SettingsKeys; import com.github.dozermapper.core.util.DozerConstants; import org.junit.Before; public abstract class AbstractDozerTest { private static Random rand = new Random(System.currentTimeMillis()); @Before public void setUp() throws Exception { System.setProperty("log4j.debug", "true"); System.setProperty(DozerConstants.DEBUG_SYS_PROP, "true"); System.setProperty(SettingsKeys.CONFIG_FILE_SYS_PROP, SettingsDefaults.LEGACY_PROPERTIES_FILE); } protected String getRandomString() { return String.valueOf(rand.nextInt()); } }
{ "content_hash": "06216368f96324c870c1156aea49d233", "timestamp": "", "source": "github", "line_count": 26, "max_line_length": 103, "avg_line_length": 29.26923076923077, "alnum_prop": 0.7437582128777924, "repo_name": "garethahealy/dozer", "id": "963dcb973ad81ac1ec8a26d7194987a3835377d8", "size": "1363", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "core/src/test/java/com/github/dozermapper/core/AbstractDozerTest.java", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Java", "bytes": "2318876" }, { "name": "Shell", "bytes": "1703" } ] }
'use strict' let component = require('omniscient') let R = require('ramda') let S = require('underscore.string.fp') let h = require('react-hyperscript') let immutable = require('immutable') let logger = require('@arve.knudsen/js-logger').get('userProfile') let ajax = require('../../ajax') let {nbsp,} = require('../../specialChars') let VCard = require('./vcard') let {convertMarkdown,} = require('../../markdown') let datetime = require('../../datetime') let {ProjectPlans,} = require('./projectPlans') if (__IS_BROWSER__) { require('./userProfile.styl') } let About = component('About', (user) => { return h('div', [ h('h1', `About ${user.name}`), convertMarkdown(user.about), ]) }) let Projects = component('Projects', (user) => { let {username,} = user return !R.isEmpty(user.projects) ? h('table#user-projects', [ h('thead', [ h('tr', [ h('th', 'ID'), h('th', 'Name'), h('th', 'Created'), ]), ]), h('tbody', R.map((project) => { let {projectId, title,} = project let createdStr = datetime.displayDateTextual(project.created) return h('tr', [ h('td', [ h('a.user-project', {href: `/u/${username}/${projectId}`,}, projectId), ]), h('td', [ h('a.user-project', {href: `/u/${username}/${projectId}`,}, title), ]), h('td', [ h('a.user-project', {href: `/u/${username}/${projectId}`,}, createdStr), ]), ]) }, user.projects)), ]) : h('em', 'No projects.') }) let SoundCloudUpload = component('SoundCloudUpload', { componentDidMount: function () { let upload = this.props.upload logger.debug(`SoundCloud upload did mount`, upload) let uploadElem = this.refs.soundCloudUpload logger.debug('Got SoundCloud upload element:', uploadElem) uploadElem.innerHTML = upload.html }, }, () => { return h('.soundcloud-upload', {ref: `soundCloudUpload`,}) }) let Media = component('Media', (user) => { let soundCloud = user.soundCloud || {} return h('div', !R.isEmpty(user.soundCloudUploads) ? [ h('h1#soundcloud-header', [ h('span.icon-soundcloud', 'SoundCloud'), ]), h('p', `${S.words(user.name)[0]}'s sounds on SoundCloud`), h('ul#soundcloud-uploads', R.map((upload) => { return h('li', [SoundCloudUpload({upload,}),]) }, soundCloud.uploads)), ] : null) }) let Workshops = component('Workshops', (user) => { return convertMarkdown(user.workshopsInfo) }) let isActiveTab = (tabName, cursor) => { return cursor.cursor('userProfile').get('activeTab') === tabName } class UserTab { constructor(title, icon, enabled) { enabled = enabled == null ? true : enabled this.title = title this.icon = icon this.enabled = enabled this.name = title.toLowerCase() this.url = `#${title.toLowerCase()}` } getClasses(activeTab) { let classes if (this.name === activeTab) { logger.debug(`${this.name} is active tab`) classes = ['active',] } else { classes = [] } return S.join(' ', R.concat(classes, !this.enabled ? ['disabled',] : [])) } } module.exports = { createState: () => { return immutable.fromJS({ activeTab: 'projects', }) }, loadData: (cursor, params) => { return ajax.getJson(`/api/users/${params.user}`, null, {cursor,}) .then((user) => { // logger.debug(`Loading user JSON succeeded:`, user) return { userProfile: { user: user, activeTab: 'projects', }, } }, (error) => { logger.warn(`Loading user JSON failed:`, error) throw error }) }, render: (cursor) => { let profileCursor = cursor.cursor('userProfile') let user = profileCursor.get('user').toJS() let currentHash = cursor.cursor('router').get('currentHash') let soundCloud = user.soundCloud || {} let profileTabs = [ new UserTab('Projects'), new UserTab('Plans'), new UserTab('About'), new UserTab('Media', null, !R.isEmpty(soundCloud.uploads || [])), new UserTab('Workshops', null, !S.isBlank(user.workshopsInfo)), ] let activeTab = R.contains(currentHash, R.map((tab) => { return tab.name }, profileTabs)) ? currentHash : 'projects' logger.debug(`Rendering profile of user '${user.username}', active tab '${activeTab}'`) logger.debug(`State:`, profileCursor.toJS()) let tabContents if (activeTab === 'about') { tabContents = About(user) } else if (activeTab === 'projects') { tabContents = Projects(user) } else if (activeTab === 'plans') { tabContents = ProjectPlans({user, cursor,}) } else if (activeTab === 'media') { tabContents = Media(user) } else if (activeTab === 'workshops') { tabContents = Workshops(user) } return h('#user-pad', [ h('.pure-g', [ h('.pure-u-1-4', [ VCard({cursor, user,}), ]), h('.pure-u-3-4', [ h('ul.tabs', {role: 'tablist',}, R.map((profileTab) => { return h(`li.${S.join('.', profileTab.getClasses(activeTab))}`, [ profileTab.enabled ? h('a', { role: 'tab', href: profileTab.url, }, [ profileTab.icon != null ? h(`span.icon-${profileTab.icon}`, nbsp) : null, h('span', profileTab.title), ]) : h('div', [ profileTab.icon != null ? h(`span.icon-${profileTab.icon}`, nbsp) : null, h('span', profileTab.title), ]), ]) }, profileTabs)), h('#tab-contents', [tabContents,]), ]), ]), ]) }, }
{ "content_hash": "5e2080c0fd8f44e70058533c80557d8f", "timestamp": "", "source": "github", "line_count": 187, "max_line_length": 91, "avg_line_length": 30.56149732620321, "alnum_prop": 0.5557305336832896, "repo_name": "muzhack/musitechhub", "id": "44209c21e77e8133bfbadc3688ef7e0bebe5dd4f", "size": "5715", "binary": false, "copies": "2", "ref": "refs/heads/master", "path": "app/views/userProfile/userProfile.js", "mode": "33188", "license": "mit", "language": [ { "name": "CSS", "bytes": "23036" }, { "name": "HTML", "bytes": "589" }, { "name": "JavaScript", "bytes": "197116" }, { "name": "Python", "bytes": "11012" }, { "name": "Shell", "bytes": "743" } ] }
'------------------------------------------------------------------------------ ' <auto-generated> ' This code was generated by a tool. ' Runtime Version:4.0.30319.34209 ' ' Changes to this file may cause incorrect behavior and will be lost if ' the code is regenerated. ' </auto-generated> '------------------------------------------------------------------------------ Option Strict On Option Explicit On Namespace My <Global.System.Runtime.CompilerServices.CompilerGeneratedAttribute(), _ Global.System.CodeDom.Compiler.GeneratedCodeAttribute("Microsoft.VisualStudio.Editors.SettingsDesigner.SettingsSingleFileGenerator", "12.0.0.0"), _ Global.System.ComponentModel.EditorBrowsableAttribute(Global.System.ComponentModel.EditorBrowsableState.Advanced)> _ Partial Friend NotInheritable Class MySettings Inherits Global.System.Configuration.ApplicationSettingsBase Private Shared defaultInstance As MySettings = CType(Global.System.Configuration.ApplicationSettingsBase.Synchronized(New MySettings()),MySettings) #Region "My.Settings Auto-Save Functionality" #If _MyType = "WindowsForms" Then Private Shared addedHandler As Boolean Private Shared addedHandlerLockObject As New Object <Global.System.Diagnostics.DebuggerNonUserCodeAttribute(), Global.System.ComponentModel.EditorBrowsableAttribute(Global.System.ComponentModel.EditorBrowsableState.Advanced)> _ Private Shared Sub AutoSaveSettings(ByVal sender As Global.System.Object, ByVal e As Global.System.EventArgs) If My.Application.SaveMySettingsOnExit Then My.Settings.Save() End If End Sub #End If #End Region Public Shared ReadOnly Property [Default]() As MySettings Get #If _MyType = "WindowsForms" Then If Not addedHandler Then SyncLock addedHandlerLockObject If Not addedHandler Then AddHandler My.Application.Shutdown, AddressOf AutoSaveSettings addedHandler = True End If End SyncLock End If #End If Return defaultInstance End Get End Property End Class End Namespace Namespace My <Global.Microsoft.VisualBasic.HideModuleNameAttribute(), _ Global.System.Diagnostics.DebuggerNonUserCodeAttribute(), _ Global.System.Runtime.CompilerServices.CompilerGeneratedAttribute()> _ Friend Module MySettingsProperty <Global.System.ComponentModel.Design.HelpKeywordAttribute("My.Settings")> _ Friend ReadOnly Property Settings() As Global.AddJPIPLayer.My.MySettings Get Return Global.AddJPIPLayer.My.MySettings.Default End Get End Property End Module End Namespace
{ "content_hash": "c96f4a90e52e614886a8dadd7994cb10", "timestamp": "", "source": "github", "line_count": 73, "max_line_length": 179, "avg_line_length": 39.89041095890411, "alnum_prop": 0.6493818681318682, "repo_name": "Esri/arcobjects-sdk-community-samples", "id": "3abdbb6e21326c3258b3548ee565a1c16fdeee9d", "size": "2914", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "Net/Raster/PublishMap/VBNET/My Project/Settings.Designer.vb", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "C", "bytes": "17067" }, { "name": "C#", "bytes": "7769223" }, { "name": "C++", "bytes": "454756" }, { "name": "HTML", "bytes": "6209" }, { "name": "JavaScript", "bytes": "3064" }, { "name": "Makefile", "bytes": "570" }, { "name": "Objective-C", "bytes": "2417" }, { "name": "Visual Basic .NET", "bytes": "5250403" }, { "name": "XSLT", "bytes": "73678" } ] }
package fi.iki.elonen; import java.io.*; import java.net.InetAddress; import java.net.InetSocketAddress; import java.net.ServerSocket; import java.net.Socket; import java.net.SocketException; import java.net.SocketTimeoutException; import java.net.URLDecoder; import java.nio.ByteBuffer; import java.nio.channels.FileChannel; import java.nio.charset.Charset; import java.security.KeyStore; import java.text.SimpleDateFormat; import java.util.ArrayList; import java.util.Calendar; import java.util.Collections; import java.util.Date; import java.util.HashMap; import java.util.Iterator; import java.util.List; import java.util.Locale; import java.util.Map; import java.util.StringTokenizer; import java.util.TimeZone; import java.util.logging.Level; import java.util.logging.Logger; import java.util.regex.Matcher; import java.util.regex.Pattern; import java.util.zip.GZIPOutputStream; import javax.net.ssl.KeyManager; import javax.net.ssl.KeyManagerFactory; import javax.net.ssl.SSLContext; import javax.net.ssl.SSLServerSocket; import javax.net.ssl.SSLServerSocketFactory; import javax.net.ssl.TrustManagerFactory; import fi.iki.elonen.NanoHTTPD.Response.IStatus; import fi.iki.elonen.NanoHTTPD.Response.Status; /** * A simple, tiny, nicely embeddable HTTP server in Java * <p/> * <p/> * NanoHTTPD * <p> * Copyright (c) 2012-2013 by Paul S. Hawke, 2001,2005-2013 by Jarno Elonen, * 2010 by Konstantinos Togias * </p> * <p/> * <p/> * <b>Features + limitations: </b> * <ul> * <p/> * <li>Only one Java file</li> * <li>Java 5 compatible</li> * <li>Released as open source, Modified BSD licence</li> * <li>No fixed config files, logging, authorization etc. (Implement yourself if * you need them.)</li> * <li>Supports parameter parsing of GET and POST methods (+ rudimentary PUT * support in 1.25)</li> * <li>Supports both dynamic content and file serving</li> * <li>Supports file upload (since version 1.2, 2010)</li> * <li>Supports partial content (streaming)</li> * <li>Supports ETags</li> * <li>Never caches anything</li> * <li>Doesn't limit bandwidth, request time or simultaneous connections</li> * <li>Default code serves files and shows all HTTP parameters and headers</li> * <li>File server supports directory listing, index.html and index.htm</li> * <li>File server supports partial content (streaming)</li> * <li>File server supports ETags</li> * <li>File server does the 301 redirection trick for directories without '/'</li> * <li>File server supports simple skipping for files (continue download)</li> * <li>File server serves also very long files without memory overhead</li> * <li>Contains a built-in list of most common MIME types</li> * <li>All header names are converted to lower case so they don't vary between * browsers/clients</li> * <p/> * </ul> * <p/> * <p/> * <b>How to use: </b> * <ul> * <p/> * <li>Subclass and implement serve() and embed to your own program</li> * <p/> * </ul> * <p/> * See the separate "LICENSE.md" file for the distribution license (Modified BSD * licence) */ public abstract class NanoHTTPD { /** * Pluggable strategy for asynchronously executing requests. */ public interface AsyncRunner { void closeAll(); void closed(ClientHandler clientHandler); void exec(ClientHandler code); } /** * The runnable that will be used for every new client connection. */ public class ClientHandler implements Runnable { private final InputStream inputStream; private final Socket acceptSocket; private ClientHandler(InputStream inputStream, Socket acceptSocket) { this.inputStream = inputStream; this.acceptSocket = acceptSocket; } public void close() { safeClose(this.inputStream); safeClose(this.acceptSocket); } @Override public void run() { OutputStream outputStream = null; try { outputStream = this.acceptSocket.getOutputStream(); TempFileManager tempFileManager = NanoHTTPD.this.tempFileManagerFactory.create(); HTTPSession session = new HTTPSession(tempFileManager, this.inputStream, outputStream, this.acceptSocket.getInetAddress()); while (!this.acceptSocket.isClosed()) { session.execute(); } } catch (Exception e) { // When the socket is closed by the client, // we throw our own SocketException // to break the "keep alive" loop above. If // the exception was anything other // than the expected SocketException OR a // SocketTimeoutException, print the // stacktrace if (!(e instanceof SocketException && "NanoHttpd Shutdown".equals(e.getMessage())) && !(e instanceof SocketTimeoutException)) { NanoHTTPD.LOG.log(Level.FINE, "Communication with the client broken", e); } } finally { safeClose(outputStream); safeClose(this.inputStream); safeClose(this.acceptSocket); NanoHTTPD.this.asyncRunner.closed(this); } } } public static class Cookie { public static String getHTTPTime(int days) { Calendar calendar = Calendar.getInstance(); SimpleDateFormat dateFormat = new SimpleDateFormat("EEE, dd MMM yyyy HH:mm:ss z", Locale.US); dateFormat.setTimeZone(TimeZone.getTimeZone("GMT")); calendar.add(Calendar.DAY_OF_MONTH, days); return dateFormat.format(calendar.getTime()); } private final String n, v, e; public Cookie(String name, String value) { this(name, value, 30); } public Cookie(String name, String value, int numDays) { this.n = name; this.v = value; this.e = getHTTPTime(numDays); } public Cookie(String name, String value, String expires) { this.n = name; this.v = value; this.e = expires; } public String getHTTPHeader() { String fmt = "%s=%s; expires=%s"; return String.format(fmt, this.n, this.v, this.e); } } /** * Provides rudimentary support for cookies. Doesn't support 'path', * 'secure' nor 'httpOnly'. Feel free to improve it and/or add unsupported * features. * * @author LordFokas */ public class CookieHandler implements Iterable<String> { private final HashMap<String, String> cookies = new HashMap<String, String>(); private final ArrayList<Cookie> queue = new ArrayList<Cookie>(); public CookieHandler(Map<String, String> httpHeaders) { String raw = httpHeaders.get("cookie"); if (raw != null) { String[] tokens = raw.split(";"); for (String token : tokens) { String[] data = token.trim().split("="); if (data.length == 2) { this.cookies.put(data[0], data[1]); } } } } /** * Set a cookie with an expiration date from a month ago, effectively * deleting it on the client side. * * @param name * The cookie name. */ public void delete(String name) { set(name, "-delete-", -30); } @Override public Iterator<String> iterator() { return this.cookies.keySet().iterator(); } /** * Read a cookie from the HTTP Headers. * * @param name * The cookie's name. * @return The cookie's value if it exists, null otherwise. */ public String read(String name) { return this.cookies.get(name); } public void set(Cookie cookie) { this.queue.add(cookie); } /** * Sets a cookie. * * @param name * The cookie's name. * @param value * The cookie's value. * @param expires * How many days until the cookie expires. */ public void set(String name, String value, int expires) { this.queue.add(new Cookie(name, value, Cookie.getHTTPTime(expires))); } /** * Internally used by the webserver to add all queued cookies into the * Response's HTTP Headers. * * @param response * The Response object to which headers the queued cookies * will be added. */ public void unloadQueue(Response response) { for (Cookie cookie : this.queue) { response.addHeader("Set-Cookie", cookie.getHTTPHeader()); } } } /** * Default threading strategy for NanoHTTPD. * <p/> * <p> * By default, the server spawns a new Thread for every incoming request. * These are set to <i>daemon</i> status, and named according to the request * number. The name is useful when profiling the application. * </p> */ public static class DefaultAsyncRunner implements AsyncRunner { private long requestCount; private final List<ClientHandler> running = Collections.synchronizedList(new ArrayList<NanoHTTPD.ClientHandler>()); /** * @return a list with currently running clients. */ public List<ClientHandler> getRunning() { return running; } @Override public void closeAll() { // copy of the list for concurrency for (ClientHandler clientHandler : new ArrayList<ClientHandler>(this.running)) { clientHandler.close(); } } @Override public void closed(ClientHandler clientHandler) { this.running.remove(clientHandler); } @Override public void exec(ClientHandler clientHandler) { ++this.requestCount; Thread t = new Thread(clientHandler); t.setDaemon(true); t.setName("NanoHttpd Request Processor (#" + this.requestCount + ")"); this.running.add(clientHandler); t.start(); } } /** * Default strategy for creating and cleaning up temporary files. * <p/> * <p> * By default, files are created by <code>File.createTempFile()</code> in * the directory specified. * </p> */ public static class DefaultTempFile implements TempFile { private final File file; private final OutputStream fstream; public DefaultTempFile(String tempdir) throws IOException { this.file = File.createTempFile("NanoHTTPD-", "", new File(tempdir)); this.fstream = new FileOutputStream(this.file); } @Override public void delete() throws Exception { safeClose(this.fstream); if (!this.file.delete()) { throw new Exception("could not delete temporary file"); } } @Override public String getName() { return this.file.getAbsolutePath(); } @Override public OutputStream open() throws Exception { return this.fstream; } } /** * Default strategy for creating and cleaning up temporary files. * <p/> * <p> * This class stores its files in the standard location (that is, wherever * <code>java.io.tmpdir</code> points to). Files are added to an internal * list, and deleted when no longer needed (that is, when * <code>clear()</code> is invoked at the end of processing a request). * </p> */ public static class DefaultTempFileManager implements TempFileManager { private final String tmpdir; private final List<TempFile> tempFiles; public DefaultTempFileManager() { this.tmpdir = System.getProperty("java.io.tmpdir"); this.tempFiles = new ArrayList<TempFile>(); } @Override public void clear() { for (TempFile file : this.tempFiles) { try { file.delete(); } catch (Exception ignored) { NanoHTTPD.LOG.log(Level.WARNING, "could not delete file ", ignored); } } this.tempFiles.clear(); } @Override public TempFile createTempFile() throws Exception { DefaultTempFile tempFile = new DefaultTempFile(this.tmpdir); this.tempFiles.add(tempFile); return tempFile; } } /** * Default strategy for creating and cleaning up temporary files. */ private class DefaultTempFileManagerFactory implements TempFileManagerFactory { @Override public TempFileManager create() { return new DefaultTempFileManager(); } } private static final String CONTENT_DISPOSITION_REGEX = "([ |\t]*Content-Disposition[ |\t]*:)(.*)"; private static final Pattern CONTENT_DISPOSITION_PATTERN = Pattern.compile(CONTENT_DISPOSITION_REGEX, Pattern.CASE_INSENSITIVE); private static final String CONTENT_TYPE_REGEX = "([ |\t]*content-type[ |\t]*:)(.*)"; private static final Pattern CONTENT_TYPE_PATTERN = Pattern.compile(CONTENT_TYPE_REGEX, Pattern.CASE_INSENSITIVE); private static final String CONTENT_DISPOSITION_ATTRIBUTE_REGEX = "[ |\t]*([a-zA-Z]*)[ |\t]*=[ |\t]*['|\"]([^\"^']*)['|\"]"; private static final Pattern CONTENT_DISPOSITION_ATTRIBUTE_PATTERN = Pattern.compile(CONTENT_DISPOSITION_ATTRIBUTE_REGEX); protected class HTTPSession implements IHTTPSession { public static final int BUFSIZE = 8192; private final TempFileManager tempFileManager; private final OutputStream outputStream; private final PushbackInputStream inputStream; private int splitbyte; private int rlen; private String uri; private Method method; private Map<String, String> parms; private Map<String, String> headers; private CookieHandler cookies; private String queryParameterString; private String remoteIp; private String protocolVersion; public HTTPSession(TempFileManager tempFileManager, InputStream inputStream, OutputStream outputStream) { this.tempFileManager = tempFileManager; this.inputStream = new PushbackInputStream(inputStream, HTTPSession.BUFSIZE); this.outputStream = outputStream; } public HTTPSession(TempFileManager tempFileManager, InputStream inputStream, OutputStream outputStream, InetAddress inetAddress) { this.tempFileManager = tempFileManager; this.inputStream = new PushbackInputStream(inputStream, HTTPSession.BUFSIZE); this.outputStream = outputStream; this.remoteIp = inetAddress.isLoopbackAddress() || inetAddress.isAnyLocalAddress() ? "127.0.0.1" : inetAddress.getHostAddress().toString(); this.headers = new HashMap<String, String>(); } /** * Decodes the sent headers and loads the data into Key/value pairs */ private void decodeHeader(BufferedReader in, Map<String, String> pre, Map<String, String> parms, Map<String, String> headers) throws ResponseException { try { // Read the request line String inLine = in.readLine(); if (inLine == null) { return; } StringTokenizer st = new StringTokenizer(inLine); if (!st.hasMoreTokens()) { throw new ResponseException(Response.Status.BAD_REQUEST, "BAD REQUEST: Syntax error. Usage: GET /example/file.html"); } pre.put("method", st.nextToken()); if (!st.hasMoreTokens()) { throw new ResponseException(Response.Status.BAD_REQUEST, "BAD REQUEST: Missing URI. Usage: GET /example/file.html"); } String uri = st.nextToken(); // Decode parameters from the URI int qmi = uri.indexOf('?'); if (qmi >= 0) { decodeParms(uri.substring(qmi + 1), parms); uri = decodePercent(uri.substring(0, qmi)); } else { uri = decodePercent(uri); } // If there's another token, its protocol version, // followed by HTTP headers. // NOTE: this now forces header names lower case since they are // case insensitive and vary by client. if (st.hasMoreTokens()) { protocolVersion = st.nextToken(); } else { protocolVersion = "HTTP/1.1"; NanoHTTPD.LOG.log(Level.FINE, "no protocol version specified, strange. Assuming HTTP/1.1."); } String line = in.readLine(); while (line != null && line.trim().length() > 0) { int p = line.indexOf(':'); if (p >= 0) { headers.put(line.substring(0, p).trim().toLowerCase(Locale.US), line.substring(p + 1).trim()); } line = in.readLine(); } pre.put("uri", uri); } catch (IOException ioe) { throw new ResponseException(Response.Status.INTERNAL_ERROR, "SERVER INTERNAL ERROR: IOException: " + ioe.getMessage(), ioe); } } /** * Decodes the Multipart Body data and put it into Key/Value pairs. */ private void decodeMultipartFormData(String boundary, ByteBuffer fbuf, Map<String, String> parms, Map<String, String> files) throws ResponseException { try { int[] boundary_idxs = getBoundaryPositions(fbuf, boundary.getBytes()); if (boundary_idxs.length < 2) { throw new ResponseException(Response.Status.BAD_REQUEST, "BAD REQUEST: Content type is multipart/form-data but contains less than two boundary strings."); } final int MAX_HEADER_SIZE = 1024; byte[] part_header_buff = new byte[MAX_HEADER_SIZE]; for (int bi = 0; bi < boundary_idxs.length - 1; bi++) { fbuf.position(boundary_idxs[bi]); int len = (fbuf.remaining() < MAX_HEADER_SIZE) ? fbuf.remaining() : MAX_HEADER_SIZE; fbuf.get(part_header_buff, 0, len); ByteArrayInputStream bais = new ByteArrayInputStream(part_header_buff, 0, len); BufferedReader in = new BufferedReader(new InputStreamReader(bais, Charset.forName("US-ASCII"))); // First line is boundary string String mpline = in.readLine(); if (!mpline.contains(boundary)) { throw new ResponseException(Response.Status.BAD_REQUEST, "BAD REQUEST: Content type is multipart/form-data but chunk does not start with boundary."); } String part_name = null, file_name = null, content_type = null; // Parse the reset of the header lines mpline = in.readLine(); while (mpline != null && mpline.trim().length() > 0) { Matcher matcher = CONTENT_DISPOSITION_PATTERN.matcher(mpline); if (matcher.matches()) { String attributeString = matcher.group(2); matcher = CONTENT_DISPOSITION_ATTRIBUTE_PATTERN.matcher(attributeString); while (matcher.find()) { String key = matcher.group(1); if (key.equalsIgnoreCase("name")) { part_name = matcher.group(2); } else if (key.equalsIgnoreCase("filename")) { file_name = matcher.group(2); } } } matcher = CONTENT_TYPE_PATTERN.matcher(mpline); if (matcher.matches()) { content_type = matcher.group(2).trim(); } mpline = in.readLine(); } // Read the part data int part_header_len = len - (int) in.skip(MAX_HEADER_SIZE); if (part_header_len >= len - 4) { throw new ResponseException(Response.Status.INTERNAL_ERROR, "Multipart header size exceeds MAX_HEADER_SIZE."); } int part_data_start = boundary_idxs[bi] + part_header_len; int part_data_end = boundary_idxs[bi + 1] - 4; fbuf.position(part_data_start); if (content_type == null) { // Read the part into a string byte[] data_bytes = new byte[part_data_end - part_data_start]; fbuf.get(data_bytes); parms.put(part_name, new String(data_bytes)); } else { // Read it into a file String path = saveTmpFile(fbuf, part_data_start, part_data_end - part_data_start); if (!files.containsKey(part_name)) { files.put(part_name, path); } else { int count = 2; while (files.containsKey(part_name + count)) { count++; } files.put(part_name + count, path); } parms.put(part_name, file_name); } } } catch (ResponseException re) { throw re; } catch (Exception e) { throw new ResponseException(Response.Status.INTERNAL_ERROR, e.toString()); } } /** * Decodes parameters in percent-encoded URI-format ( e.g. * "name=Jack%20Daniels&pass=Single%20Malt" ) and adds them to given * Map. NOTE: this doesn't support multiple identical keys due to the * simplicity of Map. */ private void decodeParms(String parms, Map<String, String> p) { if (parms == null) { this.queryParameterString = ""; return; } this.queryParameterString = parms; StringTokenizer st = new StringTokenizer(parms, "&"); while (st.hasMoreTokens()) { String e = st.nextToken(); int sep = e.indexOf('='); if (sep >= 0) { p.put(decodePercent(e.substring(0, sep)).trim(), decodePercent(e.substring(sep + 1))); } else { p.put(decodePercent(e).trim(), ""); } } } @Override public void execute() throws IOException { Response r = null; try { // Read the first 8192 bytes. // The full header should fit in here. // Apache's default header limit is 8KB. // Do NOT assume that a single read will get the entire header // at once! byte[] buf = new byte[HTTPSession.BUFSIZE]; this.splitbyte = 0; this.rlen = 0; int read = -1; try { read = this.inputStream.read(buf, 0, HTTPSession.BUFSIZE); } catch (Exception e) { safeClose(this.inputStream); safeClose(this.outputStream); throw new SocketException("NanoHttpd Shutdown"); } if (read == -1) { // socket was been closed safeClose(this.inputStream); safeClose(this.outputStream); throw new SocketException("NanoHttpd Shutdown"); } while (read > 0) { this.rlen += read; this.splitbyte = findHeaderEnd(buf, this.rlen); if (this.splitbyte > 0) { break; } read = this.inputStream.read(buf, this.rlen, HTTPSession.BUFSIZE - this.rlen); } if (this.splitbyte < this.rlen) { this.inputStream.unread(buf, this.splitbyte, this.rlen - this.splitbyte); } this.parms = new HashMap<String, String>(); if (null == this.headers) { this.headers = new HashMap<String, String>(); } else { this.headers.clear(); } if (null != this.remoteIp) { this.headers.put("remote-addr", this.remoteIp); this.headers.put("http-client-ip", this.remoteIp); } // Create a BufferedReader for parsing the header. BufferedReader hin = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(buf, 0, this.rlen))); // Decode the header into parms and header java properties Map<String, String> pre = new HashMap<String, String>(); decodeHeader(hin, pre, this.parms, this.headers); this.method = Method.lookup(pre.get("method")); if (this.method == null) { throw new ResponseException(Response.Status.BAD_REQUEST, "BAD REQUEST: Syntax error."); } this.uri = pre.get("uri"); this.cookies = new CookieHandler(this.headers); String connection = this.headers.get("connection"); boolean keepAlive = protocolVersion.equals("HTTP/1.1") && (connection == null || !connection.matches("(?i).*close.*")); // Ok, now do the serve() r = serve(this); if (r == null) { throw new ResponseException(Response.Status.INTERNAL_ERROR, "SERVER INTERNAL ERROR: Serve() returned a null response."); } else { String acceptEncoding = this.headers.get("accept-encoding"); this.cookies.unloadQueue(r); r.setRequestMethod(this.method); r.setGzipEncoding(useGzipWhenAccepted(r) && acceptEncoding != null && acceptEncoding.contains("gzip")); r.setKeepAlive(keepAlive); r.send(this.outputStream); } if (!keepAlive || "close".equalsIgnoreCase(r.getHeader("connection"))) { throw new SocketException("NanoHttpd Shutdown"); } } catch (SocketException e) { // throw it out to close socket object (finalAccept) throw e; } catch (SocketTimeoutException ste) { // treat socket timeouts the same way we treat socket exceptions // i.e. close the stream & finalAccept object by throwing the // exception up the call stack. throw ste; } catch (IOException ioe) { Response resp = newFixedLengthResponse(Response.Status.INTERNAL_ERROR, NanoHTTPD.MIME_PLAINTEXT, "SERVER INTERNAL ERROR: IOException: " + ioe.getMessage()); resp.send(this.outputStream); safeClose(this.outputStream); } catch (ResponseException re) { Response resp = newFixedLengthResponse(re.getStatus(), NanoHTTPD.MIME_PLAINTEXT, re.getMessage()); resp.send(this.outputStream); safeClose(this.outputStream); } finally { safeClose(r); this.tempFileManager.clear(); } } /** * Find byte index separating header from body. It must be the last byte * of the first two sequential new lines. */ private int findHeaderEnd(final byte[] buf, int rlen) { int splitbyte = 0; while (splitbyte + 3 < rlen) { if (buf[splitbyte] == '\r' && buf[splitbyte + 1] == '\n' && buf[splitbyte + 2] == '\r' && buf[splitbyte + 3] == '\n') { return splitbyte + 4; } splitbyte++; } return 0; } /** * Find the byte positions where multipart boundaries start. This reads * a large block at a time and uses a temporary buffer to optimize * (memory mapped) file access. */ private int[] getBoundaryPositions(ByteBuffer b, byte[] boundary) { int[] res = new int[0]; if (b.remaining() < boundary.length) { return res; } int search_window_pos = 0; byte[] search_window = new byte[4 * 1024 + boundary.length]; int first_fill = (b.remaining() < search_window.length) ? b.remaining() : search_window.length; b.get(search_window, 0, first_fill); int new_bytes = first_fill - boundary.length; do { // Search the search_window for (int j = 0; j < new_bytes; j++) { for (int i = 0; i < boundary.length; i++) { if (search_window[j + i] != boundary[i]) break; if (i == boundary.length - 1) { // Match found, add it to results int[] new_res = new int[res.length + 1]; System.arraycopy(res, 0, new_res, 0, res.length); new_res[res.length] = search_window_pos + j; res = new_res; } } } search_window_pos += new_bytes; // Copy the end of the buffer to the start System.arraycopy(search_window, search_window.length - boundary.length, search_window, 0, boundary.length); // Refill search_window new_bytes = search_window.length - boundary.length; new_bytes = (b.remaining() < new_bytes) ? b.remaining() : new_bytes; b.get(search_window, boundary.length, new_bytes); } while (new_bytes > 0); return res; } @Override public CookieHandler getCookies() { return this.cookies; } @Override public final Map<String, String> getHeaders() { return this.headers; } @Override public final InputStream getInputStream() { return this.inputStream; } @Override public final Method getMethod() { return this.method; } @Override public final Map<String, String> getParms() { return this.parms; } @Override public String getQueryParameterString() { return this.queryParameterString; } private RandomAccessFile getTmpBucket() { try { TempFile tempFile = this.tempFileManager.createTempFile(); return new RandomAccessFile(tempFile.getName(), "rw"); } catch (Exception e) { throw new Error(e); // we won't recover, so throw an error } } @Override public final String getUri() { return this.uri; } @Override public void parseBody(Map<String, String> files) throws IOException, ResponseException { final int REQUEST_BUFFER_LEN = 512; final int MEMORY_STORE_LIMIT = 1024; RandomAccessFile randomAccessFile = null; try { long size; if (this.headers.containsKey("content-length")) { size = Integer.parseInt(this.headers.get("content-length")); } else if (this.splitbyte < this.rlen) { size = this.rlen - this.splitbyte; } else { size = 0; } ByteArrayOutputStream baos = null; DataOutput request_data_output = null; // Store the request in memory or a file, depending on size if (size < MEMORY_STORE_LIMIT) { baos = new ByteArrayOutputStream(); request_data_output = new DataOutputStream(baos); } else { randomAccessFile = getTmpBucket(); request_data_output = randomAccessFile; } // Read all the body and write it to request_data_output byte[] buf = new byte[REQUEST_BUFFER_LEN]; while (this.rlen >= 0 && size > 0) { this.rlen = this.inputStream.read(buf, 0, (int) Math.min(size, REQUEST_BUFFER_LEN)); size -= this.rlen; if (this.rlen > 0) { request_data_output.write(buf, 0, this.rlen); } } ByteBuffer fbuf = null; if (baos != null) { fbuf = ByteBuffer.wrap(baos.toByteArray(), 0, baos.size()); } else { fbuf = randomAccessFile.getChannel().map(FileChannel.MapMode.READ_ONLY, 0, randomAccessFile.length()); randomAccessFile.seek(0); } // If the method is POST, there may be parameters // in data section, too, read it: if (Method.POST.equals(this.method)) { String contentType = ""; String contentTypeHeader = this.headers.get("content-type"); StringTokenizer st = null; if (contentTypeHeader != null) { st = new StringTokenizer(contentTypeHeader, ",; "); if (st.hasMoreTokens()) { contentType = st.nextToken(); } } if ("multipart/form-data".equalsIgnoreCase(contentType)) { // Handle multipart/form-data if (!st.hasMoreTokens()) { throw new ResponseException(Response.Status.BAD_REQUEST, "BAD REQUEST: Content type is multipart/form-data but boundary missing. Usage: GET /example/file.html"); } String boundaryStartString = "boundary="; int boundaryContentStart = contentTypeHeader.indexOf(boundaryStartString) + boundaryStartString.length(); String boundary = contentTypeHeader.substring(boundaryContentStart, contentTypeHeader.length()); if (boundary.startsWith("\"") && boundary.endsWith("\"")) { boundary = boundary.substring(1, boundary.length() - 1); } decodeMultipartFormData(boundary, fbuf, this.parms, files); } else { byte[] postBytes = new byte[fbuf.remaining()]; fbuf.get(postBytes); String postLine = new String(postBytes).trim(); // Handle application/x-www-form-urlencoded if ("application/x-www-form-urlencoded".equalsIgnoreCase(contentType)) { decodeParms(postLine, this.parms); } else if (postLine.length() != 0) { // Special case for raw POST data => create a // special files entry "postData" with raw content // data files.put("postData", postLine); } } } else if (Method.PUT.equals(this.method)) { files.put("content", saveTmpFile(fbuf, 0, fbuf.limit())); } } finally { safeClose(randomAccessFile); } } /** * Retrieves the content of a sent file and saves it to a temporary * file. The full path to the saved file is returned. */ private String saveTmpFile(ByteBuffer b, int offset, int len) { String path = ""; if (len > 0) { FileOutputStream fileOutputStream = null; try { TempFile tempFile = this.tempFileManager.createTempFile(); ByteBuffer src = b.duplicate(); fileOutputStream = new FileOutputStream(tempFile.getName()); FileChannel dest = fileOutputStream.getChannel(); src.position(offset).limit(offset + len); dest.write(src.slice()); path = tempFile.getName(); } catch (Exception e) { // Catch exception if any throw new Error(e); // we won't recover, so throw an error } finally { safeClose(fileOutputStream); } } return path; } } /** * Handles one session, i.e. parses the HTTP request and returns the * response. */ public interface IHTTPSession { void execute() throws IOException; CookieHandler getCookies(); Map<String, String> getHeaders(); InputStream getInputStream(); Method getMethod(); Map<String, String> getParms(); String getQueryParameterString(); /** * @return the path part of the URL. */ String getUri(); /** * Adds the files in the request body to the files map. * * @param files * map to modify */ void parseBody(Map<String, String> files) throws IOException, ResponseException; } /** * HTTP Request methods, with the ability to decode a <code>String</code> * back to its enum value. */ public enum Method { GET, PUT, POST, DELETE, HEAD, OPTIONS, TRACE, CONNECT, PATCH; static Method lookup(String method) { for (Method m : Method.values()) { if (m.toString().equalsIgnoreCase(method)) { return m; } } return null; } } /** * HTTP response. Return one of these from serve(). */ public static class Response implements Closeable { public interface IStatus { String getDescription(); int getRequestStatus(); } /** * Some HTTP response status codes */ public enum Status implements IStatus { SWITCH_PROTOCOL(101, "Switching Protocols"), OK(200, "OK"), CREATED(201, "Created"), ACCEPTED(202, "Accepted"), NO_CONTENT(204, "No Content"), PARTIAL_CONTENT(206, "Partial Content"), REDIRECT(301, "Moved Permanently"), NOT_MODIFIED(304, "Not Modified"), BAD_REQUEST(400, "Bad Request"), UNAUTHORIZED(401, "Unauthorized"), FORBIDDEN(403, "Forbidden"), NOT_FOUND(404, "Not Found"), METHOD_NOT_ALLOWED(405, "Method Not Allowed"), REQUEST_TIMEOUT(408, "Request Timeout"), RANGE_NOT_SATISFIABLE(416, "Requested Range Not Satisfiable"), INTERNAL_ERROR(500, "Internal Server Error"), UNSUPPORTED_HTTP_VERSION(505, "HTTP Version Not Supported"); private final int requestStatus; private final String description; Status(int requestStatus, String description) { this.requestStatus = requestStatus; this.description = description; } @Override public String getDescription() { return "" + this.requestStatus + " " + this.description; } @Override public int getRequestStatus() { return this.requestStatus; } } /** * Output stream that will automatically send every write to the wrapped * OutputStream according to chunked transfer: * http://www.w3.org/Protocols/rfc2616/rfc2616-sec3.html#sec3.6.1 */ private static class ChunkedOutputStream extends FilterOutputStream { public ChunkedOutputStream(OutputStream out) { super(out); } @Override public void write(int b) throws IOException { byte[] data = { (byte) b }; write(data, 0, 1); } @Override public void write(byte[] b) throws IOException { write(b, 0, b.length); } @Override public void write(byte[] b, int off, int len) throws IOException { if (len == 0) return; out.write(String.format("%x\r\n", len).getBytes()); out.write(b, off, len); out.write("\r\n".getBytes()); } public void finish() throws IOException { out.write("0\r\n\r\n".getBytes()); } } /** * HTTP status code after processing, e.g. "200 OK", Status.OK */ private IStatus status; /** * MIME type of content, e.g. "text/html" */ private String mimeType; /** * Data of the response, may be null. */ private InputStream data; private long contentLength; /** * Headers for the HTTP response. Use addHeader() to add lines. */ private final Map<String, String> header = new HashMap<String, String>(); /** * The request method that spawned this response. */ private Method requestMethod; /** * Use chunkedTransfer */ private boolean chunkedTransfer; private boolean encodeAsGzip; private boolean keepAlive; /** * Creates a fixed length response if totalBytes>=0, otherwise chunked. */ protected Response(IStatus status, String mimeType, InputStream data, long totalBytes) { this.status = status; this.mimeType = mimeType; if (data == null) { this.data = new ByteArrayInputStream(new byte[0]); this.contentLength = 0L; } else { this.data = data; this.contentLength = totalBytes; } this.chunkedTransfer = this.contentLength < 0; keepAlive = true; } @Override public void close() throws IOException { if (this.data != null) { this.data.close(); } } /** * Adds given line to the header. */ public void addHeader(String name, String value) { this.header.put(name, value); } public InputStream getData() { return this.data; } public String getHeader(String name) { for (String headerName : header.keySet()) { if (headerName.equalsIgnoreCase(name)) { return header.get(headerName); } } return null; } public String getMimeType() { return this.mimeType; } public Method getRequestMethod() { return this.requestMethod; } public IStatus getStatus() { return this.status; } public void setGzipEncoding(boolean encodeAsGzip) { this.encodeAsGzip = encodeAsGzip; } public void setKeepAlive(boolean useKeepAlive) { this.keepAlive = useKeepAlive; } private boolean headerAlreadySent(Map<String, String> header, String name) { boolean alreadySent = false; for (String headerName : header.keySet()) { alreadySent |= headerName.equalsIgnoreCase(name); } return alreadySent; } /** * Sends given response to the socket. */ protected void send(OutputStream outputStream) { String mime = this.mimeType; SimpleDateFormat gmtFrmt = new SimpleDateFormat("E, d MMM yyyy HH:mm:ss 'GMT'", Locale.US); gmtFrmt.setTimeZone(TimeZone.getTimeZone("GMT")); try { if (this.status == null) { throw new Error("sendResponse(): Status can't be null."); } PrintWriter pw = new PrintWriter(new BufferedWriter(new OutputStreamWriter(outputStream, "UTF-8")), false); pw.print("HTTP/1.1 " + this.status.getDescription() + " \r\n"); if (mime != null) { pw.print("Content-Type: " + mime + "\r\n"); } if (this.header == null || this.header.get("Date") == null) { pw.print("Date: " + gmtFrmt.format(new Date()) + "\r\n"); } if (this.header != null) { for (String key : this.header.keySet()) { String value = this.header.get(key); pw.print(key + ": " + value + "\r\n"); } } if (!headerAlreadySent(header, "connection")) { pw.print("Connection: " + (this.keepAlive ? "keep-alive" : "close") + "\r\n"); } if (headerAlreadySent(this.header, "content-length")) { encodeAsGzip = false; } if (encodeAsGzip) { pw.print("Content-Encoding: gzip\r\n"); setChunkedTransfer(true); } long pending = this.data != null ? this.contentLength : 0; if (this.requestMethod != Method.HEAD && this.chunkedTransfer) { pw.print("Transfer-Encoding: chunked\r\n"); } else if (!encodeAsGzip) { pending = sendContentLengthHeaderIfNotAlreadyPresent(pw, this.header, pending); } pw.print("\r\n"); pw.flush(); sendBodyWithCorrectTransferAndEncoding(outputStream, pending); outputStream.flush(); safeClose(this.data); } catch (IOException ioe) { NanoHTTPD.LOG.log(Level.SEVERE, "Could not send response to the client", ioe); } } private void sendBodyWithCorrectTransferAndEncoding(OutputStream outputStream, long pending) throws IOException { if (this.requestMethod != Method.HEAD && this.chunkedTransfer) { ChunkedOutputStream chunkedOutputStream = new ChunkedOutputStream(outputStream); sendBodyWithCorrectEncoding(chunkedOutputStream, -1); chunkedOutputStream.finish(); } else { sendBodyWithCorrectEncoding(outputStream, pending); } } private void sendBodyWithCorrectEncoding(OutputStream outputStream, long pending) throws IOException { if (encodeAsGzip) { GZIPOutputStream gzipOutputStream = new GZIPOutputStream(outputStream); sendBody(gzipOutputStream, -1); gzipOutputStream.finish(); } else { sendBody(outputStream, pending); } } /** * Sends the body to the specified OutputStream. The pending parameter * limits the maximum amounts of bytes sent unless it is -1, in which * case everything is sent. * * @param outputStream * the OutputStream to send data to * @param pending * -1 to send everything, otherwise sets a max limit to the * number of bytes sent * @throws IOException * if something goes wrong while sending the data. */ private void sendBody(OutputStream outputStream, long pending) throws IOException { long BUFFER_SIZE = 16 * 1024; byte[] buff = new byte[(int) BUFFER_SIZE]; boolean sendEverything = pending == -1; while (pending > 0 || sendEverything) { long bytesToRead = sendEverything ? BUFFER_SIZE : Math.min(pending, BUFFER_SIZE); int read = this.data.read(buff, 0, (int) bytesToRead); if (read <= 0) { break; } outputStream.write(buff, 0, read); if (!sendEverything) { pending -= read; } } } protected long sendContentLengthHeaderIfNotAlreadyPresent(PrintWriter pw, Map<String, String> header, long size) { for (String headerName : header.keySet()) { if (headerName.equalsIgnoreCase("content-length")) { try { return Long.parseLong(header.get(headerName)); } catch (NumberFormatException ex) { return size; } } } pw.print("Content-Length: " + size + "\r\n"); return size; } public void setChunkedTransfer(boolean chunkedTransfer) { this.chunkedTransfer = chunkedTransfer; } public void setData(InputStream data) { this.data = data; } public void setMimeType(String mimeType) { this.mimeType = mimeType; } public void setRequestMethod(Method requestMethod) { this.requestMethod = requestMethod; } public void setStatus(IStatus status) { this.status = status; } } public static final class ResponseException extends Exception { private static final long serialVersionUID = 6569838532917408380L; private final Response.Status status; public ResponseException(Response.Status status, String message) { super(message); this.status = status; } public ResponseException(Response.Status status, String message, Exception e) { super(message, e); this.status = status; } public Response.Status getStatus() { return this.status; } } /** * The runnable that will be used for the main listening thread. */ public class ServerRunnable implements Runnable { private final int timeout; private IOException bindException; private boolean hasBinded = false; private ServerRunnable(int timeout) { this.timeout = timeout; } @Override public void run() { try { myServerSocket.bind(hostname != null ? new InetSocketAddress(hostname, myPort) : new InetSocketAddress(myPort)); hasBinded = true; } catch (IOException e) { this.bindException = e; return; } do { try { final Socket finalAccept = NanoHTTPD.this.myServerSocket.accept(); if (this.timeout > 0) { finalAccept.setSoTimeout(this.timeout); } final InputStream inputStream = finalAccept.getInputStream(); NanoHTTPD.this.asyncRunner.exec(createClientHandler(finalAccept, inputStream)); } catch (IOException e) { NanoHTTPD.LOG.log(Level.FINE, "Communication with the client broken", e); } } while (!NanoHTTPD.this.myServerSocket.isClosed()); } } /** * A temp file. * <p/> * <p> * Temp files are responsible for managing the actual temporary storage and * cleaning themselves up when no longer needed. * </p> */ public interface TempFile { void delete() throws Exception; String getName(); OutputStream open() throws Exception; } /** * Temp file manager. * <p/> * <p> * Temp file managers are created 1-to-1 with incoming requests, to create * and cleanup temporary files created as a result of handling the request. * </p> */ public interface TempFileManager { void clear(); TempFile createTempFile() throws Exception; } /** * Factory to create temp file managers. */ public interface TempFileManagerFactory { TempFileManager create(); } /** * Maximum time to wait on Socket.getInputStream().read() (in milliseconds) * This is required as the Keep-Alive HTTP connections would otherwise block * the socket reading thread forever (or as long the browser is open). */ public static final int SOCKET_READ_TIMEOUT = 5000; /** * Common MIME type for dynamic content: plain text */ public static final String MIME_PLAINTEXT = "text/plain"; /** * Common MIME type for dynamic content: html */ public static final String MIME_HTML = "text/html"; /** * Pseudo-Parameter to use to store the actual query string in the * parameters map for later re-processing. */ private static final String QUERY_STRING_PARAMETER = "NanoHttpd.QUERY_STRING"; /** * logger to log to. */ private static final Logger LOG = Logger.getLogger(NanoHTTPD.class.getName()); /** * Creates an SSLSocketFactory for HTTPS. Pass a loaded KeyStore and an * array of loaded KeyManagers. These objects must properly * loaded/initialized by the caller. */ public static SSLServerSocketFactory makeSSLSocketFactory(KeyStore loadedKeyStore, KeyManager[] keyManagers) throws IOException { SSLServerSocketFactory res = null; try { TrustManagerFactory trustManagerFactory = TrustManagerFactory.getInstance(TrustManagerFactory.getDefaultAlgorithm()); trustManagerFactory.init(loadedKeyStore); SSLContext ctx = SSLContext.getInstance("TLS"); ctx.init(keyManagers, trustManagerFactory.getTrustManagers(), null); res = ctx.getServerSocketFactory(); } catch (Exception e) { throw new IOException(e.getMessage()); } return res; } /** * Creates an SSLSocketFactory for HTTPS. Pass a loaded KeyStore and a * loaded KeyManagerFactory. These objects must properly loaded/initialized * by the caller. */ public static SSLServerSocketFactory makeSSLSocketFactory(KeyStore loadedKeyStore, KeyManagerFactory loadedKeyFactory) throws IOException { SSLServerSocketFactory res = null; try { TrustManagerFactory trustManagerFactory = TrustManagerFactory.getInstance(TrustManagerFactory.getDefaultAlgorithm()); trustManagerFactory.init(loadedKeyStore); SSLContext ctx = SSLContext.getInstance("TLS"); ctx.init(loadedKeyFactory.getKeyManagers(), trustManagerFactory.getTrustManagers(), null); res = ctx.getServerSocketFactory(); } catch (Exception e) { throw new IOException(e.getMessage()); } return res; } /** * Creates an SSLSocketFactory for HTTPS. Pass a KeyStore resource with your * certificate and passphrase */ public static SSLServerSocketFactory makeSSLSocketFactory(String keyAndTrustStoreClasspathPath, char[] passphrase) throws IOException { SSLServerSocketFactory res = null; try { KeyStore keystore = KeyStore.getInstance(KeyStore.getDefaultType()); InputStream keystoreStream = NanoHTTPD.class.getResourceAsStream(keyAndTrustStoreClasspathPath); keystore.load(keystoreStream, passphrase); TrustManagerFactory trustManagerFactory = TrustManagerFactory.getInstance(TrustManagerFactory.getDefaultAlgorithm()); trustManagerFactory.init(keystore); KeyManagerFactory keyManagerFactory = KeyManagerFactory.getInstance(KeyManagerFactory.getDefaultAlgorithm()); keyManagerFactory.init(keystore, passphrase); SSLContext ctx = SSLContext.getInstance("TLS"); ctx.init(keyManagerFactory.getKeyManagers(), trustManagerFactory.getTrustManagers(), null); res = ctx.getServerSocketFactory(); } catch (Exception e) { throw new IOException(e.getMessage()); } return res; } private static final void safeClose(Object closeable) { try { if (closeable != null) { if (closeable instanceof Closeable) { ((Closeable) closeable).close(); } else if (closeable instanceof Socket) { ((Socket) closeable).close(); } else if (closeable instanceof ServerSocket) { ((ServerSocket) closeable).close(); } else { throw new IllegalArgumentException("Unknown object to close"); } } } catch (IOException e) { NanoHTTPD.LOG.log(Level.SEVERE, "Could not close", e); } } private final String hostname; private final int myPort; private ServerSocket myServerSocket; private SSLServerSocketFactory sslServerSocketFactory; private Thread myThread; /** * Pluggable strategy for asynchronously executing requests. */ protected AsyncRunner asyncRunner; /** * Pluggable strategy for creating and cleaning up temporary files. */ private TempFileManagerFactory tempFileManagerFactory; /** * Constructs an HTTP server on given port. */ public NanoHTTPD(int port) { this(null, port); } // ------------------------------------------------------------------------------- // // // // Threading Strategy. // // ------------------------------------------------------------------------------- // // /** * Constructs an HTTP server on given hostname and port. */ public NanoHTTPD(String hostname, int port) { this.hostname = hostname; this.myPort = port; setTempFileManagerFactory(new DefaultTempFileManagerFactory()); setAsyncRunner(new DefaultAsyncRunner()); } /** * Forcibly closes all connections that are open. */ public synchronized void closeAllConnections() { stop(); } /** * create a instance of the client handler, subclasses can return a subclass * of the ClientHandler. * * @param finalAccept * the socket the cleint is connected to * @param inputStream * the input stream * @return the client handler */ protected ClientHandler createClientHandler(final Socket finalAccept, final InputStream inputStream) { return new ClientHandler(inputStream, finalAccept); } /** * Instantiate the server runnable, can be overwritten by subclasses to * provide a subclass of the ServerRunnable. * * @param timeout * the socet timeout to use. * @return the server runnable. */ protected ServerRunnable createServerRunnable(final int timeout) { return new ServerRunnable(timeout); } /** * Decode parameters from a URL, handing the case where a single parameter * name might have been supplied several times, by return lists of values. * In general these lists will contain a single element. * * @param parms * original <b>NanoHTTPD</b> parameters values, as passed to the * <code>serve()</code> method. * @return a map of <code>String</code> (parameter name) to * <code>List&lt;String&gt;</code> (a list of the values supplied). */ protected Map<String, List<String>> decodeParameters(Map<String, String> parms) { return this.decodeParameters(parms.get(NanoHTTPD.QUERY_STRING_PARAMETER)); } // ------------------------------------------------------------------------------- // // /** * Decode parameters from a URL, handing the case where a single parameter * name might have been supplied several times, by return lists of values. * In general these lists will contain a single element. * * @param queryString * a query string pulled from the URL. * @return a map of <code>String</code> (parameter name) to * <code>List&lt;String&gt;</code> (a list of the values supplied). */ protected Map<String, List<String>> decodeParameters(String queryString) { Map<String, List<String>> parms = new HashMap<String, List<String>>(); if (queryString != null) { StringTokenizer st = new StringTokenizer(queryString, "&"); while (st.hasMoreTokens()) { String e = st.nextToken(); int sep = e.indexOf('='); String propertyName = sep >= 0 ? decodePercent(e.substring(0, sep)).trim() : decodePercent(e).trim(); if (!parms.containsKey(propertyName)) { parms.put(propertyName, new ArrayList<String>()); } String propertyValue = sep >= 0 ? decodePercent(e.substring(sep + 1)) : null; if (propertyValue != null) { parms.get(propertyName).add(propertyValue); } } } return parms; } /** * Decode percent encoded <code>String</code> values. * * @param str * the percent encoded <code>String</code> * @return expanded form of the input, for example "foo%20bar" becomes * "foo bar" */ protected String decodePercent(String str) { String decoded = null; try { decoded = URLDecoder.decode(str, "UTF8"); } catch (UnsupportedEncodingException ignored) { NanoHTTPD.LOG.log(Level.WARNING, "Encoding not supported, ignored", ignored); } return decoded; } /** * @return true if the gzip compression should be used if the client * accespts it. Default this option is on for text content and off * for everything else. */ protected boolean useGzipWhenAccepted(Response r) { return r.getMimeType() != null && r.getMimeType().toLowerCase().contains("text/"); } public final int getListeningPort() { return this.myServerSocket == null ? -1 : this.myServerSocket.getLocalPort(); } public final boolean isAlive() { return wasStarted() && !this.myServerSocket.isClosed() && this.myThread.isAlive(); } /** * Call before start() to serve over HTTPS instead of HTTP */ public void makeSecure(SSLServerSocketFactory sslServerSocketFactory) { this.sslServerSocketFactory = sslServerSocketFactory; } /** * Create a response with unknown length (using HTTP 1.1 chunking). */ public Response newChunkedResponse(IStatus status, String mimeType, InputStream data) { return new Response(status, mimeType, data, -1); } /** * Create a response with known length. */ public Response newFixedLengthResponse(IStatus status, String mimeType, InputStream data, long totalBytes) { return new Response(status, mimeType, data, totalBytes); } /** * Create a text response with known length. */ public Response newFixedLengthResponse(IStatus status, String mimeType, String txt) { if (txt == null) { return newFixedLengthResponse(status, mimeType, new ByteArrayInputStream(new byte[0]), 0); } else { byte[] bytes; try { bytes = txt.getBytes("UTF-8"); } catch (UnsupportedEncodingException e) { NanoHTTPD.LOG.log(Level.SEVERE, "encoding problem, responding nothing", e); bytes = new byte[0]; } return newFixedLengthResponse(status, mimeType, new ByteArrayInputStream(bytes), bytes.length); } } /** * Create a text response with known length. */ public Response newFixedLengthResponse(String msg) { return newFixedLengthResponse(Status.OK, NanoHTTPD.MIME_HTML, msg); } /** * Override this to customize the server. * <p/> * <p/> * (By default, this returns a 404 "Not Found" plain text error response.) * * @param session * The HTTP session * @return HTTP response, see class Response for details */ public Response serve(IHTTPSession session) { Map<String, String> files = new HashMap<String, String>(); Method method = session.getMethod(); if (Method.PUT.equals(method) || Method.POST.equals(method)) { try { session.parseBody(files); } catch (IOException ioe) { return newFixedLengthResponse(Response.Status.INTERNAL_ERROR, NanoHTTPD.MIME_PLAINTEXT, "SERVER INTERNAL ERROR: IOException: " + ioe.getMessage()); } catch (ResponseException re) { return newFixedLengthResponse(re.getStatus(), NanoHTTPD.MIME_PLAINTEXT, re.getMessage()); } } Map<String, String> parms = session.getParms(); parms.put(NanoHTTPD.QUERY_STRING_PARAMETER, session.getQueryParameterString()); return serve(session.getUri(), method, session.getHeaders(), parms, files); } /** * Override this to customize the server. * <p/> * <p/> * (By default, this returns a 404 "Not Found" plain text error response.) * * @param uri * Percent-decoded URI without parameters, for example * "/index.cgi" * @param method * "GET", "POST" etc. * @param parms * Parsed, percent decoded parameters from URI and, in case of * POST, data. * @param headers * Header entries, percent decoded * @return HTTP response, see class Response for details */ @Deprecated public Response serve(String uri, Method method, Map<String, String> headers, Map<String, String> parms, Map<String, String> files) { return newFixedLengthResponse(Response.Status.NOT_FOUND, NanoHTTPD.MIME_PLAINTEXT, "Not Found"); } /** * Pluggable strategy for asynchronously executing requests. * * @param asyncRunner * new strategy for handling threads. */ public void setAsyncRunner(AsyncRunner asyncRunner) { this.asyncRunner = asyncRunner; } /** * Pluggable strategy for creating and cleaning up temporary files. * * @param tempFileManagerFactory * new strategy for handling temp files. */ public void setTempFileManagerFactory(TempFileManagerFactory tempFileManagerFactory) { this.tempFileManagerFactory = tempFileManagerFactory; } /** * Start the server. * * @throws IOException * if the socket is in use. */ public void start() throws IOException { start(NanoHTTPD.SOCKET_READ_TIMEOUT); } /** * Start the server. * * @param timeout * timeout to use for socket connections. * @throws IOException * if the socket is in use. */ public void start(final int timeout) throws IOException { if (this.sslServerSocketFactory != null) { SSLServerSocket ss = (SSLServerSocket) this.sslServerSocketFactory.createServerSocket(); ss.setNeedClientAuth(false); this.myServerSocket = ss; } else { this.myServerSocket = new ServerSocket(); } this.myServerSocket.setReuseAddress(true); ServerRunnable serverRunnable = createServerRunnable(timeout); this.myThread = new Thread(serverRunnable); this.myThread.setDaemon(true); this.myThread.setName("NanoHttpd Main Listener"); this.myThread.start(); while (!serverRunnable.hasBinded && serverRunnable.bindException == null) { try { Thread.sleep(10L); } catch (Throwable e) { // on android this may not be allowed, that's why we // catch throwable the wait should be very short because we are // just waiting for the bind of the socket } } if (serverRunnable.bindException != null) { throw serverRunnable.bindException; } } /** * Stop the server. */ public void stop() { try { safeClose(this.myServerSocket); this.asyncRunner.closeAll(); if (this.myThread != null) { this.myThread.join(); } } catch (Exception e) { NanoHTTPD.LOG.log(Level.SEVERE, "Could not stop all connections", e); } } public final boolean wasStarted() { return this.myServerSocket != null && this.myThread != null; } }
{ "content_hash": "1782c9d68c3f770924fe142860a6e0b8", "timestamp": "", "source": "github", "line_count": 1930, "max_line_length": 174, "avg_line_length": 36.766321243523315, "alnum_prop": 0.547781113037106, "repo_name": "biswajitind/SCMF", "id": "647f58b4b85e269dde9fd9f5c0a0b2abff17882b", "size": "72556", "binary": false, "copies": "4", "ref": "refs/heads/master", "path": "core/src/main/java/fi/iki/elonen/NanoHTTPD.java", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "HTML", "bytes": "7786" }, { "name": "Java", "bytes": "264719" } ] }
<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <meta name="viewport" content="width=device-width, initial-scale=1"> <title>rupicola: Not compatible 👼</title> <link rel="shortcut icon" type="image/png" href="../../../../../favicon.png" /> <link href="../../../../../bootstrap.min.css" rel="stylesheet"> <link href="../../../../../bootstrap-custom.css" rel="stylesheet"> <link href="//maxcdn.bootstrapcdn.com/font-awesome/4.2.0/css/font-awesome.min.css" rel="stylesheet"> <script src="../../../../../moment.min.js"></script> <!-- HTML5 Shim and Respond.js IE8 support of HTML5 elements and media queries --> <!-- WARNING: Respond.js doesn't work if you view the page via file:// --> <!--[if lt IE 9]> <script src="https://oss.maxcdn.com/html5shiv/3.7.2/html5shiv.min.js"></script> <script src="https://oss.maxcdn.com/respond/1.4.2/respond.min.js"></script> <![endif]--> </head> <body> <div class="container"> <div class="navbar navbar-default" role="navigation"> <div class="container-fluid"> <div class="navbar-header"> <a class="navbar-brand" href="../../../../.."><i class="fa fa-lg fa-flag-checkered"></i> Coq bench</a> </div> <div id="navbar" class="collapse navbar-collapse"> <ul class="nav navbar-nav"> <li><a href="../..">clean / released</a></li> <li class="active"><a href="">8.6 / rupicola - 0.0.6</a></li> </ul> </div> </div> </div> <div class="article"> <div class="row"> <div class="col-md-12"> <a href="../..">« Up</a> <h1> rupicola <small> 0.0.6 <span class="label label-info">Not compatible 👼</span> </small> </h1> <p>📅 <em><script>document.write(moment("2022-11-21 00:31:44 +0000", "YYYY-MM-DD HH:mm:ss Z").fromNow());</script> (2022-11-21 00:31:44 UTC)</em><p> <h2>Context</h2> <pre># Packages matching: installed # Name # Installed # Synopsis base-bigarray base base-num base Num library distributed with the OCaml compiler base-ocamlbuild base OCamlbuild binary and libraries distributed with the OCaml compiler base-threads base base-unix base camlp5 7.14 Preprocessor-pretty-printer of OCaml conf-findutils 1 Virtual package relying on findutils conf-perl 2 Virtual package relying on perl coq 8.6 Formal proof management system num 0 The Num library for arbitrary-precision integer and rational arithmetic ocaml 4.02.3 The OCaml compiler (virtual package) ocaml-base-compiler 4.02.3 Official 4.02.3 release ocaml-config 1 OCaml Switch Configuration ocamlfind 1.9.5 A library manager for OCaml # opam file: opam-version: &quot;2.0&quot; authors: [ &quot;Clément Pit-Claudel &lt;clement.pitclaudel@live.com&gt;&quot; &quot;Jade Philipoom&quot; &quot;Dustin Jamner&quot; &quot;Andres Erbsen&quot; &quot;Adam Chlipala&quot; ] maintainer: &quot;Jason Gross &lt;jgross@mit.edu&gt;&quot; homepage: &quot;https://github.com/mit-plv/rupicola&quot; bug-reports: &quot;https://github.com/mit-plv/rupicola/issues&quot; license: &quot;MIT&quot; build: [ [make &quot;-j%{jobs}%&quot; &quot;EXTERNAL_DEPENDENCIES=1&quot; &quot;all&quot;] ] install: [make &quot;EXTERNAL_DEPENDENCIES=1&quot; &quot;install&quot;] depends: [ &quot;conf-findutils&quot; {build} &quot;coq&quot; {&gt;= &quot;8.15~&quot;} &quot;coq-bedrock2&quot; {&gt;= &quot;0.0.2&quot; &amp; &lt;= &quot;0.0.4&quot;} ] dev-repo: &quot;git+https://github.com/mit-plv/rupicola.git&quot; synopsis: &quot;Gallina to imperative code compilation, currently in design phase&quot; tags: [&quot;logpath:Rupicola&quot;] url { src: &quot;https://github.com/mit-plv/rupicola/archive/refs/tags/v0.0.6.tar.gz&quot; checksum: &quot;sha512=eca8735a2d741d8a759f46992b66cd1c1d00d228e7ac3cc6d8e4a6b740ee650485cf241336ea7e383f8a5f928e42d7212690c6f5f7d2121e9f1a3037369f0e90&quot; } </pre> <h2>Lint</h2> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>true</code></dd> <dt>Return code</dt> <dd>0</dd> </dl> <h2>Dry install 🏜️</h2> <p>Dry install with the current Coq version:</p> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>opam install -y --show-action coq-rupicola.0.0.6 coq.8.6</code></dd> <dt>Return code</dt> <dd>5120</dd> <dt>Output</dt> <dd><pre>[NOTE] Package coq is already installed (current version is 8.6). The following dependencies couldn&#39;t be met: - coq-rupicola -&gt; coq &gt;= 8.15~ -&gt; ocaml &gt;= 4.05.0 base of this switch (use `--unlock-base&#39; to force) No solution found, exiting </pre></dd> </dl> <p>Dry install without Coq/switch base, to test if the problem was incompatibility with the current Coq/OCaml version:</p> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>opam remove -y coq; opam install -y --show-action --unlock-base coq-rupicola.0.0.6</code></dd> <dt>Return code</dt> <dd>0</dd> </dl> <h2>Install dependencies</h2> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>true</code></dd> <dt>Return code</dt> <dd>0</dd> <dt>Duration</dt> <dd>0 s</dd> </dl> <h2>Install 🚀</h2> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>true</code></dd> <dt>Return code</dt> <dd>0</dd> <dt>Duration</dt> <dd>0 s</dd> </dl> <h2>Installation size</h2> <p>No files were installed.</p> <h2>Uninstall 🧹</h2> <dl class="dl-horizontal"> <dt>Command</dt> <dd><code>true</code></dd> <dt>Return code</dt> <dd>0</dd> <dt>Missing removes</dt> <dd> none </dd> <dt>Wrong removes</dt> <dd> none </dd> </dl> </div> </div> </div> <hr/> <div class="footer"> <p class="text-center"> Sources are on <a href="https://github.com/coq-bench">GitHub</a> © Guillaume Claret 🐣 </p> </div> </div> <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script> <script src="../../../../../bootstrap.min.js"></script> </body> </html>
{ "content_hash": "811ca4f0fa3653b6223870be6f09ad98", "timestamp": "", "source": "github", "line_count": 170, "max_line_length": 159, "avg_line_length": 41.50588235294118, "alnum_prop": 0.5446428571428571, "repo_name": "coq-bench/coq-bench.github.io", "id": "f2b96bf70737f4960fe0a8c9999e71374738dd39", "size": "7082", "binary": false, "copies": "1", "ref": "refs/heads/master", "path": "clean/Linux-x86_64-4.02.3-2.0.6/released/8.6/rupicola/0.0.6.html", "mode": "33188", "license": "mit", "language": [] }
using namespace llvm; namespace { typedef SmallString<COFF::NameSize> name; enum AuxiliaryType { ATFunctionDefinition, ATbfAndefSymbol, ATWeakExternal, ATFile, ATSectionDefinition }; struct AuxSymbol { AuxiliaryType AuxType; COFF::Auxiliary Aux; }; class COFFSymbol; class COFFSection; class COFFSymbol { public: COFF::symbol Data; typedef SmallVector<AuxSymbol, 1> AuxiliarySymbols; name Name; int Index; AuxiliarySymbols Aux; COFFSymbol *Other; COFFSection *Section; int Relocations; MCSymbolData const *MCData; COFFSymbol(StringRef name); size_t size() const; void set_name_offset(uint32_t Offset); bool should_keep() const; }; // This class contains staging data for a COFF relocation entry. struct COFFRelocation { COFF::relocation Data; COFFSymbol *Symb; COFFRelocation() : Symb(NULL) {} static size_t size() { return COFF::RelocationSize; } }; typedef std::vector<COFFRelocation> relocations; class COFFSection { public: COFF::section Header; std::string Name; int Number; MCSectionData const *MCData; COFFSymbol *Symbol; relocations Relocations; COFFSection(StringRef name); static size_t size(); }; // This class holds the COFF string table. class StringTable { typedef StringMap<size_t> map; map Map; void update_length(); public: std::vector<char> Data; StringTable(); size_t size() const; size_t insert(StringRef String); }; class WinCOFFObjectWriter : public MCObjectWriter { public: typedef std::vector<COFFSymbol*> symbols; typedef std::vector<COFFSection*> sections; typedef DenseMap<MCSymbol const *, COFFSymbol *> symbol_map; typedef DenseMap<MCSection const *, COFFSection *> section_map; llvm::OwningPtr<MCWinCOFFObjectTargetWriter> TargetObjectWriter; // Root level file contents. COFF::header Header; sections Sections; symbols Symbols; StringTable Strings; // Maps used during object file creation. section_map SectionMap; symbol_map SymbolMap; WinCOFFObjectWriter(MCWinCOFFObjectTargetWriter *MOTW, raw_ostream &OS); virtual ~WinCOFFObjectWriter(); COFFSymbol *createSymbol(StringRef Name); COFFSymbol *GetOrCreateCOFFSymbol(const MCSymbol * Symbol); COFFSection *createSection(StringRef Name); template <typename object_t, typename list_t> object_t *createCOFFEntity(StringRef Name, list_t &List); void DefineSection(MCSectionData const &SectionData); void DefineSymbol(MCSymbolData const &SymbolData, MCAssembler &Assembler, const MCAsmLayout &Layout); void MakeSymbolReal(COFFSymbol &S, size_t Index); void MakeSectionReal(COFFSection &S, size_t Number); bool ExportSymbol(MCSymbolData const &SymbolData, MCAssembler &Asm); bool IsPhysicalSection(COFFSection *S); // Entity writing methods. void WriteFileHeader(const COFF::header &Header); void WriteSymbol(const COFFSymbol *S); void WriteAuxiliarySymbols(const COFFSymbol::AuxiliarySymbols &S); void WriteSectionHeader(const COFF::section &S); void WriteRelocation(const COFF::relocation &R); // MCObjectWriter interface implementation. void ExecutePostLayoutBinding(MCAssembler &Asm, const MCAsmLayout &Layout); void RecordRelocation(const MCAssembler &Asm, const MCAsmLayout &Layout, const MCFragment *Fragment, const MCFixup &Fixup, MCValue Target, uint64_t &FixedValue); void WriteObject(MCAssembler &Asm, const MCAsmLayout &Layout); }; } static inline void write_uint32_le(void *Data, uint32_t const &Value) { uint8_t *Ptr = reinterpret_cast<uint8_t *>(Data); Ptr[0] = (Value & 0x000000FF) >> 0; Ptr[1] = (Value & 0x0000FF00) >> 8; Ptr[2] = (Value & 0x00FF0000) >> 16; Ptr[3] = (Value & 0xFF000000) >> 24; } //------------------------------------------------------------------------------ // Symbol class implementation COFFSymbol::COFFSymbol(StringRef name) : Name(name.begin(), name.end()) , Other(NULL) , Section(NULL) , Relocations(0) , MCData(NULL) { memset(&Data, 0, sizeof(Data)); } size_t COFFSymbol::size() const { return COFF::SymbolSize + (Data.NumberOfAuxSymbols * COFF::SymbolSize); } // In the case that the name does not fit within 8 bytes, the offset // into the string table is stored in the last 4 bytes instead, leaving // the first 4 bytes as 0. void COFFSymbol::set_name_offset(uint32_t Offset) { write_uint32_le(Data.Name + 0, 0); write_uint32_le(Data.Name + 4, Offset); } /// logic to decide if the symbol should be reported in the symbol table bool COFFSymbol::should_keep() const { // no section means its external, keep it if (Section == NULL) return true; // if it has relocations pointing at it, keep it if (Relocations > 0) { assert(Section->Number != -1 && "Sections with relocations must be real!"); return true; } // if the section its in is being droped, drop it if (Section->Number == -1) return false; // if it is the section symbol, keep it if (Section->Symbol == this) return true; // if its temporary, drop it if (MCData && MCData->getSymbol().isTemporary()) return false; // otherwise, keep it return true; } //------------------------------------------------------------------------------ // Section class implementation COFFSection::COFFSection(StringRef name) : Name(name) , MCData(NULL) , Symbol(NULL) { memset(&Header, 0, sizeof(Header)); } size_t COFFSection::size() { return COFF::SectionSize; } //------------------------------------------------------------------------------ // StringTable class implementation /// Write the length of the string table into Data. /// The length of the string table includes uint32 length header. void StringTable::update_length() { write_uint32_le(&Data.front(), Data.size()); } StringTable::StringTable() { // The string table data begins with the length of the entire string table // including the length header. Allocate space for this header. Data.resize(4); update_length(); } size_t StringTable::size() const { return Data.size(); } /// Add String to the table iff it is not already there. /// @returns the index into the string table where the string is now located. size_t StringTable::insert(StringRef String) { map::iterator i = Map.find(String); if (i != Map.end()) return i->second; size_t Offset = Data.size(); // Insert string data into string table. Data.insert(Data.end(), String.begin(), String.end()); Data.push_back('\0'); // Put a reference to it in the map. Map[String] = Offset; // Update the internal length field. update_length(); return Offset; } //------------------------------------------------------------------------------ // WinCOFFObjectWriter class implementation WinCOFFObjectWriter::WinCOFFObjectWriter(MCWinCOFFObjectTargetWriter *MOTW, raw_ostream &OS) : MCObjectWriter(OS, true) , TargetObjectWriter(MOTW) { memset(&Header, 0, sizeof(Header)); Header.Machine = TargetObjectWriter->getMachine(); } WinCOFFObjectWriter::~WinCOFFObjectWriter() { for (symbols::iterator I = Symbols.begin(), E = Symbols.end(); I != E; ++I) delete *I; for (sections::iterator I = Sections.begin(), E = Sections.end(); I != E; ++I) delete *I; } COFFSymbol *WinCOFFObjectWriter::createSymbol(StringRef Name) { return createCOFFEntity<COFFSymbol>(Name, Symbols); } COFFSymbol *WinCOFFObjectWriter::GetOrCreateCOFFSymbol(const MCSymbol * Symbol){ symbol_map::iterator i = SymbolMap.find(Symbol); if (i != SymbolMap.end()) return i->second; COFFSymbol *RetSymbol = createCOFFEntity<COFFSymbol>(Symbol->getName(), Symbols); SymbolMap[Symbol] = RetSymbol; return RetSymbol; } COFFSection *WinCOFFObjectWriter::createSection(StringRef Name) { return createCOFFEntity<COFFSection>(Name, Sections); } /// A template used to lookup or create a symbol/section, and initialize it if /// needed. template <typename object_t, typename list_t> object_t *WinCOFFObjectWriter::createCOFFEntity(StringRef Name, list_t &List) { object_t *Object = new object_t(Name); List.push_back(Object); return Object; } /// This function takes a section data object from the assembler /// and creates the associated COFF section staging object. void WinCOFFObjectWriter::DefineSection(MCSectionData const &SectionData) { assert(SectionData.getSection().getVariant() == MCSection::SV_COFF && "Got non COFF section in the COFF backend!"); // FIXME: Not sure how to verify this (at least in a debug build). MCSectionCOFF const &Sec = static_cast<MCSectionCOFF const &>(SectionData.getSection()); COFFSection *coff_section = createSection(Sec.getSectionName()); COFFSymbol *coff_symbol = createSymbol(Sec.getSectionName()); coff_section->Symbol = coff_symbol; coff_symbol->Section = coff_section; coff_symbol->Data.StorageClass = COFF::IMAGE_SYM_CLASS_STATIC; // In this case the auxiliary symbol is a Section Definition. coff_symbol->Aux.resize(1); memset(&coff_symbol->Aux[0], 0, sizeof(coff_symbol->Aux[0])); coff_symbol->Aux[0].AuxType = ATSectionDefinition; coff_symbol->Aux[0].Aux.SectionDefinition.Selection = Sec.getSelection(); coff_section->Header.Characteristics = Sec.getCharacteristics(); uint32_t &Characteristics = coff_section->Header.Characteristics; switch (SectionData.getAlignment()) { case 1: Characteristics |= COFF::IMAGE_SCN_ALIGN_1BYTES; break; case 2: Characteristics |= COFF::IMAGE_SCN_ALIGN_2BYTES; break; case 4: Characteristics |= COFF::IMAGE_SCN_ALIGN_4BYTES; break; case 8: Characteristics |= COFF::IMAGE_SCN_ALIGN_8BYTES; break; case 16: Characteristics |= COFF::IMAGE_SCN_ALIGN_16BYTES; break; case 32: Characteristics |= COFF::IMAGE_SCN_ALIGN_32BYTES; break; case 64: Characteristics |= COFF::IMAGE_SCN_ALIGN_64BYTES; break; case 128: Characteristics |= COFF::IMAGE_SCN_ALIGN_128BYTES; break; case 256: Characteristics |= COFF::IMAGE_SCN_ALIGN_256BYTES; break; case 512: Characteristics |= COFF::IMAGE_SCN_ALIGN_512BYTES; break; case 1024: Characteristics |= COFF::IMAGE_SCN_ALIGN_1024BYTES; break; case 2048: Characteristics |= COFF::IMAGE_SCN_ALIGN_2048BYTES; break; case 4096: Characteristics |= COFF::IMAGE_SCN_ALIGN_4096BYTES; break; case 8192: Characteristics |= COFF::IMAGE_SCN_ALIGN_8192BYTES; break; default: llvm_unreachable("unsupported section alignment"); } // Bind internal COFF section to MC section. coff_section->MCData = &SectionData; SectionMap[&SectionData.getSection()] = coff_section; } /// This function takes a section data object from the assembler /// and creates the associated COFF symbol staging object. void WinCOFFObjectWriter::DefineSymbol(MCSymbolData const &SymbolData, MCAssembler &Assembler, const MCAsmLayout &Layout) { MCSymbol const &Symbol = SymbolData.getSymbol(); COFFSymbol *coff_symbol = GetOrCreateCOFFSymbol(&Symbol); SymbolMap[&Symbol] = coff_symbol; if (SymbolData.getFlags() & COFF::SF_WeakExternal) { coff_symbol->Data.StorageClass = COFF::IMAGE_SYM_CLASS_WEAK_EXTERNAL; if (Symbol.isVariable()) { const MCSymbolRefExpr *SymRef = dyn_cast<MCSymbolRefExpr>(Symbol.getVariableValue()); if (!SymRef) report_fatal_error("Weak externals may only alias symbols"); coff_symbol->Other = GetOrCreateCOFFSymbol(&SymRef->getSymbol()); } else { std::string WeakName = std::string(".weak.") + Symbol.getName().str() + ".default"; COFFSymbol *WeakDefault = createSymbol(WeakName); WeakDefault->Data.SectionNumber = COFF::IMAGE_SYM_ABSOLUTE; WeakDefault->Data.StorageClass = COFF::IMAGE_SYM_CLASS_EXTERNAL; WeakDefault->Data.Type = 0; WeakDefault->Data.Value = 0; coff_symbol->Other = WeakDefault; } // Setup the Weak External auxiliary symbol. coff_symbol->Aux.resize(1); memset(&coff_symbol->Aux[0], 0, sizeof(coff_symbol->Aux[0])); coff_symbol->Aux[0].AuxType = ATWeakExternal; coff_symbol->Aux[0].Aux.WeakExternal.TagIndex = 0; coff_symbol->Aux[0].Aux.WeakExternal.Characteristics = COFF::IMAGE_WEAK_EXTERN_SEARCH_LIBRARY; coff_symbol->MCData = &SymbolData; } else { const MCSymbolData &ResSymData = Assembler.getSymbolData(Symbol.AliasedSymbol()); if (Symbol.isVariable()) { int64_t Addr; if (Symbol.getVariableValue()->EvaluateAsAbsolute(Addr, Layout)) coff_symbol->Data.Value = Addr; } coff_symbol->Data.Type = (ResSymData.getFlags() & 0x0000FFFF) >> 0; coff_symbol->Data.StorageClass = (ResSymData.getFlags() & 0x00FF0000) >> 16; // If no storage class was specified in the streamer, define it here. if (coff_symbol->Data.StorageClass == 0) { bool external = ResSymData.isExternal() || (ResSymData.Fragment == NULL); coff_symbol->Data.StorageClass = external ? COFF::IMAGE_SYM_CLASS_EXTERNAL : COFF::IMAGE_SYM_CLASS_STATIC; } if (Symbol.isAbsolute() || Symbol.AliasedSymbol().isVariable()) coff_symbol->Data.SectionNumber = COFF::IMAGE_SYM_ABSOLUTE; else if (ResSymData.Fragment != NULL) coff_symbol->Section = SectionMap[&ResSymData.Fragment->getParent()->getSection()]; coff_symbol->MCData = &ResSymData; } } /// making a section real involves assigned it a number and putting /// name into the string table if needed void WinCOFFObjectWriter::MakeSectionReal(COFFSection &S, size_t Number) { if (S.Name.size() > COFF::NameSize) { const unsigned Max6DecimalSize = 999999; const unsigned Max7DecimalSize = 9999999; uint64_t StringTableEntry = Strings.insert(S.Name.c_str()); if (StringTableEntry <= Max6DecimalSize) { std::sprintf(S.Header.Name, "/%d", unsigned(StringTableEntry)); } else if (StringTableEntry <= Max7DecimalSize) { // With seven digits, we have to skip the terminating null. Because // sprintf always appends it, we use a larger temporary buffer. char buffer[9] = { }; std::sprintf(buffer, "/%d", unsigned(StringTableEntry)); std::memcpy(S.Header.Name, buffer, 8); } else { report_fatal_error("COFF string table is greater than 9,999,999 bytes."); } } else std::memcpy(S.Header.Name, S.Name.c_str(), S.Name.size()); S.Number = Number; S.Symbol->Data.SectionNumber = S.Number; S.Symbol->Aux[0].Aux.SectionDefinition.Number = S.Number; } void WinCOFFObjectWriter::MakeSymbolReal(COFFSymbol &S, size_t Index) { if (S.Name.size() > COFF::NameSize) { size_t StringTableEntry = Strings.insert(S.Name.c_str()); S.set_name_offset(StringTableEntry); } else std::memcpy(S.Data.Name, S.Name.c_str(), S.Name.size()); S.Index = Index; } bool WinCOFFObjectWriter::ExportSymbol(MCSymbolData const &SymbolData, MCAssembler &Asm) { // This doesn't seem to be right. Strings referred to from the .data section // need symbols so they can be linked to code in the .text section right? // return Asm.isSymbolLinkerVisible (&SymbolData); // For now, all non-variable symbols are exported, // the linker will sort the rest out for us. return SymbolData.isExternal() || !SymbolData.getSymbol().isVariable(); } bool WinCOFFObjectWriter::IsPhysicalSection(COFFSection *S) { return (S->Header.Characteristics & COFF::IMAGE_SCN_CNT_UNINITIALIZED_DATA) == 0; } //------------------------------------------------------------------------------ // entity writing methods void WinCOFFObjectWriter::WriteFileHeader(const COFF::header &Header) { WriteLE16(Header.Machine); WriteLE16(Header.NumberOfSections); WriteLE32(Header.TimeDateStamp); WriteLE32(Header.PointerToSymbolTable); WriteLE32(Header.NumberOfSymbols); WriteLE16(Header.SizeOfOptionalHeader); WriteLE16(Header.Characteristics); } void WinCOFFObjectWriter::WriteSymbol(const COFFSymbol *S) { WriteBytes(StringRef(S->Data.Name, COFF::NameSize)); WriteLE32(S->Data.Value); WriteLE16(S->Data.SectionNumber); WriteLE16(S->Data.Type); Write8(S->Data.StorageClass); Write8(S->Data.NumberOfAuxSymbols); WriteAuxiliarySymbols(S->Aux); } void WinCOFFObjectWriter::WriteAuxiliarySymbols( const COFFSymbol::AuxiliarySymbols &S) { for(COFFSymbol::AuxiliarySymbols::const_iterator i = S.begin(), e = S.end(); i != e; ++i) { switch(i->AuxType) { case ATFunctionDefinition: WriteLE32(i->Aux.FunctionDefinition.TagIndex); WriteLE32(i->Aux.FunctionDefinition.TotalSize); WriteLE32(i->Aux.FunctionDefinition.PointerToLinenumber); WriteLE32(i->Aux.FunctionDefinition.PointerToNextFunction); WriteZeros(sizeof(i->Aux.FunctionDefinition.unused)); break; case ATbfAndefSymbol: WriteZeros(sizeof(i->Aux.bfAndefSymbol.unused1)); WriteLE16(i->Aux.bfAndefSymbol.Linenumber); WriteZeros(sizeof(i->Aux.bfAndefSymbol.unused2)); WriteLE32(i->Aux.bfAndefSymbol.PointerToNextFunction); WriteZeros(sizeof(i->Aux.bfAndefSymbol.unused3)); break; case ATWeakExternal: WriteLE32(i->Aux.WeakExternal.TagIndex); WriteLE32(i->Aux.WeakExternal.Characteristics); WriteZeros(sizeof(i->Aux.WeakExternal.unused)); break; case ATFile: WriteBytes(StringRef(reinterpret_cast<const char *>(i->Aux.File.FileName), sizeof(i->Aux.File.FileName))); break; case ATSectionDefinition: WriteLE32(i->Aux.SectionDefinition.Length); WriteLE16(i->Aux.SectionDefinition.NumberOfRelocations); WriteLE16(i->Aux.SectionDefinition.NumberOfLinenumbers); WriteLE32(i->Aux.SectionDefinition.CheckSum); WriteLE16(i->Aux.SectionDefinition.Number); Write8(i->Aux.SectionDefinition.Selection); WriteZeros(sizeof(i->Aux.SectionDefinition.unused)); break; } } } void WinCOFFObjectWriter::WriteSectionHeader(const COFF::section &S) { WriteBytes(StringRef(S.Name, COFF::NameSize)); WriteLE32(S.VirtualSize); WriteLE32(S.VirtualAddress); WriteLE32(S.SizeOfRawData); WriteLE32(S.PointerToRawData); WriteLE32(S.PointerToRelocations); WriteLE32(S.PointerToLineNumbers); WriteLE16(S.NumberOfRelocations); WriteLE16(S.NumberOfLineNumbers); WriteLE32(S.Characteristics); } void WinCOFFObjectWriter::WriteRelocation(const COFF::relocation &R) { WriteLE32(R.VirtualAddress); WriteLE32(R.SymbolTableIndex); WriteLE16(R.Type); } //////////////////////////////////////////////////////////////////////////////// // MCObjectWriter interface implementations void WinCOFFObjectWriter::ExecutePostLayoutBinding(MCAssembler &Asm, const MCAsmLayout &Layout) { // "Define" each section & symbol. This creates section & symbol // entries in the staging area. for (MCAssembler::const_iterator i = Asm.begin(), e = Asm.end(); i != e; i++) DefineSection(*i); for (MCAssembler::const_symbol_iterator i = Asm.symbol_begin(), e = Asm.symbol_end(); i != e; i++) { if (ExportSymbol(*i, Asm)) { DefineSymbol(*i, Asm, Layout); } } } void WinCOFFObjectWriter::RecordRelocation(const MCAssembler &Asm, const MCAsmLayout &Layout, const MCFragment *Fragment, const MCFixup &Fixup, MCValue Target, uint64_t &FixedValue) { assert(Target.getSymA() != NULL && "Relocation must reference a symbol!"); const MCSymbol &Symbol = Target.getSymA()->getSymbol(); const MCSymbol &A = Symbol.AliasedSymbol(); MCSymbolData &A_SD = Asm.getSymbolData(A); MCSectionData const *SectionData = Fragment->getParent(); // Mark this symbol as requiring an entry in the symbol table. assert(SectionMap.find(&SectionData->getSection()) != SectionMap.end() && "Section must already have been defined in ExecutePostLayoutBinding!"); assert(SymbolMap.find(&A_SD.getSymbol()) != SymbolMap.end() && "Symbol must already have been defined in ExecutePostLayoutBinding!"); COFFSection *coff_section = SectionMap[&SectionData->getSection()]; COFFSymbol *coff_symbol = SymbolMap[&A_SD.getSymbol()]; const MCSymbolRefExpr *SymA = Target.getSymA(); const MCSymbolRefExpr *SymB = Target.getSymB(); const bool CrossSection = SymB && &SymA->getSymbol().getSection() != &SymB->getSymbol().getSection(); if (Target.getSymB()) { const MCSymbol *B = &Target.getSymB()->getSymbol(); MCSymbolData &B_SD = Asm.getSymbolData(*B); // Offset of the symbol in the section int64_t a = Layout.getSymbolOffset(&B_SD); // Ofeset of the relocation in the section int64_t b = Layout.getFragmentOffset(Fragment) + Fixup.getOffset(); FixedValue = b - a; // In the case where we have SymbA and SymB, we just need to store the delta // between the two symbols. Update FixedValue to account for the delta, and // skip recording the relocation. if (!CrossSection) return; } else { FixedValue = Target.getConstant(); } COFFRelocation Reloc; Reloc.Data.SymbolTableIndex = 0; Reloc.Data.VirtualAddress = Layout.getFragmentOffset(Fragment); // Turn relocations for temporary symbols into section relocations. if (coff_symbol->MCData->getSymbol().isTemporary() || CrossSection) { Reloc.Symb = coff_symbol->Section->Symbol; FixedValue += Layout.getFragmentOffset(coff_symbol->MCData->Fragment) + coff_symbol->MCData->getOffset(); } else Reloc.Symb = coff_symbol; ++Reloc.Symb->Relocations; Reloc.Data.VirtualAddress += Fixup.getOffset(); Reloc.Data.Type = TargetObjectWriter->getRelocType(Target, Fixup, CrossSection); // FIXME: Can anyone explain what this does other than adjust for the size // of the offset? if (Reloc.Data.Type == COFF::IMAGE_REL_AMD64_REL32 || Reloc.Data.Type == COFF::IMAGE_REL_I386_REL32) FixedValue += 4; coff_section->Relocations.push_back(Reloc); } void WinCOFFObjectWriter::WriteObject(MCAssembler &Asm, const MCAsmLayout &Layout) { // Assign symbol and section indexes and offsets. Header.NumberOfSections = 0; DenseMap<COFFSection *, uint16_t> SectionIndices; for (sections::iterator i = Sections.begin(), e = Sections.end(); i != e; i++) { if (Layout.getSectionAddressSize((*i)->MCData) > 0) { size_t Number = ++Header.NumberOfSections; SectionIndices[*i] = Number; MakeSectionReal(**i, Number); } else { (*i)->Number = -1; } } Header.NumberOfSymbols = 0; for (symbols::iterator i = Symbols.begin(), e = Symbols.end(); i != e; i++) { COFFSymbol *coff_symbol = *i; MCSymbolData const *SymbolData = coff_symbol->MCData; // Update section number & offset for symbols that have them. if ((SymbolData != NULL) && (SymbolData->Fragment != NULL)) { assert(coff_symbol->Section != NULL); coff_symbol->Data.SectionNumber = coff_symbol->Section->Number; coff_symbol->Data.Value = Layout.getFragmentOffset(SymbolData->Fragment) + SymbolData->Offset; } if (coff_symbol->should_keep()) { MakeSymbolReal(*coff_symbol, Header.NumberOfSymbols++); // Update auxiliary symbol info. coff_symbol->Data.NumberOfAuxSymbols = coff_symbol->Aux.size(); Header.NumberOfSymbols += coff_symbol->Data.NumberOfAuxSymbols; } else coff_symbol->Index = -1; } // Fixup weak external references. for (symbols::iterator i = Symbols.begin(), e = Symbols.end(); i != e; i++) { COFFSymbol *coff_symbol = *i; if (coff_symbol->Other != NULL) { assert(coff_symbol->Index != -1); assert(coff_symbol->Aux.size() == 1 && "Symbol must contain one aux symbol!"); assert(coff_symbol->Aux[0].AuxType == ATWeakExternal && "Symbol's aux symbol must be a Weak External!"); coff_symbol->Aux[0].Aux.WeakExternal.TagIndex = coff_symbol->Other->Index; } } // Fixup associative COMDAT sections. for (sections::iterator i = Sections.begin(), e = Sections.end(); i != e; i++) { if ((*i)->Symbol->Aux[0].Aux.SectionDefinition.Selection != COFF::IMAGE_COMDAT_SELECT_ASSOCIATIVE) continue; const MCSectionCOFF &MCSec = static_cast<const MCSectionCOFF &>( (*i)->MCData->getSection()); COFFSection *Assoc = SectionMap.lookup(MCSec.getAssocSection()); if (!Assoc) { report_fatal_error(Twine("Missing associated COMDAT section ") + MCSec.getAssocSection()->getSectionName() + " for section " + MCSec.getSectionName()); } // Skip this section if the associated section is unused. if (Assoc->Number == -1) continue; (*i)->Symbol->Aux[0].Aux.SectionDefinition.Number = SectionIndices[Assoc]; } // Assign file offsets to COFF object file structures. unsigned offset = 0; offset += COFF::HeaderSize; offset += COFF::SectionSize * Header.NumberOfSections; for (MCAssembler::const_iterator i = Asm.begin(), e = Asm.end(); i != e; i++) { COFFSection *Sec = SectionMap[&i->getSection()]; if (Sec->Number == -1) continue; Sec->Header.SizeOfRawData = Layout.getSectionAddressSize(i); if (IsPhysicalSection(Sec)) { Sec->Header.PointerToRawData = offset; offset += Sec->Header.SizeOfRawData; } if (Sec->Relocations.size() > 0) { bool RelocationsOverflow = Sec->Relocations.size() >= 0xffff; if (RelocationsOverflow) { // Signal overflow by setting NumberOfSections to max value. Actual // size is found in reloc #0. Microsoft tools understand this. Sec->Header.NumberOfRelocations = 0xffff; } else { Sec->Header.NumberOfRelocations = Sec->Relocations.size(); } Sec->Header.PointerToRelocations = offset; if (RelocationsOverflow) { // Reloc #0 will contain actual count, so make room for it. offset += COFF::RelocationSize; } offset += COFF::RelocationSize * Sec->Relocations.size(); for (relocations::iterator cr = Sec->Relocations.begin(), er = Sec->Relocations.end(); cr != er; ++cr) { assert((*cr).Symb->Index != -1); (*cr).Data.SymbolTableIndex = (*cr).Symb->Index; } } assert(Sec->Symbol->Aux.size() == 1 && "Section's symbol must have one aux!"); AuxSymbol &Aux = Sec->Symbol->Aux[0]; assert(Aux.AuxType == ATSectionDefinition && "Section's symbol's aux symbol must be a Section Definition!"); Aux.Aux.SectionDefinition.Length = Sec->Header.SizeOfRawData; Aux.Aux.SectionDefinition.NumberOfRelocations = Sec->Header.NumberOfRelocations; Aux.Aux.SectionDefinition.NumberOfLinenumbers = Sec->Header.NumberOfLineNumbers; } Header.PointerToSymbolTable = offset; Header.TimeDateStamp = sys::TimeValue::now().toEpochTime(); // Write it all to disk... WriteFileHeader(Header); { sections::iterator i, ie; MCAssembler::const_iterator j, je; for (i = Sections.begin(), ie = Sections.end(); i != ie; i++) if ((*i)->Number != -1) { if ((*i)->Relocations.size() >= 0xffff) { (*i)->Header.Characteristics |= COFF::IMAGE_SCN_LNK_NRELOC_OVFL; } WriteSectionHeader((*i)->Header); } for (i = Sections.begin(), ie = Sections.end(), j = Asm.begin(), je = Asm.end(); (i != ie) && (j != je); ++i, ++j) { if ((*i)->Number == -1) continue; if ((*i)->Header.PointerToRawData != 0) { assert(OS.tell() == (*i)->Header.PointerToRawData && "Section::PointerToRawData is insane!"); Asm.writeSectionData(j, Layout); } if ((*i)->Relocations.size() > 0) { assert(OS.tell() == (*i)->Header.PointerToRelocations && "Section::PointerToRelocations is insane!"); if ((*i)->Relocations.size() >= 0xffff) { // In case of overflow, write actual relocation count as first // relocation. Including the synthetic reloc itself (+ 1). COFF::relocation r; r.VirtualAddress = (*i)->Relocations.size() + 1; r.SymbolTableIndex = 0; r.Type = 0; WriteRelocation(r); } for (relocations::const_iterator k = (*i)->Relocations.begin(), ke = (*i)->Relocations.end(); k != ke; k++) { WriteRelocation(k->Data); } } else assert((*i)->Header.PointerToRelocations == 0 && "Section::PointerToRelocations is insane!"); } } assert(OS.tell() == Header.PointerToSymbolTable && "Header::PointerToSymbolTable is insane!"); for (symbols::iterator i = Symbols.begin(), e = Symbols.end(); i != e; i++) if ((*i)->Index != -1) WriteSymbol(*i); OS.write((char const *)&Strings.Data.front(), Strings.Data.size()); } MCWinCOFFObjectTargetWriter::MCWinCOFFObjectTargetWriter(unsigned Machine_) : Machine(Machine_) { } // Pin the vtable to this file. void MCWinCOFFObjectTargetWriter::anchor() {} //------------------------------------------------------------------------------ // WinCOFFObjectWriter factory function namespace llvm { MCObjectWriter *createWinCOFFObjectWriter(MCWinCOFFObjectTargetWriter *MOTW, raw_ostream &OS) { return new WinCOFFObjectWriter(MOTW, OS); } }
{ "content_hash": "964dc216b3c0edaf5fd7b913fa15a36d", "timestamp": "", "source": "github", "line_count": 893, "max_line_length": 80, "avg_line_length": 34.15901455767077, "alnum_prop": 0.6406700760555992, "repo_name": "hoangt/goblin-core", "id": "d9ca86d8af93d08cf41351f14bf5c23becd6ae3a", "size": "31693", "binary": false, "copies": "10", "ref": "refs/heads/master", "path": "llvm/3.4.2/llvm-3.4.2.src/lib/MC/WinCOFFObjectWriter.cpp", "mode": "33188", "license": "bsd-3-clause", "language": [ { "name": "AppleScript", "bytes": "1429" }, { "name": "Assembly", "bytes": "37219664" }, { "name": "Awk", "bytes": "1296" }, { "name": "Bison", "bytes": "769886" }, { "name": "C", "bytes": "121618095" }, { "name": "C#", "bytes": "12418" }, { "name": "C++", "bytes": "125510142" }, { "name": "CMake", "bytes": "708668" }, { "name": "CSS", "bytes": "43924" }, { "name": "Cuda", "bytes": "12393" }, { "name": "D", "bytes": "23091496" }, { "name": "DTrace", "bytes": "8533449" }, { "name": "E", "bytes": "3290" }, { "name": "Eiffel", "bytes": "2314" }, { "name": "Elixir", "bytes": "314" }, { "name": "Emacs Lisp", "bytes": "41146" }, { "name": "FORTRAN", "bytes": "377751" }, { "name": "Forth", "bytes": "4188" }, { "name": "GAP", "bytes": "21991" }, { "name": "GDScript", "bytes": "54941" }, { "name": "Gnuplot", "bytes": "446" }, { "name": "Groff", "bytes": "940592" }, { "name": "HTML", "bytes": "1118040" }, { "name": "JavaScript", "bytes": "24233" }, { "name": "LLVM", "bytes": "48362057" }, { "name": "M", "bytes": "2548" }, { "name": "Makefile", "bytes": "5469249" }, { "name": "Mathematica", "bytes": "5497" }, { "name": "Matlab", "bytes": "54444" }, { "name": "Mercury", "bytes": "1222" }, { "name": "Nemerle", "bytes": "141" }, { "name": "OCaml", "bytes": "748821" }, { "name": "Objective-C", "bytes": "4996482" }, { "name": "Objective-C++", "bytes": "1419213" }, { "name": "Perl", "bytes": "974117" }, { "name": "Perl6", "bytes": "80156" }, { "name": "Pure Data", "bytes": "22171" }, { "name": "Python", "bytes": "1375992" }, { "name": "R", "bytes": "627855" }, { "name": "Rebol", "bytes": "51929" }, { "name": "Scheme", "bytes": "4296232" }, { "name": "Shell", "bytes": "2237613" }, { "name": "Standard ML", "bytes": "5682" }, { "name": "SuperCollider", "bytes": "734239" }, { "name": "Tcl", "bytes": "2234" }, { "name": "TeX", "bytes": "601780" }, { "name": "VimL", "bytes": "26411" } ] }
<resources> <!-- Base application theme. --> <style name="AppTheme" parent="Theme.AppCompat.Light.DarkActionBar"> <!-- Customize your theme here. --> <item name="colorPrimary">@color/colorPrimary</item> <item name="colorPrimaryDark">@color/colorPrimaryDark</item> <item name="colorAccent">@color/colorAccent</item> </style> </resources>
{ "content_hash": "69af672042bf86b8041eb376ed4ecda3", "timestamp": "", "source": "github", "line_count": 10, "max_line_length": 72, "avg_line_length": 38.1, "alnum_prop": 0.6535433070866141, "repo_name": "rickgit/Test", "id": "fac92916801e0cbbd570ef81122ed49e0c944ccb", "size": "381", "binary": false, "copies": "54", "ref": "refs/heads/master", "path": "app_ndk/src/main/res/values/styles.xml", "mode": "33188", "license": "apache-2.0", "language": [ { "name": "Assembly", "bytes": "32239" }, { "name": "Batchfile", "bytes": "615" }, { "name": "C", "bytes": "12241963" }, { "name": "C++", "bytes": "2655688" }, { "name": "CMake", "bytes": "145060" }, { "name": "Groovy", "bytes": "1885" }, { "name": "Java", "bytes": "1011960" }, { "name": "Kotlin", "bytes": "49459" }, { "name": "M4", "bytes": "24713" }, { "name": "Makefile", "bytes": "8022" }, { "name": "Metal", "bytes": "3796" }, { "name": "Objective-C", "bytes": "573643" }, { "name": "Perl", "bytes": "22722" }, { "name": "Python", "bytes": "2615" }, { "name": "Shell", "bytes": "4853" } ] }
require "spec_helper" describe "association assignment from nested attributes" do before do define_model("Post", title: :string) do has_many :comments accepts_nested_attributes_for :comments end define_model("Comment", post_id: :integer, body: :text) do belongs_to :post end FactoryGirl.define do factory :post do comments_attributes { [FactoryGirl.attributes_for(:comment), FactoryGirl.attributes_for(:comment)] } end factory :comment do sequence(:body) { |n| "Body #{n}" } end end end it "assigns the correct amount of comments" do expect(FactoryGirl.create(:post).comments.count).to eq 2 end it "assigns the correct amount of comments when overridden" do expect(FactoryGirl.create(:post, :comments_attributes => [FactoryGirl.attributes_for(:comment)]).comments.count).to eq 1 end end
{ "content_hash": "2ac80218639c7395c50f7aeefe170d12", "timestamp": "", "source": "github", "line_count": 32, "max_line_length": 124, "avg_line_length": 27.96875, "alnum_prop": 0.6782122905027933, "repo_name": "keeperhood/factory_girl", "id": "44dea106928f5fd15c0a4d37c1c2498fe327e7fd", "size": "895", "binary": false, "copies": "55", "ref": "refs/heads/master", "path": "spec/acceptance/nested_attributes_spec.rb", "mode": "33188", "license": "mit", "language": [ { "name": "Cucumber", "bytes": "2256" }, { "name": "Ruby", "bytes": "211777" } ] }
End of preview.

No dataset card yet

Downloads last month
7