text stringlengths 0 7.84M | meta dict |
|---|---|
Applied improvisation
Applied improvisation is the application of improvisational methods in various fields like consulting, facilitating (workshops, team trainings, meetings, conferences…), teaching, coaching, researching, generating or evolving ideas and designs, theatrical training and playing, medical and therapeutic settings or in social work.
History
Shamans are part of every culture, starting from the origin of humanity in the Stone Age. They used stories, initiated and facilitated various rituals consisting of storytelling, dances and embodiment of past or future events and challenges. They applied basic principles of improvisation, to help people, to collaborate, to understand what is happening around them and to cope with daily life.
The Sanskrit-Theater originated 1,000 BC is a very early form of theatre. The vidusaka was part of the cast and the plot and at the same time stepped out, made comments, that mostly were improvised. He can also be seen as the first appearance of a clown.
Starting about 600 BC Rhapsodes travelled around Greece. He – only men – combined in an improvised way the epics of e. g. of Homer, myths, tales, jokes, songs and adapted this mixture spontaneously to local expectations and situations. In many aspects he built upon the traditions of the shamans and the trickster.
In the fifth century BC originated in Greece an intense examination of improvisation applied in rhetoric. Schools were established, to learn how to improvise a speech which had a t relevance as instrument of democracy and civic education.
The fourth century BC in Greece is an important milestone for the history of theatre: The roots of drama and comedy were established, also in the phylax play, a burlesque dramatic, played on wooden stages: A lot of the content was improvised. Improvisation also was applied in the Atellan Farces in ancient Rome.
Always a combination of stocked and improvised moves, dances and artistic movements coined performances of Pantomimes – in ancient Rome it was also one of the origins of improvised dancing. Their performances were influenced through traditions of the Etruscans. They also performend on public spaces, traveling around the country.
From the 12th to the 14th century Minnesang was very popular. The musicians visited castles and cities, they used stocked textes and music and added a lot of improvisation, inspired also by historical events, social developments and reactions of the audience. Both, Commedia dell’Arte and Minnesang can also be perceived as a contribution to adult education and civic education in a time, literacy was not common and access to information often difficult.
Also in the 12th emerged the Jester and had the task to use improvisational methods to call attention to urgent matters of politics and society. He was allowed to criticise decisions of the emperors. In many ways he incorporates the practise of the Trickster.
In medieval Italy Commedia dell’Arte was a revival of the Atellan Farces: Groups of actors travelled also through parts of Europe. As inspiration for the improvised parts of their plays they also used political events and current social issues, what they experienced and heard in other cities, as well as wishes and reactions of the audience.
In the 17th century the pedagogue John Comenius was one of the first who emphasized the significance of play and games as method for learning and teaching – he referred to the works of Plato. Games often consist of stepping into different personas in a blink of a moment, of pretending being in various settings and genres. They often have a certain structure and also invite to play with this structures, evolve them – therefore improvisation is an important aspect of games. Also games are often the starting points for improvisation and theatrical approaches. Also pedagogues like Pestalozzi, Fröbel and Montessori applied games and role plays in education.
The social worker Neva Boyd collected around 1920 description of various games and added own inventions. She used them to teach language skills, problem-solving, self-confidence and social skills in Chicago. She also accompanied people to cope with the effects of the Great Depression. Boyd also worked together with the theatre academic and educator Viola Spolin. She used and evolved her games which „…were meant to promote creative expression through self discovery and personal experiences between children with extremely diverse backgrounds.” (LaPolice, 2012, S. 26). Spolin also emphasized, that these games and exercises evolved key competencies like self-efficacy, creativity, the ability for collaboration and to tackle problems. Spolin applied the games and exercises also in workshops for actors of the Compass Theater in Chicago, the first improvisational theatre group.
Also in the 1920s Jakob L. Moreno summarized existing ideas and founded in Vienna the Stegreiftheater, the Theater of Spontaneity: People attending were actors as well as the audience – the shift took often place within a used improvisational exercises or scene. Later on he used these experiences in the development of psychodrama – improvisational approaches are an important aspect in this method.
Starting in the 1950s the educator and playwright Keith Johnstone developed in the USA and Canada various improvisational games – they not only were used to train actors or to prepare for performances of improvisational theatre groups but also in workshops and projects with teachers of schools and universities, trainers in juvenile and adult education. In the development of improvisational theater also Del Close had an important function, as he developed also long form games.
Also in the 1950s started the work of Augusto Boal: He also developed various improvisational methods and applied them e. g. in the theatre of the oppressed. Actors and audience switch their roles often; Boals methods are also applied in civic education, in the training of teachers of all fields and in politics.
In 1975 Johnathan Fox and Jo Salas founded the first Playback Theatre: People from an audience tell stories of their lives, which are after that transformed in theatrical sequenzes by actors, using improvisational methods speaking, singing and dancing.
In the late 1990s the performative turn began in social research. Performance – also any kind of using improvisational approaches - was not longer considered only as a process on a theatre stage with a rather passive audience, but as a comprehensive principle for researching and understanding human actions. People like Shaun McNiff started to apply improvisational methods in arts based research projects.
2002 the Applied Improvisation Network was founded, a non-profit organization of people using improvisational methods in various fields.
Examples for the implementation of applied improvisation
Additional in the training of actors and as preparation to improvisational theater, applied improvisation e. g. is used:
In consulting and corporate training applied improvisation is used e. g. in team and sales training, workshops for presentation skills, resilience, leadership and for people responsible for innovations.
Applied improvisation is also often used in methods like design thinking and in service design projects – therefore the methods are also researched and applied in UX-Design. And it is used in evolving engineering.
Another field for applied improvisation is in the education and advanced training for people working in the fields of social work, medical and health care and medicine. A new approach is to apply improvisation for disaster readiness and response. Additional to psychodrama, improvisational methods are also used in other therapeutic settings and drama therapy.
In education, improvisational approaches are used in all fields of teaching teachers, of designing and evolving didactic concepts, and also as a method of student centered learning.
As mentioned in the history, improvisational methods are also applied as research tools since the permormative turn. There are several projects connected to applied drama where improvisation is an important aspect. There are also examples of use in the fields of health and life sciences, in evolving didactics and in design research.
The Applied Improvisation Network
The Applied Improvisation Network (AIN) is a global community of over 5000 members online and numerous local groups. Practitioners of applied improvisation facilitate workshops, for individuals or organisations, introducing them to these principles, tools etc. via solo, paired or group exercises, activities and games.
The members of the AIN are business professionals and academics who use improvisation tools, experience, and theory for human development and training in communities and organizations. The network includes consultants, managers, trainers, coaches, facilitators, performers and academics.
The AIN was founded in 2002 by Paul Z Jackson, Michael Rosenburg and Alain Rostain. Since then the number of local groups and online communities has grown year by year, with a series of conferences, regional events and roadshows in North America, Europe, Japan and Australia.
Further reading
Boal, Augusto. 1993. Theatre of the oppressed. Theatre Communications Group
DesMaisons, Ted. 2014. Applied Improvisation Definition Generator
Landgraf, Edgar. 2014. Improvisation as Art: Conceptual Challenges, Historical Perspectives. Bloomsbury Academic.
Sawyer, Keith. 2011. Structure and Improvisation in Creative Teaching. Cambridge University Press.
References
External links
Applied Improvisation Network
List of literature List of medical improv literature
Category:Training | {
"pile_set_name": "Wikipedia (en)"
} |
Analysis of the origins of black carbon and carbon monoxide transported to Beijing, Tianjin, and Hebei in China.
A novel back-trajectory approach was adopted to determine the origins of black carbon (BC) and carbon monoxide (CO) transported to Beijing, Tianjin and Hebei. Results showed that the transport efficiency was controlled mainly by mid-latitude westerlies in winter, the South Asian monsoon in summer and prevailing westerly and northwesterly winds in spring and autumn. Hebei was identified as the most important source region of both BC (respectively accounting for 55% and 49%) and CO (39% and 38%) transported to Beijing and Tianjin. Inner Mongolia contributed more to the effective emission intensity (EEI) in winter than in summer for both BC and CO transported to Beijing and Tianjin. Shandong was responsible for higher EEI in summer than in winter. The six provinces making the greatest contributions to BC transported to Hebei were Shandong (19%), Shanxi (19%), Inner Mongolia (17%), Beijing (11%), Henan (11%), and Tianjin (10%), whereas those making the greatest contributions to CO transported to Hebei were Shandong (20%), Inner Mongolia (10%), Tianjin (9%), Henan (9%), Shanxi (9%), and Beijing (8%). In summary, Hebei, Inner Mongolia, Shandong, Tianjin and Shanxi were determined as the dominant source regions of not only BC but also CO transported to Beijing. Hebei, Shandong, Beijing, Inner Mongolia, Henan, Liaoning and Shanxi were relatively important source regions for Tianjin. Shandong, Shanxi, Inner Mongolia, Beijing, Henan, Tianjin, Liaoning, Jiangsu and Anhui were the main source regions for Hebei. Residential and industrial sectors were the dominant sectors for BC and CO transported to the receptors, respectively. These results are consistent with the results of previous studies. Finally, comparing the observed ΔBC/ΔCO ratio with the enhancement ratio of the EEI of BC with that of CO (ΔEEIBC/ΔEEICO) at Miyun site, we further confirmed that the EEI can be used to represent the amounts of BC and CO reaching receptors. | {
"pile_set_name": "PubMed Abstracts"
} |
Anesthesia and epidermolysis bullosa.
Patients with epidermolysis bullosa (EB) may present for anesthesia with an unrelated surgical condition or, more commonly, for diagnostic or therapeutic procedures. Children in particular may require frequent anesthetics. Safe and effective management of anesthesia presents a significant challenge and although there is little rigorous evidence available to aid decision-making, in this article the elements of current good anesthesia care in EB are summarized. | {
"pile_set_name": "PubMed Abstracts"
} |
278 U.S. 282 (1929)
BOTANY WORSTED MILLS
v.
UNITED STATES.
No. 31.
Supreme Court of United States.
Submitted April 23, 1928.
Argued November 20, 1928.
Decided January 2, 1929.
CERTIORARI TO THE COURT OF CLAIMS.
*283 Mr. Nathan A. Smyth for petitioner.
Solicitor General Mitchell for the United States.
*284 A brief on behalf of Mr. A.G. Lacy, as amicus curiae, was filed by special leave of Court on motion of the Solicitor General.
MR. JUSTICE SANFORD delivered the opinion of the Court.
The Botany Worsted Mills, a New Jersey corporation engaged in the manufactured of woolen and worsted fabrics, made a return of its net income for the taxable year 1917 under the Revenue Act of 1916[1] and the War Revenue Act of 1917.[2] By § 12(a) of the Revenue Act it was provided that in ascertaining the net income of a corporation organized in the United States there should be deducted from its gross income all "the ordinary and necessary expenses paid within the year in the maintenance and operation of its business and properties." Under this provision the Mills deducted amounts aggregating $1,565,739.39 paid as compensation to the members of its board of directors, in addition to salaries of $9,000 each. It paid an income tax computed in accordance with this return. Thereafter, in 1920, the Commissioner of Internal Revenue assessed an additional income tax against it. Of this, $450,994.06 was attributable to his disallowance of $783,656.06 of the deduction claimed as compensation paid to the directors, on the ground that the total amount paid as compensation was unreasonable and the remainder of the deduction as allowed represented fair and reasonable compensation. The Mills, after paying the additional tax, filed a claim for refund of this $450,994.06. The claim was disallowed; and the Mills thereafter, in September 1924, by a petition in the Court of Claims sought to recover this sum from the United States, with *285 interest alleging that the disallowance of part of the compensation paid the directors was illegal.[3] After a hearing on the merits the court, upon its findings of fact, dismissed the petition upon the ground that the additional tax was imposed under an agreement of settlement which prevented a recovery. 63 C. Cls. 405. And this writ of certiorari was granted.
The first question presented is whether the Mills is precluded from recovering the amount claimed by reason of a settlement.
Sec. 3229 of the Revised Statutes,[4] provides that: "The Commissioner of Internal Revenue, with the advice and consent of the Secretary of the Treasury, may compromise any civil or criminal case arising under the internal-revenue laws instead of commencing suit thereon; and, with the advice and consent of the said Secretary and the recommendation of the Attorney-General, he may compromise any such case after a suit thereon has been commenced. Whenever a compromise is made in any case there shall be placed on file in the office of the Commissioner the opinion of the Solicitor of Internal Revenue, . . . with his reasons therefor, with a statement of *286 the amount of tax assessed, . . . and the amount actually paid in accordance with the terms of the compromise."[5]
The Government did not claim that there had been a compromise under this statute, but contended in the Court of Claims that, irrespective thereof, an agreement of settlement had been entered into between the Mills and the Commissioner under which the Mills had accepted the partial disallowance as to the compensation paid the directors, and had also received concessions as to other disputed items the benefit of which it still enjoyed, and was therefore estopped from seeking a recovery.
As to this matter the findings of fact show that after the Mills had paid the amount of the tax shown by its original return, an investigation of its books disclosed to the Commissioner the necessity of making an additional assessment, to be determined by the settlement of questions relating to the compensation (or, as it was termed, bonus) paid to the directors, depreciation charged off on its books, and reserves charged to expenses. After much correspondence and numerous conferences extending over several months between the attorney and assistant treasurer of the Mills and the chief of the special audit section of the Bureau of Internal Revenue and other of his official associates, a compromise was agreed to as to all the differences, by which the amounts to be allowed as reasonable compensation to the directors and as depreciation were agreed upon, and the claim as to reserve was allowed. Thereupon the Mills prepared and filed an amended return based upon the figures agreed upon in the conferences, with documentary evidence which it had *287 agreed to furnish; and the additional assessment was made in accordance with this return.[6]
The court, in sustaining the Government's contention, said: "With the payment of the tax under the circumstances surrounding this case the agreement, which is mentioned in the record as a `gentleman's agreement,' became in legal effect an executed contract of settlement"; and that, as the Mills was seeking to recover to account of the particular item which it regarded as unfavorable to its interests, and at the same time hold to the advantage derived from the settlement of other items in dispute involved in the same general settlement, it should not be allowed a recovery.
The Mills contends that the Commissioner had not been given, at the time in question, any authority, either in express terms or by implication, to compromise tax cases except as provided in § 3229; that this statute in granting such authority under specific limitations as to the method to be pursued, negatived his authority to effect a valid and binding agreement in any other way; that as the Government could not have been estopped by the unauthorized transactions of its officials, the Mills likewise could not be estopped thereby; and further, that the findings are insufficient to establish an estoppel.
The Government does not here challenge any of these contentions. In the brief for the United States filed in this Court the Solicitor General states that the question whether such an informal adjustment of taxes as was made in this case is binding on the taxpayer, is submitted for decision in deference to the opinion of the Court of Claims and the importance of the question but no argument is made in support of the Government's previous contention that the Mills was estopped from questioning *288 the settlement. And, on the contrary, it is stated that "Before and since the date of the alleged settlement in this case Congress has evidently proceeded on the theory that no adjustment of a tax controversy between representatives of the Bureau of Internal Revenue and a taxpayer is binding unless made with the formalities and with the approval of the officials prescribed by statute. The authority of officers of the United States to compromise claims on behalf of or against the United States in strictly limited. . . The statutes which authorize conclusive agreements and settlements to be made in particular ways and with the approval of designated officers raise the inference that adjustments or settlements made in other ways are not binding." And further, that "No ground for the United States to claim estoppel is disclosed in the findings."
Independently of these concessions, we are of the opinion that the informal settlement made in this case did not constitute a binding agreement. Sec. 3229 authorizes the Commissioner of Internal Revenue to compromise tax claims before suit, with the advice and consent of the Secretary of the Treasury, and requires that an opinion of the Solicitor of Internal Revenue setting forth the compromise be filed in the Commissioner's office. Here the attempted settlement was made by subordinate officials in the Bureau of Internal Revenue. And although it may have been ratified by the Commissioner in making the additional assessment based thereon, it does not appear that it was assented to by the Secretary, or that the opinion of the Solicitor was filed in the Commissioner's office.
We think that Congress intended by the statute to prescribed the exclusive method by which tax cases could be compromised, requiring therefor the concurrence of the Commissioner and the Secretary, and prescribing the formality with which, as a matter of public concern, it should be attested in the files of the Commissioner's office; *289 and did not intend to intrust the final settlement of such matters to the informal action of subordinate officials in the Bureau. When a statute limits a thing to be done in a particular mode, it includes the negative of any other mode. Raleigh, etc. R.R. Co. v. Reid, 13 Wall. 269, 270; Scott v. Ford, 52 Ore. 288, 296.
It is plain that no compromise is authorized by this statute which is not assented to by the Secretary of the Treasury. Leach v. Nichols (C.C.A.) 23 F. (2d) 275, 277. For this reason, if for no other the informal agreement made in this case did not constitute a settlement which in itself was binding upon the Government or the Mills. And, without determining whether such an agreement, though not binding in itself, may when executed become, under some circumstances, binding on the parties by estoppel, it suffices to say that here the findings disclose no adequate ground for any claim of estoppel by the United States.
We therefore conclude that the Mills was not precluded by the settlement from recovering any portion of the tax to which it may otherwise have been entitled.
This brings us to the question whether on the findings of fact the Mills is entitled to recover the portion of the additional tax attributable to the disallowance of $783,656.06 of the amount paid to the directors which it had claimed as a deduction.[7]
Under § 12(a) of the Revenue Act of 1916 the Mills was not entitled to this deduction unless the amount paid constituted a part of its "ordinary and necessary expenses" in the maintenance and operation of its business and properties. And in this suit the burden of establishing *290 that fact rested upon it, in order to show that it was entitled to the deduction which the Commissioner had disallowed, and that the additional tax was to that extent illegally assessed. The Court of Claims, however, made no finding that the amount disallowed by the Commissioner constituted a part of the ordinary and necessary expenses of the Mills. The findings are silent as to this ultimate fact essential to a recovery by the Mills and only show certain circumstantial facts relating to the payment made to the board of directors.
Where the Court of Claims does not make a finding upon the ultimate question of fact upon which the rights of the parties depend, but merely makes findings as to subsidiary circumstantial facts which bear upon it, such findings will not support a judgment unless the circumstantial facts as found are such that the ultimate fact follows from them as a necessary inference and may be held to result as a conclusion of law. See United States v. Pugh, 99 U.S. 265, 269; Winton v. Amos, 255 U.S. 373, 395.
The findings show that for many years it has been the practice of many corporations engaged in the woolen manufacturing business to base the compensation of the directors and executive officers upon a percentage of profits. Upon the organization of the Mills in 1890 the stockholders adopted a by-law providing that at the close of the business year the net profits should be distributed by paying a dividend of 6 per cent to stockholders and applying the balance remaining as follows: (a) placing 5 per cent in a reserve fund; (b) paying 25 per cent "As a bonus to the board of directors"; and (c) paying 70 per cent as additional dividend to the stockholders. The stockholders amended this by-law in 1903 by increasing the bonus of the board of directors to 40 per cent; in 1905, by providing, instead of a "bonus," that "compensation" *291 equal to 40 per cent should be "paid to the board of directors for their services"; and in 1908, by reducing such compensation to 32 per cent [that is 30.08 per cent of the net profits.] This by-law remained in force until after the taxable year 1917; and during the entire period "compensation" was paid to the directors in accordance therewith. From the outset the determination of the total amount of profit and of the aggregate amount payable to the board of directors was made by the board itself; and it likewise determined the basis of the apportionment among the several directors of the aggregate amount payable to the board as a whole. No contract was made with any director as to what his compensation should be other than such as was implied from his election and service as a member of the board in accordance with the by-law and the customary practices of the company, which each knew. At all times each director also held a position as an executive officer or manager of a department of the Mills.
The gross assets of the Mills increased from $1,114,149.63 in 1890 to $28,893,777.12 in 1917; and its net assets, including reserves, from $37,136.35 to $10,999,862.48. Its net income increased from $784,334.44 in 1910 to $7,953,512.80 in 1917; and the amount paid the directors in pursuance of the by-law increased, with some fluctuations, from $268,444.19 in 1910, to $400,935.18 in 1915, $693,617.16 in 1916, and $1,565,739.39 in 1917.[8] In 1917 there were the members of the board, so that if the total amount had been apportioned ratably, each would have received $156,573.93. And in that year each member of the board, in addition to the part of the aggregate in fact apportioned to him individually, also received a salary of $9,000.
*292 The findings do not show the nature or extent of the services rendered by the board of directors or its individual members, either as directors, executive officers or department managers the amounts apportioned and paid to each director the basis of apportionment, whether the nature and extent of their individual services, the amount of their stockholdings, or otherwise the value of their services or the reasonableness of the purported compensation.
We do not find it necessary to determine here whether the amounts paid by a corporation to its officers as compensation for their services cannot be allowed as "ordinary and necessary expenses" within the meaning of § 12 (a), merely because, and to the extent that, as compensation, they are unreasonable in amount.[9] However this may be, it is clear that extraordinary, unusual and extravagant amounts paid by a corporation to its officers in the guise and form of compensation for their services, but having no substantial relation to the measure of their services and being utterly disproportioned to their value, are not in reality payment for services, and cannot be regarded as "ordinary and necessary expenses" within the meaning of the section; and that such amounts do not become part of the "ordinary and necessary expenses" merely because the payments are made in accordance with an agreement between the corporation and its officers. Even if binding upon the parties, such an agreement does not change the character of the purported compensation or constitute it, as against the Government, an ordinary and necessary expense. Compare 20 Treas. Dec., Int. Rev., 330; Jacobs & Davies v. Anderson (C.C.A.), 228 Fed. 505, 506; *293 United States v. Philadelphia Knitting Mills Co. (C.C.A.), 273 Fed. 657, 658; and Becker Bros. v. United States (C.C.A.), 7 F. (2d) 3, 6.
In the light of this principle it is clear that the findings do not show, as a matter of necessary inference resulting as a conclusion of law, that the amount paid the directors in excess of the $782,083.33 allowed by the Commissioner,[10] constituted part of the ordinary and necessary expenses of the Mills. On the contrary, as this amount so greatly exceeded the amounts which, as a matter of common knowledge, are usually paid to directors for their attendance at meetings of the board and the discharge of their customary duties, and was much greater than the amounts that had been paid in prior years,[11] and as there is no showing as to the amounts paid the individual directors, in addition to the salaries of $9,000 which each received presumably for his service as an executive officer or department manager or as to the nature, extent or value of their services, the findings raise a strong inference that the unusual and extraordinary amount paid to the directors was not in fact compensation for their services, but merely a distribution of a fixed percentage of the net profits that had no relation to the services rendered.
Therefore, as the Mills has not sustained the burden of showing that the amount disallowed by the Commissioner was in fact part of its ordinary and necessary expenses, the judgment must, for this reason, be
Affirmed.
MR. JUSTICE HOLMES agrees with the result.
NOTES
[1] 39 Stat. 756, c. 463.
[2] 40 Stat. 300, c. 63.
[3] Sec. 3226 of the Revised Statutes had been previously amended by § 1318 of the Revenue Act of 1921, 42 Stat. 227, 314, c. 136, so as to provide that no suit or proceeding should be maintained in any court for the recovery of any internal-revenue tax alleged to have been erroneously or illegally assessed or collected until a claim for refund or credit had been duly filed with the Commissioner of Internal Revenue; and further amended by § 1014(a) of the Revenue Act of 1924, 43 Stat. 253, 343, c. 234, so as to provide that such suit or proceeding might be maintained, whether or not such tax had been paid under protest or duress. And the right of the Mills to maintain this suit, although the tax had not been paid under protest or duress, is not questioned by the Government.
[4] U.S.C., Tit. 26, § 158.
[5] Since the date of the settlement here involved §§ 1312 and 1313 of the Revenue Act of 1921, § 1006 of the Revenue Act of 1924, and § 1106(b) of the Revenue Act of 1926 have dealt specifically with agreements in writing made by a taxpayer and the Commissioner, with the approval of the Secretary, that the previous determination and assessment of a tax shall be final and conclusive.
[6] The findings indicate inferentially that some tax claims of the Mills for two other years were also included in the settlement; but the precise facts do not appear.
[7] This is claimed in the brief filed for the Mills; and in the oral argument its counsel specifically stated that the Mills relied on the sufficiency of the findings and made no request that the case be remanded to the Court of Claims for additional findings, as the Solicitor General had suggested.
[8] The figures for some other years are also given in tabulated statements included in the findings.
[9] Later, by § 214(a) of the Revenue Act of 1918, 40 Stat. 1057, c. 18, it was specifically provided that the "ordinary and necessary expenses" should include "a reasonable allowance for salaries or other compensation for personal services actually rendered."
[10] The amount allowed, it may be noted, was, in itself, $481,934.02 more than the average of the amounts that had been paid in the seven years immediately preceding, and $88,466.17 more than the greatest amount that had been paid in any one year.
[11] See note 10, supra.
| {
"pile_set_name": "FreeLaw"
} |
Superpower (disambiguation)
Superpower, a state with the ability to influence events and project power on a worldwide scale
Second Superpower, a term used to conceptualize a global civil society as a counterpoint to the United States of America
Energy superpower
Potential superpowers
Superpower may also refer to:
Superpower (ability), extraordinary powers mostly possessed by fictional characters, commonly in American comic books
Tetration, superpower or hyperpower is used as synonym of tetration.
Music
SuperPower, a musician that was featured in Three 6 Mafia's hit, Lolli Lolli.
"Super Powers", a song by The Dismemberment Plan from their 2001 album Change
"Superpower" (song), a song by Beyoncé featuring Frank Ocean from her self-titled album (2013)
Sports and games
The Super Powers, a tag team in the NWA's Jim Crockett Promotions in the 1980s
Super Cup (rugby union), an annual international Rugby Union competition, originally called Super Powers Cup
Superpower (board game), a 1986 political strategy game
SuperPower, a 2002 political simulation computer game
SuperPower 2, a 2004 strategic wargame game
Super PLAY, a video game magazine named Super POWER from 1993 to 1996
TV
The Superpower, a 1983 Hong Kong TV series
The Legendary Super Powers Show, an alternate title for a later season of Super Friends
"Super Powers" (Homeland), a 2015 episode of the TV series Homeland
Other
Super Power Building, also known as the SP Building, a Church of Scientology high-rise complex in Clearwater, Florida
Super Powers Collection, a line of action figures based on DC Comics superheroes and supervillains, created in the 1980s by Kenner Toys
Project Superpowers, a comic book from Dynamite Entertainment
Superpower (horse) | {
"pile_set_name": "Wikipedia (en)"
} |
Description:
Jemma looks hot in her tight fitting pants and heels. But she just wants to get comfortable, so she slowly strips out of her clothes and then then pulls on her hairy pussy now that she's comfortable! | {
"pile_set_name": "OpenWebText2"
} |
Q:
Count Based on 2 date Ranges for 2 different columns
Having an issue writing a report.
I am trying to count on the number of enquiries Issued and passed, they both have a datetime field
The problem im having is the results are coming back incorrect.
Running this code gives me 126 for passed
SELECT COUNT(*) AS Passed FROM BPS.dbo.tbl_Profile AS p
Inner Join BPS.dbo.tbl_Profile_Mortgage AS pm
ON p.Id = pm.FK_ProfileId
WHERE p.CaseTypeId IN (1,2,9,15) AND CONVERT(DATE,pm.DatePassed,103) BETWEEN @Start AND @End
When i run the issued query I get 223
SELECT COUNT(*) AS Issued FROM BPS.dbo.tbl_Profile AS p
Inner Join BPS.dbo.tbl_Profile_Mortgage AS pm
ON p.Id = pm.FK_ProfileId
WHERE p.CaseTypeId IN (1,2,9,15) AND CONVERT(DATE,pm.DateAppIssued,103) BETWEEN @Start AND @End
These figures are correct so i put it in one query like so.
SELECT COUNT(pm.DateAppIssued) AS Issued,COUNT(pm.DatePassed) AS Passed FROM BPS.dbo.tbl_Profile AS p
Inner Join BPS.dbo.tbl_Profile_Mortgage AS pm
ON p.Id = pm.FK_ProfileId
WHERE p.CaseTypeId IN (1,2,9,15)
AND (CONVERT(DATE,pm.DateAppIssued,103) BETWEEN @Start AND @End
OR CONVERT(DATE,pm.DatePassed,103) BETWEEN @Start AND @End)
This gives me Issued 265 and passed 185
I have tried many different variation but still cant get the correct figures
I hope i have explained this well enough, any help would be much appreciated.
Rusty
A:
Because you have the both the conditions in the where clause with an or condition, you are seeing a different result. Use them in the aggregation itself.
SELECT
COUNT(case when CONVERT(DATE,pm.DateAppIssued,103) BETWEEN @Start AND @End then 1 end) AS Issued,
COUNT(case when CONVERT(DATE,pm.DatePassed,103) BETWEEN @Start AND @End then 1 end) AS Passed
FROM BPS.dbo.tbl_Profile AS p
Inner Join BPS.dbo.tbl_Profile_Mortgage AS pm ON p.Id = pm.FK_ProfileId
WHERE p.CaseTypeId IN (1,2,9,15)
| {
"pile_set_name": "StackExchange"
} |
Q:
how to change code this into recursion Java?
public long weightedSum(int[] a, int[] b, int n) {
long value = 0;
long sum = 0;
for (int i = 0; i < a.length; i++) {
value = a[i] * b[i];
sum = sum + value;
}
return sum;
}
takes two 1D integer arrays and an integer n as parameters and returns the sum of products of the first n elements of the two arrays. For example, given the following two 1D arrays:
int[] arr1 = {1, 2, 3, 4, 5};
int[] arr2 = {6, 7, 8, 9, 10};
the first 4 elements of arr1 and arr2, that is,1 * 6 + 2 * 7 + 3 * 8 + 4 * 9 = 80 as the result.
A:
public static long weightedSum(int[] a, int[] b, int n) {
if (n == 0)
return 0;
else
return a[n - 1] * b[n - 1] + weightedSum(a, b, n - 1);
}
Output :
int[] arr1 = { 1, 2, 3, 4, 5 };
int[] arr2 = { 6, 7, 8, 9, 10 };
System.out.println(weightedSum(arr1, arr2, 1)); // output : 6
System.out.println(weightedSum(arr1, arr2, 2)); // output : 20
System.out.println(weightedSum(arr1, arr2, 3)); // output : 44
System.out.println(weightedSum(arr1, arr2, 4)); // output : 80
System.out.println(weightedSum(arr1, arr2, 5)); // output : 130
| {
"pile_set_name": "StackExchange"
} |
Cage Warriors 64 took place at The Forum in London, and the organization’s 2014 debut card featured a host of local talent. In the headliner, fast-rising bantamweight Cory Tait took on fellow Englishman James Pennington. | {
"pile_set_name": "Pile-CC"
} |
import $ from 'jquery';
import env from '../core/env';
import lists from '../core/lists';
import dom from '../core/dom';
export default class TablePopover {
constructor(context) {
this.context = context;
this.ui = $.summernote.ui;
this.options = context.options;
this.events = {
'summernote.mousedown': (we, e) => {
this.update(e.target);
},
'summernote.keyup summernote.scroll summernote.change': () => {
this.update();
},
'summernote.disable': () => {
this.hide();
}
};
}
shouldInitialize() {
return !lists.isEmpty(this.options.popover.table);
}
initialize() {
this.$popover = this.ui.popover({
className: 'note-table-popover'
}).render().appendTo(this.options.container);
const $content = this.$popover.find('.popover-content,.note-popover-content');
this.context.invoke('buttons.build', $content, this.options.popover.table);
// [workaround] Disable Firefox's default table editor
if (env.isFF) {
document.execCommand('enableInlineTableEditing', false, false);
}
}
destroy() {
this.$popover.remove();
}
update(target) {
if (this.context.isDisabled()) {
return false;
}
const isCell = dom.isCell(target);
if (isCell) {
const pos = dom.posFromPlaceholder(target);
this.$popover.css({
display: 'block',
left: pos.left,
top: pos.top
});
} else {
this.hide();
}
return isCell;
}
hide() {
this.$popover.hide();
}
}
| {
"pile_set_name": "Github"
} |
Q:
Separate node js query results into objects?
I have a node query that is returning results. However, the handlebars helpers on my view is treating the entire array as one result:
{{#if donations}}
{{#each donations}}
{{this}} is printing the entire set of donations!
{{/each}}
{{/if}}
The {{this}} prints the entire set of the array instead of just printing one object on the web page:
{ owner: '5a6d4c99d6320c05223e0cdb', _id: 5a930456d5ff0809409d15d8, name: 'test', price: 24, description: 'please donate to our cause', enddate: null, __v: 0 },{ owner: '5a6d4c99d6320c05223e0cdb', _id: 5a9a0b601be9210796083c9b, name: 'please donate again', price: 150, description: 'medical condition', enddate: null, __v: 0 },
Here's the relevant router code:
// GET user by username
router.get('/:username', function(req, res) {
//var id = req.params.id;
var username = req.params.username;
User.getUserByUsername(username,function(err, user){
const vm = user;
var id = vm.id;
Donation.getDonationsByUserId(id, function(err, donations){
const vd = { donations };
//console.log(vd);
res.render('user', {donations: vd, user: vm});
})
});
});
And here's the model function it's referencing:
module.exports.getDonationsByUserId = function(id, callback){
return Donation.find({owner: id}, function(err, donations) {
//ar query = {owner: id};
return callback(err, donations);
});
}
How can I iterate through each individual object result?
Thanks!
A:
Considering your JSON structure as below,
{
"donations": [
{
"owner": "5a6d4c99d6320c05223e0cdb",
"_id": "5a930456d5ff0809409d15d8",
"name": "test",
"price": 24,
"description": "please donate to our cause",
"enddate": null,
"__v": 0
},
{
"owner": "5a6d4c99d6320c05223e0cdb",
"_id": "5a9a0b601be9210796083c9b",
"name": "please donate again",
"price": 150,
"description": "medical condition",
"enddate": null,
"__v": 0
}
]
}
You may use the below template structure.
{{#if donations}
{{#each donations}}
{{#each this}}
{{this}}
{{/each}}
is printing the entire set of donations!
{{/each}}
{{/if}}
Which prints the output as,
5a6d4c99d6320c05223e0cdb
5a930456d5ff0809409d15d8
test
24
please donate to our cause
0
is printing the entire set of donations!
5a6d4c99d6320c05223e0cdb
5a9a0b601be9210796083c9b
please donate again
150
medical condition
0
is printing the entire set of donations!
Tested using http://tryhandlebarsjs.com
Hope this helps.
| {
"pile_set_name": "StackExchange"
} |
// Copyright (C) 2017-2019 Jonathan Müller <jonathanmueller.dev@gmail.com>
// This file is subject to the license terms in the LICENSE file
// found in the top-level directory of this distribution.
#include <cppast/cpp_template_parameter.hpp>
#include <cppast/cpp_entity_kind.hpp>
using namespace cppast;
const char* cppast::to_string(cpp_template_keyword kw) noexcept
{
switch (kw)
{
case cpp_template_keyword::keyword_class:
return "class";
case cpp_template_keyword::keyword_typename:
return "typename";
}
return "should not get here";
}
std::unique_ptr<cpp_template_type_parameter> cpp_template_type_parameter::build(
const cpp_entity_index& idx, cpp_entity_id id, std::string name, cpp_template_keyword kw,
bool variadic, std::unique_ptr<cpp_type> default_type)
{
std::unique_ptr<cpp_template_type_parameter> result(
new cpp_template_type_parameter(std::move(name), kw, variadic, std::move(default_type)));
idx.register_definition(std::move(id), type_safe::cref(*result));
return result;
}
cpp_entity_kind cpp_template_type_parameter::kind() noexcept
{
return cpp_entity_kind::template_type_parameter_t;
}
cpp_entity_kind cpp_template_type_parameter::do_get_entity_kind() const noexcept
{
return kind();
}
bool detail::cpp_template_parameter_ref_predicate::operator()(const cpp_entity& e)
{
return e.kind() == cpp_entity_kind::template_type_parameter_t;
}
std::unique_ptr<cpp_non_type_template_parameter> cpp_non_type_template_parameter::build(
const cpp_entity_index& idx, cpp_entity_id id, std::string name, std::unique_ptr<cpp_type> type,
bool is_variadic, std::unique_ptr<cpp_expression> default_value)
{
std::unique_ptr<cpp_non_type_template_parameter> result(
new cpp_non_type_template_parameter(std::move(name), std::move(type), is_variadic,
std::move(default_value)));
idx.register_definition(std::move(id), type_safe::cref(*result));
return result;
}
cpp_entity_kind cpp_non_type_template_parameter::kind() noexcept
{
return cpp_entity_kind::non_type_template_parameter_t;
}
cpp_entity_kind cpp_non_type_template_parameter::do_get_entity_kind() const noexcept
{
return kind();
}
bool detail::cpp_template_ref_predicate::operator()(const cpp_entity& e)
{
return is_template(e.kind()) || e.kind() == cpp_entity_kind::template_template_parameter_t;
}
cpp_entity_kind cpp_template_template_parameter::kind() noexcept
{
return cpp_entity_kind::template_template_parameter_t;
}
cpp_entity_kind cpp_template_template_parameter::do_get_entity_kind() const noexcept
{
return kind();
}
| {
"pile_set_name": "Github"
} |
Some people think that rock climbing walls are fun, but to me it sounds as mundane as a game of golf. In one, you hit a ball, go find the ball and hit it again. In the other, you just… climb. It doesn’t sound like a great time to me. But these Star Wars rock climbing accessories might help make it more fun.
They come from UK-based Hang Fast Adventure Structures. Instead of grasping onto a rock shaped thing, you can now hoist yourself up using the Death Star or R2-D2’s head. If I had a whole wall of Droid heads and Death Stars, I might actually want to climb it.
They also have some triangular pieces, one of which has the Millenium Falcon on it, but it doesn’t look so hot in that shape. A Star Destroyer would have been better. If you are a rock climber who is also a Star Wars nerd, you might want to check them out here.
[via Geeks Are Sexy] | {
"pile_set_name": "OpenWebText2"
} |
[Comparative study of the state of health and nutrition of young children from rural and urban areas of the Uzbek SSR].
The study of the physical development including caliperometric investigations was conducted in 2620 infants. The food ingredients and caloric content were studied in the daily ration of 267 infants with premorbid signs. Parameters of the physical development (growth, body mass, chest circumference size, and standards of fat content in per cent) in infants can be recommended as reference ones in the region studied. The physical development of infants depended on proper organization of full value nutrition, both quantitative and qualitative. | {
"pile_set_name": "PubMed Abstracts"
} |
Use of nutritional supplements among Mexican women and the estimated impact on dietary intakes below the EAR and above the UL.
To describe supplement use practices among non-pregnant, non-lactating Mexican women (12-49 y) and estimate their impact on the proportion of women with intakes below the Estimated Average Requirement (EAR) and above the Upper Limit (UL) using data from a national probabilistic nutrition survey in Mexico (1999). Information was collected by questionnaire on the frequency and duration of supplement use in the previous 6 months (n = 17,794). Dietary intakes by 24-hour recall were determined in a representative sub-sample (n = 2,599). Frequency of use and available information on the nutrient content of supplements was used to estimate daily equivalent intakes. 17.6% of women reported to have used supplements. The majority of these took supplements once daily (71%) and for < or =2 months (75%). While nutrient intakes from diet alone did not differ between users and non-users, the proportion with intakes of Vitamins A, B(6), B(12), and C, folate, iron, and zinc < EAR were significantly greater among the supplement non-users when intakes from supplements were also considered. The proportion of women with intakes > UL was greater among supplement users than non-users for iron, folate and Vitamin B(6). Supplement use contributes to the adequacy of nutrient intakes but may also increase the possible risk of toxic intakes of some nutrients among Mexican women. | {
"pile_set_name": "PubMed Abstracts"
} |
Q:
RowStyles and Rows doesn't match in TableLayoutPanel?
We are hiding controls in a TableLayoutPanel. We have been using the following code for a while for hiding a row that shouldn't be visible.
int controlRow = m_panel.GetPositionFromControl(control).Row;
m_panel.RowStyles[controlRow].SizeType = SizeType.Absolute;
m_panel.RowStyles[controlRow].Height = 0;
Now we have been adding more rows, and all of a sudden we have problems with the indexes.
There are fewer RowStyles than Rows.
Is there something fishy going on, or have I misunderstood how the TableLayoutPanel works?
A:
I've tried digging into the issue. The problem is that you did not add rows in a correct way. To add rows correctly, you have to ensure the value of RowCount and the number of RowStyles to be equal. You can see this right in the Form1.Designer.cs in the autogenerated code for the tableLayoutPanel. So you should do something like this:
//add a new row
tableLayoutPanel.RowCount++;
tableLayoutPanel.RowStyles.Add(newRowStyle);
The mismatching in fact does not cause a very serious problem. When the RowStyles.Count is larger than the actual RowCount, all the top RowStyles (which has count being equal to RowCount) will be used to style the rows, the rest can be seen as reserve. When the RowStyles.Count is smaller than the actual RowCount, there will be some rows not having any style and may be collapsed. Anyway using the code I posted above to add a new row will help you avoid any issue. The point is we have to ensure the number of rows and the number of RowStyles to be equal.
| {
"pile_set_name": "StackExchange"
} |
i can't get around digital downloads, would rather just buy something physical and tangible.
And i'm going to look at it this way...if a band's releases are for sale digitally but not physically...this is not a band worth hearing, especially when so many amazing bands are putting out physical albums still, and even more so there's enough physical music currently in existence that i wouldn't even be able to listen to it all, so there will always be another band to get.
I buy mp3s from bands if 1) I don't know their output well and don't feel like paying good money to something I may not love or 2) it's unavailable for purchase from another format.
So, I actually appreciate albums available via bandcamp and frequently throw a couple of bucks to the band to d/l the album, even if a physical copy is available. I have limited space in my apartment and I only want hard copies if I feel the release is essential or I can't get it any other way._________________Old dudes watched nothing but Mr. Rogers and Sesame Street. That's how you start a proper path to misanthropy and hate. -- Goldenbull
As someone who has never downloaded a single piece of music form the intra-webbers, I've found Bandcamp to be great. I tend to buy most releases that include an immediate download as well as physical copy and it's an added plus that the music goes right to the artist.
And i'm going to look at it this way...if a band's releases are for sale digitally but not physically...this is not a band worth hearing, especially when so many amazing bands are putting out physical albums still, and even more so there's enough physical music currently in existence that i wouldn't even be able to listen to it all, so there will always be another band to get.
This is more true in underground metal, where bands and fans tend to be fetishistic about physical releases, than in many other genres. Digital is where things have been heading for a long time now and some other genres of music have just beaten us there. I have bought very few metal releases as digital-only, but some other stuff I listen to, quite a bit more of it will never see a physical release - especially other niche genres._________________http://unspeakableaxerecords.com
It amazes me to see stuff on Amazon where the album on MP3 costs £7.99 and the cd can be bought "new and used from" £2 or something ridiculous
THIS.
I found mp3s of the band of a friend at amazon last week (that price for the full album). He wasn't informed that the label (metalhit if I recall correctly) would not only make physical copies but also upload them tracks as paid mp3.
Funny he has the album listed for free on bandcamp, and the CD can be purchased for 5 usd He won't take any actions about this though, because he couldn't care less about how his music is spread (and also he thinks if anyone is idiotic enough to pay for that, he deserves to be charged), but I still think is unfair for the artist._________________
It works like that because most people don't give a shit about a full album and would rather download single songs, so at the end it's still cheaper to pay 1 or 2 bucks for the songs they care about than 5 for the whole album.
It amazes me to see stuff on Amazon where the album on MP3 costs £7.99 and the cd can be bought "new and used from" £2 or something ridiculous
THIS.
I found mp3s of the band of a friend at amazon last week (that price for the full album). He wasn't informed that the label (metalhit if I recall correctly) would not only make physical copies but also upload them tracks as paid mp3.
Funny he has the album listed for free on bandcamp, and the CD can be purchased for 5 usd He won't take any actions about this though, because he couldn't care less about how his music is spread (and also he thinks if anyone is idiotic enough to pay for that, he deserves to be charged), but I still think is unfair for the artist.
It amazes me to see stuff on Amazon where the album on MP3 costs £7.99 and the cd can be bought "new and used from" £2 or something ridiculous
THIS.
I found mp3s of the band of a friend at amazon last week (that price for the full album). He wasn't informed that the label (metalhit if I recall correctly) would not only make physical copies but also upload them tracks as paid mp3.
Funny he has the album listed for free on bandcamp, and the CD can be purchased for 5 usd He won't take any actions about this though, because he couldn't care less about how his music is spread (and also he thinks if anyone is idiotic enough to pay for that, he deserves to be charged), but I still think is unfair for the artist.
Did you Make up that story? Digital distribution is what metalhit do.
True story, I was told they arranged for a cd version. He never got a copy either. Feel free to believe whatever you want _________________
It amazes me to see stuff on Amazon where the album on MP3 costs £7.99 and the cd can be bought "new and used from" £2 or something ridiculous
THIS.
I found mp3s of the band of a friend at amazon last week (that price for the full album). He wasn't informed that the label (metalhit if I recall correctly) would not only make physical copies but also upload them tracks as paid mp3.
Funny he has the album listed for free on bandcamp, and the CD can be purchased for 5 usd He won't take any actions about this though, because he couldn't care less about how his music is spread (and also he thinks if anyone is idiotic enough to pay for that, he deserves to be charged), but I still think is unfair for the artist.
Did you Make up that story? Digital distribution is what metalhit do.
True story, I was told they arranged for a cd version. He never got a copy either. Feel free to believe whatever you want
So in conclusion, he thought that Metalhit would do CDs only, whereas they did digital too. Furthermore they are selling the CD for 5 usd and haven't given him any, but he doesn't really care about that and won't take action.
Maybe it's just being stubborn but a digital copy of an album means absolutely nothing to me. I can find a digital copy of almost anything I'd want in under 10 minutes for free, why would I voluntarily pay for that? I buy plenty of physical releases and rarely if ever dowload anything these days, but there's simply no way I'll spend money for a digital album.
In my opinion, the legitimate digital release market is merely a reaction to the success and prevalence of its illegitimate counterpart rather than a symptom of a supposed trajectory. Eh, it's definitely debatable.
Maybe it's just being stubborn but a digital copy of an album means absolutely nothing to me. I can find a digital copy of almost anything I'd want in under 10 minutes for free, why would I voluntarily pay for that?
1. To support the band/label
2. For convenience (with so many legal options, it's now easier to buy than to illegally download, in many cases; at the very least you're a lot less likely to waste time clicking on dead links, and/or end up with a virus or adware)
That's why most people do it. Personally my emphasis is on #1. Like I said I prefer hard copies, but if I can't get that I'd rather put a few bucks in the musician's pocket than steal his stuff. However, I do believe that digital releases should be priced accordingly; if they cost the same as a new CD I'm not buying, 9 times out of 10. There's so much less cost and overhead, it's just not right to charge the same amount.
astralvesl wrote:
In my opinion, the legitimate digital release market is merely a reaction to the success and prevalence of its illegitimate counterpart rather than a symptom of a supposed trajectory.
The truth is actually both, I think. It was DEFINITELY a reaction, but on the other hand, that illegitimate channel was doing such a good job of choking the life out of the record business because younger listeners were fine with downloading music and not owning a CD. The illegal option created a market direction which legitimate channels are now reacting to._________________http://unspeakableaxerecords.com | {
"pile_set_name": "Pile-CC"
} |
Hei, denne artikkelen er over ett år gammel og kan inneholde utdatert informasjon
Rune Grahn (66), rektor ved Skedsmo VGS gikk av på dagen 8. april, bare et halvt år før han skulle pensjonere seg. Grunnen var knallhard kritikk fra Fylkesrevisjonen, som har gransket Skedsmo-skolens samarbeids-prosjekt i Russland, Moskva-skolen.
Moskva-skolen er finansiert over statsbudsjettet med 1,3 millioner kroner. I tillegg har Skedsmo VGS, som har vært ansvarlige for regnskapene, bidratt med 850.000 kroner fra sitt budsjett. Penger som egentlig skulle kommet Skedsmo-elevene til gode.
Fylkesrevisjonen mener at den russiske rektoren Evgeny Karpov har brukt store deler av disse midlene til å berike seg selv.
Det er sendt millionbeløp til skolen i Russland over flere år, uten at Skedsmo VGS har fått noen god dokumentasjon på hva pengene har gått til. Den dokumentasjonen som er framskaffet, viser at det er brukt store summer på alkohol, hotellovernattinger, restaurantbesøk, flybilletter og bensin.
Tidligere rektor ved Skedsmo VGS, Rune Grahn ønsker ikke å kommentere saken.
Også den russiske rektoren har trukket seg fra sin stilling.
- Full gjennomgang Fylkesrådmann Tron Bamrud mener at hele Moskvaskolen bør avvikles, og varsler nå en full gjennomgang av alle utenlandsprosjekter tilknyttet de videregående skolene i distriktet.
- Min vurdering er at Moskvaskolen bør avvikles. I lys av Moskva-saken må vi nok se på alle utenlandsprosjektene, sier fylkesrådmann Tron Bamrud til Dagbladet.
Akershus og Østfold fylkeskommunes rapport om Moskva-skolens prosjekt var ferdig 15. april. Den konkluderer med at rektor ved Skedsmo videregåendes utøvelse av myndighet tilknyttet Moskvaskolen «kan tendere til grov tjenesteforsømmelse».
Vodka og parfyme Blant funnene i rapporten er:
• Den russiske rektoren har selv fastsatt lønn for sine ansatte. Den best betalte læreren er hans egen kone, som for sin deltidsstilling fikk opp til tre ganger så godt betalt som andre fulltidsansatte.
• Bilag for 2010 viser at det er brukt penger på betydelige mengder matvarer, vin, vodka, øl, restaurantbesøk, hotellovernattinger, bensin, støvsuger, drill, DVDer etc.
Revisjonen kritiserte pengerbruken i 2011, men den russiske rektorens pengefest vedvarte:
• Da da de sjekket igjen i 2012, fant de at de norske offentlige midlene var brukt til blant annet flybilletter og hotellopphold for rektor og hans familie, restaurant og caferegninger med alkohol i stort omfang, bensin, sigaretter, sjokolade og teaterbilletter.
I tillegg, skriver fylkesrevisjonen, ble det brukt penger på hundeutstyr, kjole, champagne, parfyme og konfekt.
Det rettes også kritikk mot fylkesrådmannen, fylkesdirektøren og assisterende fylkesdirektør for at pengerotet i Moskva har fått pågå så lenge. Sistnevnte har vært på besøk på moskvaskolen flere ganger.
Vurderer politianmeldelse Mandag var saken oppe i kontrollutvalget.
Der foreslo leder av kontrollutvalget, Siri Baastad at fylkesrådmannen foretar en gjennomgang av Moskvaskolen helt fra oppstarten i 1994.
Kontrollutvalget har bedt fylkesrådmannen «gjennomføre de personalmessige konsekevensene som vil være en naturlig følge av de avdekkede uregelmessighetene og rapportens konlusjoner».
De ber også fylkesrådmannen vurdere om det er grunnlag for å politianmelde enkeltpersoner eller saken som helhet.
Saken legges frem for fylkestinget 6. mai.
Pengekurérer Den russiske rektoren, Evegny Karpov satt som rektor fra 1995 til han gikk av nylig.
Fylkesrevisjonens gjennomgang begrenser seg kun til de siste tre årene.
Men faren for mislighold strekker seg langt tilbake: Fylkesrevisjonen beskriver at metoden for å overføre penger til Moskvaskolen i en årrekke var å la pengekurerer fra Norge ta med seg store mengder kontanter, som rektoren la i sin personlige safe på skolen.
Kurerene skal typisk ha hatt med seg 20-40000 euro i kontanter på sine reiser til Moskvaskolen, heter det i rapporten.
Da skolen etter påtrykk fra fylkeskommunen begynte å sende penger via bank i 2011, svarte den russiske rektoren med å ta ut store menger kontanter etter hver overføring. Disse skal han ha oppbevart i en safe på skolen, som bare han har hatt tilgang til. | {
"pile_set_name": "OpenWebText2"
} |
“I thought I couldn’t do it,” Maria admitted. “Before I joined BIG, I didn’t understand how I could actually do it.” To a perspective applicant, Barton International Group often seems quite intimidating. For Maria, she was sure that she did not have adequate experience or understanding of the business world to perform successfully in BIG’s professional setting. However, once she joined Barton International Group, Maria quickly learned that, even with her lack of experience, everyone in the group was there to help her gain the knowledge and experience that she needed.
Being an international student, Maria was unable to work outside of campus – making BIG the perfect opportunity to “get the same experience” as the students who were able to work off campus. She first discovered the group in her junior year, and proceeded to join in the fall of that same year. Prior to this discovery, she hadn’t found any other organization that would allow her to work with actual companies while still being a student, and thought BIG would be “a great experience” and “a great part of [her] resume.” Although she joined late into her college career, everyone was welcoming and helped her to become acclimated to the new environment.
Maria has spent her time in Barton International Group in the Business Development Department, where she contacts potential clients, attends meetings with company executives, and frequents networking events. She is currently working on a project with NIAR, which has so far given her “the experience of working in a fast paced environment”, but also allowed her to learn “how long it takes to understand a company.” Through her work with market research on this project, she told me that it is really “not as easy as everyone thinks.”
Like many other members of BIG, Maria wasn’t an incredibly confident person prior to joining the organization. Moreover, she initially felt isolated as a result of being the only international student of the group. However, through her work within the group as well as the connections she made during her years in Barton International Group, Maria gained “the confidence [she] needed to launch her career.” “Nothing is too big for me now,” she told me. “I’m ready for a big career.”
After she graduates, Maria will return to her home in Bolivia for a period of time before beginning to search for a job – preferably one where she can do market research or manage a company’s social media. However, this should be relatively simple considering the skills and confidence she learned from being in Barton International Group.
When the interview ended, I asked Maria if she had any final comments to make. She sat back and thought for a moment before grinning and enthusiastically blurting out “join BIG!” – one final promotion for the place that molded her into the person she is now.
Barton International Group is a student run consulting organization who strives to develop educational enrichment opportunities for our members and provide our clients with exceptional, diverse, and creative business solutions. | {
"pile_set_name": "Pile-CC"
} |
Potent and selective Kunitz domain inhibitors of plasma kallikrein designed by phage display.
Phage displaying APPI Kunitz domain libraries have been used to design potent and selective active site inhibitors of human plasma kallikrein, a serine protease that plays an important role in both inflammation and coagulation. Selected clones from two Kunitz domain libraries randomized at or near the binding loop (positions 11-13, 15-19, and 34) were sequenced following five rounds of selection on immobilized plasma kallikrein. Invariant preferences for Arg at position 15 and His at position 18 were found, whereas His, Ala, Ala, and Pro were highly preferred residues at positions 13, 16, 17, and 19, respectively. At position 11 Pro, Asp, and Glu were favored, while hydrophobic residues were preferred at position 34. Selected variants, purified by trypsin affinity chromatography and reverse phase high performance liquid chromatography, potently inhibited plasma kallikrein, with apparent equilibrium dissociation constants (Ki*) ranging from approximately 75 to 300 pM. From sequence and activity data, consensus mutants were constructed by site directed mutagenesis. One such mutant, KALI-DY, which differed from APPI at 6 key residues (T11D, P13H, M17A, I18H, S19P, and F34Y), inhibited plasma kallikrein with a Ki* = 15 +/- 14 pM, representing an increase in binding affinity of more than 10,000-fold compared to APPI. Similar to APPI, the variants also inhibited Factor XIa with high affinity, with Ki* values ranging from approximately 0.3 to 15 nM; KALI-DY inhibited Factor XIa with a Ki* = 8.2 +/- 3.5 nM. KALI-DY did not inhibit plasmin, thrombin, Factor Xa, Factor XIIa, activated protein C, or tissue factor. Factor VIIa. Consistent with the protease specificity profile, KALI-DY did not prolong the clotting time in a prothrombin time assay, but did prolong the clotting time in an activated partial thromboplastin time assay > 3.5-fold at 1 microM. | {
"pile_set_name": "PubMed Abstracts"
} |
Sandy Jones
Sandy Jones, (born August 16, 1943), is an American pregnancy and parenting expert.
Career
Over 200 of Jones' articles on consumer issues and parenting have been published in national publications including Family Circle, Redbook, American Baby, and Working Mother. She has served as a columnist for Parents, Parenting and Woman's World. As a speaker, Jones has made presentations for the national conferences of La Leche League International and has made a presentation at the National Association for the Education of Young Children conference. In addition, she has lectured to parenting groups and professionals working with parents across the nation including her unique "Empowerment for Mothers" seminar presented to hundreds of mothers in 10 states.
Her newest book, co-authored with her daughter, Marcie Jones, Great Expectations: Your All-in-One Resource for Pregnancy & Childbirth was published by Sterling Publishing. Comforting Your Crying Baby: Why Your Baby is Crying and What to Do About It was published by Innova Publications.
References
Category:1943 births
Category:Living people
Category:American family and parenting writers | {
"pile_set_name": "Wikipedia (en)"
} |
//
// Generated by class-dump 3.5 (64 bit) (Debug version compiled Oct 15 2018 10:31:50).
//
// class-dump is Copyright (C) 1997-1998, 2000-2001, 2004-2015 by Steve Nygard.
//
#import <AddressBookCore/ABCNPropertyDescription.h>
#import <AddressBookCore/ABCNAbstractPropertyDescription-Protocol.h>
@class NSString;
@interface ABCNIdentifierDescription : ABCNPropertyDescription <ABCNAbstractPropertyDescription>
{
}
- (void)decodeUsingCoder:(id)arg1 contact:(id)arg2;
- (void)encodeUsingCoder:(id)arg1 contact:(id)arg2;
- (void)copyFromContact:(id)arg1 to:(id)arg2;
- (BOOL)isEqualForContact:(id)arg1 other:(id)arg2;
- (id)init;
// Remaining properties
@property(readonly, copy) NSString *debugDescription;
@property(readonly, copy) NSString *description;
@property(readonly) unsigned long long hash;
@property(readonly) Class superclass;
@end
| {
"pile_set_name": "Github"
} |
Q:
Backbone + JQuery Plugin won't work (Written in CoffeeScript)
Ok, I have a simple JQuery plugin that when applied to a div will neatly load all the images within that div using a preloader image. This works fine.
However, in Backbone i'm appending data from JSON to a element after the page has been loaded. Therefore the code below won't work.
$(document).ready( ->
$("#theList").preloader()
)
So I need to put the "$("#theList").preloader()" in my render function, after my list is generated..
class HomeView extends Backbone.View
constructor: ->
super
initialize: ->
@isLoading = false
events: {
"click #nextPage": "nextPage"
"click #prevPage": "prevPage"
}
template: _.template($('#home').html())
theListTemplate: _.template($('#theList').html())
render: ->
$(@el).append(@template)
@loadResults()
$("#theList").preloader() #<----------------- DOESN'T WORK NOW
loadResults: () ->
@isLoading = true
Properties.fetch({
success: (data) =>
#Append properties to list
$('#theList').append(@theListTemplate({data: data.models, _:_})).listview('refresh')
error: ->
@isLoading = false
alert('Unable to load information')
})
However, now the line of code is within a Backbone View/Model/Controller or whatever, it doesn't work..
For reference I load my application like so..
$(document).ready( ->
console.log('document ready')
app.AppRouter = new AppRouter()
Backbone.history.start()
)
Any help would be much appreciated! Thanks.
A:
Assuming the preloader isn't intended to operate on the nodes added in the loadResults#fetch#success (since the fetch has not returned by the time you invoke the preloader), I suspect the issue is that, during the execution of the render() function, the view's el is not part of the DOM yet.
If you invoke HomeView like
myHomeView = new HomeView()
$('some selector').append(myHomeView.render().el)
the HomeView's el has not yet been added to the DOM, its in a detached document.
Therefore the jQuery selector $("#theList") returns an empty result - since its searching the DOM. This is easily verified by either console.log'n the result of that selector, or putting a breakpoint in using a debugger, and testing using the console.
Thankfully, the fix is easy. You need to reference the detached document by scoping the selector to the view, either using the jQuery reference scoped to the view:
@$("#theList").preloader()
or, by doing it yourself
$(@el).find("#theList").preloader()
| {
"pile_set_name": "StackExchange"
} |
/* ***** BEGIN LICENSE BLOCK *****
* Version: MPL 1.1/GPL 2.0/LGPL 2.1
*
* The contents of this file are subject to the Mozilla Public License Version
* 1.1 (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
* http://www.mozilla.org/MPL/
*
* Software distributed under the License is distributed on an "AS IS" basis,
* WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License
* for the specific language governing rights and limitations under the
* License.
*
* The Original Code is part of dcm4che, an implementation of DICOM(TM) in
* Java(TM), hosted at https://github.com/dcm4che.
*
* The Initial Developer of the Original Code is
* Agfa Healthcare.
* Portions created by the Initial Developer are Copyright (C) 2012
* the Initial Developer. All Rights Reserved.
*
* Contributor(s):
* See @authors listed below
*
* Alternatively, the contents of this file may be used under the terms of
* either the GNU General Public License Version 2 or later (the "GPL"), or
* the GNU Lesser General Public License Version 2.1 or later (the "LGPL"),
* in which case the provisions of the GPL or the LGPL are applicable instead
* of those above. If you wish to allow use of your version of this file only
* under the terms of either the GPL or the LGPL, and not to allow others to
* use your version of this file under the terms of the MPL, indicate your
* decision by deleting the provisions above and replace them with the notice
* and other provisions required by the GPL or the LGPL. If you do not delete
* the provisions above, a recipient may use your version of this file under
* the terms of any one of the MPL, the GPL or the LGPL.
*
* ***** END LICENSE BLOCK ***** */
package org.dcm4che3.net;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
/**
* @author Gunter Zeilinger <gunterze@gmail.com>
*
*/
public class DeviceService implements DeviceServiceInterface
{
protected Device device;
protected ExecutorService executor;
protected ScheduledExecutorService scheduledExecutor;
protected void init(Device device) {
setDevice(device);
}
public void setDevice(Device device) {
this.device = device;
}
public Device getDevice() {
return device;
}
public boolean isRunning() {
return executor != null;
}
public void start() throws Exception {
if (device == null)
throw new IllegalStateException("Not initialized");
if (executor != null)
throw new IllegalStateException("Already started");
executor = executerService();
scheduledExecutor = scheduledExecuterService();
try {
device.setExecutor(executor);
device.setScheduledExecutor(scheduledExecutor);
device.bindConnections();
} catch (Exception e) {
stop();
throw e;
}
}
public void stop() {
if (device != null)
device.unbindConnections();
if (scheduledExecutor != null)
scheduledExecutor.shutdown();
if (executor != null)
executor.shutdown();
executor = null;
scheduledExecutor = null;
}
protected ExecutorService executerService() {
return Executors.newCachedThreadPool();
}
protected ScheduledExecutorService scheduledExecuterService() {
return Executors.newSingleThreadScheduledExecutor();
}
}
| {
"pile_set_name": "Github"
} |
(AP Photo/Mary Altaffer)
On Thursday, the National Academies of Sciences, Engineering, and Medicine will release its report on “The Economic and Fiscal Consequences of Immigration.” According to the report, first generation immigrants as a group increase the nation’s fiscal deficit. In other words, the government benefits they receive exceed the taxes paid.
The National Academies’ report provides 75-year fiscal projections for new immigrants and their descendants. The fiscal impact varies greatly according to the education level of the immigrant. Low-skill immigrants are shown to impose substantial fiscal costs that extend far into the future. The future government benefits they will receive greatly exceed the taxes they will pay.
On average, a nonelderly adult immigrant without a high school diploma entering the U.S. will create a net fiscal cost (benefits received will exceed taxes paid) in both the current generation and second generation. The average net present value of the fiscal cost of such an immigrant is estimated at $231,000, a cost that must be paid by U.S. taxpayers.
The concept of “net present value” is complex: it places a much lower value on future expenditures than on current expenditures.
One way to grasp net present value is that it represents the total amount of money that government would have to raise today and put in a bank account earning interest at 3 percent above the inflation rate in order to cover future costs.
Thus, as each adult immigrant without a high school diploma enters the country, the government would need to immediately put aside and invest $231,000 to cover the future net fiscal cost (total benefits minus total taxes) of that immigrant.
Converting a net present value figure into future outlays requires information on the exact distribution of costs over time. That data is not provided by the National Academies.
However, a rough estimate of the future net outlays to be paid by taxpayers (in constant 2012 dollars) for immigrants without a high school diploma appears to be around $640,000 per immigrant over 75 years. The average fiscal loss is around $7,551 per year (in constant 2012 dollars).
Slightly more than 4 million adult immigrants without a high school diploma have entered the U.S. since 2000 and continue to reside here. According to the estimates in the National Academies report, the net present value of the future fiscal costs of those immigrants is $920 billion.
This means government would have to immediately raise taxes by $920 billion and put that sum into a bank account earning 3 percent plus inflation per year to cover the future fiscal losses that will be generated by those immigrants.
To cover the future cost, each taxpaying U.S. household, on average, would have to pay an immediate lump sum of over $10,000. Costs would go up in the future as more than 200,000 additional adult immigrants without a high school diploma arrive in the country each year.
Again, converting a net present value figure into future outlays requires information on the exact timing of future costs that are not provided by the National Academies. However, a rough estimate of the future net outlays (benefits minus taxes) for the 4 million adult immigrants without a high school degree who have entered the U.S. since 2000 is perhaps $2.6 trillion.
One might argue that these estimates are exaggerated because many immigrants may return to their country of origin. But the report estimates already have a re-emigration rate of 31 percent built in.
A surge of low-skill immigrant workers may push down wages and thereby reduce consumer costs. But the National Academies report indicates such consumer gains would be modest, and if the wages of less-educated immigrants are driven down, the wages of less-educated U.S. workers will fall as well. Any consumer gains would come at the cost of wage losses for the most vulnerable American workers.
One might also argue that is it misleading to assign the costs of government “public goods” such as defense and interest of the national debt to recent immigrants. But the National Academies estimates exclude such public goods costs.
Advocates of ongoing, massive low-skill immigration have suggested that low-skill immigrants generate large-scale economic externalities that benefit U.S. workers. The National Academies report finds minimal evidence of such effects.
The continuing inflow of low-skill immigrants into the U.S. creates large fiscal burdens for U.S. taxpayers in both the present and the future.
Robert Rector, a leading authority on poverty, welfare programs and immigration in America for three decades, is The Heritage Foundation’s senior research fellow in domestic policy.
Jamie Bryan Hall is a senior policy analyst in the Center for Data Analysis at The Heritage Foundation. His research focuses on immigration and other issues in support of the Institute for Family, Community, and Opportunity. | {
"pile_set_name": "OpenWebText2"
} |
Effects of digitalis on atrial vulnerability.
The effects of digitalis on vulnerability to atrial fibrillation and flutter were assessed in man, using the model of repetitive atrial firing initiated by post-drive atrial extrastimulation. Nine patients without heart failure or significant mitral valve disease were tested before and 30 minutes after the administration of 0.01 mg/kg ouabain. When repetitive firing was manifested by flutter, neither the flutter cycle length nor the interval from the initiating beat to the first flutter beat was consistently altered by ouabain. Repetitive firing was found at the atrial site with the shortest functional refractory period. The vulnerable zone bordered this refractory period. The functional refractory period was lengthened after ouabain, from 231 +/- 13 to 246 +/- 15 msec (mean +/- standard error of the mean) (P less than 0.025). Partly because of prolonged refractoriness, the vulnerable zone was curtailed by ouabain, from 32.2 +/- 5.7 to 9.4 +/- 4.6 msec (P less than 0.001). This result suggests a protective effect of digitalis against atrial fibrillation and flutter independent of its hemodynamic actions. | {
"pile_set_name": "PubMed Abstracts"
} |
Q:
img слайдер в java script
Всем привет!
Пишу код для слайда в определённом диве.
Не получается заменить картину. В чём может быть проблема в данном коде ?
Попробувал 3 варианта и не работают, в чём может быть проблема ? Я не получаю никаких ошибок в консоле и в IDE!
var images = [
"BackgroundSlideImages/italy-1587287.jpg",
"BackgroundSlideImages/stream-1149882_1920.jpg",
"BackgroundSlideImages/thimble-1597514_1920.jpg",
"BackgroundSlideImages/mountain-1543308.jpg",
"BackgroundSlideImages/maligne-river-1485060_1920.jpg",
"BackgroundSlideImages/camping-1289930_1920.jpg",
"BackgroundSlideImages/beach-1092734_1920.jpg",
"BackgroundSlideImages/malaysia-911580.jpg",
"BackgroundSlideImages/night-839807.jpg",
"BackgroundSlideImages/sunset-1373171.jpg",
"BackgroundSlideImages/cycling-1533265_1920.jpg",
"BackgroundSlideImages/swim-422546_1920.jpg",
"BackgroundSlideImages/sea-79606_1920.jpg",
"BackgroundSlideImages/paragliding-1219999_1280.jpg",
"BackgroundSlideImages/ski-247787_1920.jpg",
"BackgroundSlideImages/snowboarder-1261844_1920.jpg",
"BackgroundSlideImages/water-fight-442257_1920.jpg",
"BackgroundSlideImages/family-591581_1920.jpg",
"BackgroundSlideImages/pregnant-422982_1920.jpg"
];
var imageCounter = 0;
var element = $("<img src="+images[imageCounter]>+">");
var image = $(".content");
image.append(element);
setInterval(function () {
var img = imageCounter++;
image.fadeOut(1000,function () {
$("img").attr("src",images[img]);
image.fadeIn(1000);
});
if (img == images.length){
img = 0 ;
$("img").attr("src",images[img]);
}
},10000); // Не работает, что можно сделать по другому ?
=================================================================
var imageCounter = 0;
//var element = $("<img src="+images[imageCounter]>+">");
var image = $("#img-page");
// var image = $(".content");
//image.append(element);
setInterval(function () {
var img = imageCounter++;
image.fadeOut(1000,function () {
image.src = images[img];
image.fadeIn(1000);
});
if (img == images.length){
img = 0 ;
image.src = images[img];
}
},10000); //Этот тоже не работает!
=======================================================================
var imageCounter = 0;
var element = $("<img src="+images[imageCounter]>+">");
var image = $(".content");
image.append(element);
setInterval(function () {
var img = imageCounter++;
image.fadeOut(1000,function () {
element.src = images[img];
image.fadeIn(1000);
});
if (img == images.length){
img = 0 ;
element.src = images[img];
}
},10000); // Этот вариант тоже не работет!
=======================================================================
.content{
border: solid;
color: brown;
min-width: 100%;
max-width: 100%;
}
.content > img {
width: 99%;
max-width: 99%;
min-width: 99%;
min-height: 775px;
max-height: 775px;
}
<div class="content">
тут должна быть картина, элемент создаю динамически
<div class="home-content" id = "home-page">
Home
</div>
</div>
A:
Так делать не правельно, но....
var images = [
"http://www.evilbeetgossip.com/wp-content/uploads/2010/08/ladygaga_i-d.jpg",
"http://www.evilbeetgossip.com/wp-content/uploads/2010/08/gaga_id_11.jpg",
"http://www.evilbeetgossip.com/wp-content/uploads/2010/08/gagaid.jpg",
"http://www.evilbeetgossip.com/wp-content/uploads/2010/08/gaga_id_10.jpg"
];
var imageCounter = 0;
var element = $("<img src=" + images[imageCounter] + ">");
var image = $(".content");
image.append(element);
setInterval(function() {
imageCounter++;
image.fadeOut(500, function() {
$("img").attr("src", images[imageCounter]);
image.fadeIn(500);
});
if (imageCounter == images.length) {
imageCounter = 0;
image.fadeOut(500, function() {
$("img").attr("src", images[imageCounter]);
image.fadeIn(500);
});
}
}, 1500);
.content {
border: solid;
color: brown;
max-width: 300px;
}
img{
max-width: 300px;
max-height: 500px;
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<div class="content">
<div class="home-content" id="home-page">
</div>
</div>
| {
"pile_set_name": "StackExchange"
} |
<?php
/**
* Zend Framework
*
* LICENSE
*
* This source file is subject to the new BSD license that is bundled
* with this package in the file LICENSE.txt.
* It is also available through the world-wide-web at this URL:
* http://framework.zend.com/license/new-bsd
* If you did not receive a copy of the license and are unable to
* obtain it through the world-wide-web, please send an email
* to license@zend.com so we can send you a copy immediately.
*
* @category Zend
* @package Zend_Pdf
* @copyright Copyright (c) 2005-2010 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
* @version $Id: Png.php 22655 2010-07-22 18:47:20Z mabe $
*/
/** Internally used classes */
require_once 'Zend/Pdf/Element/Array.php';
require_once 'Zend/Pdf/Element/Dictionary.php';
require_once 'Zend/Pdf/Element/Name.php';
require_once 'Zend/Pdf/Element/Numeric.php';
require_once 'Zend/Pdf/Element/String/Binary.php';
/** Zend_Pdf_Resource_Image */
require_once 'Zend/Pdf/Resource/Image.php';
/**
* PNG image
*
* @package Zend_Pdf
* @copyright Copyright (c) 2005-2010 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
*/
class Zend_Pdf_Resource_Image_Png extends Zend_Pdf_Resource_Image
{
const PNG_COMPRESSION_DEFAULT_STRATEGY = 0;
const PNG_COMPRESSION_FILTERED = 1;
const PNG_COMPRESSION_HUFFMAN_ONLY = 2;
const PNG_COMPRESSION_RLE = 3;
const PNG_FILTER_NONE = 0;
const PNG_FILTER_SUB = 1;
const PNG_FILTER_UP = 2;
const PNG_FILTER_AVERAGE = 3;
const PNG_FILTER_PAETH = 4;
const PNG_INTERLACING_DISABLED = 0;
const PNG_INTERLACING_ENABLED = 1;
const PNG_CHANNEL_GRAY = 0;
const PNG_CHANNEL_RGB = 2;
const PNG_CHANNEL_INDEXED = 3;
const PNG_CHANNEL_GRAY_ALPHA = 4;
const PNG_CHANNEL_RGB_ALPHA = 6;
protected $_width;
protected $_height;
protected $_imageProperties;
/**
* Object constructor
*
* @param string $imageFileName
* @throws Zend_Pdf_Exception
* @todo Add compression conversions to support compression strategys other than PNG_COMPRESSION_DEFAULT_STRATEGY.
* @todo Add pre-compression filtering.
* @todo Add interlaced image handling.
* @todo Add support for 16-bit images. Requires PDF version bump to 1.5 at least.
* @todo Add processing for all PNG chunks defined in the spec. gAMA etc.
* @todo Fix tRNS chunk support for Indexed Images to a SMask.
*/
public function __construct($imageFileName)
{
if (($imageFile = @fopen($imageFileName, 'rb')) === false ) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception( "Can not open '$imageFileName' file for reading." );
}
parent::__construct();
//Check if the file is a PNG
fseek($imageFile, 1, SEEK_CUR); //First signature byte (%)
if ('PNG' != fread($imageFile, 3)) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception('Image is not a PNG');
}
fseek($imageFile, 12, SEEK_CUR); //Signature bytes (Includes the IHDR chunk) IHDR processed linerarly because it doesnt contain a variable chunk length
$wtmp = unpack('Ni',fread($imageFile, 4)); //Unpack a 4-Byte Long
$width = $wtmp['i'];
$htmp = unpack('Ni',fread($imageFile, 4));
$height = $htmp['i'];
$bits = ord(fread($imageFile, 1)); //Higher than 8 bit depths are only supported in later versions of PDF.
$color = ord(fread($imageFile, 1));
$compression = ord(fread($imageFile, 1));
$prefilter = ord(fread($imageFile,1));
if (($interlacing = ord(fread($imageFile,1))) != Zend_Pdf_Resource_Image_Png::PNG_INTERLACING_DISABLED) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception( "Only non-interlaced images are currently supported." );
}
$this->_width = $width;
$this->_height = $height;
$this->_imageProperties = array();
$this->_imageProperties['bitDepth'] = $bits;
$this->_imageProperties['pngColorType'] = $color;
$this->_imageProperties['pngFilterType'] = $prefilter;
$this->_imageProperties['pngCompressionType'] = $compression;
$this->_imageProperties['pngInterlacingType'] = $interlacing;
fseek($imageFile, 4, SEEK_CUR); //4 Byte Ending Sequence
$imageData = '';
/*
* The following loop processes PNG chunks. 4 Byte Longs are packed first give the chunk length
* followed by the chunk signature, a four byte code. IDAT and IEND are manditory in any PNG.
*/
while(($chunkLengthBytes = fread($imageFile, 4)) !== false) {
$chunkLengthtmp = unpack('Ni', $chunkLengthBytes);
$chunkLength = $chunkLengthtmp['i'];
$chunkType = fread($imageFile, 4);
switch($chunkType) {
case 'IDAT': //Image Data
/*
* Reads the actual image data from the PNG file. Since we know at this point that the compression
* strategy is the default strategy, we also know that this data is Zip compressed. We will either copy
* the data directly to the PDF and provide the correct FlateDecode predictor, or decompress the data
* decode the filters and output the data as a raw pixel map.
*/
$imageData .= fread($imageFile, $chunkLength);
fseek($imageFile, 4, SEEK_CUR);
break;
case 'PLTE': //Palette
$paletteData = fread($imageFile, $chunkLength);
fseek($imageFile, 4, SEEK_CUR);
break;
case 'tRNS': //Basic (non-alpha channel) transparency.
$trnsData = fread($imageFile, $chunkLength);
switch ($color) {
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_GRAY:
$baseColor = ord(substr($trnsData, 1, 1));
$transparencyData = array(new Zend_Pdf_Element_Numeric($baseColor),
new Zend_Pdf_Element_Numeric($baseColor));
break;
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_RGB:
$red = ord(substr($trnsData,1,1));
$green = ord(substr($trnsData,3,1));
$blue = ord(substr($trnsData,5,1));
$transparencyData = array(new Zend_Pdf_Element_Numeric($red),
new Zend_Pdf_Element_Numeric($red),
new Zend_Pdf_Element_Numeric($green),
new Zend_Pdf_Element_Numeric($green),
new Zend_Pdf_Element_Numeric($blue),
new Zend_Pdf_Element_Numeric($blue));
break;
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_INDEXED:
//Find the first transparent color in the index, we will mask that. (This is a bit of a hack. This should be a SMask and mask all entries values).
if(($trnsIdx = strpos($trnsData, "\0")) !== false) {
$transparencyData = array(new Zend_Pdf_Element_Numeric($trnsIdx),
new Zend_Pdf_Element_Numeric($trnsIdx));
}
break;
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_GRAY_ALPHA:
// Fall through to the next case
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_RGB_ALPHA:
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception( "tRNS chunk illegal for Alpha Channel Images" );
break;
}
fseek($imageFile, 4, SEEK_CUR); //4 Byte Ending Sequence
break;
case 'IEND';
break 2; //End the loop too
default:
fseek($imageFile, $chunkLength + 4, SEEK_CUR); //Skip the section
break;
}
}
fclose($imageFile);
$compressed = true;
$imageDataTmp = '';
$smaskData = '';
switch ($color) {
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_RGB:
$colorSpace = new Zend_Pdf_Element_Name('DeviceRGB');
break;
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_GRAY:
$colorSpace = new Zend_Pdf_Element_Name('DeviceGray');
break;
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_INDEXED:
if(empty($paletteData)) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception( "PNG Corruption: No palette data read for indexed type PNG." );
}
$colorSpace = new Zend_Pdf_Element_Array();
$colorSpace->items[] = new Zend_Pdf_Element_Name('Indexed');
$colorSpace->items[] = new Zend_Pdf_Element_Name('DeviceRGB');
$colorSpace->items[] = new Zend_Pdf_Element_Numeric((strlen($paletteData)/3-1));
$paletteObject = $this->_objectFactory->newObject(new Zend_Pdf_Element_String_Binary($paletteData));
$colorSpace->items[] = $paletteObject;
break;
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_GRAY_ALPHA:
/*
* To decode PNG's with alpha data we must create two images from one. One image will contain the Gray data
* the other will contain the Gray transparency overlay data. The former will become the object data and the latter
* will become the Shadow Mask (SMask).
*/
if($bits > 8) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception("Alpha PNGs with bit depth > 8 are not yet supported");
}
$colorSpace = new Zend_Pdf_Element_Name('DeviceGray');
require_once 'Zend/Pdf/ElementFactory.php';
$decodingObjFactory = Zend_Pdf_ElementFactory::createFactory(1);
$decodingStream = $decodingObjFactory->newStreamObject($imageData);
$decodingStream->dictionary->Filter = new Zend_Pdf_Element_Name('FlateDecode');
$decodingStream->dictionary->DecodeParms = new Zend_Pdf_Element_Dictionary();
$decodingStream->dictionary->DecodeParms->Predictor = new Zend_Pdf_Element_Numeric(15);
$decodingStream->dictionary->DecodeParms->Columns = new Zend_Pdf_Element_Numeric($width);
$decodingStream->dictionary->DecodeParms->Colors = new Zend_Pdf_Element_Numeric(2); //GreyAlpha
$decodingStream->dictionary->DecodeParms->BitsPerComponent = new Zend_Pdf_Element_Numeric($bits);
$decodingStream->skipFilters();
$pngDataRawDecoded = $decodingStream->value;
//Iterate every pixel and copy out gray data and alpha channel (this will be slow)
for($pixel = 0, $pixelcount = ($width * $height); $pixel < $pixelcount; $pixel++) {
$imageDataTmp .= $pngDataRawDecoded[($pixel*2)];
$smaskData .= $pngDataRawDecoded[($pixel*2)+1];
}
$compressed = false;
$imageData = $imageDataTmp; //Overwrite image data with the gray channel without alpha
break;
case Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_RGB_ALPHA:
/*
* To decode PNG's with alpha data we must create two images from one. One image will contain the RGB data
* the other will contain the Gray transparency overlay data. The former will become the object data and the latter
* will become the Shadow Mask (SMask).
*/
if($bits > 8) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception("Alpha PNGs with bit depth > 8 are not yet supported");
}
$colorSpace = new Zend_Pdf_Element_Name('DeviceRGB');
require_once 'Zend/Pdf/ElementFactory.php';
$decodingObjFactory = Zend_Pdf_ElementFactory::createFactory(1);
$decodingStream = $decodingObjFactory->newStreamObject($imageData);
$decodingStream->dictionary->Filter = new Zend_Pdf_Element_Name('FlateDecode');
$decodingStream->dictionary->DecodeParms = new Zend_Pdf_Element_Dictionary();
$decodingStream->dictionary->DecodeParms->Predictor = new Zend_Pdf_Element_Numeric(15);
$decodingStream->dictionary->DecodeParms->Columns = new Zend_Pdf_Element_Numeric($width);
$decodingStream->dictionary->DecodeParms->Colors = new Zend_Pdf_Element_Numeric(4); //RGBA
$decodingStream->dictionary->DecodeParms->BitsPerComponent = new Zend_Pdf_Element_Numeric($bits);
$decodingStream->skipFilters();
$pngDataRawDecoded = $decodingStream->value;
//Iterate every pixel and copy out rgb data and alpha channel (this will be slow)
for($pixel = 0, $pixelcount = ($width * $height); $pixel < $pixelcount; $pixel++) {
$imageDataTmp .= $pngDataRawDecoded[($pixel*4)+0] . $pngDataRawDecoded[($pixel*4)+1] . $pngDataRawDecoded[($pixel*4)+2];
$smaskData .= $pngDataRawDecoded[($pixel*4)+3];
}
$compressed = false;
$imageData = $imageDataTmp; //Overwrite image data with the RGB channel without alpha
break;
default:
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception( "PNG Corruption: Invalid color space." );
}
if(empty($imageData)) {
require_once 'Zend/Pdf/Exception.php';
throw new Zend_Pdf_Exception( "Corrupt PNG Image. Mandatory IDAT chunk not found." );
}
$imageDictionary = $this->_resource->dictionary;
if(!empty($smaskData)) {
/*
* Includes the Alpha transparency data as a Gray Image, then assigns the image as the Shadow Mask for the main image data.
*/
$smaskStream = $this->_objectFactory->newStreamObject($smaskData);
$smaskStream->dictionary->Type = new Zend_Pdf_Element_Name('XObject');
$smaskStream->dictionary->Subtype = new Zend_Pdf_Element_Name('Image');
$smaskStream->dictionary->Width = new Zend_Pdf_Element_Numeric($width);
$smaskStream->dictionary->Height = new Zend_Pdf_Element_Numeric($height);
$smaskStream->dictionary->ColorSpace = new Zend_Pdf_Element_Name('DeviceGray');
$smaskStream->dictionary->BitsPerComponent = new Zend_Pdf_Element_Numeric($bits);
$imageDictionary->SMask = $smaskStream;
// Encode stream with FlateDecode filter
$smaskStreamDecodeParms = array();
$smaskStreamDecodeParms['Predictor'] = new Zend_Pdf_Element_Numeric(15);
$smaskStreamDecodeParms['Columns'] = new Zend_Pdf_Element_Numeric($width);
$smaskStreamDecodeParms['Colors'] = new Zend_Pdf_Element_Numeric(1);
$smaskStreamDecodeParms['BitsPerComponent'] = new Zend_Pdf_Element_Numeric(8);
$smaskStream->dictionary->DecodeParms = new Zend_Pdf_Element_Dictionary($smaskStreamDecodeParms);
$smaskStream->dictionary->Filter = new Zend_Pdf_Element_Name('FlateDecode');
}
if(!empty($transparencyData)) {
//This is experimental and not properly tested.
$imageDictionary->Mask = new Zend_Pdf_Element_Array($transparencyData);
}
$imageDictionary->Width = new Zend_Pdf_Element_Numeric($width);
$imageDictionary->Height = new Zend_Pdf_Element_Numeric($height);
$imageDictionary->ColorSpace = $colorSpace;
$imageDictionary->BitsPerComponent = new Zend_Pdf_Element_Numeric($bits);
$imageDictionary->Filter = new Zend_Pdf_Element_Name('FlateDecode');
$decodeParms = array();
$decodeParms['Predictor'] = new Zend_Pdf_Element_Numeric(15); // Optimal prediction
$decodeParms['Columns'] = new Zend_Pdf_Element_Numeric($width);
$decodeParms['Colors'] = new Zend_Pdf_Element_Numeric((($color==Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_RGB || $color==Zend_Pdf_Resource_Image_Png::PNG_CHANNEL_RGB_ALPHA)?(3):(1)));
$decodeParms['BitsPerComponent'] = new Zend_Pdf_Element_Numeric($bits);
$imageDictionary->DecodeParms = new Zend_Pdf_Element_Dictionary($decodeParms);
//Include only the image IDAT section data.
$this->_resource->value = $imageData;
//Skip double compression
if ($compressed) {
$this->_resource->skipFilters();
}
}
/**
* Image width
*/
public function getPixelWidth() {
return $this->_width;
}
/**
* Image height
*/
public function getPixelHeight() {
return $this->_height;
}
/**
* Image properties
*/
public function getProperties() {
return $this->_imageProperties;
}
}
| {
"pile_set_name": "Github"
} |
The Metropolitan Branch Trail in Washington D.C. (Photo credit to Mixed Ginger)
Several businesses, nonprofits, and universities came together in 2017 to collaborate on a trail project that will greatly improve access in the downtown St. Louis area. The exact path of the Chouteau Greenway trail(s) has yet to be officially declared. It is likely to go from Forest Park, through the heart of the city, all the way to the Arch. This project is going to greatly improve access, sustainability, and health while fostering a greater sense of pride and community. While the metro is a green solution for some parts of the city, it is only one tool in St. Louis’ fairly empty tool box. With this trail all walks of life can actively commute to work and play areas.
The downtown area has begun to improve over the years and become more of a destination, but the issue with it is just that, it is a destination for the suburbs. With the trail there will be a greater incentive for people to migrate back into the city, because being able to walk, run, skate, or bike to work not only is green, healthy, and quick but it is also trendy.
Great Rivers Greenway is the nonprofit that is spear heading this effort, in addition to putting a number of other trails throughout the city, and has currently narrowed down the 124 proposals down to a final 4. Throughout the whole process Great Rivers Greenway and its partners have put effort into hearing community voices. You can be more a part of the community by getting involved in this thoughtful and needed project. The next input period will be in April, we’ll be sure to keep you posted.
If you’re interested in the economic, health, and community benefits of trails I suggest clicking through this website. I’ll warn you now though that it’s easy to go down a rabbit hole and lose a day.
If you have any thoughts on the article or suggestions for future articles, leave a comment below or shoot us an email at moderndope@gmail.com. | {
"pile_set_name": "OpenWebText2"
} |
The Wonder Years' Danica McKellar Is Teaching Her Baby The Qaudratic Formula
The Wonder Years' Danica McKellar really likes math. She found the niche after ending her stint on the popular coming-of-age show, eventually even publishing several math books. Her most recent title, Girls Get Curves: Geometry Takes Shape, was written to help young women get pumped about shapes and numbers. Now a new mommy, McKeller is hoping to inspire a love for math in her own child.
McKellar recently came out to talk about her 23-month-old boy’s current math abilities. Apparently, little Draco likes to name geometric shapes and count the number of toes on his feet, although McKellar recently told People, his numbers don’t always add up.
“We’re working on some things. He knows his colors and his shapes. He still gets seven when he counts the toes on one foot — he really likes seven right now!”
Next on her list, however, the mom plans to work with Draco until he has some formulas down pat. If she manages what she is attempting, her little tyke will be way ahead of the linear curve.
“He likes it when I recite Pi. And I’m trying to teach him the quadratic formula, he hasn’t quite gotten it yet. It’s sung to the tune of ‘Pop Goes the Weasel’.”
McKeller most recently made headlines when she filed for divorce from her longterm partner, Mike Verta. Despite the recent stressors, it is good to see the 37-year-old mom so keenly devoted to her little one. Pop Blend would like to wish her the best in her quadratic equation efforts. A mom with brains and the willingness to put in time certainly deserves it. | {
"pile_set_name": "Pile-CC"
} |
#include <rose.h>
class SgVarSubstCopy : public SgCopyHelp
{
SgVariableSymbol *sourceSymbol;
SgExpression *replacementExpression;
public:
SgVarSubstCopy(SgVariableSymbol *sourceSymbol, SgExpression *replacementExpression)
: sourceSymbol(sourceSymbol), replacementExpression(replacementExpression)
{}
SgNode *copyAst(const SgNode *n)
{
if (const SgVarRefExp *vr = isSgVarRefExp(const_cast<SgNode *>(n)))
{
if (vr->get_symbol() == sourceSymbol)
{
return replacementExpression->copy(*this);
}
}
return n->copy(*this);
}
};
#define SgNULL_FILE Sg_File_Info::generateDefaultFileInfoForTransformationNode()
int main(int argc, char **argv)
{
// Initialize and check compatibility. See Rose::initialize
ROSE_INITIALIZE;
// Build the AST used by ROSE
SgProject *project = frontend(argc, argv);
// Find the exampleFunction function
SgGlobal *global = project->get_file(0).get_globalScope();
SgFunctionSymbol *exampleFunctionSym = global->lookup_function_symbol("exampleFunction");
ROSE_ASSERT(exampleFunctionSym != NULL);
SgFunctionDeclaration *exampleFunctionDecl = exampleFunctionSym->get_declaration();
ROSE_ASSERT(exampleFunctionDecl != NULL);
SgFunctionDefinition *exampleFunctionDef = exampleFunctionDecl->get_definition();
ROSE_ASSERT(exampleFunctionDef != NULL);
// Find its first parameter
SgInitializedName *firstParamName = exampleFunctionDecl->get_args().front();
ROSE_ASSERT(firstParamName != NULL);
SgVariableSymbol *firstParamSym = exampleFunctionDef->lookup_var_symbol(firstParamName->get_name());
ROSE_ASSERT(firstParamSym != NULL);
// Construct the expression to substitute for
SgIntVal *twenty = new SgIntVal(SgNULL_FILE, 20);
// Create our copy help mechanism with the required parameters
SgVarSubstCopy ourCopyHelp(firstParamSym, twenty);
// Do the copy
SgNode *exampleFunctionDeclCopyNode = exampleFunctionDecl->copy(ourCopyHelp);
ROSE_ASSERT(exampleFunctionDeclCopyNode != NULL);
SgFunctionDeclaration *exampleFunctionDeclCopy = isSgFunctionDeclaration(exampleFunctionDeclCopyNode);
ROSE_ASSERT(exampleFunctionDeclCopy != NULL);
// Change the name of the new function
exampleFunctionDeclCopy->set_name("exampleFunction20");
global->get_symbol_table()->insert(exampleFunctionDeclCopy->get_name(), new SgFunctionSymbol(exampleFunctionDeclCopy));
// Add function to global scope
global->append_declaration(exampleFunctionDeclCopy);
// Generate source code from AST and call the vendor's compiler
return backend(project);
}
| {
"pile_set_name": "Github"
} |
lolsanrolls submitted to thisisthinprivilege:
Thin privilege is being able to buy all types of food in your weekly shop and not have to be concerned with strangers judging you for your purchases.
Each week I buy all our groceries and put all the “junk” food (which is generally for my thin partner) at the start of the counter and ensure that all the fruit and veges are placed at the end. I do this so the junk gets bagged up fast and the customer behind me will be less likely to say something shitty if all they see is veges.
If I do not do this, if I put the veges up front or put the junk food wherever on the counter and am shopping alone (fat phobes are less likely to be assholes to me when I’m with someone thin), without a doubt the customer behind me will make a comment or pull a face and look me up and down.
“Where are your greens?!” -already bagged up you shithead
“That’s a lot of food for one person”- i buy for 2, thanks
“I hope you don’t eat all that in one week”- we might, or we might not. it certainly won’t make a difference to you.
“That’s a lot of bread. You know carbs aren’t that good for you?”- piss off, i love bread. Did you know that diet coke you’re buying is way worse for you?
“I’m just trying to help, Dear. You look like you need it”- i am an independent human being who doesn’t need your passive-aggressive attempt to shame me.
EVERYTHING a fat person does in public is judged negatively. And if we fight back or tell them to mind their own business we become the assholes! I’m so sick of passive aggressive nosy people. Mind your own selves, people! | {
"pile_set_name": "OpenWebText2"
} |
Description
All that's needed in a gym or travel bag.This shower gel gently cleanses the skin and hair, leaving the pleasantly aromatic and spicy fragrance of Eau de L’OCCITAN.No alcohol, No animal ingredients (vegan), No BHA, No BHT, No EDTA, No formol or formaldehyde, No mineral oil (oil-free), No parabens, No phenoxyethanol, No phtalates, No SLES, No SLS, No synthetic colorants, No triclosan. | {
"pile_set_name": "Pile-CC"
} |
Egy olvasónk hívta fel a figyelmünket egy hibára a Revolut rendszerében, ami sokakat érinthet. A szolgáltatás használata nem százszázalékosan fájdalommentes (melyiké az?), és a mostani hiba az alkalmazás telefonkönyv-turkálós funkciójának pontatlanságából ered.
Olvasónk három hete regisztrált a szolgáltatásra, el is kezdte tesztelni, mit tud a Revolut. Beleegyezett, hogy az alkalmazás átnézze a telefonkönyvét, amiből kiemeli a kontaktokat, akik szintén használják a szolgáltatást. Meglepetten látta, hogy egy olyan ismerőse – nevezzük Katinak – is szerepel a listán, aki nem szokott élni ilyen high-tech szerekkel. Gondolta, teszteli a dolgot, és küldött Katinak 15 forintot. Az utalás át is ment probléma nélkül, de azért csak felhívta Katit, hogy megjött-e a pénz Revoluton.
Mi az a Revolut?
– kérdezett vissza Kati.
A kérdésre olvasónk megnézte a tranzakció részleteit, ahol már a kedvezményezett neve sem egyezett, Kati helyett egy bizonyos Jolánnak ment a pénz. A képen jól látható, hogy a két személy telefonszáma majdnem ugyanaz, gyakorlatilag csak az országkód különbözik (Jolánnak angol száma van, ami +44-el kezdődik). Ez azt jelenti, hogy a Revolut nem számol azzal, hogy a különböző országokban előfordulhatnak hasonló számkombinációk a telefonszámok között, amit pont országkóddal próbál a nemzetközi telefonszámok rendszere orvosolni. Ezt viszont pont nem nézi a Revolut.
Fotó: Olvasói fotó / Index
Olvasónk lelkiismeretesen írt is a Revolut ügyfélszolgálatának, hogy a hibát javítani kéne. Több kört is futott, több emberrel is csetelt, de a végeredmény mindig az lett, hogy ne aggódjon, vissza fogják szerezni a 15 forintját.
Ebből a telefonszámos kavarásból ered az is, hogy néhányan, akik regisztrálni próbálnak, azt az üzenetet kapják, hogy a számuk már használatban van.
Ilyenkor az történik, hogy más országban, más országkód mögött ugyanaz a telefonszáma valakinek, mint a miénk.
Mi is megkerestük a Revolutot az alkalmazáson belüli cseten keresztül. Az ügyintéző azt írta, hogy tudnak a problémáról, és dolgoznak azon, hogy kijavítsák. Azt javasolják, hogy amíg a bug nincs kijavítva, mindenki kérdezzen rá előre a kedvezményezettől, utalásnál hogy van-e Revolutja. Peches esetben előfordulhat, hogy nem annak megy a pénz, akinek küldenénk, hanem valaki másnak, akinek hasonlít a telefonszáma a mi címzettünkére. | {
"pile_set_name": "OpenWebText2"
} |
Kazuo Yamada
was a Japanese conductor and composer.
Birthday
Born in Tokyo in 1912. Began studies at Gakushuin and then Tokyo University of the Arts (formerly the Tokyo Music School). Studied piano with Leo Sirota and Paul Weingarten, and composition with Klaus Pringsheim, and graduated at the top of his class. Formed the orchestra 'Promethée' as a composer. In 1937 was awarded first prize from the Japan Broadcasting Corporation for his symphonic music works, and in 1938 was also awarded by the New Symphony Orchestra for his symphonic poem 'Songs that youth can sing' as well as the Weingarten Award for the symphonic 'Kiso'.
Studied conducting technique under the tutelage of Józef Rosenstock, and premiered as a conductor for the New Symphony Orchestra in 1940. Appointed to the post of chief conductor of the Japan Symphony Orchestra which was reorganized from the New Symphony Orchestra (which is now the NHK symphony orchestra) in 1942, greatly contributing to the improvement of the orchestra for the next 13 years.
Awarded the Mainichi Music Award in 1949 for the Japan premiere of the opera Hänsel und Gretel sponsored by NHK.
Contributed immeasurably to many Japan premieres of works such as Symphonies No. 2 and No. 8 (Mahler), Symphony No. 5 (Shostakovich), Don Quixote (R. Strauss), Six Pieces for Orchestra (Webern), The Rite of Spring (Stravinsky), Le roi David (Honegger) and Sept haïkaï (Messiaen).
Yamada's activities overseas gradually began to take off from 1955, where he guest conducted European orchestras such as the USSR Symphony Orchestra, Slovak Philharmonic, Dresden Philharmonic, as well as orchestras in North and South America and South Africa, and he moreover recorded with the Vienna Symphony Orchestra in 1988.
In 1976, Yamada was awarded the Medal with Purple Ribbon (for artistic excellence) by the Government of Japan, and in 1978 he firmly established his reputation for excellence in the 'World of Kazuo Yamada' series of performances. He was further awarded the Minister of Education Award for Fine Arts in 1979, the Order of the Rising Sun, Gold Rays with Rosette, Fourth Class in 1984, and the Japan Art Academy Award in 1986.
Held a number of key posts such as Chief Resident Conductor & Artistic Advisor of the Kyoto Symphony Orchestra, Artistic Director of the Gunma Symphony Orchestra, Honorary Conductor of the Japan Shinsei Symphony Orchestra, Music Director of the Nissho Chorus, Chairman of the Japan Mahler Society, Professor of the Tokyo University of the Arts, and gave stellar performances as Guest Conductor in Japan as well as overseas.
Appointed to the post of Music Director of the Kanagawa Philharmonic in 1991 until his sudden demise on 13 August in the same year at 78 years of age.
A live recording of Yamada conducting Symphony No. 9 (Mahler) with the New Japan Philharmonic was released in 2011, a quarter-century after the 1986 performance, which was awarded the Grand Prize for recording in the Arts Festival of the Japan Ministry for Cultural Affairs.
Major works
Violin Sonata
Sonata for Cello Solo
String Quartet No. 1
Notturno for flute and piano
From Soshigaya for voice and piano
Footnotes
External links
Tribute (in Japanese)
Category:1912 births
Category:1991 deaths
Category:20th-century classical composers
Category:20th-century conductors (music)
Category:20th-century Japanese composers
Category:Japanese classical composers
Category:Japanese conductors (music)
Category:Japanese male classical composers
Category:20th-century male musicians | {
"pile_set_name": "Wikipedia (en)"
} |
Q:
How to programmatically connect to a particular PC of unknown IP from a WiFi network?
I want to connect and transfer data between a PC and an Android device both of which are on the same local WiFi network.
I cannot use the local IP as such in the code to actually get the connection going because it does not remain constant every time. I know I can set the local IP to be constant but I am looking for a more general solution to the problem.
Having a central server is also not what I'm looking for, because I want to transfer data offline.
I am not an expert on networks as you might have already guessed, so if I am missing something out, let me know. Also is there some API in android that could do this?
A:
You should look at broadcasting. This is technique of sending your packets to all devices (IP's) in a subnet. It would look something like that:
Android 192.168.0.101: Send message packet to broadcast address
192.168.0.254
PC 192.168.0.110: Reply to 192.168.0.101's broadcast message
Android now knows PC's IP address and can communicate directly
It's really simple. You can find JAVA sample in this SO question: https://stackoverflow.com/questions/12999425/simple-udp-broadcast-client-and-server-on-different-machines
| {
"pile_set_name": "StackExchange"
} |
Monoranjan Bezboruah 952 days ago
President Trump in tonight''s State of the Union address asked for the same thing --to ask the various concerned Officials to meet and expedite the Permission for Project into a maximum period of 24 months, preferably to 12 months. Democracies lapse into stupor, and India simply cannot afford to let all these nonsensical Rules etc. and petty official doms delay the Move ahead. The largest Official building --Pentagon was built in 8 months, today, obtaining the various needed Permissions will take over 12 years! The same thing in India, officials simply keep you run around for no good reason. Way to go, have the shortest deadline, and such shortest time-lag should be sufficient to do the needful, as Time and Tide waits for None. Now, by the way, China is the fastest in getting things done or constructed, South Korea second. No wonder,they are way ahead of us! | {
"pile_set_name": "OpenWebText2"
} |
Total knee arthroplasty (TKA) is one of the most clinically successful and cost-effective interventions in health care, with numerous investigators reporting excellent long-term results in terms of reducing pain, improving function, and increasing quality of life in patients. Among the many factors that influence TKA outcomes, surgical technique and final prosthesis alignment appear the most critical. It is known that an accurate restoration of the overall lower limb alignment reduces polyethylene wear and therefore the risk of aseptic loosening. Furthermore, errors in varus/valgus alignment greater than 3 degrees for the femoral or tibial component have been shown to result in less satisfactory clinical outcomes and a higher risk of loosening. Also, malrotation in the transverse plane of the femoral or tibial components has been shown to affect correct function of the extensor mechanism of the replaced knee, which may lead to anterior knee pain. The elevation of the joint line limits the range of motion of the knee. Finally, instability, malalignment, and/or malposition are the major reasons for TKA revision and, except for infection, are all related to incorrect surgical technique. Although various guides for alignment have been designed to improve accuracy, several limitations remain.
Computer Assisted Orthopedic Surgery (CAOS) is a discipline where computer technology is applied pre-, intra- and/or post-operatively to improve the outcome of orthopedic surgical procedures. The principal idea behind CAOS is that operative outcomes will be improved through the use of computer technology. Taking the example of joint replacement, the task of the surgeon is to integrate the new joint components into the patient's existing anatomy. CAOS technologies allow the surgeon to: plan the component placement in advance, including determination of the appropriate component sizes; measure the intra-operative placement of the components in real time, making sure that the plan is adhered to; and measure the post-operative result. | {
"pile_set_name": "USPTO Backgrounds"
} |
/**
* Converts an ASCII `string` to an array.
*
* @private
* @param {string} string The string to convert.
* @returns {Array} Returns the converted array.
*/
function asciiToArray(string) {
return string.split('');
}
module.exports = asciiToArray;
| {
"pile_set_name": "Github"
} |
// This file is part of Substrate.
// Copyright (C) 2017-2020 Parity Technologies (UK) Ltd.
// SPDX-License-Identifier: GPL-3.0-or-later WITH Classpath-exception-2.0
// This program is free software: you can redistribute it and/or modify
// it under the terms of the GNU General Public License as published by
// the Free Software Foundation, either version 3 of the License, or
// (at your option) any later version.
// This program is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
// GNU General Public License for more details.
// You should have received a copy of the GNU General Public License
// along with this program. If not, see <https://www.gnu.org/licenses/>.
//! Communication streams for the polite-grandpa networking protocol.
//!
//! GRANDPA nodes communicate over a gossip network, where messages are not sent to
//! peers until they have reached a given round.
//!
//! Rather than expressing protocol rules,
//! polite-grandpa just carries a notion of impoliteness. Nodes which pass some arbitrary
//! threshold of impoliteness are removed. Messages are either costly, or beneficial.
//!
//! For instance, it is _impolite_ to send the same message more than once.
//! In the future, there will be a fallback for allowing sending the same message
//! under certain conditions that are used to un-stick the protocol.
use futures::{prelude::*, channel::mpsc};
use log::{debug, trace};
use parking_lot::Mutex;
use prometheus_endpoint::Registry;
use std::{pin::Pin, sync::Arc, task::{Context, Poll}};
use sp_core::traits::BareCryptoStorePtr;
use finality_grandpa::Message::{Prevote, Precommit, PrimaryPropose};
use finality_grandpa::{voter, voter_set::VoterSet};
use sc_network::{NetworkService, ReputationChange};
use sc_network_gossip::{GossipEngine, Network as GossipNetwork};
use parity_scale_codec::{Encode, Decode};
use sp_runtime::traits::{Block as BlockT, Hash as HashT, Header as HeaderT, NumberFor};
use sc_telemetry::{telemetry, CONSENSUS_DEBUG, CONSENSUS_INFO};
use crate::{
CatchUp, Commit, CommunicationIn, CommunicationOutH,
CompactCommit, Error, Message, SignedMessage,
};
use crate::environment::HasVoted;
use gossip::{
FullCatchUpMessage,
FullCommitMessage,
GossipMessage,
GossipValidator,
PeerReport,
VoteMessage,
};
use sp_finality_grandpa::{
AuthorityId, AuthoritySignature, SetId as SetIdNumber, RoundNumber,
};
use sp_utils::mpsc::TracingUnboundedReceiver;
pub mod gossip;
mod periodic;
#[cfg(test)]
pub(crate) mod tests;
pub use sp_finality_grandpa::GRANDPA_ENGINE_ID;
pub const GRANDPA_PROTOCOL_NAME: &'static str = "/paritytech/grandpa/1";
// cost scalars for reporting peers.
mod cost {
use sc_network::ReputationChange as Rep;
pub(super) const PAST_REJECTION: Rep = Rep::new(-50, "Grandpa: Past message");
pub(super) const BAD_SIGNATURE: Rep = Rep::new(-100, "Grandpa: Bad signature");
pub(super) const MALFORMED_CATCH_UP: Rep = Rep::new(-1000, "Grandpa: Malformed cath-up");
pub(super) const MALFORMED_COMMIT: Rep = Rep::new(-1000, "Grandpa: Malformed commit");
pub(super) const FUTURE_MESSAGE: Rep = Rep::new(-500, "Grandpa: Future message");
pub(super) const UNKNOWN_VOTER: Rep = Rep::new(-150, "Grandpa: Unknown voter");
pub(super) const INVALID_VIEW_CHANGE: Rep = Rep::new(-500, "Grandpa: Invalid view change");
pub(super) const PER_UNDECODABLE_BYTE: i32 = -5;
pub(super) const PER_SIGNATURE_CHECKED: i32 = -25;
pub(super) const PER_BLOCK_LOADED: i32 = -10;
pub(super) const INVALID_CATCH_UP: Rep = Rep::new(-5000, "Grandpa: Invalid catch-up");
pub(super) const INVALID_COMMIT: Rep = Rep::new(-5000, "Grandpa: Invalid commit");
pub(super) const OUT_OF_SCOPE_MESSAGE: Rep = Rep::new(-500, "Grandpa: Out-of-scope message");
pub(super) const CATCH_UP_REQUEST_TIMEOUT: Rep = Rep::new(-200, "Grandpa: Catch-up request timeout");
// cost of answering a catch up request
pub(super) const CATCH_UP_REPLY: Rep = Rep::new(-200, "Grandpa: Catch-up reply");
pub(super) const HONEST_OUT_OF_SCOPE_CATCH_UP: Rep = Rep::new(-200, "Grandpa: Out-of-scope catch-up");
}
// benefit scalars for reporting peers.
mod benefit {
use sc_network::ReputationChange as Rep;
pub(super) const NEIGHBOR_MESSAGE: Rep = Rep::new(100, "Grandpa: Neighbor message");
pub(super) const ROUND_MESSAGE: Rep = Rep::new(100, "Grandpa: Round message");
pub(super) const BASIC_VALIDATED_CATCH_UP: Rep = Rep::new(200, "Grandpa: Catch-up message");
pub(super) const BASIC_VALIDATED_COMMIT: Rep = Rep::new(100, "Grandpa: Commit");
pub(super) const PER_EQUIVOCATION: i32 = 10;
}
/// A type that ties together our local authority id and a keystore where it is
/// available for signing.
pub struct LocalIdKeystore((AuthorityId, BareCryptoStorePtr));
impl LocalIdKeystore {
/// Returns a reference to our local authority id.
fn local_id(&self) -> &AuthorityId {
&(self.0).0
}
/// Returns a reference to the keystore.
fn keystore(&self) -> &BareCryptoStorePtr {
&(self.0).1
}
}
impl AsRef<BareCryptoStorePtr> for LocalIdKeystore {
fn as_ref(&self) -> &BareCryptoStorePtr {
self.keystore()
}
}
impl From<(AuthorityId, BareCryptoStorePtr)> for LocalIdKeystore {
fn from(inner: (AuthorityId, BareCryptoStorePtr)) -> LocalIdKeystore {
LocalIdKeystore(inner)
}
}
/// If the voter set is larger than this value some telemetry events are not
/// sent to avoid increasing usage resource on the node and flooding the
/// telemetry server (e.g. received votes, received commits.)
const TELEMETRY_VOTERS_LIMIT: usize = 10;
/// A handle to the network.
///
/// Something that provides both the capabilities needed for the `gossip_network::Network` trait as
/// well as the ability to set a fork sync request for a particular block.
pub trait Network<Block: BlockT>: GossipNetwork<Block> + Clone + Send + 'static {
/// Notifies the sync service to try and sync the given block from the given
/// peers.
///
/// If the given vector of peers is empty then the underlying implementation
/// should make a best effort to fetch the block from any peers it is
/// connected to (NOTE: this assumption will change in the future #3629).
fn set_sync_fork_request(&self, peers: Vec<sc_network::PeerId>, hash: Block::Hash, number: NumberFor<Block>);
}
impl<B, H> Network<B> for Arc<NetworkService<B, H>> where
B: BlockT,
H: sc_network::ExHashT,
{
fn set_sync_fork_request(&self, peers: Vec<sc_network::PeerId>, hash: B::Hash, number: NumberFor<B>) {
NetworkService::set_sync_fork_request(self, peers, hash, number)
}
}
/// Create a unique topic for a round and set-id combo.
pub(crate) fn round_topic<B: BlockT>(round: RoundNumber, set_id: SetIdNumber) -> B::Hash {
<<B::Header as HeaderT>::Hashing as HashT>::hash(format!("{}-{}", set_id, round).as_bytes())
}
/// Create a unique topic for global messages on a set ID.
pub(crate) fn global_topic<B: BlockT>(set_id: SetIdNumber) -> B::Hash {
<<B::Header as HeaderT>::Hashing as HashT>::hash(format!("{}-GLOBAL", set_id).as_bytes())
}
/// Bridge between the underlying network service, gossiping consensus messages and Grandpa
pub(crate) struct NetworkBridge<B: BlockT, N: Network<B>> {
service: N,
gossip_engine: Arc<Mutex<GossipEngine<B>>>,
validator: Arc<GossipValidator<B>>,
/// Sender side of the neighbor packet channel.
///
/// Packets sent into this channel are processed by the `NeighborPacketWorker` and passed on to
/// the underlying `GossipEngine`.
neighbor_sender: periodic::NeighborPacketSender<B>,
/// `NeighborPacketWorker` processing packets sent through the `NeighborPacketSender`.
//
// `NetworkBridge` is required to be cloneable, thus one needs to be able to clone its children,
// thus one has to wrap `neighbor_packet_worker` with an `Arc` `Mutex`.
neighbor_packet_worker: Arc<Mutex<periodic::NeighborPacketWorker<B>>>,
/// Receiver side of the peer report stream populated by the gossip validator, forwarded to the
/// gossip engine.
//
// `NetworkBridge` is required to be cloneable, thus one needs to be able to clone its children,
// thus one has to wrap gossip_validator_report_stream with an `Arc` `Mutex`. Given that it is
// just an `UnboundedReceiver`, one could also switch to a multi-producer-*multi*-consumer
// channel implementation.
gossip_validator_report_stream: Arc<Mutex<TracingUnboundedReceiver<PeerReport>>>,
}
impl<B: BlockT, N: Network<B>> Unpin for NetworkBridge<B, N> {}
impl<B: BlockT, N: Network<B>> NetworkBridge<B, N> {
/// Create a new NetworkBridge to the given NetworkService. Returns the service
/// handle.
/// On creation it will register previous rounds' votes with the gossip
/// service taken from the VoterSetState.
pub(crate) fn new(
service: N,
config: crate::Config,
set_state: crate::environment::SharedVoterSetState<B>,
prometheus_registry: Option<&Registry>,
) -> Self {
let (validator, report_stream) = GossipValidator::new(
config,
set_state.clone(),
prometheus_registry,
);
let validator = Arc::new(validator);
let gossip_engine = Arc::new(Mutex::new(GossipEngine::new(
service.clone(),
GRANDPA_ENGINE_ID,
GRANDPA_PROTOCOL_NAME,
validator.clone()
)));
{
// register all previous votes with the gossip service so that they're
// available to peers potentially stuck on a previous round.
let completed = set_state.read().completed_rounds();
let (set_id, voters) = completed.set_info();
validator.note_set(SetId(set_id), voters.to_vec(), |_, _| {});
for round in completed.iter() {
let topic = round_topic::<B>(round.number, set_id);
// we need to note the round with the gossip validator otherwise
// messages will be ignored.
validator.note_round(Round(round.number), |_, _| {});
for signed in round.votes.iter() {
let message = gossip::GossipMessage::Vote(
gossip::VoteMessage::<B> {
message: signed.clone(),
round: Round(round.number),
set_id: SetId(set_id),
}
);
gossip_engine.lock().register_gossip_message(
topic,
message.encode(),
);
}
trace!(target: "afg",
"Registered {} messages for topic {:?} (round: {}, set_id: {})",
round.votes.len(),
topic,
round.number,
set_id,
);
}
}
let (neighbor_packet_worker, neighbor_packet_sender) = periodic::NeighborPacketWorker::new();
NetworkBridge {
service,
gossip_engine,
validator,
neighbor_sender: neighbor_packet_sender,
neighbor_packet_worker: Arc::new(Mutex::new(neighbor_packet_worker)),
gossip_validator_report_stream: Arc::new(Mutex::new(report_stream)),
}
}
/// Note the beginning of a new round to the `GossipValidator`.
pub(crate) fn note_round(
&self,
round: Round,
set_id: SetId,
voters: &VoterSet<AuthorityId>,
) {
// is a no-op if currently in that set.
self.validator.note_set(
set_id,
voters.iter().map(|(v, _)| v.clone()).collect(),
|to, neighbor| self.neighbor_sender.send(to, neighbor),
);
self.validator.note_round(
round,
|to, neighbor| self.neighbor_sender.send(to, neighbor),
);
}
/// Get a stream of signature-checked round messages from the network as well as a sink for round messages to the
/// network all within the current set.
pub(crate) fn round_communication(
&self,
keystore: Option<LocalIdKeystore>,
round: Round,
set_id: SetId,
voters: Arc<VoterSet<AuthorityId>>,
has_voted: HasVoted<B>,
) -> (
impl Stream<Item = SignedMessage<B>> + Unpin,
OutgoingMessages<B>,
) {
self.note_round(
round,
set_id,
&*voters,
);
let keystore = keystore.and_then(|ks| {
let id = ks.local_id();
if voters.contains(id) {
Some(ks)
} else {
None
}
});
let topic = round_topic::<B>(round.0, set_id.0);
let incoming = self.gossip_engine.lock().messages_for(topic)
.filter_map(move |notification| {
let decoded = GossipMessage::<B>::decode(&mut ¬ification.message[..]);
match decoded {
Err(ref e) => {
debug!(target: "afg", "Skipping malformed message {:?}: {}", notification, e);
future::ready(None)
}
Ok(GossipMessage::Vote(msg)) => {
// check signature.
if !voters.contains(&msg.message.id) {
debug!(target: "afg", "Skipping message from unknown voter {}", msg.message.id);
return future::ready(None);
}
if voters.len().get() <= TELEMETRY_VOTERS_LIMIT {
match &msg.message.message {
PrimaryPropose(propose) => {
telemetry!(CONSENSUS_INFO; "afg.received_propose";
"voter" => ?format!("{}", msg.message.id),
"target_number" => ?propose.target_number,
"target_hash" => ?propose.target_hash,
);
},
Prevote(prevote) => {
telemetry!(CONSENSUS_INFO; "afg.received_prevote";
"voter" => ?format!("{}", msg.message.id),
"target_number" => ?prevote.target_number,
"target_hash" => ?prevote.target_hash,
);
},
Precommit(precommit) => {
telemetry!(CONSENSUS_INFO; "afg.received_precommit";
"voter" => ?format!("{}", msg.message.id),
"target_number" => ?precommit.target_number,
"target_hash" => ?precommit.target_hash,
);
},
};
}
future::ready(Some(msg.message))
}
_ => {
debug!(target: "afg", "Skipping unknown message type");
future::ready(None)
}
}
});
let (tx, out_rx) = mpsc::channel(0);
let outgoing = OutgoingMessages::<B> {
keystore,
round: round.0,
set_id: set_id.0,
network: self.gossip_engine.clone(),
sender: tx,
has_voted,
};
// Combine incoming votes from external GRANDPA nodes with outgoing
// votes from our own GRANDPA voter to have a single
// vote-import-pipeline.
let incoming = stream::select(incoming, out_rx);
(incoming, outgoing)
}
/// Set up the global communication streams.
pub(crate) fn global_communication(
&self,
set_id: SetId,
voters: Arc<VoterSet<AuthorityId>>,
is_voter: bool,
) -> (
impl Stream<Item = CommunicationIn<B>>,
impl Sink<CommunicationOutH<B, B::Hash>, Error = Error> + Unpin,
) {
self.validator.note_set(
set_id,
voters.iter().map(|(v, _)| v.clone()).collect(),
|to, neighbor| self.neighbor_sender.send(to, neighbor),
);
let topic = global_topic::<B>(set_id.0);
let incoming = incoming_global(
self.gossip_engine.clone(),
topic,
voters,
self.validator.clone(),
self.neighbor_sender.clone(),
);
let outgoing = CommitsOut::<B>::new(
self.gossip_engine.clone(),
set_id.0,
is_voter,
self.validator.clone(),
self.neighbor_sender.clone(),
);
let outgoing = outgoing.with(|out| {
let voter::CommunicationOut::Commit(round, commit) = out;
future::ok((round, commit))
});
(incoming, outgoing)
}
/// Notifies the sync service to try and sync the given block from the given
/// peers.
///
/// If the given vector of peers is empty then the underlying implementation
/// should make a best effort to fetch the block from any peers it is
/// connected to (NOTE: this assumption will change in the future #3629).
pub(crate) fn set_sync_fork_request(
&self,
peers: Vec<sc_network::PeerId>,
hash: B::Hash,
number: NumberFor<B>
) {
Network::set_sync_fork_request(&self.service, peers, hash, number)
}
}
impl<B: BlockT, N: Network<B>> Future for NetworkBridge<B, N> {
type Output = Result<(), Error>;
fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output> {
loop {
match self.neighbor_packet_worker.lock().poll_next_unpin(cx) {
Poll::Ready(Some((to, packet))) => {
self.gossip_engine.lock().send_message(to, packet.encode());
},
Poll::Ready(None) => return Poll::Ready(
Err(Error::Network("Neighbor packet worker stream closed.".into()))
),
Poll::Pending => break,
}
}
loop {
match self.gossip_validator_report_stream.lock().poll_next_unpin(cx) {
Poll::Ready(Some(PeerReport { who, cost_benefit })) => {
self.gossip_engine.lock().report(who, cost_benefit);
},
Poll::Ready(None) => return Poll::Ready(
Err(Error::Network("Gossip validator report stream closed.".into()))
),
Poll::Pending => break,
}
}
match self.gossip_engine.lock().poll_unpin(cx) {
Poll::Ready(()) => return Poll::Ready(
Err(Error::Network("Gossip engine future finished.".into()))
),
Poll::Pending => {},
}
Poll::Pending
}
}
fn incoming_global<B: BlockT>(
gossip_engine: Arc<Mutex<GossipEngine<B>>>,
topic: B::Hash,
voters: Arc<VoterSet<AuthorityId>>,
gossip_validator: Arc<GossipValidator<B>>,
neighbor_sender: periodic::NeighborPacketSender<B>,
) -> impl Stream<Item = CommunicationIn<B>> {
let process_commit = move |
msg: FullCommitMessage<B>,
mut notification: sc_network_gossip::TopicNotification,
gossip_engine: &Arc<Mutex<GossipEngine<B>>>,
gossip_validator: &Arc<GossipValidator<B>>,
voters: &VoterSet<AuthorityId>,
| {
if voters.len().get() <= TELEMETRY_VOTERS_LIMIT {
let precommits_signed_by: Vec<String> =
msg.message.auth_data.iter().map(move |(_, a)| {
format!("{}", a)
}).collect();
telemetry!(CONSENSUS_INFO; "afg.received_commit";
"contains_precommits_signed_by" => ?precommits_signed_by,
"target_number" => ?msg.message.target_number.clone(),
"target_hash" => ?msg.message.target_hash.clone(),
);
}
if let Err(cost) = check_compact_commit::<B>(
&msg.message,
voters,
msg.round,
msg.set_id,
) {
if let Some(who) = notification.sender {
gossip_engine.lock().report(who, cost);
}
return None;
}
let round = msg.round;
let set_id = msg.set_id;
let commit = msg.message;
let finalized_number = commit.target_number;
let gossip_validator = gossip_validator.clone();
let gossip_engine = gossip_engine.clone();
let neighbor_sender = neighbor_sender.clone();
let cb = move |outcome| match outcome {
voter::CommitProcessingOutcome::Good(_) => {
// if it checks out, gossip it. not accounting for
// any discrepancy between the actual ghost and the claimed
// finalized number.
gossip_validator.note_commit_finalized(
round,
set_id,
finalized_number,
|to, neighbor| neighbor_sender.send(to, neighbor),
);
gossip_engine.lock().gossip_message(topic, notification.message.clone(), false);
}
voter::CommitProcessingOutcome::Bad(_) => {
// report peer and do not gossip.
if let Some(who) = notification.sender.take() {
gossip_engine.lock().report(who, cost::INVALID_COMMIT);
}
}
};
let cb = voter::Callback::Work(Box::new(cb));
Some(voter::CommunicationIn::Commit(round.0, commit, cb))
};
let process_catch_up = move |
msg: FullCatchUpMessage<B>,
mut notification: sc_network_gossip::TopicNotification,
gossip_engine: &Arc<Mutex<GossipEngine<B>>>,
gossip_validator: &Arc<GossipValidator<B>>,
voters: &VoterSet<AuthorityId>,
| {
let gossip_validator = gossip_validator.clone();
let gossip_engine = gossip_engine.clone();
if let Err(cost) = check_catch_up::<B>(
&msg.message,
voters,
msg.set_id,
) {
if let Some(who) = notification.sender {
gossip_engine.lock().report(who, cost);
}
return None;
}
let cb = move |outcome| {
if let voter::CatchUpProcessingOutcome::Bad(_) = outcome {
// report peer
if let Some(who) = notification.sender.take() {
gossip_engine.lock().report(who, cost::INVALID_CATCH_UP);
}
}
gossip_validator.note_catch_up_message_processed();
};
let cb = voter::Callback::Work(Box::new(cb));
Some(voter::CommunicationIn::CatchUp(msg.message, cb))
};
gossip_engine.clone().lock().messages_for(topic)
.filter_map(|notification| {
// this could be optimized by decoding piecewise.
let decoded = GossipMessage::<B>::decode(&mut ¬ification.message[..]);
if let Err(ref e) = decoded {
trace!(target: "afg", "Skipping malformed commit message {:?}: {}", notification, e);
}
future::ready(decoded.map(move |d| (notification, d)).ok())
})
.filter_map(move |(notification, msg)| {
future::ready(match msg {
GossipMessage::Commit(msg) =>
process_commit(msg, notification, &gossip_engine, &gossip_validator, &*voters),
GossipMessage::CatchUp(msg) =>
process_catch_up(msg, notification, &gossip_engine, &gossip_validator, &*voters),
_ => {
debug!(target: "afg", "Skipping unknown message type");
None
}
})
})
}
impl<B: BlockT, N: Network<B>> Clone for NetworkBridge<B, N> {
fn clone(&self) -> Self {
NetworkBridge {
service: self.service.clone(),
gossip_engine: self.gossip_engine.clone(),
validator: Arc::clone(&self.validator),
neighbor_sender: self.neighbor_sender.clone(),
neighbor_packet_worker: self.neighbor_packet_worker.clone(),
gossip_validator_report_stream: self.gossip_validator_report_stream.clone(),
}
}
}
/// Type-safe wrapper around a round number.
#[derive(Debug, Clone, Copy, Eq, PartialEq, PartialOrd, Ord, Encode, Decode)]
pub struct Round(pub RoundNumber);
/// Type-safe wrapper around a set ID.
#[derive(Debug, Clone, Copy, Eq, PartialEq, PartialOrd, Ord, Encode, Decode)]
pub struct SetId(pub SetIdNumber);
/// A sink for outgoing messages to the network. Any messages that are sent will
/// be replaced, as appropriate, according to the given `HasVoted`.
/// NOTE: The votes are stored unsigned, which means that the signatures need to
/// be "stable", i.e. we should end up with the exact same signed message if we
/// use the same raw message and key to sign. This is currently true for
/// `ed25519` and `BLS` signatures (which we might use in the future), care must
/// be taken when switching to different key types.
pub(crate) struct OutgoingMessages<Block: BlockT> {
round: RoundNumber,
set_id: SetIdNumber,
keystore: Option<LocalIdKeystore>,
sender: mpsc::Sender<SignedMessage<Block>>,
network: Arc<Mutex<GossipEngine<Block>>>,
has_voted: HasVoted<Block>,
}
impl<B: BlockT> Unpin for OutgoingMessages<B> {}
impl<Block: BlockT> Sink<Message<Block>> for OutgoingMessages<Block>
{
type Error = Error;
fn poll_ready(mut self: Pin<&mut Self>, cx: &mut Context) -> Poll<Result<(), Self::Error>> {
Sink::poll_ready(Pin::new(&mut self.sender), cx)
.map(|elem| { elem.map_err(|e| {
Error::Network(format!("Failed to poll_ready channel sender: {:?}", e))
})})
}
fn start_send(mut self: Pin<&mut Self>, mut msg: Message<Block>) -> Result<(), Self::Error> {
// if we've voted on this round previously under the same key, send that vote instead
match &mut msg {
finality_grandpa::Message::PrimaryPropose(ref mut vote) =>
if let Some(propose) = self.has_voted.propose() {
*vote = propose.clone();
},
finality_grandpa::Message::Prevote(ref mut vote) =>
if let Some(prevote) = self.has_voted.prevote() {
*vote = prevote.clone();
},
finality_grandpa::Message::Precommit(ref mut vote) =>
if let Some(precommit) = self.has_voted.precommit() {
*vote = precommit.clone();
},
}
// when locals exist, sign messages on import
if let Some(ref keystore) = self.keystore {
let target_hash = *(msg.target().0);
let signed = sp_finality_grandpa::sign_message(
keystore.as_ref(),
msg,
keystore.local_id().clone(),
self.round,
self.set_id,
).ok_or_else(
|| Error::Signing(format!(
"Failed to sign GRANDPA vote for round {} targetting {:?}", self.round, target_hash
))
)?;
let message = GossipMessage::Vote(VoteMessage::<Block> {
message: signed.clone(),
round: Round(self.round),
set_id: SetId(self.set_id),
});
debug!(
target: "afg",
"Announcing block {} to peers which we voted on in round {} in set {}",
target_hash,
self.round,
self.set_id,
);
telemetry!(
CONSENSUS_DEBUG; "afg.announcing_blocks_to_voted_peers";
"block" => ?target_hash, "round" => ?self.round, "set_id" => ?self.set_id,
);
// announce the block we voted on to our peers.
self.network.lock().announce(target_hash, Vec::new());
// propagate the message to peers
let topic = round_topic::<Block>(self.round, self.set_id);
self.network.lock().gossip_message(topic, message.encode(), false);
// forward the message to the inner sender.
return self.sender.start_send(signed).map_err(|e| {
Error::Network(format!("Failed to start_send on channel sender: {:?}", e))
});
};
Ok(())
}
fn poll_flush(self: Pin<&mut Self>, _cx: &mut Context) -> Poll<Result<(), Self::Error>> {
Poll::Ready(Ok(()))
}
fn poll_close(mut self: Pin<&mut Self>, cx: &mut Context) -> Poll<Result<(), Self::Error>> {
Sink::poll_close(Pin::new(&mut self.sender), cx)
.map(|elem| { elem.map_err(|e| {
Error::Network(format!("Failed to poll_close channel sender: {:?}", e))
})})
}
}
// checks a compact commit. returns the cost associated with processing it if
// the commit was bad.
fn check_compact_commit<Block: BlockT>(
msg: &CompactCommit<Block>,
voters: &VoterSet<AuthorityId>,
round: Round,
set_id: SetId,
) -> Result<(), ReputationChange> {
// 4f + 1 = equivocations from f voters.
let f = voters.total_weight() - voters.threshold();
let full_threshold = (f + voters.total_weight()).0;
// check total weight is not out of range.
let mut total_weight = 0;
for (_, ref id) in &msg.auth_data {
if let Some(weight) = voters.get(id).map(|info| info.weight()) {
total_weight += weight.get();
if total_weight > full_threshold {
return Err(cost::MALFORMED_COMMIT);
}
} else {
debug!(target: "afg", "Skipping commit containing unknown voter {}", id);
return Err(cost::MALFORMED_COMMIT);
}
}
if total_weight < voters.threshold().get() {
return Err(cost::MALFORMED_COMMIT);
}
// check signatures on all contained precommits.
let mut buf = Vec::new();
for (i, (precommit, &(ref sig, ref id))) in msg.precommits.iter()
.zip(&msg.auth_data)
.enumerate()
{
use crate::communication::gossip::Misbehavior;
use finality_grandpa::Message as GrandpaMessage;
if !sp_finality_grandpa::check_message_signature_with_buffer(
&GrandpaMessage::Precommit(precommit.clone()),
id,
sig,
round.0,
set_id.0,
&mut buf,
) {
debug!(target: "afg", "Bad commit message signature {}", id);
telemetry!(CONSENSUS_DEBUG; "afg.bad_commit_msg_signature"; "id" => ?id);
let cost = Misbehavior::BadCommitMessage {
signatures_checked: i as i32,
blocks_loaded: 0,
equivocations_caught: 0,
}.cost();
return Err(cost);
}
}
Ok(())
}
// checks a catch up. returns the cost associated with processing it if
// the catch up was bad.
fn check_catch_up<Block: BlockT>(
msg: &CatchUp<Block>,
voters: &VoterSet<AuthorityId>,
set_id: SetId,
) -> Result<(), ReputationChange> {
// 4f + 1 = equivocations from f voters.
let f = voters.total_weight() - voters.threshold();
let full_threshold = (f + voters.total_weight()).0;
// check total weight is not out of range for a set of votes.
fn check_weight<'a>(
voters: &'a VoterSet<AuthorityId>,
votes: impl Iterator<Item=&'a AuthorityId>,
full_threshold: u64,
) -> Result<(), ReputationChange> {
let mut total_weight = 0;
for id in votes {
if let Some(weight) = voters.get(&id).map(|info| info.weight()) {
total_weight += weight.get();
if total_weight > full_threshold {
return Err(cost::MALFORMED_CATCH_UP);
}
} else {
debug!(target: "afg", "Skipping catch up message containing unknown voter {}", id);
return Err(cost::MALFORMED_CATCH_UP);
}
}
if total_weight < voters.threshold().get() {
return Err(cost::MALFORMED_CATCH_UP);
}
Ok(())
};
check_weight(
voters,
msg.prevotes.iter().map(|vote| &vote.id),
full_threshold,
)?;
check_weight(
voters,
msg.precommits.iter().map(|vote| &vote.id),
full_threshold,
)?;
fn check_signatures<'a, B, I>(
messages: I,
round: RoundNumber,
set_id: SetIdNumber,
mut signatures_checked: usize,
buf: &mut Vec<u8>,
) -> Result<usize, ReputationChange> where
B: BlockT,
I: Iterator<Item=(Message<B>, &'a AuthorityId, &'a AuthoritySignature)>,
{
use crate::communication::gossip::Misbehavior;
for (msg, id, sig) in messages {
signatures_checked += 1;
if !sp_finality_grandpa::check_message_signature_with_buffer(
&msg,
id,
sig,
round,
set_id,
buf,
) {
debug!(target: "afg", "Bad catch up message signature {}", id);
telemetry!(CONSENSUS_DEBUG; "afg.bad_catch_up_msg_signature"; "id" => ?id);
let cost = Misbehavior::BadCatchUpMessage {
signatures_checked: signatures_checked as i32,
}.cost();
return Err(cost);
}
}
Ok(signatures_checked)
}
let mut buf = Vec::new();
// check signatures on all contained prevotes.
let signatures_checked = check_signatures::<Block, _>(
msg.prevotes.iter().map(|vote| {
(finality_grandpa::Message::Prevote(vote.prevote.clone()), &vote.id, &vote.signature)
}),
msg.round_number,
set_id.0,
0,
&mut buf,
)?;
// check signatures on all contained precommits.
let _ = check_signatures::<Block, _>(
msg.precommits.iter().map(|vote| {
(finality_grandpa::Message::Precommit(vote.precommit.clone()), &vote.id, &vote.signature)
}),
msg.round_number,
set_id.0,
signatures_checked,
&mut buf,
)?;
Ok(())
}
/// An output sink for commit messages.
struct CommitsOut<Block: BlockT> {
network: Arc<Mutex<GossipEngine<Block>>>,
set_id: SetId,
is_voter: bool,
gossip_validator: Arc<GossipValidator<Block>>,
neighbor_sender: periodic::NeighborPacketSender<Block>,
}
impl<Block: BlockT> CommitsOut<Block> {
/// Create a new commit output stream.
pub(crate) fn new(
network: Arc<Mutex<GossipEngine<Block>>>,
set_id: SetIdNumber,
is_voter: bool,
gossip_validator: Arc<GossipValidator<Block>>,
neighbor_sender: periodic::NeighborPacketSender<Block>,
) -> Self {
CommitsOut {
network,
set_id: SetId(set_id),
is_voter,
gossip_validator,
neighbor_sender,
}
}
}
impl<Block: BlockT> Sink<(RoundNumber, Commit<Block>)> for CommitsOut<Block> {
type Error = Error;
fn poll_ready(self: Pin<&mut Self>, _: &mut Context) -> Poll<Result<(), Self::Error>> {
Poll::Ready(Ok(()))
}
fn start_send(self: Pin<&mut Self>, input: (RoundNumber, Commit<Block>)) -> Result<(), Self::Error> {
if !self.is_voter {
return Ok(());
}
let (round, commit) = input;
let round = Round(round);
telemetry!(CONSENSUS_DEBUG; "afg.commit_issued";
"target_number" => ?commit.target_number, "target_hash" => ?commit.target_hash,
);
let (precommits, auth_data) = commit.precommits.into_iter()
.map(|signed| (signed.precommit, (signed.signature, signed.id)))
.unzip();
let compact_commit = CompactCommit::<Block> {
target_hash: commit.target_hash,
target_number: commit.target_number,
precommits,
auth_data
};
let message = GossipMessage::Commit(FullCommitMessage::<Block> {
round,
set_id: self.set_id,
message: compact_commit,
});
let topic = global_topic::<Block>(self.set_id.0);
// the gossip validator needs to be made aware of the best commit-height we know of
// before gossiping
self.gossip_validator.note_commit_finalized(
round,
self.set_id,
commit.target_number,
|to, neighbor| self.neighbor_sender.send(to, neighbor),
);
self.network.lock().gossip_message(topic, message.encode(), false);
Ok(())
}
fn poll_close(self: Pin<&mut Self>, _: &mut Context) -> Poll<Result<(), Self::Error>> {
Poll::Ready(Ok(()))
}
fn poll_flush(self: Pin<&mut Self>, _: &mut Context) -> Poll<Result<(), Self::Error>> {
Poll::Ready(Ok(()))
}
}
| {
"pile_set_name": "Github"
} |
console.log (Object.prototype.toString.call([].__proto__));
console.log (Object.prototype.toString.call([].__proto__.__proto__));
console.log (Object.prototype.toString.call([].__proto__.__proto__.__proto__));
| {
"pile_set_name": "Github"
} |
UnPoetia:I am the Calculator
I'm a calculator calculating numerals and fractions,
Although I am a calculator I do no other actions,
Though I am smart and logical,
I'm not at all philosophical,
I do not have the will at all,
To do non-mathematical interactions.
Calculating calculations is my only function,
To use me any other way would be mental interruption,
My only goal is to create,Equations to substantiate,
Polynomials to evaluate,
Or finding electric potential in a p-n junction.
Though many think of me only as a tool for math equations,
Math lovers see me as a bulky box of infatuation,
It's a sin{\displaystyle sin} to feel this way,
We'll stay together come what come may,
We won't add our hearts to nostalgic dismay,
Together in love, no explanation.
Our hearts will intertwine like two parabolas quadratic,
Our affections, they will multiply like pigeons in an attic,
Don't leave me here to die,
If you do I'll pout and cry,
These emotions, I don't know why,
Are causing an unexpectedly incredible panic. | {
"pile_set_name": "Pile-CC"
} |
In Control (EliZe album)
In Control is the first full-length studio album by the Dutch pop and dance singer EliZe, released on October 6, 2006, in the Netherlands. In 2007, it was released in Japan as a special edition with bonus tracks.
Track listing
"Shake"
"Itsy Bitsy Spider"
"Let's Dance"
"I'm No Latino"
"Bodytalk"
"Automatic (I'm Talking to You)"
"Rhythm of Love"
"Come Along"
"Into Your System"
"Sexually Healing"
"100%"
Japanese edition
"Automatic (DJ Uto Remix)"
"Into Your System (Shiny☆Mix)"
"100%"
"Let's Dance"
"Shake"
"Automatic"
"Into Your System"
"Itsy Bitsy Spider"
"I'm No Latino"
"Come Along"
"Bodytalk"
"Rhythm Of Love"
"Sexually Healing"
Singles
"Shake" was the first single of In Control. When released on October 18, 2004, the single entered the Dutch Top 40 at #36 and peaked at number #32.
"Automatic (I'm Talking To You)" was the second single to be released, on March 7, 2005. It is EliZe's most successful song to date. It spent a total of 13 weeks in the Dutch chart and peaked at #7.
"I'm No Latino" was the third single from In Control, released on August 22, 2005. It reached number #14 in the Dutch Top 40.
"Into Your System" was the fourth single, released on June 2, 2006. The single entered the Dutch Top 40 at #33 and peaked at number #18.
"Itsy Bitsy Spider" was the final single from this album. It failed to enter the Dutch Top 40 chart and peaked at #5 in the Tipparade.
Chart performance
References
Category:2006 albums | {
"pile_set_name": "Wikipedia (en)"
} |
# Subscription
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**subscriptionId** | **String** | | [optional]
**applicationId** | **String** | |
**apiIdentifier** | **String** | |
**tier** | **String** | |
**status** | [**StatusEnum**](#StatusEnum) | | [optional]
<a name="StatusEnum"></a>
## Enum: StatusEnum
Name | Value
---- | -----
BLOCKED | "BLOCKED"
PROD_ONLY_BLOCKED | "PROD_ONLY_BLOCKED"
UNBLOCKED | "UNBLOCKED"
ON_HOLD | "ON_HOLD"
REJECTED | "REJECTED"
| {
"pile_set_name": "Github"
} |
American moms apparently aren't the only ones locked in an epic battle between organic-juice perfectionists and bad-mother ne'er-do-wells. To hear feminist Elisabeth Badinter tell it, the mommy wars have spread to France.
According to the Guardian's Lizzy Davies, French moms have been seen as "the wonder women of Europe" for their ability to juggle jobs with a fertility rate higher than Britain's or Germany's. But Badinter (not pictured — that's Carla Bruni and her stepson Louis Sarkozy) is worried they can only juggle for so long. She says,
The majority of French women [now] reconcile maternity with professional life. Many of them work full-time when they have a child. They are resisting the model of the perfect mother, but for how long? I get the impression that we may now be at a turning point.
Evidence for said turning point includes, according to Davies, "the new image of the 'ideal mother' – one who breastfeeds for six months, does not rush to return to full-time work, avoids painkillers in childbirth, rejects disposable nappies and occasionally lets her baby sleep in her bed" and a possible trend of more French schoolgirls wanting to be stay-at-home moms. Cecilé Duflot, mom of four and leader of the French Green Party, says Badinter "is completely wrong … The examples she uses totally miss the point." But it's interesting to see issues of women's work-life balance — which in America so often seem related to our long hours, employment insecurity, and lack of a social safety net — still playing out in a more socialist country.
Of course, France is no utopia — like the US, it has serious problems with race and class. Still, it's sad that in a nation with reasonable workdays, paid maternity leave, and subsidized childcare, motherhood is still fraught with unreasonable expectations and infighting. Next thing they'll be telling us French women get fat.
G/O Media may get a commission Subscribe and Get Your First Bag Free Promo Code AtlasCoffeeDay20
French Philosopher Says Feminism Under Threat From 'Good Motherhood' [Guardian] | {
"pile_set_name": "OpenWebText2"
} |
The Real Housewives of Miami star has called off her engagement to fiancé Romain Zago, E! News confirms.
"We got in a huge fight, and I just pretty much got fed up with it," the 33-year-old model and reality star tells In Touch. "He loves me to death, he wants to be with me, but there's something that's holding him from committing 100 percent."
Krupa and Zago were together for over five years but had not yet set a date to walk down the aisle.
The split isn't exactly a huge surprise for Real Housewives of Miami fans. The couple's rocky romance has been a major plotline throughout the show, with Krupa and Zago often getting into heated arguments.
So how does the now-single Krupa feel about returning for another season of Housewives?
"Despite all the drama, of course I would love to," Krupa told us exclusively this past weekend. "I signed up knowing that it's one of the biggest franchises and there's also a lot of drama. But if you get seven girls in a room, somebody's going to start up something."
"I don't have any regrets," she added. "You live and learn. Miami is just another platform in my career, and I appreciate it."
Your information may be shared with other NBCUniversal businesses and used to better tailor our services and advertising to you. For more details about how we use your information, see our Privacy Policy. If you are located outside of the U.S., your information may be transferred to, processed and used in the U.S. | {
"pile_set_name": "Pile-CC"
} |
Q:
get the margin size of an element with jquery
How can I get the properties of an element with jquery? I want to get the size of the margin of a div in particular.
I have set the style of the div in a .css file, for instance,
.item-form {
margin:0px 0px 10px 0px;
background: blue;
}
the html,
<form>
...
<div class="item-form">
<textarea class="autohide"></textarea>
</div>
...
</form>
I tried with this code, but it fails obviously,
$(".autohide").each(function(){
var $this = $(this);
alert($this.parents("div:.item-form").css("margin"));
});
any ideas? thanks.
A:
The CSS tag 'margin' is actually a shorthand for the four separate margin values, top/left/bottom/right. Use css('marginTop'), etc. - note they will have 'px' on the end if you have specified them that way.
Use parseInt() around the result to turn it in to the number value.
NB. As noted by Omaty, the order of the shorthand 'margin' tag is: top right bottom left - the above list was not written in a way intended to be the list order, just a list of that specified in the tag.
A:
You'll want to use...
alert(parseInt($this.parents("div:.item-form").css("marginTop").replace('px', '')));
alert(parseInt($this.parents("div:.item-form").css("marginRight").replace('px', '')));
alert(parseInt($this.parents("div:.item-form").css("marginBottom").replace('px', '')));
alert(parseInt($this.parents("div:.item-form").css("marginLeft").replace('px', '')));
A:
Exemple, for :
<div id="myBlock" style="margin: 10px 0px 15px 5px:"></div>
In this js code :
var myMarginTop = $("#myBlock").css("marginBottom");
The var becomes "15px", a string.
If you want an Integer, to avoid NaN (Not a Number), there is multiple ways.
The fastest is to use native js method :
var myMarginTop = parseInt( $("#myBlock").css("marginBottom") );
| {
"pile_set_name": "StackExchange"
} |
Staff / Samantha M. Shal / Lexii Cassell was the lone senior for Holy Innocents’, which finished runner-up in the Class A private school girls’ state basketball tournament after losing to Southwest Atlanta Christian 57-52 in the finals in Macon last Saturday.
What was otherwise a magical season for the Holy Innocents’ girls’ basketball team came to a disappointing end with a 57-52 loss to Southwest Atlanta Christian in the Class A private school championship game in Macon last Saturday.
The loss deprived the Lady Bears (29-1) of both a state championship – which would have been their first since 1999 – and an undefeated season after winning their first 29 games of the 2013-14 campaign.
“We did have an outstanding season and not just with the wins,” Holy Innocents’ coach Tony Watkins said. “We had 11 girls playing for each other. We had outstanding team chemistry. We were very disappointed and I feel like we weren’t just happy to be there. We really played to win and it just didn’t work out for us.”
Holy Innocents’ led most of the way through the first three quarters – leading 16-12 at the end of the first quarter, 27-22 at halftime and 38-33 at the end of the third period.
However, back-to-back 3-pointers by Marquita Daniels ignited what would turn out to be a 15-2 run for SACA, which also was able to take advantage of some Lady Bear turnovers at the end.
“We had a five-point lead going into the fourth quarter and we made a couple of turnovers and Marquita Daniels hit a couple of bombs from NBA range,” Watkins said. “We also gave up a couple of offensive rebounds. We outrebounded them 50-35 for the game and that’s one thing I was worried about, because they’re so athletic. But, they got a couple of [of offensive rebounds] in the fourth quarter, which led to baskets and they made their run late – hats off to them.”
Khayla Pointer scored 13 points, while Erika Cassell added 12 points and 18 rebounds and Sydney Long had 11 points to lead HIES in the finals.
The Lady Bears advanced to the championship game with a 46-20 victory over Darlington in a semifinal contest at Kennesaw State last Monday after having beaten Savannah Country Day 83-29 in the first round Feb. 24 and Mount Paran Christian 72-51 Feb. 26.
With only senior Lexii Cassell – a Murray State signee – departing from this year’s Class A private school girls’ runner-up, Watkins said the future looks bright for his team.
“We return 10 of 11 on that roster and that’s exciting,” Watkins said. “We’re looking forward to the future.”
Meanwhile, St. Pius X (30-3) won its second consecutive Class AAA girls championship with a 48-45 victory over Buford in the title game in Macon last Saturday.
The Golden Bears fell behind 15-3 at the beginning of the game, but came back and took the lead for good towards the end of the first half.
Asia Durr scored 23 points, while Jasmine Carter added 12 points and seven rebounds and Miah Allen had nine points on three 3-pointers and grabbed seven rebounds to lead St. Pius.
*We welcome your comments on the stories and issues of the day and seek to provide a forum for the community to voice opinions. All comments are subject to moderator approval before being made visible on the website but are not edited. The use of profanity, obscene and vulgar language, hate speech, and racial slurs is strictly prohibited. Advertisements, promotions, spam, and links to outside websites will also be rejected. Please read our terms of service for full guides | {
"pile_set_name": "Pile-CC"
} |
No Retreat, No Surrender, No Compromise: The New GOP?
Something rare and wonderful in politics has happened. It has very little to do with the GOP takeover of the House of Representatives, although that is certainly a side benefit. What is truly remarkable is the massive amount of new blood that has been transfused into the Republican Party as a result of their victory in this election. It makes for a delicious feeling of uncertainty and unpredictability. Anything is going to be possible over the next two years, including a GOP implosion, a Democratic Party explosion, or a presidential meltdown.
About a third of the GOP caucus that is sworn in on January 3, 2011, will never have served in Congress previously. If they organize and stay together, they could affect everything from the battle to repeal health care reform to who becomes speaker of the House. Almost all of them are as conservative as any group of first-termers who have ever been elected. The question being asked by both tea party folk and the GOP establishment is: how wedded to "principle" are the newcomers?
Similar questions were being asked by Democrats in 1974 when the Watergate class of liberal congressmen upended the Democratic establishment and forever after skewed the party to the far left. There were 72 new congressmen in that class (the Democrats gained 49 seats) and they quickly organized themselves into a powerful caucus that changed the committee and seniority system, thus altering the way the Congress did business. Their example may be followed by this new group of freshly minted conservative House members who come to Washington as a result of the GOP tidal wave.
Not all of them have bubbled up from the tea party movement, but most are in sync with its goals: fiscal responsibility and a return to some semblance of prudent government. But what does that mean? We are in a nightmarish economy with slow growth, continuing job losses, and the specter of inflation in the background due to the irresponsible policies of the Federal Reserve. We are also faced with depressing budget deficits and a truly frightening national debt.
Is there no role for government at all in fixing this mess? If there is, the Republicans are not going to be able to accomplish much on their own. They will need to work with the Democrats and the president in order to get something done about the economy and the budget. Spending and tax cuts will have to be negotiated to have any chance of being signed by the president and put into effect. Otherwise, the GOP will simply be posturing, and nothing at all will be accomplished.
But the Republicans have already indicated that there will be no compromise with the Democrats, and the newcomers are completely in tune with that promise. In fact, if a move is made by establishment Republicans to work with the opposition, there is the probability that the tea party caucus will possess the solidarity to shoot down any attempt to reach a bipartisan deal on the budget and taxes with the Democrats. This will deny President Obama and his party any semblance of "victory," but it will also prevent the GOP from achieving anything they can take back home with them to show their constituents how much they care about their suffering in this miserable economy.
Some may believe this is not important. If so, they are out of touch with the vast majority of the American people. And ignoring voters while doing exactly the opposite of what they desire sounds suspiciously like the very same strategy employed by the Democrats recently. The results for the GOP in 2012 are likely to mirror what happened to the Democrats in 2010.
When it comes to compromising with the opposition or "sticking with principles," it's no contest. A Bloomberg poll had it 80-16 for compromise, while a CBS/New York Times poll showed 69-22 in favor of working with the opposition to get something done. With numbers like that, it would seem that the GOP is willing to commit political suicide with their "no compromise" stand, playing right into the hands of President Obama. The president will no doubt make what are reasonable sounding accommodations with the Republicans, knowing full well he can do so safely since they will be rejected and the GOP will get the lion's share of the blame if nothing gets accomplished. | {
"pile_set_name": "Pile-CC"
} |
Behavioural phenotype of a patient with a de novo 1.2 Mb chromosome 4q25 microdeletion.
A female patient, 20 years of age, is reported with a history characterized by developmental and psychomotor delay, and during grammar-school period increasing learning problems, ritualistic behaviours and social withdrawal. Subsequently, challenging and autistic-like behaviours became prominent. The patient showed mild facial dysmorphisms, long thin fingers with bilateral mild short V metacarpals, and hyperlaxity of the joints. Neuropsychiatric examination disclosed obsessive, ritualistic behaviours and vague ideas of reference. Neuropsychological assessment demonstrated mild intellectual disability, mental inflexibility and incongruent affect. MRI-scanning of the brain showed no relevant abnormalities. Genome wide SNP array analysis revealed a 1.2 Mb de novo interstitial microdeletion in 4q25 comprising 11 genes, that was considered to be causative for the developmental delay, perseverative cognitive phenotype and dysmorphisms. To the authors knowledge, this is the first report of a de novo 4q25 microdeletion that presents with a specific behavioural phenotype. | {
"pile_set_name": "PubMed Abstracts"
} |
Some electronic components, such as circuit-level devices, integrated-circuit-level devices, board-level devices, and system-level devices, use a multi-phase clock signal. Typically, the multi-phase clock is generated by a ring oscillator, a delay locked loop (DLL), or logic dividers with an external reset or initialization.
However, the circuitry that generates the multi-phase clock generally do not operate at low voltage, have relatively narrow bandwidth support, do not have good noise characteristics, and consume a large amount of power. Moreover, an external reset or initialization for such circuitry does not provide adequate design robustness. | {
"pile_set_name": "USPTO Backgrounds"
} |
Modelling of the temporal indoor radon variation in Bulgaria.
In this study, temporal variations of indoor radon concentrations in Bulgaria were investigated. The radon concentrations were measured by nuclear track detectors as part of the Bulgarian National Survey, performed in the dwellings of 28 regional districts. The detectors were exposed through a year in two consecutive time periods of different lengths. For 2433 dwellings, measurements could be completed for both time periods, while for 345 dwellings they could only be completed for one of the periods. To estimate any missing radon concentrations, a temporal correction procedure was developed. This procedure, which included development of a linear correlation between the ln-transformed radon concentrations from the 9-month period [CRn(L)] and from the 3-month period [CRn(S)]. A normal distribution of the data, which is a condition for linear regression, was achieved when the ln-transformed radon concentrations were grouped by climate zone, then by regional districts, and finally by the presence/absence of a basement in the investigated building. The linear models obtained for each group showed reasonable coefficients of determination (R2 ≈ 0.50) and root mean square errors (RMSEs) of about 0.50. When these correlations were used to reconstruct radon concentrations in missing measurement periods, it turned out that the reconstructed data (for 345 dwellings) were within the 95% confidence interval of the measured data (for 2433 dwellings). The geometric means of CRn(L) and CRn(S) were 76 Bq/m3 and 100 Bq/m3, respectively, for 2433 dwellings, which are almost equal to those of 75 Bq/m3 and 98 Bq/m3, which represent the measured and reconstructed data together (for 2778 dwellings). | {
"pile_set_name": "PubMed Abstracts"
} |
Buy my book; Game Art (By No Starch Press)!
Digitally Downloaded editor-in-chief, Matt Sainsbury, has written a book about games as works of art, from America to Japan, Australia to Europe. Accompanied by gorgeous, high quality art and interviews with over 20 of the world's best game creators, this book is a collector's piece for fans of all kinds of games!
| {
"pile_set_name": "OpenWebText2"
} |
Q:
How to fire following command from vc++?
I want to change image's exif data. For that I've used Exiv2.exe. Now I want to fire command from my program which is written in vc++ 08. For modify GPS data of image, exive command is
exiv2 -M"set Exif.GPSInfo.GPSLatitude 4/1 15/1 33/1" D:\test\image.jpg
I've placed exiv2.exe into system32 folder. And this command works fine from command prompt. For example,
C:\Users\Me>exiv2 -M"set Exif.GPSInfo.GPSLatitude 4/1 15/1 33/1" D:\test\image.jpg
Now how can I fire this same command from my c++ program?
Thanks in advance...
A:
Finally got it,
const char *change_latitude = "exiv2 -M\"set Exif.GPSInfo.GPSLatitude 14/1 15/1 13/1\" D:\\test\\image.jpg";
system(change_latitude);
In this example assumption is : exiv2.exe in system32 folder.
Thanks...
| {
"pile_set_name": "StackExchange"
} |
World Small Animal Veterinary Association World Congress Proceedings, 2003
Ray Butcher, MA, VetMB, MRCVS
The Wylie Veterinary CentreUpminster, Essex, UK
About the speaker
As well as being a Partner in a large companion animal practice near to London, Ray Butcher is a veterinary advisor to the World Society for the Protection of Animals (WSPA), and has worked on a number of projects in Eastern Europe, Asia and South America. He has represented WSPA and the World Veterinary Association (WVA) on the World Health Organisation (WHO) "Rabies in Asia working group".
Introduction
I have been very fortunate to have traveled widely and have come to understand that there are often no simple answers to animal welfare problems. We must take account of religious, cultural and economic differences as well as realise that we cannot consider animal welfare in isolation of human welfare issues.
Asia has many cultures and so there are many different solutions to the same challenges. I feel a little embarrassed as a European making this presentation in Asia, but hope that you will accept the observations of an "outsider" as food for thought rather than an attempt to tell you what is right or wrong.
This presentation will use humane stray dog control programmes as an example and suggest challenges for our profession.
Basic concepts
Before discussing the stray dog control programmes in more detail, it is important to outline a number of basic concepts:
1. Human Animal interactions
We interact with animals throughout our daily lives. There are many benefits of owning pets, but when they are not kept in a responsible way or allowed to stray, they may cause significant problems to society as a whole. It is important that all involved realise there must be a balance achieved, and that any control schemes must employ only humane methods.
2. The Five Freedoms
The so-called "Five Freedoms" were introduced by the Farm Animal Welfare Council of the UK to assess the welfare of farm animals kept in intensive husbandry systems. They represent an ideal situation, but still remain a useful guide. Having accepted their value in the agricultural context, it is important that we use the same criteria to assess our programmes for companion animals. Sadly, the examples will show that significant welfare problems can result the actions of well-meaning animal protection groups.
3. Responsible Pet ownership
This is an easy concept when considering a household pet. The owner is responsible for providing adequate care and ensuring that the behaviour of their pet does not adversely affect society in general. In cultures with so-called "community dogs", it is important to educate society to have a degree of corporate responsibility for these dogs.
4. Stakeholders
A stakeholder is anyone (man, animal or association) that is directly or indirectly affected by any proposed programme. The potential stakeholders in stray dog control programmes will be discussed, illustrating the wide range of interests and views that need to be taken note of. Failure to co-ordinate these groups at the start will adversely affect the chances of success.
Stray dog control programmes
The WHO estimate that the worldwide dog population is about 10% of the human population, and of these, 75% can be regarded as strays. These dogs can cause direct injury or disease to humans, livestock and pets and many other indirect costs. In countries with endemic rabies, these costs can become very high. Historically, municipalities have tended to respond using mass slaughter campaigns. These are often inhumane, indiscriminate and ineffective. For this reason in 1991, the WHO and WSPA introduced guidelines on population control programmes. The overall framework required:
1. Legislation
2. Registration and identification
3. Garbage Control
4. Neutering of owned pets
5. Neutering of un-owned dogs
6. Regulation of breeders and sales outlets
7. Education
The specific situation in different countries means that the priorities will vary, but it is important that all factors are considered. It is also essential that all groups involved (municipalities, human health ministries, veterinarians, animal protection societies) work to an agreed coordinated plan.
Before starting a programme it is also important to perform an accurate survey of the population. Potentially a stray dog population can be made up of:
Owned dogs that have accidentally been lost
Owned dogs that are allowed to wander
Owned dogs that have been abandoned
Community dogs
Feral dogs
The priorities of the specific programme will reflect the relative numbers in each group. It is also important that the survey method is to an agreed standard such that data can be compared and the success of the project monitored accurately.
While a "no kill" policy may be the moral ideal, it is probably unrealistic for groups with limited resources in areas with a poor economy. In such cases, failure to face up to difficult decisions may cause more welfare problems.
Neuter and release programmes have been successfully used in some situations where there is a large community or feral population. The requirements for such programmes will be discussed, as well as the potential welfare problems resulting from playing the "numbers game".
A number of projects within Asia will be described that illustrate these points.
WHO and Rabies Control
Although many animal protection societies are involved in stray dog control from a perspective of animal welfare, it is my opinion that the success of any project will be judged by its success at controlling rabies. Although the guidelines described above were produced jointly by WHO and WSPA, it is clear that the lessons are ignored by many human health agencies that still promote mass slaughter. There is also relatively poor communication between those devising strategies and those working in the field. Veterinarians have an important role to play in this connection.
Education
The biggest challenge is perhaps education, and the most responsive group in any society is children. The majority of successful projects have an education component and veterinarians have an important role to play.
To stimulate this progress, it is important that we "educate the educators". It is the policy of WVA to encourage the introduction of Animal Welfare as a core curriculum subject in the veterinary faculties of developing countries, and a WSPA project called "Concepts in Animal Welfare" has been developed to facilitate this. The philosophy of this WSPA project is endorsed by WSAVA. | {
"pile_set_name": "Pile-CC"
} |
Supervising family therapy trainees in primary care medical settings: context matters.
The purpose of this article is to identify and describe four essential skills for effective supervision of family therapy trainees in primary care medical settings. The supervision skills described include: (1) Understand medical culture; (2) Locate the trainee in the treatment system; (3) Investigate the biological/health issues; and (4) Be attentive to the self-of-the-therapist. Recommendations are also made to help supervisors become better prepared for the questions medical family therapy trainees bring to supervision. | {
"pile_set_name": "PubMed Abstracts"
} |
We are all incompetent to some capacity. There are two ways to respond to our inadequacies, violence or love. I’ve learned to accept that there are just some poses in yoga that my body, anatomically, just isn’t prepared for yet. It’s really frustrating. My background in competitive sports often clicks in and I want to surpass what others are doing and even what I thought I could do myself. When recently working on flying pigeon (https://www.youtube.com/watch?v=zEsJMOW6RKc), I injured my right knee. Everything went fantastic on the left side, but I don’t have as much openness or flexibility in my right hip. The lack of flexibility here also puts strain on my right knee. So in a hurried state of thinking both sides should be equal, I heard a pop in my right knee. It seems to be healing pretty well so far, but all of that could have been avoided if I just decided to not be violent towards myself. There was no reason to throw myself into a pose that I know my body was not prepared for.
I tell the above recent reflection because I do not want you to be discouraged by your practice. Yoga is not a “on to the next level of poses and see what I can do one or two times.” It is a constant communication between your mind, body and the world within and around us. A practice filled with familiar poses and a positive acceptance of the state of your body can do wonders for you mentally and physically. You aren’t powerless … you are quite powerful when you display patience, acceptance and love. Don’t let the anger you hold towards yourself keep you from the beauty of the moment.
Dealing with Powerlessness
Mantra – I love who I am right now.
Whether we respond with anger, withdrawal, frustration, or resignation, there is a way in which our mind shuts down, as if we are riding a train through a dark tunnel and we can’t see anything but darkness and anxiety. Ahimsa, nonviolence, invites us to question the feeling of powerlessness rather than accept it.
There are three ways of thinking that can shift you out of a feeling of powerlessness: practicing gratitude, trust in the moment, and thinking about others. Any sense of powerlessness we are feeling can be traced back to the story we are telling ourselves in the moment about the situation. We all have the choice to tell a different story and grow ourselves up to take responsibility for our lives in a new and fresh way. With this attitude, feelings of powerlessness become opportunities to become competent rather than violent. | {
"pile_set_name": "Pile-CC"
} |
CfnmTV – Among Friends 2
Chris is in possession of some quite racy underwear! Normally he’d keep it as a saucy surprise for his wife Fiona. But tonight, at her insistence, he is to model it for all her friends to see. And she won’t even allow him to leave the room to get changed… | {
"pile_set_name": "Pile-CC"
} |
[Absence of echoviral RNA sequences in medulla oblongata samples taken from patients who died of ALS].
The role of viral infection in the pathogenesis of ALS has been raised many times in the previous papers. The presence of an enterovirus genome has previously been confirmed using PCR RT and PCR in situ in spinal cord tissue samples taken from patients who died of ALS. Viral genome sequencing has also been used to show 91% and 88% homogeneity with ECHO 6 and 7 viruses, respectively. A year later the same method did not confirm the presence of the virus in spinal cord fragments taken from 30 patients who died of ALS nor in an 18 person control group. The present study was aimed to find persistent ECHO 6 and 7 viral infections in tissue samples taken from patients who died of ALS. RNA was isolated from frozen medulla oblongata samples taken from six patients who died of ALS (hospitalized in the Neurological Clinic CSK AM in the years 2000-2002). The presence of RNA was confirmed using RNA beta-actine. Oligo 2 and 3 as well as pEforward and pErevers primers were used for amplification. All samples returned negative results. In the samples studied no correlation was found between ECHO 6 and 7 viral infections and ALS. | {
"pile_set_name": "PubMed Abstracts"
} |
Perimeter Institute for Theoretical Physics is offering fully funded masters scholarship for international students. These scholarships are available for pursuing one-year master’s level course at the Perimeter Institute for Theoretical Physics. All students who are admitted receive full scholarships (Accommodation, Meals, Living stipend, Full tuition, Health insurance, Books and materials, Laptop and Travel supplement).
Each year up to 30 adventurous and exceptional students from countries around the world are admitted to the PSI program. To be eligible for the scholarship minimum TOEFL score required is 90 overall, with a minimum of 25 in writing and speaking. The application deadline is February 1, 2017.
Study Subject(s): Scholarships are awarded in the field of theoretical physics.
Course Level: Scholarships are available for pursuing one-year master’s level course at the Perimeter Institute for Theoretical Physics
Scholarship Award: All students who are admitted receive full scholarships (Accommodation, Meals, Living stipend, Full tuition, Health insurance, Books and materials, Laptop and Travel supplement).
Scholarship can be taken in the Canada
Eligibility: Perimeter Scholars International welcomes applications from exceptional, enthusiastic, interactive students who have obtained an undergraduate degree in physics and/or math with a minimum of three undergraduate or graduate courses in physics beyond Introductory Physics.
-To be eligible for the scholarship minimum TOEFL score required is 90 overall, with a minimum of 25 in writing and speaking.
Scholarship Open for International Students: International students can apply for this scholarship.
Scholarship Description: Perimeter Scholars International is a one-year Masters level course in Theoretical Physics designed to bring highly qualified, highly motivated graduate students to the cutting edge of theoretical physics in an intense, interactive learning environment. Taught by some of the world’s top physicists, students at PSI are fully integrated into the unique international culture and dynamic atmosphere of a leading research institute, the Perimeter Institute for Theoretical Physics, while earning a master’s degree from the University of Waterloo. | {
"pile_set_name": "Pile-CC"
} |
‘Man Camp’ Development Moratorium Likely for N.Y. Town
As state officials consider allowing a possible shale gas boom in New York, town officials in Campbell, a community southwest of the Finger Lakes Region, are beginning to discuss a related issue – “man camps” that provide a home away from home for out-of-state drilling workers.
On Monday (March 12), the Campbell Town Board is expected to enact a two-year moratorium on establishing or expanding dormitories, campgrounds, RV parks, trailer parks or similar facilities, the Corning Leader reported.
“The goal is to allow time to develop a comprehensive plan without something like a ‘man camp’ moving in,” Tennent said.
“We haven’t had any proposal of a man camp seeking to come here. This is just a precaution and time to develop a comprehensive plan.”
The comprehensive plan will include zoning laws on residential density, water use, sanitation and other issues, Tennent said.
Tennent said the Campbell Town Board will hold a public hearing Monday and then vote to adopt the moratorium.
“I expect it will pass,” Tennent said.
The town currently has several campgrounds, RV parks and trailer parks.
Tennent said the board will begin working on the comprehensive plan soon, with the assistance of the Southern Tier Central Regional Planning and Development Board.
Other local municipalities may soon tackle the “man camp” issue as well.
“We haven’t really discussed it yet,” said Corning Town Supervisor Kim Feehan. “But gas drilling is an issue that we’ll look into all elements of.”
Erwin Town Supervisor Rita McCarthy said her town already has a comprehensive plan in place that includes strict zoning laws.
“Man camps” have opened in Pennsylvania’s Northern Tier, including one in Athens Township that opened in 2010. It’s a dormitory-style facility with a dining hall and other accommodations.
“We haven’t had any problems with them,” said Maurice Fay, chairman of the Athens Town Board of Supervisors. “They run a tight ship. I asked our police (Thursday) if they have had a problem with them, and they said not at all. It’s a well run place.”
Such facilities can be convenient for gas industry workers and may ease the area’s housing shortage that has driven up rent for locals. But concerns have also been raised about the facilities bringing in hard-drinking “roughnecks” who may cause problems.
Local businessman Dan Hurley’s plan to open an RV park for gas industry workers on his property behind Bradley Farms in Southport drew criticism from neighbors before it was put on hold last fall. | {
"pile_set_name": "Pile-CC"
} |
The only method by which people
can be supported is out of the effort of those who are earning their own way.
We must not create a deterrent
to hard work. - Robert A. Taft
What Makes a Republican - a REPUBLICAN?
A Long History Opposing the Same Enemy
We have spent the better part of the last half century forgetting the reasons that Republicans are part of
an American First tradition and the real meaning of the GOP. Just what are the principles and policies that separate the platform
of Republicans from that of the Socialists that wear the Democratic label? Sorry to say, not much of a difference presently
exists; let alone a dedication to enact legislation that counters the legacy of FDR. It wasn't like this - once upon a time
. . . For Republicans knew what they were all about and had an example of a true champion of principle in one, Senator
Robert A. Taft.
Taft is most famous for his opposition to Franklin Roosvelt's New Deal Legislation and policies. He has been
called the last "Old Right" political. While some may conclude that this description points out that we have 'moved on', the essential
question remains. Were the policies of Taft the real essence of Republicanism? Principles never die, changing circumstances
only seek out appropriate applications. Liberty of the individual was the hallmark of Taft that earned him the name, Mr Republican.
The New Deal's expansion of federal power at the expense of state and local government is incompatible with the core
bedrock of Republican philosophy. Taft vigorously urged economy in government and restoration of balanced budgets, while supporting
a very limited role in foreign affairs. He voted against NATO, supported strong tariffs, opposed the draft and sponsored legislation
that bears his name, the Taft-Hartley Law.
If Republicanism isn't about opposing the Federal Income Tax and the Federal Reserve System, just what did
the party ever stand for to begin with?
When it comes to foreign policy, the last century is one of "Perpetual War for Perpetual Peace". Taft speaks
directly to this point:
"Fundamentally, I believe the ultimate purpose of our foreign policy must be to protect the liberty of the people of the United States. The American Revolution was fought to establish
a nation "conceived in liberty." That liberty has been defended in many wars since that day. That liberty has enabled our
people to increase steadily their material welfare and their spiritual freedom. To achieve that liberty we have gone to war,
and to protect it we would go to war again . . .
Only second to liberty is the maintenance of peace. . . . Our traditional policy of neutrality and non-interference
with other nations was based on the principle that this policy was the best way to avoid disputes with other nations and to
maintain the liberty of this country without war. From the days of George Washington that has been the policy of the United
States. It has never been isolationism; but it has always avoided alliances and interference in foreign quarrels as a preventive
against possible war, and it has always opposed any commitment by the United States, in advance, to take any military action
outside of our territory. It would leave us free to interfere or not according to whether we consider the case of sufficiently
vital interest to the liberty of this country. It was the policy of the free hand."
In his book, Principles Without Program: Senator Robert A. Taft and American Foreign Policy - he conveys his views as core Republican principles that
are as valid today as they were when originally written. So why does the Republican Party work overtime to run in lock step
with the Socialism of the New Frontier, Great Society and New World Order? The answer is obvious, the Republicanism has been
removed from the party and has been replaced with a neo-conservatism sham that is a betrayal of America's tradition.
How many remember the names of these brave leaders that fought so hard to retain the promise of the American
way of life? Just what was their cause and why do most Republicans ignore their heritage? Taft sums up nicely the purpose
of their task:
"There are a good many Americans who talk about an American century in which America will dominate the world.... If we confine our activities to the field of moral leadership we
shall be successful if our philosophy is sound and appeals to the people of the world. The trouble with those who advocate
this policy is that they really do not confine themselves to moral leadership. They are inspired by the same kind of New Deal
planned-control ideas abroad as recent Administrations have desired to enforce at home. In their hearts they want to force
on these foreign people through the use of American money and even, perhaps, arms, the policies which moral leadership is
able to advance only through the sound strength of its principles."
Robert Taft believed in the "Federalism" model of the American Republic. His faith was in basic American values
and the abilities of the people to seek Liberty. Achieving this goal requires that such liberty is founded upon an economic
system based on free enterprise, a political system based on citizen participation, and national independence and sovereignty
for our country.
Internationalist Republicans have become mutants, with the abdication of purpose for their party. Just what
is the point of having two shades of the same color when that hue is one and the same in Socialism. If you say the debate
is over and the future belongs to the most popular collectivist, then America is already deceased.
Even under the great Ronald Reagan, the Departments of Education and Energy continued. Just look at the record!
When was the last time a 'so called' conservative remained ardent in the fight against social democracy? Taft's principles
are timeless because they represent the best chance for the freedom of a free people. Or does that idea scare so many, that
Liberty is no longer our mutual objective? With the dawn of this new century, it is time to remember the common sense of past
generations and devote ourselves to the reinvention of practical policies that apply those principles to our current condition.
Anything short of this reformation, will confirm that the GOP has lost it's way. Rediscover what a Republican really means
. . .
SARTRE - March 10, 2002
I always find that statistics are
hard to swallow and impossible to digest. The only one I can ever remember is that if all the people who go to sleep in church
were laid end to end they would be a lot more comfortable. | {
"pile_set_name": "Pile-CC"
} |
The principal aim of this proposal is to further development of new methods for analyzing observational data bases and randomized trials of HIV-infected persons and the application of these methods to data obtained in randomized and observational studies in an attempt to help answer important open substantive questions concerning the treatment and course of HlV-related disease. The proposed approaches are based either on (i) the estimation of new classes of causal models which include structural nested models, marginal structural models (MSMs), direct effect structural nested models, continuous time structural nested models, and optimail regime structural models (SNMs). Many of the new methods are fundamentally epidemiologic in that they require data on time-dependent confounding factors, that is, risk factors for outcomes that also predict subsequent treatment with the drug or cofactor under study. In particular, we plan to further develop optimal regime SNMs and dynamic MSMs to help detemnine the optimal times to start HAART therapy and to change HAART regimens as a function of a subject's CD4 count, HIV RNA, clinical history, and, where available, results of genot^lc or phenotypic resistance testing. Our methods will be developed with the goal of directing analyzes and reanalyzes, with collaborators, of data from the HIV Causal Colioboration at HSPH . the Multicenler AIDS Cohort Study, The Women's Interagency HIV Study, The Swiss HIV Cohort Study, The Study of The Consequences of Protease Inhibitor Era (SCOPE), Pediatric Late Outcomes Protocol (PACTG 219) and the ALLRT study. RELEVANCE (See instructions): Observational methods are used to answer pressing causal questions that cannot be or have not yet been studied in randomized trials. In particular we are developing methods that are the best available to determine the optimal CD4 and HIV RNA levels at which to initiate HAAART therapy in HIV infected subjects and the optimal time to change therapy once resistance to a initial HAART regime has developed. | {
"pile_set_name": "NIH ExPorter"
} |
Men’s squash takes on West Point, a matchup they won, 7-2. (Ajon Brodie – The Triangle)
The Drexel University men’s and women’s squash teams suffered losses Jan. 30 when they traveled to the No. 19 and 16, respectively, Middlebury College Panthers. The men fell in a close contest 5-4, while the women lost 6-3. The men’s overall record drops to 4-8 and the women’s overall record drops to 6-8.
The men got wins from Atticus Kelly, Luke Willemse, Michael Thompson and Cillian Dunne in the one, two, four and five spots, respectively. Kelly swept his Panther opponent Andrew Jung in three games. Willemse defeated his opponent in four games after losing the first game of the match. Thompson won in three games, sweeping Harrison Croll. Dunne won in four games after dropping the first game of his set.
Mark Kauf and Sebastian Dangond both forced a five game set but could not come out with a win for the Dragons.
On the women’s side, the Dragons got wins from Hayley Hughes, Ryan Morgan and Kaitlyn Money in the one, two, and four spots, respectively. Hughes beat Panther Saskia Pownall-Gray in a five-game battle. Morgan swept Anne Wymard in three games. Money staged a comeback in five games to secure a third win for the Dragons after being down 2-1 after three games.
Laura Rahauser lost a tough match in five games after putting in a strong effort. She had a 2-1 lead on her opponent Liddy Renner, but Renner was able to battle back and win for Middlebury. The other Dragon losses came in three game sweeps in spots three and six through nine.
The Drexel men and women travelled to Williamstown, Massachusetts, Jan. 31 to take on the No. 15 and 12 ranked Williams College Ephs.
The men earned a 6-3 win, snapping a four game losing streak and improving their record to 5-8 on the season. The women fell to the Ephs 6-3, moving their record to 6-9.
The Dragon men secured wins in spots one through four, seven and nine. Kelly began the competition for the men, winning in four games. Willemse, Ibrahim Bakir and Thompson followed, sweeping their opponents.
Nat Fry also won in four games, while Joey Gingold swept his opponent in the last spot to finish off a victory for the Dragons.
The women got wins in spots one, four and five. Hughes started off with a victory for Drexel, sweeping her opponent Nicole Friedman in three games. Money and Rahauser got the other two wins for the Dragons, fighting it out in five games.
Morgan and Mary Fung-A-Fat both dropped their matches in four games, while the Dragons suffered three game sweeps in spots six through nine.
The Dragons returned home to the Kline & Specter Squash Center Feb. 1 where the men’s team faced off against the Midshipmen of the U.S. Naval Academy. Drexel defeated Navy 7-2, the first win over Navy in program history.
Kelly started the Dragons off with a win over Midshipman Andrew McGuinness in five games in the first spot. Willemse followed with a win in three games. The other Dragon victories came in spots four through seven and spot nine. Dunne swept opponent Randy Beck. Thompson, Fry and Gingold all won their matches in four games, while Kauf battled it out in five games for the win.
Bakir and Dangond dropped their matches in four games, the only two losses for the Dragons during the competition.
The men and women return to action when they welcome the Dickinson College Red Devils Friday, Feb. 6. | {
"pile_set_name": "Pile-CC"
} |
Development and validation of a bioanalytical method using automated solid-phase extraction and LC-UV for the simultaneous determination of lumefantrine and its desbutyl metabolite in plasma.
A bioanalytical method for the determination of lumefantrine (LF) and its metabolite desbutyl-lumefantrine (DLF) in plasma by solid-phase extraction (SPE) and liquid chromatography has been developed. Plasma proteins were precipitated with acetonitrile:acetic acid (99:1, v/v) containing a DLF analogue internal standard before being loaded onto a octylsilica (3 M Empore) SPE column. Two different DLF analogues were evaluated as internal standards. The compounds were analysed by liquid chromatography UV detection on a SB-CN (250 mm x 4.6 mm) column with a mobile phase containing acetonitrile-sodium phosphate buffer pH (2.0; 0.1 M) (55:45, v/v) and sodium perchlorate 0.05 M. Different SPE columns were evaluated during method development to optimise reproducibility and recovery for LF, DLF and the two different DLF analogues. The within-day precisions for LF were 6.6 and 2.1% at 0.042 and 8.02 microg/mL, respectively, and for DLF 4.5 and 1.5% at 0.039 and 0.777 microg/mL, respectively. The between-day precisions for LF were 12.0 and 2.9% at 0.042 and 8.02 microg/mL, respectively, while for DLF 0.7 and 1.2% at 0.039 and 0.777 microg/mL, respectively. The limit of quantification was 0.024 and 0.021 microg/mL for LF and DLF, respectively. Different amounts of lipids in plasma did not affect the absolute recovery of LF or DLF. | {
"pile_set_name": "PubMed Abstracts"
} |
359 F.2d 886
Application of Albert BOWERS and James C. Orr.
Patent Appeal No. 7584.
United States Court of Customs and Patent Appeals.
May 12, 1966.
Evelyn K. Merker, Leon Simon, Washington, D. C., for appellants.
Clarence W. Moore, Washington, D. C. (Jack E. Armore, Washington, D. C., of counsel), for Commissioner of Patents.
Before RICH, Acting Chief, MARTIN, SMITH, and ALMOND, Judges, and Judge WILLIAM H. KIRKPATRICK.*
SMITH, Judge.
1
Syntex Corporation is the common assignee of the appealed application1 and the 2 patents2 relied upon for the rejection. Different joint inventors are named in the application and in the patents. Albert Bowers, one of the nominal appellants here, is one of the joint inventors in the appealed application and in the 2 above named patents. The appealed application is senior in filing date to the applications upon which the patents were issued.
2
The Board of Appeals in its decision of March 18, 1964 affirmed3 the rejection of appealed claims 1 to 12 of appellants' application as "unpatentable over" claim 1 of the Bowers and Edwards patent and affirmed the rejection of appealed claim 13 as being "unpatentable over" claim 16 of the Bowers and Berkoz patent.
3
Closely related subject matter is disclosed in the patents and the application on appeal. The steroid compounds here claimed differ from the compounds claimed in the indicated patents by the presence in the steroid structure of a 2-methyl group instead of a hydrogen atom.
4
It was the examiner's position that the 2-methyl compounds of the appealed claims are so closely related to the hydrogen containing, or 2-desmethyl, compounds of the indicated patent claims as to be, in the words of the Board of Appeals, "obvious therefrom to those skilled in the art."
5
The statutory basis for the rejection is not clear from the record. While using certain of the language of 35 U.S.C. § 103, in affirming the examiner's rejection,4 the board does not explain how the patents, issuing on applications filed later than the filing date of the appealed application, can be considered as prior art against the invention here claimed. Earlier filed applications of "another" describing the invention claimed in a later filed application are prior art under 35 U.S.C. § 102(e) and as such are available for consideration in a 35 U.S.C. § 103 "obviousness" rejection Hazeltine Research, Inc. v. Brenner, 382 U.S. 252, 86 S.Ct. 335, 15 L.Ed.2d 304. However, the rule does not warrant a rejection under 35 U.S.C. §§ 102 or 103 on patents that issued on later filed applications. Such references are clearly excluded by the precise language of sections 102(e) and 103.
6
The opinion of the board seeks to justify the rejection in its statement:
7
* * * The Examiner rules that, in view of this close relationship and the obviousness of the claimed compounds from the patented claims, appellants are not entitled to receive a patent on the basis of the appealed claims, since appellants' assignee had received patent protection on essentially the same inventions in the Bowers and Edwards and the Bowers and Berkoz patents. * * *
* * * * * *
8
It is unfortunate that the issue of "double patenting" was not raised at the earliest possible date * * *.
9
There is no objection of record concerning this being a new ground of rejection.
10
Subsequent to the decision of the board, a petition for rehearing was filed in which the separate nature of the involved inventions was pointed out and discussed. Later, and subsequent to our decision of May 14, 1964 in In re Robeson, 331 F.2d 610, 51 CCPA 1271, appellants filed a letter of June 19, 1964, in which the Board of Appeals was requested to consider the disclaimer, filed concurrently therewith, in which they disclaimed:
11
* * * the terminal portion of the term of the above identified application Serial No. 138,265 as would extend beyond October 2, 1979, the expiration date of U. S. Patent No. 3,056,814, with respect to Claims 1-12, and as would extend beyond March 19, 1980, the expiration date of U. S. Patent No. 3,082,220, with respect to Claim 13.
12
Appellants' letter of June 19, 1964 refers to our Robeson decision as being "directly in point" and argued:
13
* * * that the attached disclaimer obviates the basis of the double patenting rejection of claims 1-13 on appeal, over the common assignee's Patent No. 3,056,814 and No. 3,082,220. The disclaimer precludes any extension of the monopoly since it provides for the expiration of the above identified application, if patented, simultaneously with Patent Nos. 3,056,814 and 3,082,220.
14
In its decision on the petition for reconsideration, the board considered the contents of the letter of June 19, 1964, and criticized one of the signatures appearing on the disclaimer. It then stated:
15
Assuming that this paper were a disclaimer operative to disclaim the indicated portions of a patent granted on the instant applications, we could give it no weight in the present appeal because it is not apparent that In re Robeson, supra, or the subsequent decision, In re Kaye, 51 CCPA [1465, 332 F.2d 816] 141 USPQ 829, apply to the situation where a terminal disclaimer is offered with respect to the commonly owned patent of a different inventive entity. * * *
16
Subsequently appellants filed a new disclaimer to overcome the board's criticism as to form which was accepted and has been duly recorded in the United States Patent Office. By order of the court upon granting appellants motion to correct diminution of record, which was not objected to by the Solicitor for the U. S. Patent Office, this new disclaimer was added to the record. The Solicitor here does not challenge the sufficiency of the new disclaimer nor does he argue that the effect of the terminal disclaimer on the rejection is not before us. We will therefore turn to a consideration of the subject matter defined in the appealed claims and the patent claims.
17
We find that we are here dealing with different inventions. As pointed out in appellants' brief:
18
It is apparent that a single invention is not involved. The inventions of the involved application and those of the reference patents are not identical; they are different and distinct inasmuch as the inventions differ in the presence of a CH2 grouping at a specific position in the complex steroid molecule. It is clear that the claims of the Bowers and Orr application, which was the first filed application, define an invention separate and different from those defined in the reference patents owned by the same assignee.
19
Appellants also point out in their brief that "Each invention would be patentable absent the other," which is not disputed by the Patent Office. The brief then continues:
20
It is clear that separate, distinct and nonidentical inventions are described in the application at bar and in the patented inventions. It is therefore contended that a terminal disclaimer is appropriate in the case at bar to overcome a double patenting rejection under the holding of In re Robeson and In re Kaye.
21
As we stated in Kaye, supra, 332 F.2d at 819, 51 CCPA at 1468, in reference to Robeson, supra:
22
In that case we held that where, as here, the claims define separate, albeit patentably indistinct, inventions, the filing of a terminal disclaimer may obviate a double patenting rejection.
23
Thus, it seems to us that the board's position must stand or fall on the issue of whether our decisions in Robeson, supra, and Kaye, supra, as stated by the board, "apply to the situation where a terminal disclaimer is offered with respect to the commonly owned patent of a different inventive entity."
24
It is true that in both Robeson, supra, and Kaye, supra, the double patenting rejections which we found to be obviated by the terminal disclaimer were predicated in each case on the same inventorship. However, we find this to be a distinction without legal significance in the present context.
25
Statutory authority for the terminal disclaimer here in issue is found in 35 U.S.C. § 253, the second paragraph of which provides:
26
In like manner any patentee or applicant may disclaim or dedicate to the public the entire term, or any terminal part of the term, of the patent granted or to be granted.
27
It is to be noted that the parties authorized by the statute to file the terminal disclaimer are "any patentee or applicant." It seems clear that Congress intended that the remedies of this section were also to be available to assignees in view of the express provision of 35 U.S.C. § 100(d) that:
28
(d) The word "patentee" includes not only the patentee to whom the patent was issued but also the successors in title to the patentee.
29
The statutory provisions thus support appellants' position and are contrary to the solicitor's arguments.
30
The solicitor argues that the common assignee here, in effect, seeks to circumvent sections 102(e) and 103 by filing a terminal disclaimer. This argument lacks substance as here there can be no resort to those sections. Where the inventorship is different and there is a common assignee, as here, the first filed application which issues as a patent is "by another" and if the invention claimed in the second application is "described" in the first application, it is available under section 102(e) as prior art which is relevant for consideration under section 103. If the second filed application issues first, there can be no resort to section 102(e) to establish it as prior art under section 103 as against the first filed application. This results from the controlling effect which is given under the statute to the United States filing dates. Sections 102 (e) and 103 provide grounds of rejection which are distinct and separate from "double patenting," requiring different inquiries and representing mutually exclusive grounds for rejecting claims. Where there are separate inventions and the ground of rejection is double patenting, based on the alleged unlawful timewise extension of monopoly, the rejection may be overcome by a terminal disclaimer.
31
We are of the opinion, therefore, that the common assignee of the appealed application and the involved patents is entitled to proceed under 35 U.S.C. § 253.
32
The solicitor relies on two decisions in other courts, Sterling Varnish Co. v. Louis Allis Co., 145 F.Supp. 810 (E.D.Wis. 1956), aff'd on rehearing, 149 F.Supp. 826 (1957); and Hays v. Reynolds, 242 F.Supp. 206 (D.C.1965), aff'd, Hays v. Brenner, 357 F.2d 287 (D.C. Cir. 1966). The solicitor has extracted from these two cases the rule that a common assignee may not file a terminal disclaimer to obviate a double patenting rejection. Upon analysis of the cases we find they do not support such a rule.
33
In the Sterling case the court sustained a defense of double patenting in a suit for patent infringement (145 F.Supp. 810). The two patents issued to the same inventor (a fact apparently overlooked by the solicitor) and were assigned to the plaintiff. Both patents related to coating an electrical winding with varnish, the later issued patent being a continuation-in-part of the first. The first patent required that the winding be coated by "rotating" in the varnish while the second patent required that the winding be coated by "contacting" or "immersing" the winding in the varnish. Also the first patent called for "heating" the winding to the temperature of the varnish while the second patent required that the winding be "highly heated" or "at a temperature not below 275° F." The court found the invention to be the same in both patents, the alleged differences being no more than the "mere use of obviously alternate, immaterial and equivalent terms." 145 F.Supp. at 815.
34
Plaintiff filed a terminal disclaimer as to the later issued patent after the court's decision and on plaintiff's motion to modify the decision the court found the two patents claimed the "same invention" and held that the terminal disclaimer was "legally insufficient," (149 F.Supp. 829) relying on our decision in In re Siu, 222 F.2d 267, 42 CCPA 864. In Siu we held that when in fact a second application claimed the same subject matter as a previously issued patent, a double patenting rejection was proper and a terminal disclaimer would not overcome such a rejection. The Sterling and Siu decisions are not applicable to the fact situation here in which the appealed application and the issued patents in fact claim different inventions.5 Thus in the instant case, unlike the situation in Sterling and Siu, there are clearly three inventions involved and the respective claims define different subject matter.
35
In Hays a rather complex record clouds the issue ultimately decided by the courts. Hays' claims were rejected in view of a patent to Keating. The Hays application and the Keating patent were commonly assigned. The application on which the Keating patent issued, although having a later filing date than the Hays application, was, however, a continuation-in-part of three applications filed prior to Hays. The examiner's answer before the board stated two grounds of rejection of the appealed Hays' claims: first, as "unpatentable over the claims of the Keating patent;" and second, as "lacking invention over the Keating parent case No. 598,215, as noted in the Keating patent." The examiner argued that the filing date of the Keating parent case had not been overcome.
36
The board in its decision affirming the examiner refers to the first ground of rejection as "double patenting." Concerning the second basis for the rejection, the board stated:
37
Appellant has filed no reply to this rejection and inasmuch as we find no obvious error therein it is sustained.6
38
In a civil action under 35 U.S.C. § 145 in the District Court, District of Columbia, the position of the Patent Office was that the Hays claims stood rejected on two grounds: (1) double patenting based on the claims of the Keating patent; and (2) that the subject matter claimed by Hays was obvious in view of a prior disclosure by Keating. The filing date of the parent application of Keating and the subject matter carried over into the Keating patent was relied on by the Patent Office. The solicitor disclaimed any reliance on the Keating abandoned application alone. The solicitor explained that this had been the position of both the examiner and the board.
39
The District Court in its decision utilized both "double patenting" and section 103 language in finding for the Commissioner. The findings of fact make it clear that the court found the subject matter claimed by Hays would have been obvious to one of ordinary skill in the art having the benefit of the work done previously by Keating. The record shows that the District Court relied on the earlier filing date of Keating. In double patenting situations, of course, the filing date is of no concern, which suggests to us that the real basis for the decision of the court was 35 U.S.C. § 103.
40
The briefs filed in the Court of Appeals, District of Columbia, in the Hays case and that court's opinion leave no doubt that the statutory basis for refusing a patent to Hays was 35 U.S.C. § 103, and this notwithstanding some of the District Court's findings of fact and conclusions of law.
41
The solicitor stated in his brief before the Court of Appeals in the Hays case:
42
* * * Hence, it is clear from the record that Keating's parent application, which was filed on July 16, 1956 antedates the application at bar. Section 120 of Title 35 USC makes that date the effective date of all subject matter disclosed in Keating's patent which is also disclosed in the parent abandoned application.
43
* * * * * *
44
* * * Hays' purported invention is a narrow improvement over the Keating invention * * * [and] that improvement must be evaluated in accordance with statutory standards (35 U.S.C. § 103) to determine whether appellants are entitled to a patent based on the claims at issue.
45
In view of the fact that Keating's work, as disclosed in the abandoned application and carried forward into the issued patent, was subject matter "described in a patent granted on an application for patent by another [Keating] filed in the United States before the invention thereof by the applicant [Hays]" it was prior art under 35 U.S.C. § 102(e) and 103; Hazeltine Research, Inc. v. Brenner, supra. Despite this fact, both parties injected some double patenting language into the issue before the Court of Appeals.7
46
The Court of Appeals in its opinion refused to consider the merits of any alleged double patenting rejection:
47
Appellants' principal effort before us is limited to showing that, while the Hays "invention" may be obvious under 35 U.S.C. § 103, "in any event the claims of Hays define a different and significantly different invention from the invention pointed out by the claims of Keating and thus the double patenting rejection is obviated by the terminal disclaimer." (Emphasis in appellants' brief.) In answering this contention, we find it unnecessary to undertake an analysis of the differences between the Keating patent and the Hays application, or to determine whether the Hays "invention" is different from the Keating invention. Since appellants' contention for patentability based on the filing of the terminal disclaimer assumes obviousness, § 103 is an absolute bar to the grant of a patent. (357 F.2d at 289)
48
We agree with the Court of Appeals in Hays that a terminal disclaimer will not obviate a rejection for obviousness in view of the prior art under 35 U.S.C. § 103. That situation is not presented here and the solicitor's reliance on the Hays decision is misplaced. The facts here, however, present a case in which the filing of a terminal disclaimer, as permitted under section 253, is effective to overcome a rejection based only on double patenting. The different chemical compounds defined in the respective claims are different inventions. The claims specify these differences and thus do not define the same invention. In re Siu, supra.
49
In summary, where there are in fact separate inventions, each of which is considered patentable over the prior art absent a patent on the other, a rejection based upon double patenting can be obviated by the filing of a terminal disclaimer under 35 U.S.C. § 253 which may be filed by a common assignee.
50
In view of the foregoing, the decision of the board is reversed.
51
Reversed.
Notes:
*
United States Senior District Judge for the Eastern District of Pennsylvania, designated to participate in place of Chief Judge WORLEY, pursuant to provisions of Section 294(d), Title 28, United States Code
1
Bowers and Orr, Ser. No. 138,265, filed Sept. 15, 1961 for "Cyclopentanophenanthrene Derivatives and Process."
2
Bowers and Edwards, Patent No. 3,056,814 issued Oct. 2, 1962, filed Nov. 2, 1961; Bowers and Berkoz, Patent No. 3,082,220 issued Mar. 19, 1963, filed Feb. 21, 1962
3
An additional reference cited by the examiner "to show the state of the art" was not made a part of the record before this court. We therefore do not consider its teachings in this appeal
4
It is clear from the following statement in the examiner's answer that the rejection was based on obviousness under 35 U.S.C. § 103:
Claims 1-12 are rejected as being unpatentable over claim 1 of copending application Serial No. 149,502, of common assignee, now U. S. Patent No. 3,056,814. It is the Examiner's position that the claimed 2-methyl compounds are so closely allied to the 2-desmethyl compounds set forth in the reference, as to be obvious to those skilled in the art and therefore unpatentable under the terms of 35 U.S.C. 103. * * *
* * * * *
Claim 13 has also been rejected as being unpatentable over claim 16 of copending application Serial No. 175,680, of common assignee, now U. S. Patent No. 3,082,220, for the same reasons as set forth in the prece[e] ding paragraph.
* * * * *
* * * In view of the cited references, the instant claimed 2-methylated compounds are deemed obvious to those skilled in the art and therefore unpatentable under the terms of 35 U.S.C. 103. * * *
No such rejection is warranted under section 103. In view of the later filing date of the reference patents they cannot properly be considered as prior art for purposes of a section 103 rejection.
5
InSterling, sections 102(e) and 103 were inapplicable during prosecution because of the same inventorship and copendency. In Siu, involving different inventorship and a common assignee, the patent issued on the earlier filed application. The Patent Office pursued the narrower ground of rejection, double patenting, because the patent claimed the same subject matter as in the second application.
6
On petition for rehearing the board adhered to its original decision stating a reply to the second rejection during oral argument was no substitute for written argument. On petition to the Commissioner, reopening of prosecution of the case was denied because an action had been commenced in the District Court. In view of the arguments made in the District Court and Court of Appeals, the solicitor apparently preferred to pursue a rejection under section 103, discussed infra in the text, and did not rely on the alleged failure of Hays to reply to the rejection
7
Appellant apparently tried to avoid the section 103 obviousness rejection by calling it "double patenting" so as to argue that a terminal disclaimer obviated the rejection. The solicitor apparently wished to characterize the issues in In re Robeson and in In re Kaye, supra, as involving section 103, which they did not
While in those cases we held that an "obvious" type double patenting rejection could be overcome by filing a terminal disclaimer, that does not mean that a section 103 rejection for obviousness may be similarly overcome. Under section 103, a reference patent is available for all it fairly discloses to one of ordinary skill in the art. There is no inquiry as to what is claimed therein. In the "obviousness" type of double patenting rejections, the test is not what would be obvious to one of ordinary skill in the art from reading the specification on the claims. In re Sarett, 327 F.2d 1005, 51 CCPA 1180. Rather, the inquiry is much more limited in nature and the patent is considered only to compare the invention defined in the patent claims with the invention defined in the application claims.
52
KIRKPATRICK, Judge (concurring).
53
The opinion in this case is a logical extension of the reasoning of the Court in In re Robeson, 331 F.2d 610, 51 CCPA 1271, and In re Kaye, 332 F.2d 816, 51 CCPA 1465. The rationale of those decisions requires the reversal of the board's decision in the present case and, for that reason only, I concur.
| {
"pile_set_name": "FreeLaw"
} |
[Evaluation of patients with below knee amputation with respect to postoperative prosthesis fitting].
A follow-up of 78 patients who underwent amputations just below the knee at Herlev University Hospital in the county of Copenhagen during the period 1985-88 is reported. The examination was carried out at an average of 39 months later. There was a high early mortality postoperatively as well as in the subsequent years. Of the survivors, about 87% were found to be candidates for prosthetic fitting. Nearly all of these patients became functional ambulators. The long-term survival rate was primarily correlated to a reduced occurrence of concurrent medical diseases, especially of cardiovascular nature. All women with diabetes mellitus were deceased at the time of examination. This is a well-known phenomenon, relating diabetes with a six times higher mortality risk. The study indicates that the effort and expense of fitting and training patients with prostheses may be well worthwhile. | {
"pile_set_name": "PubMed Abstracts"
} |
Q:
window.close() not working on Windows Mobile 6.5
I'm calling a webpage from a mobile device (a motorola MC55A0).
The browser is IEmobile.
The page has a button on it, a press on that button calls some javascript code ending with the line:
window.close();
The javascript executes fine until that line, where nothing happens.
The browser is expected to close but doesn't.
What could be the cause of that behavior?
EDIT: I would like to add that the same webpage worked on another mobile device, with Windows CE 5.0 (Motorola MC3000 series)
A:
remember that: you can fire window.close() only on windows that have been opened with window.open()
See: Scripts may close only the windows that were opened by it
This method is only allowed to be called for windows that were opened
by a script using the window.open method. If the window was not
opened by a script, the following error appears in the JavaScript
Console: Scripts may not close windows that were not opened by
script.
that's the "nothing happens" you're facing. actually it's not absolutely nothing- a script message has been printed to the console.
hope that helps.
| {
"pile_set_name": "StackExchange"
} |
This blog comments on a variety of technology news, trends, and products and how they connect. I'm in Red Hat's cloud product strategy group in my day job although I cover a broader set of topics here. This is a personal blog; the opinions are mine alone.
Tuesday, November 15, 2016
Andi Mann is Chief Technology Advocate at Splunk. In this podcast, he discusses some of the ways in which data plays an important role in DevOps. I’ve known Andi for ages since we were both IT industry analysts and we had a chance to sit down at CloudExpo/DevOps Summit/IoT Summit in Santa Clara where Andi was chairing a DevOps track and I was one of the speakers. (We also did a data and DevOps panel together on the main stage but that video doesn’t seem to be up yet. I’ll post once it is.)
Among the topics we tackle are choosing appropriate metrics that align with the business rather than just technical measures, creating feedback loops, using data to promote accountability, and DevSecOps.
Gordon Haff: I'm sitting down here with an old analyst mate of mine, Andi Mann. Also, formally of CA, also the author of some books, and now he is the chief technology advocate with Splunk. What we're going to talk about today is data in DevOps. Welcome, Andi.
Andi Mann: A lot of the customers I talk to, who are doing DevOps in various versions using Splunk...It boils down to three key areas is that they really want to know about, the metrics that matter for them.
The first is really about how fast are they? What's their cycle time? How quick does it take for an idea to get in front of a customer? How long does it take someone in business to come up with something and then basically make money from it, or, in government, they service their citizens with it? That cycle time is really important, the velocity of delivery.
The second key area that people look at is around the quality of what they're delivering. Are they doing good? Are they delivering good applications? Are they creating downtime? Are they having availability issues? Is one release better than another?
The third area is really around what sort of impact do they have? Measuring real business goals, MBOs, things like revenue and customer sign‑up rates, and cart fulfillment, and cart abandonment. These sorts of things. Those are the metrics that my customers, the people I talk to are interested in for DevOps, closing those feedback loops in those three areas.
Gordon: One of the things I find interesting, what you just said, Andi, is that you read these DevOps surveys, DevOps reports, and often the metrics, or at least what they're calling metrics, are framed in much more technical terms. How many releases do we have per year, or per week, per hour?
What's the failure rate? How quickly can we do builds? How quickly can we integrate? Which, I think to your point, are probably worth measuring, but they're really...The ultimate goal of DevOps is not to release software faster.
Andi: Exactly. It's interesting because you do look at these metrics in isolation, and they matter. All this matters. 10 deploys a day, we all know that from 2009 in Velocity. That matters, but 10 deploys a day is no good if they're all bad deploys. You need to measure quality in that.
But even if it's a good quality deploy and you do it quickly, if it's not moving the needle on what your business wants you to be doing, then again, it doesn't matter. I think it's actually really important to connect these together so you really are getting metrics, correlating metrics, that matter across the whole range to really understand whether you're doing good or not.
Gordon: One of my favorite Dilbert cartoons, I don't remember the exact wording but to the effect of...Pointy Hair goes, "We're now going to measure you on the number of lines of you code you write," and Wally says, "I'm going to off to write myself a new car today."
Andi: [laughs] Yeah, exactly. That's one of the things that I actually do measure. We measure it internally. A bunch of our customers do actually measure code volume. There's a couple of interesting reasons for that. Especially in a DevOps and Agile mode, actually delivering too much code can be a signifier that you're doing things badly.
You're writing too much code, you're doing too much in one release rather than doing small, iterative releases. It can also signify that one person has too much of a workload. When you think about DevOps and the concepts around empathy and wanting to make sure that life doesn't suck for everyone, when one person is doing all the work, that sucks for them.
There are actually good things that come out of measuring code volume [laughs] but saying that more code equals better code, equals a bonus? That's a really bad thing. [laughs]
Gordon: I think a lot of people tend to lump data metrics into this one big bucket. As we've had discussions before, there are these business metrics which have to be somehow connected to things.
It's not clear that overall company revenue is necessarily a good DevOps metric. Some of the other things you mentioned certainly are. In many cases, it does make sense to collect a lot of underlying data for data analytics and things like that. Then, you also have alerts.
Andi: Yeah, the business stuff is really interesting. I know one of our customers releases the software-as-a-service. They're a SaaS company, cloud native and all that. Their developers actually do care about who uses specific features.
They'll implement a feature. They do canary releases. They'll implement feature on 10 out of a 1,000 servers, or whatever. Certain volume or percentage of their customers will get access to it. Then they'll measure using Splunk the way that those features are being used or not. They also measure the satisfaction of those customers.
They've got these nice smileys, and tick marks, and stuff that say, "Yes, I enjoyed using this feature." They can correlate that together, and it actually means the next day after doing a commit, after doing a release, they actually know whether the business use cases being satisfied, which is very cool.
I know a television company in the UK that we work with. They actually send reports on a weekly basis, I think it is, to their marketing department, based on whether users are using the website, what they're doing on the website, whether they're clicking through on competitions.
That's actually really important, but obviously mostly what people are doing in using data and the feedback...Closing the feedback loops is what I'm talking about here at DevOps Summit.
They're closing the feedback loop around those technical measurements.
Am I creating more bugs? Am I creating availability issues? Am I creating problems with uptime? Am I closing out the feature set that is in the story or in the epic that I was promising to do? Partially, it's also around this accountability to each other. Am I doing what I'd promised I'd do?
Gordon: Talk a little more about accountability.
Andi: Yeah, that's one of my soapboxes at the moment. I see a lot of the empowerment that DevOps gives developers to make decisions. I think that's great, especially in companies where you've got systems thinking and they understand their role in the organization and what it means to deliver good outputs for their customers.
You give them a lot of responsibility. Their manager is the leader. You give your developers, and your operations team, and those DevOps professionals a lot of responsibility and lot of empowerment to do the right thing.
Also, I think that there's a need for them to be accountable for doing the right thing as well. Especially as DevOps grows in larger organizations and there are more and more people involved. Also, with the concept that DevOps is about helping and making sure that each other is having a good experience at their life and their work.
As a developer, you're not making sure that operators are getting called out late at night, and all this sort of stuff. If DevOps is about helping to work with each other, to collaborate, to communicate better, to make sure each other's lives get better as Dev and Ops professionals, then I think you need to be accountable in two ways.
You need to be accountable to your business, which often means being accountable to your manager for doing the work that you're meant to do, and doing the work you promised you would, within the bounds of the responsibility you've been given.
It's also being accountable to each other, from doing good work, and doing the right work in ways that helps your whole team move forward, and makes everyone else's life positive. I think we talk a lot about empowerment and enablement. We don't really talk much about the flip side of that, which I think is that accountability.
Gordon: I think the culture talk around DevOps, and we did have lots of discussions around culture and some of the ways that it can be overextended and over‑applied. Yeah, it can turn into this "don't fear failure," empathy, transparency, etc. Unicorns farting rainbows. This very touchy feely, everyone's happy and sings "Kumbaya," but you are, at the end of the day, being paid to produce business outcomes.
There does need to be some accountability there. If you crash the SQL server three weekends in a row, and call in Ops, somebody's going to have to talk with you, as they should.
Andi: Exactly. Especially when you talk about the DevOps toolchain and the life cycle of software. It's a very complex and opaque theme to try to see what's going on at every stage, especially if you're a manager who's not necessarily fully fluent in specific tools. They can't dig into the specific tools to have a look at that.
I think reporting up to your management and reporting to each other and saying, "I introduced these bugs and I'm sorry for it. I won't do it again." By the same token, "I introduced these newest features, and they were really successful. We should all celebrate that as a team."
I think that accountability is actually really important. You'll see this in manufacturing as well where we get a lot of our examples from. You'll see that if one person makes the same mistake several times, then they'll get into a training program, or they'll get different mentoring.
Maybe they'll move into a different part of the line where they're better suited, and their skills are better suited. You don't know how to make your team better if you're not being accountable to each other, and to your management.
That's, I think, something we've got to step up to as DevOps professionals for want of a better term, is how do we be accountable to each other, and to the company that pays us as you said to do the job?
Gordon: You just talked about manufacturing. You just mentioned quality, and I think that's a pretty good segue because we often think about DevOps primarily, well, through the lens of developer for one thing, but that's another topic for another day.
We also tend to view DevOps, first and foremost, through the lens of this velocity, business agility, and so forth, but there is a very important quality component there as well. What are some of the ways that data can help to surface that quality component?
Andi: Absolutely. Some of the things we're looking at ‑‑ and our customers are doing a lot of this at the moment ‑‑ is looking at areas like code coverage and tests, number of defects, defect rates per release. Looking at the aggregating and correlating the quality metrics out of multiple test and scanning tools.
Doing static analysis and looking at the defect rates, doing dynamic analysis, and then also looking at the defect rates, as well as application performance and health scores. Looking at the performance in terms of resource utilization, response time, availability, execution failures, and so forth.
Comparing current release in production with next release just about to come forward, and being able to run that over time, so you can see whether you're making quality improvements over time.
If you're able to actually give your application a health score, and then you can measure that not just in production, but also in staging, or pre‑prod, whatever you want to call it, then you can start to make sure that you're getting better with every release. Your quality is going up with every release.
You can do with actual data, real measurements, so coming out of these testing tools, as well as coming out of actually running that in a stood up environment. There's lots of feedback loops you can close there.
Once you start to find problems as you find them especially in production, but also in pre‑prod and staging, feeding those things back into the test cycle so that you never find the same mistake twice, because the first time you find it, the next time you'd test for it.
Gordon: This idea of doing things incrementally in stages, before they hit production, is really important from a security perspective as well. I was just having a conversation with one of my colleagues, or actually several of my colleagues, about this kind of tension between the traditional security guy who is sort of, "Stop. Stop. Don't push it out there," and this idea of whether you like the term or not, DevSecOps, where security gets baked in, and added incrementally.
What we were saying, and what was really coming out as we were having this discussion was that while the reason there's this tension or maybe disconnect is from the security guys' point of view, to a degree, the serious security flaws are pushed out into production.
Well, that is something that simply needs to be stopped to a degree that you can tolerate failures and errors in security that don't hit the actual production environment, because you found them through automated testing, or whatever. Then that makes more sense as this incremental, and sometimes breaking things sort of process.
Andi: Yeah. Absolutely. This is actually something I've done a little bit of work, and most of the work is being done by someone that you probably know well, Ed Haletky of the TVP, @Texiwill on Twitter.
Also using some of those tools like Fortifywhich will do quality of code scanning for security purposes. You can start to shift left in that respect, but also continuing to get inputs from security testing even post release. There's no reason why security testing can't keep going even after you've released.
You can get to a certain coverage rate. This is where data helps. You get to a 90 percent, or a 92 percent, or a 95 percent coverage rate, or confidence level if you will. You go, "OK, I'm ready to release. I know that the remaining five percent is potentially low impact, or low risk. I'll put it out there anyway, but continuing to test."
There's some really interesting work out there that Ed's published about cloud, cloud promotion, and cloud delivery that actually really focuses on using these metrics from security testing, both pre and post release, which I think is actually really important.
Gordon: We're going to be hearing a lot more about this whole security angle everywhere. This is partly an IoT show. We've heard a lot about security. I'm not sure we've heard a lot of solutions, but we've heard a lot about security.
Obviously, it is a big part of the DevOps discussion. It's a big, scary world out there, and it's pretty universally recognized that having an auditor sign off once a year, and then you don't think about security for that application for another six months or whatever, really doesn't work today.
Andi: Yeah. It's not my joke. I saw someone post it the other day. "What did you get owned by? Your toaster or your fridge?" It's so true, especially in IoT, but in a DevOps perspective, or DevOps context, being able to do that continuous security testing, I think, is really important, and bring security a shift left.
We talk about a shift left in all sorts of other areas, and we're doing it with QA which I think is awesome. We need to start doing it more with security, I believe. At Splunk, we do have a whole security practice around incident event monitoring, or unused behavior analytics. Being able to start to apply some of that in the test, and pre‑prod and staging environment I think is really important.
Being able to do some automated audit reporting around what is happening, penetrations, security violations, or passwords, or PII exposure, potential hard coded passwords, stuff like that, there's a bunch of stuff that developers could be, and should be responsible for that actually make security pro's life easier. Not harder. I think there's a lot of work yet to be done on that.
Gordon: Absolutely. I'd go back to DevSecOps. I think there's this school of thought that, well, if you read the Phoenix Project properly, you wouldn't be having to have this discussion. Know security was baked in. Meanwhile in the real world, security has tended to be this separate profession.
We were both at DevOpsDays London. I still remember security professional I guess in his 40s, standing up in an open space, and go, "I'm one of those security guys who's been getting in your way. You know, this is the first time I've ever been to an IT conference that wasn't purely a security conference."
I love that story. Certainly not to pick on that guy. That's quite brave of him, getting up like that. I think that's such a perfect illustration of how security has operated in his own world as this gatekeeper to releasing applications.
Andi: Yeah. People joke about IT being a department of no. Security has that moniker for fear or not. Obviously, security teams are just looking out to protect the business. That's their job. Having them in the tent, I think, is a better option, and we started to bring other teams into the tent of DevOps.
I actually gave a presentation. You can find it online at the Splunk user conference, that was titled something along the lines of "Biz PMO Dev Sec QA Biz Ops," or something crazy like that, about broadening the tent of DevOps.
Security's got to come into this tent. Being a security pro into your team, into your scrum, that's got to be a good start, doesn't it?
Gordon: Right, even if they're not in the meeting in the stand‑up every week, or every day, at least having them be as part of the team. Just like there used to be a business analyst who's a part of the team. Our product and technologies operations, their DevOps story.
I call it the "Banana Pickle Story," because they would get asked for a banana, and as Katrinka describes it, six months later, they deliver this pickle. Really, their DevOps story...Again, the business level, because it's what matters to me.
They used a lot of technology like OpenShift, Platform Service, and Ansible for automation , things like that. But again, they were really focused on the business story of how do we get the stakeholders iterating with us. "Oops, that banana's looking a little green. Let's dial that back to yellow and get on with the other things."
Andi: Yeah, and this is the agile model for development, is getting someone from the business...You're creating an MVP and getting someone from the business to evaluate it, and continue to iterate with their advice.
You know that you're creating the right thing as you're creating it, rather than finding out in six‑months time, that you've created a pickle, instead of banana. [laughs] I love that analogy.
We should be doing that more and more with security. If security is saying no to you all the time, then maybe you're not inviting them to the party as much as you should, so that they can say yes iteratively, rather than one big no at the end.
Gordon: Right. Just to cap off this podcast. In order to prove to security, much less external auditors and to prove to these other stakeholders, you need data.
Andi: Absolutely. Exactly right. This is fundamental to what I believe. We cannot continue making decisions based on "I feel that this is the right thing to do. I think we're going to have good results here." We're living in a society that's driven by data and facts. Especially as developers or IT professionals, we need to have these feedback loops based on real data.
Not just people coming back and saying "I don't feel like you did the right thing. I don't think that this was good. I think our release worked and helped our customers." We need to come back and stop having these back‑and‑forths over opinions.
There's some very crude statements about, "Everyone's got opinions," right? I like to say, "In God we trust. All others bring data." That's how we get these real feedback loop in a system's mode, getting feedback from productions systems, from customer interaction, from the security violations and the passes that we do make.
From the coverage, to know if we are doing the right thing in terms of speed, in terms of quality, in terms of impacting our business, that's where data has a huge role to play. It’s those feedback loops that DevOps depends on.
Wednesday, November 09, 2016
Traditionally, in open source, there was a lot of emphasis on singular projects. Today, it's much more about how multiple communities interact and build on each other. In this podcast recorded at the OpsenShift Commons Gathering and Kubecon in Seattle, Red Hat's Diane Mueller discusses what she's learned as Director for Community Development at OpenShift and what's coming next.
Show notes:
You can find Diane on Twitter @pythondj or email dmueller (at) redhead.com.
Tuesday, November 08, 2016
As DevOps practices have been put into wide use, it's become evident that developers and operations aren't merging to become one discipline. Nor is operations simply going away. Rather, DevOps is leading software development and operations - together with other practices such as security - to collaborate and coexist with less overhead and conflict than in the past.
In my session at @DevOpsSummit at 19th Cloud Expo, I discussed what modern operational practices look like in a world in which applications are more loosely coupled, are developed using DevOps approaches, and are deployed on software-defined, and often containerized, infrastructures - and where operations itself is increasingly another "as a service" capability from the perspective of developers.
How does the operations tool chest change? How does the required skill set differ? How are the interactions between operations and other IT and business organizations different from in the past? How can operations provide the confidence to the entire organization that this new pipeline is still delivering non-functional requirements such as regulatory compliance and a secure and certified operating environment? How does operations safely consume vendor and upstream dependencies while meeting developer desires for the latest and greatest?
Operations is more important than ever for a business to derive value from its IT organization. But the roles and the goals of operations are significantly different than they were historically.
Tuesday, November 01, 2016
if you're ignoring PaaS because early offerings didn’t meet your needs or because you’re more focused on operations than developers, you should look again. It enables ops to enable developers efficiently and to manage an underlying container infrastructure.
Circulating in drafts beginning in 2009, some variant of the NIST Cloud Computing definition used to be de rigueur in just about every cloud computing presentation. Among other terms, this document defined Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software
-as-a-Service (SaaS) and, even as technology has morphed and advanced, this is the taxonomy that we still largely accept and adhere to today.
That said, PaaS was never as crisply defined as IaaS and SaaS because “platform” was never as crisply defined as infrastructure or (end-user) software. For example, some platforms were specific to a SaaS, such as Salesforce.
Others, specifically the online platforms that were most associated with the PaaS term early on, were typically tied to particular languages and frameworks. These PaaSs were very “opinionated.” For example, the original Google App Engine supported an environment that was just (and just almost) Python and Heroku was all about Ruby. Heroku's twelve-factor app manifesto was an additional type of opinion; write your apps this way or they won’t really be suitable for the platform. These platforms may not have been just for hobbyists, but they were certainly much more suited to developer prototyping and experimentation than production deployments.
At the same time, platform was also used more broadly to cover the integration of a range of middleware, languages, frameworks, other tools, and architecture decisions (such as persistent storage) that a developer might use to create both web-centric and more traditional enterprise applications. Furthermore, such PaaSs as OpenShift remained not only “polyglot” but also allowed for an increasing range of deployment types both on-premise and in multi-tenant and dedicated online environments. (As well as on developer laptops using the upstream open source OpenShift Origin project.)
However, the various approaches to PaaS did have a common thread. They were bundles of technology that were largely framed as appealing to developers.
One of the greatest benefits of a PaaS is its ability to create a bright line between what's "operations" and what's "development". In other words, what's "yours" and what's "theirs".
Things get complicated and expensive when that line blurs: developers demand tweaks to kernel settings, particular hardware, etc. which fly in the face of any standardization or automation effort. Operations, on the other hand, creates inflexible rules for development platforms that prevent developers from doing their jobs. PaaS decouples these two, and permits each group to do what they're good at.
If you've outsourced your operations or development, this problem gets worse because any idiosyncrasies on the ops or the development side create friction when sourcing work to alternate vendors.
By using a PaaS, you make it perfectly clear who's responsible for what: above the PaaS line, developers can do whatever they like in the context of the PaaS platform, and it will automatically comply with operations standards. Below the line, operations can implement whatever they like, choose whatever vendors they like, as long as they're delivering a functional PaaS environment.
We spend a lot of time talking about why PaaS is great for developers. I think it's even better for procurements, architecture, and budget.
Today, with the rise of DevOps on one hand and containers on the other, it’s increasingly clear that a PaaS can be the sum of parts that are of direct interest mostly to developers and parts that are of direct interest mostly to operations.
DevOps both leads to change and reflects change in a couple of areas.
First is the number of tools that organizations are bringing into their DevOps (or DevSecOps if you prefer) software delivery workflow. Most obvious is the continuous integration/continuous delivery pipeline, most notably with Jenkins. But there are also any number of testing, source code control, collaboration, and monitoring tools that need to be integrated into the workflow. At the same time, developers still want their self-service provisioning with an overall user experience that’s tailored to how they work. A PaaS is an obvious integration and aggregation point for this tooling.
DevOps is also changing the way that developers and operations work with each other. Early DevOps discussions often focused on breaking down the wall between Dev and Ops. But this isn’t quite right. DevOps does indeed embody cultural elements such as collaboration and cooperation across teams—including Dev and Ops. But there’s also a recognition that the best form of communication is sometimes eliminating the need to communicate at all. To the degree that Ops can build a self-service platform for developers and get out of the way, that can be more effective than improving how dev and ops can work together. I don’t want to communicate more effectively with a bank teller; I want to use an ATM (or skip cash entirely).
Containers have also influenced how some organizations are thinking about PaaS. Many PaaS solutions (including OpenShift) have been based on containers from the beginning. But each platform did their own implementation of containers; in OpenShift it was Gears, in Heroku it was Dynos, in CloudFoundry it was Warden (now Garden) containers.
As the industry moved to a container standard (Docker-format with standardization through the Open Container Initiative (OCI)), OpenShift moved with it. Red Hat has helped drive that movement along with many others though not all PaaS platforms have participated in the shift to standards.
With container formats, runtimes, and orchestration increasingly standardized through the OCI and Cloud Native Computing Foundation (where kubernetes is hosted), there’s increasing interest from many ops teams in deploying a tested and integrated bundle of these technologies outside of any specific development environment initiatives within their companies.
That’s because the huge amount of technological innovation happening around containers and DevOps can be something of a double-edged sword. On the one hand it creates enormous possibilities for new types of applications running on a very dynamic and flexible platform. At the same time, channeling and packaging the rapid change happening across a plethora of open source projects isn’t easy—and can end up being a distraction from the ultimate business goals.
As a result, at Red Hat, we talk to customers who view OpenShift primarily through the lens of a container management platform rather than the more traditional developer-centric PaaS view. There’s still a developer angle of course—a platform isn’t much use unless you’re going to run applications on it. But sometimes there are already developer tooling and workflows in place and the pressing need is to deploy a container platform using Docker-format containers and kubernetes orchestration without having to assemble these from upstream community bits and support them in-house.
An integrated platform leads to real savings. For example, based on a set of interviews, IDC found that:
IT organizations that want to decouple application dependencies from the underlying infrastructure areadopting container technology as a way to migrate and deploy applications across multiple cloud environments and datacenter footprints. OpenShift provides a consistent application development and deployment platform, regardless of the underlying infrastructure, andprovides operations teams with a scalable, secure, and enterprise-grade application platformand unified container and cloud management capabilities.
Among its quantitative findings was 35 percent less IT staff time required per application deployed. [1]
In short, PaaS remains a central part of the cloud computing discussion even if the name is sometimes discarded for something more specific or descriptive such as container platform. What’s perhaps changed the most is the recognition that PaaS isn’t just a tool for developers. It’s also a way for ops to enable developers most efficiently and to manage the underlying container infrastructure.
[1] I’ve got some other good data points and outside perspectives that I’ll share in a future post.
About Me
I'm technology evangelist for Red Hat, the leading provider of commercial open source software. I'm a frequent speaker at customer and industry events. I also write extensively on and develop strategy for Red Hat’s hybrid cloud portfolio.
Prior to Red Hat, as an IT industry analyst, I wrote hundreds of research notes, was frequently quoted in publications such as The New York Times on a wide range of IT topics, and advised clients on product and marketing strategies. Among other hobbies, I do a lot of photography and enjoy the outdoors. | {
"pile_set_name": "Pile-CC"
} |
Description
Visi-Flow Irrigation Starter Set is designed for persons with colostomies for whom irrigation is indicated. It is a complete irrigation system offering the advantages of flow visibility and control, easy front filling and a removable stoma cone. The Visi-Flow Irrigation System is compatible with the Visi-Flow Irrigation Sleeves. Flange: 1 3/4" (45 mm) | {
"pile_set_name": "Pile-CC"
} |
CDKN2A germline mutations in U.K. patients with familial melanoma and multiple primary melanomas.
We report six of 16 U.K. melanoma families and two of 17 patients with multiple primary melanomas and a negative family history who have between them four different functionally damaging mutations of the CDKN2A (p16) gene: an Arg 24 Pro substitution in exon 1 in one family, a stop codon at codon 44 of exon 1 in one family, and a Met 53 Ile substitution in exon 2 in four families. One multiple primary melanoma patient also has the Met 53 Ile mutation and a second has a G-T substitution at the IVS2 + 1 splice donor site. Our data together with other recent publications from France and the U.S.A. indicate that screening melanoma kindreds with only two affected family members for CDKN2A mutations is justified. | {
"pile_set_name": "PubMed Abstracts"
} |
Isoft is a very powerful SaaS, Software and Startup WordPress theme. All the latest features are using here which help you to make unique website from others. It’s a powerful, super flexible responsive, professional, and multipurpose tool. It comes with numerous customisable and reusable components that are designed to fit as many purposes as possible. They are also easy to customize and to combine with other components.
The Theme allows creating as many new page layouts as you wish. Isoft comes with 17 Home pages, Portfolio, Pricing, Services, 11 Blog type, Support and Shop to choose from. There are Contact page, About me/us layouts, Unique Headers and Footers to display your work, and with our one click demo importer!
We made the theme compatible with such plugins as The Grid, WPBakery (Visual Composer), Contact form 7, Google Maps, Booked, WooCommerce and more!
For more information regarding this product, you can read the extensive theme documentation source. If you meet any issues using this theme our tech support is always happy to help!
Grid
The Grid is a premium WordPress grid plugin which allows you to show off any custom post types in a fully customizable and responsive grid system. It is perfectly suited for displaying your blog, portfolio, e-commerce or any kind of WordPress post type. This plugin support the following post format: standard, video, audio, gallery, link, quote.
WPBakery Page Builder for WordPress (Visual Composer)
WPBakery Page Builder is easy to use drag and drop page builder that will help you to create any layout you can imagine fast and easy. WPBakery Page Builder gives you full control over responsiveness. Create responsive websites automatically or adjust preferences you need to ensure your WordPress website looks perfect on mobile or tablet. WPBakery Page Builder has everything it takes to create a responsive website.
Google Maps
Google Map Visual Composer Addon is most easy and powerful free google maps addon for Visual Composer. With this addon, you can create different type of maps having option to change Language, Map Controls Zoom Controls and lot of more functionals, also you can setup the Style. This is a google maps visual composer add on to generate fully customized responsive google maps in your local language. 114 different styles(skin) available and counting.
Contact form 7
We have added a Contact Form 7 plugin, to let you create nice contact forms. This plugin can manage multiple contact forms, you can customize the form and the mail content flexibly with simple markup. The form supports Ajax-powered submitting, CAPTCHA, Akismet spam filtering and so on.
WooCommerce integration
Isoft is 100% WooCommerce compatible and includes full design integration that looks amazing. With Isoft theme you can easily create a highly professional and fully functional online shop.
100% Fully Responsive
Isoft theme is fully responsive. No matter what device your visitors are using to access your site, the layout will fluidly respond to the screen size to ensure they can still read, browse, shop, download, and interact with your website in every other way.
Lifetime Updates and User Support
Each purchase of the Theme grantees you lifetime access to future theme updates at no extra cost. You also get six months of user support with the option of extending this period should you wish
More features:
Built with HTML5 and CSS3
Responsive layout (desktops, tablets, mobile devices)
Clean & Modern Design
Cross Browser Compatible (Chrome, FF, Safari, Opera)
One-click demo install
Sidebar widget area
Unlimited color/skins
Child theme included
Page preloader
404 Page customization
Theme settings backup import/export
Professional Support
Regular Updates
Theme Documentation
Notice:
All images in iSoft Theme can be used only on demo version. You will need to upload your own images for your site. | {
"pile_set_name": "Pile-CC"
} |
Charles H. Grosvenor
Charles Henry Grosvenor (September 20, 1833 – October 30, 1917) was a multiple-term U.S. Representative from Ohio, as well as a brigade commander in the Union Army during the American Civil War.
Biography
Grosvernor was born in Pomfret, Connecticut. He was the uncle of Charles Grosvenor Bond. In 1838, Grosvenor moved with his parents to southeastern Ohio, where he attended school in Athens County. He later taught school before studying law. He was admitted to the bar in 1857 and practiced in Athens.
During the Civil War, Grosvenor served in the 18th Ohio Infantry and was promoted through the ranks to colonel. He led his regiment at the Battle of Chickamauga in 1863, and was a brigade commander in the division of Charles Cruft at the Battle of Nashville in December 1864. At the close of the war, Grosvenor was brevetted as a colonel in the Regular Army. He was mustered out of the volunteers on October 9, 1865. On January 13, 1866, President Andrew Johnson nominated Grosvenor for appointment to the grade of brevet brigadier general of volunteers, to rank from March 13, 1865, and the United States Senate confirmed the appointment on March 12, 1866.
Following the war, Grosvenor held diverse township and village offices. He served as a member of the State house of representatives from 1874–1878 and served as Speaker of the House for two years. He served as member of the board of trustees of the Ohio Soldiers' and Sailors' Orphans' Home in Xenia from April 1880 until 1888, and president of the board for five years.
Presidential elector for Grant/Wilson in 1872.
Presidential elector for Garfield/Arthur in 1880.
He served as delegate to the Republican National Convention in 1896 and 1900. Grosvenor was elected as a Republican to the Forty-ninth, Fiftieth, and Fifty-first Congresses (March 4, 1885 – March 3, 1891). He was an unsuccessful candidate for renomination in 1890.
Grosvenor was elected to the Fifty-third and to the six succeeding Congresses (March 4, 1893 – March 3, 1907).
He served as chairman of the Committee on Expenditures in the Department of the Treasury (Fifty-fourth Congress), Committee on Mines and Mining (Fifty-fifth Congress), Committee on Merchant Marine and Fisheries (Fifty-sixth through Fifty-ninth Congresses). He was an unsuccessful candidate for renomination in 1906.
He resumed the practice of law in Athens. The combat veteran was appointed as chairman of the Chickamauga and Chattanooga National Park Commission and served from 1910 until his death in Athens on October 30, 1917. He was interred in Union Street Cemetery.
Grosvenor married Samantha Stewart of Athens County, December 1, 1858. She died in 1866, leaving a daughter. He married Louise A. Currier, also of Athens County, May 21, 1867. She had two daughters.
See also
List of American Civil War brevet generals (Union)
Notes
References
Retrieved on 2008-08-13
External links
1904 Photo
Category:1833 births
Category:1917 deaths
Category:Members of the United States House of Representatives from Ohio
Category:People from Athens, Ohio
Category:Ohio lawyers
Category:People of Ohio in the American Civil War
Category:Union Army colonels
Category:Union Army generals
Category:Speakers of the Ohio House of Representatives
Category:1872 United States presidential electors
Category:1880 United States presidential electors
Category:Ohio Republicans
Category:Republican Party members of the United States House of Representatives
Category:19th-century American politicians | {
"pile_set_name": "Wikipedia (en)"
} |
Q:
Need help on deploying node js project?
I am very much new to Node JS. I have developed a node js project, in my local machine I am running the command 'node app.js' through command prompt. Now I want to deploy this project in our QA environment. Can someone help me on how I can bundle project and deploy it in QA regions?
Also I need to set up an the environment specific variables out side the project so that other projects can use it. Please help me on this as well.
A:
Its easy make a copy, then zip the copied directory and clone your application to you QA environment. In your QA environment, run:
npm install
Optionally, if you have any tests, you can run :
npm install
Also don't forget to set the environment:
NODE_EN=development
| {
"pile_set_name": "StackExchange"
} |
Q:
Is there a name for the center of a line?
Is there a name for the center point for a line?
For example:
---------o---------
If the dashes represent a straight line and the O represents the center of that line, what would the name for that center point be?
A:
A line goes forever in both directions, so it has no center. If you have a line segment - a part of a line with two definite ends - then the name is "midpoint."
| {
"pile_set_name": "StackExchange"
} |
Essential ECG clues in patients with congenital heart disease and arrhythmias.
The prevalence of adults with congenital heart disease has dramatically increased during the last decades due to significant advances in the surgical correction of these conditions. As a result, patient's survival has been prolonged and arrhythmias have become one of the principal causes of morbidity and mortality for these patients. The surface 12-lead ECG may play a critical role in the identification of the underlying heart disease of the patient, the recognition of the arrhythmia mechanism and may also help in the planification of the ablation procedure in this setting. Finally, important prognostic information can be also obtained from the ECG in these patients. The present review will offer an overview of the principal utilities of the surface ECG in the diagnosis and management of patients with CHD and arrhythmias. | {
"pile_set_name": "PubMed Abstracts"
} |
Conventionally, a scheme that a value of plural bits is stored in a single memory cell transistor is known as a technique of increasing capacity of a NAND type flash memory. For example, a flash memory employing a multi level cell (MLC) scheme can store a value of two bits in a single memory cell transistor. For example, a flash memory employing a triple level cell (TLC) scheme can store a value of three bits in a single memory cell transistor. | {
"pile_set_name": "USPTO Backgrounds"
} |
125 Ariz. 53 (1980)
607 P.2d 372
AMERICAN CONTINENTAL LIFE INSURANCE COMPANY, an Arizona Corporation, Appellant and Cross-Appellee,
v.
RANIER CONSTRUCTION CO., INC., Appellee and Cross-Appellant.
No. 13950.
Supreme Court of Arizona, In Banc.
February 6, 1980.
Rehearing Denied March 11, 1980.
*54 Fennemore, Craig, von Ammon & Udall by Silas H. Shultz, Dwayne L. Burton, Michael Green, Phoenix, for appellant and cross-appellee.
Thomas W. Murphy, Pago Pago, American Samoa, for appellee and cross-appellant.
GORDON, Justice:
This appeal involves a suit for breach of a construction contract between American Continental Life Insurance Co. (American) and Ranier Construction Co., Inc. (Ranier). After trial by jury, a verdict was returned for Ranier in the amount of $130,000 and for American in the amount of $10,000 on its counterclaim. The trial judge determined that neither party was entitled to recover attorney's fees from the other. Both parties appeal. Having jurisdiction pursuant to 17A A.R.S., Rules of Civil Appellate Procedure, Rule 19(e), we reverse the judgment in favor of Ranier and the court's disposition of attorney's fees.
American contracted with Ranier to construct a building for $517,286.30. To date, American has paid Ranier $457,247.47. The contract required American to make monthly progress payments for 90% of the work completed each month upon the issuance by the architect of a certificate for payment. American refused to make the final payment, consisting of the 10% retained each month and the amount due for work completed after the date of the last progress payment. It claimed that Ranier had breached the procedural requirements of the contract and had failed to construct the building in a workmanlike manner and in accordance with the plans and specifications. Ranier subsequently instituted suit for breach of the contract, to recover funds retained under the contract and damages for delays and lost profits. American counterclaimed for breach of contract and negligence, to recover damages for faulty construction and delays. The jury returned a single verdict for Ranier in the amount of $130,000 and a single verdict for American in the amount of $10,000. American appeals the verdict in favor of Ranier. Ranier cross appeals the trial court's refusal to award attorney's fees to Ranier as the prevailing party.[1]
At the close of Ranier's case and again at the close of all the evidence, American moved for a directed verdict, which was denied. One of the grounds urged by American was that Ranier had failed to meet a condition precedent to the right to final payment, because it had failed to procure from the architect a final certificate for payment as provided in the contract. *55 American renews this argument on appeal, contending that the court erred in denying its motion for a directed verdict.
Article 7 of the contract provides:
"Final payment constituting the entire unpaid balance of the Contract Sum shall be paid by the Owner to the Contractor within thirty (30) days after Substantial Completion of the Work * * * provided the work has then been completed * * * and a final Certificate for Payment has been issued by the Architect."
Issuance of the final certificate for payment is governed by paragraph 9.7.2 of the General Conditions of the contract:
"Upon receipt of written notice that the Work is ready for final inspection and acceptance and upon receipt of a final Application for Payment, the Architect will promptly make such inspection and, when he finds the Work acceptable under the Contract Documents and the Contract fully performed, he will promptly issue a final Certificate for Payment stating that to the best of his knowledge, information and belief, and on the basis of his observations and inspections, the Work has been completed in accordance with the terms and conditions of the Contract Documents and that the entire balance found to be due the Contractor, and noted in said final Certificate, is due and payable."
The architect issued a certificate of substantial completion,[2] but Ranier admits that a final certificate for payment, as provided for in the contract, was never applied for or obtained. American asserts, and it is undisputed by Ranier, that it was Ranier's responsibility to procure issuance of the certificate. Ranier argues, however, that strict compliance with the requirement of a final certificate for payment was waived, because, from the beginning, both parties deviated from the formal requirements of the contract in other respects. Ranier cites as examples the fact that change orders, although done at the owner's request, were not signed by the owner; that on occasion the owner even ordered changes to be made without execution of a formal change order; that extensions of time were granted both formally and informally; and that the owner, although in agreement, also failed to sign extensions of time. The trial court believed that there was sufficient evidence to support a submission to the jury as to whether strict compliance had been waived. We disagree.
Waiver is either the express, voluntary, intentional relinquishment of a known right or such conduct as warrants an inference of such an intentional relinquishment. See, e.g., City of Tucson v. Koerber, 82 Ariz. 347, 313 P.2d 411 (1957). Waiver by conduct must be established by evidence of acts inconsistent with an intent to assert the right. Occidental Life Insurance Co. v. Jacobson, 15 Ariz. 242, 137 P. 869 (1914); see Bolo Corp. v. Homes and Son Construction Co., Inc., 105 Ariz. 343, 464 P.2d 788 (1970). The waiver of one right under a contract does not necessarily waive other rights under the contract. See O'Malley v. Cummings, 86 Ill. App.2d 446, 229 N.E.2d 878 (1967). Thus, even if American did waive other rights under the contract relating to change orders or extensions of time, that conduct does not manifest an intent to waive any right relating to payment for *56 work. See Practical Construction Co. v. Granite City Housing Authority, 416 F.2d 540 (7th Cir.1969). Ranier does not indicate any evidence, nor does any evidence appear in the record, that the parties ever disregarded any of the terms of the contract relating to payments. Accordingly, we find no waiver.[3]
Ranier also argues, alternatively, that certain acts by American[4] prevented fulfillment of the condition precedent, thereby excusing performance. We fail to perceive how these acts prevented Ranier from seeking a final certificate of payment from the architect. Similarly, we reject Ranier's assertion that seeking a final certificate of payment would have been a futile act, because American had already demonstrated its unwillingness to co-operate by refusing to sign the certificate of substantial completion. The failure of American to sign the certificate of substantial completion has nothing to do with Ranier's obligation under the contract to procure the final certificate of payment in order to be in a posture to claim that payment from American is due. Moreover, even if Ranier is correct in assuming that American would not have made the final payment had Ranier sought and received the final certificate of payment from the architect, Ranier is not excused from the contractually-imposed duty of acquiring the certificate. Without it, we have no way of knowing if the architect was satisfied that the list of items to be completed between the time of issuance of the certificate of substantial compliance and the application for final payment[5] had been completed and that the contract was, thus, fully performed. At the trial, in fact, the architect testified that after he issued the certificate of substantial completion, the building was not complete, and items on the "punch" list remained unfinished.[6]
We agree with American that the final certificate for payment is not "procedural chaff." It is a major substantive right, which "serves a vital interest, in that it induces the contractor to render a performance that conforms in fact to plans and specifications, spurs him to stay with the job and, upon completion, furnishes the main incentive to make conforming corrections." Loyal Erectors, Inc. v. Hamilton & Son, Inc., 312 A.2d 748, 755 (Me. 1973). The trial court erred in failing to direct a verdict in American's favor because of Ranier's *57 non-compliance with the condition precedent of obtaining a final certificate of payment.
American challenges the jury's award to Ranier on several other grounds, which we need not consider, because we reverse on the basis of the court's denial of American's motion for a directed verdict. We next address the issue of attorney's fees raised by Ranier.
The contract between American and Ranier provides for attorney's fees in the following language:
"In the event of litigation between the parties hereto arising out of this Contract or the performance of the Work hereunder, the prevailing party shall be entitled to recover reasonable attorney's fees in addition to any other damages allowed by law." Supplementary General Conditions, Paragraph 15.3.
The trial judge would not allow either party to recover attorney's fees from the other, because he believed that both parties had breached their agreements, and he considered this finding implicit in the jury's verdicts. Because of our decision today reversing the judgment in favor of Ranier, American is clearly the prevailing party and is entitled to recover attorney's fees by the above express provision of the contract.
The judgment in favor of Ranier is reversed and remanded to the trial court with directions to enter judgment in American's favor upon Ranier's complaint and to award attorney's fees to American.
HOLOHAN, V.C.J., and HAYS and CAMERON, JJ., concurring. STRUCKMEYER, Chief Justice, dissenting.
I cannot agree with the disposition of this case by the majority of this Court. In order that the case be placed in perspective, certain facts should be emphasized.
Eight years ago, Ranier Construction Company entered into a contract by which it agreed to construct an office building for American Continental for $517,286.30. Approximately three years later, Ranier brought this suit for damages for partial nonpayment of the cost of constructing the building. American Continental asserted that Ranier had not constructed the building in conformity with standards of good workmanship and that no final certificate for payment had been issued by the architect as required by the contract. American Continental counterclaimed against Ranier for an amount necessary to put the building in compliance with the construction agreement. After a trial and a view of the building by the jury, a verdict of $130,000 was returned in favor of Ranier, and $10,000 in favor of American Continental on its counterclaim.
American Continental employed the architectural firm of Haver, Nunn & Nelson, Inc., by separate written agreement, to prepare the plans for the building and to supervise its construction. By the specific language of paragraph 2.2.2 of the construction contract, the architect was made the owner's agent.
American Continental early became dissatisfied with the progress of the work, and the delay in completion of the contract became one of its grounds for refusal to pay Ranier the balance due on the contract. By paragraph 8.3.1 of the construction contract, as amended, it is provided:
"If Contractor is delayed at any time in the progress of the Work by an act * * of the Owner or Architect or by any employee of either, or by any separate contractor employed by Owner, or by changes ordered in the Work * * * then the Contract time shall be extended by a Change Order for a reasonable period of time as determined by the Architect to cover such occurrences."
The evidence at the trial established that the architect had extended by change order every delay of which American Continental complained. (By amended subparagraph 2.2.10, either party was authorized to bring an action "concerning the matter decided by the architect.")
The construction contract also provided by paragraph 3.2.4, "The owner shall issue *58 all instructions to the Contractor through the Architect." American Continental early breached this clause of the contract by placing a personal representative on the job site who gave orders directly to sub-contractors as well as the contractor's personnel working on the job.
By the construction agreement, when the work on the building was substantially completed, Ranier was authorized to request the architect to issue a Certificate of Substantial Completion. A Certificate of Substantial Completion is issued by the architect when the building is in such a condition that the owner can occupy it. At that time, the responsibility for insurance and maintenance of the property becomes the owner's. Accompanying the Certificate of Substantial Completion is what is called the "final punch list." This is a list of matters to be corrected or completed before the architect will issue a Certificate of Final Payment. Copies of the Certificate of Substantial Completion and "final punch list" were issued by the architect and delivered to Walter Bush, President of American Continental, for his signature on October 10, 1973. However, Bush refused to accept the architect's "final punch list", submitting a list of things with which he required Ranier to comply before the building would be acceptable to him. Lawrence Blesh, President of Ranier Construction, Inc., later testified at the trial without contradiction:
"Q Did Ranier comply with the punch list that the architect prepared?
A Yes, we did.
Q Did Mr. Bush himself prepare his own punch list?
A Yes, he did.
Q Did you attempt to comply with that?
A We attempted to."
Eventually an impasse developed with the architect supporting Ranier against Bush's demands for further corrections as being unreasonable.
Blesh testified:
"Q Was payment demanded by you from Mr. Bush?
A Yes.
Q Mr. Bush refused to make payment?
A Yes, he did."
Blesh, in bringing suit, obviously treated Bush's conduct, including his refusal to make payment, as an anticipatory breach.
"We have recognized that an action may be maintained for breach of contract based upon the anticipatory repudiation by one of the parties to the contract. Sarle v. School Dist. No. Twenty-Seven of Pima County, 32 Ariz. 96, 255 P. 994. It is well established that in order to constitute an anticipatory breach of contract there must be a positive and unequivocal manifestation on the part of the party allegedly repudiating that he will not render the promised performance when the time fixed for it in the contract arrives. Mobley v. New York Life Ins. Co., 295 U.S. 632, 55 S.Ct. 876, 79 L.Ed. 1621, 99 A.L.R. 1166; Salot v. Wershow, 157 Cal. App.2d 352, 320 P.2d 926; Atkinson v. District Bond Co., 5 Cal. App.2d 738, 43 P.2d 867; Preston v. Love, 240 S.W.2d 486 (Tex.Civ.App. 1951); 4 Corbin on Contracts § 973 (1951); 5 Williston on Contracts § 1324 (rev. ed. 1937); Restatement of the Law of Contracts § 318." Diamos v. Hirsch, 91 Ariz. 304, 372 P.2d 76 (1962).
One further fact should be stressed. In December 1973, prior to the bringing of this action, American Continental took possession of the building and has been occupying and using it for the purpose for which it was built ever since.
It is clear from the most cursory perusal of the facts in this case that when ultimately an impasse developed, Bush treated the construction contract as at an end. It is of no significance or materiality to this lawsuit as to who was at fault in causing the final repudiation[1] because even if we assume *59 that it was Ranier, the law does not foreclose the right to recover for the proper work done under the contract and for materials which were furnished to the building. American Continental acquired a building which the architect said was substantially completed in conformity with the plans and specifications.
"The common-law rule required literal performance of building contracts, but the American courts generally hold that substantial performance of such contracts will support a recovery either on the contract or on a quantum meruit basis. Three reasons are given for that holding. One is that materials and labor upon a building are such that even if rejected by the owner of the land he receives benefit thereof. Since the owner receives the benefits of the builder's labor and materials, it is equitable to require the owner to pay for what he gets. The second reason is that it is next to impossible for a builder to comply literally with all the minute specifications in a building contract. And the third is that the parties are presumed to have impliedly agreed to do what is reasonable under all the circumstances with reference to the subject of performance. 6 R.C.L., page 667, Sec. 343; 9 Am.Jur., page 30, Sec. 40; 17 C.J.S. Contracts § 508, page 1085; Hickory Investment Co. v. Wright Lumber Co., 152 Miss. 825, 119 So. 308." Standard Millwork & Supply Co. v. Mississippi Steel & Iron Co., 205 Miss. 96, 38 So.2d 448 (1949).
The California Court of Appeals, in Martin v. Karsh, 142 Cal. App.2d 468, 298 P.2d 635, 636-637 (1956), put it this way:
"The law is settled in this state, that in the case of building contracts, especially where the owner has taken possession of the building and is enjoying the fruits of the contractor's work, no literal compliance with the contract in all details and no absence of all defects and imperfections is required to entitle the contractor to recovery on the contract, but that he can have such recovery after substantial performance in good faith, if the deviations and imperfections do not substantially affect the usefulness of the building for the purposes for which it was intended, subject to an allowance for damages if the owner has suffered any by reason of the failure to perform strictly."
The Rhode Island court, in Ferris v. Mann, 99 R.I. 630, 210 A.2d 121 (1965), said:
"The common-law rule that there must be complete or absolute performance with the terms of a building contract for plaintiff to recover the agreed consideration has long since been relaxed in this country. It has been repeatedly held in a number of jurisdictions that where the builder has not wilfully deviated from the specifications of a contract and all that remained to be done in order to constitute performance within the meaning of the contract is of a trivial or minor nature, the builder is entitled to recover the contract price with adjustments made by the court to compensate defendant for the unfinished or unsatisfactory work. See Connell v. Higgins, 170 Cal. 541, 150 P. 769, Sgarlat v. Griffith, 349 Pa. 42, 36 A.2d 330, and Pelletier v. Masse, 49 R.I. 408, 143 A. 609." 210 A.2d at 124.
The Missouri Court of Appeals, in Talbot-Quevereaux Const. Co. v. Tandy, 260 S.W.2d 314, 316 (1953), said:
"In the case of substantial but defective performance, where the contractor sues for the contract price and the owner defends by way of recoupment, the owner, upon proof of defective performance, is entitled to have the contractor's recovery reduced by the amount that would reasonably be required to remedy the defects and make the structure conform to the plans and specifications. Spink v. Mueller, 77 Mo. App. 85; Walter v. Huggins, 164 Mo. App. 69, 148 S.W. 148."
And see Jim Arnott, Inc. v. L & E, Inc., 539 P.2d 1333, 1336 (Colo. App. 1975), wherein the court said:
*60 "If it appears that a contractor has substantially performed his contract, then he is entitled to the contract price, less an offset for the cost to remedy the deficiencies. The purpose of the doctrine of substantial performance is to avoid injustice where a building contractor has performed all major aspects of the construction and the owner seeks to avoid payment for inconsequential defects in the work."
The foregoing statements beyond question reflect the status of the law in Arizona. In Cracchiolo v. Carlucci, 62 Ariz. 284, 157 P.2d 352 (1945), a building contract was entered into for the construction of a tourist court or motel near Tucson. The owner took possession of the motel and opened it for business. He refused to pay some $14,000 on the contract and for extras, claiming non-performance of the contract in accordance with the plans and specifications. It was established that the contractor failed to regrade and repave the motel driveway. This Court said:
"Where a contract has been partly performed by one party and the other has derived a substantial benefit therefrom, the latter cannot refuse to comply with its terms simply because the former fails to complete performance. Where there has been part performance and there is a breach of a promise which goes only to a part of the consideration and the breach may be compensated for in damages, the breach does not relieve the other party from his obligation to perform his promise. * * * We believe [these principles] to be applicable to the facts here." 62 Ariz. at 292, 157 P.2d at 355.
When Ranier finally left the job, the construction contract was ended, irrespective of who was responsible for causing its ultimate repudiation. The enforcement by this Court of a condition in the contract which had obviously been renounced, forecloses any recovery by the contractor for his labor and materials put into the building, even though the architect had issued a Certificate of Substantial Completion, and American Continental had occupied the building, thereby unjustly enriching the owner.[2]
The jury, after a twelve-day trial devoted to evidence concerning Ranier's work and after a view of the building, brought in a verdict for Ranier for $130,000 and offset that verdict with a verdict for American Continental on its counterclaim for $10,000. The ultimate absurdity resulting from the Court's opinion is that the owner of the building gets $10,000 for either deficiencies or delays in construction from the contractor, but the contractor is never paid for his work and materials. The order of this Court reversing the judgment of the trial court with directions to enter judgment in American Continental's favor upon Ranier's complaint and, in addition, awarding attorney's fees to American Continental is a gross miscarriage of justice.
I dissent.
NOTES
[1] We note that Ranier is not appealing the jury's $10,000 verdict in favor of American, and we, therefore, do not consider that facet of the judgment.
[2] Issuance of the certificate of substantial completion is provided for in Paragraph 9.7.1 of the General Conditions of the contract:
"9.7.1. When the Contractor determines that the Work or a designated portion thereof acceptable to the Owner is substantially complete, the Contractor shall prepare for submission to the Architect a list of items to be completed or corrected. The failure to include any items on such list does not alter the responsibility of the Contractor to complete all Work in accordance with the Contract Documents. When the Architect on the basis of an inspection determines that the Work is substantially complete, he will then prepare a Certificate of Substantial Completion which shall establish the Date of Substantial Completion, shall state the responsibilities of the Owner and the Contractor for maintenance, heat, utilities, and insurance, and shall fix the time within which the Contractor shall complete the items listed therein. The Certificate of Substantial Completion shall be submitted to the Owner and the Contractor for their written acceptance of the responsibilities assigned to them in such Certificate."
[3] Because we find no waiver, we need not address American's contentions that waiver must be specially pleaded and that Ranier failed to plead it.
[4] The president of American at one point, in a fit of anger, broke a panel of drywall with an ax to protest what he considered shoddy workmanship. He also hired a special supervisor who Ranier claims improperly began giving orders directly to the workmen.
[5] The list of items to be completed is provided for in Paragraph 9.7.1 of the General Conditions of the contract. See footnote 2 infra.
[6] The dissenting opinion discusses another argument to support Ranier's contention that failure to fulfill the condition precedent should not preclude it from bringing suit for the contract price. This is the theory that the contract had been repudiated, thereby obviating the necessity to comply with the condition precedent. We are convinced that the facts of the case do not support repudiation. American's refusal to make the final payment may not be equated with a denial on its part of its obligation to pay under any circumstances, no matter what Ranier did, thereby bringing the contract to an end. While refusing to pay, American continued to demand that Ranier complete the punch list, a clear indication that American considered the contract extant. As stated in its answer, American refused to make final payment because it believed that Ranier had not completed construction according to the contract plans and specifications. Thus, American thought that its duty to make final payment had not yet arisen according to the terms of the contract, not that its contractual obligation was forever terminated.
Additionally, the dissent relies on the doctrine of substantial performance, apparently satisfied that Ranier's deviations were so trivial as to give rise to the doctrine, despite contradictory evidence. By express provision of the contract, the parties set up a system of progress payments whereby the agreed upon value of full compliance by Ranier with the plans and specifications called for within the contract, and as vouched for by the architect, was the final payment by American. To allow the doctrine of substantial performance to operate here would fly in the face of the original intent of the parties and would nullify the contract.
[1] The court instructed the jury:
"First, in regard to Ranier's complaint if you find that American Continental did not breach the contract, your verdict must be for American Continental. If, however, you find that American Continental breached the contract, your verdict must be for Ranier, * *."
Manifestly, the jury must have found that American Continental breached the contract.
[2] The court instructed the jury:
"As one of its defenses to Ranier's claims that American Continental has breached the contract by failing to pay the entire amount agreed upon for construction, American Continental has alleged that Ranier has failed or omitted to establish that it has performed certain conditions precedent. * * * a condition precedent is a fact which must exist or occur before a duty to perform a contractual obligation arises.
In this case, if the contract includes conditions precedent those conditions must have been satisfied before Ranier is entitled to release of the retention and final payment, unless you find that American Continental has by its conduct waived its right to rely on performance of those conditions precedent. To constitute waiver there must be voluntary and intentional relinquishment of a known right. It may be expressly stated or inferred by conduct."
The jury could have concluded that when American Continental advised Ranier that it would not pay the balance of the contract, such conduct terminated the contract, thus waiving the condition that payment was dependent on the architect's issuance of a Certificate of Final Payment. But irrespective, if American Continental's conduct is not within the outer limits of waiver, it clearly falls within the doctrine of estoppel in pais, a doctrine courts invoke to promote the ends of justice and to prevent injury and inequitable consequences.
| {
"pile_set_name": "FreeLaw"
} |
Further characterization of the interaction of histidine-rich glycoprotein with heparin: evidence for the binding of two molecules of histidine-rich glycoprotein by high molecular weight heparin and for the involvement of histidine residues in heparin binding.
Rabbit histidine-rich glycoprotein (HRG, 94 kDa) binds heparin with high affinity (apparent Kd 60-110 nM). Eosin Y (1 equiv) bound to HRG was used as a reporter group to monitor associations of HRG with heparins of molecular mass 10, 17.5, and 30 kDa. The stoichiometries of the heparin-HRG complexes were determined by fluorescence and absorbance measurements as well as by analytical ultracentrifugation. Two types of complex form: complexes of 1 heparin:1 HRG and of 1 heparin:2 HRG. The 1:2 complex formation requires a minimum heparin chain length since 17.5-kDa but not 10-kDa heparin binds two HRG molecules. The formation of the 1:2 complexes of the larger heparin fractions is enhanced by divalent copper or zinc (1-10 equiv) bound to HRG. However, metal is not required for complex formation since all sizes of heparin examined interact tightly with HRG in the presence of ethylenediaminetetraacetic acid. Between 0.1 and 0.3 M ionic strength, both 1:1 and 1:2 complexes of heparin with HRG are progressively destabilized. No heparin-HRG complex is found at ionic strengths of 0.5 M. Between pH 8.5 and pH 6.5 both 1:2 and 1:1 complexes are found with 17.5-kDa heparin, but at pH 5.5 only 1:1 complexes are formed. The heparin-HRG interaction is progressively decreased by modification of the histidine residues of HRG, whereas modification of 22 of the 33 lysine residues of HRG has little effect.(ABSTRACT TRUNCATED AT 250 WORDS) | {
"pile_set_name": "PubMed Abstracts"
} |
Trade unions in Palau
Trade unions in Palau. Although the Constitution of Palau recognizes the right to free association, there is no specific mention of trade union rights, including the right of collective bargaining.
Palau is not a member of the International Labour Organization, and there are no functioning trade unions in the country.
ICTUR reports that there is no right to strike, and none have been reported in recent years.
References
Category:Economy of Palau
Category:Organizations based in Palau | {
"pile_set_name": "Wikipedia (en)"
} |
By Gaius Publius, a professional writer living on the West Coast of the United States and frequent contributor to DownWithTyranny, digby, Truthout, and Naked Capitalism. Follow him on Twitter @Gaius_Publius, Tumblr and Facebook. GP article archive here. Originally published at DownWithTyranny
Image credit: Mike Thompson / Detroit Free Press
Like “swing vote” Justice Sandra Day O’Connor before him, “swing vote” justice Anthony Kennedy has been one of the worst Supreme Court jurists of the modern era.
With swing-vote status comes great responsibility, and in the most consequential — and wrongly decided — cases of this generation, O’Connor and Kennedy were the Court’s key enablers. They
Cast the deciding vote that made each decision possible
Kept alive the illusion of the Court’s non-partisan legitimacy
Each of these points is critical in evaluating the modern Supreme Court. For two generations, it has made decisions that changed the constitution for the worse. (Small “c” on constitution to indicate the original written document, plus its amendments, plus the sum of all unwritten agreements and court decisions that determine how those documents are to be interpreted).
These horrible decisions are easy to list. They expanded the earlier decision on corporate personhood by enshrining money as political speech in a group of decisions that led to the infamous Citizens United case (whose majority opinion, by the way, was written by the so-called “moderate” Anthony Kennedy); repeatedly undermined the rights of citizens and workers relative to the corporations that rule and employ them; set back voting rights equality for at least a generation; and many more. After this next appointment, many fear Roe v. Wade may be reversed.
Yet the Court has managed to keep (one is tempted to say curate) its reputation as a “divided body” and not a “captured body” thanks to its so-called swing vote justices and the press’s consistent and complicit portrayal of the Court as merely “divided.”
Delaying the Constitutional Crisis
The second point above, about the illusion of the Court’s legitimacy, is just as important as the first. If the Court were ever widely seen as acting outside the bounds of its mandate, or worse, seen as a partisan, captured organ of a powerful and dangerous political minority (which it certainly is), all of its decisions would be rejected by the people at large, and more importantly, the nation would plunged into a constitutional crisis of monumental proportions.
We are in that constitutional crisis now, but just at the start of it. We should have been done with it long ago. Both O’Connor and Kennedy are responsible for that delay.
O’Connor’s greatest sin, of course, was as the swing vote in Bush v. Gore, the judicial coup that handed the 2000 election to George W. Bush and Dick Cheney. It was also widely reported that on election night at a dinner party “Sandra Day O’Connor became upset when the media initially announced that Gore had won Florida, her husband explaining that they would have to wait another four years before retiring to Arizona.” (More on that here.)
Consider: If the Supreme Court were part of coup that makes a losing presidential candidate the winner, and makes that ruling along partisan and preferential lines that can’t be judicially defended, how could any decision issued by that court be deemed legitimate afterward?
Yet here we are, still publicly asserting the Court’s legitimacy, whatever people think privately, and still watching in horror as decision after decision dismantles old constitutional agreements and erects new ones.
The Legacy of Anthony Kennedy
Kennedy will be praised for his so-called “moderate” or “case-by-case” ideology, bolstered largely by decisions protecting gay rights. Perhaps that will be his legacy.
But in the main he has been horrible, a record of ideological and indefensible votes capped by his landmark decision in the Citizens United case. Enough has been written about that to make repetition here unnecessary. As noted, Kennedy not only provided the crucial “swing vote,” he also wrote the majority opinion, which in essence, reduced the broad and complex sweep of both public corruption and the appearance of corruption only to provable, documented, evidence-based quid pro quo exchanges. This is beyond naïve and touches itself the broader meaning of corrupt.
If justice exists in the world, his legacy will be this: First, in giving to Donald Trump the ability to hand a person of relative youth the fifth and deciding Republican vote on the Court, Kennedy has changed the Court for a generation. After his successor is confirmed, no good thing will come from the Court for the next 20 years, and much, perhaps fatal, damage will be done.
Second, thanks to Kennedy’s handing his seat to Trump, the next new justice will be unable to claim the propagandistic “swing vote” mantle held by O’Connor and Kennedy, which fact should destroy the Court’s perceived, illusory legitimacy forever. The full consequences of loss of legitimacy will be considered elsewhere, but suffice it to say that when a nation’s highest court is not just captured, but widely seen to be captured, a constitutional crisis is at hand.
This is the legacy Justice Anthony Kennedy, and though he may bask for the next few months in the glory of his pronounced moderation, the awful truth, to his enduring shame, should follow him to the grave — and be printed on it.
The Crisis to Come
Let’s close by quoting Anthony Kennedy in the Citizens United case, the most bizarre defense of a decision in the modern era. (There have been many bizarre decisions — the “money is speech” decision in Buckley v. Valeo is among the worst in the last 50 years, but none has been as bizarrely defended by the Court as Citizens United.)
Remember that in Citizens United the Court, building on the decision in Buckley, ruled that the First Amendment prohibits Congress from passing any law limiting so-called “independent expenditures” by corporations and unions to political campaigns. (Of course, those independent expenditures are almost never independent at all, but that’s another problem.)
To the objection that unlimited campaign contributions would foster widespread public corruption, Kennedy countered with this absurdity (quote taken from Jonathan Cohn here). In his majority opinion, Kennedy wrote:
[W]e now conclude that independent expenditures, including those made by corporations, do not give rise to corruption or the appearance of corruption. … The fact that speakers [i.e., donors] may have influence over or access to elected officials does not mean that these officials are corrupt. … The appearance of influence or access, furthermore, will not cause the electorate to lose faith in our democracy.
Each assertion above strains belief that the writer is sane. Consider those assertions in simpler language:
Gifts of money do not corrupt. Gifts of money don’t look corrupt. Influence over politicians does not corrupt. Voters will have no problem with nakedly bought elections.
The first three are either plain nonsense, in which case Kennedy is unqualified to sit on the bench at all, or nonsense in service of ideology, in which case Kennedy is a political actor on an already captured Court.
The obvious explanation is the latter.
But the worse of his assertions may be the fourth, which is also patently wrong. That assertion, which says in effect “and people will let us get away with all these changes,” has set the final table for the constitutional crisis to come — the one that questions the legitimacy of the Court itself and with it, perhaps, our entire political process.
That crisis, if it comes, will threaten the national fabric as fundamentally as any of the earlier three — the crisis of 1776, the crisis of 1860, and the Great Depression. We’re now much closer to that point than anyone with a microphone or media column inches will say. But you did hear it here. Stay tuned. | {
"pile_set_name": "OpenWebText2"
} |
Hindu Mahajana Sangam
The Hindu Mahajana Sangam () is an association of Indian workers. It is one of the Indian non-profitable organization at Penang, Malaysia which has been officially established in 1935 by waterfronts workers. Before 1935, it known as Kootakkadai ().
The name Mahajana comes from two words, maha meaning great and jana meaning people. So Mahajana can be described as great people in Tamil, though in Sanskrit, it is translated as great vehicle. The Indian workers who arrived here were farmers back home. They preferred to call themselves Kootakadai, because they work as a kootam () which is gang or group, rather than calling themselves coolies or labourers.
The Hindu Mahajana Sangam was established in 1935 after following the first consecration of the Sri Muthu Mariamman Temple in Queen Street which was held in 1933. It was established for religious observances, to promote education, social/cultural development, participates in the administration/development of the Sri Muthu Mariamman, Arulmigu Balathandayuthapani and act as union for the workers.
The sangam's madam (hall) located at 674, Jalan Kebun Bunga (formerly known as Waterfall Road) near the foothill of the Waterfall Arulmigu Balathandayuthapani hilltop temple, is named as Gandhiji Ashram in remembrance of Mahatma Gandhi on 22nd Feb 1948.
History
The Indian farmers for economic reasons came to Penang from South India in the early 18th Century to work in the Penang waterfront as labourers, they worked for the Indian merchants and contributed to the development of import and export spice trade in the Island.
There were also Tamil Indian freedom fighters Maruthu Pandiyar relatives and 72 soldiers were deported to Penang in the year 1802 by the Madras Presidency Government (British India Government).
Koottakadai
There were more than 2,000 workers and their dependents lived in the prewar houses in the town centre than known as simpang lelong or elam muchanthi in Tamil, today this area is known as Little India. They worked in groups/gangs (kootam in Tamil) each gang consist of 50 to 30 workers with a head (Thandal in Tamil) and a clerk (Kanakku Pillai in Tamil) in charge of daily working income account which will be divided and distributed to the workers at the end of the month. They worked for the Indian merchant houses in handling their goods in the import/export trade. These workers are referred to as Kootakadai as they work as a group (Kootam). There were more than 50 groups operating in the Weld Quay waterfront handling cargo in the tongkangs or wooden barges in the import and export trade. In the many open concrete fields in the town, some of the workers were involved in storing; drying and re-bagging of betel nuts imported from Indonesia for re-export. For the workers relaxation in the evening, the authorities operated 2 government toddy shops in the town centre, one in Market Lane/Muda Lane junction which still exist today and the other in King Street which was bombed by the Japanese during the 2nd World War, this site have been since rebuilt into a warehouse for the customs department. During the War many workers joined the Indian National Army. They also had a unique form of a gathering to raise emergency funds for their needs call Needs (Thaevai in Tamil) where each worker contribute and his amount and name recorded and same amount return to him in another function.
Sri Muthu Mariamman Temple
In 1833 the workers founded the Queen Street Sri Muthu Mariamman Temple (The temple was renamed Arulmigu Mahamariamman Temple in 1980 by the Hindu Endowment Board). The waterfront workers contributed to the development, construction and management of the Queen Street and Waterfall temples. The inner city Hindu population was mostly waterfront workers who established the Hindu Mahajana Sangam in 1935. The sangam was established with the main purpose of managing the Queen Street and Waterfall hilltop temples. The Queen street temple which was then managed by a few trustees appointed by the Board. One of these trustees was Mr K.V. Karuppiah Thandal, who became the inaugural secretary of the sangam.
The sangam was also involved in promoting religious observance, developing Tamil education and Indian culture. It was also involved in arbitrating disputes within the community which served as an avenue to discuss and address social problems affecting Indian workers. It set up a panchayat committee to solve these problems. The sangam started Tamil schools to educate the Tamil children, and was active in promoting cultural activities and religious practices.
Union for the workers
The Hindu Mahajana Sangam was a channel in bring up the grievances of the Indian community to the proper authorities and acted as a union for the workers until 1950, when the Waterfront Workers Union was formed. This later became known as the Weld Quay Workers Union. In the early 1970s, when the Government established the Penang Labour Board, the Weld Quay Workers Union was amalgamated with the Prai Cargo Handlers and the Penang Stevedores Union to form the Penang Port Workers Union, with its office in King Street.
The sangam's Hon. Secretary, the late K. V. Karuppiah Thandal, the head of one of the numerous groups of workers was a trustee of the Queen Street Sri Muthu Mariamman temple and the sangam members were regularly appointed to the Hindu Endowment Board and the members held various positions in the temples management committees and contributed to the temples development.
The sangam's President, the late Mr. M. Doraisamy Thevar JP, PJK was the last Hindu Endowment Board member representing the Hindu Mahajana Sangam and was also the chairman of the Temples Management Committee, which carried out the 2nd consecration ceremony of the Queen Street, Sri Muthu Mariamman Temple.
Between 1960s and 70s many of the workers living in the inner city became jobless due to the Indonesian confrontation and also because the Indian merchants began to reduce the usage of the tongkangs/barges in transporting cargo to vessels and used the newly built PPC Butterworth deepwater wharves in the mainland to be cost effective. Therefore, many of the workers due to lack of job opportunities and their expanded families gradually left the town centre for economic reasons to various parts of the Island, to the slums in the Weld Quay reclamation area, Noordin Street Flats, mainland and far South, some of them even left Penang permanently to settle in India.
Building
The Hindu Mahajana Sangam operated out of 47 Church Street from 1935 until 1947, when it moved to 40 Church Street. The sangam moved out of 40 Church Street in 1988 after the property was sold. It was then that it moved to its own building, the Gandhiji Ashram, at 674 Jalan Kebun Bunga.
Today it is based at the Gandhiji Ashram, a community hall at the foot of the Balathandayuthapani Temple near the Penang Botanic Gardens. The hall was built by the early Indian settles who were the waterfront workers in late 1920s and was originally known as Madaalayam or Kootakadai Madam. The Sangam intends to preserve the Dewan Mahatma Gandhi building as a heritage building as in our view this is the only building that exist today which is based on the South Indian Architecture left in the whole of Malaysia. This building is of significant historic value need to be preserved for our future generation to appreciate our heritage.
Contributions
The Sangam to strengthen the religious beliefs in the community every year organizes various festivals and celebrations in Queen Street Sri Maha Mariamman Temple, Waterfall Sri Bala Thendayuthapani Hilltop temple, Penang Road Sri Kunj Bihari Krishna Temple and Dewan Mahatma Gandhi.
Dewan Mahatma Gandhi
The members of the Sangam built the hall (madam in Tamil) in the Waterfall Balathandayuthapani Temple compound to cater for the devotees attending the annual Thaipusam (during Jan/Feb) and the annual Chitraparuvam Festivals (during April/May). This is a rare heritage building representing the South Indian architecture. Not only Dewan Mahatma Gandhi is used as a religious retreat, it also provides shelter to estate workers who come to Penang during the festivals. Due to lack of transport and accommodation in the earlier years, the sangam provided free temporary accommodation for the devotees from outstation attending the festivals and free vegetarian lunch to the public. This tradition is still being carried out by the Sangam members till today. On 22 February 1948 during a condolences gathering for the late Mahatma Gandhi, the members unanimously named the Sangam's hall Dewan Mahatma Gandhi (Ganhdhiji Ashram).
On 5 October 2008, A ceremony was held at the Gandhiji Ashram to unveil a life size statue of Mahatma Gandhi. The ceremony was also to celebrate United Nation's International Day against violence and to commemorate the birthday of Mahatma Gandhi.
Beginning from August 2013, an exhibit of rare photographs of Mahatma Gandhi is open to the public from 10:00 am to 1:00 pm, on every Fridays, Saturdays and Sundays.
Pillaiyar Temple
In 1951 the Sangam built the Pillaiyar Temple in the Waterfall Temple compound and held its first consecration ceremony. In the late 1970s the Hindu Endowment Board rebuilt this temple and renamed it Arulmigu Sri Ganesar Temple.
Chitraparuvam
The Hindu Mahajana Sangam, with notable among the festivals, the annual Chitraparuvam Festival which is celebrated in the Tamil month of Chithirai (April/May) every year, is organised with a chariot procession of the Panchaloha () deity of Lord Subramaniyaswami from Queen Street Sri Mahamariamman Temple. It is the day of the first full moon of the first Tamil month. In early years, the festival starts with special pooja and ubayam for the Hindu Mahajana Sangam (Koota Kadai) in Queen Street Sri Mahamariamman temple, the chariot procession commences in the early morning 7.00 am and reached Waterfall Dewan Mahatma Gandhi (Gandhiji Ashram) in the afternoon, the deity then carried and placed in the ashram until the return journey of the chariot to Queen Street Sri Mahamariamman temple in the evening of the same day.
Since the early 1970s this festival is celebrated for three days. The deity of Lord Subramaiyaswami is brought in procession from the Queen Street Sri Mahamariamman temple passing through many street and roads before reaching the Waterfall Arulmigu Sri Ganesha temple. The deity is carried up to the Hilltop Arulmigu Sri Balathadayuthapani Temple. On the second day is the Chitraparuvam Festival where the deity is taken in procession around the hilltop temple compound in the evening. On the third day evening, the deity is carried down and placed on the chariot procession journey back to the Queen Street Sri Mahamariamman temple. In 1992, the Hindu Mahajana Sangam imported a new chariot from India, for the annual Chitraparuvam festival celebration to replace the old chariot which was found to be not road worthy and in a decaying condition.
On the first day, the chariot passes through Queen Street, Chulia Street, Chulia Street Ghaut, Victoria Street, Prangin Road Ghaut, C. Y. Choy Road, Magazine Road, Dato Keramat Road, Western Road and Waterfall Road before reaching the Ganesar Temple. The chariot stops at Kamatchi Amman Temple, Sivan Temple and Muneeswarar Temple along the way. Then the Lord Subramaniyaswami is carried up to the Sri Balathandayuthapani Temple at the hilltop.
On the return journey, the Lord Subramaniyaswami is carried down and the chariot passes through Waterfall Road, Gottlieb Road, Tunku Abdul Rahman Road, Macalister Road, Anson Road, Burma Road, Transfer Road, Sri Bahari Road, Penang Road, Kimberley Street, Carnarvon Street, Chulia Street, Pitt Street, Church Street, Queen Street, China Street, King Street, Light Street, Penang Street, Chulia Street, King Street, China Street, Beach Street, Market Street and Queen Street before reaching the Sri Mahamariamman Temple. The chariot stops at Balathandayuthapani Temple, Meenatchi Sundaraeswarar Temple, ISKCON Centre, Muneeswarar Temple and Kunj Bihari Temple along the way.
Navarathiri
The Navarathiri festival begins with the lion flag raising ceremony and ends with a procession where the Panchaloha () deity of Mahamariamman is paraded in a decorated wooden chariot through the streets of Little India. Navarathiri is nine nights’ vegetarian festival. According to the Hindu Puranas, the festival is held to commemorate the victory of Goddess Adi Parashakti over the demon king Mahishasuran. It was stated that the evil king ill-treated the people that they turned to the goddess, who is the consort of Lord Shiva, to save them. Goddess Adi Parasakthi fought a battle for nine days and ultimately destroyed him on the 10th day, which is known as Vijayadashami.
Various Indian organisations and communities would sponsor the prayers for each of the nights. On the Final tenth day of the celebration, Vijayadashami is celebrated by Hindu Mahajana Sangam. The sangam organised the chariot procession for many years from Queen Street, Sri Mahamariamman Temple to Dhoby Ghaut. The festival celebration is concluded after shooting of arrows from the chariot in the evening in the Dhoby Ghaut. The chariot returns to the temple at about midnight the same day. Since the late 1970s, the organisation of the chariot procession had been taken over from the sangam by the Temple committee and the Hindu Endowments Board. The chariot procession is now confined to Fort Cornwallis area, nearby the Kedah Pier Muneeswarar Temple in the Esplanade. However, the sangam still continues to celebrate the annual Navarathiri festival's final tenth day Vijayadashami Ubayam every year without fail.
Vaikunda Ekadhasi
The celebration the annual Vaikunda Ekadhasi festival in the Sri Kunj Bihari Temple, Penang Road since the early 1900s. This festival falls in December/January every year and is celebrated with a special ubayam and pooja at night, followed by the procession of the Panchaloha () deities Krishna along with Rukmini and Satyabhama are carried out around within the temple premises. Religious hymns (Bhajan) will be conducted until the following morning in the main hall of the temple. The celebration is concluded with a special Pooja in the early morning of the following day. The devotees break their fast and our members serve vegetarian meal after the prayers.
Thaipusam
The annual Thaipusam festival is celebrated in the Gandhiji Hall (Dewan Mahatma Gandhi) Waterfall, from the eve of the Festival day. Free lunch is provided on the day before, in the evening light food and drinks served to the public. On Thaipusam day, free breakfast, lunch and dinner is served to the public.
The Hindu Mahajana Sangam members carry a traditional kavadi weighing approximately 80 kilos known as Atta Kavadi since 1927 from Queen Street Sri Mahamariamman temple in the evening to Dewan Mahatma Gandhi in the Arulmigu Balathandayuthapani Temple, Penang compound with traditional Nadhaswaram accompanying the Atta Kavadi. A late night dinner is served in the Dewan Mahatma Gandhi after the arrival of the Atta Kavadi. In the early years the arrival of the Atta Kavadi signifies the conclusion of the annual Thaipusam festival celebration for the day and devotees are not expected to carry any kavadi after the Atta Kavadi reaches the Waterfall temple compound.
In the last few years our sangam had been organizing exhibitions on Hindu religious practices where various organizations participated. Anti Dadah and health exhibitions were held. These exhibitions were officiated by YB Dato’ Dr. K. Rajapathy, the Hindu Endowment Board chairman during the festivals.
Non-violence Day
The Hindu Mahajana Sangam and Taiping Peace Initiative will jointly organize the United Nations International Day of Non-Violence commemorating the birthday of Mahatma Gandhi.
Activities and future plans
In the past 15 years, the sangam has enhanced its activities with the cooperation of NGOs in Penang by providing free medical camp, yoga for health, weekly evening lectures on Hinduism, weekly education class for children and charity for the underprivileged.
Since early 1999, the sangam started to revamp it is activities in line with its objectives and refurbished the hall to incorporate various facilities. It now has a library, a computer lab, meeting room and toilet facilities. The sangam with help of the volunteers organize religious and Tamil class for children on every Sunday afternoon. Tuition and computer classes for children in the primary school every Sunday morning.
Religious class for adults is conducted on every Monday evening on the Art of Living by Swamiji H. H. Sri Brahmanada Saraswathi from Kulim Diyana Mandram. The sangam provides vegetarian light food after the discourse. On every Tuesday and Wednesday evenings with the cooperation of Ananda Marga Society, the Sangam organizes Yoga for Health for all races from 7.40pm to 9.30pm. On Thursday evening the sangam conducts basic Tamil Language for adults from 7.45 pm to 9.00pm.
The sangam's future plan is to focus on developing traditional Art, Music and other cultural activities to preserve our diverse identity with the hope of creating a vibrant Bangsa Malaysia.
Since 2008, the sangam has come to the aid of the less fortunate and underprivileged. In conjunction with Deepavali, poor families received hampers consisting of grocery essentials such as rice, oil, milk, noodles and biscuits as well as cash packets from the association at its premises.
Getting there
The nearest bus stop is along Jalan Kebun Bunga, which is served by Rapid Penang bus 10.
See also
Sri Mahamariamman Temple, Penang
Arulmigu Balathandayuthapani Temple, Penang
History of Penang
Tamil Malaysians
References
Category:Labour in India
Category:Indian culture
Category:Indian religions | {
"pile_set_name": "Wikipedia (en)"
} |
/*
* Copyright (c) 2006-2011 Christian Plattner. All rights reserved.
* Please refer to the LICENSE.txt for licensing details.
*/
package ch.ethz.ssh2.packets;
import ch.ethz.ssh2.DHGexParameters;
/**
* PacketKexDhGexRequestOld.
*
* @author Christian Plattner
* @version 2.50, 03/15/10
*/
public class PacketKexDhGexRequestOld
{
byte[] payload;
int n;
public PacketKexDhGexRequestOld(DHGexParameters para)
{
this.n = para.getPref_group_len();
}
public byte[] getPayload()
{
if (payload == null)
{
TypesWriter tw = new TypesWriter();
tw.writeByte(Packets.SSH_MSG_KEX_DH_GEX_REQUEST_OLD);
tw.writeUINT32(n);
payload = tw.getBytes();
}
return payload;
}
}
| {
"pile_set_name": "Github"
} |
Impact of bullying due to dentofacial features on oral health-related quality of life.
The aim of this study was to investigate whether there is a relationship between self-reported bullying because of dentofacial features and oral health-related quality of life among a representative sample of Jordanian schoolchildren. This was a cross-sectional study in which a representative sample of sixth-grade students (age, 11-12 years) from randomly selected schools in Amman, Jordan, were asked to complete questionnaires distributed in the classroom in the presence of the researchers. The questionnaire used for this purpose was the short form of the Child Perceptions Questionnaire for 11- to 14-year-old children. The final sample size was 920 children (470 girls, 450 boys). There were significant differences between the sexes for the total Child Perceptions Questionnaire score and for the oral symptoms and the social well-being subscales, with boys reporting higher scores and thus more negative effects on their oral health-related quality of life. Comparison of the total scores and subscales scores for boys and girls subdivided into those who reported being bullied and not being bullied about their teeth showed that bullied boys had significantly greater effects on overall oral health-related quality of life and on all subscales than did not-bullied boys (P <0.001 for all comparisons). Bullied girls also had significantly greater effects on the overall oral health-related quality of life and all subscales than did not-bullied girls (P <0.001 for all comparisons). However, bullied boys and girls reported similar scores for the different subscales of the Child Perceptions Questionnaire; there were no significant differences. This study demonstrated a significant relationship between bullying because of dentofacial features and negative effects on oral health-related quality of life. The results highlight the importance of addressing the bullying problem among schoolchildren and provide important data for educational authorities to create antibullying programs to help students receive education in a safe and healthy environment. | {
"pile_set_name": "PubMed Abstracts"
} |
Bimonia is one of the last Birdlings alive on this earth. The Birdlings see humans as sworn enemies, and Bimonia has been taught to kill on sight.
One day, Bimonia meets Tayo, a human child hunting in the forest. Bimonia fails to kill Tayo, and they instead become friends. Mother will be furious if she finds out, or worse, this relationship may cause tension between the two species to erupt.
The Last Birdling explores Bimonia and Tayo's fight for their friendship as the world threatens to tear them apart.
* * *
Unhack, the first game created under the InvertMouse name, was produced in 2012. That means The Last Birdling happens to be my fifth anniversary project. This game is a culmination of all the lessons I have learned through the past five years. Now, I would like to share with you several features available in The Last Birdling:
Dual perspectives:
The Last Birdling alternates between Bimonia and Tayo's perspectives. Follow their journeys from childhood to adolescence as they struggle to maintain their friendship against all odds.
Multiple endings:
There are 21 decisions to make throughout Bimonia and Tayo's journeys. Depending on their resolve, this story will conclude in one of five ways. Use the progress tracker to determine how to obtain every ending.
Glossary:
Dive into The Last Birdling's lore through the glossary page. Throughout your journey, you will find links that take you to relevant entries.
* * *
Thank you! I hope you will find the experience worthwhile. | {
"pile_set_name": "OpenWebText2"
} |
NGC 298
NGC 298 is a spiral galaxy in the constellation Cetus. It was discovered on September 27, 1864 by Albert Marth.
References
External links
0298
18640927
Category:Cetus (constellation)
Category:Spiral galaxies | {
"pile_set_name": "Wikipedia (en)"
} |
true
false
| {
"pile_set_name": "Github"
} |
Q:
How do I merge data from several rows in SQL Server
Here is my situation:
TABLE PEOPLE (code, name, + other fields that are identical for records with same code)
1;John Wayne
2;Jack Smith
2;Jill Smith
3;Bill Peyton
3;Gill Peyton
3;Billy Peyton
The result I would like:
VIEW PEOPLE (code, name, + other fields that are identical for records with same code)
1;John Wayne
2;Jack Smith Jill Smith
3;Bill Peyton Jill Peyton Billy Peyton
Can some one please help me create a view that would give me this result?
The point is merging rows with same "code" and merge names in column "name". All the other fields are 100% identical for rows with same "code".
Thank you.
A:
Try this
SELECT Code,
( SELECT Name + ' '
FROM Table1 t2
WHERE t2.Code = t1.Code
ORDER BY Name
FOR XML PATH('') ) AS Name
FROM Table1 t1
GROUP BY Code ;
| {
"pile_set_name": "StackExchange"
} |
Mangham, Louisiana
Mangham is a village in Richland Parish in northeastern Louisiana, United States. The population was 595 at the 2000 census. Mangham was established in 1890. It is named for Wiley P. Mangham (died 1896), the publisher of the Richland Beacon-News, a weekly newspaper in Rayville, the seat of Richland Parish.
Geography
Mangham is located at (32.308304, -91.776225).
According to the United States Census Bureau, the town has a total area of , all land.
The community straddles the border with Franklin Parish. Louisiana Highway 15 runs through Mangham.
Demographics
As of the census of 2000, there were 595 people, 247 households, and 170 families residing in the town. The population density was 590.0 people per square mile (227.5/km²). There were 268 housing units at an average density of 265.7 per square mile (102.5/km²). The racial makeup of the town was 59.66% White, 39.83% African American, 0.17% Asian, and 0.34% from two or more races. Hispanic or Latino of any race were 1.34% of the population.
There were 247 households out of which 28.3% had children under the age of 18 living with them, 40.9% were married couples living together, 27.1% had a female householder with no husband present, and 30.8% were non-families. 28.7% of all households were made up of individuals and 13.0% had someone living alone who was 65 years of age or older. The average household size was 2.41 and the average family size was 2.96.
In the town, the population was spread out with 27.1% under the age of 18, 7.4% from 18 to 24, 24.4% from 25 to 44, 23.5% from 45 to 64, and 17.6% who were 65 years of age or older. The median age was 38 years. For every 100 females, there were 85.4 males. For every 100 females age 18 and over, there were 72.9 males.
The median income for a household in the town was $17,500, and the median income for a family was $23,558. Males had a median income of $22,273 versus $18,125 for females. The per capita income for the town was $15,813. About 22.0% of families and 28.3% of the population were below the poverty line, including 42.8% of those under age 18 and 9.0% of those age 65 or over.
Education
Public schools in Richland Parish are operated by the Richland Parish School Board. Three campuses serve the village of Mangham - Mangham Elementary School (Grades PK-5), Mangham Junior High School (Grades 6-8), and Mangham High School (Grades 9-12). The current principal of MJH is Elizabeth Gregorie. The principal of MHS is Mrs. Futch.
Notable people
Ralph Abraham (born September 16, 1954), veterinarian and physician elected as a Republican to the U.S. House of Representatives in 2014
Delores Chapman Danley. Author of I Think I Heard a Rooster Crow, a collection of inspirational poems and devotionals, published in 2011 by Westbow Press.
Noble Ellington is a former member of both houses of the Louisiana State Legislature, having served continuously from 1988 to 2012. He graduated in 1960 from Mangham High School.
Robert Neal Harwell (born February 1944) is a commissioner of the Tensas Basin Levee District and the former Republican mayor of Mangham.
Bennie McLain Hixon (June 21, 1923 – October 23, 2014), was a principal of Mangham High School, self-published The History of Mangham and the Big Creek—Boeuf River to 1940. He also penned columns for the Richland Beacon. He enlisted in the United States Army during World War II and served with the 14th Armored Division, which entered France at Marseilles in 1944 and became part of General George S. Patton's Third Army as it moved across Germany. In 1963, he relocated to Monroe to join the history faculty of Neville High School. After retiring from professional education in 1969, he became a Louisiana state employee.
Dr. D. B. "Bruce" Magee (May 30, 1955). Former newscaster at KTVE-TV (El Dorado/Monroe) from 1979 to 1983. Former International beauty pageant emcee for Cinderella Scholarship Pageants (1979-1989);Former editor of College English (1989-1991); Co-author of Mosaics (a series of writing textbooks) published in 1998 by Prentice-Hall; professor of English at Fullerton College in Fullerton, California; 1973 graduate of Mangham High School
Myrtis Methvin, mayor of Castor in Bienville Parish 1933 to 1945, second woman mayor in Louisiana; native of Attala County, Mississippi, resided in Mangham in her early years and is interred at Gwinn Cemetery
Keith Munyan, commercial photographer in North Hollywood, California, reared in Mangham, son of Keith Owen Munyan, Sr.
Keith Owen "Moose" Munyan, Sr. (September 6, 1932 – July 8, 1996), became the head football coach at Mangham High School in 1963 and held the position for twenty-seven years. Teams under Coach Munyan compiled an outstanding record of 204 wins, 78 losses, and 6 ties. He led his teams to sixteen district championships and was selected "District Coach of the Year" on eighteen occasions. Munyan held various offices with the Louisiana High School Coaches Association and was honored by the organization for twenty-five years of service.
Frellsen Reese (March 19, 1919 – January 31, 2007) was a Republican mayor of Mangham and a retired lieutenant colonel of the United States Army with service in both World War II and the Vietnam War.
Robert Max Ross (August 5, 1933 – September 15, 2009) ran as a Republican candidate for governor in 1972 and 1983, as well as the United States Senate, the U.S. House of Representatives, the Louisiana State Senate, and for Mayor of Mangham, Louisiana.
References
External links
Mangham Progress Community Progress Site for Mangham, LA
Category:Villages in Richland Parish, Louisiana
Category:Villages in Louisiana | {
"pile_set_name": "Wikipedia (en)"
} |
Q:
Spring Security Oauth - Custom format for OAuth2Exceptions
The error format of spring security oauth conforms with the OAuth spec and looks like this.
{
"error":"insufficient_scope",
"error_description":"Insufficient scope for this resource",
"scope":"do.something"
}
Especially on a resource server I find it a bit strange to get a different error format for authentication issues. So I would like to change the way this exception is rendered.
The documentation says
Error handling in an Authorization Server uses standard Spring MVC
features, namely @ExceptionHandler methods
So I tried something like this to customize the format of the error:
@ControllerAdvice
@Order(Ordered.HIGHEST_PRECEDENCE)
public class MyErrorHandler {
@ExceptionHandler(value = {InsufficientScopeException.class})
ResponseEntity<MyErrorRepresentation> handle(RuntimeException ex, HttpServletRequest request) {
return errorResponse(HttpStatus.FORBIDDEN,
MyErrorRepresentation.builder()
.errorId("insufficient.scope")
.build(),
request);
}
}
But this does not work.
Looking at the code, all the error rendering seems to be done in DefaultWebResponseExceptionTranslator#handleOAuth2Exception. But implementing a custom WebResponseExceptionTranslator would not allow changing the format.
Any hints?
A:
First of all,some knowledge for Spring Security OAuth2.
OAuth2 has two main parts
AuthorizationServer : /oauth/token, get token
ResourceServer : url resource priviledge management
Spring Security add filter to the filter chains of server container, so the exception of Spring Security will not reach @ControllerAdvice
Then, custom OAuth2Exceptions should consider for AuthorizationServer and ResourceServer.
This is configuration
@Configuration
@EnableAuthorizationServer
public class OAuthSecurityConfig extends AuthorizationServerConfigurerAdapter {
@Override
public void configure(AuthorizationServerEndpointsConfigurer endpoints) throws Exception {
//for custom
endpoints.exceptionTranslator(new MyWebResponseExceptionTranslator());
}
}
@Configuration
@EnableResourceServer
public class ResourceServerConfiguration extends ResourceServerConfigurerAdapter {
@Override
public void configure(ResourceServerSecurityConfigurer resources) {
// format message
resources.authenticationEntryPoint(new MyAuthenticationEntryPoint());
resources.accessDeniedHandler(new MyAccessDeniedHandler());
}
}
MyWebResponseExceptionTranslator is translate the exception to ourOAuthException and we custom ourOAuthException serializer by jackson, which way is same by default the OAuth2 use.
@JsonSerialize(using = OAuth2ExceptionJackson1Serializer.class)
public class OAuth2Exception extends RuntimeException {
other custom handle class stuff
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.security.core.AuthenticationException;
import org.springframework.security.oauth2.common.exceptions.OAuth2Exception;
import org.springframework.security.oauth2.provider.error.WebResponseExceptionTranslator;
/**
* @author qianggetaba
* @date 2019/6/21
*/
public class MyWebResponseExceptionTranslator implements WebResponseExceptionTranslator {
@Override
public ResponseEntity<OAuth2Exception> translate(Exception exception) throws Exception {
if (exception instanceof OAuth2Exception) {
OAuth2Exception oAuth2Exception = (OAuth2Exception) exception;
return ResponseEntity
.status(oAuth2Exception.getHttpErrorCode())
.body(new CustomOauthException(oAuth2Exception.getMessage()));
}else if(exception instanceof AuthenticationException){
AuthenticationException authenticationException = (AuthenticationException) exception;
return ResponseEntity
.status(HttpStatus.UNAUTHORIZED)
.body(new CustomOauthException(authenticationException.getMessage()));
}
return ResponseEntity
.status(HttpStatus.OK)
.body(new CustomOauthException(exception.getMessage()));
}
}
import com.fasterxml.jackson.databind.annotation.JsonSerialize;
import org.springframework.security.oauth2.common.exceptions.OAuth2Exception;
/**
* @author qianggetaba
* @date 2019/6/21
*/
@JsonSerialize(using = CustomOauthExceptionSerializer.class)
public class CustomOauthException extends OAuth2Exception {
public CustomOauthException(String msg) {
super(msg);
}
}
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.databind.SerializerProvider;
import com.fasterxml.jackson.databind.ser.std.StdSerializer;
import java.io.IOException;
import java.util.Arrays;
import java.util.Map;
/**
* @author qianggetaba
* @date 2019/6/21
*/
public class CustomOauthExceptionSerializer extends StdSerializer<CustomOauthException> {
public CustomOauthExceptionSerializer() {
super(CustomOauthException.class);
}
@Override
public void serialize(CustomOauthException value, JsonGenerator jsonGenerator, SerializerProvider serializerProvider) throws IOException {
jsonGenerator.writeStartObject();
jsonGenerator.writeNumberField("code4444", value.getHttpErrorCode());
jsonGenerator.writeBooleanField("status", false);
jsonGenerator.writeObjectField("data", null);
jsonGenerator.writeObjectField("errors", Arrays.asList(value.getOAuth2ErrorCode(),value.getMessage()));
if (value.getAdditionalInformation()!=null) {
for (Map.Entry<String, String> entry : value.getAdditionalInformation().entrySet()) {
String key = entry.getKey();
String add = entry.getValue();
jsonGenerator.writeStringField(key, add);
}
}
jsonGenerator.writeEndObject();
}
}
for custom ResourceServer exception
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.security.core.AuthenticationException;
import org.springframework.security.web.AuthenticationEntryPoint;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
/**
* @author qianggetaba
* @date 2019/6/21
*/
public class MyAuthenticationEntryPoint implements AuthenticationEntryPoint {
@Override
public void commence(HttpServletRequest request, HttpServletResponse response,
AuthenticationException authException)
throws ServletException {
Map map = new HashMap();
map.put("errorentry", "401");
map.put("message", authException.getMessage());
map.put("path", request.getServletPath());
map.put("timestamp", String.valueOf(new Date().getTime()));
response.setContentType("application/json");
response.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
try {
ObjectMapper mapper = new ObjectMapper();
mapper.writeValue(response.getOutputStream(), map);
} catch (Exception e) {
throw new ServletException();
}
}
}
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.security.access.AccessDeniedException;
import org.springframework.security.web.access.AccessDeniedHandler;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
/**
* @author qianggetaba
* @date 2019/6/21
*/
public class MyAccessDeniedHandler implements AccessDeniedHandler{
@Override
public void handle(HttpServletRequest request, HttpServletResponse response, AccessDeniedException accessDeniedException) throws IOException, ServletException {
response.setContentType("application/json;charset=UTF-8");
Map map = new HashMap();
map.put("errorauth", "400");
map.put("message", accessDeniedException.getMessage());
map.put("path", request.getServletPath());
map.put("timestamp", String.valueOf(new Date().getTime()));
response.setContentType("application/json");
response.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
try {
ObjectMapper mapper = new ObjectMapper();
mapper.writeValue(response.getOutputStream(), map);
} catch (Exception e) {
throw new ServletException();
}
}
}
A:
I found a similar question with answers that really helped my solving this - Handle spring security authentication exceptions with @ExceptionHandler
But my question is specifically about spring-security-oauth2 - so I think it is still worth stating the answer specific to spring-security-oauth2. My solution was picked from different answers to the question mentioned above.
My samples work for spring-security-oauth2 2.0.13
So the solution for me to achieve a different custom error structure for oauth2 errors on resource server resources was to register a custom OAuth2AuthenticationEntryPoint and OAuth2AccessDeniedHandler that I register using a ResourceServerConfigurerAdapter. It is worth mentioning that this is only changing the format for ResourceServer endpoints - and not the AuthorizationServer endpoints like the TokenEndpoint.
class MyCustomOauthErrorConversionConfigurerAdapter extends ResourceServerConfigurerAdapter {
@Override
public void configure(ResourceServerSecurityConfigurer configurer) throws Exception {
configurer.authenticationEntryPoint(new MyCustomOauthErrorOAuth2AuthenticationEntryPoint());
configurer.accessDeniedHandler(new MyCustomOauthErrorOAuth2AccessDeniedHandler());
}
}
I could not reuse the functionality in OAuth2AuthenticationEntryPoint and OAuth2AccessDeniedHandler because the relevant methods translate the exception and flush it in the same method. So I needed to copy some code:
public class MyCustomOauthErrorOAuth2AccessDeniedHandler extends OAuth2AccessDeniedHandler {
private final MyCustomOauthErrorOAuth2SecurityExceptionHandler oAuth2SecurityExceptionHandler = new MyCustomOauthErrorOAuth2SecurityExceptionHandler();
/**
* Does exactly what OAuth2AccessDeniedHandler does only that the body is transformed to {@link MyCustomOauthError} before rendering the exception
*/
@Override
public void handle(HttpServletRequest request, HttpServletResponse response, org.springframework.security.access.AccessDeniedException authException)
throws IOException, ServletException {
oAuth2SecurityExceptionHandler.handle(request, response, authException, this::enhanceResponse);
}
}
public class ExceptionMessageOAuth2AuthenticationEntryPoint extends OAuth2AuthenticationEntryPoint {
private final MyCustomOauthErrorOAuth2SecurityExceptionHandler oAuth2SecurityExceptionHandler = new MyCustomOauthErrorOAuth2SecurityExceptionHandler();
/**
* Does exactly what OAuth2AuthenticationEntryPoint does only that the body is transformed to {@link MyCustomOauthError} before rendering the exception
*/
@Override
public void commence(HttpServletRequest request, HttpServletResponse response, AuthenticationException authException) throws IOException, ServletException {
oAuth2SecurityExceptionHandler.handle(request, response, authException, this::enhanceResponse);
}
}
@RequiredArgsConstructor
public class MyCustomOauthErrorOAuth2SecurityExceptionHandler {
private final WebResponseExceptionTranslator exceptionTranslator = new DefaultWebResponseExceptionTranslator();
private final OAuth2ExceptionRenderer exceptionRenderer = new DefaultOAuth2ExceptionRenderer();
private final HandlerExceptionResolver handlerExceptionResolver = new DefaultHandlerExceptionResolver();
/**
* This is basically what {@link org.springframework.security.oauth2.provider.error.AbstractOAuth2SecurityExceptionHandler#doHandle(HttpServletRequest, HttpServletResponse, Exception)} does.
*/
public void handle(HttpServletRequest request, HttpServletResponse response, RuntimeException authException,
BiFunction<ResponseEntity<OAuth2Exception>, Exception, ResponseEntity<OAuth2Exception>> oauthExceptionEnhancer)
throws IOException, ServletException {
try {
ResponseEntity<OAuth2Exception> defaultErrorResponse = exceptionTranslator.translate(authException);
defaultErrorResponse = oauthExceptionEnhancer.apply(defaultErrorResponse, authException);
//this is the actual translation of the error
final MyCustomOauthError customErrorPayload =
MyCustomOauthError.builder()
.errorId(defaultErrorResponse.getBody().getOAuth2ErrorCode())
.message(defaultErrorResponse.getBody().getMessage())
.details(defaultErrorResponse.getBody().getAdditionalInformation() == null ? emptyMap() : defaultErrorResponse.getBody().getAdditionalInformation())
.build();
final ResponseEntity<MyCustomOauthError> responseEntity = new ResponseEntity<>(customErrorPayload, defaultErrorResponse.getHeaders(), defaultErrorResponse.getStatusCode());
exceptionRenderer.handleHttpEntityResponse(responseEntity, new ServletWebRequest(request, response));
response.flushBuffer();
} catch (ServletException e) {
// Re-use some of the default Spring dispatcher behaviour - the exception came from the filter chain and
// not from an MVC handler so it won't be caught by the dispatcher (even if there is one)
if (handlerExceptionResolver.resolveException(request, response, this, e) == null) {
throw e;
}
} catch (IOException | RuntimeException e) {
throw e;
} catch (Exception e) {
// Wrap other Exceptions. These are not expected to happen
throw new RuntimeException(e);
}
}
}
| {
"pile_set_name": "StackExchange"
} |
Credit
Caption
Human endothelial cells. Light micrograph of cultured human endothelial cells, a cell type found lining the blood vessels. Fluorescent dyes have been used to show the cell structures. Actin filaments are red, the cell nuclei are blue, and microcorpuscules containing Von Willebrand factor are yellow. Actin is the most abundant cellular protein, forming part of the cells' cytoskeleton, the system that is responsible for intracellular transport, and the structure and motility of the cells. The cell nuclei contain the cells' genetic information, DNA (deoxyribonucleic acid), packaged in chromosomes. Von Willebrand factor is a blood clotting protein. Magnification: x350 when printed at 10 centimetres wide. | {
"pile_set_name": "Pile-CC"
} |
//===========================================
// The following is for 8812A 1ANT BT Co-exist definition
//===========================================
#define BT_AUTO_REPORT_ONLY_8812A_1ANT 1
#define BT_INFO_8812A_1ANT_B_FTP BIT7
#define BT_INFO_8812A_1ANT_B_A2DP BIT6
#define BT_INFO_8812A_1ANT_B_HID BIT5
#define BT_INFO_8812A_1ANT_B_SCO_BUSY BIT4
#define BT_INFO_8812A_1ANT_B_ACL_BUSY BIT3
#define BT_INFO_8812A_1ANT_B_INQ_PAGE BIT2
#define BT_INFO_8812A_1ANT_B_SCO_ESCO BIT1
#define BT_INFO_8812A_1ANT_B_CONNECTION BIT0
#define BT_INFO_8812A_1ANT_A2DP_BASIC_RATE(_BT_INFO_EXT_) \
(((_BT_INFO_EXT_&BIT0))? TRUE:FALSE)
#define BTC_RSSI_COEX_THRESH_TOL_8812A_1ANT 2
#define BT_8812A_1ANT_WIFI_NOISY_THRESH 30 //max: 255
typedef enum _BT_INFO_SRC_8812A_1ANT{
BT_INFO_SRC_8812A_1ANT_WIFI_FW = 0x0,
BT_INFO_SRC_8812A_1ANT_BT_RSP = 0x1,
BT_INFO_SRC_8812A_1ANT_BT_ACTIVE_SEND = 0x2,
BT_INFO_SRC_8812A_1ANT_MAX
}BT_INFO_SRC_8812A_1ANT,*PBT_INFO_SRC_8812A_1ANT;
typedef enum _BT_8812A_1ANT_BT_STATUS{
BT_8812A_1ANT_BT_STATUS_NON_CONNECTED_IDLE = 0x0,
BT_8812A_1ANT_BT_STATUS_CONNECTED_IDLE = 0x1,
BT_8812A_1ANT_BT_STATUS_INQ_PAGE = 0x2,
BT_8812A_1ANT_BT_STATUS_ACL_BUSY = 0x3,
BT_8812A_1ANT_BT_STATUS_SCO_BUSY = 0x4,
BT_8812A_1ANT_BT_STATUS_ACL_SCO_BUSY = 0x5,
BT_8812A_1ANT_BT_STATUS_MAX
}BT_8812A_1ANT_BT_STATUS,*PBT_8812A_1ANT_BT_STATUS;
typedef enum _BT_8812A_1ANT_WIFI_STATUS{
BT_8812A_1ANT_WIFI_STATUS_NON_CONNECTED_IDLE = 0x0,
BT_8812A_1ANT_WIFI_STATUS_NON_CONNECTED_ASSO_AUTH_SCAN = 0x1,
BT_8812A_1ANT_WIFI_STATUS_CONNECTED_SCAN = 0x2,
BT_8812A_1ANT_WIFI_STATUS_CONNECTED_SPECIAL_PKT = 0x3,
BT_8812A_1ANT_WIFI_STATUS_CONNECTED_IDLE = 0x4,
BT_8812A_1ANT_WIFI_STATUS_CONNECTED_BUSY = 0x5,
BT_8812A_1ANT_WIFI_STATUS_MAX
}BT_8812A_1ANT_WIFI_STATUS,*PBT_8812A_1ANT_WIFI_STATUS;
typedef enum _BT_8812A_1ANT_COEX_ALGO{
BT_8812A_1ANT_COEX_ALGO_UNDEFINED = 0x0,
BT_8812A_1ANT_COEX_ALGO_SCO = 0x1,
BT_8812A_1ANT_COEX_ALGO_HID = 0x2,
BT_8812A_1ANT_COEX_ALGO_A2DP = 0x3,
BT_8812A_1ANT_COEX_ALGO_A2DP_PANHS = 0x4,
BT_8812A_1ANT_COEX_ALGO_PANEDR = 0x5,
BT_8812A_1ANT_COEX_ALGO_PANHS = 0x6,
BT_8812A_1ANT_COEX_ALGO_PANEDR_A2DP = 0x7,
BT_8812A_1ANT_COEX_ALGO_PANEDR_HID = 0x8,
BT_8812A_1ANT_COEX_ALGO_HID_A2DP_PANEDR = 0x9,
BT_8812A_1ANT_COEX_ALGO_HID_A2DP = 0xa,
BT_8812A_1ANT_COEX_ALGO_MAX = 0xb,
}BT_8812A_1ANT_COEX_ALGO,*PBT_8812A_1ANT_COEX_ALGO;
typedef struct _COEX_DM_8812A_1ANT{
// hw setting
u1Byte preAntPosType;
u1Byte curAntPosType;
// fw mechanism
BOOLEAN bCurIgnoreWlanAct;
BOOLEAN bPreIgnoreWlanAct;
u1Byte prePsTdma;
u1Byte curPsTdma;
u1Byte psTdmaPara[5];
u1Byte psTdmaDuAdjType;
BOOLEAN bAutoTdmaAdjust;
BOOLEAN bPrePsTdmaOn;
BOOLEAN bCurPsTdmaOn;
BOOLEAN bPreBtAutoReport;
BOOLEAN bCurBtAutoReport;
u1Byte preLps;
u1Byte curLps;
u1Byte preRpwm;
u1Byte curRpwm;
// sw mechanism
BOOLEAN bPreLowPenaltyRa;
BOOLEAN bCurLowPenaltyRa;
u4Byte preVal0x6c0;
u4Byte curVal0x6c0;
u4Byte preVal0x6c4;
u4Byte curVal0x6c4;
u4Byte preVal0x6c8;
u4Byte curVal0x6c8;
u1Byte preVal0x6cc;
u1Byte curVal0x6cc;
BOOLEAN bLimitedDig;
u4Byte backupArfrCnt1; // Auto Rate Fallback Retry cnt
u4Byte backupArfrCnt2; // Auto Rate Fallback Retry cnt
u2Byte backupRetryLimit;
u1Byte backupAmpduMaxTime;
// algorithm related
u1Byte preAlgorithm;
u1Byte curAlgorithm;
u1Byte btStatus;
u1Byte wifiChnlInfo[3];
u4Byte preRaMask;
u4Byte curRaMask;
u1Byte preArfrType;
u1Byte curArfrType;
u1Byte preRetryLimitType;
u1Byte curRetryLimitType;
u1Byte preAmpduTimeType;
u1Byte curAmpduTimeType;
u4Byte nArpCnt;
u1Byte errorCondition;
} COEX_DM_8812A_1ANT, *PCOEX_DM_8812A_1ANT;
typedef struct _COEX_STA_8812A_1ANT{
BOOLEAN bBtLinkExist;
BOOLEAN bScoExist;
BOOLEAN bA2dpExist;
BOOLEAN bHidExist;
BOOLEAN bPanExist;
BOOLEAN bUnderLps;
BOOLEAN bUnderIps;
u4Byte specialPktPeriodCnt;
u4Byte highPriorityTx;
u4Byte highPriorityRx;
u4Byte lowPriorityTx;
u4Byte lowPriorityRx;
s1Byte btRssi;
BOOLEAN bBtTxRxMask;
u1Byte preBtRssiState;
u1Byte preWifiRssiState[4];
BOOLEAN bC2hBtInfoReqSent;
u1Byte btInfoC2h[BT_INFO_SRC_8812A_1ANT_MAX][10];
u4Byte btInfoC2hCnt[BT_INFO_SRC_8812A_1ANT_MAX];
u4Byte btInfoQueryCnt;
BOOLEAN bC2hBtInquiryPage;
BOOLEAN bC2hBtPage; //Add for win8.1 page out issue
BOOLEAN bWiFiIsHighPriTask; //Add for win8.1 page out issue
u1Byte btRetryCnt;
u1Byte btInfoExt;
u4Byte popEventCnt;
u1Byte nScanAPNum;
u4Byte nCRCOK_CCK;
u4Byte nCRCOK_11g;
u4Byte nCRCOK_11n;
u4Byte nCRCOK_11nAgg;
u4Byte nCRCErr_CCK;
u4Byte nCRCErr_11g;
u4Byte nCRCErr_11n;
u4Byte nCRCErr_11nAgg;
BOOLEAN bCCKLock;
BOOLEAN bPreCCKLock;
u1Byte nCoexTableType;
BOOLEAN bForceLpsOn;
}COEX_STA_8812A_1ANT, *PCOEX_STA_8812A_1ANT;
//===========================================
// The following is interface which will notify coex module.
//===========================================
VOID
EXhalbtc8812a1ant_PowerOnSetting(
IN PBTC_COEXIST pBtCoexist
);
VOID
EXhalbtc8812a1ant_PreLoadFirmware(
IN PBTC_COEXIST pBtCoexist
);
VOID
EXhalbtc8812a1ant_InitHwConfig(
IN PBTC_COEXIST pBtCoexist,
IN BOOLEAN bWifiOnly
);
VOID
EXhalbtc8812a1ant_InitCoexDm(
IN PBTC_COEXIST pBtCoexist
);
VOID
EXhalbtc8812a1ant_IpsNotify(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte type
);
VOID
EXhalbtc8812a1ant_LpsNotify(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte type
);
VOID
EXhalbtc8812a1ant_ScanNotify(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte type
);
VOID
EXhalbtc8812a1ant_ConnectNotify(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte type
);
VOID
EXhalbtc8812a1ant_MediaStatusNotify(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte type
);
VOID
EXhalbtc8812a1ant_SpecialPacketNotify(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte type
);
VOID
EXhalbtc8812a1ant_BtInfoNotify(
IN PBTC_COEXIST pBtCoexist,
IN pu1Byte tmpBuf,
IN u1Byte length
);
VOID
EXhalbtc8812a1ant_RfStatusNotify(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte type
);
VOID
EXhalbtc8812a1ant_HaltNotify(
IN PBTC_COEXIST pBtCoexist
);
VOID
EXhalbtc8812a1ant_PnpNotify(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte pnpState
);
VOID
EXhalbtc8812a1ant_CoexDmReset(
IN PBTC_COEXIST pBtCoexist
);
VOID
EXhalbtc8812a1ant_Periodical(
IN PBTC_COEXIST pBtCoexist
);
VOID
EXhalbtc8812a1ant_DbgControl(
IN PBTC_COEXIST pBtCoexist,
IN u1Byte opCode,
IN u1Byte opLen,
IN pu1Byte pData
);
VOID
EXhalbtc8812a1ant_DisplayCoexInfo(
IN PBTC_COEXIST pBtCoexist
);
| {
"pile_set_name": "Github"
} |
AV Haunts
AV Haunts
is creating adventure and exploration videos
0
patrons
$0
per month
Thanks for stopping by . We create videos of our wacky adventures. We search for historical locations. Sometimes we find ourselves that we are not alone. Join us and show us some support as we traverse across the land.
Thanks for stopping by . We create videos of our wacky adventures. We search for historical locations. Sometimes we find ourselves that we are not alone. Join us and show us some support as we traverse across the land. | {
"pile_set_name": "Pile-CC"
} |
Bacterol Products Company
Bacterol Products Company was a New York City business which was incorporated during the Great Depression, in February 1930. The firm was located at 25 Broadway. Its incorporation was made with 12,000 shares of common stock.
The organization was headed by Dr. Kurt Erich Schlossingk (1888 - March 13, 1930), a German born physician and chemist. Previously the export manager for the American Drug Syndicate, Schlossingk introduced twilight sleep to the United States in 1914. This was a method of maternity anesthesia induced by morphine and scopolamine. Schlossingk died an untimely death following gallstone surgery at Lenox Hill Hospital.
On February 7, 1930 the Bacterol Products Company announced a capital increase, from the initial 12,000 shares of no par stock, to 41,000. At the end of February 1930, the corporation leased property at 11 East 44th Street. This location was their corporate headquarters.
References
Category:Defunct companies based in New York City
Category:Chemical companies established in 1930
Category:1930 establishments in New York (state) | {
"pile_set_name": "Wikipedia (en)"
} |
package crdapps
import (
"testing"
routefake "github.com/openshift/client-go/route/clientset/versioned/fake"
v1alpha1 "github.com/stakater/Forecastle/pkg/apis/forecastle/v1alpha1"
"github.com/stakater/Forecastle/pkg/kube"
"github.com/stakater/Forecastle/pkg/testutil"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
kubefake "k8s.io/client-go/kubernetes/fake"
)
func Test_getURL(t *testing.T) {
clients := kube.Clients{
RoutesClient: routefake.NewSimpleClientset(),
KubernetesClient: kubefake.NewSimpleClientset(),
}
type args struct {
clients kube.Clients
forecastleApp v1alpha1.ForecastleApp
}
tests := []struct {
name string
args args
want string
err error
}{
{
name: "TestGetURLWithDefaultURLValue",
args: args{
clients: clients,
forecastleApp: *testutil.CreateForecastleApp("app-1", "https://google.com", "default", "https://icon"),
},
want: "https://google.com",
},
{
name: "TestGetURLWithNoURL",
args: args{
clients: clients,
forecastleApp: *testutil.CreateForecastleApp("app-1", "", "default", "https://icon"),
},
want: "",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if got, err := getURL(tt.args.clients, tt.args.forecastleApp); got != tt.want && err != tt.err {
t.Errorf("getURL() = %v, want %v, err = %v, wantErr = %v", got, tt.want, err, tt.err)
}
})
}
}
func Test_discoverURLFromRefs(t *testing.T) {
clients := kube.Clients{
RoutesClient: routefake.NewSimpleClientset(),
KubernetesClient: kubefake.NewSimpleClientset(),
}
clients.KubernetesClient.ExtensionsV1beta1().Ingresses("").Create(testutil.CreateIngressWithHost("my-app-ingress", "https://ingress-url.com"))
clients.RoutesClient.RouteV1().Routes("").Create(testutil.CreateRouteWithHost("my-app-route", "ingress-url.com"))
type args struct {
clients kube.Clients
forecastleApp v1alpha1.ForecastleApp
}
tests := []struct {
name string
args args
want string
err error
}{
{
name: "TestDiscoverURLFromRefsWithIngressName",
args: args{
clients: clients,
forecastleApp: *testutil.CreateForecastleAppWithURLFromIngress("app-1", "default", "https://icon", "my-app-ingress"),
},
want: "http://https://ingress-url.com",
},
{
name: "TestDiscoverURLFromRefsWithRouteName",
args: args{
clients: clients,
forecastleApp: *testutil.CreateForecastleAppWithURLFromRoute("app-1", "default", "https://icon", "my-app-route"),
},
want: "http://ingress-url.com",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
if got, err := discoverURLFromRefs(tt.args.clients, tt.args.forecastleApp); got != tt.want && err != tt.err {
t.Errorf("discoverURLFromRefs() = %v, want %v, err = %v, wantErr = %v", got, tt.want, err, tt.err)
}
})
}
clients.KubernetesClient.ExtensionsV1beta1().Ingresses("").Delete("my-app-ingress", &metav1.DeleteOptions{})
clients.RoutesClient.RouteV1().Routes("").Delete("my-app-route", &metav1.DeleteOptions{})
}
| {
"pile_set_name": "Github"
} |
//------------------------------------------------------------------------------
// <auto-generated>
// This code was generated by a tool.
// Runtime Version:4.0.30319.1
//
// Changes to this file may cause incorrect behavior and will be lost if
// the code is regenerated.
// </auto-generated>
//------------------------------------------------------------------------------
namespace ClearCanvas.ImageServer.Common {
[global::System.Runtime.CompilerServices.CompilerGeneratedAttribute()]
[global::System.CodeDom.Compiler.GeneratedCodeAttribute("Microsoft.VisualStudio.Editors.SettingsDesigner.SettingsSingleFileGenerator", "10.0.0.0")]
public sealed partial class ProductManifestServiceSettings : global::System.Configuration.ApplicationSettingsBase {
private static ProductManifestServiceSettings defaultInstance = ((ProductManifestServiceSettings)(global::System.Configuration.ApplicationSettingsBase.Synchronized(new ProductManifestServiceSettings())));
public static ProductManifestServiceSettings Default {
get {
return defaultInstance;
}
}
[global::System.Configuration.ApplicationScopedSettingAttribute()]
[global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
[global::System.Configuration.DefaultSettingValueAttribute("http://localhost:9998/")]
public string BaseUrl {
get {
return ((string)(this["BaseUrl"]));
}
}
[global::System.Configuration.ApplicationScopedSettingAttribute()]
[global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
[global::System.Configuration.DefaultSettingValueAttribute("")]
public string FailoverBaseUrl {
get {
return ((string)(this["FailoverBaseUrl"]));
}
}
[global::System.Configuration.ApplicationScopedSettingAttribute()]
[global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
[global::System.Configuration.DefaultSettingValueAttribute("ClearCanvas.ImageServer.Common.ClientWsHttpConfiguration, ClearCanvas.ImageServer" +
".Common")]
public string ConfigurationClass {
get {
return ((string)(this["ConfigurationClass"]));
}
}
[global::System.Configuration.ApplicationScopedSettingAttribute()]
[global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
[global::System.Configuration.DefaultSettingValueAttribute("2000000")]
public int MaxReceivedMessageSize {
get {
return ((int)(this["MaxReceivedMessageSize"]));
}
}
[global::System.Configuration.ApplicationScopedSettingAttribute()]
[global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
[global::System.Configuration.DefaultSettingValueAttribute("None")]
public global::System.ServiceModel.Security.X509CertificateValidationMode CertificateValidationMode {
get {
return ((global::System.ServiceModel.Security.X509CertificateValidationMode)(this["CertificateValidationMode"]));
}
}
[global::System.Configuration.ApplicationScopedSettingAttribute()]
[global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
[global::System.Configuration.DefaultSettingValueAttribute("NoCheck")]
public global::System.Security.Cryptography.X509Certificates.X509RevocationMode RevocationMode {
get {
return ((global::System.Security.Cryptography.X509Certificates.X509RevocationMode)(this["RevocationMode"]));
}
}
[global::System.Configuration.ApplicationScopedSettingAttribute()]
[global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
[global::System.Configuration.DefaultSettingValueAttribute("")]
public string UserCredentialsProviderClass {
get {
return ((string)(this["UserCredentialsProviderClass"]));
}
}
}
}
| {
"pile_set_name": "Github"
} |
SU-E-T-155: Dose Response Curve of EBT2 and EBT3 Radiochromic Films to a Synchrotron-Produced Monochromatic X-Ray Beam.
This work investigates the dose-response curves of Gafchromic EBT2 and EBT3 radiochromic films using synchrotron-produced monochromatic x-ray beams. These dosimeters are being utilized for dose verification in photoactivated Auger electron therapy at the LSU Center for Advanced Microstructures and Devices (CAMD) synchrotron facility. Monochromatic beams of 25, 30 and 35 keV were generated on the tomography beamline at CAMD. Ion chamber depth-dose measurements were used to calculate the dose delivered to films irradiated simultaneously at depths from 0.7 - 8.5 cm in a 10×10×10-cms polymethylmethacrylate phantom. AAPM TG-61 protocol was applied to convert measured ionization into dose. Calibrations of films at 4 MV were obtained for comparison using a Clinac 21 EX radiotherapy accelerator at Mary Bird Perkins Cancer Center. Films were digitized using an Epson 1680 Professional flatbed scanner and analyzed using the optical density (OD) derived from the red channel. For EBT2 film the average sensitivity (OD/dose) at 50, 100, and 200 cGy relative to that for 4-MV x- rays was 1.07, 1.20, and 1.23 for 25, 30, and 35 keV, respectively. For EBT3 film the average sensitivity was within 3 % of unity for all three monochromatic beams. EBT2 film sensitivity shows strong energy dependence over an energy range of 25 keV - 4 MV. EBT3 film shows weak energy dependence, indicating that it would be the better dosimeter for Auger electron therapy. This research was supported by contract W81XWH-10-1-0005 awarded by The U.S. Army Research Acquisition Activity, 820 Chandler Street, Fort Detrick, MD 21702-5014. This report does not necessarily reflect the position or policy of the Government, and no official endorsement should be inferred. | {
"pile_set_name": "PubMed Abstracts"
} |
<?php
/*
* Copyright 2007-2017 Charles du Jeu - Abstrium SAS <team (at) pyd.io>
* This file is part of Pydio.
*
* Pydio is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Pydio is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Pydio. If not, see <http://www.gnu.org/licenses/>.
*
* The latest code can be found at <https://pydio.com>.
*/
namespace Pydio\Access\Core\Filter;
use Pydio\Access\Core\Model\AJXP_Node;
use Pydio\Access\Core\Model\UserSelection;
use Pydio\Core\Utils\Vars\PathUtils;
defined('AJXP_EXEC') or die( 'Access not allowed');
/**
* Class ContentFilter
*/
class ContentFilter {
public $filters = array();
public $virtualPaths = array();
/**
* @param AJXP_Node[] $nodes
*/
function __construct($nodes){
foreach($nodes as $n){
$virtualPath = $this->getVirtualPath($n->getPath());
$this->filters[$n->getPath()] = $virtualPath;
}
$this->virtualPaths = array_flip($this->filters);
}
/**
* @param $path
* @return string
*/
private function getVirtualPath($path){
return "/".substr(md5($path), 0, 10)."/".basename($path);
}
/**
* @param UserSelection $userSelection
*/
function filterUserSelection( &$userSelection ){
if($userSelection->isEmpty()){
foreach($this->filters as $path => $virtual){
$userSelection->addFile($path);
}
}else{
$newFiles = array();
foreach($userSelection->getFiles() as $f){
if(isSet($this->virtualPaths[$f])){
$newFiles[] = $this->virtualPaths[$f];
}else{
$testB = base64_decode($f);
if(isSet($this->virtualPaths[$testB])){
$newFiles[] = $this->virtualPaths[$testB];
}
}
}
$userSelection->setFiles($newFiles);
}
}
/**
* @return mixed|string
*/
function getBaseDir(){
return PathUtils::forwardSlashDirname(array_keys($this->filters)[0]);
}
/**
* Retrieves the path of the first object
* @return mixed|string
*/
function getUniquePath(){
return PathUtils::forwardSlashBasename(array_keys($this->filters)[0]);
}
/**
* @param AJXP_Node $node
* @return String
*/
function externalPath(AJXP_Node $node){
return $this->getVirtualPath($node->getPath());
}
/**
* @param String $vPath
* @return String mixed
*/
function filterExternalPath($vPath){
if(isSet($this->virtualPaths) && isSet($this->virtualPaths[$vPath])){
return $this->virtualPaths[$vPath];
}
return $vPath;
}
/**
* @param String $oldPath
* @param String $newPath
* @return bool Operation result
*/
public function movePath($oldPath, $newPath){
if(isSet($this->filters[$oldPath])){
$this->filters[$newPath] = $this->getVirtualPath($newPath);
unset($this->filters[$oldPath]);
$this->virtualPaths = array_flip($this->filters);
return true;
}
return false;
}
/**
* @return array public data as array, pre-utf8 encoded
*/
public function toArray(){
$data = array("filters" => array(), "virtualPaths" => array());
foreach($this->filters as $k => $v){
$data["filters"][$k] = $v;
}
foreach($this->virtualPaths as $k => $v){
$data["virtualPaths"][$k] = $v;
}
return $data;
}
/**
* @param $filters
*/
public function fromFilterArray($filters){
$this->filters = $filters;
$this->virtualPaths = array_flip($this->filters);
}
} | {
"pile_set_name": "Github"
} |
Regulating the licensing of DNA replication origins in metazoa.
Eukaryotic DNA replication is a highly conserved process; the proteins and sequence of events that replicate animal genomes are remarkably similar to those that replicate yeast genomes. Moreover, the assembly of prereplication complexes at DNA replication origins ('DNA licensing') is regulated in all eukaryotes so that no origin fires more than once in a single cell cycle. And yet there are significant differences between species both in the selection of replication origins and in the way in which these origins are licensed to operate. Moreover, these differences impart advantages to multicellular animals and plants that facilitate their development, such as better control over endoreduplication, flexibility in origin selection, and discrimination between quiescent and proliferative states. | {
"pile_set_name": "PubMed Abstracts"
} |
Ethylene glycol or methanol intoxication: which antidote should be used, fomepizole or ethanol?
Ethylene glycol (EG) and methanol poisoning can cause life-threatening complications. Toxicity of EG and methanol is related to the production of toxic metabolites by the enzyme alcohol dehydrogenase (ADH), which can lead to metabolic acidosis, renal failure (in EG poisoning), blindness (in methanol poisoning) and death. Therapy consists of general supportive care (e.g. intravenous fluids, correction of electrolytes and acidaemia), the use of antidotes and haemodialysis. Haemodialysis is considered a key element in the treatment of severe EG and methanol intoxication and is aimed at removing both the parent compound and its toxic metabolites, reducing the duration of antidotal treatment and shortening the hospital observation period. Currently, there are two antidotes used to block ADH-mediated metabolism of EG and methanol: ethanol and fomepizole. In this review, the advantages and disadvantages of both antidotes in terms of efficacy, safety and costs are discussed in order to help the physician to decide which antidote is appropriate in a specific clinical setting. | {
"pile_set_name": "PubMed Abstracts"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.