text
stringlengths
9
11M
pile_idx
int64
25
134M
The Tax Justice Network has criticised a Senate report on corporate tax avoidance, saying it fails to recommend the most needed reforms, such as whistleblower protection for private-sector workers and disclosure of all subsidiaries in offshore tax havens. The report also fails to recommend a reform that would show where a company conducts its business compared to where it actually parks its profits, Mark Zirnsak of the Tax Justice Network said. “Those being made public would be a huge step forward, that’s a huge one,” Zirnsak said. The Senate inquiry into corporate tax avoidance released its interim report on Friday afternoon, a day after convening a special hearing in response to the infamous Panama Papers leak. The tax office commissioner, Chris Jordan, gave explosive evidence at that hearing, saying the Panamanian law firm at the centre of the leak, Mossack Fonseca, was not unique and it was being used to conceal and aid criminal behaviour, as well as tax minimisation strategies. The Senate committee was supposed to publish its final report on Friday but, after the Panama Paper revelations, it has asked for an extension until 30 September, well after the election. Zirnsak said he thought the committee had had to rush its work, with the election looming, and he fears its will be forgotten as political events take over between now and September. But the committee’s chair, the Labor senator Chris Ketter, said that would not happen. “The Coalition government would be very foolish to forget about this committee and the forward-thinking reports and recommendations it provides,” he said. The report, Corporate Tax Avoidance, Gaming the System, recommends stronger penalties for not providing information to the ATO and action on general purpose accounts reporting. But it said it needed more time to consider the revelations from the Panama Papers leak, in which 800 Australians have been identified. “Although the committee has delved into the use of tax havens as a means of avoiding tax, recent revelations emanating from a law firm operating out of Panama have enlivened commentary about this practice,” the report stated. “The committee intends to consider Australians’ use of offshore facilities to avoid paying their due taxes and will seek an extension to the final reporting date in order to examine and report on this matter. “The papers seem predominantly to involve individuals not multinationals. That said, these revelations provide additional evidence that tax avoidance and aggressive minimisation extends beyond the realm of large multinationals.” The report was released hours after the Guardian revealed the Turnbull government plans to create a public register revealing the identities of the beneficial owners of shell companies in an effort to quell mounting public outrage about multinational tax avoidance in the countdown to the federal election. The creation of the public register – which will be announced within weeks before an international tax avoidance and evasion summit convened by the British prime minister, David Cameron, in London in mid-May – will bring Australia into line with G20 commitments on transparency. The independent senator Nick Xenophon said Australians ought not worry that the Senate report was less comprehensive than some people expected. “The final report is the report that will count. People shouldn’t read too much into the fact that this is an interim report,” he said. “Conclusions will be drawn by the time of the final report,and I for one will be pushing to make that a very strong report.”
78,119,207
Q: Close Fancybox modal window and reload the page after login form has been submitted I've got a login form opening in a FancyBox 2 modal iframe window. But when you submit the form, the web page is opening within the modal. Can anyone tell me if and how to make the Fancybox modal close, and the underlying page re-load when the login form is submitted please? Here's my current code - within a functions.js file: $(".login-popup").fancybox({ maxWidth : 310, height : 300, autoSize : false, closeClick : false, openEffect : 'none', closeEffect : 'none' }); And here's my login link: <a class="login-popup" data-fancybox-type="iframe" href="/popup/login" title="">Login</a> Thanks for any help, Ste Edit 1: Thanks for the help with this. Sorry for being a complete newbie, but I'm having trouble getting the parent.$.fancybox.close(); to work on the form submit() event. Here's the code I'm now using but it doens't close the window as I'm doing something wrong. Any ideas what? Thanks. $(".login-popup").fancybox({ closeEffect : 'none', 'afterClose':function () { window.location.reload(); }, }); $(".login-form").submit(function(){ parent.fancyBoxClose(); }); A: To close the fancybox from within the iframe, you can use parent.$.fancybox.close(); on the form submit() event. Then add an onClose event on the Fancybox initializer, which would reload the page: $(".login-popup").fancybox({ maxWidth : 310, height : 300, autoSize : false, closeClick : false, openEffect : 'none', closeEffect : 'none', 'afterClose':function () { window.location.reload(); }, });
78,119,510
redspottedhanky.com, the award winning online travel retailer, has announced it is using Masabi, a developer of mTicketing technology for the transport sector, for its mobile ticketing service. The mobile application allows travellers to search for train times, buy tickets and earn loyalty points for any route in the UK, with displayable mTickets available on certain routes, eliminating the need for many passengers to use ticket machines. The apps currently support iPhone, Android, Blackberry and Nokia smartphones, as well as most everyday phones, with a Windows Phone 7 version launching soon. Customers can quickly and easily search for and buy cheap train tickets and earn loyalty points without paying any booking fees – repeat purchases are even faster, requiring only a few key presses. mTickets are supported on certain routes, such as the Chiltern Railways network, in the form of 2D barcodes. Where mTickets are not available, customers can collect their tickets using a booking reference number at most UK rail stations. The application also has the option to link purchases to the user’s web account, giving them a single convenient place to view all purchases, or print out receipts for expenses claims. “It is now possible to get the great redspottedhanky.com service on the go where customers can check train times and buy the cheapest possible tickets whilst earning loyalty points using their mobile phone, whether it’s the latest smartphone or a basic everyday handset,” said James Bain, redspottedhanky.com’s Director and Co-Founder. “We look forward to continuing to work with Masabi to further develop the mobile applications and offer more advanced features, ultimately improving our customers’ experience.” “As more and more consumers opt to travel by rail, there is a need to find new ways to sell tickets without necessarily expanding every station. This application allows consumers to completely bypass ticket machines, making their journeys as effortless and enjoyable as possible, from buying the ticket to boarding the train,” said Giacomo Biggiero, CPO of Masabi. “We are extremely pleased to be working with forward-thinking online rail ticket vendors like redspottedhanky.com in bringing this innovative service to the public.” ADVERTISEMENT All transactions are secured using Masabi’s award-winning, US government certified, encryptME security system that will ultimately support phone-based contactless technology, such as NFC, once it is widely available on handsets. Older CityPASS: Save Big on a Visit to Fun-loving, Freewheeling San Francisco Newer Whitbread PLC brings £5 million to High Wycombe thanks to new Premier Inn
78,119,591
Sunday, April 4, 1999 Published at 12:09 GMT 13:09 UKUKHarriers thwarted againHarrier GR7: Still firmly rooted to the groundRAF Harriers have abandoned their latest mission against Serb targets after technical problems on a support plane. The aborted attempt has meant growing frustration for British pilots flying out of the Gioia del Colle base in Italy. The Harrier GR7s, which took off at about midnight GMT, were airborne for just minutes before they were recalled without even crossing the Yugoslav border. "The frustration continues," Group Captain Ian Travers Smith said. The Harriers have had only two successful missions so far "Because of problems with assets in the other parts of the package, to go on with the mission would have put the other aircraft in the package at risk." But he said it was unfortunate the mission was recalled, as weather conditions may have been right for the pilots to effectively target Serb units. "The boys are definitely a bit miffed. I think you can imagine their frustration," he said. At Sunday's Ministry of Defence briefing Air Marshal Sir John Day said improving weather in the Balkans would help them intensify the ongoing air strikes. 'Substantial attacks' He said: "Nato carried out a substantial series of attacks last night. This was in spite of the fact that the continuing poor weather disrupted the overall air effort. "And a technical difficulty with a support aircraft led to the cancellation of part of Nato's planned bombing missions." An RAF weatherman assesses the day's forecast But he added: "The weather in the operational area is at last improving and we confidently expect that the full weight of Nato's air power will be brought to bear over the next few days.'' Since the air strikes began on 25 March, the Harriers at the base have carried out just two successful missions, and this is the fifth consecutive night that the Harrier pilots have had their flights put on hold. Tons of extra weapons were delivered to the base on Saturday, but have still not been put to use. Military concerns over the number of aborted Nato missions have reportedly prompted the UK to send unmanned reconnaissance drones to the area. The Phoenix UAVs (Unmanned Aerial Vehicles), came into service with the Army in November last year after being developed at a cost of £300m by Marconi. Pilots 'keeping occupied' They will fly over the Kosovo battlefields to provide ''real time'' video pictures of targets. In the meantime, the pilots at the base are keeping occupied with reading and writing home, as well as enjoying the contents of Easter packages sent by relatives. An Easter service was held at the base on Sunday by Padre Jonathan Chaffey. He has been with the squadron on previous trips abroad, including their posting to the Gulf.
78,119,677
Lightning beat Caps in OT, take 2-0 series lead Washington, DC (Sports Network) - Vincent Lecavalier's second goal of the game came at 6:19 of overtime, to lift Tampa Bay over Washington, 3-2, in Game 2 of this Eastern Conference semifinal. Randy Jones took advantage of a slow Capitals shift change, flipping a pass from deep inside his own zone up to Teddy Purcell at the Washington blue line. Purcell was free to down the left wing, then feed to Lecavalier in front for a shot that was chipped over Michal Neuvirth for the winner. "It means a lot. Once again they played a great game," Lecavalier said when asked if it was important to snag two games on the road to begin this series. "They came after us, but 'Rolie' made some huge saves all night. Even in overtime, they played pretty solid. We took advantage of our chances." Martin St. Louis added a goal and one assist for the Lightning, who won both games in Washington and can put a stranglehold on the series with Games 3 and 4 this week in Florida. Purcell collected two assists and Dwayne Roloson stopped 35-of-37 shots as Tampa posted its fifth consecutive win in these playoffs. Brooks Laich and Alex Ovechkin tallied for the Capitals, while Neuvirth made just 20 stops in defeat. "The series is not over. We're going to go there and we're going to have to win two games and it's going to be hard. We have to win," Ovechkin said defiantly. After Washington failed to click on three straight power plays, Tampa used its second chance with the advantage to take a 1-0 lead inside the final minute of the first. Stationed between the bottom of the circles, St. Louis fed Lecavalier in the near the right faceoff dot, and his rising slap shot sailed past traffic and over Neuvirth's glove inside the right post. Three more Capitals power plays fell by the wayside in the second until the game-tying score with 5:08 left. Nicklas Backstrom fired from 15 feet, and though Roloson got his blocker up to deflect the shot, Laich managed to bat the puck down and across the goal line. "The most important penalty killer is your goaltender," noted Bolts head coach Guy Boucher. "We also got lucky. Goaltending and luck had a lot to do with this." St. Louis gave Tampa a 2-1 edge at 7:35 of the third period, when his centering feed from near the goal line on the left side caromed off Capitals defenseman Mike Green's skate and in. Neuvirth went to the bench for an extra skater as the clock ticked under two minutes left, and Ovechkin chipped a loose puck under the crossbar with 1:08 to go. "I thought we had the momentum, quite frankly, for about 45 minutes of this game," said Caps head coach Bruce Boudreau. "I felt very comfortable going into overtime." Game Notes This is the first time in franchise history that the Lightning have won the first two games in a series on the road...Washington's power play was 0-for-6 in the game and has come up empty in 11 chances through the first two contests...The Lightning were without forward Simon Gagne and defenseman Pavel Kubina. Both were listed as day-to-day, with Blair Jones and Randy Jones filling the roster spots accordingly...Caps forward Mike Knuble returned to the lineup after a three-game absence with an upper-body injury believed to be a hand issue...The Capitals last dropped the first two games of a playoff series at home against the Rangers in 2009, but turned a 1-3 deficit into a seven-game victory in the East quarterfinals.
78,119,966
Raven Radio Andre Reui is one of the prime radio station of UK which is targeted for contents that are from UK. For those who loves the culture and musical passion of UK and the people of the country actually loves their culture and heritage of music and they also loves to listen to their cultural music then Raven Radio Andre Reui is the perfect station for them. Raven Radio Andre reui official website address is www.raven-radio.com/rieu
78,120,052
PowerCLI is the best tool for automating management and configuration of VMware vSphere Tag Archives: vSphere Following the last videos showing an introduction to PowerCLI and Reporting in PowerCLI there is now a new video that takes you through managing the virtual machine lifecycle with PowerCLI, these videos are a fantastic, quick way to learn PowerCLI tips and tricks. Administrators are no longer responsible for managing single digit servers, they now have hundreds or even thousands to maintain. The automation capabilities of PowerCLI allow you to manage the lifecycle of a single virtual machine, or a large number of virtual machines, by using custom scripts. Every stage of the virtual machine lifecycle, from creation and modification to migration and deletion, can be automated with just a few lines of PowerCLI code. What’s more, the number of virtual machines that you are managing, has no significant impact on the complexity of the PowerCLI scripts that you need to use. Watch the video One of the great new features introduced in vSphere 5.1 was the ability to export and import the configuration of your vSphere Distributed Switch (VDS) and port groups to a file. This gives you a quick restore method in case of issues or misconfigurations and also allows you to copy the entire VDS or port group configuration to a new VDS. This feature is detailed by this VMware KB and is available via the vSphere Web Client, below you can see how we would do this via the web client: Exporting the configuration with PowerCLI With the introduction of the VDS cmdlets in PowerCLI 5.1 R2 we can also automate this process using the Export-VDSwitch and
78,120,124
Not now? Yes, now. Quick PSA, by way of one of my friends who actually knows something* about this: When companies publish software updates and security patches, hackers (not the good kind) can run a diff on the old software versus the new (patched) software and see just what hole the company was patching. They can then take advantage of that knowledge and attack computers that haven’t been updated yet; aka, the systems where someone has pressed “Not Now” instead of “Continue.”
78,120,363
Izhar Gafni is reinventing the two-wheeler. The Israeli mechanical engineer wowed the opening-day audience Wednesday at Moses Znaimer’s ideacity with his 28-pound (less than 13 kilograms) cardboard bike. Waterproof, rust-proof, made of renewable and recycled resources such as paper, plastic bottles and car tires, and able to support a rider 20 times its weight, Gafni’s invention costs about $9 to build. “If it’s cheap, it’s affordable,” Gafni told the Star. “The only issue is to make it have a softer ride,” he continued, adding that he expects that, along with a seat height adjustment mechanism, a shock absorber system will be in place within the year. Gafni’s presentation at the three-day conference was part of an afternoon focussed on transportation innovation. He followed a lineup of whiz-bang speakers who showed off the-future-is-here flying cars and superfast evacuated tube travel, sold as “space travel on Earth.” The cardboard bike is much more-down-to-earth, and accessible. Gafni’s hope is not only to have it sold all over the world — so that, for example, children in remote areas of Africa can get to school — but also to get it manufactured where it’s sold. An assembly-line expert, he believes that local factories could be established that would provide both jobs and the means to get to them for workers. He maintains the production process is fairly simple and has a low carbon footprint. “There are no big ovens, not a lot of heat that will chew up a lot of energy,” he explained. Gafni got the idea for the bike — which is not yet named — in 2008, while working on a pomegranate peeling and extraction machine, a prize-winning invention that made pomegranate juice commercially viable. So far, it has won a 2013 Invention Award from Popular Science magazine and has been recognized by CNN, as well as a number of design companies, for its potential to revolutionize transportation. Loading... Loading... Loading... Loading... Loading... Loading... His next move is to reinvent the wheelchair, to make a paper version so that more people can afford better mobility devices. Moses Znaimer’s 14th annual ideacity conference continues through Friday at Koerner Hall, in the Royal Conservatory of Music complex at 273 Bloor St. W. Pass information and talks can be found at www.ideacityonline.com Read more about:
78,121,106
// Hint: Action is triggered by user interaction, network request, ... import {Injectable} from '@angular/core'; import {NgRedux} from '@angular-redux/store'; import {RootState} from '../store'; import {CalendarDay} from '../../models/calendar-day'; import {CalendarDaysService} from '../../services/calendar-days.service'; import {CalendarType} from './settings-actions'; import {addMonths, addWeeks, subMonths, subWeeks} from 'date-fns'; export enum PeriodChange { add, subtract } @Injectable() export class CalendarActions { static BUILD_DAYS = 'BUILD_DAYS'; static RESET_CALENDAR_STORE = 'RESET_CALENDAR_STORE'; constructor(private ngRedux: NgRedux<RootState>, private calendarService: CalendarDaysService) { } public buildDays(date: Date, calendarType: CalendarType = CalendarType.Month) { this.calendarService.buildDaysAsync(date, calendarType) .then(days => { this.ngRedux.dispatch({type: CalendarActions.BUILD_DAYS, payload: days, date: date}); }) ; } public navigate(date: Date, change: PeriodChange, calendarType: CalendarType) { switch (calendarType) { case CalendarType.Month: switch (change) { case PeriodChange.add: date = addMonths(date, 1); break; case PeriodChange.subtract: date = subMonths(date, 1); break; } break; case CalendarType.Week: case CalendarType.WorkWeek: switch (change) { case PeriodChange.add: date = addWeeks(date, 1); break; case PeriodChange.subtract: date = subWeeks(date, 1); break; } break; } this.buildDays(date, calendarType); } // navigates to a date public navigateToDate(date: Date, calendarType: CalendarType) { this.buildDays(date, calendarType); } public resetStore() { this.ngRedux.dispatch({type: CalendarActions.RESET_CALENDAR_STORE}); } }
78,121,165
Adventure Lifestyle Photographer working and playing in the Canadian Rockies Wednesday, November 10, 2010 Oil Slick | Adventure Lifestyle Photographer Tonight I had the privilege to photograph a friend of mine from town. Mike is a mechanic at the local garage and a member of the "Diablos" classic car club. I have had this idea to shoot mike for about a year now and just 2 hours before the session I had this wild idea pop into my head. Mike doesn't really drink anymore but what do you get a gear head to do in studio that is going to shock my audience? Naturally I imagined him drinking from a quart of QuakerState 10-40 motor oil. So, without further adieu, here is Mike! (ps. butter flavoured pancake syrup looks just like fresh motor oil...)
78,121,396
Neighbors are upset over shingles flying off a roof and plenty of squirrels and pigeons in a vacated Cobble Hill home.The city slapped a vacate COBBLE HILL - Neighbors are upset over shingles flying off a roof and plenty of squirrels and pigeons in a vacated Cobble Hill home. The city slapped a vacate order last week on 149 Kane St., which has landmark status. Residents say it's been a long time coming. "With the roof open and water coming through the building, what's on the sides of the home can be affected," said neighbor Anthony DiGuglielmo.According to residents, the owner, who reportedly lived in the rundown brownstone as recently as last week, often got defensive when they suggested that she make repairs.
78,121,508
Up to 30% of the US population are now affected by nasal allergy, resulting in a total (direct and indirect) cost of approximately $14.6 billion per year. Seasonal allergic rhinitis (AR), due to exposure to airborne pollen and molds is a major component of this problem in the US and worldwide. The limitations of current pharmaceutical and specific immunotherapy for AR include their side-effects and the time, effort and cost associated with their use. The goal of this Phase I project is to explore an entirely new paradigm for preventing seasonal AR, employing an antibody-based, nasal prophylactic to be used only when patients know they will be exposed to large quantities of specific allergens. This project builds upon our NIAID-funded experimental model system of mountain cedar allergy. Using this model, we discovered that ~90% of patients'IgE against mountain cedar pollen binds to conformational epitopes of a single allergen Jun a 1 and that a unique monoclonal antibody (E58) causes extensive loss of these conformation epitopes on Jun a 1. Further, the binding of E58 extensively reduces of the release of allergic mediators from mast cells sensitized with human IgE and challenged with Jun a 1. We have therefore developed a multidisciplinary, academic-corporate team of clinical and basic investigators with expertise to determine the feasibility of developing E58 as a novel therapeutic to benefit seasonal pollinosis sufferers. This project is within an NIAID STTR Area of Interest. The goal of this Phase 1 Project is to determine whether nasal instillation of bioengineered, recombinant E58 (rE58) will prevent the acute nasal airway obstruction and other signs of allergic inflammation. The Specific Aims are to: 1) Complete the optimization of a recombinant E58 antibody (rE58) for enhanced avidity and expression level and down-modulatory activity on Jun a1 reactivity and express rE58 as a univalent, partially humanized antibody. 2) Test in our mouse model of cedar pollinosis the efficacy of intranasal instillations of varying doses of rE58 antibody in preventing or substantially reducing the allergic response to subsequent nasal exposure of mountain cedar pollen. Positive Phase I results will provide "proof of concept" that nasal instillation of optimized antibodies can provide an effective "barrier" between an inhaled allergen and patient's IgE antibodies in the nasal mucosa. Further, these results will lay the ground work for Phase II studies, in which we will;a) produce and test potential commercial formulations and methods of delivery of rE58 antibody to optimize rapid onset and prolonged duration of the effect;b) validate efficacy and safety studies in GLP compliant facilities;and c) initiate FDA discussions and submission of an IND application to initiate clinical trials. The ultimate success of this product may lead to similar anti-allergen based therapeutics for other causes of seasonal AR. PUBLIC HEALTH RELEVANCE: Allergic rhinitis (AR) has become one of the most common chronic diseases in industrialized countries, yet despite the billions of dollars spent each year on therapeutics, many symptoms are not fully relieved. We have discovered that a specific monoclonal antibody rE58 reduces allergen-IgE interactions and propose that instillation of this antibody to the nasal mucosa may prevent seasonal pollinosis-induced AR. This project will test the feasibility of this new paradigm for treating seasonal nasal allergy and if successful may lead to a new class of anti-allergy antibody drugs.
78,121,629
Cable & Wireless fiscal-year net profit up 3% SarahTurner LONDON (MarketWatch) -- Fixed-line telecom provider Cable & Wireless uk:cw said Thursday that its fiscal-year net profit rose to 226 million pounds, from 220 million pounds at the same point a year ago. Adjusted operating profit climbed 36% to 822 million pounds, just exceeding analyst expectations for a 813 million pound profit. The firm said it's expecting EBITDA of around 1.0 billion pounds in fiscal 2010. "Each of our businesses has produced another strong set of results," said Chairman Richard Lapthorne. It hiked its dividend 13% to 8.50 pence for the year. Intraday Data provided by SIX Financial Information and subject to terms of use. Historical and current end-of-day data provided by SIX Financial Information. All quotes are in local exchange time. Real-time last sale data for U.S. stock quotes reflect trades reported through Nasdaq only. Intraday data delayed at least 15 minutes or per exchange requirements.
78,121,676
The Napa Valley, Internationally known for its fine wines exciting restaurants and world-class resorts, is home to 130,000 residents who share a strong sense of community and a legacy of preserving and protecting our rich agricultural heritage.Located in the heart of California’s preeminent wine region, the Napa Valley is also part of the dynamic San Francisco Bay Metropolitan Area. With its sunny Mediterranean climate and proximity to the mountains and ocean, the Valley offers residents easy access to virtually unlimited shopping, dining, cultural and recreational opportunities.The Napa Valley’s strategic location, natural and cultural resources, history of responsible land use planning and attractive quality of life provide the ideal mix of small town living and big city amenities. COUNTY OF NAPA AS AN EMPLOYER The County of Napa is a highly respected employer within the local community as well as throughout the region. We offer rewarding and challenging work, flexible hours, competitive salaries, a comprehensive benefits package and tremendous opportunities for career growth. At the County of Napa, we truly value our employees and are committed to diversity in our family-oriented environment. This is why we are the Employer of Choice for more than 1,300 employees.As an organization, the County is dedicated to improving the lives of our citizens and reflecting the best of the community’s values: integrity, accountability, and service. THE POSITION The current vacancy is a full-time permanent position within the Pollution Prevention Team. This recruitment process will establish three separate eligibility lists for Environmental Health Specialists with different areas of expertise: Pollution Prevention, Land Use, and Consumer Protection. Please answer the corresponding supplemental questions for the area of expertise for which you are applying. TEAM DESCRIPTIONS Pollution Prevention Team: is the Certified Unified Program Agency for Napa County. It is responsible for implementation of the Unified Programs (Underground Storage Tank Program, Aboveground Petroleum Storage Act, Hazardous Materials Business Plan Program, Hazardous Waste Program and California Accidental Release Prevention Program), and the County’s storm water ordinance. Land Use Team: regulates the construction and operation of on-site wastewater treatment systems and the siting and construction of domestic water wells Consumer Protection Team: regulates the construction and operation of food facilities, public water systems, public bathing places, and compliance with the County’s storm water ordinance. The Recruitment Process 1. This recruitment closes onFriday, January 30, 2015 at 5:00 PST2. 2. Application and Supplemental Questionnaire reviews for minimum qualification will tentatively take place the week ofFebruary 2, 2015. 3. Oral Panel Interviews will tentatively take place the week ofFebruary 16, 2015. Only the most qualified candidates as a result of the supplemental questions will be invited to interview. Human Resources reserve the right to change the recruitment process at any time. EXAMPLE OF DUTIES Conduct inspections and investigations of food handling and drinking establishments, sewage disposal plants, garbage and waste disposal sites, public schools, public and semipublic buildings, hazardous materials facilities, underground storage tank facilities and swimming pools; give instructions for correction of deficiencies and violations; obtain compliance with law, regulations and ordinances governing environmental health and safety; Investigate complaints filed by the public involving environmental health and safety issues; make recommendations and give instructions for correction of deficiencies and violations; issue permits or licenses when conditions warrant; Advise food handlers on methods of dish and glass sanitation and personal hygiene. Conduct inspections of water systems and private wells to ensure portable water sources in the community served; Take water samples of small water systems and public swimming pools for bacterial analysis by laboratory personnel; Collect samples of beverage, food and other matter for epidemiological studies and determine results; Inspect labor camps and other group housing to ensure Health and Safety code compliance, and to assure compliance with the Employee Housing Act; Participate in infectious disease control investigations; Instruct individuals and communities in methods of control of animal carriers of plague, rabies and other transmitted diseases; Inspect new land developments and subdivisions and recommends suitable water and sewage installations; review and approve federal housing drawings and specifications, in so far as sewage disposal and water installations are concerned; Conduct detailed and complex investigations and inspection of hazardous waste facilities including underground storage tanks; obtain compliance with laws and ordinances governing hazardous materials; Evaluate, inspect and regulate hazardous materials during transfer, storage and disposal; Participate as an emergency response team member during hazardous materials incidents within all areas of the County; determine containment, protection and clean up procedure coordinating with appropriate federal, state and local agencies; Conduct preliminary surveys and make evaluations of occupational work sites; Conduct studies of the effects of hazardous solids, liquids and air contaminant waste management proposals on human health and the environment; Collect samples of soil, liquid and air to evaluate the presence of hazardous contaminants and interpret their results; Work with other local, state and federal agencies involved in determining the effects of hazardous materials on the environment; Coordinate activities with other agencies and groups; inform interested parties of recent developments in the programs; Conduct joint inspections with other agencies such as Fire, Building, OSHA, as required; Collect, analyze and interpret hazardous material data; conduct chemical testing and sampling; Confer and advise engineers, architects, builders, developers and other on matters related to environmental health and safety; Advise and instruct individuals, business concerns, institutions and public officials in prevention and correction of health hazards, vector control and other conditions contrary to good environmental health and safety practices; Interpret environmental health and safety laws and regulations to the public; Prepare correspondence; write reports and findings of investigations; Answer questions and provide the public with information concerning environmental health and safety requirements both in the office and in the field; May address community groups on the work of the department; Prepare reports and legal complaints; appear in court as a witness in matters relating to actions taken by the department; And perform related duties as assigned. TYPICAL QUALIFICATIONS Knowledge of: State, federal and local laws and regulations dealing with environmental health and safety. Sanitation requirements and procedures. Principles and practices of environmental health and safety.Penal code sections related to public nuisances. Methods and procedures used in the inspection, investigation and correction of unsanitary and hazardous conditions. Proper and effective enforcement techniques and the appropriate response to various violations. Causes of the spread of disease.Basic building construction principles.Basic practices in the storage, transportation and disposal of hazardous materials. Statistical methods, survey techniques and analysis. Ability to: Understand, interpret and apply laws and regulations. Conduct environmental health and safety inspections and investigations using the proper tools and equipment.Analyze sanitary conditions and make recommendations for their correction. Identify potential problems or violations and take corrective action to eliminate that potential. EXPERIENCE AND EDUCATIONTo qualify for this classification, an individual must possess any combination of experience and education that would likely produce the required knowledge and abilities. A desirable combination is: Experience:One year of responsible experience in the field of environmental health and safety regulation and enforcement. Education: Equivalent to graduation from an accredited college or university with major course work in Public Health, Environmental Health, or other basic science, which included courses in land use, solid waste management, occupational health and/or toxic substances. License or Certificate: Possession of a valid certificate of Registration as an Environmental Health Specialist issued by the State of California. Possession of a valid California Driver's License. ADA ACCOMMODATIONApplicants requiring accommodation during the application and/or selection process pursuant to the Americans with Disabilities (ADA) Act should contact County of Napa Human Resources at (707) 253-4303. THE LAW: AB 300 (Ma) The Safe Body Art Act was signed into law on October 9, 2011. This law applies to all body art including piercing, tattoos and permanent cosmetics but there are limited requirements for mechanical ear piercing. It also applies to permanent, temporary and mobile body art facilities. The law becomes effective on January 1, 2012 but the compliance date for registration and facility permitting is July 1, 2012. THE TRAINING: The training sessions will be facilitated by CAEHA and other sponsors and will begin January 2012.Each training session will consist of a full day of training offered on several locations throughout California. The training sessions are designed to meet training requirements in AB 300 for practitioner registration, instructors from public environmental health and industry. The topics covered during the course are overview of the law, practitioner registration requirements, bloodborne pathogen exposure control training, safe body art practices, permanent body art facility requirements, temporary body art facilities, enforcement, and mechanical stud and clasp ear piercing. The training will be followed by a Q&A session and certificates will be issued at the end of the day. Question: I understand that the Webinars have already been completed...can I still review them, take the quizzes and receive Continuing Education Credits? Yes, watch the video or videos below, take the test, complete the evaluation formPay only $25 (each) and then you get a certificate. Dive in and expand your knowledge by attending four sessions throughout the summer. Learn about state and local initiatives and goals Check out a showcase of successful and innovative local IT projects Embrace right-minded data management principals consistent with the CCDEH goals across the state This year, compelling presentations are just a few clicks away! In an effort to make the CCDEH Data Summit as accessible as possible, this year the CCDEH IT Committee chose to use a virtual format. By design, this allows many stakeholders from across the state to present and participate in segments without travel. Get ready to jump online and catch the web presentations live! Sessions will involve various technical materials, but are applicable for directors, program managers, and IT professionals working within Environmental Health. Webinars are FREE!* *Continuing Education Contact Hours To receive a Certificate of Completion to get REHS Continuing Education Contact Hours a small processing fee is required. Each of the following sessions will offer 1 continuing education unit. The processing fee is $25.00 for each session. You will also be required to fill out a survey questionaire (see table below) in order to receive continuing education units. Educational objectives and description of class:To provide an overview of how social media and web-based tools can be used in the event of a disaster. Review a case-study of the County of San Diego’s responded to the Southern California power outage in 2011. Anticipated educational outcomes for each class:Explore different ways of communication that could be utilized when traditional methods are not available. Presenter Bio:Arleen Lim is a CUPA inspector in the Hazardous Materials Division for the County of San Diego. She received her bachelor’s degree in Public Health Microbiology from SDSU and has a diverse background of scientific research and development, regulatory inspections and enforcement as well as hands-on and web-based training. In addition to CUPA inspections, Arleen is also a member of the County’s Department of Environmental Health’s emergency management team. She was most recently involved in the Southern California Power Outage Emergency Response in September 2011. She is here today to share her knowledge gained from her experience during this event. The course will survey the changing modalities of information transfer that affect the movement of data among Local Primacy Agencies (LPA’s), the California Department of Public Health (CDPH), and the federal Environmental Protection Agencies (EPA). Rationale for the changes will be reviewed and placed in context of evolving information priorities including water conservation planning, water costs, and infrastructure needs. A survey of present technology will set the stage for discussion of how the technology for information transfer is expected to change and how that will impact the different entities involved. Anticipated educational outcomes for each class Participants in the course should obtain a thorough understanding of the current and proposed future infrastructure with emphasis on their individual roles and responsibilities in the process. This understanding will allow for planning to occur that will allow the affected entities to adapt their systems and processes to meet the changing information needs and substantially increase compliance rates in their information environment. Presenter Bio: Paul Collins is the manager for the Office of Information Systems (OIS) within the Division of Drinking Water and Environmental Management (DDWEM) for the California Department of Public Health (CDPH) since the mid 1990’s. Current emphasis is on the utilization of environmental data to solve public health and environmental issues across organizational boundaries. Sharing Solano County experience in dealing with the implementation of hand held devices during field inspections, working with data vendors, the development of our Consumer Protection’s Tablet PC food inspection and the Consumer Protection inspection data base. Anticipated educational outcomes for each class A description of the oversight and testing of multiple types of hand held Tablet PCs to determine what staff liked the best for field use, ergonomic considerations, user friendliness, staff training, involvement & experiences, equipment considerations (pros and cons: tablet, printers, means of transport and usage, obstacles encountered), benefits of data integration, data base challenges, role of data base in Standardization efforts as a results of PC Tablet’s use. Presenter Bios: Ricardo M. Serrano, MS, REHS Ricardo has more than 22 years of experience in several environmental health programs. He currently supervises a merged section (Consumer Protection and Technical Services Sections) of the Environmental Health Division, Resource Management Department of Solano County comprised of 10 registered environmental health specialists. This section is involved in Food Protection, Recreational Health, Solid Waste, Housing and Institutions, Vector Control, Childhood Lead Poisoning Prevention, and the newly Safe Body Art program. Ricardo is currently analyzing and reviewing data obtained from the field use of PC Tablets as part of the Standardization process in the Food Protection program, with the intent to measure program improvements and consistency among Staff. Ricardo was in charge of the biosolids program receiving the 2008 CCDEH Excellence in Environmental Health Award as part of the Comprehensive Program Addressing Community Concerns in the Land Spread Application of Biosolids. Ricardo was responsible for the implementation of the 24/7 Odor-Nuisance Response Protocol, including the development of an interactive website complaint form and organizing stakeholders meetings with nearby residents of solid waste facilities to address community concerns. Ricardo has also worked in the Hazardous Materials/Remediation-Cleanup and Consumer Protection sections in Solano County. Ricardo holds a Bachelor Degree in Biological Science from the Trujillo National University in Peru and earned a Master of Science degree in Environmental Management from the University of San Francisco. Joyce H. Benefield Joyce is a Senior Environmental Health Specialist with more than 15 years of experience in the environmental health programs that include food protection, recreational health, liquid waste, land use, housing and institutions, water well programs. She is currently the staff trainer and manager of food program improvements with extensive experience in food safety inspections and auditing in the public and private sectors. Joyce is leading the Standardization efforts in the Consumer Protection Program of Solano County, conducting staff field assessments and providing a structured training in the requirements of CalCode. Joyce worked with our data vendor and developed our consumer protection’s Tablet PC food inspection. She also oversaw the testing of multiple types of hand held Tablet PCs, assessing field usage and suitability. Joyce holds a Bachelor of Science degree in Environmental / Occupational Health from Cal State University in Northridge. Educational objectives and description of class: To provide an update on the use of automated data exchange to fulfill local, state and federal reporting requirements and the implementation of electronic reporting for regulated businesses statewide. Anticipated educational outcomes for each class: The participants will learn about the methods being implemented by the state and local agencies to meet the 2013 mandate for electronic reporting. Additionally, the participants will learn about the newest technologies being planned to make use of the collected information. Presenters Bios: Don Johnson: Don Johnson was appointed as the Assistant Secretary for Local Program Coordination and Emergency Response for the California Environmental Protection Agency in October of 1999. He is responsible for implementation of the "Unified Hazardous Waste and Hazardous Materials Management Regulatory Program" or Unified Program, coordination of disaster and emergency response activities for Cal/EPA, environmental enforcement program development and represents the Secretary of Cal/EPA on the Site Designation Committee for hazardous material site cleanup. Prior to assuming his responsibilities within the Cal/EPA he held a number of supervising and management positions within state and local government environmental programs, including the Department of Toxic Substances Control, and Riverside, Tulare and San Diego Counties. In Riverside County he managed environmental health programs for the Desert Health District, in Tulare County he was the Director of Environmental Health, and in San Diego he supervised county hazardous materials programs. Don Johnson holds a baccalaureate degree in Environmental Health Science from California State University San Diego and a Masters Degree in Public Administration from Golden Gate University. He is registered by the State of California as an Environmental Health Specialist. Jim Bohon: Jim Bohon has been in government service for over 39 years, first with the U. S. Navy for 11 years, then county government for 6 years, and now California state government for 22 years. He is the Program Manager for the State’s Unified Program, which establishes program standards for hazardous materials, hazardous waste and underground storage tank management and oversees over 100 delegated local government agencies. Jim graduated from the University of Colorado, Boulder with a Bachelor of Arts in Physics and from the Colorado State University, Fort Collins with a Bachelor of Science in Mathematics and teaching credentials in science and mathematics. During his six years with county government and his first three years of state service, Jim was an emergency manager and responded to dozens of local and state declared emergencies. Jim also has an extensive background with information technology and he is currently the project manager for many of Cal/EPA’s environmental data exchange projects. It’s easy!Go to CCDEH.com, register and pay online here, or use the form below. You must view the webinar for the entire session in order to receive credit. If you are viewing a pre-recorded session you will be required to fill out a survey questionnaire in order to receive continuing education units. What do the REHS CECH cost? The REHS CECH is offered for the low processing fee of $25.00 for each session. Dive in and expand your knowledge by attending four Webinar sessions throughout this summer. Learn about state and local initiatives and goals Check out a showcase of successful and innovative local IT projects Embrace right-minded data management principals consistent with the CCDEH goals across the state These compelling presentations are only clicks away! To make the CCDEH Data Summit as accessible as possible, the CCDEH IT Committee chose to use a virtual format. By design, this allows stakeholders from across the state to present and participate in segments without travel, ideal for leaders with limited schedules. Get ready to jump online and catch the web presentations live! Sessions will involve various technical materials, but are applicable for Directors of Environmental Health, Environmental Health Program Managers, Supervisors, Program Leads, and IT professionals working within Environmental Health. We look forward to seeing you at the following scheduled events (see below for Webinar links): The webinar session series is open to all Environmental Health Directors, Program Managers, Supervisors, Program Leads, and IT Professionals within the industry. The informative sessions are sure to enhance your agency’s knowledge, impact, and use of effective IT resources within your regulatory programs. • Learn about state and local initiatives and goals within environmental health • Grasp successful and innovative IT projects • Embrace the powerful data management principles The Webinars are FREE!! REHS Continuing Education Contact Hours (CECH) will be offered for a small processing fee of $25 per session. Webinar sign-up information and CECH credit request forms can be found below. Access Code: 274-413-037 AudioPIN: Shown after joining the webinar Space is limited, so please register for the session early. We ask you share computers to provide everyone with access to the webinar. DATE: August 27, 2014, 9:00 AM to 10:00 AMTOPIC: Data driven approach for LAMP/OWTS complianceSPEAKER: Dilan Roe, Alameda County Environmental Health Please register for this session at: https://attendee.gotowebinar.com/register/6324897018292207105 after registering, you will receive a confirmation email containing information about joining the webinar. Meeting Number: (415) 655-0053 Access Code: 274-413-037 AudioPIN: Shown after joining the webinar Space is limited, so please register for the session early. We ask you share computers to provide everyone with access to the webinar. ]]>adminsecure@nettop.com (Super User)ROOTMon, 01 Jun 2015 12:03:30 -0700Manager of the Year Awardhttp://www.ccdeh.com/resources/announcements/63-manager-of-the-year-award-for-2009 http://www.ccdeh.com/resources/announcements/63-manager-of-the-year-award-for-2009Get all of the information about submitting an application for Manager of the Year before the March 27, 2015 deadline...
78,121,678
Extracted Text The following text was automatically extracted from the image on this page using optical character recognition software: /AFFIDAVIT IN AMY FACTTHE STATE OF TEXASCOUNTY OF DALLASBEFORE ME, Ksry Rattana Notary Public in and for said County, State of Texas, on this day personally appeared,Billy Nolan Lovelady w/m/26 of 7722 Hums Drive, Delias, TexasWho, after being by me duly sworn, on oath deposes and says: I i:ork at Texas School Book depository^iill Elm# On Friday November 22, 1963 I worked on the 6th floor along withanny Arce, Jack Dougherty, Bill Shelley and Charles Givens, b'hen the ^resident caneby Bill Shelley and I was standing on the steps in front of the building where I work.After he had passed and was about 5>0 yards past us I heard three fhots# There was sslight pause after the first ehot then the next two was right close together. Icould not tell where the shots come from but sounded 1'ke they were across the streetfror? us. However, that could have been caused by the echo. After it was over Wewent back into the bpilding and I took some police officers up to search the building#I did not see anyone around the buildinc that was not supposed to he there. Ourlunch ueriod is from 12 to ICili^ pmf All of us had left the 6th floor to see thePresident. ^ , w ~P _ „ £ ///?&£?<■SUBSCRIBED AND SWORN TO BEFORE ME THIS 22 DAY OF November ^Ilary Battenul'l, , J L^u/lNotary Public, Dallas County, TexasCPS-GM13JFL 36 Related Items Handwritten affidavit by Billy Nolan Lovelady. Lovelady was working on the sixth floor on the morning of November 22nd. When the President's car passed by, he and Shelly were standing outside the building. The car was about fifty yards away when shots were heard. Lovelady went back inside the building and escorted some police officers inside. He did not see anyone in the building who was not supposed to be there.
78,121,820
Ask HN: I found a security issue with my bank and they don't care - jason_slack I found out that when answering my banks security question they allow it to be incorrect by one character.<p>Example: If the answer is: washington I could type something like: woshington<p>and it works.<p>I contacted my bank and was told that it was a &quot;feature&quot; in case users are typing fast and just fat finger a key.<p>To me this is an issue. None of my other banks and websites allow this &quot;feature&quot;.<p>Can I get thoughts from others? Let it go? Is it not really an issue as far as other people are concerned? ====== DanielStraight If you're relying the extra protection of an exact match over a match with a Levenshtein distance of 1, your security question isn't secure to begin with. Proper management of security questions is pretty straightforward: Set them to a long random string and keep track of them as if they were another password. If you do that, the fuzzy matching shouldn't be a problem at all. If you're using real answers that are easy to obtain from knowing anything about you, exact matching isn't helping. ~~~ jason_slack This is a good idea. I'll try this approach. ------ bifrost I'm going to guess it cut down on their customer service tickets, thats kinda a weird one though. I could see allowing case insensitivity, but maybe not spelling...
78,122,118
Devendra Fadnavis has clarified that suggestions of farmer loan waiver scheme being a ‘scam’ are baseless as not a single rupee will exchange hands. Maharashtra government has looked to tackle farmers’ loan waiver scheme in a manner that brings benefits to the concerned people and at the same time public funds are not wasted or go to fraudsters. (Photo: PTI) Maharashtra government has looked to tackle farmers’ loan waiver scheme in a manner that brings benefits to the concerned people and at the same time public funds are not wasted or go to fraudsters. However, it has been alleged that a huge number of fake Aadhaar cards are floating around and that the loan waiver benefits might be cornered by them. But, Chief Minister Devendra Fadnavis has clarified that suggestions of farmer loan waiver scheme being a ‘scam’ are baseless as not a single rupee will exchange hands. That the government is on tehlookout nevertheless about this fraud was made clear by Maharashtra revenue minister Chandrakant Patil in a big revelation that he made. As per the procedure, the Maharashtra government has been preparing a Green list of all the eligible farmers who will benefit from the farm loan waiver scheme. The loan amount will be credited to the banks concerned who will then write-off the farmer’s outstanding dues according to the norms. The writing-off of the loans will enable farmers to apply for fresh agricultural loans. Maharashtra revenue minister Chandrakant Patil has said that at least 10 lakh bank accounts of farmers are fake and will not receive any benefit of the loan waiver, according to PTI. The minister further claimed that most of these accounts were earlier opened by banks or credit societies for illegal transfer of the loan amount. The minister said, ‘’As per the previous assessment, some 89 lakh farmers were expected to benefit out of the loan waiver scheme. Now, we have realised that some 10 lakh bank accounts of farmers are fake, most of them opened by banks or credit societies for siphoning off the loan amount.’’ The government has found discrepancies in the linking of a single Aadhaar with several farmers which has been attributed to the feeding of data in the software that has been designed to reject any duplicate entries. However, the Chief Minister immediately called for an urgent meeting with the bankers and I-T Department officials and specified the loopholes in the scheme. Patil also said, ‘’Earlier, such fake accounts were regularly used for availing loan. With state government’s intervention, we want to fix such leakages in our system. Hence we have taken such tough decision. The government is keen on helping genuine farmers.” The banks admitted that the data they received from the online registration portal —Aapale Sarkar – varies from the data in their records. The names of some farmers are missing and some do not match with the land size or type of loan. Thereafter, Fadnavis has directed the banks to check and rectify the problems on an urgent basis.
78,122,326
Thank you for your interest in the Global Justice XML Data Model (GJXDM). To access the latest available information, please visit the GJXDM Archive. While this initiative is still supported, adoption of the broader successor to the GJXDM, the National Information Exchange Model (NIEM), is highly encouraged.
78,122,983
# # This Source Code Form is subject to the terms of the Mozilla Public # License, v. 2.0. If a copy of the MPL was not distributed with this # file, You can obtain one at http://mozilla.org/MPL/2.0/. include $(CORE_DEPTH)/coreconf/UNIX.mk # # The default implementation strategy for Irix is classic nspr. # ifeq ($(USE_PTHREADS),1) ifeq ($(USE_N32),1) IMPL_STRATEGY = _n32_PTH else IMPL_STRATEGY = _PTH endif endif DEFAULT_COMPILER = cc ifdef NS_USE_GCC CC = gcc AS = $(CC) -x assembler-with-cpp ODD_CFLAGS = -Wall -Wno-format -Wno-switch ifdef BUILD_OPT OPTIMIZER = -O6 endif else CC = cc CCC = CC ODD_CFLAGS = -fullwarn -xansi -woff 1209 ifdef BUILD_OPT ifeq ($(USE_N32),1) OPTIMIZER = -O -OPT:Olimit=4000 else OPTIMIZER = -O -Olimit 4000 endif endif # For 6.x machines, include this flag ifeq (6., $(findstring 6., $(OS_RELEASE))) ifeq ($(USE_N32),1) ODD_CFLAGS += -n32 -mips3 -exceptions else ODD_CFLAGS += -32 -multigot endif else ODD_CFLAGS += -xgot endif ifeq ($(USE_N32),1) OS_CFLAGS += -dollar endif endif ODD_CFLAGS += -DSVR4 -DIRIX CPU_ARCH = mips RANLIB = /bin/true # For purify # NOTE: should always define _SGI_MP_SOURCE NOMD_OS_CFLAGS += $(ODD_CFLAGS) -D_SGI_MP_SOURCE OS_CFLAGS += $(NOMD_OS_CFLAGS) ifdef USE_MDUPDATE OS_CFLAGS += -MDupdate $(DEPENDENCIES) endif ifeq ($(USE_N32),1) SHLIB_LD_OPTS += -n32 -mips3 endif MKSHLIB += $(LD) $(SHLIB_LD_OPTS) -shared -soname $(@:$(OBJDIR)/%.so=%.so) ifdef MAPFILE # Add LD options to restrict exported symbols to those in the map file endif # Change PROCESS to put the mapfile in the correct format for this platform PROCESS_MAP_FILE = cp $< $@ DSO_LDOPTS = -elf -shared -all ifdef DSO_BACKEND DSO_LDOPTS += -soname $(DSO_NAME) endif # # Revision notes: # # In the IRIX compilers prior to version 7.2, -n32 implied -mips3. # Beginning in the 7.2 compilers, -n32 implies -mips4 when the compiler # is running on a system with a mips4 CPU (e.g. R8K, R10K). # We want our code to explicitly be mips3 code, so we now explicitly # set -mips3 whenever we set -n32. #
78,123,335
Tech Security: Here's How to Rein in Shadow IT Accelerate the Decision-Making Processes Willingness to work with employees and provide alternatives is the first step, but if IT does not move as fast as business needs, they will be left behind. Shadow IT has been featured in the headlines for years, often as a massive headache for enterprise IT. Line of business managers swiping credit cards to pay for services outside IT's watchful eye created a scare in the industry, as did BYOD and the idea that employees would share corporate data through consumer file-sharing applications. At the same time, the use of consumer-grade collaboration and productivity applications in the SMB market has been seen as a benefit, as small companies that cannot afford expensive, enterprise software have gained access to cheap, effective tools. But the sophistication of SMB IT has grown in recent years, and cybersecurity has become an issue that no organization can ignore. The time has come for SMBs to take shadow IT seriously and address the issue in a way that is inclusive to employees and meets the needs of the business. Here are seven tips from Paessleron how best to address shadow IT. To protect the company from those insiders who abuse their privileged access and from hackers with stolen credentials, many companies are turning to a privileged access management (PAM) solution. ... More >>
78,123,741
Successful as a team! We offer your company a huge variety of promotional measures and tools: Support at in-house exhibitions and customer events Rental of exhibits for your trade show appearance Promotional material for your customers A solar argumentation kit for all target groups Sales campaigns in our list of special offers All this and much more is available in the EWS partner area – including the comprehensive technical information for your daily sales activities. Register online to benefit from the many advantages. Feel free to contact us! Our team will be happy to advise you! In addition to the classic marketing tools, we will also provide more support in the field of online-marketing in the future. Our planning and communication platform QuickPlan is available from EWS for free integration on your website. The professional web portal helps you to win over the visitors of your website and easily generate end customer leads. Via the integrated project administration tool we forward to you additional leads to potential new customers who have contacted us or our suppliers by phone, at exhibitions or online via one of our various websites. In addition, we also team up with multipliers, e. g. energy suppliers, construction companies and financial service providers to procure more leads for your business. More information here: forwarding end consumer leads.
78,123,758
Midwest Technical Services is owned and operated by Tim Moes (WDØFKC) who has over 40 years of Amateur Radio repair experience. Tim was the Service Manager and Radio Technician at Burghardt Amateur Center in Watertown, SD for 24 years, so you can have a nationally recognized and qualified technician handle any of your service needs. Tim is a licensed Amateur Radio operator, First Class FCC License Holder and a factory trained authorized service technician for Yaesu, Kenwood, and Icom. At Midwest Technical Services you will receive the best repair service available at a very affordable price. Service is provided on all out of warranty equipment and is fully warranted on work performed for 30 days.
78,124,034
public class A { public interface Cond { boolean cond(); } void test1(int x, Cond c) { int i; // --- do loops --- do { if (c.cond()) continue; // BAD if (c.cond()) break; } while (false); do { if (c.cond()) continue; if (c.cond()) break; } while (true); do { if (c.cond()) continue; if (c.cond()) break; } while (c.cond()); // --- while, for loops --- while (false) { if (c.cond()) continue; // GOOD [never reached, if the condition changed so it was then the result would no longer apply] if (c.cond()) break; } for (i = 0; false; i++) { if (c.cond()) continue; // GOOD [never reached, if the condition changed so it was then the result would no longer apply] if (c.cond()) break; } // --- nested loops --- do { do { if (c.cond()) continue; // BAD if (c.cond()) break; } while (false); if (c.cond()) break; } while (true); do { do { if (c.cond()) continue; // GOOD if (c.cond()) break; } while (true); } while (false); do { switch (x) { case 1: // do [1] break; // break out of the switch default: // do [2] // break out of the loop entirely, skipping [3] continue; // BAD; labelled break is better }; // do [3] } while (false); } }
78,124,246
Your brain’s reward pathways become active during art-making activities like doodling, according to a new Drexel University study. Girija Kaimal, EdD, assistant professor in the College of Nursing and Health Professions, led a team that used fNIRS (functional near-infrared spectroscopy) technology to measure blood flow in the areas of the brain related to rewards while study participants completed a variety of art-making projects. “This shows that there might be inherent pleasure in doing art activities independent of the end results. Sometimes, we tend to be very critical of what we do because we have internalized, societal judgements of what is good or bad art and, therefore, who is skilled and who is not,” said Kaimal of the study that was published The Arts in Psychotherapy. “We might be reducing or neglecting a simple potential source of rewards perceived by the brain. And this biologocial proof could potentially challenge some of our assumptions about ourselves.” Examples of the coloring activity. For the study, co-authored by Drexel faculty including Jennifer Nasser, PhD, and Hasan Ayaz, PhD, 26 participants wore fNIRS headbands while they completed three different art activities (each with rest periods between). For three minutes each, the participants colored in a mandala, doodled within or around a circle marked on a paper, and had a free-drawing session. During all three activities, there was a measured increase in bloodflow in the brain’s prefrontal cortex, compared to rest periods where bloodflow decreased to normal rates. The prefrontal cortex is related to regulating our thoughts, feelings and actions. It is also related to emotional and motivational systems and part of the wiring for our brain’s reward circuit. So seeing increased bloodflow in these areas likely means a person is experiencing feels related to being rewarded. There were some distinctions between the activities in the data collected. Doodling in or around the circle had the highest average measured bloodflow increase in the reward pathway compared to free-drawing (the next highest) and coloring. However, the difference between each form of art-making was not statistically significant, according to analysis. Examples of the doodling activity. “There were some emergent differences but we did not have a large-enough sample in this initial study to draw any definitive conclusions,” Kaimal said. It was noted and tracked which participants in the study considered themselves artists so that their results could be compared to non-artists. In that way, Kaimal and her team hoped to understand whether past experience played a factor in triggering feelings of reward. Doodling seemed to initiate the most brain activity in artists, but free-drawing was observed to be about the same for artists and non-artists. Interestingly, the set coloring activity actually resulted in negative brain activity in artists. “I think artists might have felt very constrained by the pre-drawn shapes and the limited choice of media,” Kaimal explained. “They might also have felt some frustration that they could not complete the image in the short time.” Again, however, these results regarding artists versus non-artists proved statistically insignificant, which might actually track with Kaimal’s previous research that found experience-level did not have a bearing on the stress-reduction benefits people had while making art. Examples of the free drawing activity. Overall, though, the finding that any form of art-making resulted in the significant activation of feelings of reward are compelling, especially for art therapists who see art as a valuable tool for mental health. In fact, in surveys administered to the participants after the activities were complete, respondents indicated that they felt more like they had “good ideas” and could “solve problems” than before the activities. Participants even said they felt the three-minute time spans for art-making weren’t long enough. “There are several implications of this study’s findings,” Kaimal said. “They indicate an inherent potential for evoking positive emotions through art-making — and doodling especially. Doodling is something we all have experience with and might re-imagine as a democratizing, skill independent, judgment-free pleasurable activity.” Additionally, Kaimal felt that the findings of increased self-opinion were intriguing. “There might be inherent aspects to visual self-expression that evoke both pleasure and a sense of creative agency in ourselves,” she said. Those interested in the full study, “Functional near-infrared spectroscopy assessment of reward perception based on visual self-expression: Coloring, doodling and free drawing,” can read it here.
78,124,263
Anisah Smith didn’t have to look too far in order to find a home for the next four years. The Daily News Girls Basketball Player of the Year out of Carlmont-Belmont signed a letter of intent to play at Menlo College, which returns every starter on a team that won its third consecutive Cal-Pac Conference Championship. “We are very excited about Anisah joining our program,” Menlo College coach Shannon Osborne said in a statement. “She is an outstanding player who will make an impact with her scoring ability, her athleticism and her work ethic. We are looking forward to Anisah having an outstanding four years as an Oak.” The 5-foot-8 point guard averaged 23.6 points, 7.2 rebounds, 4.4 steals and 3.4 assists as a senior with the Scots, who went 24-4 and reached the Central Coast Section Division I quarterfinals. Smith, who played two years at Eastside Prep-East Palo Alto, transferred to Carlmont her junior year. “Anisah was a tremendous player for Carlmont,” Carlmont coach Dan Mori said. “She was someone who could really do everything on the court. … I think she’ll be a tremendous assset to the Menlo College program. I have no doubt that she can be extremely successful there. And I think part of her wanted to stay close to home so her family and friends could see her play. I think it’s a great opportunity for her and Menlo is going to be really excited to have her.” A long-awaited plan to keep the Raiders in Oakland was unveiled late Friday by city and Alameda County officials. In a news release issued late Friday afternoon, local officials touted the plan for a new $1.3 billion stadium and mixed-use development designed to keep the team at the Coliseum site. Raiders owner Mark Davis currently is pursuing a plan to...
78,124,294
Tag Archives: domain dispute July, 2010 Paris Hilton, a celebrity and heiress to the Hilton fortune, has filed a dispute with the WIPO over the domain paris-hilton-perfume.net. She contends that the domain violates the “Paris Hilton” trademark. This dispute was more than likely filed by one of Paris Hilton’s legal representatives. This is a good thing, … In a huge victory for Verizon, the company has won a dispute it filed with the National Arbitration Forum over the domain VerizonWireless.com. The company markets itself as “Verizon Wireless,” so having this name is very valuable. This case is interesting because the arbitration panel ruled that the domain was … Charter Communications, an American ISP and cable company that filed for Chapter 11 bankruptcy in 2009, lost an arbitration dispute this week over the name charterbusiness.com. Charter claimed the domain violated a number of its trademarks and alleged that the owner was using it in bad faith because. The name … Usually big companies win domain disputes. In this instance, however, French publisher Editions Milan tried to abuse the WIPO domain arbitration system and ended up screwing itself over in the process. It all started when American soldier Steve Hill registered kokeshi.com in 2001. He registered the single-keyword domain because of … June, 2010 After filing a complaint with the National Arbitration Forum, Microsoft has won control of the domain xboxonline.com. The Redmond company claimed the owner registered the name in bad faith and that its use violated Microsoft’s trademark on “XBox.” It’s no surprise that the NAF agreed with Microsoft. The company further … Yahoo’s Flickr.com is the most popular photo sharing site online, but for years the site has been missing out on traffic from users who misspelled the name as “Flicker.com.” According to changes in the name’s WHOIS records, Yahoo now has control over it. According to technology news site Softpedia, acquiring … May, 2010 Apple won an important victory with the WIPO this week after it was given control of the name itunes.com.mx. I’m sure Cupertino lost quite a bit Mexican traffic to this domain and owning it will help it protect the Apple brand. The company is also the first multinational corporation to … Ed Teal, a Republican candidate in the Marshall County, Alabama Sheriff’s election, has filed a cybersquatting lawsuit against his opponent, Sheriff Chief Deputy Doug Gibbs. He alleges that the incumbent purposely registered some 19 domains, such as VoteEdTeal.com and EdTealforSheriff.com, to hinder his use of the Internet in the election. … Facebook has filed a dispute with the WIPO over the domain Facebook.me, which is currently owned by UAE citizen Amjad Abbas. This is just one of many ccTLDs the site has gone after lately. Interestingly, the name’s current owner has it redirecting to Facebook.com. Why he would do is unclear … When company Transure Enterprise registered baylorcme.com and fansofbaylor.com and offered them to Baylor University for $500, the last thing it probably expected was to end up in an arbitration dispute. Because the Waco, Texas, university has a trademark for “Baylor University,” it took the cybersquatter to the National Arbitration Forum. …
78,124,419
Q: IBM x3650 M1 SATA support up to 2 Tb I have an IBM x3650 (M1 [i think because of DDR2 rams]), an as i know it only support SATA hard drvies only up to 1 Tb. Is there any RAID controller firmware/BIOS update which can give the server 2 tb hard drives support? And how can is update it? Thanks. Ewert A: The M1's only support drives up to 1TB and the full list is found here: X3650 M1 compatibility guide There's not a ServerProven controller there with firmware that would support anything beyond 1TB. Trying to go beyond or circumvent the compatibility guide IMO would be against proper Server Administration practices for a production environment.
78,124,552
Meet Our Team JENNY GORDON I am a proud mom of two young men named Max and Noah. I am also a proud mom of two mischievous canines named Rosie and Vinny. I’ve been married for over 2 decades to my best friend and confidant, Dr. Larry Gordon. <<read more>> DANIEL ZIPPERER I am a fun and energetic agent that enjoys a satisfied client. I am very focused on making sure that a big decision in your life and the worries that may accompany are put to ease. I work hard and follow through keeping you informed every step of the way. <<read more>> ROB IRIARTE After spending over 20 years working with institutional clients in the financial markets, my wife Kathy and I decided to move to Wausau and raise our three children Parker, Abby and William. After years of work transfers between New York and Chicago, we feel very fortunate to be able to <<read more>>
78,124,896
Opening Bids in Search for Budget Deal Excerpt Looking at the Ryan plan, the Concord Coalition, a nonpartisan group of high-powered budget and tax specialists, noted that Ryan has left the door open for possible compromise, acknowledging that some elements of his plan are unlikely to be accepted by Democrats.
78,124,903
Q: How can I calculate the area of a bezier curve? Given the following path (for example) which describes a SVG cubic bezier curve: "M300,140C300,40,500,40,500,140", and assuming a straight line connecting the end points 300,140 to 500,140 (closing the area under the curve), is it possible to calculate the area so enclosed? Can anyone suggest a formula (or JavaScript) to accomplish this? A: Convert the path to a polygon of arbitrary precision, and then calculate the area of the polygon. Interactive Demo: Area of Path via Subdivision                       At its core the above demo uses functions for adaptively subdividing path into a polygon and computing the area of a polygon: // path: an SVG <path> element // threshold: a 'close-enough' limit (ignore subdivisions with area less than this) // segments: (optional) how many segments to subdivisions to create at each level // returns: a new SVG <polygon> element function pathToPolygonViaSubdivision(path,threshold,segments){ if (!threshold) threshold = 0.0001; // Get really, really close if (!segments) segments = 3; // 2 segments creates 0-area triangles var points = subdivide( ptWithLength(0), ptWithLength( path.getTotalLength() ) ); for (var i=points.length;i--;) points[i] = [points[i].x,points[i].y]; var doc = path.ownerDocument; var poly = doc.createElementNS('http://www.w3.org/2000/svg','polygon'); poly.setAttribute('points',points.join(' ')); return poly; // Record the distance along the path with the point for later reference function ptWithLength(d) { var pt = path.getPointAtLength(d); pt.d = d; return pt; } // Create segments evenly spaced between two points on the path. // If the area of the result is less than the threshold return the endpoints. // Otherwise, keep the intermediary points and subdivide each consecutive pair. function subdivide(p1,p2){ var pts=[p1]; for (var i=1,step=(p2.d-p1.d)/segments;i<segments;i++){ pts[i] = ptWithLength(p1.d + step*i); } pts.push(p2); if (polyArea(pts)<=threshold) return [p1,p2]; else { var result = []; for (var i=1;i<pts.length;++i){ var mids = subdivide(pts[i-1], pts[i]); mids.pop(); // We'll get the last point as the start of the next pair result = result.concat(mids) } result.push(p2); return result; } } // Calculate the area of an polygon represented by an array of points function polyArea(points){ var p1,p2; for(var area=0,len=points.length,i=0;i<len;++i){ p1 = points[i]; p2 = points[(i-1+len)%len]; // Previous point, with wraparound area += (p2.x+p1.x) * (p2.y-p1.y); } return Math.abs(area/2); } } // Return the area for an SVG <polygon> or <polyline> // Self-crossing polys reduce the effective 'area' function polyArea(poly){ var area=0,pts=poly.points,len=pts.numberOfItems; for(var i=0;i<len;++i){ var p1 = pts.getItem(i), p2=pts.getItem((i+-1+len)%len); area += (p2.x+p1.x) * (p2.y-p1.y); } return Math.abs(area/2); } Following is the original answer, which uses a different (non-adaptive) technique for converting the <path> to a <polygon>. Interactive Demo: http://phrogz.net/svg/area_of_path.xhtml                   At its core the above demo uses functions for approximating a path with a polygon and computing the area of a polygon. // Calculate the area of an SVG polygon/polyline function polyArea(poly){ var area=0,pts=poly.points,len=pts.numberOfItems; for(var i=0;i<len;++i){ var p1 = pts.getItem(i), p2=pts.getItem((i+len-1)%len); area += (p2.x+p1.x) * (p2.y-p1.y); } return Math.abs(area/2); } // Create a <polygon> approximation for an SVG <path> function pathToPolygon(path,samples){ if (!samples) samples = 0; var doc = path.ownerDocument; var poly = doc.createElementNS('http://www.w3.org/2000/svg','polygon'); // Put all path segments in a queue for (var segs=[],s=path.pathSegList,i=s.numberOfItems-1;i>=0;--i) segs[i] = s.getItem(i); var segments = segs.concat(); var seg,lastSeg,points=[],x,y; var addSegmentPoint = function(s){ if (s.pathSegType == SVGPathSeg.PATHSEG_CLOSEPATH){ }else{ if (s.pathSegType%2==1 && s.pathSegType>1){ x+=s.x; y+=s.y; }else{ x=s.x; y=s.y; } var last = points[points.length-1]; if (!last || x!=last[0] || y!=last[1]) points.push([x,y]); } }; for (var d=0,len=path.getTotalLength(),step=len/samples;d<=len;d+=step){ var seg = segments[path.getPathSegAtLength(d)]; var pt = path.getPointAtLength(d); if (seg != lastSeg){ lastSeg = seg; while (segs.length && segs[0]!=seg) addSegmentPoint( segs.shift() ); } var last = points[points.length-1]; if (!last || pt.x!=last[0] || pt.y!=last[1]) points.push([pt.x,pt.y]); } for (var i=0,len=segs.length;i<len;++i) addSegmentPoint(segs[i]); for (var i=0,len=points.length;i<len;++i) points[i] = points[i].join(','); poly.setAttribute('points',points.join(' ')); return poly; } A: I hesitated to just make a comment or a full reply. But a simple Google search of "area bezier curve" results in the first three links (the first one being this same post), in : http://objectmix.com/graphics/133553-area-closed-bezier-curve.html (archived) that provides the closed form solution, using the divergence theorem. I am surprised that this link has not been found by the OP. Copying the text in case the website goes down, and crediting the author of the reply Kalle Rutanen: An interesting problem. For any piecewise differentiable curve in 2D, the following general procedure gives you the area inside the curve / series of curves. For polynomial curves (Bezier curves), you will get closed form solutions. Let g(t) be a piecewise differentiable curve, with 0 <= t <= 1. g(t) is oriented clockwise and g(1) = g(0). Let F(x, y) = [x, y] / 2 Then div(F(x, y)) = 1 where div is for divergence. Now the divergence theorem gives you the area inside the closed curve g (t) as a line integral along the curve: int(dot(F(g(t)), perp(g'(t))) dt, t = 0..1) = (1 / 2) * int(dot(g(t), perp(g'(t))) dt, t = 0..1) perp(x, y) = (-y, x) where int is for integration, ' for differentiation and dot for dot product. The integration has to be pieced to the parts corresponding to the smooth curve segments. Now for examples. Take the Bezier degree 3 and one such curve with control points (x0, y0), (x1, y1), (x2, y2), (x3, y3). The integral over this curve is: I := 3 / 10 * y1 * x0 - 3 / 20 * y1 * x2 - 3 / 20 * y1 * x3 - 3 / 10 * y0 * x1 - 3 / 20 * y0 * x2 - 1 / 20 * y0 * x3 + 3 / 20 * y2 * x0 + 3 / 20 * y2 * x1 - 3 / 10 * y2 * x3 + 1 / 20 * y3 * x0 + 3 / 20 * y3 * x1 + 3 / 10 * y3 * x2 Calculate this for each curve in the sequence and add them up. The sum is the area enclosed by the curves (assuming the curves form a loop). If the curve consists of just one Bezier curve, then it must be x3 = x0 and y3 = y0, and the area is: Area := 3 / 20 * y1 * x0 - 3 / 20 * y1 * x2 - 3 / 20 * y0 * x1 + 3 / 20 * y0 * x2 - 3 / 20 * y2 * x0 + 3 / 20 * y2 * x1 Hope I did not do mistakes. -- Kalle Rutanen http://kaba.hilvi.org
78,124,969
In the search for alien life, China has built the world's biggest radio telescope, which it says could 'lead to discoveries beyond our wildest imagination.' The Five-hundred-metre Aperture Spherical Radio Telescope (FAST), nestled between hills in the mountainous region of Guizhou, began working this week. Built at a cost of 1.2 billion yuan (£140 million), the telescope dwarfs the Arecibo Observatory in Puerto Rico as the world's largest single-dish radio telescope, with twice the sensitivity and a reflector as large as 30 football fields. Scroll down for video The Five-hundred-metre Aperture Spherical Radio Telescope (FAST), pictured nestled between hills in the mountainous region of Guizhou, began working this week FAST - KEY FACTS Cost to build: 1.2 billion yuan (£140 million) Number of panels: 4,450 Size: 500 metres Electromagnetic waves detected: 1,300 light-years away Number of people relocated: 10,000 Relocation costs: 1.8 billion yuan ($270 million) Advertisement FAST will use its vast dish, made up of 4,450 panels, to search for signs of intelligent life, and to observe distant pulsars - tiny, rapidly spinning neutron stars believed to be the products of supernova explosions. China sees its ambitious military-run, multi-billion-dollar space programme as symbolising the country's progress. It plans a permanent orbiting space station by 2020 and eventually a manned mission to the moon. Chinese President, Xi Jinping, celebrated the launch, with reports that he had sent a congratulatory letter to the scientists and engineers who contributed to its creation. The telescope represents a leap forward for China's astronomical capabilities and will be one of several 'world-class' telescope projects launched in the next decade, said Yan Jun, head of China's National Astronomical Observation (NAO), according to Xinhua news agency. FAST will use its vast dish, made up of 4,450 panels, to search for signs of intelligent life, and to observe distant pulsars - tiny, rapidly spinning neutron stars believed to be the products of supernova explosions. China sees its ambitious military-run, multi-billion-dollar space programme as symbolising the country's progress The telescope represents a leap forward for China's astronomical capabilities and will be one of several 'world-class' telescope projects launched in the next decade, said Yan Jun, head of China's National Astronomical Observation (NAO), according to Xinhua news agency In a test run before the launch, FAST detected electromagnetic waves emitted by a pulsar more than 1,300 light-years away, state media reported an NAO researcher as saying. Earlier Xinhua cited Wu Xiangping, director-general of the Chinese Astronomical Society, as saying that the telescope's high degree of sensitivity 'will help us to search for intelligent life outside of the galaxy'. Experts have been hunting for alien intelligence for six decades, pointing radio telescopes at stars in the hope of discovering signals from other civilisations, but have not yet found any evidence. Construction of FAST began in 2011, and local officials relocated nearly 10,000 people living within five kilometres (three miles) to create a quieter environment for monitoring In the first two to three years, the telescope will need re-adjusting and will be used for small research projects during that time The telescope is nestled between hills in the mountainous region of Guizhou. Construction of FAST began in 2011, and local officials relocated nearly 10,000 people living within five kilometres (three miles) to create a quieter environment for monitoring. Cell phones in the area must be powered off to maintain radio silence Last month a 'strong signal' detected by a Russian telescope searching for extraterrestrial signals stirred interest among scientists, but experts said it was far too early to make conclusions about its origin. But the new FAST telescope could 'lead to discoveries beyond our wildest imagination,' Douglas Vakoch, president of METI, a group seeking to send messages to space in search of alien life, told Xinhua. Construction of FAST began in 2011, and local officials relocated nearly 10,000 people living within five kilometres (three miles) to create a quieter environment for monitoring. Cell phones in the area must be powered off to maintain radio silence. In the past, China has relocated hundreds of thousands of people to make way for large infrastructure projects such as dams and canals. The huge telescope, which began being constructed five years ago, is made up of 4,450 panels. The area surrounding the telescope is remote and relatively poor. State media said it was chosen because there are no major towns nearby China has poured money into big-ticket science and technology projects as it seeks to become a high-tech leader, but despite some gains the country's scientific output still lags behind The area surrounding the telescope is remote and relatively poor. State media said it was chosen because there are no major towns nearby. The villagers will be compensated with cash or housing. The budget for relocation is 1.8 billion yuan ($270 million), it was reported, more than the cost of constructing the telescope. China has poured money into big-ticket science and technology projects as it seeks to become a high-tech leader, but despite some gains the country's scientific output still lags behind.
78,125,208
NONPRECEDENTIAL DISPOSITION To be cited only in accordance with Fed. R. App. P. 32.1 United States Court of Appeals For the Seventh Circuit Chicago, Illinois 60604 Argued December 15, 2009 Decided January 19, 2010 Before TERENCE T. EVANS, Circuit Judge JOHN DANIEL TINDER, Circuit Judge DAVID F. HAMILTON, Circuit Judge No. 08-3952 BARBARA THOMAS-BAGROWSKI, Appeal from the United States District Plaintiff-Appellant, Court for the Northern District of Illinois, Eastern Division. v. No. 04 C 3544 RAY LaHOOD, Secretary of Transportation, Wayne R. Andersen, Defendant-Appellee. Judge. ORDER In this lawsuit under Title VII of the Civil Rights Act of 1964, see 42 U.S.C. § 2000 et seq., Barbara Thomas-Bagrowski claims that she was the victim of racial discrimination, retaliation, and a hostile work environment while employed by the Federal Aviation Admin-istration (FAA). The district court granted summary judgment for the FAA, and Thomas-Bagrowski appeals. Thomas-Bagrowski was already a management employee in November 1997 when she and three other employees applied for a permanent position as a team leader in the No. 08-3952 Page 2 FAA’s human resources division. A panel interviewed the four candidates and gave a white applicant, Valerie Granahan, the highest score. Thomas-Bagrowski, who is African- American, finished third. Thomas-Bagrowski did not contest the scoring. Eventually, the position was not filled due to a hiring freeze and internal job restructuring. In June 1998, the FAA wanted to fill a temporary (one-year) opening for a team leader. Joseph Yokley, the regional manager for human resources, decided to appoint Granahan and Thomas-Bagrowski to split the position for six months each based on their 1997 interview scores. Yokley, who is African-American and was not a member of the interview panel, later admitted that he selected Thomas-Bagrowski, instead of the runner- up in the 1997 “team leader” competition, in a misconceived attempt at “affirmative action.” But then, before Granahan completed the first six-month rotation, Yokley was approached by employees who had not applied for the permanent position but were interested in serving in the temporary team leader spot. Yokley then opened up the second rotation to other applicants and, instead of allowing Thomas-Bagrowski to serve the entire six months, divided the remaining time equally between her and five other employees (whose races are not specified). In response, Thomas-Bagrowski filed an internal complaint with the FAA in October 1998 claiming that Yokley engaged in racial discrimination. Around the same time that Thomas-Bagrowski filed her internal complaint, the FAA’s Great Lakes Region was undergoing reorganization. The FAA dissolved Thomas- Bagrowski’s working group, and in November 1998 she was assigned to the Airways Facilities team, led by David Pinner. The FAA also revised its operating policy for human resources, in part to place greater emphasis on working together in teams and holding the entire team accountable for ensuring that leave and telecommuting requests from individual members do not impede customer service. Thomas-Bagrowski first encountered problems with her new team over a demand to telecommute. In her previous position she worked from her home in Milwaukee, Wisconsin, instead of going to the office in Des Plaines, Illinois, and so she submitted a telecommuting proposal in November 1998 for her new team to approve. The team did not deny her request but asked her for more information. Instead of responding, Thomas- Bagrowski asked Pinner to approve the request himself and asserted that she qualified for accommodation under the Americans with Disabilities Act because of lower-back pain. Thomas-Bagrowski had previously cited back pain in a claim for worker’s compensation, which the Department of Labor denied after she failed to provide sufficient documentation of her impairment and its cause. When Pinner asked for more information, Thomas- No. 08-3952 Page 3 Bagrowski told him that her medical clinic had faxed him information about her condition and that she did not feel the need to submit anything further. In December 1998, Thomas-Bagrowski and Pinner clashed over her requests for annual leave. Before her reassignment to Pinner’s team, Thomas-Bagrowski received approval to schedule annual leave. After joining the Airways Facilities team, she balked at having to resubmit the same leave requests to her new team for approval. Thomas- Bagrowski refused to coordinate the leave requests with her new team members, but Pinner approved the requests anyway. But Pinner did not approve Thomas-Bagrowski’s requests for sick leave. In February 1999 she asserted to Pinner that she was entitled to paid sick leave whenever she needed it, whether accrued or not. In the past Thomas-Bagrowski had received approval to take an advance of up to 240 hours against future sick leave, but that was to cover expected absences while she was receiving cancer treatments. Her requests to Pinner were for routine sick days and medical matters, which under FAA policy do not qualify for an advance against future sick leave. Pinner requested additional medical information, but Thomas-Bagrowski refused to submit anything. She told Pinner his requests violated the Privacy Act. The record establishes that Pinner required similar documentation from white employees who requested advances against their sick leave. In March 1999, Thomas-Bagrowski filed another claim with the Department of Labor for worker’s compensation. This time she alleged that her work environment was hostile and that it caused her to develop shingles. Her claim was denied for insufficient medical documentation. Meanwhile, Thomas-Bagrowski simply stopped going to work. She was fired in September 1999 after unauthorized absences totaling nearly 500 hours over three months. She had submitted some documentation to justify her absences, but Pinner consulted with a physician in the medical division and concluded that the documentation was inadequate. Thomas-Bagrowski had not supplied an expected date of recovery, or a medical explanation for her incapacity, or a list of suggested duty restrictions. She challenged her discharge before the Merit Systems Protection Board, but her claim was dismissed. She did not appeal the Board’s decision. In 2004, Thomas-Bagrowski filed this lawsuit claiming racial discrimination (for denying her a six-month detail as team leader), retaliation (for denying her sick leave and telecommuting requests after she filed her internal complaint), and a hostile work No. 08-3952 Page 4 environment (for requiring approval from her team and additional medical documentation for her leave requests). The litigation, however, does not concern Thomas-Bagrowski’s termination from the FAA. In granting summary judgment for the FAA, the district court concluded that Thomas-Bagrowski, who rested her discrimination claim on the indirect, burden-shifting analysis of McDonnell Douglas Corp. v. Green, 411 U.S. 792 (1973), had produced no evidence suggesting that Yokley’s reason for denying her the six-month detail--that he wanted to consider employees who had not applied for the permanent position--was pretextual. The court concluded that Pinner’s refusal to permit Thomas- Bagrowski to telecommute was not a materially adverse action, because she had not shown that she was entitled to any accommodation for a disability under the ADA and, in any event, could not insist on being granted a telecommuting option as an accommodation. The court also reasoned that the refusal of sick leave was not materially adverse, as her requests did not comport with FAA policy. In addition, she showed no causal connection to her internal complaint. Finally, the court found no evidence of a hostile work environment and did not credit Thomas-Bagrowski’s assertions that FAA personnel policies were “overwhelmingly excessive and unnecessary.” On appeal, Thomas-Bagrowski first argues that her discrimination claim should have survived summary judgment. She asserts that under the direct method the claim should have gone forward, but that theory was waived by the failure to present it to the district court, see Burks v. Wis. Dep’t of Transp., 464 F.3d 744, 751 n.3 (7th Cir. 2006), and is frivolous in any event because she has not identified any direct evidence of discrimination. Thomas-Bagrowski also contends that she established a prima facie case, and that the FAA failed to provide a legitimate explanation for denying her a six-month term as team leader. Thus, she concludes, a material dispute exists about whether the agency acted out of a preference for white employees. The FAA’s motion for summary judgment discussed Thomas-Bagrowski’s case under the indirect method of racial discrimination. In her response, Thomas-Bagrowski presented no legal argument, only an extended list of facts and exhibits. The district court followed the FAA’s lead and simply assumed that Thomas-Bagrowski had established a prima-facie case and skipped directly to the question of pretext. The agency offered undisputed evidence that Yokley divided up the last six months of the temporary position of team leader because he wanted to consider employees who were interested in that brief assignment but had not applied for the permanent position. See Fane v. Locke Reynolds, LLP, 480 F.3d 534, 538 (7th Cir. 2007) (explaining that employer may prevail on summary judgment if it offers genuine, nondiscriminatory reason for employment action); Barricks v. Eli Lilly & Co., 481 F.3d 556, 560 (7th Cir. 2007) (explaining that employer’s reasons need No. 08-3952 Page 5 only have been honest, regardless if they were accurate or wise, to prevail on summary judgment). There is no evidence that Yokley lied about his desire to consider more applicants (or even any evidence that the employees who shared the final six months with Thomas-Bagrowski were not African-American), and he stated that he always planned to reevaluate the division’s needs before Thomas-Bagrowski began her term. In fact, as Thomas-Bagrowski acknowledges, Yokley originally selected her in part out of a racial preference, in an attempt at affirmative action. The lack of evidence of pretext is reason enough to sustain the grant of summary judgment on the discrimination claim, but we add the observation that Thomas-Bagrowski did not establish a prima facie case. What was missing is evidence that she was best qualified for the six-month position in the first place. See Jackson v. City of Chi., 552 F.3d 619, 622 (7th Cir. 2009). As the FAA acknowledges, Yokley’s initial decision to award her the detail because of her race, despite the fact that the interview panel favored another white candidate, could have led to a legitimate reverse-discrimination claim from the second- place candidate. See Ballance v. City of Springfield, 424 F.3d 614, 617 (7th Cir. 2005). In that sense, then, Thomas-Bagrowski lost nothing when Yokley rescinded the decision to let her fill the temporary position for the entire six months. Thomas-Bagrowski also challenges the district court’s conclusion that the FAA’s refusals to approve her sick leave and telecommuting requests were insufficient to show a genuine issue of retaliation. Her brief is difficult to follow, but apparently she argues that the timing of the changes in FAA policies after she had filed an internal complaint alleging discrimination was enough to show a causal connection to a materially adverse action. The district court correctly recognized, however, that the FAA did not take a materially adverse action by declining to approve Thomas-Bagrowski’s requests for sick leave or to telecommute. By refusing to provide required documentation, Thomas- Bagrowski never completed the requests in the first place. See, e.g., Hudson v. Chi. Transit Auth., 375 F.3d 552, 558 (7th Cir. 2004). The FAA did not alter the conditions of her employment, see Griffin v. Potter, 356 F.3d 824, 829 (7th Cir. 2004), but simply required that she follow agency-wide procedures in asking for those benefits. AFFIRMED.
78,125,231
Funky Entertainment, Inc. is a small, family owned and operated Entertainment Company specializing in Photo Booth Rentals. We are located in Suffolk County, Long Island. We provide entertainment for... more photo booth info Our Portable new digital era Photo Booth has the latest technology which makes it ideal for any special event. Beautiful sleek design has (4) four HD color screens. Packages Starting at only $400.00..... more photo booth info
78,125,757
The present invention relates to a dot-matrix printer with a font cartridge which reads out character pattern data stored in advance in a memory mounted in the font cartridge, and prints in accordance with the character pattern data. In order to print a plurality of types of font of characters, dot-matrix printers must have a read-only memory (ROM) storing character pattern data of each font. However, different fonts are usually used in different countries. For this reason, when dot-matrix printers mount a ROM storing character pattern data for each country, a ROM having a considerable memory capacity is required, resulting in high cost. In order to overcome such a drawback, a plurality of font cartridges mounting ROMs or random-access memories (RAMs) storing a plurality of font of character pattern data are prepared, and the desired font cartridge is mounted on the dot-matrix printer so as to read out character pattern data of a desired font for printing. However, with this method, when the font cartridge is mounted on the dot-matrix printer, a control circuit of the printer cannot recognize in which address area of the font cartridge a memory is mounted, or whether a ROM or a RAM is mounted. Therefore, it requires a complex operation and extra time for this recognition.
78,125,860
Grateful Dead Help On The Way for RatDog: Kimock Steps In For Karan By Gary Lambert Central Park, NYC, 7/9/07(Photo: Gary Lambert) When life looks like easy street… well, you know how the rest of that song goes. And danger could not have announced itself much more dramatically at RatDog’s door than it did last month, with the news that the band’s great guitarist (and more important, our dear brother), Mark Karan, had been diagnosed with throat cancer, and would have to take a break from touring so that he can get the best and most thorough treatment available. Just days before the start of a long summer trek, the members of RatDog were faced with multiple challenges: dealing both personally and professionally with the absence of such a close pal and vital musical collaborator; and finding someone who could, on dauntingly short notice, step into Mark’s shoes. The latter task was made immeasurably easier by the availability of a longtime member of the extended family: Steve Kimock. “Steve is an old friend and a good friend,” Bob Weir explained when we caught up with him a half-dozen stops into the tour. “He already knew a lot of our repertoire, having played it in various contexts over the years, and Steve and I know each other’s sensibilities. So we couldn’t have been luckier, given the situation.” Kimock, Barefoot in the Park(Photo: Gary Lambert) It takes more than luck, though, to incorporate a new musical element into a close-knit unit like RatDog. There’s also a lot of time and effort involved, and this band clearly has the resolve and work ethic to make it happen. As of this writing, the concept of “down time” had not yet been introduced to the touring party. On the tour bus, in the backstage rehearsal room and in every available moment, the musicians work and re-work the material, figuring out how to incorporate Kimock’s distinctive, gorgeously melodic playing into the ensemble sound, and how to tailor their own playing to complement his. As Weir notes about RatDog: “There’s a collective sense of orchestration that only occurs when you’re together for a long time.” Asked how he thinks Kimock is doing at finding his way comfortably into that collective, Bob adds, with great admiration, “Steve astonishes me several times a night.” The audible evidence of the tour thus far emphatically suggests that the astonishments will keep right on coming. Another source of delightful discovery on the July leg of the tour has been the immensely entertaining presence of Keller Williams, who has been opening the shows with his uncanny solo act, using high-tech looping devices to transform himself into a kind of one-man jamband. Williams has also been sitting in with RatDog at various spots in the shows, with Mr. Weir returning the compliment by guesting on a few tunes in Keller’s set each night. “Keller is a whole lot of fun to play with,” Bobby notes, “and we’ve been figuring out tunes we can do together on the fly, just running through ‘em once or twice and then seeing what happens to ‘em on stage.” The August leg of the tour will commence at one of the best events of any summer, The Gathering of the Vibes, which returns this time around to Bridgeport, CT, after several years in upstate New York. Bobby and Keller(Photo: Gary Lambert) From there it’s on to Lowell, MA, hometown of Jack Kerouac, for a very special event celebrating the 50th anniversary of Kerouac’s great and hugely influential novel, On The Road. Before the performance, Bob Weir and Grateful Dead historian Dennis McNally (who also wrote Desolate Angel, an acclaimed Kerouac biography) will take to the stage to discuss Kerouac and Neal Cassady, who was immortalized in the novel as Dean Moriarity and would later be a pivotal force in the formative years of the Grateful Dead, as primary driver (“Cowboy Neal at the wheel”) of the legendary Merry Pranksters bus known as “Furthur.” The summer wraps up with a bunch of dates pairing RatDog with another bunch that has a lot of distinguished history behind it and a lot of great music still in it, the Allman Brothers Band. To tell you the truth, i almost gave up on the DOG. After hearing DJ LOGIC, in the band, I hated, listened again and then loved this new kind of DOG. Then Wasserman left, I hated the fact that his replacement rocked. I mean that bass rocked my bones! (sylvester's) But Rob felt like an old friend, he gave a sense of dignity to the music, if not to the shows. I am remorseful that it took Mark Karan's illness to bring Kimock to form a new DOG, for a while. A rare bread that I had not conceived. With the current line-up, along with the support of Keller. The show I caught on this tour, (opening night in Chi-town), blew me away. It was the best Ratdog since the Family Reunion's second nght/day. Which included Logic, whom I gave the cold shoulder to for a year. first i want to say i will miss mark karan and hope he's doin' ok. I'll miss his playing and influence on the music and the band. Next I am looking forward to hearing steve with the band and his distinctive style of playing. finally, if some moreflavor or assisstence is needed I'm sure that either Dereck or Warren from the Brothers would make some acceptable contributions musically as well as anybody in ratdog could certainly add some of their wonderful flavor to the ABB stew. Could be quite the night!! Plus if anybody hasn't been to the laurel center for a show i think they will be very satisfied with the venue- it's fairly new and interesting. only problem could be traffic on some of the PA blue highways. Who's Online Online users Newsletter Get the latest news on Grateful Dead releases and more straight to your inbox. By submitting your email address you acknowledge and agree to our Privacy Policy and Terms of Use and are okay with receiving news, updates, special offers and occasional marketing messages from us and our affiliates. Newsletter Get the latest news on Grateful Dead releases and more straight to your inbox. Signup! By submitting your email address you acknowledge and agree to our Privacy Policy and Terms of Use and are okay with receiving news, updates, special offers and occasional marketing messages from us and our affiliates.
78,125,981
Stockwell Toilet Watch Wednesday, 31 March 2010 COP THUG SMELLIE SURROUNDED - CLOCK HIS NON ATTACKERS! CLOCK this single frame pic of RIOT COP THUG SMELLIE "SURROUNDED". Not by attackers but by several photographers who absolutely for sure were not about to attack - indeed if they were bent on doing that they could have done it. Of course they didn't do anything of the kind. Smellie innocent? Yeah and the world is flat. The show aint over yet altho' he and the excuse for a District Judge Friday, 26 March 2010 VERY THICK SKINNED G20 "DELROY ATTACK DOG" SMELLIE. ELBOW PADDING WORN BY DIMWIT ATTACK DOG DELROY SMELLIE. ---------------------------------Maybe the Westminster Magistrates "DELROY ATTACK DOG SMELLIE" comic opera (in a manner of speaking) will get wrapped up today - it's already over-run its scheduled linolium. Delroy "attack dog" was at times a truly dim-witted star performer yesterday. Unable to "asp" or break the nose or arms of those attempting to penetrate his secretive frontal lobes he instead confessed to the court that it had been explained to him on a number of occasions by bigger Met Police fishes than he why he'd been suspended from duty after his violent "attack dog" antics at the IAN TOMLINSON "Memorial" demo last year. Trouble is that his frontal lobes had played hard to find - so he just didn't get it - not even to this day as to why he'd been suspended. He's not into acknowledging the situation one little bit - one would have thought that this was hardly the head space for a conscientious sagacious careful cop. And yet this specimen in police estimation is one of the "elite" bright public order policing "attack dogs". Heaven help us with regard to those who are less highly regarded in the police farmyard than dimwit Delroy. Genuine apologies are extended to farm yard animals everywhere.-----------------------------Of course dimwit knows very well why he's been suspended but he's just doing the verbal - lying - after all its only a court room - to save his skin. Exactly like ex Commissioner Ian Blair lied transparently through his crooked teeth about how soon he knew that innocent Jean Charles de Menezes had been shot dead by police at Stockwell Station. ------------------------On yesterdays narrative dimwit Delroy recounted that the 2nd April 2009 events threw him big time so much so that he almost faced his 9/11 attack come-uppance because of the provocative threatening presence of Nicola Fisher. The verbal went that woweee she was about to launch a truly big time attack on him - poor isolated diminutive dimwit Smellie Delroy. There was no evidence that fighter planes were scrambled to deal with the emergency but certainly dimwit's brain cells were. Maybe fighter planes from City Airport or Biggin Hill will feature today.---------------------Diminutive dimwit Delroy. Just stop and look at him in the above vid frame with his heavy duty "attack dog" elbow pads - he appears to be the ONLY truly full time prowling pugnacious cop shown in the Nicola Frazer attack vid sequence - click here to view - kitted out big time, and clearly very keen to put the ASP-like boot in.------------------ Tuesday, 16 March 2010 Do not hesitate to click on today's above EVENING STANDARD news report immediately above to ENLARGE text and thereby read about LONDON's CROWN PROSECUTION SERVICE failings. There is also a"COMMENT"made by this blogger that has been carried by the EVENING STANDARD. --------------------- It specifically concerns IAN TOMLINSON who was killed as a result of being repeatedly beaten by police almost a year ago now (April 1st 2009). -------------------- Altho the repeated criminal attacks were filmed still no charges have as yet been laid against a single police officer involved in and or complicit with the attacks. ---------------- It is absolutely breathtaking that this state of affairs is allowed to exist and with barely a single murmer being made Friday, 12 March 2010 MOSQUE BRUVS ON FRIDAY. Saturday, 6 March 2010 re HARBOROUGH PUBLIC TOILETS - WILL SELF PLEASE NOTE! Local Harborough folk (Leicestershire) are currently fighting to keep their public loo's open - see here. These loo's are threatened with closure by the local council not because so many local eateries or whatever have sprung up a la Will Self - see discussion here- and so have thereby made their continuation unneccesary or superflous to demand - on the contrary they are being closed as is so often the case as a cost cutting exercise. Well its only 2011 - give it another 700 hundred years and maybe folk will have wised up - that is if any folk will still be in existance. Well this blogger won't and nor will Will either. PHONEY BLAIR COVER-UP - PURE BUM WIPE MATERIAL. About Me See post < sCRAPBOOK STOCKWELL > text. I'm a Stockwell (London) pensioner, with diabetes type 2, have had radiotherapy treatment for cancer of the prostate, and am in regular need of public conveniences - for initial STOCKWELL TOILET WATCH purposes what else do you need to know?
78,126,105
Helping to educate our customers on how to stay safe when using electricity is part of our job, and one that we take seriously. Safety guides us in our day to day duties as Ontario's largest distributor of electricity. Our website is home to a wealth of information on things you should look out for when working outside, renovating your house, or trimming trees near power lines.
78,126,121
A young man who was initially denied a crucial double-lung transplant after he tested positive for marijuana has died after undergoing the procedure in Philadelphia. 20-year-old Riley Hancey died Saturday due to complications from the transplant procedure. "We are extremely thankful to all the wonderful doctors and staff at the University of Pennsylvania and the University of Utah for their expertise and care that Riley received," the Hancey family said in a statement, according to the PhillyVoice. "We would also like to thank the donor family, who in their own grief chose to save a life. We will never forget your kindness and generosity." According to reports, Hancey contracted a rare form of pneumonia and developed a lung infection just after Thanksgiving in 2016. When it became clear that he would need a lung transplant in order to survive, doctors at the University of Utah reportedly rejected Hancey as a transplant candidate because he tested positive for THC, a chemical found in marijuana. Hancey’s father claims that his son had smoked pot with his friends on Thanksgiving night in 2016 — the first time he had used drugs in a year. Hancey’s family contacted hospitals across the country, searching for a doctor that would perform the procedure despite his drug test. University of Pennsylvania Hospital agreed to perform the procedure, and he was life-flighted to Philadelphia in February. He received his new lungs on March 28. "We know that in our hearts we gave him every opportunity to survive," the Hancey family’s statement said. "He will live in our hearts forever. Riley is now free to climb every mountain, ski the back country, go fishing, and run every river. He will continue to do so with his family in spirit." Drug use in transplant candidates is currently handled on a case-by-case basis, but Hancey’s case has sparked a call for a consistent policy in regard to marijuana. The drug is now legal in some capacity in 26 states and the District of Columbia. Alex Hider is a writer for the E.W. Scripps National Desk. Follow him on Twitter @alexhider.
78,126,183
Share this article on LinkedIn Email Formula 1 cannot afford to remain with fully frozen engines if the sport is going to be more exciting in the future, says Red Bull boss Christian Horner. As pressure mounts on Mercedes to end its opposition to an easing of F1's engine freeze rules, Horner reckons that changing the regulations is vital for the health of grand prix racing. "For F1 it is important," he said. "We saw Nico Rosberg's performance [in Russia] - the true performance is they can drive through the field. It is too out of kilter. "There were five Mercedes-powered cars in the top five, and the immaturity of this technology is still quite raw. "I think Mercedes should not be afraid of competition, they are doing a super job. "But it is healthy for F1 that Ferrari, Honda and Renault should have that ability to close that gap otherwise we are going to end up in a very stagnant position." Mercedes has made it clear that it will not approve the changes to the rules for 2015. And with unanimity required at the F1 Commission for it to happen, there appears no chance of it going through unless Mercedes can be convinced to change its mind. When asked what could be done to persuade Mercedes, Horner said: "I think it is a bigger issue than just about the teams. "It is about what is right for the sport, what is right for the fans. "It is easy to take a self-interest position, but when you look at what is the right thing for F1, it is to have competition. "We need to be big enough to say let's open it a little bit, and let's be responsible on costs - so there is no cost impact for customer teams - but have that competition." When asked about the potential danger of Mercedes actually pulling further clear if the engine freeze is relaxed, Horner added: "It is quite possible. "But at least you have the ability to try to improve because at the moment you are frozen with what you have got. "You are running with your hands tied behind your back. "It is competition, like it is on the chassis side. If you start on the back foot then you can develop your way out of that."
78,126,275
DİDEM ÜRER: The legislative proposal banning the retail sale of alcohol between 22:00 and 06:00 has been passed. In addition, there will be no advertising of alcohol in any form and no marketing aimed at consumers. ADNAN OKTAR: Now that part is highly important. Night is when all kinds of awful things happen; people can protect themselves in the daytime. This is an excellent and most appropriate move, masha’Allah. DİDEM ÜRER: A similar legislative proposal has been passed in Russia, where 38% of deaths are caused by vehicle use while under the influence of alcohol. It began being enforced in Russia in January. However, there is considerable criticism, saying that an age limit after 22:00 hours would be reasonable, but that it is wrong to impose a blanket ban. ADNAN OKTAR: In my view, it is essential to prevent sales at night. It is great. People drink , they become drunk and then turn up at people’s doorstep. There is no one there, no one to help out. Everyone is asleep at night. There are lots of people around in the daytime. It is easy to deal with a drunk in the daytime, isn’t it? Everyone is up. But at night, everyone is in bed. What can anyone do if a drunk turns up at the door and attacks his children and family? These people drink at night and then go home and attack their wives and children. I am not saying everyone is like that; some people drink in moderation and I exonerate them. I am talking about some other cases. But hundreds of thousands of families suffer from this in Anatolia. There is no sense in ignoring it. It is a fact. But these measures are applied all over the world, in all modern countries. Turkey came to it somewhat late. Labels and packaging all over the world also say that alcohol is harmful. Night-time sales are banned everywhere. These measures are entirely normal. In America, for instance, they refuse to sell it to anyone aged 18. They have to be 21. It is no good otherwise. DİDEM ÜRER: The sale of alcohol is banned after 22:00 in Canada and after 23:00 in Britain. (Adnan Oktar, 24 May 2013: A9 TV)
78,126,394
School of Business and Technology The School of Business and Technology (SBT) was a charter school that provided a small class size in an effort to increase a student's chance at learning the subject. The School of Business and Technology (SBT) provided a High education level and an experience in business in an effort to prepare its students for college. The School of Business and Technology (SBT) was created in 2002 by the Oceanside Unified School District in Oceanside, California. In July 2008, the board of directors for the Oceanside Unified School District voted to discontinue funding to The School of Business and Technology, essentially ending the school. The school decided to search for a city who is willing to house them. As of right now , the school is still searching for a suitable new location. Category:Defunct schools in California Category:Educational institutions established in 2002 Category:Oceanside, California Category:2002 establishments in California
78,126,537
Opération Sentinelle – in place since January 2015, after the attack on satirical weekly Charlie Hebdo – involves 10,000 troops and costs €400,000 a day. But some now question whether it has succeeded in minimising terror attacks on French soil. Advertising Read more Six soldiers were injured on Wednesday after a suspect rammed his car into a group of soldiers, the latest in a series of attacks targeting French security services.The attack has renewed questions over the efficiency of Opération Sentinelle's anti-terror patrols, which have often become the targets of attacks themselves. “It is essentially just posturing that has zero operational impact,” Jean-Charles Brisard of the Center for the Analysis of Terrorism told FRANCE 24. “Sentinelle has never stopped, prevented or hindered any terrorist attack in France since its creation in 2015.” Alain Rodier of the Centre for Intelligence Research says that the mission remains in place largely for political reasons. “As it stands right now, no politician is willing to dismantle Opération Sentinelle in case there is a terror attack right afterwards, because the opposition would immediately blame them,” he says. Click on the video player above to view the full FRANCE 24 report. Daily newsletterReceive essential international news every morning Subscribe
78,126,601
Google has released a completely visual programming language that lets you build software without typing a single character. Now available on Google Code – the company's site for hosting open source software – the new language is called Google Blockly, and it's reminiscent of Scratch, a platform developed at MIT that seeks to turn even young children into programmers. Like Scratch, Blockly lets you build applications by piecing together small graphical objects in much the same way you'd piece together Legos. Each visual object is also a code object – a variable or a counter or an "if-then" statement or the like – and as you piece them to together, you create simple functions. And as you piece the functions together, you create entire applications – say, a game where you guide a tiny figurine through a maze. "Users can drag blocks together to build an application," reads the description on Google's site. "No typing required." The project is part of a much larger effort to bring programming skills to, well, everyone. In the summer of 2010, Google announced a similar platform known as App Inventor, and this year, an outfit called Codecademy has made headlines as it seeks educate a whole new world of programmers over the web. New York Mayor Michael Bloomberg is among those using the service – or at least that's what he says. "Programming has followed a steady progression of becoming more and more accessible," says Neil Fraser, one of the Googlers behind Blocky. "From Assembly, to Fortran, to C++, to Python, to Blockly, each generation gets to use an even higher-level interface. Eventually one will be able to instruct computers with completely natural language. At which point everyone will be able to tell a computer what to do." Fraser says that Blocky is designed to replace the "blocks editor" previously used by App Inventor and that unlike Scratch, it's intended for more than just children. "Blockly is designed to be embedable into any program or website which wants to enable novice programmers to write scripts," he says. "One of the goals for Blockly is to generate readable code – whether it be JavaScript, Dart, Python, or some other language – which the user can continue working with once they out-grow the blocks editor. We want users to be able to take their data and leave, whenever they want." App Inventor was the brainchild of the MIT computer science and engineering professor Hal Abelson, who was on sabbatical at Google at the time. The platform was actually an outgrowth of Scratch, which Abelson had worked on at MIT. It was billed as a tool that would allow even the greenest of techies to build applications for the company's Android mobile operating system, but its life at Google was short. When Abelson returned to MIT the following summer, he essentially took the platform with him. At the University of California at Berkeley, researchers are offering their own port of Scratch, known as Snap. With the Google name behind it, Blocky has already sparked at least a temporary flurry of interest. At Hacker News – the online hangout for Silicon Valley developers – a post about the platform has received over 100 comments over the past day, and some include programs built with the platform. From Google's site, you can translate Blockly applications into existing languages, including Javascript; Dart, Google new take on Javascript, and Python. And there's a "Hebrew and Arabic" programming mode where you piece together the objects from right to left, rather than left to right. So, Google Blocky is a little like a Pixar movie. It's for kids. But it's also for adults. And it had a nice sense of humor. Update: This story had been updated with comment from Google's Neil Fraser.
78,126,762
Chicago, IL (February 7, 2019) – NBC Sports Chicago - THE Home of the #AuthenticFan - is ready to start up the highly-anticipated 2019 MLB season as it gets set to deliver southside and northside baseball fans with its most extensive cross-platform White Sox & Cubs Cactus League Spring Training coverage to date featuring special remote editions of SportsTalk Live presented by The Chevy Silverado, along with in-depth reports and video offerings at NBCSportsChicago.com and via the new MyTeams by NBC Sports app and much more (NOTE: Fans located anywhere in the U.S. can download MyTeams for free on iOS and Android devices in the Apple App Store and Google Play Store).,Note the following highlights beginning next Monday, February 11:read more But unlike HBO and “Game of Thrones,” WGN and the Cubs aren’t hyping their final season together after 72 years.,In my first year as Cubs beat writer in 1997, WGN televised 144 games.,The following January, the Cubs owners — Tribune Co. — decided to reduce the WGN schedule to 92 games, placing 62 games on CLTV, the company’s local cable news channel.,Fans gradually grew accustomed to watching more games on cable, and in 2003 the Cubs, White Sox, Bulls and Blackhawks began their own sports channel, Comcast SportsNet Chicago, which is now NBC Sports Chicago.,WGN will televise the Cubs' season opener — likely for the last time » 'I have more confidence than last year': A healthy Yu Darvish eyes his return to dominance with the Cubs »read more
78,126,770
French doctor speaks out in pollution controversy Domhnall MacAuley is a CMAJ Associate Editor and professor of primary care in Northern Ireland, UK CMAJ and other medical journals have called for medical leadership on climate change and related issues. Kirsten Patrick’s recent editorial addresses physicians responsibility as agents for change. But what happens if you do speak out? Dr. Frédéric Champly, chief of the emergency department in Sallanches hospital, which serves the Chamonix valley and the local region of the French Alps, recently drew attention to pollution by particles and mono nitrogen oxides in the Arve valley. Although we may be concerned yet unsurprised at pollution in industrial cities, this river valley in a beautiful alpine landscape is the gateway to major winter sports centres, a climbing and walking paradise, and is overlooked by Mont Blanc. It also contains the Autoroute Blanche (A40) leading to the Mont Blanc tunnel. Medical colleagues locally had already raised their concerns. When Dr. Champly was interviewed on television, he described the effect of this pollution on patients attending his hospital, his concern for those exercising in this environment, his worries about the local children and, on a personal note, that he kept his children away from school at times when the pollution level was highest. He was then criticized publicly by a local politician, Jean Marc Peillex, mayor of the commune of St Gervais who did not feel there should be public debate and who asked the Regional Health Agency to sanction Dr. Champly. In response, 220 of Dr. Champly’s medical colleagues in Haute Savoie signed a letter of support and sent it to the French President, outlining the severity of the problem, their indignation at the reaction of the authorities and the ineffectiveness of any measures already taken. To learn more about how this story has evolved, take a look at the Facebook page created in his support, which contains some remarkable photographs, maps and diagrams indicating the level of pollution, in addition to television interviews and news stories from the local newspaper. The page also contains a copy of his colleagues’ letter of support with additional online signatories. The evolution of this story shows just what can happen if doctors take a stand, together with the remarkable power of social media and its potential to facilitate change. Pollution is, of course, a complex and politically sensitive issue; in this case, it has implications for tourism, local industry, transport, individual behaviour and the local tradition of wood-burning fires. No wonder it is such a difficult issue for local politicians. However, pollution continues to be a major problem in France; just this week, the entire northern part of the country was covered by a cloud of pollution. Evening news bulletins showed pollution maps as an integral part of their weather forecasts. Paris was shrouded in smog, and some regions introduced severe speed restrictions. As has happened in the past, Paris introduced “circulation alternée” on Monday, in which traffic is restricted by number plates. Only vehicles with odd-numbered plates were allowed to drive, and public transit was free. France gasped this week, but the story from Haute Savoie shows how doctors can raise awareness. Perhaps we need more Dr. Champlys.
78,126,773
The use of electronic devices is very pervasive in today's society as electronic devices are used in many different locations. In fact, the same electronic device, such as a mobile device like a cell phone, smart phone or laptop, can be used in many different locations such as the office, at home or in a vehicle. However, these electronic devices are used differently in these different locations. For example, a mobile device user may use certain applications on the device such as a clock application while at home and use a text to speech application while travelling in a vehicle. Accordingly, it would be useful if electronic devices can determine their environment and automatically execute certain applications based on user preferences.
78,126,871
5 Suggested Answers Hi, a 6ya expert can help you resolve that issue over the phone in a minute or two. best thing about this new service is that you are never placed on hold and get to talk to real repairmen in the US. the service is completely free and covers almost anything you can think of (from cars to computers, handyman, and even drones). click here to download the app (for users in the US for now) and get all the help you need.goodluck! Unfortunately, the touch panel has experienced a failure. You can try disconnecting the power for a bout 5 minutes to see if it resets, but I doubt it will. The touch pad will likely need to be replaced. I will need the rest of the model number to give you the correct part information. Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only. Related Questions: You have activated the delayed start function. Please read following instruction from user manual. 8. Delayed Start button Allows the start of the programme to be delayed by a minimum of 1 hour, up to 9 hours. How to set the delayed start When the button is pressed, the number appears in the display (4). Press the button again to select the number of hours by which you wish to delay the programme start, up to a maximum of 9. E.g.: if you have selected number , the wash programme will start in 2 hours' time. To cancel the delayed start set, press the button until the number appears. Close the dishwasher door to start the countdown: a long audible signal (about one second) informs you that the function is active. If you open the door again you will see, in the display (4), how long there is to go (in hours) before the programme starts. A flashing dot in the digital display means that the delayed start function has been activated. Opening the door has no effect on a countdown which has already been started. If, during the countdown, you wish to see the programme number selected in the display, press the Programme Selector button (5). If, after having started the countdown by closing the dishwasher door, you wish to cancel the delayed start, press button (9) (Cancel) for about 2 seconds until the short «instruction accepted «signal is heard». This is a countdown-type timer, for precise cooking times, etc. Press hour and minute buttons to set time. Press start button to begin countdown. Press hour and minute buttons same time to reset timer. Timer beeps each 5 and 10 minute intervals during countdown. This timer does not connect to any electrical device, has a magnet on back, and has internal battery operated. Setting the Time:Pull out the crown to position 3.Rotate the crown to set the exact time.Ensure that you are in the daytime12 hour segment by allowing the hands to pass 12 o'clock and notice if the date moved on or not.If the date changed, then you will need to advance at least 2 hours before moving on to set the date.If you do not need to change the date, return the crown to position 1. Setting the Date:DO NOT DO THIS IF THE WATCH IS BETWEEN 8pm & 2amPull crown to position 2.Rotate the crown clockwise to set to the correct date.Return the crown to position 1. Setting the Alarm [12 hour]Pull the crown to position 2.Rotate the crown anti clockwise to move the alarm hand to the desired time.Return the crown to position 1.To activate the alarm, press the 8 o'clock button. You will hear 2 beeps.To deactivate the alarm, press the 8 o'clock button. You will hear 1 beep.NB If the alarm is not cancelled while it sounds the first time, it will repeat again in 2 minutes time. The alarm sound last for 15 seconds. If the alarm time coincides with 1 beep of the countdown, the alarm function will be cancelled. Battery EOL [End Of Life]The Swiss ISA 8270 movement uses a 395 cell and will signify a low power state by advancing the small seconds hand 4 steps every 4 seconds. Setting the CountdownYou can specify a countdown of between 1 and 10 minutes.Press the 4 o'clock button to enter setting mode. You will hear 1 beep.Press the 4 o'clock button to move the central timing hand successively to the desired countdown time.Wait 3 seconds for the watch to automatically exit the setting mode.You can change the duration of the countdown [except in the last minute] at any time.e.g. If the countdown is set to 10minutes, and you now want to change the countdown to 9 minutes, press the 4 o'clock button. The hours/seconds counter sub-dial will perform one complete turn and then stop at 60, before continuing. Activating the Countdown:Press the 2 o'clock button to start the countdown. You will hear 2 beeps.The hours/seconds counter sub-dial will perform one complete turn and then stop at 60, before continuing.When the countdown begins, the hour/seconds counter sub-dial indicates the seconds with 1 step every 2 seconds, and the central timing hand indicates the minutes with 1 step being 2.4 seconds. Countdown tone:There are 2 beeps every minute.in the last minute:50 seconds=1 beep.40 seconds=1 beep.30 seconds=2 beeps.20 seconds=2 beeps.10 seconds and less=1 beep every second with zero being 1 second long beep. Timer function automatically starts. Timer:After countdown has finished the timer starts automatically.The central timing hand indicates the minutes with 1 step every 12 seconds. The hour/minutes counter sub-dial indicates the hours up to a maximum of 96.Press the 4 o'clock button to see the timer seconds [display lasts up to a minute].Press the 4 o'clock button to see the timer hours.The timer has a maximum time of 96 hours [4 days] after which the hour/seconds sub-dial counter will stay at 0 and the central timing hand will stay at 12. Start/Stop Timer:Press the 2 o'clock button to stop the timer. You will hear 1 beep.Press the 2 o'clock button to resume the timer. You will hear 2 beeps. Setting the Countdown & Timer:Countdown reset:Press the 2 o'clock button to stop the countdown. You will hear 1 beep.Press the 4 o'clock button to reset. Setting the Timer:Press the 2 o'clock button to stop the timer. The hour/seconds counter sub-dial will show the elapsed hours and the central timing hand the elapsed minutes.Press the 4 o'clock button to view the seconds elapsed on the hour/seconds counter sub-dial.Press the 4 o'clock button to reset. Resetting the Hands:If the hands do not zero correctly, which might occur after a battery change or a violent shock, you can make adjustments by doing the following settings as long as the countdown is not in operation.To correct the hour/seconds counter sub-dial, pull the crown out to position 2 and wait 1 second.Press either the 2 or 4 o'clock button; the central timing hand will turn and stop [not necessarily at 12].Press the 2 o'clock button to increment the hours/seconds counter sub-dial hand to position it at 60.To correct the central timing hand, press the 4 o'clock button to increment the hand to the 12 position.Return the crown to position 1. If the battery had been replaced, the central timing hand defaults to the 10 minute position. If the battery had not been replaced, the central timing hand will return to the previously set position. SHU 53 MODELS To start test program, press and hold both SCRUB WASH and DELICATE/ECONO program buttons, then turn unit on by pressing ON/OFF button. Initially, control module version # will be displayed (e.g./ ?21? = version 1 with jumper). When wash program buttons are released, lights above them will flash. To end test program, press ON/OFF button. To check each program indicator light, press its button. To check Cycle Countdown display, Refill Rinse Agent light and REGULAR WASH light, press REGULAR WASH button (press Delay Start button to test Cycle Countdown display alone). Cycle Countdown display will show ?88″ when REGULAR WASH button is pressed and ?8h? when Delay Start button is pressed. To start testing, press both the SCRUB WASH and DELICATE/ECONO buttons a second time. When testing has ended, Cycle Countdown display will show a fault using codes below. If more than one fault occurs, code numbers will be added together, (e.g./ ?6? = faults 2 + 4). 0 = No faults detected 2 = Heating fault 8 = NTC (temperature sensor) fault 1 = Aqua Sensor (?Sensotronic?) fault 4 = Filling fault F = Filling fault (display occurs during wash, not test) To check for heater, high limit or flow switch problems, start testing until heating starts. Using a meter with a current coil, measure current going into dishwasher. If current is ~ 11 A, then heater, high limit and flow switch are OK. If current stays at ~ 1.5-2 A, then heater, high limit or flow switch are faulty. If water level switch (f1) has failed (opened), display will show fault code ?4″ and unit will continually fill and drain where testing won?t be completed. If flow switch (e5) has failed (opened), display will show ?0″, water won?t heat (to 150?F) and water won?t stop circulating. If NTC probe (f2/f4) has failed (opened), display will show fault code ?8″ immediately and testing will end shortly after water has started circulating.
78,127,091
extern int v3p_netlib_ztgexc_( v3p_netlib_logical *wantq, v3p_netlib_logical *wantz, v3p_netlib_integer *n, v3p_netlib_doublecomplex *a, v3p_netlib_integer *lda, v3p_netlib_doublecomplex *b, v3p_netlib_integer *ldb, v3p_netlib_doublecomplex *q, v3p_netlib_integer *ldq, v3p_netlib_doublecomplex *z__, v3p_netlib_integer *ldz, v3p_netlib_integer *ifst, v3p_netlib_integer *ilst, v3p_netlib_integer *info );
78,127,135
Todo Backup Advanced Server Posted by Iphonne King to iPhone Data Recovery Newly upgraded iOS data recovery software - EaseUS MobiSaver 6. Newly added features supports to restore lost iOS data under more complex circumstances. With all these newly added features, more lost data can be restored by EaseUS Mobisavdr 6. With this newly updated iOS data recovery software, you can directly restore all iCloud backups under iOS 9 and later versions, selectively restore iTunes backup data and get more photos restored accurately with high quality. So how to get EaseUS MobiSaver 6. Or you can also directly click the below download button to directly download and install in your PC or Mac computer. What about the serial key? If you are trying to use a totally free but cracked iOS data recovery software to restore lost data from iPhone, iPad or iPod touch, stop iphne. Here below are some major reasons: Besides all these disadvantages, you may also not get tech support services when Diario el correo trujillo de hoy have iOS data recovery problems with EaseUS MobiSaver crack, and no refund policy to protect your deserved rights. Roxio creator nxt pro 3 full version with crack you are looking for an easy iOS data recovery method or solution, let EaseUS MobiSaver 6. Top 10 Hot Articles. How to Kyegen Pictures from Disabled iPhone How to Retrieve Blocked Messages Deleted on iPhone How to Recover iPhone Data after Restore without Backup Recover Data from Broken iPhone 6 How to Ilhone iCloud Backup Greyed Out Roxio creator nxt 3 cracked to Solve 0 Bytes Available on iPhone Error Recover iPhone Contacts after Factory Reset How to Fix iPhone Stuck at Apple Logo without Data Loss How to Retrieve Files from iCloud How to Restore Encrypted iPhone Backup.
78,127,158
An incomplete version of Android 4.3 has been leaked for the Motorola Moto X . While it has been posted on XDA Developers Forum, this version of the firmware should not be loaded on your Moto X unless you really, really know what you're doing. Unlocking the bootloader on the phone, even with help from Motorola , can lead to the voiding of your warranty and the bricking of your Moto X.As the first Motorola release that Google had a lot of influence over, it is expected that the Motorola Moto X will be closer to the top of the list of released Android models receiving an update to Android 4.3. According to those who have installed the new firmware on their Moto X, the update brings improvements to the camera. Other new features of the update are far from final and are full of bugs.The Motorola Moto X launched on August 23rd with Android 4.2 .2 installed.source: XDA via IntoMobile
78,127,257
Southampton Politician Del Singh Killed in Kabul Suicide Blast Del Singh was a Labour Party candidate for the European Parliament.Vimeo One of the two Britons who died in the suicide bombing in Kabul has been named as Del Singh, a Labour candidate for Southampton in the forthcoming European elections. Ed Miliband, the Labour party leader, confirmed that Singh was among those killed and said he was "appalled and shocked by this barbarous act of terror". "My thoughts - and the thoughts of the whole Labour Party - are with the family and friends of Del Singh, who was killed in yesterday's tragic suicide bomb in Kabul," Miliband continued. "People everywhere will be appalled and shocked by this barbarous act of terror deliberately targeting members of the international community living and working in Kabul in the service of the Afghan people. "Del spent over 10 years carrying out vital work on development projects in Afghanistan, Kosovo, Sudan, Sierra Leone and other countries. "He dedicated his life to working with people across the world who needed his support." The other Briton was named as Simon Chase from County Derry, who was the bodyguard of a Danish female police officer who was also killed. A spokeswoman said she did not believe there were any other British injuries. Two Britons and two Americans were among at least 21 people killed when a suicide bomber and gunmen attacked one of Kabul's most popular restaurants. The Taverna du Liban was a favourite dinner destination for aid workers, diplomats, security contractors and journalists. Afghan security forces arrive at the scene of an explosion in KabulReuters Wabel Abdallah, the Lebanese head of the International Monetary Fund (IMF) and three other staff of the UN were also killed in the attack on Friday, along with the Lebanese restaurant owner and several Afghanis and two Canadians. The Taliban claimed responsibility for the attack, saying it was in reprisal for an Afghan military operation earlier in the week against insurgents in eastern Parwan province, which they said killed many civilians. "The target of the attack was a restaurant frequented by high-ranking foreigners," Taliban spokesman Zabihullah Mujahid said in an emailed statement. He said the attack targeted a place "where the invaders used to dine with booze and liquor in the plenty."
78,127,401
What's It All About Menu Urban Living – redux My husband grew up in Washington DC, so whenever I mentioned that I’d like to move into the city, he let me know that I had no idea what that meant. So the compromise was to live in surburbia, but close to the city. Like most spouses, I love opportunities to hear my spouse say the three golden words “you were right”, but it is now my turn to utter those words. I finally got my wish to live the urban lifestyle, and now I do know what that means. On the positive side, it means that I can walk one block to the grocery store and numerous shops and restaurants. It means there is free live music on the waterfront just 2 blocks from house, and the added bonus of fireworks over the water at least 3 days a week, and, I regularly see ginormous ships pass by that are so big as to block out half the view of the city. Plus, the navy seals do their water and air exercises right outside of my window. On the not so positive side, it means that along with the fireworks come the sound of car alarms set off by the fireworks. The free music and restaurants draw large crowds of tourists so the restaurants frequented by the “locals” are now overrun with tourists. It also means that on days when I can work at home, I get a close-up and personal experience with every delivery and trash truck in the area as they beep beep beep during their backups. Just love the people driving by who like to share their music with the world plus, I get to listen to the live music at the Ferry and from across the Bay whether I want to or not unless I close my windows and doors. Then there are the navy seals, as cool as they are to look at, their boats and planes make a lot of noise. So.. to my husband “you were right”, but even with all the noise and confusion, I’m loving urban living. The picture below is the view I get every waking moment that I am home…. not so shabby huh 🙂
78,127,422
964 A.2d 2 (2009) COM. v. SILVERSTEIN. No. 427 EAL (2008). Supreme Court of Pennsylvania. January 16, 2009. Disposition of petition for allowance of appeal. Denied.
78,127,423
Calm down. The operative word is it has been put on hold. These are not normal times. No one saw this coming or we could have planned better. Give us a few days to figure everything out. We just made the decision today to preserve cash while we were waiting to see what was going to happen. Thanks for your cooperation which I desperately need right now. -----Original Message----- From: Corman, Shelley Sent: Wednesday, November 07, 2001 4:07 PM To: Hayslett, Rod Cc: Horton, Stanley; McCarty, Danny Subject: Floor Space Announcement I was very disappointed to not have any advance warning about the announcement that construction on 41 has been put on hold for financial reasons. The Gas Logistics team was "relocated" off the 41st floor on October 5. Prior to the move, we had a very efficient, open configuration that let us work together as a unified team. After the move, we are located on 3 different floors. Besides the general impact on team unity, this move also means that our evening staff work alone on separate floors, rather than in a group. It means fewer individuals to cover customer calls. It means having more equipment to cover the day to day business. I have worked hard to calm the team that this arrangement is just temporary and that we will eventually be put back together as a team. Prior to our move I constantly questioned whether our move was necessary. I asked about smaller, more practical changes to fit more people on the floor. And, I specifically asked about whether Enron issues might impact the groups desire to continue to spend $7-$10 million -- even as I reluctantly packed up my boxes. I hope you can appreciate that the current seating arrangement is a serious issue for my team and one that generates constant questions as my staff meetings. Finding out that we moved off of 41 for no reason at an all-employee floor meeting undermines my team's faith in my leadership.
78,127,594
Determination of deletion sizes in the MHC-linked complement C4 and steroid 21-hydroxylase genes by pulsed-field gel electrophoresis. In man, the genes encoding the complement component C4 (C4A, C4B) of the immune system and the steroid 21-hydroxylase enzyme (CYP21A, CYP21B) of adrenal steroid biosynthesis are located in the major histocompatibility complex (MHC). Frequent gene deletions and duplications have been described in the C4 and CYP21 genes, particularly in patients with autoimmune diseases and congenital adrenal hyperplasia. Here we report the determination of deletion sizes in 11 chromosomes with six different deletions. The deletions spanned the C4A+CYP21A, C4B+CYP21A, and C4B+CYP21B gene pairs as determined by standard Southern blot analysis. The deletion size fell within the range of 30-38 kb in all the chromosomes, as determined by pulsed-field gel electrophoresis. Because the deletion sizes in most other gene clusters are more heterogeneous, the results suggest the involvement of a specific mechanism in the generation of C4+CYP21 deletions.
78,127,677
Stable housing for vet leads to second chance Steve Elliott has a good job, good prospects and a good home. He commutes to his job at a gas and petroleum distribution company in Cleveland from the small, lakeside community of Vermillion, Ohio, where he lives. He calls his car a "beater," but it works for him. Steve likes his neighbors, likes to grill, and likes his quiet community. It may sound simple and may not be glamorous, but it’s a good life for Steve. "I’m content with going to work. I live in a nice neighborhood. I find serenity in that," Steve said. But Steve’s life now is much different than what it once was. In a relatively short span of time, Steve had gone from making $70,000-plus per year to facing an eviction. He met with complicated legal issues, moved from one state to another, went through the VA medical system, struggled to find work, and faced eviction and the threat of homelessness before seeking help. Through the Great Lakes Community Action Partnership (formerly WSOS Community Action Commission) Supportive Services for Veteran Families (SSVF) program, Steve gained the support he needed to avoid homelessness, continue working and achieve stability. For many years, Steve had worked as a general manager of a commercial truck stop in Virginia. He made good money, enjoyed the beach and enjoyed the night life. Steve had previously served in the U.S. Navy, beginning his service in 1985. He worked as a technician, traveling to Florida, Mississippi, the Great Lakes, Connecticut and San Diego during his naval career. "I had good experiences and bad experiences," Steve said. "One of the good things was that I got to travel." After his first four years, Steve re-upped and was preparing for another six years of service. Unfortunately, a knee injury cut Steve’s time short. He exited the Navy in 1990 with a medical discharge and began working in Virginia. His career as a manager was going well, until a situation arose in which a client passed more than $18,000 worth of bad checks to Steve. In turn, the insufficient funds caused by the client in turn caused Steve to bounce checks to different vendors. He attempted to reconcile with vendors, but the matter soon became a legal issue. Although Steve won a civil case, he lost a criminal case and had to serve a year of jail time. After his sentence ended, Steve moved to Ohio for a six-week treatment program at the Cleveland VA Medical Center. He was able to find low-income housing, but had difficulty finding work due to his prior sentence. Even after explaining the mitigating circumstances of his sentence, many employers would not or could not hire Steve. "I had received four job offers, but the background check would always come back and the offer would be withdrawn," Steve said. "It was deflating to interview four times before being able to connect with an employer." Steve continued his search and eventually had success, landing a part-time job in data entry. Unfortunately, although Steve had gained a job, he soon faced another crisis. The housing management company that owned Steve’s apartment issued Steve a notice of eviction. Although the company would later withdraw its case after realizing that it had issued the eviction in error, the blemish on Steve’s record was still there, and made what was already a difficult situation even worse. Without many resources, Steve had to find a place to live, and fast. "That’s when I reached out to Irene." Irene Miller, a long-time family advocate for the Supportive Services for Veteran Families (SSVF) program, has built a reputation among her clients for being trustworthy and dependable. Irene said she approaches every person with kindness and without judgment in order to build trust and help her clients be successful, which is especially important when working with veterans. "Veterans are a rare breed," Irene said. "They have such pride and a lot of times would rather do without everything — even basic needs — before asking for a handout." However, SSVF is not a handout, Irene explained. The program helps connect veterans who are homeless or at risk of becoming homeless with housing and basic needs, allowing them to gain stability and enact life goals. "I always tell them it’s a hand up, and not a handout," Irene said. Irene helped Steve locate a new apartment in Vermillion, secure a few months’ rent and some household supplies, and connect with other support through the local veterans’ services office. "The first meeting went really well. I had a sense of ease with her to be open about my needs and wants," Steve said. "I went in humbled. She took away the feeling of being embarrassed and asking for help, and hooked me up with the resources I needed." With a stable residence in a nice neighborhood with no fear of eviction, Steve could better concentrate on his work and becoming successful at his career. His situation illustrates the reason as to why SSVF places emphasis on providing stable housing as the first step from which clients can set goals and grow. Without a stable place to live, Steve said he would have had to focus on meeting basic needs before anything else. "I don’t know if I would have been successful. I might have had to work two or three part-time jobs to survive," Steve said. "[SSVF] allowed me to focus on a more professional job, which panned out." His ability to focus has paid off. Steve’s part-time job will soon become full-time, with a promotion. While Steve worked hard to be where he is, he says the help from SSVF gave him a boost to be able to keep going. "I don’t know if I would be where I am today without these organizations and Irene taking the pressure off. It made an incredible difference," Steve said. "I can lay my head on the pillow at night. The rent is paid. The refrigerator is full. I can focus on work," he said. "It’s a change of lifestyle, but I’m more self-content now."
78,127,682
As Google grows in size, so does its political influence. And, while this may not overly worry most people, there is one man who is keeping a close watch on the search engine firm. He is Consumer Watchdog’s John Simpson, and one gets the feeling that he revels in his role of giant-killer. Simpson thinks that Google enjoys far too much power in Washington–not least in the White House, where Deputy CTO Andrew McLaughlin is seen as being too close to his former employers. And last week, Inside Google, an offshoot of CW, broke what Simpson says is, “one of the biggest wire-tapping scandals in U.S. history.” He’s talking about Google’s data-collecting methods, which may have been harvesting information that could be harmful to national security. On Tuesday, FastCompany put some questions to Simpson about Google’s growing influence in D.C., and these are his replies. Why do you think there is so much panic about Google’s influence in the White House? “I don’t see panic. I see legitimate concern about the amount of influence Google now wields in Washington. Google has gone from having no presence a few years ago to becoming one of the biggest players. Last year, Google spent $4.03 million lobbying. Spending on lobbying increased 57% in the first quarter of 2010 over the first quarter of 2009.” According to a Google spokesperson, McLaughlin was never a lobbyist. Do you dispute Google’s version of events? “McLaughlin was ‘Director of Global Public Policy.’ I understand that to mean he was attempting to shape regulations and laws to benefit Google around the world. In simple language he’s a lobbyist. Google’s Political Action Committee, Google Inc. NetPAC–which contributes to political candidates–listed McLaughlin as assistant treasurer and designated agent in a March 16, 2009 filing. He was registered as a federal lobbyist in 2007, though Google now claims that was a ‘mistake.'” Which side of the Google-White House partnership do you fear the most: the government, with its attitude to grabbing data, or Google’s data-gathering antics? “I distrust them equally.”
78,127,885
Passionate about IP! Since June 2003 the IPKat weblog has covered copyright, patent, trade mark, info-tech and privacy/confidentiality issues from a mainly UK and European perspective. The team is David Brophy, Birgit Clark, Merpel, Jeremy Phillips, Eleonora Rosati, Darren Smyth, Annsley Merelle Ward and Neil J. Wilkof. You're welcome to read, post comments and participate in our community. You can email the Kats here For the half-year to 30 June 2015, the IPKat's regular team is supplemented by contributions from guest bloggers Suleman Ali, Tom Ohta and Valentina Torelli. Regular round-ups of the previous week's blogposts are kindly compiled by Alberto Bellan. Wednesday, 25 May 2005 This, from the BBC: the Association of American University Presses (AAUP), which represents 125 non-profit-making academic book and journal publishers, has accused Google of infringing copyright if it puts university libraries online. This will, the AAUP claims, have financially troubling consequences and could undermine sales of works in which publishers own the rights. Last December Google announced deals with four leading universities, Oxford, Harvard, Stanford and Michigan. At a cost of $200m (£110m) Google aims to put 15 million volumes online from four top US libraries by 2015. It will also scan in out-of-copyright books from the UK's Oxford University. The idea is to make millions of important but previously inaccessible texts available to researchers everywhere. The AAUP seeks clarification of 16 questions and claims the book-scanning scheme "appears to involve systematic infringement of copyright on a massive scale". Its members, who depend on book sales and other licensing agreements for the majority of their revenue, are worried that if users can get the information they want from its books by searching them online, they won't buy them. Further opposition has come from France, where there are fears that the Google project will enhance the dominance of the English language and of Anglo-Saxon ways of thinking. France and several other European countries recently got European Union backing for a separate book-scanning project for works not in English. The IPKat is delighted at Google's plan, having suffered enough at the hands of several publishers and librarians over the years. Merpel's pleased too: she recalls how the introduction of printing threw 30,000 scribes out of work in Venice alone, and remembers the squeals of outrage from English publishers when copyright was introduced in 1710 in the Act of Anne. No-one has a divine right to make money out of books that are out of copyright anyway.
78,127,952
City chooses Toll, Starwood to develop hotel and residence at Brooklyn Bridge Park From left: Robert Toll, Barry Sternlicht and renderings of the development (credit: Mayor's office) Recent press speculations and predictions have been confirmed: Mayor Michael Bloomberg’s office today announced the selection of Toll Brothers and Starwood Capital Group to develop a 550,000-square-foot complex at Pier 1 in Brooklyn Bridge Park, which will include a 200-room luxury hotel and 159 residential units. The hotel and residential complex will rise 10 stories, and there will be a separate five-story residential building, a city press release says. The team expects to break ground on the site by next summer — with a projected completion date of fall 2015. The hotel will operate under the name 1 Hotel, and its proposals include about 16,000 square feet of restaurant space, another 16,000 square feet of banquet and meeting space, 2,000 square feet of retail, a 6,000-square-foot fitness center and spa, as well as 300 parking spaces, the release said. The joint Toll Brothers and Starwood venture will enter into two 97-year ground leases on the site. When complete, the team’s development will yield a projected total of $3.3 annually from owners of residential units and the hotel operator. The development is expected to create roughly 210 permanent jobs and 300 construction jobs. As The Real Deal recently reported, the New York City construction industry employed fewer people in the first quarter of this year than it has at any point over the past 13 years. “Not only will this project create a new hotel and residential complex along Brooklyn’s emerging waterfront,” the city’s Economic Development Corporation president, Seth Pinsky, said,”it will also provide a large portion of the necessary funding for the maintenance and operations for what has already become one of the city’s most desirable open spaces.” Rogers Marvel Architects is in charge of the hotel and residential building’s design, and the green development hopes to nab LEED Silver Certification, the release said. — Zachary Kussin
78,128,030
--- abstract: 'Evidence for the existence of low energy cosmic rays in the Galaxy comes from the COMPTEL observations of gamma ray line emission from Orion, and also from light element abundance data which seem to suggest a low energy rather than a relativistic Galactic cosmic ray origin for most of the light elements. The Orion and light isotope data are more consistent with a composition that is depleted in protons and $\alpha$ particles than with one which is similar to that of the Galactic cosmic rays. This low energy cosmic ray phenomenon appears to be highly localized in space and time in the Galaxy, and is probably associated with star formation regions similar to Orion.' author: - Reuven Ramaty date: 'Received ; accepted ' title: Interstellar Gamma Ray Lines from Low Energy Cosmic Ray Interactions --- Introduction ============ The interactions of accelerated particles with ambient matter produce a variety of gamma ray lines following the deexcitation of excited nuclei in both the ambient matter and the accelerated particles. Apart from solar flares, nuclear deexcitation lines following accelerated particle interactions have so far only been observed from the Orion molecular cloud complex (Bloemen et al. 1994). The Orion observations, carried out with the COMPTEL instrument on the Compton Gamma Ray Observatory (CGRO) revealed emission features at 4.44 and 6.13 MeV, due to deexcitations in $^{12}$C and $^{16}$O. Since there are no significant long lived radioactive isotopes of nucleosynthetic origin that decay into the excited states of these nuclei, the observed lines must be produced contemporaneously by large fluxes of accelerated particles interacting in Orion. Gamma ray emission in the energy range from about 30 MeV to 10 GeV was also observed from Orion with the EGRET instrument on CGRO (Digel, Hunter, & Mukherjee 1995). However, the fact that these EGRET data do not require that the cosmic rays in Orion be enhanced relative to the relativistic Galactic cosmic rays observed near Earth, implies that the particles which produce the line emission in Orion are mostly low energy cosmic rays confined to energies below the effective pion production threshold. The discovery of gamma ray line emission from Orion and the implied existence of large fluxes of low energy cosmic rays, not only in Orion but probably also elsewhere in the Galaxy, has led to renewed discussions on the origin of the light elements. It has been known for over two decades that cosmic ray spallation is important to the origin of $^6$Li, $^7$Li, $^9$Be, $^{10}$B, and $^{11}$B (Reeves, Fowler, & Hoyle 1970). It was shown that the relativistic Galactic cosmic rays (GCR) interacting with interstellar matter prior to the formation of the solar system may have produced the observed solar system abundances of $^6$Li, $^9$Be and $^{10}$B (Meneguzzi, Audouze & Reeves 1971; Mitler 1972). These cosmic rays, however, cannot account for the abundances of $^7$Li and $^{11}$B. It is believed that about 10% of the $^7$Li is produced in the big bang and that most of the remaining production is due to nucleosynthesis in stars (e.g. Reeves 1994). Recent measurements of the boron isotopic ratio, $^{11}$B/$^{10}$B, in meteorites yielded values in the range 3.84 – 4.25 (Chaussidon & Robert 1995) which exceed the calculated GCR value by a factor of about 1.5. The implications of the Orion gamma ray observations on the origin of the light isotopes have been considered by Cassé, Lehoucq & Vangioni-Flam (1995) and Ramaty, Kozlovsky & Lingenfelter (1996). Unlike the GCR, low energy cosmic rays, similar to those which produce gamma ray line emission in Orion, could account for the meteoritic B isotopic ratio. It is in fact possible that the bulk of the light isotopes, except $^7$Li, is produced by low energy cosmic rays in the Galaxy. Although $^{11}$B could have been produced by neutrino spallation of $^{12}$C in supernovae (Woosley et al. 1990), recent B and Be observations in stars of various metallicities (Duncan 1995) support a low energy cosmic ray origin for $^{11}$B since neutrino spallation is not expected to produce much Be. The present paper is in large part a review based on a series of previous articles dealing with the gamma ray line emission from Orion and related subjects (Ramaty 1995; Ramaty, Kozlovsky, & Lingenfelter 1995a,b;1996). After a general discussion, we consider the implications of the gamma ray line observations on the composition, energy spectrum, energy deposition and energy density of the low energy cosmic rays in Orion. We also consider the ionization produced by these cosmic rays, and briefly review the possible origins of the accelerated particles. We then proceed to calculate the expected gamma ray line emission produced by low energy cosmic rays in the Galaxy. We base this calculation on the close relationship between the gamma ray line and light isotope production. General Considerations ====================== The gamma ray lines produced by accelerated particle interactions can be broad, narrow or very narrow (Ramaty, Kozlovsky, & Lingenfelter 1979). Broad lines are produced by accelerated C and heavier nuclei interacting with ambient H and He. The broadening of these lines (widths ranging from a few hundreds of keV to an MeV) is due to the motion of the accelerated heavy particles themselves. Narrow lines are produced by accelerated protons and $\alpha$ particles interacting with ambient He and heavier nuclei. The broadening in this case (widths ranging from a few tens of keV to around 100 keV) is due to the motion of the heavy targets which recoil with velocities much lower than those of the projectiles. Very narrow lines result from excited nuclei which have slowed down and stopped due to energy losses before emitting gamma rays. The broadening of these lines is due only to the bulk motion of the ambient medium (widths around a few keV or less for the interstellar medium). There are two distinct processes which can lead to very narrow line emission: deexcitation of heavy nuclei embedded in dust grains and excited by protons or $\alpha$ particles (Lingenfelter and Ramaty 1976), and deexcitation of excited nuclei populated by long lived radionuclei. Line emission from dust is not discussed in the present article. Dust grains, however, may play an important role in the injection and acceleration of the low energy cosmic rays which produce the gamma ray lines (see below). Long lived radionuclei produced by accelerated particle bombardment can stop in ambient gas before they decay thereby producing excited nuclei essentially at rest. The most important such radionuclei are $^{55}$Co($\tau_{1/2}=17.5$h), $^{52}$Mn($\tau_{1/2}=5.7$d), $^{7}$Be($\tau_{1/2}=53.3$d), $^{56}$Co($\tau_{1/2}=78.8$d), $^{54}$Mn($\tau_{1/2}=312$d), $^{22}$Na($\tau_{1/2}=2.6$y), and $^{26}$Al($\tau_{1/2}=0.72$my), all of which can be produced in accelerated particle interactions, for example $^{56}$Fe(p,n)$^{56}$Co. Unlike the very narrow grain lines which are produced almost exclusively by accelerated protons and $\alpha$ particles, very narrow lines from long lived radioactivity can result from both these interactions and interactions due to accelerated heavy nuclei. In the following discussion we shall need to make assumptions on the composition of both the ambient medium and the accelerated particles, on the energy spectrum of the accelerated particles, and on the interaction model. As these have been described in detail in Ramaty et al. (1996), we only give a brief summary here. For the ambient medium we assume a solar system composition (Anders & Grevesse 1989). For the accelerated particles we consider six different composition: solar system (SS), cosmic ray source (CRS), the ejecta of supernovae of 35M$_\odot$ and 60M$_\odot$ progenitors (SN35 and SN60), the winds of Wolf-Rayet stars of spectral type WC, and pick-up ions resulting from the breakup of interstellar grains (GR). The grain case is analogous to the anomalous component of the cosmic rays observed in interplanetary space (Fisk, Kozlovsky, & Ramaty 1974; Adams et al. 1991). Interstellar neutral atoms that penetrate into the solar cavity are picked up by the magnetized solar wind after being ionized by solar UV and charge exchange with solar wind protons. Because in the frame of the wind the ions acquire considerable energy during the pick up process ($m_i V^2/2$ where V is the speed of the wind), they form a seed population that is much more easily accelerated than the rest of the ambient plasma. For Orion it is conceivable that the equivalent incoming matter is essentially neutral dust that is broken up by evaporation, sputtering or other processes. The assumed GR abundances are SS abundances modified by depletion factors (Sofia, Cardelli & Savage 1994). The noble gas (He, Ne, Ar) abundances are set to zero and H/O = 2, assuming that the bulk of the H and O are in ice. The acceleration of pick up ions in Orion was also considered by Ip (1995). The essential properties of these compositions are the following: SS, CRS and to some extent SN35 have large proton and $\alpha$ particle abundances relative to C and heavier nuclei. On the other hand, because of prior mass loss for SN60 and WC, and because H and He are essentially absent in dust, the SN60, WC and GR compositions are very poor in protons and $\alpha$ particles. There is no Ne in the GR composition. The WC composition is dominated by C and O and has some additional amount of $^{22}$Ne. All the calculations are carried out in a thick target model in which accelerated particles with given energy spectrum and composition are injected into an interaction region where they produce nuclear reactions as they slow down due to Coulomb interactions to energies below the thresholds of the various reactions. Energetically this is the most efficient way of producing the nuclear reactions; if the particles are allowed to escape at higher energies, then energy that would otherwise be available for producing nuclear reactions, is removed from the system rendering the process less efficient. Energetic efficiency is important for gamma ray line production in Orion because even under optimal conditions the deposited energy, and the accompanying ionization if the medium is neutral, are very large. A model in which the particles escape from the interaction region with negligible energy loss is referred to as a thin target model. In a thick target the energy spectrum of the interacting particles is flatter (because of the energy losses) than their source spectrum; in a thin target the interacting particle spectrum is identical to the source spectrum. A ’thin target’ situation can also arise without escape, namely when continuous particle acceleration compensates for the energy losses so that the spectrum of the interacting particles remains identical to their source spectrum. Because the Coulomb energy losses depend on the charge of the particles, these losses reduce the importance for gamma ray production of heavy nuclei relative to lighter nuclei. Thus, for the compositions poor in protons and $\alpha$ particles (i.e. SN60, WC and GR), for which practically all the gamma ray production is due to C and heavier nuclei, the ratio of the line emission from Ne, Mg, Si and Fe to that from C and O is smaller in a thick target than in a thin target. This has an observable consequence: because C and O produce lines predominantly in the 4–7 MeV region while Ne–Fe produce lines at energies below 3 MeV, for identical source spectra and compositions, deexcitation line production below 3 MeV relative to that above this energy is smaller in a thick target than in a thin target. This is relevant for Orion, where, as we shall see, emission in the 1–3 MeV was strongly suppressed relative to the emission between 4–7 MeV. The calculations that we present are carried out with two spectral forms for the accelerated particle source function, $$\begin{aligned} {dN_i \over dt}(E) = K_i \Big({E \over E_0} \Big) ^{-1.5}e^{-E/E_0}, \end{aligned}$$ and $$\begin{aligned} {dN_i \over dt}(E) = K_i \Big({E \over E_c} \Big)^{-s} ~~{\rm for}~~ E>E_c \nonumber\\ = K_i ~~{\rm for}~~ E \le E_c,\end{aligned}$$ where the $K_i$’s are proportional to the accelerated particle source abundances, $E$ is energy per nucleon, and the parameters $E_0$ and $E_c$ are allowed to vary over the broad range from 2 to 100 MeV/nucl. The nonrelativistic spectral index of 1.5 in Eq. (1) is the consequence (e.g. Ellison & Ramaty 1985) of shock acceleration with a compression ratio at its maximal value of 4; the exponential turnover characterizes the effects of a finite acceleration time or a finite shock size. There is little theoretical basis for the flat spectrum given by Eq. (2). It is a simple form that has been used in previous calculations of gamma ray line and light element production (e.g. Ramaty et al. 1995a; Cassé et al. 1995; Ramaty et al. 1996), and we use it here as well; as in the previous studies (Ramaty et al. 1995a; 1996) we take $s$ = 10. We refer to the spectrum given by Eq. (1) as the strong shock spectrum and to that given by Eq. (2) as the flat spectrum. Gamma Ray Line Emission from Orion ================================== Gamma ray line emission in the 3 to 7 MeV range was observed from the Orion complex with COMPTEL (Bloemen et al. 1994). We show the COMPTEL data in Fig. 1 together with the higher energy gamma ray emission observed from Orion with EGRET (Digel et al. 1995). The 3–7 MeV observations show emission peaks near 4.44 and 6.13 MeV, consistent with the deexcitations in $^{12}$C and $^{16}$O following accelerated particle interactions. This implies that the ambient matter in Orion is undergoing bombardment by an unexpectedly intense, locally accelerated, population of low energy cosmic rays. At other photon energies the COMPTEL observations reveal only upper limits. As we shall see, the 1–3 MeV upper limit places strong constraints on the composition of the low energy cosmic rays. Digel et al. (1995) have shown that the EGRET data can be understood in terms of a relativistic cosmic ray flux similar to that observed near Earth producing gamma rays in Orion via pion decay and bremsstrahlung. This implies that the low energy cosmic rays should be confined to energies below the effective pion production threshold (see also Cowsik & Friedlander 1995). In the following we shall be concerned with the properties and effects of these low energy cosmic rays. Composition ----------- In all the calculations of gamma ray production in Orion the ambient gas was assumed to have a solar system composition (Anders & Grevesse 1989). This should not suggest that there could be no departures from solar abundances in the ambient medium; such effects were simply not yet investigated. The results discussed here, therefore, pertain only to the accelerated particles. Information on the proton and $\alpha$ particle abundances relative to those of C and heavier nuclei could be obtained from observations of the shapes of the gamma ray lines. In Fig. 2 we show theoretical gamma ray spectra for the CRS (cosmic ray source) and GR (grain) compositions calculated using Eq. 2 (flat spectrum) with $E_c$=20 MeV/nucl. (Spectra obtained with the strong shock spectrum, Eq. (1), are not very different.) The CRS spectrum (top panel) shows both broad and narrow lines, as well as very narrow lines from long lived radionuclei. The $^{12}$C complex around 4.44 MeV clearly shows the narrow line superimposed on its broad counterpart. On the other hand, the narrow lines are absent in the GR spectrum (bottom panel) showing the effects of the absence of protons and $\alpha$ particles. However, there are still very narrow lines from the long lived radionuclei. To allow the shortest lived radionucleus $^{55}$Co($\tau_{1/2}=17.5$h) to stop before it decays, we assumed that the ambient density exceeds $2 \times 10^6$ cm$^{-3}$. By convolving spectra similar to those in Fig. 2 with Gaussians representing the COMPTEL energy resolution, Ramaty et al. (1995a) showed that the current data cannot yet rule out a mixed broad-narrow line spectrum (e.g. the CRS) even though the COMPTEL energy resolution at 4.44 MeV (FWHM$\simeq$300 keV, Schönfelder et al. 1993) would appear sufficient. A similar result was also obtained by Cowsik & Friedlander (1995). On the other hand, arguments of energetics tend to support an accelerated particle composition poor in protons and $\alpha$ particles. To demonstrate this, we calculate the deposited power that is associated with the production of the nuclear reactions in a thick target. It is given by $$\begin{aligned} {dW \over dt} = \sum_{i} A_i \int_0^{\infty} E {dN_i \over dt}(E) dE~, \end{aligned}$$ where the $A_i$ is atomic number and ${dN_i \over dt}$ is given by Eqs. 1 or 2. We emphasize that in a thick target, the ratio of the nuclear reaction rate to the deposited power is essentially independent of details of the interaction region, especially the density of the ambient gas. It depends mainly on the spectrum and composition of the accelerated particles. The composition of the ambient gas could play a major role but only if there were a substantial enhancement of the C and heavier element abundances relative to H and He. Such an enhancement would reduce the deposited power. There is also a weak dependence on the state of ionization of the ambient gas, as the Coulomb energy losses are higher in a plasma (by about a factor of 2) than in a neutral gas. The results are shown in Fig. 3 for an ambient neutral medium. The top panel is for the strong shock spectrum (Eq. 1) and the bottom panel is for the flat spectrum (Eq. 2). The deposited power, normalized to the distance of Orion and the observed 3–7 MeV nuclear deexcitation flux (Bloemen et al. 1994), is shown as a function of $E_0$ or $E_c$ for the various compositions. We see that the gamma ray line production is energetically most efficient for compositions that are poor in protons and $\alpha$ particles (i.e. the SN60, WC and GR compositions), favoring such compositions. However, as we shall see below, even in the most favorable cases, the rate of ionization of gas in Orion is extremely large. We mention here that if the light isotopes are indeed produced mostly by low energy cosmic rays, then these cosmic rays must also be depleted in protons and $\alpha$ particles. The $\alpha$ particle depletion is necessary in order not to overproduce $^6$Li; the proton depletion ensures a linear dependence of the Be and B abundances on the Fe abundance in stars of various ages (Duncan 1995). If the low energy cosmic rays are poor in protons and $\alpha$ particles they will produce Be and B from the breakup of accelerated C and O on ambient H and He; in this case both the target and projectile abundances could remain constant, leading to a linear growth of the Be and B abundances. On the other hand, the GCR would produce much of the isotopes from the breakup of C and O in the ambient medium whose abundances increase with time, leading to a quadratic growth. Information on the composition of the heavy nuclei can be obtained from the observed gamma ray spectrum. In Fig. 4 we show calculations of the ratio $R$, $$\begin{aligned} R = 2 {\int_{1{\rm MeV}}^{3{\rm MeV}} E_{\gamma}^2 Q(E_{\gamma}) dE_{\gamma} \over \int_{3{\rm MeV}}^{7{\rm MeV}} E_{\gamma}^2 Q(E_{\gamma}) dE_{\gamma}}, \end{aligned}$$ for which Bloemen et al. (1994) set a 2$\sigma$ upper limit of 0.13 (see also Fig. 1). We see that the CRS and SS predicted ratios are in disagreement with the observations by more than $3\sigma$, and the SN35 and SN60 predictions are inconsistent at greater than $2\sigma$. Modifications in the abundances, however, can invalidate these results. For example, for the SN60 case, by increasing the C abundance by a factor of 2, but leaving all the other abundances unchanged, we obtain $R\simeq 0.13$ for both $E_0$ or $E_c$ equal 30 MeV/nucl. On the other hand, both the GR and WC compositions yield $R$’s which are lower than the COMPTEL upper limit. However, while the WC composition predicts practically no emission in the 1–3 MeV region, the GR composition predicts significant broad line emission in this region, due to Mg, Si and Fe. The reduction in the overall 1–3 MeV emission for the GR case is caused by the absence of the 1.634 MeV line due to the lack of Ne in grains. Energy Spectrum --------------- We have already pointed out that the fact that the EGRET data do not require an enhancement over the locally observed GCR implies that the accelerated particles which produce the gamma ray line emission in Orion are confined to low energies, below the pion production threshold. On the other hand, the energetics discussed above require a hard spectrum. This can be seen from Fig. 3 which shows that for steep spectra (i.e. small $E_0$ or $E_c$) the deposited power becomes very large. Thus the combined COMPTEL–EGRET data imply that the low energy cosmic rays in Orion should typically have energies around a few tens of MeV/nucl. The shapes of the observed gamma ray lines could place further constraints on the hardness of the accelerated particle spectra. For values of $E_0$ or $E_c$ larger than about 50 MeV/nucleon, for pure broad line spectra, the 4.44 and 6.13 MeV lines become very broad (see Ramaty et al. 1996 for details). Although the current COMPTEL data cannot yet rule out such spectra, future observations could employ line shapes to limit the hardness of the low energy cosmic rays. Low Energy Cosmic Ray Energy Density ------------------------------------ Whereas the power deposited in Orion by the accelerated particles can be calculated independent of the total irradiated mass of ambient gas, the low energy cosmic ray energy density does depend on this total mass. We have calculated this energy density assuming a steady state in which the accelerated particles lose energy by Coulomb collisions in a neutral gas. As already mentioned, the energy loss rates in an ionized gas are higher by only a factor of 2. The results are shown in Fig. 5, where $w(E)$ is the differential energy density measured in MeV cm$^{-3}$ MeV$^{-1}$. The top (bottom) panel is for the CRS (WC) composition. Also shown is the energy density in the local Galactic cosmic rays (GCR) evaluated from direct cosmic ray observations and assuming a power law in momentum source spectrum (Skibo 1993). We see that for the CRS composition the low energy cosmic rays energy density exceeds the local GCR energy density by almost two orders of magnitude; for the WC composition the excess is only about a factor of 10. Ionization of the Ambient Gas ----------------------------- For the SN60, WC and GR compositions and $E_0$ or $E_c$ about 30 MeV/nucl, the deposited power is (2.5–5)x10$^{38}$ erg s$^{-1}$. The total deposited energy depends on the duration of the irradiation. For example, if the irradiation lasts 10$^{5}$ years, the total energy requirement would be (0.8-1.6)x10$^{51}$ ergs, equal to the total kinetic energy output of a supernova. Just such a supernova, occurring $\sim$80,000 years ago in the OB association at $l$ = 208$^\circ$ and $b$ = $-$18$^\circ$, the same direction as the center of the gamma ray line source, was suggested by Burrows et al. (1993) from analyses of the X-ray emission from the Orion-Eridanus bubble (see also Ramaty, Kozlovsky & Lingenfelter 1996). As it takes 36 eV of cosmic ray energy to produce an ion pair in neutral H, the above deposited power corresponds to an ionization rate of (4.3–8.7)$\times$10$^{48}$ H atoms s$^{-1}$ or (0.17–0.34) M$_\odot$ yr$^{-1}$. Even with no recombination, in 10$^5$ years the amount of H that would be ionized is only (1.7–3.4)$\times$10$^4$ M$_\odot$, which is significantly smaller than the current neutral H mass in Orion. Considering the current ionization rate per H atom, we obtain that $\zeta$=(0.5–1.0)$\times$10$^{-13}$M$_5^{-1}$ s$^{-1}$, where M$_5^{-1}$ is the irradiated neutral H mass in units of 10$^{5}$ M$_\odot$. The effects of such a very large ionization rate have not yet been examined. It is nevertheless possible that a large fraction of the power that accompanies the gamma ray production is deposited in ionized gas. Origin of the Low Energy Cosmic Rays in Orion --------------------------------------------- In considering the origin of the low energy cosmic rays in Orion, we distinguish the problem of the acceleration from that of the injection of the particles into the accelerator. The proposed acceleration mechanisms are shock acceleration, due to shocks associated with the winds of O and B stars (Nath & Biermann 1994) or shocks produced by colliding stellar winds and supernova explosions (Bykov and Bloemen 1994), and stochastic acceleration, due to gyroresonance with cascading Alfvén turbulence in the accretion disk of a black hole (Miller & Dermer 1995). The proposed injection sources are the winds of Wolf Rayet stars (the WC composition, Ramaty et al. 1995a), the ejecta of supernovae from massive star progenitors (c.f. the SN60 composition, Cassé et al. 1995; Ramaty et al. 1996), and the pick up ions resulting from the breakup of interstellar grains (c.f. the GR composition, Ramaty et al. 1995b;1996). The suppression of accelerated protons and $\alpha$ particles relative to C and heavy nuclei (§3.1) finds an explanation in stochastic acceleration. This mechanism, however, does not predict the suppression of Ne and heavier nuclei relative to C and O (§3.1). The suppression of protons and $\alpha$ particles is relatively easily achieved by all the proposed injection processes. Concerning the heavier nuclei, Ne–Fe are strongly suppressed in WC, but less so in the SN60. The suppression of Ne in the GR composition is sufficient to account for the 1–3 MeV COMPTEL upper limit. The comparison with solar flares is quite instructive (Ramaty et al. 1995;1996). The solar flare gamma ray spectra show much higher ratios of 1–3 MeV to 3–7 MeV fluxes than does Orion. It was shown (Murphy et al. 1991) that this enhanced emission below 3 MeV is, in part, due to the enrichment of the flare accelerated particle population in heavy nuclei. Such enrichments are routinely seen in direct observations of solar energetic particles from impulsive flares (e.g. Reames, Meyer & von Rosenvinge 1994). These impulsive flare events are also rich in relativistic electrons. On the other hand, in gradual events the composition is coronal and the electron-to-proton ratio is low. The acceleration in impulsive events is thought to be due to gyroresonant interactions with plasma turbulence while in gradual events it is the result of shock acceleration. The fact that the ratio of bremsstrahlung-to-nuclear line emission in Orion is very low lends support to the shock acceleration scenario. Low Energy Cosmic Rays in the Galaxy ==================================== We now provide estimates of the Galactic gamma ray line emission expected from low energy cosmic ray interactions. We base these estimates on the close relationship between light isotope and nuclear deexcitation line production. In Fig. 6 we show the B to 3–7 MeV nuclear gamma ray production ratio, $Q({\rm B})/Q(3-7)$, as a function of $E_0$ and $E_c$ for the strong shock (Eq. 1) and flat (Eq. 2) accelerated particle spectra, and for the various compositions. We see that at the higher values of $E_0$ and $E_c$ the ratio is not strongly dependent on composition. This is because both the B and the 3–7 MeV deexcitation photons are mostly produced from C and O. However, $Q({\rm B})/Q(3-7)$ does depend quite strongly on the spectrum of the accelerated particles. For the subsequent estimate, we take $Q({\rm B})/Q(3-7)\simeq$0.1, which, for $E_0=E_c=20$ MeV/nucl, is a mean for the two assumed spectral forms. To estimate the total B inventory in the Galaxy we first use the meteoritic B/H ratio given by Anders & Grevesse (1989) and a total Galactic mass of 5$\times$10$^{10}$ M$_\odot$. This yields $N_{\rm B}$$\simeq$3$\times$10$^{58}$ atoms. However, as the meteoritic B/H is probably higher than the B/H measured in Pop I stars by a factor of 3 to 4 (Reeves 1994), we take the Galactic B inventory at the formation of the solar system to be approximately 10$^{58}$ atoms, and assume that it was produced by low energy cosmic rays in about 5$\times$10$^9$ years. This yields an average B production rate prior to the formation of the solar system of about 6$\times$10$^{40}$ atoms s$^{-1}$. We then further assume that current production rate is equal to this average and use $Q({\rm B})/Q(3-7{\rm MeV})\simeq 0.1$ to estimate a current Galactic 3–7 MeV photon production rate $Q_{\rm G}(3-7{\rm MeV}) \simeq 6\times 10^{41}$ photons s$^{-1}$. In terms of this total Galactic production, the flux from the central Galactic radian is $$\begin{aligned} \Phi_{3-7}\simeq \xi 10^{-46} Q_{\rm G}(3-7) \simeq 6 \times 10^{-5}\xi~{\rm ph}~{\rm cm}^{-2}~ {\rm s}^{-1},\end{aligned}$$ where $0.5 \lsim \xi \lsim 2$ depending on the spatial distribution of the sources (Skibo 1993). Clearly this prediction is highly uncertain. The estimate of the current B production rate would be larger if there were significant destruction of B due to incorporation into stars; in this case we would predict a higher central radian flux. On the other hand, the current B production rate could be lower than the average rate prior to the formation of the solar system, in which case our prediction would also be lower. In addition, since the B to 3–7 MeV photon conversion depends on the spectrum of the accelerated particles, for values of $E_0$ or $E_c$ larger than 20 MeV/nucl, the ratio would be smaller than the value we used, yielding a lower predicted central radian flux. Clearly much could be learned from an actual measurement of C and O deexcitation line emission from the direction of the Galactic center or elsewhere. In Fig. 7 we compare our calculations with observations. None of these have revealed any line emission. The COMPTEL data (Strong et al. 1994) is continuum, most likely a combination of bremsstrahlung and inverse Compton radiation produced by relativistic electrons. The SMM upper limit refers specifically to line emission (Harris, Share, & Messina 1995). We obtained the calculated curves by normalizing the spectra shown in Fig. 1 to the 3–7 MeV flux of $6 \times 10^{-5}~{\rm ph}~{\rm cm}^{-2}~ {\rm s}^{-1}$ given above. While the calculations are not inconsistent with data, it is conceivable that with more sensitive instruments or with longer COMPTEL exposures, the predicted line emission could be observed. For the WC, SN60 and GR compositions and $E_0 = E_c = 30$ MeV/nucl the energy required to produce 1 B atom is about 1 erg (Ramaty et al. 1996). The current rate of B production of about 6$\times$10$^{40}$ atoms s$^{-1}$ then implies a current total power deposition by low energy cosmic rays in the Galaxy of about 6$\times$10$^{40}$ erg s$^{-1}$. When compared with the energy deposition rate in Orion (§3.4), we find that about 200–400 Orion-like regions could be currently active in the Galaxy. This small number, and the fact that the irradiation time in Orion probably lasted for only 10$^{5}$ years, implies that the low energy Galactic cosmic ray phenomenon is highly localized in space and time, in contrast with the GCR whose spatial distribution is relatively uniform and their time dependence, as evidenced by meteoritic studies, is relatively constant in time. I wish to thank S. W. Digel for discussions on the relationship between the COMPTEL and EGRET data, and B. Kozlovsky and R. E. Lingenfelter with whom much of this research was jointly carried out. Adams, J. H. et al. 1991, ApJ, 375, L45 Anders, E., & Grevesse, N. 1989, Geochim. et Cosmochim. Acta, 53, 197. Bloemen, H. et al. 1994, A&A, 281, L5 Burrows, D. N. et al. 1993, ApJ, 406, 97 Bykov, A., & Bloemen, H. 1994, A&A, 283, L1 Cassé, M., Lehoucq, R., & Vangioni-Flam, E., 1995, Nature, 373, 318 Chaussidon, M., & Robert, F. 1995, Nature, 374, 337 Cowsik, R. & Friedlander, M. 1995, ApJ, 444, L29 Digel, S. W., Hunter, S. D., & Mukherjee, R. 1995, ApJ, 441, 270 Duncan, D. 1995, paper presented at the Cosmic Abundance Conference, College Park, MD, October 1995 Harris, M. J., Share, G. J., & Messina, D. C. 1995, ApJ, 448, 157 Ellison, D. C., & Ramaty, R. 1985. ApJ, 298, 400 Fisk, L. A., Kozlovsky, B., & Ramaty, R. 1974, ApJ, 190, L35 Ip, W.-H. 1995, A&A, 300, 283 Lingenfelter, R. E., & Ramaty, R. 1976, ApJ, 211, L19 Meneguzzi, M., Audouze, J., and Reeves, H. 1971, A&A, 15, 337 Miller, J. A., & Dermer, C. D. 1995, A&A, 298, L13 Mitler, H. E. 1972, Astrophys. & Sp. Sci., 17, 186 Murphy, R. J., Ramaty, R., Kozlovsky, B., & Reames, D. V. 1991, ApJ, 371, 793 Nath, B. B., and Biermann, P. 1994, MNRAS, 270, L33 Ramaty, R. 1995, in The Gamma Ray Sky with COMPTON GRO and SIGMA, eds. M. Signore, P. Salati, and G. Vedrenne, (Dordrecht: Kluwer), 279 Ramaty, R., Kozlovsky, B., & Lingenfelter, R. E. 1979, ApJ Suppl., 40, 487 Ramaty, R., Kozlovsky, B., & Lingenfelter, R. E. 1995a, ApJ, 438, L 21 Ramaty, R., Kozlovsky, B., & Lingenfelter, R. E. 1995b, Ann. New York Academy of Sciences, (17th Texas Symposium on Relativistic Astrophysics and Cosmology, eds. H. Bohringer, G. E. Morfill and J. Trumper), 759, 392 Ramaty, R., Kozlovsky, B., & Lingenfelter, R. E. 1996, ApJ, 456, 525 Reames, D. V., Meyer, J-P., & von Rosenvinge, T. T. 1994, ApJ Suppl., 90, 649 Reeves, H. 1994, Revs. Modern Physics, 66, 193 Reeves, H., Fowler, W. A., & Hoyle, F. 1970, Nature, Phys. Sci, 226, 727 Schönfelder, V. et al. 1993, ApJ Suppl., 86, 657 Skibo, J. G. 1993, Ph.D. Thesis, University of Maryland Strong, A. W. 1994, A&A, 292, 82 Sofia, U. J., Cardelli, J. A., & Savage. B. D. 1994, ApJ, 430, 650 Woosley, S. E., Hartmann, D., Hoffman, R., D., & Haxton, W. C. 1990, ApJ ,356, 272
78,128,189
DJ Drama – Quality Street Music (Album Cover & Track List) DJ Drama debuts a slightly unusual but refreshing album cover for his upcoming 4th studio album Quality Street Music which now hits stores on October 2nd, the same day as Kendrick Lamar’s good kid, m.A.A.d City. Expect a lot of A-list features on this one. --Advertisement-- UPDATE: Back to the top with the official track list. This looks solid! Pre-order on iTunes.
78,128,478
Photo The bets that Silicon Valley is placing on Bitcoin are getting bigger. Even though concerns remain about the digital currency’s staying power, Bitcoin start-ups are attracting more dollars from well-known venture capitalists. In the latest move, Blockchain, a Bitcoin wallet provider and software developer, is expected to announce on Tuesday that it has closed a roughly $30.5 million fund-raising round, led by Lightspeed Venture Partners and Wicklow Capital. The investment, raised from Blockchain’s first round of outside financing, is one of the biggest in the digital currency industry to date. Since it was founded in 2011, Blockchain, which is based in Britain, has gained respect in the industry for adhering to the virtual currency’s original philosophy of anonymity and decentralization. Roger Ver, a libertarian known in some circles as the Bitcoin Jesus, was the first backer and supporter of the company. The price of one Bitcoin, which reached a peak of about $1,150 last year, fell over the weekend to its lowest point of the year after tumbling 20 percent, to about $286, according to CoinDesk, a virtual currency website. Bitcoin was trading on Monday evening at about $330. But despite the failure of some Bitcoin-related sites and new regulatory challenges, more than $250 million has been invested in Bitcoin companies, most of that in the last 12 months, according to Wedbush Securities, a financial services firm. “Over the course of the next 10 years, Bitcoin is going to have a big impact,” said Jeremy Liew, a partner at Lightspeed who will join Blockchain’s board. “Where is the central nexus of value creation in this whole industry? It has to be the wallet,” he said. Blockchain’s financing round follows a spate of big investments in Bitcoin companies, particularly those that offer storage services. Coinbase, for example, secured $25 million last year in a financing round led by the venture capital firm Andreessen Horowitz. Another Bitcoin company, Xapo, said in July that it had raised $40 million from investors including Greylock Partners and Index Ventures. Bitcoin has been slow to gain more widespread use in the mainstream. In developed countries, virtual money is still largely the plaything of technology enthusiasts and speculators, though some retailers and stores have trumpeted their acceptance of Bitcoin. In emerging markets, where some see enormous potential for Bitcoin, the infrastructure to process transactions simply does not exist. For Bitcoin to become more widely adopted, supporters say, the virtual currency must find a unique application that will take it beyond the realm of speculation. And for that to happen, companies must first build a robust platform, which is what Blockchain says it is trying to do. Blockchain has criticized other companies that essentially allow customers to bet on Bitcoin’s price, a practice that it sees as straying from the currency’s core mission. Blockchain says it tries to make it easier for people across the globe to use Bitcoin. The company has 2.3 million consumer wallets, making it among the most popular wallet services in the world. It has also developed a search engine that allows users to verify transactions quickly in the currency’s public ledger, known as the Bitcoin blockchain. Most of the company’s revenue comes from advertising. Peter Smith, the president of Blockchain, said the company planned to use its new financing to expand and to invest in developing markets. The company says it is working on software to make it easier to transfer Bitcoin. “Right now, we’re going into a period where it’s not just enough to have Bitcoin be tantalizing,” Mr. Smith said. “We need Bitcoin to actually be useful.” It is this emphasis on improving the Bitcoin platform that seems to have attracted Blockchain’s latest investors, a group that also included Mosaic Ventures and Richard Branson. In particular, investors said that they had been impressed with Blockchain’s ability to establish itself as a key player in the industry without outside capital. “Blockchain is far and away an early leader in Bitcoin,” said Rafael Corrales, a partner at the venture capital firm CRV who personally invested in Blockchain’s funding round. “They’re actually enabling the entire ecosystem.” Video
78,128,598
LA city officials revive long-stalled move to legalize street vending After three years spent languishing in City Hall, a proposal to legalize street vending in Los Angeles is once again on the move, although questions around issues that include enforcement remain unanswered. On Tuesday, City Council backers of street vending released the most detailed proposal yet for legalizing the operations that have spurred opposition in some neighborhoods. In a letter to the full council, members Curren Price Jr. and Joe Buscaino outlined a plan that would cover vending throughout the city. "We value the notion that everyone deserves the opportunity to start a small business, on a level playing field, with failure or success determined by our own talent, hard work, and perseverance," Price and Buscaino wrote in their letter. The general plan would limit street vendors to two per side of a city block. But it would allow the creation of special vending districts with customized rules allowing more or fewer vendors in an area. The districts could be created at the request of city officials, business owners or residents. Under the plan, street vendors who follow the rules would no longer face misdemeanor charges as they do now for operating illegally. A council committee review and approval by the full council would be required before the plan could be implemented. The plan goes before the council's Public Works and Gang Reduction Committee on Dec. 12. The Price and Buscaino plan for legalized street vending is referred to as a "hybrid" model. It isone of three different approaches that city officials have debated. Residents in several communities weighed in on ideas for a street vending program during a series of community meetings in 2015. Key aspects of the council members' plan released Tuesday call for: Stationary carts only in commercial and industrial zones with a maximum of two on each side of a city block allowed. Limited roaming carts in residential zones. They must be moving at all times and complete transactions in a timely manner. Street vendors would be required to have mandatory permits, liability insurance, lists of goods they sell and an established route or location. They would also need the consent of adjacent property or business owners to operate in a spotand would be required to provide a trash receptacle to prevent littering. On Tuesday, proponents of legalized street vending familiar with the plan said they were pleased by and large. "We're definitely optimistic that there seems to be a new second wind of enthusiasm from the council offices," said Mike Dennis, director of community organizing with the East L.A. Community Corp., a local development group that has backed legalizing street vending. Dennis said proponents did find some aspects of the plan too restrictive, citing the proposed sidewalk rules that limit vendors to two operators per block. But the plan is seen as a compromise that so far is sitting well with some opponents, like Blair Besten, executive director of Historic Core Business Improvement District. The downtown group has opposed the idea of legalizing street vending, in part because of the large concentration of carts downtown. "I like that there is a restriction of how many vendors can operate by block, so you are not seeing too much density of vendors competing for space, which still allows the free movement of people on the sidewalk," Besten said. However, she said there are many unanswered questions, among them: how will the program be enforced? Critics of legalized street vending have argued that there aren't enough police to enforce a program. "It's all about enforcement, so how are you going to do that in order to reward the vendors that do go through the system?" Besten asked. Price and Buscaino propose that the city Bureau of Street Services be responsible for handling complaints about unpermitted vending or other problems, although Los Angeles police would retain enforcement authority over those who violate the rules.
78,128,640
The benefits of increased access and transparency are many. Democracy’s first principles strongly support the people’s right to know how their government works. This would seem to be underscored by this court’s stubborn insistence on freedom of communication in a democratic society. Recall that earlier this year, the court held that the First Amendment protected the right of protesters to hector a military family during a funeral service for their son, who was killed in Iraq. And the court decided that the same societal interest in free speech outweighed California’s interest in protecting minors from extremely violent video games. These are but two of many examples in which the current court has made plain its view that, in extreme cases, the force of First Amendment rights shall outweigh all else. Year after year, the court issues decisions that profoundly affect the nation. Think of civics classes. The retired Justice Sandra Day O’Connor is one of many who have lately lamented the apparent collapse of civic literacy in public schools. Think of older Americans affected by President Obama’s health care program. Think of women or other groups affected by important class-action cases, like the Wal-Mart discrimination case last term. These citizens should have a chance to hear what the justices think about important questions that touch their lives. The issue of cameras in the courtroom is one of precious few on which conservative Republicans, like Senator John Cornyn of Texas, and liberal Democrats, like Representative Henry A. Waxman of California, agree. The views cherished by the court’s old guard are nicely dramatized by the retired justice David H. Souter, who, by his own account, preferred death to the quiet illumination of cameras in the courtroom. Justice Anthony M. Kennedy’s fear is that televising the oral arguments would introduce “the insidious temptation to think that one of my colleagues is trying to get a sound bite for the television.” But this fear seems groundless in light of the already available sound recordings from these sessions. Newspapers, radio and television were all once condemned for their demagogic potential, but we have long since accepted these media as vitally important pieces of our national dialogue. The idea that cameras would transform the court into “Judge Judy” is ludicrous. Happily, the old guard’s views are now in decline. Justice Elena Kagan, the newest and youngest member of the court, has spoken fervently for openness and transparency. At an Aspen Institute event in August, she said, “If everybody could see this, it would make people feel so good about this branch of government and how it’s operating.”
78,128,645
The 2016 A-League grand final looms as the most profitable in history for Football Federation Australia but the governing body has ruled out awarding any prize money to the champion this season. Fairfax Media understands the FFA will receive a gross profit of up to $3 million if all tickets to Sunday's final at Adelaide Oval are sold but the winning club won't receive a cut until a new TV deal. A higher capacity than many recent grand final venues, lower hiring fee than Brisbane's Suncorp Stadium and incredible demand for tickets is making this year's decider a considerable cash cow for the FFA. Tickets went on sale to the general public at 11.30am on Tuesday and by the close of business more than 40,000 tickets to the 53,000 capacity stadium were purchased. Staff, leasing, transport, security and accommodation costs mean the final figure will be lower than $3 million, but sources suggest the FFA will still walk away with a total profit well in excess of $1 million.
78,128,831
WASHINGTON—Following months of analysis into the animal’s stunning good looks, the Food and Drug Administration announced Monday that genetically modified salmon are far too handsome to eat. “After several rounds of clinical testing, we have determined that these genetically altered fish are safe for human consumption, but between their striking, rugged good looks and the air of devil-may-care competence they exude, no one should want to eat them,” said FDA commissioner Scott Gottlieb, particularly noting how the engineered salmons’ pouty lips and lustrous silver-blue sheen masked a deep, instinctual desire to swim upstream and spawn like the intertidal bad boys they are. “These salmon may be raised for greater immunity to waterborne pathogens and to produce higher levels of Omega 3 fatty acids than their wild counterparts, but between their supple, hunky bodies and those eyes—These salmon are farmed and dangerous, honey. Therefore, the FDA cannot recommend them for human consumption at this time.” Gottlieb further warned that no amount of genetic manipulation could take a fish away from its true nature and that handsome or not, salmon would never be as kind-hearted as trout. Advertisement
78,128,886
Article content continued Wednesday’s decision does not overturn the approvals for the project, which were granted again by Prime Minister Justin Trudeau in June, but they do introduce another legal hurdle for the pipeline project. “There’s no tools-down requirement,” Newman said, noting that opponents would need to apply for an injunction in order to force work to stop on the pipeline, a move which would frustrate the oil industry’s plans. Tsleil-Waututh Nation, one of the First Nations that filed the motion, said it was “confident the court will once again decide in our favour.” “Canada continued to do the legal minimum and in our view, fell well below the mark again,” Chief Leah George-Wilson said in release, which noted that “the same six First Nations that stopped the Trans Mountain pipeline expansion in its tracks last year will have another chance” to overturn approvals for the project again. Amid entrenched opposition from First Nations and the government of British Columbia, Houston-based pipeline giant sold the Trans Mountain pipeline system and expansion project to Ottawa in May 2018 for $4.5 billion. The federal government has been trying to move the project forward since that time and recently announced construction would restart and work would be complete within two years. “It’s a very disappointing decision” but I’m not sure anyone could say it’s surprising,” Petroleum Services Association of Canada president Gary Mar said following the ruling, which demonstrates that the project was still mired in uncertainty.
78,128,906
For some business travelers, the start of a new year offers an opportunity to take stock of ways to improve travel habits. Here are some ideas for resolutions from and for business travelers in 2016. Make time for fun Kat Cohen, a university admissions counselor and founder of IvyWise, hopes to build in "time to experience one event that is for pleasure on each business trip," even if it's "just a meal or one museum." See also: Keep your routine during business travel with these hacks Cheryl Andrews, president of Cheryl Andrews Marketing Communications, also hopes to "work at least one event of culture or beauty" into every trip. She started in early on her resolution this past fall, flying in to London ahead of a November business trip to tour the Victoria & Albert Museum and attend a classical concert. Leon Rbibo, who frequently travels to Tahiti, Japan and Hong Kong for his Los Angeles-based jewelry company, The Pearl Source, says one of his resolutions for 2016 is to "extend my arrival and departure by one day each, landing a day early and staying a day later" in order to "take the time to enjoy some of the places I visit." Keep calm and stay fit Jamie Sigler, founding partner of J Public Relations, based in San Diego, plans to "leave time to listen to a daily meditation to keep calm and carry on when I am traveling for work. Two apps I'm loving are buddhify and Smiling Mind." Sigler's colleague at J Public Relations, Ali Lundberg, pledges to pack her running shoes so she can explore "urban trails in 2016." "With not a lot of time to explore a destination during business travel, and the desire to get my morning fitness routine checked off the list, combining the two is at the top of my resolutions list," Lundberg said. Protect yourself online David Grubb, president of CMIT Solutions of Tribeca, an information technology solutions and services company, is encouraging clients to improve their cybersecurity in the new year so that they're as safe online on the road as they are at home. Grubb recommends backing up all data, updating passwords, avoiding public Wi-Fi (including free airport networks) because the networks are not secure, and using two-step authentications for all financial or purchasing transactions. Grubb describes two-factor authentications as "something you know," like a password, plus "something you have," like a one-time code received via text or cellphone, or a fingerprint scan. Change up your dinner plans, cut out the snacks Jared Blank, chief marketing officer of Deal News, a shopping comparison site based in Huntsville, Alabama, says travelers who frequent the same cities again and again for work "tend to fall into a rut where they eat at the same restaurants every time they go ... But for the new year make a resolution to avoid the same places you've always gone. It'll make that 11th trip to Atlanta a little more fun." Gayle B. MacIntyre of Global Ink Communications says that "as a frequent business traveler who works in the hospitality industry, my resolution for 2016 is to cut out the peanuts, pretzels and Biscoff cookies. Empty calories add up for frequent business travelers. Arriving at a destination sans the salt and sugar has got to be a better and healthier way to arrive energized." Don't rush the connecting flights It might seem counterintuitive to those who hate hanging around airports, but Pamela Wagner pledges to build in three to four hours between flights as a way to cut stress. "Why? I can absolutely calmly go into one of the lounges and enjoy all their facilities, and have a good two to three hours of concentrated, uninterrupted work," said Wagner, who has her own digital marketing business and is currently based in Austria. Even if you're not a frequent flyer on a given airline, Wagner says it's worth the $25 or $30 for an airline lounge pass to access "showers, work stations, good food and drinks." Then she calmly boards her next flight, watches a movie, gets some rest and is ready to work. "It's an ideal rhythm," she said.
78,129,161
FlipSize – Kids Clothing Solution I have written about FlipSize before when I first came across their service and felt a need to review them and let everyone know about them. Since that time, I have used FlipSize again and again. I have also been spreading the word about my FlipSize bliss. So for everyone that keeps asking me, what is the name of that website again……this post is for you. xoxo FlipSize is an amazing business where you package up your old clothes, shoes, coats etc that your kids have outgrown and then either mail them in, or drop them off to a local FlipSize drop off location. You are awarded points for the clothing items that you drop off. These points then are accessible for you in your account to use to either, buy other up cycled clothing items, or cash out in the form of PayPal cash or other gift cards. Before you drop off your clothes or order a bag, FlipSize asks that you contact them to ensure that they are in need of what you have. What FlipSize needs is always changing depending on what they receive. To read more about this, check out the FlipSize blog post here. If you do not live near a FlipSize location, then you need a FlipBag which you order off their website. Photo Credit: FlipSize The FlipBags are a very hot item and run out of stock quite quickly, so make sure to sign up for the FlipSize Newsletter so that you are among the first to know when the bags are back in stock. If you are local, then you are lucky as you do not need a FlipBag. You just need to package up your items nicely in a bag with your information in there as well, and then drop the back off at one of the local drop sites. Are you local to Abbotsford, Coquitlam, Langley, Maple Ridge or Surrey, BC? Drop off your bag at one of our residential drop off locations and earn 100 BONUS POINTS. Email for details. Here are the cross streets to get a better idea of locations in the Lower Mainland of BC… FlipSize only accepts clothes that are in good condition that can then be put on the FlipSize website to be purchased. They do have a strict no duds policy, so make sure the clothes are in good condition still. No holes, tears, pulls, etc. If you send in items that do not meet their quality control, then they get donated. I accidentally included socks in one of my drop off bags one time and they donated them to charity. I must have gotten my bag sorting a bit mixed up, but I was happy to hear that the items when to the local charity and not just thrown out. Once you get your points, a shopping you can go. You can turn the points into cash that is paid into your PayPal account, gift cards for places like Toys R Us, Babies R Us, or Amazon. You can also use your points to buy the next size of clothing items for your little one. If you use your points to to buy clothes from FlipSize, you receive extra reward points for using them to buy clothes. So your points go even further if you use them to purchase clothes from the FlipSize website. FlipSize orders over $99 have free shipping and there is also the option of free local pick up for orders. FlipSize is an amazing way to turn the clothes that your kids have outgrown into the next size of clothes for them. FlipSize has a wide selection of clothing items, and their stock is always changing as they get new items in. Their clothes sizing runs from newborn to size 12 plus. I have seen items up to kid’s size 14 on there before also. My Little One is out of their sizing range, so we have not shopped on FlipSize. She is just leaving their top end of available sizes, so we use the drop off services quite a bit. WebsiteFlipSize Newsletter Sign upFacebookTwitterPinterest I am excited to announce a $50 FlipSize e-gift certificate for one lucky fan of MomMomOnTheGo. This giveaway is open to residents of Canada over the age of 18 and where permitted by local laws to enter and win giveaways. Please ensure that you read the full terms and conditions found at the bottom of the entry form.
78,129,271
The U.S. Environmental Protection agency Region 3 recently awarded a $1.5 million grant to the West Virginia Department of Environmental Protection (DEP) to support the state's Nonpoint Source Water Pollution Control Program and finance implementation of high-priority watershed restoration projects. The funding also will help finance treatment of streams polluted by acid mine drainage from abandoned coal mines, according to the EPA. The agency also awarded DEP a $45,000 grant "to determine statistically valid compliance rates for municipalities from combined sewer overflows (CSOs)," according to a press statement. The noncompliance rate for CSOs is high, and the "assessment will provide valuable information on the current compliance rate of CSO communities and may lead to appropriate enforcement actions if necessary," the statement said.
78,129,360
At Cancer Research UK, we’re often asked about alternative cancer cures. These are usually circulated on the internet and end up plastered onto our Facebook page (often accompanied by the phrase “they don’t want you to know about it). Just a small selection of the cures we’ve heard about recently includes lemon juice, baking soda, apricot kernels, coffee enemas, tropical fruit, “alkaline” foods (whatever they are…), even bleach. But while there are plenty of ‘miracle cures’ out there, a little investigation shows that there’s very little evidence that any of them actually work. In some cases – particularly chemicals found in plants and other foodstuffs – there may be lab studies suggesting it has an anti-cancer effect. But many things can kill cancer cells growing in a Petri dish in the lab, and chemicals that seem promising in the lab or even in animal models of tumours can be disappointingly ineffective when faced with the real deal in a cancer patient. Yet the internet is bursting with anecdotes from patients who have apparently been “cured” by all kinds of pills, lotions and potions. So what should we make of them? Despite what people may claim, videos and stories are not scientific evidence for the effectiveness of any cancer treatment. When faced with a patient story, it’s impossible to tell whether these patients have been ‘cured’ by a particular treatment or not. We know nothing about their medical diagnosis (did they actually have cancer? If so, what type and how was it confirmed?), the stage and aggressiveness of the disease or their outlook. Often it turns out that people have had conventional cancer treatments too, yet this may not be mentioned. We don’t know about the chemical composition of the treatment they got – for example, one alternative prostate cancer treatment was found to contain prescription drugs. And we only hear about the success stories – what about the people who have tried alternative therapies and not been cured? People who make bold claims only pick their best cases without presenting the full picture. This highlights the importance of publishing data from rigorous lab research and well-designed clinical trials in peer-reviewed scientific and medical journals. Conducting proper clinical studies enables researchers to compare like with like, and prove that a prospective cancer treatment is safe and effective. And publishing their data in scientific journals allows doctors around the world to judge for themselves and use the information for the benefit of their patients. This is the standard to which all conventional cancer treatments are held, and it’s one that alternative treatments should be held to too. Anecdotes and videos prove nothing and benefit no-one – we need reliable, scientific research to judge whether a treatment is effective. When faced with a diagnosis of cancer, it’s tempting to turn to “Dr Google” to find out more, but we urge patients to check out the evidence behind any alternative treatments they might be thinking of taking and talk it through with a medical professional. Not only are people at risk of wasting their time and money on completely ineffective treatments, there is also the possibility that a therapy might be harmful or interact with conventional treatments. Through our information services, we’re well aware how distressing this kind of misinformation about ‘cures’ for serious illnesses can be for people. It gives them false hope and can lower their confidence in the treatment they are receiving from their own doctors. @Francisco Silveira – That is fantastic that you have found a cure for cancer! I would love to see double blind studies of your formula in a peer-reviewed journal. You should be nominated for the Nobel Prize for medicine. My name is Francisco Silveira. I’m 82 and by the way you judge people who cure themselves with natural products, I’m a Quack. My medical records show that I have cured my skin cancers and about one-year-ago, my homemade medicine also cured multiple tumors in my liver, spleen and a bleeding one in my bladder. What you have to treat any cancer will be considered by the coming generations; the Dark-Ages of Medicine. Before calling other people quacks, look at yourselves. Look at your survival rates. Some people here need to study science more as it is a proven fact by science that there are what many would call alternative medicine that actually work! I have researched and read over 1.5 million pages of research from around the world and have found many cures for disease. Most people do not know that Some Doctors in the U.S. wrote books an Curing disease in the late 1800s -early 1900s. In their books they wrote about how they cured all types of cancers and other disease. CURED DISEASE! NOT MASK IT WITH A DRUG! Best wishes to those looking for the truth and a cure to disease. Well, there are doctors who have had great success by using natural treatments. Most cancer patients die from the poison of chemotherapy, not the cancer. Based on my research, cancer is easily curable and the people who own big pharma are only interested in money. I believe the medical profession is the biggest cause of death in the Western world, not because doctors are bad people, but because the system is controlled. I no longer fear cancer because I know how easy it is to cure. The sleeping giant is rising. Soon the truth will become widespread, and the people who own the system can expect a very long time behind bars. Oh Please not another alt pusher trying to sell a book of the backs of desperate and vulnerable cancer patients, you people really have no shame do you, you didnt cure your breast cancer by a EMEM frequency generator , or by useing natural therapies to “Boost” your immune system, and your breast cancer certainly WASNT caused by a root canal tooth, do you know anything about the biology of cancer????? i suggest doing some proper research into the history of cancer and the biology of this disease (Its free from reptuable cancer organisations BTW no books to sell) 1- there is no “cure” for breast cancer there are treatments that will hopefully keep your cancer at bay , breast cancer can come back at anytime 5/10/20 yrs later the same cancer on path report ,the ONLY time you will know if you are cured of breast cancer is when you grow old and die of something else!!, furthermore you can “Boost ” the immune system as much as you like it wont stop or cure a cancer , The Immune system DOES NOT reconise cancer cells, cancer cells are your own body cells multiplying out of control the immune system only reconises bacteria and viruses for goodness sake why the heck do you think scientists are trying to develope vacines to TRAIN the Immune system to reconise cancer cells jeeeessseee , please stop being so gullable people lots of people out there makeing a very good liveing out of all this twaddle and nonsence , some of you people really need to wise up and get yourselves EDUCATED. , What utter Piffle!!!!!!!! Yes Rex your right drugs companies want to make shed loads of money from this terrible illness. But you should research the latest treatment for prostate cancer. It involves turmeric .. And it’s being used by the medical profession so don’t give up believing that cures will be found in the garden or cupboard . How can natural treatment or Natural FOODS damage our body??? Eat any kind of fruits and you’ll even live longer and healthy everyday. I’ve contributed lots of money for Cancer Research UK in the past just to realize that they’ve never able to produce a cure for cancer which you can supposedly get from your garden for FREE. Of course they don’t want you to know as they can’t make money from it. They want you to get a conventional treatment of Chemo, radio, surgery and pay big sums just to die few years later. Take your time to look at life for what it really is around you if you wish to seek a breakthrough. Lemon juice could never be considered a cure. It is acidic and full of sugar, exactly what a cancer cell needs to thrive. Stop looking to killing cancer and start looking to helping it grow, paradoxically, look to preventing normal cells from growth without cancer causing “ingredients” and killing them slowly, and look away from helping them live. Your blinding yourselves by mechanisms. While it is right to look to these mechanisms, you have to look to cancer for what it really is. Why and how philosophically. In the same way is it right for us to call every person “different” from us, when we are abiding by the same mechanisms in a different way?, You may be ecstatic over a new car and show this yourself in your speech, body language and thoughts, while I may become “my version” of ecstatic through a different means of the same mechanisms. Which is why there are numerous amounts of different cancers. Only curing one would only singularly cure that one and while that may be beneficial to treating others with the secrets of it’s mechanism. It will never be obtained if you chase two rabbits for 10 days with 20 different tools. You will always go hungry. Have patience, look to the trees, the grass and most importantly yourself. Look to life and death and you will find your cure. Can’t you see that initially, cancer wants to die?, Fortunately having myself lived with an aggressive type of breast cancer for over 6 yrs i’m still alive and kicking thanks to cancer research and conventional cancer treatments that have been proven to work, i also know many people up and down the UK with secondary breast cancer, and many are still alive some 12 or more yrs later , its a nonsense that anyone who has an aggressive cancer only lasts a few months after treatment, getting the right treatments for each individuals cancer and finding the ones which work for them can keep even secondary cancer at bay for many yrs, Conventional medicine doesn’t dismiss “Natural” Medicine when it has been Proven to work, i.e Aspirin comes from the bark of a willow tree, Taxotere a type of chemo comes from the Yew Tree, these are just small 2 examples, however,so called “Alternative” cancer treatments touted on the internet with nothing more to offer than Anecdotes in You-Tube videos and on dubious alternative cancer curing treatment websites DO NOT work, they have no clinical scientific evidence to back them up , they have not been tested in robust clinical trials to prove either efficiency or safety and they offer no more than False Hope to a Cancer Patient which is utterly cruel , when someone has been told they have no more treatment options left, it is cruel for these snake oil salesmen to prey on those vulnerable cancer patients, it robs them of their time, which would be better spent with their families , Its robs them of their savings and in some cases their homes, There really is No such thing as Alternative Medicine, There are treatments that Work and there are treatments that Don’t, When a Treatment has been Proven to work, it is called MEDICINE, therefore is No Longer Alternative. Well said Nicholas. I don’t know of anyone who has survived aggressive cancer. Some have gained a few months after having chemo, but the treatment made them so ill. A thought on natural medicine. The treatment for gout stems from the Egyptians and dates back five thousand years. A great many herbs spices and berries used by people in the past have found their way into modern medicine! When you are told that you have three months to live, and conventional treatment, may extend your life by a month or so,if you are prepared to spend your time being sick from the chemo, and there is NO hope of it curing you. Then who the hell are any of you to criticise anyone trying something different. Mind over matter is a very strong healer and if you believe in the herbal go for it. I have seen too many people go down the conventional treatment, with exactly the same results, death, normally to the month predicted. I also know people who have refused conventional treatment and are still alive years later, and that is ad fact! Do NOT try to force people to conform to conventional medicine if they decide not to ,it is their choice not yours. As such why put up such a case to go down the conventional medical line? Afraid that some of this has a positive result as opposed to chemo that will poison your body. Quackery flourishes when conventional treatments are unpleasant and only partially effective. Before the advent of antibiotics there were “alternative” treatments for diseases such syphilis and TB. Effective modern treatments for these formerly serious diseases have driven the VD and TB quacks away. Sadly, even with modern treatment cancers can not always be cured (although some cancers are cured and others can be kept at bay for years), so there are frightened and desperate people out there for cancer quacks to exploit. It is good that cancer charities are mobilising information to fight exploitation of the vulnerable. Alternative Therapies make more money in a year than Cancer Research does , I find that staggering!! ,especially when alternative therapies don’t have a hope in hell of working let alone curing anything. It’s quite shocking really just how many cancer patients believe in all this ALT rubbish, which sadly seems particularly abundant online, of course Alt is often aimed at the vulnerable, frightened and often desperate cancer patients ,so its very big business, I find it quite disgraceful having known quite a few friends with cancer turn to alternative therapies because they have been brainwashed with all this silly nonsense online, I think much more needs to be done to make people much more aware of all the Hukum out there sadly surrounding this disease ,and that those websites that continue to promote this kind of nonsense to cancer patient’s are brought to task. Thanks CRUK, for publishing this very good artical. . Celia, these cancer treatments that damage peoples bodies, have gotton rid of hodgkins lymphoma for me and saved my life, to my family, the drug companies deserve to make a lot of money because they are heroes to millions of survivors. Remember they will only make their millions if they are selling something that can work, maybe not for everybody, but it certainly has for me x x I guess CRUK and Kat Arney feel they have to appeal to the lowest common denominator, but it still depressing how this blog always assumes people are stupid, and cannot distinguish between peer reviewed scientific articles which are available on the internet, and rubbish. Cancer treatments make massive amounts of money for the drugs companies, These drugs damage peoples bodies and don’t always cure them. We need to be told what is causing cancer so we can help ourselves to stay healthy. And we need cures not treatments. Follow us Cancer Research UK is a registered charity in England and Wales (1089464), Scotland (SC041666) and the Isle of Man (1103). A company limited by guarantee. Registered company in England and Wales (4325234) and the Isle of Man (5713F). Registered address: Angel Building, 407 St John Street, London EC1V 4AD.
78,129,521
Majikoi vs. Horizon Yet another in a long series of articles that started out as a comment at Steven’s, and had to be moved here. In this case, as the title suggests, I am talking in part about another series, and wanted to avoid the derail in his comments. (I’m a little more hands-off on that kind of thing, but his blog, his rules.) Steven complains that Majikoi has degenerated from being Dog Days (edit: or BakaTest, if he’s not pulling my leg) to something that’s terribly derivative. I don’t quite get that complaint — after all, if it was all about harmless fights between rival groups of young people, it would be derivative of Dog Days, right? Don’t slam a show for not being the sequel to a recent fave. NSFW: Here's some of that inventive censoring that some folks demand. But let’s take the complaint at face value — it’s derivative. Even at that, it’s head and shoulders above Horizon on the Middle of Nowhere, which is taking “original” into “incoherent, with inferior fanservice.” Both shows have one thing in common — it appears that they started with an action-packed episode that didn’t really represent the tenor of the series. Bait-and-switch, as it were. But in Horizon’s case, it combines a too-large cast, a total loss of a male lead, complex-to-the-point-of-incoherent politics and background that make no sense, inconsistent technology, and motivations that make no sense. She's as fanservice-y as it gets, and she's his sister. Why is all the world but Japan uninhabitable? Why did the rest of the world move into an alternate dimension that was shaped like Japan. How could it “fall to Earth?” What in the world would give these people the idea that they can recreate “Heaven” by reenacting history? And how are a bunch of schoolkids supposed to change this world? Lets not even get into the one that got featured this episode, who had to undergo a sex-change from girl to boy in order to take service with some great house, only to be laid off, mid-change when the lord replaced his staff with androids. Too much is being thrown at the viewer too fast — this might be salvageable if it were a two-cour series, and the concepts were added along the way — much like what Scrapped Princess did. But it was designed that way, whereas I suspect if I read the novels from which this was derived, the first 50 pages would read like a David Weber infodump. In other words, fascinatingly boring as hell. (FYI: David Weber, if you didn’t know, has a habit in the Honor Harrington series, of starting an action scene, then pausing it for three pages of Context You Need to Have. Just start the damn missiles flying already!!!!) Sometimes, derivative is better than the alternative. It’s just not possible to be original all the time. Sometimes, you just gotta give the audience what they’re asking for, and I think that’s the route they went with Majikoi. This is hardly surprising, seeing as it’s coming from a dating sim with little plot and prominent sex scenes. (I reiterate my complaint that not enough of these games that get made into anime are making it to American shores. Then again, I suspect there might be problems with some of the laws involving under-age sex and jail terms. At least, unless your the head of HoustonMetro. NSFW: If you've ever shot a bow, you know what's wrong with this picture. If there was anything that I found disappointing, it’s that Momoyo “doesn’t want” Yamato, but she isn’t going to let any of the other girls have him either. So instead of hijinks of him fending off amorous girls while chasing his goal (Momoyo), we’re going to get Momoyo cock-blocking the competition while not knowing her own heart being indecisive. Blech. Well, anyway, I’ll take MajiKoi for what it is — entertaining fluff, with a lot of really good cheesecake. Speaking of which, I’ll leave you with this racy last scene from the bath chase.
78,129,664
Head On Electric's Different Kind of Grunge Head On Electric's Sleep Slaughter Sheep isn't a debut album, but it could easily be mistaken for one. It's the band's first release for the Milwaukee rock 'n' roll clearinghouse Dusty Medical, and it's the band's first wide release, following some long-out-of-print cassette tapes and small-run CDs. So unless you've diligently frequented their merch table, it's probably the first Head on Electric album you've had a chance to buy. Just as importantly, Sleep Slaughter Sheep has the feel of a debut album, capturing the excitement of a band with more ideas than they could squeeze into a reasonable playing time as they realize how to pare them down and make them cohere into something powerful. Although Head On Electric has been performing since 2005, when singer-guitarist Erik Oley, bassist Steve Deau and drummer Cole Juntila were members of the seven-piece noise circus The Kind of Jazz Music That Kills, it was only fairly recently that the trio began to regard itself as something more than a side project. “The three of us have always taken the band fairly seriously, practicing every week,” Juntila explains. “But I guess until two or three years ago, this was always our secondary band. There were times when we were all in two or three, sometimes even four, bands at the same time. Once this became our main band, we started to put a little more effort into it.” Not coincidentally, it was during that time that Head On Electric began to distinguish itself from its many peers in Milwaukee's magnificently fertile and very crowded garage scene with a sound that filtered the pure melodic sensibilities of garage-pop through the muddy angst of first-wave grunge. There's a bit of the Meat Puppets in the band's punky, bruising rhythms and a shade of J. Mascis in Oley's emotive yowl, as well as a whole lot of Kurt Cobain in his feral screams. The resemblance was at times uncanny when the band performed a Nirvana tribute set this Halloween. “We all went to high school together, and me and Erik got really into the whole Nirvana thing, although we were hilariously a little late to it—we all only found out that Cobain was dead two years after he had died,” Juntila says of the band's teen years in Pulaski, Wis. “We sort of built from that when we got to Milwaukee. We really loved what was going on with The Catholic Boys and all the other guys here, and we've developed from there.” What thrills so much about that seemingly basic template—the grunge of America's collective youth by way of a very Milwaukee-specific form of garage-punk—is how it leaves so much room for variations and tangents. Throughout Sleep Slaughter Sheep, the band breaks form regularly, detouring into twitchy psychedelia, oddball Americana and unflinching hard-rock. “Erik is the one who writes all of our songs, and he doesn't have a set style,” Juntila says. “He would be the first to admit he's not a singer, which is why our vocals have about 20,000 gallons of reverb on them, but he has such an ear for melody that it doesn't matter. “He's really prolific, and has a new song or five every practice, and they're always a little more impressive and eclectic than the last batch,” Juntila continues. “They never sound like something you'd expect. He'll have this barrage of heavy songs, then he'll break out this beautiful country song. Being in a band with him for so long and watching his songwriting develop has been really incredible.” Head On Electric plays an album release show on Saturday, Jan. 14, with Static Eyes and Mayday at Linneman's Riverwest Inn at 9:30 p.m.
78,129,699
FROM alpine:3.8 ARG VERSION ENV VERSION master ARG URLCONTEXT ENV uid 1337 ENV gid 1337 ENV user lemur ENV group lemur RUN addgroup -S ${group} -g ${gid} && \ adduser -D -S ${user} -G ${group} -u ${uid} && \ apk --update add python3 libldap postgresql-client nginx supervisor curl tzdata openssl bash && \ apk --update add --virtual build-dependencies \ git \ tar \ curl \ python3-dev \ npm \ bash \ musl-dev \ gcc \ autoconf \ automake \ libtool \ make \ nasm \ zlib-dev \ postgresql-dev \ libressl-dev \ libffi-dev \ cyrus-sasl-dev \ openldap-dev && \ mkdir -p /opt/lemur /home/lemur/.lemur/ && \ curl -sSL https://github.com/Netflix/lemur/archive/$VERSION.tar.gz | tar xz -C /opt/lemur --strip-components=1 && \ pip3 install --upgrade pip && \ pip3 install --upgrade setuptools && \ mkdir -p /run/nginx/ /etc/nginx/ssl/ && \ chown -R $user:$group /opt/lemur/ /home/lemur/.lemur/ WORKDIR /opt/lemur RUN npm install --unsafe-perm && \ pip3 install -e . && \ node_modules/.bin/gulp build && \ node_modules/.bin/gulp package --urlContextPath=${URLCONTEXT} && \ apk del build-dependencies COPY entrypoint / COPY src/lemur.conf.py /home/lemur/.lemur/lemur.conf.py COPY supervisor.conf / COPY nginx/default.conf /etc/nginx/conf.d/ COPY nginx/default-ssl.conf /etc/nginx/conf.d/ RUN chmod +x /entrypoint WORKDIR / HEALTHCHECK --interval=12s --timeout=12s --start-period=30s \ CMD curl --fail http://localhost:80/api/1/healthcheck | grep -q ok || exit 1 USER root ENTRYPOINT ["/entrypoint"] CMD ["/usr/bin/supervisord","-c","supervisor.conf"]
78,129,790
single instance XULRunner applications single instance XULRunner applications Hi All, I'm developing a XULRunner application on Windows. My Application should have only Single Instance running at a time i.e like Single Document Applcation(SDI). But when i click my app's EXE , it launches new App everytime. I've searched the mozilla site for this , found a link containing the singletonWindowType preference. I tried the same with my app , but it didn't work.(http://developer.mozilla.org/en/docs/toolkit.singletonWindowType)
78,129,810
open Types exception ExecError of string val exec_code : environment -> instruction list -> syntactic_value * environment option
78,129,855
Chicago Blackhawks Charities will sell limited-edition, certified Chicago Blackhawks ice from the 2015 Stanley Cup victory – the organization’s first win on home ice in 77 years – on Friday, Dec. 11. The one-of-a-kind vials and cubes of melted home ice collected by United Center staff can be purchased exclusively at the Blackhawks Store locations on Michigan Avenue, Oakbrook Center and Old Orchard Mall for $99 per vial or $49 per cube, with all proceeds benefitting Blackhawks Charities. Both items will also be available for purchase at the United Center Blackhawks Store location on Friday, Dec. 11. There will be a limited number of vials and cubes available for phone orders starting on Monday, Dec. 14; please contact the Blackhawks Store at 312-759-0079.Each commemorative vial kit will include a leather-embossed display box engraved with the Blackhawks logo and one glass vial, as well as a certificate of authenticity. Each cube kit will include a box with the Blackhawks logo and one cube, which includes postseason opponents and series scores. Blackhawks Charities pledges to support programs and institutions throughout Illinois that focus on health and wellness, education and housing, while striving to serve local citizens and impact the lives of youth and their families in and around the city of Chicago. Since its inception in October 1993, Blackhawks Charities has granted over $14 million to local nonprofit organizations.
78,129,900
“Why Unlimited Mobile-to-Mobile Calling is Evidence of a Lack of Wireless Competition”, suggests that there has been no competition in the wireless marketplace because AT&T is offering a new unlimited mobile-to-mobile plan. Mr. Weinberg claims that this new plan is evidence that there has not been competition in the market for some time. Congresswoman Marsha Blackburn (R-TN) grilled FCC Chairman Julius Genachowski about the FCC’s Net Neutrality rules, and whether the rules would cover disputes like the one between Level 3 and Comcast. Genachowski evaded the question and even misrepresented his own rules to preserve the flexibility for future FCC intervention and overreach. A 57% majority of the US House of Representatives has voted to block funding for the FCC’s Net Neutrality rules passed by a slim FCC majority last December. The vote now heads to the Senate and then the President’s desk. We found the FCC rules to be incoherent because the FCC ignored the record and overreached when it outlawed Paid Prioritization. I recently ran into an blog article by Jonathan at WhoIsHostingThis.com which is a web hosting review site. The article in question was calling for the Internet as we know it to become a public utility. This is a meme that crops up from time to time, but has definitely had some additional traction as of late. Deep packet inspection or web crawlers had nothing to do with the Egyptian Internet shutdown, but Free Press rarely lets facts get in the way of exploiting a good crisis to call for government hearings. Ironically, it was Free Press asking the FCC to regulate Internet speech for decency. The new report commissioned by European broadband providers speaks of an impending crisis if content providers don’t pay up while the content providers continue to propagate the myth that all websites should run at the same speed regardless of what they pay. But both sides are being ridiculous and their alarmism could lead to nasty political outcomes that they will both regret. Netflix and its CDN partners want the media and the government to pressure and force consumers to subsidize Netflix to the tune of thousands of Gbps of free bandwidth. Netflix CEO Reed Hastings is misleading all of us when he says that Netflix already pays their share of the bandwidth costs and that they deserve free server capacity. Barry Collins of PCPro UK claimed that “ISPs are threatening to cripple websites that don’t pay them first” and set off some predictable outrage against the ISPs. But Collins’ article is a perfect example of scaremongering because its thesis wasn’t even supported by his own evidence gathered for the article, and some of the quotations seem to have been misinterpreted to throw gas on the fire against ISPs. Nearly a year late to the party, Nate Anderson repeats the myth that the FCC concluded that higher speed broadband is turning into a Cable DOCSIS 3.0 monopoly outside of Verizon FiOS fiber to the home (FTTH) territory because the fiber to the node (FTTN) VDSL2 technology used by other Telcos can’t compete. This is wrong on many levels because the FCC didn’t make such a conclusion and FTTN can compete with Cable. I asked Blair Levin (the man in charge of that National Broadband Plan) to clarify last April and … Twitter Feed About Us Digital Society is a digital think tank that believes culture and commerce are inseparable, that the digital economy flourishes when people are free and rights are secure, and that free markets free people. Digital Society is an independent 501(c)3 non-profit organization, funded by donations from Jon Henke and from Arts+Labs. We advocate for a pro-culture, pro-commerce digital society through research, analysis and debate on emerging technology issues. Reply Comments Transparency and interactivity are trademarks of the Internet era, and we aim to foster them here at Digital Society. It is inevitable that some people will disagree with the technology policy positions we take. We want to have that constructive debate. The Reply Comments feature gives our critics a chance to respond to our viewpoints and the Digital Society audience convenient access to competing arguments. Any time we directly challenge the views of an individual or a group on this site, the party in question may substantively respond in a guest post.
78,129,985
We've teamed up with our friends at Perfect World Entertainment to celebrate the recent console release of the "Victory is Life" expansion in Star Trek Online by giving awayYellowstone Class Runabout Xbox One codes. Smaller than a starship but larger than a shuttlecraft, the Yellowstone is the newest Runabout in the Federation line. Like other Runabouts, the Yellowstone is often used as a tow vehicle. It comes equipped with a Tractor Beam to handle these utilitarian missions. The Yellowstone comes with an advanced Tetryon-Plasma Engine. This engine grants a +32 bonus to your Starship Warp Core Potential. The Tetryon-Plasma can also be ejected behind the Runabout. Any enemy ship that passes through the Tetryon-Plasma cloud will be slowed and may have its engines knocked offline. Coinciding with the 25th anniversary of, "Victory is Life" takes players onto the iconic starbase after it has been badly damaged in an invasion by the Hur'q. Spread across seven new episodes, captains will have to come up with a plan to defend the starbase from further incursions and prevent any further damage.You can find further info about the "Victory is Life" expansion, along with screenshots and the launch trailer here , and you can register and download the game here To grab a code, simply complete one of the actions below using the embedded Gleam app. Once completed your download code will be displayed. (Rules And Regulations) - Entry is via one or more entries using the Gleam app - The code will be displayed in the Gleam app immediately - This competition is available worldwide - Entries close on Wednesday, August 15th at 7PM UTC.
78,130,017
Q: Undefined reference to "timer_create" even though "-lrt" included compile I'm having some issues with a Makefile for a project I am working on. I am getting "undefined reference to 'timer_create'" and such even though they are included in the linkopts. I think the issue is that the libraries are at the front of the compile line instead at the end, but I am pretty unfamiliar with a Makefile like this. How can I ensure the links come at the end instead of at the beginning? Here is part of the Makefile I am talking about, the make, it tries this: gcc -g -lpthread -lrt -Wall -o scheduler scheduler.o worker.o list.o smp5_tests.o testrunner.o But I am pretty sure it should be this: gcc -g -Wall -o scheduler scheduler.o worker.o list.o smp5_tests.o testrunner.o -lpthread -lrt Here is the Makefile: CC = gcc CCOPTS = -c -g -Wall LINKOPTS = -g -lpthread -lrt -Wall EXEC=scheduler OBJECTS=scheduler.o worker.o list.o smp5_tests.o testrunner.o all: $(EXEC) $(EXEC):$(OBJECTS) $(CC) $(LINKOPTS) -o $@ $^ A: I think the issue is that the libraries are at the front of the compile line instead at the end, but I am pretty unfamiliar with a Makefile like this. Well, this is specific to GNU ld linker, not to make itself: ld resolves dependencies in a single pass from left to right, except between -Wl,--start-group and -Wl,--end-group (which intended specially for handling circular dependencies). This means that the libraries must be put after the modules (or another libraries) which use them. How can I ensure the links come at the end instead of at the beginning? Consider how it's done with (a simplified version of) the default rule: %: %.o $(CC) $(LDFLAGS) $^ $(LDLIBS) -o $@ Here LDFLAGS are "normal" linker flags which can safely precede the objects list; and LDLIBS is the list of system libraries used by the program.
78,130,048
553 Pa. 233 (1998) 719 A.2d 217 COMMONWEALTH of Pennsylvania, Appellee, v. John Charles LESKO, Appellant. Supreme Court of Pennsylvania. Submitted September 15, 1997. Decided May 21, 1998. *238 Rabe F. Marsh, III, Greensburg, Paul G. Kay, Pittsburgh, for John Lesko. John Peck, Greensburg, Robert A. Graci, Harrisburg, for Com. Before FLAHERTY, C.J., and ZAPPALA, CAPPY, CASTILLE, NIGRO, NEWMAN and SAYLOR, JJ. OPINION OF THE COURT FLAHERTY, Chief Justice. In 1981, the appellant, John Charles Lesko, was convicted of murder of the first degree and conspiracy and sentenced to death for the killing of a police officer, Leonard C. Miller. Appellate and post-conviction review of the conviction and sentence resulted in the exhaustion of state remedies and the denial of certiorari by the Supreme Court of the United States. See Commonwealth v. Travaglia, 502 Pa. 474, 467 A.2d 288 (Pa.1983) (direct appeal), cert. denied, 467 U.S. 1256, 104 S.Ct. 3547 (1984); Commonwealth v. Lesko, 509 Pa. 67, *239 501 A.2d 200 (Pa.1985) (post-conviction review), reargument denied, 509 Pa. 625, 506 A.2d 897 (Pa.1986), cert. denied, 479 U.S. 1101, 107 S.Ct. 1328, 94 L.Ed.2d 179 (1987). In subsequent habeas corpus proceedings, the sentence of death was reversed by the United States Court of Appeals for the Third Circuit on the basis that improper prosecutorial comments made during the penalty phase of trial had tainted the jury's sentencing decision. Lesko v. Lehman, 925 F.2d 1527 (3d Cir.1991), cert. denied, 502 U.S. 898, 112 S.Ct. 273, 116 L.Ed.2d 226 (1991). The case was remanded to federal district court to resolve an evidentiary issue relevant to resentencing. Id. In 1995, a resentencing proceeding was held in the Court of Common Pleas of Westmoreland County. Appellant was again sentenced to death.[1] The present appeal ensued. Appellant's first contention is that, in 1991, when his initial sentence of death was reversed, a remand for imposition of a sentence of life imprisonment should have followed. Under the sentencing statute that was previously in effect, a remand for imposition of a sentence of life imprisonment would indeed have been required. Commonwealth v. Wharton, 542 Pa. 83, 665 A.2d 458, 460 (Pa.1995), cert. denied, 517 U.S. 1247, 116 S.Ct. 2504 (1996); Commonwealth v. Young, 536 Pa. 57, 637 A.2d 1313, 1316 (Pa.1993). However, the statute was amended in 1988 to provide that a new sentencing hearing must be conducted whenever a sentence of death is vacated, except where it is vacated for disproportionality or lack of evidence of aggravating factors. 42 Pa.C.S. § 9711(h)(4). Appellant argues that his due process and ex post facto rights under the federal and state constitutions were violated by subjecting him to resentencing under the amended statute. The same arguments have already been rejected by this court. We have repeatedly held that the revised sentencing provision can be applied to cases, like appellant's, that were pending in the appellate process at the time of the amendment. Commonwealth v. Wharton, 665 *240 A.2d at 460; Commonwealth v. Young, 637 A.2d at 1316-18 (no violation of ex post facto clause). See also Commonwealth v. Chambers, 546 Pa. 370, 685 A.2d 96, 100-02 (Pa.1996) (no denial of due process), cert. denied, ___ U.S. ___, 118 S.Ct. 90 (1997). Appellant further asserts that the resentencing provision does not apply where a death sentence has been vacated by any court other than the Supreme Court of Pennsylvania. He relies on the following language in 42 Pa.C.S. § 9711(h)(4): "If the Supreme Court determines that the death penalty must be vacated for any other reason [i.e., reasons other than disproportionality or lack of evidence of aggravating circumstances], it shall remand for a new sentencing hearing pursuant to subsections (a) through (g)." Appellant reasons that because the statute does not specify what is to occur when courts other than this one vacate a sentence, the legislature must have intended that there would be no new sentencing hearing and that a remand for imposition of a life sentence would occur. Such an approach would lead, however, to a highly irrational sentencing scheme. Those whose sentences are vacated by this court would be in a far worse position than those whose sentences are vacated by other courts, since the former would be at risk of incurring another death sentence while the latter would not. The legislature cannot be deemed to have intended such an illogical result. See 1 Pa.C.S. § 1922(1) (presumption that the legislature did not intend a result that is absurd or unreasonable). Further, appellant asserts that the resentencing provision is inapplicable to cases that were pending before any court other than the Supreme Court of Pennsylvania at the time of the 1988 amendment. Appellant notes that, because we completed appellate and post-conviction review of this matter in 1986, this case was not pending before us at the time of the 1988 amendment. The statute contains no language, however, that makes it applicable only to cases pending in a particular court. In fact, the legislature expressly designated that the amendment should be applied to "all criminal cases *241 and appeals pending on the effective date of this act." Act of 1988, Dec. 21, P.L. 1862, No. 179, § 3 (emphasis added). This plainly sets no limits as to the courts in which cases and appeals were pending. Appellant next contends that it was error to allow the Commonwealth to introduce evidence of two other murder convictions as aggravating circumstances for the present murder, inasmuch as evidence of the other convictions was not introduced at the original sentencing proceeding. We do not agree. The other murders, namely those in which Peter Levato and Marlene Sue Newcomer were victims, were committed prior to appellant's trial in this case. No trials in the other cases had yet taken place. Hence, the convictions were simply not available for use at appellant's original sentencing hearing. Regardless, it is well established that the Commonwealth may introduce new aggravating factors in the penalty phase of a second capital murder trial without providing an excuse for not having presented those factors in the first trial. Commonwealth v. Zook, 532 Pa. 79, 615 A.2d 1, 19-21 (Pa. 1992), cert.denied, 507 U.S. 974, 113 S.Ct. 1420, 122 L.Ed.2d 789 (1993). Appellant also asserts that allowing admission of the other convictions serves to reward the Commonwealth for prior misconduct in that there would have been no second penalty hearing and therefore no use of the other convictions if the Commonwealth had not made improper remarks at the previous hearing. In reversing appellant's sentence of death, the United States Court of Appeals for the Third Circuit held that certain remarks by the Commonwealth at the penalty hearing had tainted the sentencing determination. However, there was absolutely no indication that the remarks were viewed as being so intentional and egregious as to constitute misconduct. Lesko, 925 F.2d at 1540-46. See generally Commonwealth v. Smith, 532 Pa. 177, 615 A.2d 321, 325 (Pa. 1992) (intentional prosecutorial misconduct). In fact, it was only when the remarks were considered cumulatively that habeas *242 corpus relief was deemed warranted at all. Id. at 1541, 1546. Appellant's assertion is without merit. It is next alleged that the trial court erred in dismissing a juror for cause. The juror, Arlene Selepic, asked the following question while she was undergoing voir dire by the Commonwealth regarding her ability to render a verdict of death: "In the State of Pennsylvania if a person is granted or is sentenced to life in prison, is that with the proviso there is no parole possible?" Defense counsel nodded his head in the affirmative and proclaimed, "The judge will charge the jury that life imprisonment in Pennsylvania is without parole." The Commonwealth immediately challenged the accuracy of this remark. The court responded by telling Selepic that life imprisonment "has no requirement of parole," and added that the matter could not be explained any further at that time. A challenge for cause was made by the Commonwealth. The court sustained the challenge on the basis that the information conveyed to Selepic by defense counsel was information that other jurors might or might not receive during the sentencing hearing. The court noted that the question of whether, and under what circumstances, jurors should be instructed that a life sentence means life without parole had not yet been resolved by Pennsylvania courts.[2] A trial court's decision to discharge a juror will not be reversed absent a palpable abuse of discretion. Commonwealth v. Jacobs, 536 Pa. 402, 639 A.2d 786, 790 (Pa.1994). No such abuse is evident here, inasmuch as the court's action was a mere precaution designed to assure that the jury would not be tainted by the information that defense counsel conveyed. Appellant's next contention is that the Commonwealth, during cross-examination and closing argument, improperly commented on his fifth amendment privilege. At the resentencing hearing appellant testified that, around the time of the Levato, *243 Newcomer, and Miller murders, he was taking drugs and abusing alcohol. He also testified that he was upset and traumatized by sexual molestation that he and his brother experienced during their childhood years. During the original sentencing hearing appellant testified regarding various aspects of his past, but he did not mention any drug or alcohol abuse. Nor did he mention any molestation. Similarly, confessions that he gave to the police contained no references to such matters. At the resentencing hearing, appellant was cross-examined by the Commonwealth regarding his failure to have ever previously mentioned drug and alcohol abuse and sexual molestation. Appellant conceded that he had an opportunity to include such details in his confessions, but that he did not do so. He also admitted that he did not mention these matters in testimony that he gave at his original sentencing hearing. Consequently, the Commonwealth made the following comments during its closing argument: And isn't it interesting, ladies and gentlemen, in 1981 when John Lesko testified before another jury, he didn't tell them that he was under the influence of drugs and alcohol. Don't you think that would be important when you committed these homicides, Peter Levato, Marlene Sue Newcomer and Officer Miller, to tell the jury that I was drunk, that I had drugs and didn't know what I was doing, don't you think that would be something you would remember? . . . After 15 years, he's remembered that detail. He remembers all of his drug abuse and alcohol abuse started early, continued, only got worse, but in 1981, he didn't mention a word to the jury about that, a word. . . . . Remember too when he testified in 1981, there's no rage, there's no hostility about the molestation of his brother . . . . That was important to him. Why didn't he mention it? Appellant's contention that the Commonwealth's cross-examination and closing argument amounted to an impermissible comment on his right to remain silent is without basis. *244 Appellant's testimony regarding drug and alcohol abuse and sexual molestation was directed at establishing mitigating circumstances for sentencing. See 42 Pa.C.S. § 9711(e) (mitigating circumstances). The cross-examination and closing remarks of the Commonwealth simply tested the veracity of the mitigating testimony that appellant presented. As the Third Circuit Court of Appeals noted when it reversed appellant's original sentence of death, Lesko provided testimony of a biographical nature at the penalty phase of his trial. Clearly, then, he could not claim a fifth amendment privilege against cross-examination or prosecutorial comment on matters reasonably related to his credibility or the subject matter of his testimony. See Harrison v. United States, 392 U.S. 219, 222, 88 S.Ct. 2008, 2010, 20 L.Ed.2d 1047 (1968) ("A defendant who chooses to testify waives his privilege against compulsory self-incrimination with respect to the testimony he gives . . ."). 925 F.2d at 1542. Unlike the original sentencing proceeding where the Commonwealth was deemed to have improperly commented on appellant's failure to testify regarding the merits of the charges against him, see Lesko, 925 F.2d at 1544, the Commonwealth's cross-examination and closing argument at the resentencing hearing addressed only the credibility of the testimony that appellant provided. The Commonwealth was perfectly within its bounds to challenge the credibility of biographical testimony that appellant provided. Appellant next argues that the following excerpt from the Commonwealth's closing argument constituted an impermissible appeal to vengeance: Remember all of the sympathy that he showed him, all the mercy that he showed him. When you think about Marlene Sue Newcomer, think about her handcuffed in the back seat of her car with a blanket over her, being shot at and then being killed and remember all of the mercy and all the sympathy that John Lesko showed her. On January 1, 1980, and the climax of these eight days, remember Officer Miller being goaded, literally goaded into chasing Travaglia and when he comes up to him, being killed and being urged *245 by John Lesko to hit him again. Remember his prideful remarks that night, I wanted to kill him. Remember all of the mercy and all the sympathy that John Lesko showed Officer Leonard Clifford Miller on January 3, 1980. In fact, appellant claims that this was essentially identical to the appeal for vengeance that was made during his original sentencing hearing. We do not agree. At the original hearing, the prosecutor argued that the death penalty should be imposed to fulfill the jury's "duty" to even the "score," which stood at "John Lesko and Michael Travaglia two, Society nothing." Lesko, 925 F.2d at 1545-46. See generally Commonwealth v. Johnson, 542 Pa. 384, 668 A.2d 97, 107-08 (Pa.1995) (prosecutorial remarks at penalty hearing must not "arouse the jury's emotions to such an extent that it is impossible for the jury to impose a sentence based on relevant evidence"), cert. denied, ___ U.S. ___, 117 S.Ct. 90 (1996). The comments at the resentencing proceeding were of a different sort. They focused on details of the crimes, and encouraged the jury to pay particular attention to the callous manner in which appellant acted. This did not invite a retaliatory sentencing decision; rather, it invited a decision that took into account all of the relevant facts — including those which shed light on appellant's cold-hearted and unmerciful character. Consideration of evidence relating to character is properly at the core of the sentencing decision. Commonwealth v. Beasley, 505 Pa. 279, 479 A.2d 460, 465 (Pa.1984). The remarks in question would not have impeded the jury in rendering a verdict based on the evidence. Appellant next argues that by again being sentenced to death, after having been imprisoned on death row for sixteen years, he is being subjected to cruel and unusual punishment in violation of the eighth amendment to the United States Constitution and article I, § 13 of the Pennsylvania Constitution. Appellant describes death row as a hostile and hopeless environment where there are "shouts of the ugliest profanities" and where "basic concentration is next to impossible." Further, he reports that death row houses some of the Commonwealth's most violent offenders, and that many of *246 these people do not conduct themselves in a manner that he finds tolerable; for example, they refuse to take showers and they utter "unintelligible shrieks." We are not persuaded that, by being housed with people like himself who have committed crimes so reprehensible to society as to warrant imposition of the death penalty, appellant has been subjected to an unconstitutional punishment. It has been repeatedly held that the death penalty does not constitute cruel and unusual punishment. E.g., Commonwealth v. Hardcastle, 519 Pa. 236, 546 A.2d 1101, 1111 (Pa.1988), cert. denied, 493 U.S. 1093, 110 S.Ct. 1169, 107 L.Ed.2d 1072 (1990). This conclusion is not altered by the fact that, prior to being executed, one ordinarily spends substantial time in incarceration while appeals challenging one's sentence are adjudicated. Appellant's next claim is that the trial court erred in excluding certain mitigating testimony at the penalty hearing. It is well established that a defendant is to be accorded wide latitude in demonstrating mitigating circumstances. See Commonwealth v. Travaglia, 467 A.2d at 300. The testimony that appellant sought to introduce was, however, properly excluded. It was, quite simply, not relevant to proof of mitigating circumstances. Specifically, the court did not permit a social worker, Lois Nardone, to testify regarding tangential aspects of appellant's family background that were not known to appellant before he committed the crime. Nardone provided extensive testimony about various aspects of appellant's family history. But when she attempted to testify regarding the marital relationship of appellant's grandmother, including extraneous details such as whether the grandmother slept alone or engaged in sex only on paydays, the testimony was excluded as irrelevant. Nardone was unable to provide any basis for belief that appellant would have been aware of such details. Matters of which appellant had no knowledge would certainly not have been mitigating factors in his criminal acts. The court also excluded small portions of the testimony of Hamid Abdul, an Islamic chaplain who served as a *247 religious mentor and counselor to appellant in prison. Abdul testified regarding the prison environment where appellant was incarcerated, and about appellant's personality, interests, and behavior. Abdul was not, however, permitted to state his personal opinion as to whether appellant is remorseful, though he was allowed to say that appellant's appearance is demonstrative of remorse. Further, Abdul was not permitted to explain his personal reasons for testifying in support of appellant. Nor was he allowed to state whether the death penalty is recognized in the Islamic tradition. In excluding this testimony, the trial court reasoned that Abdul's personal opinions and motivations, as well as the traditions of Islamic culture, were not relevant to the statutory mitigating circumstances. See 42 Pa.C.S. § 9711(e) (mitigating circumstances). We agree. Next, appellant argues that it was error for the Commonwealth to introduce evidence of three aggravating circumstances at sentencing. Appellant's claim is that the prosecutor created three aggravating circumstances out of two prior murder convictions. The Commonwealth introduced evidence that appellant was convicted of two prior murders: that of Peter A. Levato and Marlene Sue Newcomer. It also introduced evidence that appellant was convicted of conspiracy to commit murder in the first degree in each case (Levato, Newcomer and Miller). Finally, it introduced evidence that Officer Leonard Miller was killed in the performance of his duties. Appellant's argument is that using each murder conviction as a separate aggravating circumstance and also using those convictions to show felony convictions involving the use or threat of violence to the person allows the Commonwealth an impermissible double usage of the evidence. 42 Pa.C.S. § 9711(d)(10) provides: (d) Aggravating circumstances. Aggravating circumstances shall be limited to the following. *248 (1) The victim was a ... peace officer ... who was killed in the performance of his duties or as a result of his official position. . . . . (9) The defendant has a significant history of felony convictions involving the use or threat of violence to the person. (10) The defendant has been convicted of another Federal or State offense, committed either before or at the time of the offense at issue, for which a sentence of life imprisonment or death was imposable or the defendant was undergoing a sentence of life imprisonment for any reason at the time of the commission of the offense. It is plain that the killing of Officer Miller in the performance of his duties was an aggravating circumstance under (d)(1). It is also plain that the convictions of conspiracy to commit murder were aggravating circumstances under (d)(9) in that they were a significant history of felony convictions involving the use of violence. Appellant complains, however, that the convictions for the murders of Peter Levato and Marlene Newcomer are but one aggravating circumstance either under (d)(9) or under (d)(10), and that they may not be used also as separate aggravating circumstances. The argument is meritless. Section (d)(10) allows the jury to consider as aggravating circumstances "another Federal or State offense ... for which a sentence of life imprisonment or death was imposable" as an aggravating circumstance, and (d)(9) allows the jury to consider as aggravating circumstances a significant history of felony convictions involving the use of violence. Nothing in the statute provides that a criminal conviction may be considered under only one subsection. Here, the murders of Levato and Newcomer fit under both d(9) and d(10), and were, therefore, properly considered by the jury as presenting aggravating circumstances under both d(9) and d(10). Next, appellant claims that it was error to admit into evidence photographs of the crime scenes of all three murders, *249 the two revolvers and bullets used in the slayings, the autopsy reports in the Levato and Newcomer cases, and statements made by appellant following the initial trial. The essence of appellant's claim is that this evidence is irrelevant, immaterial and unduly prejudicial. We rejected a similar claim in Commonwealth v. Young, 637 A.2d at 1320, where we stated: Beasley makes it clear that a sentencing court does not err when it refuses to limit evidence of this particular aggravating circumstance [prior convictions] to a certified record and sanitized summary of the circumstances of the prior crimes. The jury is entitled to know more than the mere fact of conviction. In Beasley we stated: In this Commonwealth, sentencing has long been regarded as having at its core a function of character analysis . . . and the central idea of the present sentencing statute is to allow a jury to take into account such relevant information, bearing upon a defendant's character and record, as is applicable to the task of considering the enumerated aggravating circumstances. . . . The nature of an offense, as ascertained through examination of the circumstances concomitant to its commission, has much bearing upon the character of a defendant, and, indeed, without reference to those facts and circumstances, consideration of "convictions" would be a hollow process, yielding far less information about a defendant's character than is relevant. 479 A.2d at 465. Bearing in mind that the Commonwealth has the burden of proving the existence of aggravating circumstances, one of which was that appellant had a significant history of felony convictions involving the use of violence, evidence of the revolvers, bullets, a photograph of the crime scene and autopsy reports were relevant to the jury's understanding of the nature of the offenses. With respect to photographs, the only photograph admitted over defense objection was a black and white photograph of the crime scene in the Miller homicide. It showed the officer's body lying in the *250 road under a blanket. This photograph served to familiarize the jury with the facts and circumstances of the crime and was not unduly prejudicial. Similarly, the guns, bullets and autopsy reports were also relevant to the jury's understanding of the nature of the offenses for they concern the nature of the prior crimes and the character of the person who committed them. Since there has been no showing of undue prejudice, this evidence was a relevant and proper factor in the jury's consideration of the sentence which should be imposed. Similarly, appellant's claim of error with respect to cross-examination concerning his statement that he had a better chance of being hit by a car than getting the electric chair, and that he would be out of jail in ten years, is without merit. At the sentencing hearing, appellant claimed that he was remorseful. Cross-examination concerning this statement was proper in light of the claim of remorse. There was no error in admission of any of the complained of evidence. Next, appellant argues that it was error to admit the facts of the Nichols homicide in the sentencing proceeding. In a habeas corpus proceeding filed by appellant, a United States district court conducted a hearing concerning the voluntariness of appellant's guilty plea in the Nichols matter. The district court ruled that appellant's guilty plea was wrongfully induced by the representation that it would not be introduced during the penalty phase of the Westmoreland County case involving the murder of Officer Miller. Since the Nichols guilty plea was introduced in the Miller case, the district court determined that a second penalty proceeding was to be conducted in the Miller case and that evidence of the guilty plea in the Nichols matter was not to be introduced in the second Miller penalty trial. Lesko v. Lehman, Civil Action No. 86-1238, U.S. Dist. Lexis 1123, 1992 WL 717815 (W.D.Pa.1992). At the second penalty hearing, over defense objection, the Commonwealth introduced evidence of the Nichols conviction and the circumstances of the Nichols murder, but not the guilty plea. The evidence which was introduced was not considered as an aggravating factor in sentencing; instead, it was used to demonstrate appellant's motive and intent in *251 killing Officer Miller. As Judge Scirica stated in the first Third Circuit appeal:[3] At trial, Lesko and Travaglia's sole defense to the charge of first degree murder was that they each lacked the requisite intent to kill. Lesko's counsel argued principally that his client was at most guilty of felony-murder. He argued that in instigating the police chase, defendants planned first to divert the officer from the Stop-and-Go store, and later return to rob the establishment. Therefore, Lesko's lawyer urged, the killing was not pre-meditated, but was the unintended result of a botched robbery attempt. . . . . [T]he central issue in the Commonwealth's case against Lesko was whether he deliberately supported Travaglia in a premeditated killing of Miller or whether he was guilty only of participating in an abortive attempt at robbery. Members of the jury could not have determined what was in Lesko's mind based solely on the events immediately preceding Miller's death. The jury could only have fairly evaluated the Commonwealth's theory regarding Lesko's state of mind by hearing evidence tending to show that *252 Travaglia and Lesko had jointly embarked that evening on a crime spree, that they had already committed a homicide likely to command the death penalty, that they had in their possession powerful evidence of their guilt of that homicide. Moreover, to be in a position to evaluate Lesko's state of mind during the critical moments during the Miller encounter, the jury needed to hear sufficient details about these matters to be able to appreciate the nature of the evening's joint undertaking, the relationship and mood of the participants, and the extent of the criminal exposure of those participants in the event of their apprehension by Miller. Lesko v. Owens, 881 F.2d 44, 47, 54 (3d Cir.1989) (footnote omitted). The issue, thus, is whether it was error to introduce evidence of the Nichols killing in order to prove appellant's motive and intent in killing Officer Miller when the district court had ruled that the guilty plea in the Nichols case was not to be introduced in the second penalty trial. The trial court charged the jury as follows: You have heard evidence concerning the killing of William Nichols in Indiana County. This evidence may be considered only as evidence tending to show the motive for and the circumstances surrounding the killing of Officer Leonard Miller. This event is not to be considered by you as an aggravating circumstance upon which you might base a sentence of death. That is, that the killing of William Nichols is not to be considered as an aggravating circumstance, not as a conviction to determine if Mr. Lesko has a significant history of felony convictions involving the use or threat of violence toward a person or persons, nor as a conviction of any state offense committed either before or at the time of the offense at issue for which a sentence of life imprisonment or death was imposable. I want it to be clear to you that you are not to use the evidence of the Nichols killing to determine any of the aggravating circumstances described to you. *253 As we have already stated, the circumstances surrounding the killing are admissible for the jury to consider in imposing sentence. Commonwealth v. Young, supra. Because the jury was clearly instructed that the Nichols matter was introduced into evidence only for the purpose of showing the motive and circumstances surrounding the killing of Officer Miller, it was not error to admit evidence of the Nichols evidence for the narrow purpose which was instructed. Next appellant asserts that he was denied due process and equal protection of the law because he was not granted the same proportion of challenges to alternate jurors as he was to jurors. Pa.R.Crim.P. 1126 provides that in capital felony trials involving one defendant, the Commonwealth and the defendant each has twenty peremptory challenges. That is a ratio of 10 challenges to every 6 jurors. However, in selecting alternate jurors, Pa.R.Crim.P. 1108 provides that each party is entitled to one peremptory challenge for each two alternate jurors. Thus, the ratio for alternate jurors in this case was 3 challenges to every 6 jurors. The claim is without merit. First, since both parties are afforded the same number of challenges, there can be no equal protection violation. Second, appellant has articulated no injury from the application of the rules. The fact that an alternate juror became a juror is not, in itself, an injury. Third, rules 1126 and 1108 provide an evenhanded and reasonable process for the selection of jurors. Fourth, appellant offers no authority for the proposition that due process is violated merely by providing for a different number of peremptory challenges for jurors and alternate jurors. And finally, as this court stated in Commonwealth v. Morales, "There is no constitutional right to any peremptory challenges, let alone any particular number of challenges." 494 A.2d 367, 373 n. 5 (Pa.1985). Finally, appellant asserts that his second penalty trial was barred by the double jeopardy clauses of the United States and the Pennsylvania constitutions. In Commonwealth v. Smith, supra, this court held that the double jeopardy *254 clause of the Pennsylvania Constitution bars reprosecution "not only when prosecutorial misconduct is intended to provoke the defendant into moving for a mistrial, but also when the conduct of the prosecutor is intentionally undertaken to prejudice the defendant to the point of the denial of a fair trial." 615 A.2d at 325. In Smith the prosecution withheld evidence that one of its witnesses was promised lenient treatment in exchange for his testimony and it intentionally withheld material exculpatory evidence. No court which has reviewed this case has determined that the prosecutor's conduct intentionally and impermissibly prejudiced the appellant. The Third Circuit Court of Appeals in Lesko v. Lehman, supra, after a careful review of the Commonwealth's conduct in this case, concluded that the prosecutor had impermissibly referred to appellant's silence and had impermissibly appealed to vengeance in his closing remarks. The United States District Court subsequently determined that the Commonwealth reneged on the plea bargain between appellant and the county prosecutor. Although these errors required a reversal of the imposition of the death penalty, neither the Third Circuit Court of Appeals nor any other court has determined that the prosecutor's conduct was an intentional attempt to impermissibly prejudice appellant. It was, instead, erroneous. Accordingly, this claim is without merit, for the prosecutorial misconduct in this case does not rise to the level of intentional misconduct and double jeopardy is not, therefore, implicated. Finally, after a thorough review of the record we conclude that the record supports the aggravating circumstances as found by the jury. Additionally, we have conducted a proportionality review as is required by 42 Pa.C.S. § 9711(h)(3)(iii) and find the sentence to be proportionate to sentences imposed in similar cases. There is no basis to find that the sentence of death was the product of passion, prejudice or any other arbitrary factor. Accordingly, the death sentence imposed by the Court of Common Pleas of Westmoreland County *255 is affirmed.[4] NIGRO, J., concurred in the result. NOTES [1] Appellant's conviction was amply supported by the evidence, as recounted in Commonwealth v. Travaglia, supra. [2] Appellant's resentencing hearing preceded our decision in Commonwealth v. Christy, 540 Pa. 192, 656 A.2d 877, 888-89 (Pa.1995), cert. denied, 516 U.S. 872, 116 S.Ct. 194, 133 L.Ed.2d 130 (1995), wherein we addressed the matter of providing jury instructions regarding the meaning of a "life sentence." [3] There were two appeals to the Third Circuit involving Lesko's habeas corpus petition. In the first appeal, Lesko v. Owens, 881 F.2d 44 (3d Cir.1989), the Third Circuit reversed the district court's issuance of the writ, holding that Lesko was not deprived of a fair trial by the introduction of evidence of his role in the prior murder of William Nichols. However, the court remanded the case to the district court to consider the remaining claims in Lesko's habeas corpus petition. On remand the district court dismissed Lesko's petition after considering the remaining claims, but without an evidentiary hearing. On appeal, Lesko v. Lehman, 925 F.2d 1527 (3d Cir.1991)(the second appeal), the Third Circuit again reversed the district court, holding that the district court erred in failing to hold an evidentiary hearing on the claim that introduction of evidence of his guilty plea to the Nichols murder at the penalty phase of the Miller trial violated his due process rights and that the jury's sentencing was tainted by improper prosecutorial remarks during the penalty phase of the trial. The Third Circuit ordered the district court to conduct an evidentiary hearing on the issue of the voluntariness of Lesko's Indiana County guilty plea (the Nichols murder) and to issue a writ of habeas corpus subject to the holding of a Pennsylvania resentencing proceeding. If the district court concluded as a result of its evidentiary hearing that the Nichols guilty plea was not voluntary, then evidence of the guilty plea was not to be introduced at the resentencing proceeding. [4] The prothonotary of the Supreme Court is directed to transmit the complete record in this case to the Governor, 42 Pa.C.S. § 9711(i).
78,130,304
In the United States Court of Federal Claims OFFICE OF SPECIAL MASTERS No. 12-589V Filed: July 11, 2013 Not for Publication ************************************* STEPHEN SALVO, * * Petitioner, * Damages decision based on stipulation; * influenza vaccine; neurologic injuries; v. * Bell’s palsy * SECRETARY OF HEALTH * AND HUMAN SERVICES, * * Respondent. * * ************************************* Ronald C. Homer, Boston, MA, for petitioner. Lisa A. Watts, Washington, DC, for respondent. MILLMAN, Special Master DECISION AWARDING DAMAGES1 On July 11, 2013, the parties filed the attached stipulation in which they agreed to settle this case and described the settlement terms. Petitioner alleges that he suffered neurologic injuries, including Bell’s palsy, that were caused by his September 22, 2009 receipt of influenza vaccine. Respondent denies that petitioner’s Bell’s palsy, or any other injury, was caused by flu vaccine. Nonetheless, the parties agreed to resolve this matter informally. 1 Because this unpublished decision contains a reasoned explanation for the special master's action in this case, the special master intends to post this unpublished decision on the United States Court of Federal Claims's website, in accordance with the E-Government Act of 2002, Pub. L. No. 107-347, 116 Stat. 2899, 2913 (Dec. 17, 2002). Vaccine Rule 18(b) states that all decisions of the special masters will be made available to the public unless they contain trade secrets or commercial or financial information that is privileged and confidential, or medical or similar information whose disclosure would constitute a clearly unwarranted invasion of privacy. When such a decision is filed, petitioner has 14 days to identify and move to delete such information prior to the document=s disclosure. If the special master, upon review, agrees that the identified material fits within the banned categories listed above, the special master shall delete such material from public access. The court hereby adopts the parties’ said stipulation, attached hereto, and awards compensation in the amount and on the terms set forth therein. Pursuant to the stipulation, the Court awards petitioner a lump sum of $175,000.00, representing compensation for all damages that would be available under 42 U.S.C. § 300aa-15(a). The award shall be in the form of a check for $175,000.00 made payable to petitioner. In the absence of a motion for review filed pursuant to RCFC Appendix B, the clerk of the court is directed to enter judgment herewith.2 IT IS SO ORDERED. Dated: July 11, 2013 s/ Laura D. Millman Laura D. Millman Special Master 2 Pursuant to Vaccine Rule 11(a), entry of judgment can be expedited by each party, either separately or jointly, filing a notice renouncing the right to seek review. 2
78,130,491
"The future is dangerous" - FabCity Summit 2018 17/07/2018 At this year’s Fab City summit, professor Carlos Moreno of the Sorbonne, painted a potentially grim picture of the future. “The future is dangerous” he declared, a prognosis which he had the maps and data to support. Speaking at the event, Saskia Sassen, who made a recent appearance in Brussels as part of the EUROCITIES Cities4Europe campaign, warned of the risks of letting the interests of international financial markets take precedence over the real human beings that live and work in cities. Nathalie Guri, projects and knowledge sharing director of EUROCITIES, was present at the event to represent Sharing Cities. Mrs Guri can attest to the fact that it wasn’t all doom and gloom. Again and again during the summit the message surfaced, amongst the speakers and the 900-strong audience, that cities hold the power to create new paradigms for resilience, innovation, justice and inclusiveness. To realise this power, the message was that large cities must take control of the data generated in their territories. This does not mean jeopardising resident’s ‘digital sovereignty’, but rather ensuring that data is opened up. This will allow government, citizens, universities, civil society and industry to cocreate solutions that work for everyone.
78,130,621
The problem with a desk job is low energy burnoff. My engine runs hot, at a high rpm. If I idle for a while, I get edgy. It affects the way I walk, even the way I think. Just getting a soda became a mini-adventure as my imagination ran with images and stories. One of those times where ... well ... if there were a playground nearby I'd take a ten minute recess and come back focused and collected.
78,130,672
On all orders over £30.00 excluding VAT to all UK mainland addresses and for most non-mainland UK addresses. Bought this for my elderly mother to enable her to maintain her independence and it was easy to set up and she managed to use it with very little guidance. It’s a sturdy piece of equipment and folds up easily for transporting. Great service. A deterioration in the health of the person I bought this for prompted me to request urgent delivery. It was delivered the next day after request. Brilliant service. Arrived on time. Excellent piece of equipment well designed and made. Does what it is intended to do. Thank you to the driver who left it in my safe place as I was out. There was an error in the ordering and I was anxious that I might receive more than one walking stick. Contacted the seller direct and they dealt with it efficiently and courteously. Then I had a query on how to unfold and refold the stick and once again the seller was most helpful and so quick with their response. Their service was excellent. I would happily recommend them. So easy. I just looked for one I wanted, very easy to order. Arrived two days later when due for delivery. My first experience of ordering walking stick on line. Wouldn’t hesitate to do so again. I asked for early delivery as it was for my father in law who had fell out of bed twice and hurt himself. It was dispatched immediately after I asked for it. Excellent service. Prompt delivery. Very helpful when called with a query. Product as described and expected. Narrow size ideal for small 97 year old lady Item very good, just what I needed and as described, very good quality, well packaged and arrived well before the estimated date. • Ideal for users who struggle to get to the toilet • Simple folding commode designed for easy storage • Perfect for house guests or overnight visitors • Lightweight and portable for ease of handli... Difficulties getting into and out of the bath are quickly and easily overcome when using the Bathmaster Deltis. Key Features: Easy to assemble and no technical knowledge needed. Manufactuererd... Designed to provide a multipurpose work surface for bedridden patients or users spending long periods of time in chairs With a large, curved surface, the table is ideal for meals, reading, jigsaws, a... Pedal exerciser for the gentle exercise of your upper or lower body Low-impact pedalling motion helps stimulate blood circulation whilst strengthening arm and leg muscles Place on a table to exercis... Height adjustable perching stool is ideal for those with limited mobility, the elderly, disabled, or handicapped Front of seat is slightly lowered in the front so the seat has a slight incline for ea... Days white line suspended bath seat is a plastic seat allows you to sit and relax in the bath with a horseshoe front cut out for personal cleaning. Key features Perfect for individuals with l... Our Best Selling Wheelchair. Available in 4 attractive colour options and 3 widths. Key Features: Ultra-lightweight Folding frame and backrest for compact storage Generously padded flame-retard... PREMIUM QUALITY:- Our three wheel walker is made of rustproof heavy duty aluminium to ensure durability. Quality tested ultralight material for longevity and performance. ERGONOMIC DESIGN:- Comfortab... Aids 4 Mobility uses Royal Mail, Parcel Force, DPD, Hermes, TNT and UPS. We have chosen these couriers based on their reliability and cost-effective delivery solutions. This helps us to keep your order costs as low as possible. There is FREE DELIVERY* on all orders over £30.00 excluding VAT to all UK mainland addresses and for most non-mainland UK addresses. Orders up to £30.00 excluding VAT are only £3.95 P and P If you are looking to have orders delivered outside the UK mainland, please contact us for a quotation. We will do our best to keep the delivery costs as low as we can.
78,130,798
I didn't do 30 days in a row; I put it in rotation with other exercise (including C25K). That kind of gave me a break and made different muscles sore on different days . I thought it was effective as something that will give you visible results pretty quickly, so that helps with motivation to keep going. I am planning to take this and the No More Trouble Zones on vacation with me. I'll be gone for 10 days, won't have access to a gym, won't be running in the 90+ degree heat, and don't want to lug my weights along. The two of these should be enough to keep me in a holding pattern at least, especially since I haven't done either in 2-3 months. __________________Cassie And now that you don't have to be perfect, you can be good. -John Steinbeck I just lost a post about this-- yes I am hanging with Jillian and getting results...haaha My dh also , and we laugh when she says "hit the mat, ladies!" Our dog also likes to lie on the towels, so it's a real family affair. I really like the weight training that is part of it.....we also like Kali Rae Yoga tapes, which I tried to get a link to: Kali Ray TriYoga - Free the Spine DVD ~ Kali Ray Permalink: Amazon.com: Kali Ray TriYoga - Free the Spine: Kali Ray: Movies & TV Well maybe that will work....free the spine, and there is also strengthening, and free the hips. We do all three, and they are wonderful. Take about 40 minutes. I am a definite shred fan! Grandma Sharon I've been doing the 30 day shred everyday(i pretty much started when I started fitday). I was dying on the first day of day one, but stuck with it. I'm now on Day 22 (Day 2 of Level 3) and workout 3 is Killer. I've definitely seen a change in my body - my lower body especially - much more firm. Plus I've lost almost 9 pounds in 3 weeks. I did change my diet slightly (just watched my calorie intake) so the diet excercise combo really works. Diet wise, I don't really even feel like I'm on a diet because I never feel hungry, I just switched out bad choices with good ones. I'll probably recheck my measurements after the full 30 days and give you all an update. Has anyone tried the 30 Day Shred with Jillian Michaels? I just recently cancelled my 2...yes TWO...gym memberships because I could never make it to the gym! I have P90X but I don't feel that I am in any shape to begin that at this time. I would really like to get my cardio and muscle strength up some and then begin P90X. I have three high school students that are in sports year around, so while they look like I use to, I am aquiring 'bleacher butt'. Heya, I'm one of the many that have tried and consistently use 30 Day Shred. I think it's intense, but the improvements for me came pretty quickly. You'll notice a change in your endurance (and measurements) within the first week or so, if you're anything like me. I do the Shred every other day; I would just get too bored with it doing it every single day. I highly recommend it. I like that I can burn a slew of calories in about 30 minutes and it definitely goes by fast. Good luck to you and welcome to your fitness journey. Let us know how it goes. I have 3 DD's that are all in soccer, swimming and scouts. One way I've found to get time to work out is to stay at soccer practice and lap the field with my ankle weights on. Occasionally I get pulled on to the field to help with a drill, while the other moms look at me like I'm some sort of wacko. I might be a wacko, but I could blow them all out of the water in a foot race.... I also have the Shred and like it, although I've not used it as often as changeisgood, I'm going back to it for a while because I broke my elliptical.
78,130,871
"Antonelli Electr. works mainly with sequencers and drum machines; a computer is used only for studio work. Only sequencers and drum machines are used for the live interpretation as well. This minimalist procedure has proved effective over the years and it has been perfected and viewed as a challenge. His way of working is often conceptional and serially motivated, which is reflected in the visual interpretation of his work as well (cover design, live performance, music video). Antonelli Electr. creates an independently unique language of music and means of expression: his own personal digital draft, minimally arranged, slender in aesthetics, hedonistic in attitude, abstract in programming and detailed in sound."
78,130,920
Coffee Hair Color Yes, coffee can do a lot more for you than just give you a quick morning pick me up. There is some research that shows coffee can also stimulate your hair growth and at the same time increase the shine and adding some depth to darker hair. There are several ways you can do this and here’s a couple of them. Super Charge Your Hair With This Coffee Rinse Make a Coffee Rinse Start by brewing a strong pot of coffee. This means that you should use about two more tablespoons of coffee than what you would normally put in for your drinking pleasure. Here are a couple of things to take into consideration: The stronger the coffee is the darker the brew and this means that soaking your hair with a darker brew is going to darken your hair color . This is great for brunettes or people with gray hair because it can add some depth to your existing hair color. . This is great for brunettes or people with gray hair because it can add some depth to your existing hair color. If you are blonde or have light red hair, or if your hair is already dyed, then you may want to try something different because you could end up making your hair look dingy or dirty. Make sure the coffee rinse is totally cooled off before using. Shampoo Your Hair Once you have the coffee rinse made shampoo your hair like normal making sure to rinse well. Make sure all the shampoo is out of your hair and then gently squeeze out excess water, it doesn’t have to be totally dry, you just don’t want it dripping wet. Stay in the shower or tub and take the cooled coffee and pour it through your hair by starting at the roots. You might want to use a large bowl or bucket to catch the coffee as it falls off your hair so that you can pour the drippings back through your hair a second time for a complete hair color rinse. Once you’re done doing this make sure that you rinse out your tub good to prevent any coffee stains. Use a Shower Cap In order for the coffee to work well for your hair color, put your hair in a shower cap and leave it on for 20 to 60 minutes. If you don’t have a shower cap then use an old towel that you don’t mind ruining. Make sure to wash off any coffee on your face or neck with soap and water so it doesn’t leave any stains. Remember, the longer you leave the coffee on your hair, the darker your hair color is going to be. Rinse and Dry Once you feel you’ve left it on long enough rinse your hair with some warm water and then let your hair air dry. If you really want the coloring process of the coffee to work really well try rinsing your hair with some apple cider vinegar. This will actually help to set the color. Enhance Your Hair Color With Coffee Grounds Use Coffee Grounds If you don’t want to use a coffee rinse you can use coffee grounds instead to enhance your hair color. All you need to do is take some of your used coffee grounds and massage a handful of these grounds into your wet hair after you have shampooed it. Make sure to work the coffee grounds into your hair and rub them into your scalp and working your way down your hair. This actually stimulates your hair follicles too and could stimulate hair growth. Rinse Your Hair After you feel like you have gotten the coffee grounds all the way through your hair you can then rinse with warm water and dry your hair with an old towel. WHAT TO READ NEXT: Don’t Throw Those Coffee Grounds Away! Best Espresso Machine Ranked | 2017 A Brief History of Espresso What is Cold Brew Coffee? [pt_view id=”a669ee8t7h”]
78,131,079
import scala.reflect.macros.blackbox.Context object Macros { def impl[T, U](c: Context) = { import c.universe._; c.Expr[Unit](q"()") } def foo[T, U] = macro impl[T, U] }
78,131,142
President Barack Obama will issue an executive order Monday that will expand the “Pay as You Earn’’ student debt repayment plan to as many as five million borrowers who had previously been ineligible for the plan, the New York Times reports. The Pay as You Earn plan caps repayments on federal student loans at 10 percent of debtors’ monthly income, but is not currently available to those who have taken out older loans—those who took out loans before October 2007, or who stopped borrowing before October, 2011. Obama’s order would not go into effect until December of next year, the Times reports, to give the education department time to deal with putting it into place. That timeline would also give time for other elements of Obama’s student debt plans, as highlighted in his proposed 2015 budget, to already be in place before borrowers opt in to the program. That budget plan also involved extending the Pay as You Earn program. The order will kickstart a busy week on the student debt front in Washington. A Democrat-led bill from Massachusetts Sen. Elizabeth Warren that would allow student debtors to refinance their loans at lower interest rates is expected to see the Senate floor this week. (The bill would involve new taxes on the wealthy.) And on Tuesday, Obama plans to answer questions about student debt and education costs submitted on the social networking site Tumblr. Speaking to the Times, Democratic New York Sen. Charles Schumer framed Obama’s executive action serves in part as a hedge on the Senate bill, ensuring borrowers would at least see some good news this week. “Even though our bill goes further, the president’s action means something will be done even if Republicans block it,’’ Schumer told the Times.
78,131,227
Experts praise new Vatican laws, but ask for inspection of Vatican bank Share this story VATICAN CITY (CNS) — European experts on preventing financial crimes praised a series of new Vatican laws and procedures but urged the Vatican to move immediately to conduct onsite inspections of the Vatican bank and the Administration of the Patrimony of the Holy See. “Moneyval” — the Council of Europe’s Committee of Experts on the Evaluation of Anti-Money Laundering Measures and the Financing of Terrorism — said, “wide-ranging legislative and institutional measures were instituted to rectify deficiencies” in Vatican law and structures and showed “significant efforts” to implement recommendations Moneyval made in July 2012. The committee said it was encouraged that the Institute for the Works of Religion, commonly known as the Vatican bank, had completed a preliminary survey of its account holders, was in the midst of a more thorough review of them, had redefined the categories of customers eligible to hold an account at the Vatican, and had closed some accounts. However, Moneyval also recommended the Vatican oversight agency, known as the Financial Intelligence Authority, conduct “a full inspection of the IOR (the Italian acronym for the Vatican bank) without any further delay.” Moneyval approved the Holy See-Vatican progress report at a meeting Dec. 9 in Strasbourg, France, and published the full report on its website Dec. 12. In its report, the Vatican said that at the end of November, the Financial Intelligence Authority was at an advanced stage in planning its onsite inspection of the bank and expected to begin in January 2014 at the latest. Moneyval also encouraged an inspection of the Administration of the Patrimony of the Holy See, commonly referred by its Italian acronym APSA, but the Vatican said the inspection would take place “at a later stage.” APSA handles the Vatican’s investment portfolio and its real estate holdings, and also serves as the Vatican employment office and procurement agency. The office originally was set up to manage the assets received as a settlement from the Italian government in 1929 with the signing of the Lateran Pacts. In its 2012 report to Moneyval, the Vatican estimated APSA’s assets to be worth about 680 million euros. The assets included the accounts of 23 individuals — cardinals and bishops who deposited charitable contributions made in their name to the Vatican or their home dioceses and a handful of laypeople who made large donations to the Vatican and were receiving annuities. The Vatican stopped accepting such arrangements in 2001 and has since been taking steps to close the existing accounts. Moneyval praised changes in Vatican laws dealing with finance, financial crimes, money laundering and terrorism financing and changes to the role, authority and responsibilities of the Vatican’s Financial Intelligence Authority. But is also noted that the Vatican said major changes to the Vatican bank and APSA would have to await the conclusion of work by commissions appointed by Pope Francis to study the two entities. A key focus of Moneyval has been encouraging the Vatican to strengthen the ability of its Financial Intelligence Authority to set policy to prevent financial crimes, to freeze assets when suspicious activities occur and to audit financial transactions throughout Vatican City. The Vatican report indicated that Vatican employees have begun to use the new procedures initiated by the authority and to report potentially problematic transactions. In 2012 the authority said it received six “suspicious transaction reports,” while in the first 10 months of 2013, it had received 105 reports. The Vatican attributed the huge jump to a combination of the initial review of all Vatican bank accounts and “increased transaction monitoring.” Moneyval had urged the Vatican to implement procedures for seizing or freezing assets when alarms are raised by suspicious transactions. In 2013, the report said, the Vatican seized 1.98 million euro “in a money laundering investigation” involving Msgr. Nunzio Scarano, an accountant at APSA arrested by Italian authorities on charges of fraud, corruption and slander. He also is under investigation in an Italian money laundering case. Moneyval’s approval of the Vatican progress report Dec. 9 “confirms the significant efforts undertaken by the Holy See and Vatican City State to strengthen its legal and institutional framework,” said Msgr. Antoine Camilleri, undersecretary for relations with states and head of the Vatican’s delegation to Moneyval. “The Holy See is fully committed to continuing to improve further the effective implementation of all necessary measures to build a well-functioning and sustainable system aimed at preventing and fighting financial crimes,” Msgr. Camilleri said in a statement released by the Vatican press office. Rene Brulhart, director of the Financial Intelligence Authority, told Vatican Radio Dec. 12 that the changes enacted at the Vatican and the verification by Moneyval demonstrate the church’s commitment to “applying the measures necessary to present ourselves as a credible partner” in the global fight against financial crime, particularly “the struggle against laundering dirty money and financing terrorism.”
78,131,520
Kanye West’s total embrace of President Donald Trump may be starting to have real-world implications beyond Twitter. According to the results of a Reuters weekly tracking poll released this week, support for the president among black men doubled from 11 percent, for the week ending April 22, to 22 percent, for the week ending April 29. The approval numbers are the highest Trump has enjoyed in the survey among black men all year. The timing is noteworthy since the rapper began to go public with his pro-Trump views on April 21, first tweeting support for right-wing pundit Candace Own on April 21. Four days later, he proclaimed “love” for POTUS and his “dragon energy” — and posted a selfie in which he wore a MAGA hat. Also Read: Breitbart Chief Wants Kanye West as a Columnist: 'Absolutely We'd Publish' (Exclusive) President Trump still remains broadly underwater among African American men, with disapproval in the most recent survey holding at 71 percent. While the burst in approval is an intriguing data point, the figures might also be a statistical anomaly. The sample sizes of black men surveyed for both weeks was well under 200 people. Like most Republicans, Trump has never polled well among African Americans — he won just 8 percent of the African-American vote in 2016. Also Read: 'Fox & Friends' Backs Kanye West After TMZ Meltdown: 'Good for Him' (Video) Still, the news is evidence that West’s pro-Trump positions are raising eyebrows offline as well as on. On Thursday, Breitbart Editor-in-Chief Alex Marlow told TheWrap that he would be keen to sign on Yeezy as a columnist. “Absolutely we’d publish Kanye or interview him on our national radio shows on SiriusXM or in print,” he said. “Kanye is one of the few people in public life — or non-public life, for that matter — who appears to be ‘fully realized,'” Marlow added. “In other words, he gives off the impression that he thinks what he wants and does what he wants without preoccupation with the judgment of others,” Marlow added. Also Read: Don Lemon Shreds Kanye West Over Slavery Remarks: He's 'Embarrassing Himself' Trump and West’s friendship goes back some time. Just after Trump won the election in 2016, West was given a private meeting with the real estate magnate in Trump Tower. When it was over, the president-elect laid it on thick. “He’s a good man,” Trump told reporters before patting the rapper on the shoulder. “We’ve been friends for a long time.”
78,131,697
// ------------------------------------------------------------------------------ // Copyright (c) Microsoft Corporation. All Rights Reserved. Licensed under the MIT License. See License in the project root for license information. // ------------------------------------------------------------------------------ // **NOTE** This file was generated by a tool and any changes will be overwritten. // <auto-generated/> // Template Source: Templates\CSharp\Requests\EntityCollectionRequestBuilder.cs.tt namespace Microsoft.Graph { using System; using System.Collections.Generic; /// <summary> /// The type DeviceAppManagementManagedAppRegistrationsCollectionRequestBuilder. /// </summary> public partial class DeviceAppManagementManagedAppRegistrationsCollectionRequestBuilder : BaseRequestBuilder, IDeviceAppManagementManagedAppRegistrationsCollectionRequestBuilder { /// <summary> /// Constructs a new DeviceAppManagementManagedAppRegistrationsCollectionRequestBuilder. /// </summary> /// <param name="requestUrl">The URL for the built request.</param> /// <param name="client">The <see cref="IBaseClient"/> for handling requests.</param> public DeviceAppManagementManagedAppRegistrationsCollectionRequestBuilder( string requestUrl, IBaseClient client) : base(requestUrl, client) { } /// <summary> /// Builds the request. /// </summary> /// <returns>The built request.</returns> public IDeviceAppManagementManagedAppRegistrationsCollectionRequest Request() { return this.Request(null); } /// <summary> /// Builds the request. /// </summary> /// <param name="options">The query and header options for the request.</param> /// <returns>The built request.</returns> public IDeviceAppManagementManagedAppRegistrationsCollectionRequest Request(IEnumerable<Option> options) { return new DeviceAppManagementManagedAppRegistrationsCollectionRequest(this.RequestUrl, this.Client, options); } /// <summary> /// Gets an <see cref="IManagedAppRegistrationRequestBuilder"/> for the specified DeviceAppManagementManagedAppRegistration. /// </summary> /// <param name="id">The ID for the DeviceAppManagementManagedAppRegistration.</param> /// <returns>The <see cref="IManagedAppRegistrationRequestBuilder"/>.</returns> public IManagedAppRegistrationRequestBuilder this[string id] { get { return new ManagedAppRegistrationRequestBuilder(this.AppendSegmentToRequestUrl(id), this.Client); } } /// <summary> /// Gets the request builder for ManagedAppRegistrationGetUserIdsWithFlaggedAppRegistration. /// </summary> /// <returns>The <see cref="IManagedAppRegistrationGetUserIdsWithFlaggedAppRegistrationRequestBuilder"/>.</returns> public IManagedAppRegistrationGetUserIdsWithFlaggedAppRegistrationRequestBuilder GetUserIdsWithFlaggedAppRegistration() { return new ManagedAppRegistrationGetUserIdsWithFlaggedAppRegistrationRequestBuilder( this.AppendSegmentToRequestUrl("microsoft.graph.getUserIdsWithFlaggedAppRegistration"), this.Client); } } }
78,131,720
Q: Thread keeps updating for a while before going to next activity and collisions are off I have a little Android game I've been working on and have also implemented some pixel perfect collision checking based on code here (I say based on, the only thing I changed was the class name). This used to work fine a few months ago but I stopped development for a while, came back and had to rebuild a lot of code including that. Now when I implement it to collisions seem horribly off. If the large weight is on the left of the player, there's a significant distance and yet it still counts as a collision. I have also noticed that the player can pass through some of the smaller weights but not all the way through. I know the player has some non-transparent white pixels which don't help, but I tried with a transparent smaller image and the same thing happened. I edited the player image to remove the white pixels, but I'm still getting the odd collision detection. Any ideas? The second issue, when the player collides with a weight, it goes to the next activity (which it does) however it jerks all the objects on screen for about a second, and then proceeds to the next activity. Any ideas on this? I have a feeling it's to do with the way I handle the threads, or the way I close them. Check GitHub link below for the GameThread class which handles the main operations on my surface thread. Collision code can be found in above link, my calling of the class can be found below: // handles game over circumstances if (CollisionUtil.isCollisionDetected(player.getBitmap(), player.getX(), player.getY(), weight.getBitmap(), weight.getX(), weight.getY())) { Intent gameOverIntent = new Intent(this.getContext(), GameOverActivity.class); player.setTouched(false); this.getContext().startActivity(gameOverIntent); ((Activity) this.getContext()).finish(); save(score, time); try { gameTimers.join(); } catch (InterruptedException e) { e.printStackTrace(); } gameOver = true; } The rest of the classes and all my other code can be found here: https://github.com/addrum/Francois/tree/master/src/com Thanks for your help in advance. Edit: Forgot to mention that if you need to test it (you probably will to see what I'm talking about) you can download it here: https://play.google.com/store/apps/details?id=com.main.francois Edit: I've now uploaded a short recording on YouTube to make it easier to see what I'm talking about: http://youtu.be/vCjKmTmhabY A: Thanks for the video; now I understand the "jerk" part. The explanation for that is simple: your code is doing this (simplified): while (running) { canvas = this.surfaceHolder.lockCanvas(); this.gameLogic.update(); this.gameLogic.render(canvas); surfaceHolder.unlockCanvasAndPost(canvas); } public void render(Canvas canvas) { if (!gameOver) { ...draw... } } If gameOver is true, your render() function doesn't draw anything, so you get the previous contents of the buffer. You might expect this to show whatever was drawn from the frame just before this one. However, the SurfaceView surface is double- or triple-buffered. So if you just keep calling lock/post without drawing anything, you'll cycle through the last two or three frames. Simple rule: if you lock the surface, you need to draw. If you don't want to draw when the game is over, you need to put your "game over" test back before you lock the canvas.
78,131,893
--- author: - Gabriele Vissio - Valerio Lucarini title: 'A proof of concept for scale-adaptive parameterizations: the case of the Lorenz ’96 model' --- Constructing efficient and accurate parameterizations of sub-grid scale processes is a central area of interest in the numerical modelling of geophysical fluids. Using a modified version of the two-level Lorenz ’96 model, we present here a proof of concept of a scale-adaptive parameterization constructed using statistical mechanical arguments. By a suitable use of the Ruelle response theory and of the Mori-Zwanzig projection method, it is possible to derive explicitly a parameterization for the fast variables that translates into deterministic, stochastic and non-markovian contributions to the equations of motion of the variables of interest. [We show that our approach is computationally parsimonious,]{} has great flexibility, as it is explicitly scale-adaptive, and we prove that it is competitive compared to empirical ad-hoc approaches. While the parameterization proposed here is universal and can be easily analytically adapted to changes in the parameters’ values by a simple rescaling procedure, the parameterization constructed with the ad-hoc approach needs to be recomputed each time [the parameters of the systems are changed.]{} The price of the higher flexibility of the method proposed here is having a lower accuracy in each individual case. [**Keywords:**]{} Parameterization; Multiscale systems; Stochastic Dynamics; Memory; Noise; Response theory; Mori-Zwanzig theory; Chaos; Scale-adaptivity; Prediction Introduction ============ The climate is a forced and dissipative system featuring variability on a vast range of spatial and temporal scales. This results essentially from the fact that a) the climate system is composed by subdomains having different characteristic time scales; and b) the dynamics inside each subdomain and the couplings between them are strongly nonlinear. As a result, even the most sophisticated and computationally expensive numerical climate models are far from being able to represent explicitly just a relatively small fraction of the whole dynamical range [of the geophysical fluids]{} [@Ghil1987; @Peixoto1993; @Lucarini2014]. Therefore, it is crucial to develop approximate - yet accurate and efficient - dynamical/statistical representations - the so-called parameterizations - of the effects of unresolved scales on the scales the model is able to explicitly describe [@Palmer2008; @Franzke2015; @Berner2016]. Lacking proper parameterizations reduces substantially model’s skills in terms of short-to-medium range weather prediction, and on climatic time scales, in terms of average properties, variability, and climate response to forcings [see e.g. @Holton2004; @McGuffie2005; @Palmer2006; @Plant2016]. A fundamental problem in the construction of parameterizations is that they are typically tuned for being accurate for a specific configuration of a model in terms of numerical resolution, and the operation of re-tuning needed when a new model version at higher resolution is available can be extremely tedious and costly. The need of achieving scale-adaptive parameterizations has been recently emphasized in the scientific literature, see e.g. [@Arakawa2011; @Park2014; @Sakradzija2016]. Additionally, parameterizations are typically tested against [specific observables of interest and tuned in order to better represent those observables]{}, but it is not always clear whether optimizing the skill for such observables might come at the price of reducing the skill on other climatic properties that might prove crucial for, e.g., modulating the climatic response to forcings. A somewhat peculiar point of view is provided by the so-called superparameterizations, mostly used to represent convection [@Majda2007; @Li2012]. The idea is to have lower dimensionality (and so computationally much cheaper) models run in parallel with the main code for resolving at high resolution the dynamics inside each atmospheric column. For long time, parameterizations were aimed at describing the mean effects of small, fast scales on slower ones, but recently it has become apparent that it is crucial to widen their formulation in order be able to include in some way the effect of fluctuations. The pursuit of stochastic parameterizations for weather and climate models has then become an extremely active area of research, see e.g. the recent contributions by [@Palmer2008; @Franzke2015; @Berner2016] and the now classical collection of results in [@Imkeller2001]. The construction of stochastic parameterizations in geophysical fluid dynamical models is also usually approached using a pragmatic method: one tries to construct empirical functions able to represent well the effect of mean state and of the fluctuations of the unresolved variables, see e.g. the illustrative examples of [@Orrell2003] and [@Wilks2005]. Mathematical arguments do indeed support the idea of going towards stochastic parameterizations. The homogenization method shows that the effect of the fast scales on the slow scales can be represented, in the limit of infinite time scale separation, as the sum of two extra terms in the equation of motions of the slow variables, precisely a deterministic (mean field) and a white-noise stochastic (fluctuations) component [@Pavliotis2008]. This point of view has led to an important set of results by Majda and collaborators on the possibility of constructing explicitly reduced order models for geophysical fluid dynamical systems (see, e.g., [@Majda1999; @Majda2001; @Majda2003; @Franzke2005]). A different point of view focuses, instead, on constructing effective dynamics comprising deterministic as well as stochastic terms purely from data. The idea proposed by [@Kravtsov2005] has been to extend the multilevel linear regressive method, which is suitable for linear problems, to the nonlinear case, allowing for dealing with the possibility of representing quadratic nonlinearities in the evolution equations, which are in fact typical of (geophysical) fluid dynamical processes. The method allows for constructing an optimal representation of the deterministic, linear and nonlinear, dynamics as well as of the stochastic forcing, so that its correlation properties are suitably recovered without making any assumption on the existence of time scale separation between resolved and neglected variables. The Mori-Zwanzig theory [@Zwanzig1960; @Zwanzig1961; @Mori1974] provides an - unfortunately implicit - expression for the effect of the small, fast scales on the scales of interest. One finds that, once the hypothesis of infinite time scale separation is relaxed, the parameterization requires in fact three terms, a deterministic correction, a stochastic term, and a memory, non-markovian term. In the limit of infinite time scale separation, the memory term drops off and the stochastic terms becomes indeed white noise, in agreement with what predicted by the homogenization theory. Recently [@Chekroun2015a; @Chekroun2015b] have provided a comprehensive treatment of these issues that combine mathematical rigor and physical intuition. In a few recent papers, [@Wouters2012; @Wouters2013; @Wouters2016] have provided explicit formulas for constructing parameterizations able to incorporate the deterministic, stochastic, and non-markovian components. The formulas have been obtained independently using two rather different approaches, namely a second order expansion of the Mori-Zwanzig projection operator, and a reworking of the [@Ruelle1998; @Ruelle2009] response theory, which allows under suitable conditions to compute the change in the expectation value of any smooth observable of a system resulting from perturbations of the dynamics in terms of the statistical properties of the unperturbed flow. The idea followed by Wouters and Lucarini has been to treat the coupling between the slow and fast variables as the weak forcing added on top of the uncoupled dynamics, and then evaluate the impact of the forcings on the statistical properties of a generic observable of the slow variables. Finally, the last step has been to retro-engineer explicit formulas for terms that, added to the uncoupled dynamics of the slow variables, provide up to second order to the same results as the actual coupling. It is important to note that since explicit formulas are provided, one can indeed construct the parameterizations *ab-initio*, and not empirically. Additionally, the parameterization is automatically optimized for all possible observables of the system. Such an approach seems especially promising in all systems, as in the extremely relevant case of the climate, where there is no *spectral gap* in the scales of motions that justifies the assumption of infinite time scale separation between fast and slow scales. It seems then in general relevant to be able to retain and check the relevance of the memory term and to construct a suitable model for the stochastic forcing, going beyond the approximation of using white noise or simple empirical autoregressive processes. Note that the approach discussed here is not *per se* constructed to deal with multiscale systems only. In fact, the explicit expressions for the terms responsible for the parameterization are constructed by performing an asymptotic expansion controlled by a parameter determining the degree of coupling between the set of variables of interest and those we want to parameterize. Clearly, if such a condition is satisfied, we can apply our method also to multiscale systems, as done here. Recently, a parameterization constructed according to such a statistical mechanical point of view has been tested successfully in a simple low dimensional model [@Wouters2016a] and in a more complex yet simple coupled model [@Demaeyer2015]. In this paper, we want to stress another quality of this approach, namely the possibility of having automatically scale adaptive formulations of the parameterization. We choose as benchmark system to work with (a modified version of) the Lorenz ’96 model [@Lorenz1996], which provides a prototypical yet convincing representation of a two-scale system where large scale, synoptic variables are coupled to small scale, convective variables. The Lorenz ’96 model has quickly become the test-bed for evaluating new methods of data assimilation [@Trevisan2004; @Trevisan2010] and is receiving a lot of attention also in the community of statistical physics [@Abramov2008; @Hallerberg2010; @Lucarini2011; @Gallavotti2014]. More importantly for our specific case, the Lorenz ’96 model has been used in the papers of [@Orrell2003] and [@Wilks2005] to construct explicit models of stochastic parameterization, so we have previous results to compare to. [We wish to stress that data driven closure models have been recently extended in order to be able to deal with the unavoidable memory effects due to presence of neglected, hidden variables [@Kondrashov2017]. One can see the approach proposed here as the top-down counterpart of the bottom-up approach provided by the data-driven methods. ]{} The paper is structured as follows. Section 2 provides the main ingredients of the method for constructing general parameterizations introduced by [@Wouters2012; @Wouters2013; @Wouters2016]. Section 3 describes the Lorenz ’96 system and highlights the modifications we have applied in the present work, most importantly the introduction of a forcing acting also on the fast variables. Section 4 is dedicated to describing the performance of the parameterization, discuss its scale-adaptive properties, and compare its performance with previous results. Section 5 concludes the paper with the discussion of the results and the perspective for future research in this area. In Appendix A we present some new ideas building on Wouters and Lucarini 2012, 2013, 2016 able to extend the range of applicability of the theory. Wouters-Lucarini’s parameterization =================================== In this paper we test the effectiveness of the methodology introduced by [@Wouters2012; @Wouters2013; @Wouters2016] for constructing parameterizations for dynamical systems of the form: $$\begin{aligned} %\begin{split} \frac{dX}{dt}&=F_X(X)+\epsilon\Psi_X(X,Y),\label{eq:perturbedX}\\ \frac{dY}{dt}&=F_Y(Y)+\epsilon\Psi_Y(X,Y),\label{eq:perturbedY} %\end{split}\end{aligned}$$ where the $X$ variables correspond to the dynamics of interest and the $Y$ variables correspond to the dynamics we want to parameterize. The $F$ vector field on the right hand side of Eqs. - corresponds to the uncoupled dynamics of the $X$ and $Y$ variables respectively, while the $\Psi$ field describes the coupling, with $\epsilon$ being a bookkeeping variable describing the coupling strength. Note that Eqs. - do not describe, in general, a multiscale dynamical system, where the $X$ (slow) and the $Y$ (fast) variables are essentially characterized by different scales of motion. Nonetheless, we can bring it to the standard form elucidating multiscale behaviour by considering the following form for Eqs. -: $$\label{newperturbedX} \frac{dX}{dt}=F_X(X)+\epsilon\Psi_X(X,Y)$$ $$\label{newperturbedY} \frac{dY}{dt}=\gamma\tilde{F}_Y(Y)+\epsilon{\Psi}_Y(X,Y)$$ where $\gamma\gg1$ and $F_Y(Y)=\gamma \tilde{F}_Y(Y)$. As clear from the later discussion, it is not important in our case to include the factor $\gamma$ also for the coupling term affecting the $Y$ variables in Eq. , because we are eminently interested in separating the time scales of the decoupled ($\epsilon=0$) X- and Y-systems. Following the discussion presented in the introduction, the goal is to find an approximate equation of the form $$\label{eq:Xpluschi} \frac{dX}{dt}=F_X(X)+\chi\{X\}$$ able to provide a good approximation of the statistical properties of the $X$ variables, where $\chi\{X\}$ can in general correspond to an integro-differential contribution with also a stochastic component. It seems relevant aiming at being able to specify in advance the accuracy of the approximation in terms of the properties of the coupling and, in particular, of the coupling strength $\epsilon$. Clearly, if $\epsilon=0$, we have that $\chi\{X\}=0$ provides a (trivial) solution to our problem. The approach can be seamlessly followed also in the presence of a functional form for the equations where the parameter $\gamma$ explicitly controls the scale separation between the $X$ and $Y$ variables. Note that [@Abramov2016] has recently introduced an extension of the homogenization method able to deal with a problem formulated as in Eqs. -. The method ---------- The basic idea is to consider the dynamical system - as resulting from an $\epsilon-$perturbation of the following dynamical system: $$\label{eq:unperturbedX} \frac{dX}{dt}=F_X(X),$$ $$\label{eq:unperturbedY} \frac{dY}{dt}=F_Y(Y),$$ where the coupling plays the role of the perturbation. We now focus on the $X$ variables by considering a general observable $A=A(X)$, i.e. a smooth function of the $X$ variables only. Making suitable hypotheses on the mathematical properties of the unperturbed system and taking advantage of the [@Ruelle1998; @Ruelle2009] response theory, Wouters and Lucarini have been able to find a useful expression for the expectation value $\rho_\epsilon(A)$ of the observable $A$ taken according to the invariant measure $\rho_\epsilon(dXdY)$ of the coupled dynamical system -: $$\rho_\epsilon(A)=\int\rho_\epsilon(dXdY)A(X) .$$ In what follows, we assume that all invariant measures considered are of the Sinai-Ruelle-Bowen kind [@Eckmann1985; @Young2002]. This assumption can be physically motivated by taking into account the chaotic hypothesis (e.g. [@Gallavotti2014a]). We can also introduce the projected measure $$\rho^*_\epsilon(dX)=\int_Y \rho_\epsilon(dX dY)$$ where the inetgration is performed on the $Y$ variables only, such that $\rho_\epsilon(A)=\rho^*_\epsilon(A)$. Using ergodicity, we also have: $$\label{ensambleaverage} \rho^*_\epsilon(A)=\lim_{T\to\infty} \frac{1}{T}\int_0^Td\tau A(x(t)) ,$$ where $x(t)=\widetilde f^t{x_0}$, with $\widetilde f^t$ defining the flow determined by the dynamical system -. It is possible to find a perturbative expansion of the expectation value of $A$ taken according to the invariant measure of the coupled system. One can in fact write $$\rho^*_\epsilon(A)=\rho_{0,X}(A)+\epsilon \delta_\Psi^{(1)}\rho(A)+\epsilon^2 \delta_{\Psi,\Psi}^{(2)}\rho(A)+\mathcal{O}(\epsilon^3) ,$$ where the first term $\rho_{0,X}(A)$ is the expectation value of $A$ taken according to the invariant measure of the $X$-component of the unperturbed system : $$\rho_{0,X}(A)=\int\rho_{0,X}(dX)=\lim_{T\to\infty} \frac{1}{T}\int_0^Td\tau A(f^\tau(x_0)) ,$$ where we have again used ergodicity and defined $f^t$ as the flow of the $X$ variables part of the dynamical system . The second term $\epsilon \delta_\Psi^{(1)}\rho(A)$ and the third term $\epsilon^2 \delta_{\Psi,\Psi}^{(2)}\rho(A)$ correspond to the first and second order corrections, and can be also expressed as expectation values on $\rho_{0,X}(dX)$ of explicitly determined observables, which are constructed non-trivially from A and the vector field $\Psi$. All the terms can be computed from the statistical properties of the uncoupled dynamics of the Y variables given in Eq. \[eq:unperturbedY\]. The explicit expressions can be found in [@Wouters2012]. While the previous result allows for computing the impact of the coupling on the statistics of any given $A$ observable, it is not useful *per se* for constructing a parameterization. Nonetheless, it is possible to retro-engineer an educated guess for the term $\chi\{X\}$ introduced in Eq. , such that *up to second order in $\epsilon$* the expectation value of $A$ according to the invariant measure $\rho'_\epsilon(dX)$ of the system: $$\label{eq:wl_par} \frac{dX}{dt}=F_X(X)+\epsilon D(X)+\epsilon S\{X\}+\epsilon^2 M\{X\}$$ is the same as the expectation value of $A$ according to $\rho_\epsilon$, or, more explicitly: $$\label{eq:rhoOthird} \rho_\epsilon(A)=\rho'_\epsilon(A)+O(\epsilon^3).$$ Therefore, Eq. provides a useful basis for defining a parameterization where we are able to control the error on the statistics of the surrogate dynamics with respect to the full dynamics as a function of $\epsilon$, and where this applies *for all possible observables* $A$. The three perturbation vector fields $D$, $S$ and $M$ correspond to, respectively, a mean field term, a stochastic forcing and a non-markovian memory term. Note that the stochastic term has a second order effect on the measure even if its intensity is proportional to $\epsilon$; see [@Lucarini2012]. As shown in [@Wouters2012; @Wouters2013; @Wouters2016], the explicit expression for these three terms can be obtained also by performing a second order expansion of the Mori-Zwanzig projector operator, which constructs the effective projected dynamics for the $X$ variables only. This suggests that the proposed parameterization might have skill also in terms of prediction (in the sense of weather forecast); we will test this elsewhere. In what follows, we refer to this approach as the the W-L parameterization. The explicit expressions for the three terms providing the parameterization shown in Eq. are given below in Eqs. , and . Therefore, once we derive $D$, $S$, and $M$, we can use them to construct parameterizations for all values of $\epsilon$ within the radius of convergence of the expansion. Additionally, if the coupled model given in Eqs. - is multiscale, this approach allows for constructing parameterizations integrating the single scale equation . This can significantly ease the computational burden of our problem. ### Deterministic, stochastic, and non-markovian terms We assume that the coupling terms $\Psi_X(X,Y)$ and $\Psi_Y(X,Y)$ are separable in the $X$ and $Y$ variables, so that we can write $\Psi_X(X,Y)=\Psi_{X,1}(X)\Psi_{X,2}(Y)$ and $\Psi_Y(X,Y)=\Psi_{Y,1}(X)\Psi_{Y,2}(Y)$. As explained in [@Wouters2012; @Wouters2013; @Wouters2016], such an assumption does not really impact the generality of our results. $D(X)$ is a deterministic term that accounts for the average impact that the coupling has on the $X$ variables and it is given by: $$\label{eq:original_D} D(X)=\Psi_{X,1}(X)\rho_{0,Y}(\Psi_{X,2}(Y)).$$ The second order contribution is composed of two parts. $S\{X\}$ represents a stochastic forcing due to the temporal correlation of the fluctuations of the forcing exerted by the Y-variables onto the $X$ variables. We can write $$\label{eq:Smult} S\{X\}=\Psi_{X,1}(X)\sigma(t) ,$$ where $\sigma(t)$ is a stochastic term and is constructed in such a way to reproduce the lagged correlation of the fluctuations of the forcing. The statistical properties of the noise $\sigma(t)$ can be expressed as: $$\begin{aligned} %\begin{split} R(t)&=\langle\sigma(t),\sigma(0)\rangle\nonumber\\ &=\rho_{0,Y}\left((\Psi_{X,2}(Y)-\rho_{0,Y}(\Psi_{X,2}(Y)))(\Psi_{X,2}(f^{t}(Y))-\rho_{0,Y}(\Psi_{X,2}(Y)))\right),\\ \langle\sigma(t)\rangle&=0. %\end{split}\end{aligned}$$ where the brackets indicate the expectation value of the stochastic process and $R(t)$ is the lagged correlation of the (stationary) noise. Finally, $M\{X\}$ is a memory term that describes the effects of the history of the $X$ variables on their present value through the influence of the $Y$ variables. This term is essential for capturing the effect of the hidden (Y) variables on the (X) variables of interest, as clarified by [@Chekroun2015a; @Chekroun2015b]. It is expressed as: $$M\{X\}=\int_0^\infty h(\tau,X(t-\tau))d\tau,$$ where the integral kernel is given by: $$\label{eq:original_h} {\color{black}h(\tau,\tilde{X})=\Psi_{Y,1}(\tilde{X})\Psi_{X,1}(f^{\tau}(\tilde{X}))\rho_{0,Y}(\Psi_{Y,2}(Y)\partial_Y\Psi_{X,2}(f^{\tau}(Y))).}$$ Such an average resembles a cross-correlation between the actual state of the two fields $X,Y$ and the deviation of the trajectory of the same fields evolved at $t=\tau$. A remarkable property of this parameterization is its universality, as shown by Eq. through , because we have explicit formulas for computing the three factors $D$, $S$ and $M$ for any given expression of the coupling terms or of the uncoupled dynamics. Another positive aspect of these equations is the scale adaptivity of the parameterization terms, as we are going to show in next sections. Independent coupling case ------------------------- The special case where the two coupling terms are independent from the variable they are affecting, namely $\Psi_X(X,Y)=\Psi_X(Y)$ and $\Psi_Y(X,Y)=\Psi_Y(X)$, is particularly important for the scopes of this paper. The three terms discussed above take the following simpler form: $$\label{eq:Dterm} D(X)=\rho_{0,Y}(\Psi_X(Y)),$$ $$\label{eq:m2} S\{X\}=\sigma(t),$$ where [ $$\label{eq:fluctuationterm} \begin{split} R(t)=\langle\sigma(0),\sigma(t)\rangle&=\rho_{0,Y}((\Psi_X(Y)-D)(\Psi_X(f^{t}(Y))-D)),\\ \langle\sigma(t)\rangle&=0, \end{split}$$]{} and $$\label{eq:m3} M\{X\}=\int_0^\infty h(t_2,X(t-t_2))dt_2,$$ where [ $$\label{eq:new_h} h(t_2,\tilde{X})=\Psi_Y(\tilde{X})\rho_{0,Y}(\partial_Y\Psi_X(f^{t_2}(Y))).$$]{} In this special case, the stochastic contribution reduces to a simple additive noise term - compare Eqs. and - while the evaluation of the memory kernel $h$ is significantly easier as a simpler expression appears in the ensemble average - compare Eqs. and . We can now prove the scale adaptivity of the method adopted here as follows. We then consider the case where the equations of motions can be written as in -. We note that the expectation values are computed according to the invariant measure of the uncoupled equation $\frac{dY}{dt}=\gamma\tilde{F}_Y(Y)$, which can be rewritten as $$\label{universa} \frac{dY}{d\tau}=\tilde{F}_Y(Y)$$ where $\tau=\gamma t$. We clearly have that the constant $D$ in Eq. is not affected by the choice of the time scale. Instead, the correlation function in Eq. and the memory kernel in Eq. are affected by the rescaling in the time and only the rescaled time $\tau$ will appear in their arguments. By substituting $\tau=ct$ one then obtains the actual parameterization for every choice of $\gamma$. In particular, large values of $\gamma$ will lead to a compression of the time axis for the correlation function and memory kernel, as seen below in the specific case investigated in this study. The Lorenz ’96 Model ==================== It is crucial to test the methodology outlined in the previous section on a concrete numerical model having some practical and conceptual relevance. We recommend the reader to check the recent contributions by [@Wouters2016a] and [@Demaeyer2015]. The study presented here is constructed in such a way that we systematically explore how the performance of the parameterization changes when we alter both the intensity of the coupling and the time scale separation between the fast and slow variables. In particular, we are able to construct a scale adaptive scheme that requires minimal computational time for constructing a general parameterization scheme. At this regard, we have chosen to perform our analysis on the Lorenz ’96 model [@Lorenz1996]. The Lorenz ’96 model provides a conceptually meaningful yet extremely simplified representation of the atmosphere; there are two sets of variables, one describing the dynamics on large scale (so-called synoptic variables), and one characterizing the dynamics on small scale (so-called convective variables). The convective variables are divided in as many subgroups of equal size as the number of synoptic variables, each subgroup being coupled to a different synoptic scale variable. The system is then characterized by coupling within and across scales of motions. The Lorenz ’96 model has quickly established itself as one of the reference models in nonlinear dynamics for testing e.g. data assimilation [@Trevisan2004; @Trevisan2010], schemes and properties of Lyapunov exponents and covariant Lyapunov vectors and is becoming increasingly popular also within the community of statistical mechanics [@Abramov2008; @Hallerberg2010; @Lucarini2011; @Gallavotti2014]. The evolution equations of the Lorenz ’96 model can be written as: $$\label{eq:lor1before} \frac{dX_k}{dt}=X_{k-1}(X_{k+1}-X_{k-2})-X_k+F_1-\frac{hc}{b}\sum\limits_{j=1}^JY_{j,k},$$ $$\label{eq:lor2before} \frac{dY_{j,k}}{dt}=cbY_{j+1,k}(Y_{j-1,k}-Y_{j+2,k})-cY_{j,k}+\frac{hc}{b}X_k,$$ with $k=1,...,K$; $j=1,...,J$. The boundary conditions are defined as $$\begin{split} X_{k-K}=X_{k+K}=X_k,\\ Y_{j,k-K}=Y_{j,k+K}=Y_{j,k},\\ Y_{j-J,k}=Y_{j,k-1},\\ Y_{j+J,k}=Y_{j,k+1}. \end{split}$$ The latitudinal circle is divided into $K$ sectors, each one corresponding to one *synoptic* slow $X$ variable. Each $X$ variable is coupled to $J$ *convective* fast $Y$ variables. As discussed in detail later, the constant $c$ defines the time scale separation between the fast and slow variables (see also the general form of a multiscale system as given in Eqs -), while the amplitude of the fluctuations is determined by $b$, while $h$ controls the strength of the coupling. In absence of forcing and dissipation, the sum of the squares of the variables (the *energy* of the system) is conserved. For a detailed description of the statistical mechanical and conservation properties of the system (yet in a simplified version), the reader is encouraged to look into [@Lucarini2011; @Blender2013; @Gallavotti2014]. We note that the coupling between the $X$ and the $Y$ terms has the simplified form discussed in the previous section (what we referred to as the independent coupling), and is linear. This simplifies the treatment below, which is nonetheless possible also for more complex forms of coupling. The choice of the parameters defining the strength of the external forcing, the number of sectors and subsectors, the strength of the coupling, the relative amplitude of the fluctuations and the time scale separation between the two systems determines the properties of the dynamical system. The original parameters chosen by Lorenz are $c=10.0$, $b=10.0$, $h=1.0$, $K=36$ and $J=10$ (providing therefore a total of $36$ $X$ variables and $360$ $Y$ variables). We remind that, following the original derivation of the model, $1$ unit of time is equivalent to $5$ days, while the usual integration time step is $0.005$, corresponding to $36$ minutes. When one is well within the chaotic regime (e.g. $F_1$ is sufficiently large) and considers a sufficiently large number of sectors (and subsectors), it is reasonable to expect to be able to define intensive properties that are stable with respect to the specific choice of K and J, see discussion in [@Gallavotti2014] for a simpler version of the model. We have implemented two modifications to the Lorenz ’96 model: - We have introduced a forcing term also in the equations describing the dynamics of the $Y$ variables (see Eq. ), in order to represent the direct effect of forcings at small scales (mimicking, e.g., the impact of direct solar forcing on convective motions). This has the effect of making the fast variables an active component of the system: they can also pump energy into the $X$ variables and are not exclusively dissipating energy coming from larger scales. - We have changed the boundary conditions on the $Y$ variables in such a way that the fast variables of different sectors do not interact with each other, in the spirit of having the fast variables representing sub-grid scale phenomena (see Eqs. ). Note that if $J\gg1$ and we are in a chaotic regime, it is reasonable that this change has negligible impact on the statistics of the system, as information does not propagate efficiently between convective variables belonging to neighbouring sectors. Additionally, the parameterization becomes easier to implement, because, following the basic idea behind super-parameterization, subgrid variables belonging to different $X$ sectors are independent and equivalent in the uncoupled case (see Eqs. -). Therefore, the evolution equations - are modified as follows: $$\label{eq:lor1} \frac{dX_k}{dt}=X_{k-1}(X_{k+1}-X_{k-2})-X_k+F_1-\frac{hc}{b}\sum\limits_{j=1}^JY_{j,k},$$ $$\label{eq:lor2} \frac{dY_{j,k}}{dt}=cbY_{j+1,k}(Y_{j-1,k}-Y_{j+2,k})-cY_{j,k}+\frac{c}{b}F_2+\frac{hc}{b}X_k,$$ with modified boundary conditions $$\begin{split} \label{eq:newboundary} X_{k-K}=X_{k+K}=X_k,\\ Y_{j-J,k}=Y_{j+J,k}=Y_{j,k}. \end{split}$$ The parameter $\epsilon$ in Eqs. - is $\frac{hc}{b}$ and the coupling terms are $\Psi_X=-\epsilon \sum\limits_{j=1}^JY_{j,k}$ and $\Psi_Y=\epsilon X_k$, $b$ defines the ratio between the typical size of the $X$ and $Y$ variables, while the parameter $\gamma$ controlling the scale separation is given by $c$. We choose $F_1=10.0$ and $F_2=6.0$, so that chaos is realized in the uncoupled version of the system (obtained from Eqs. - by setting $h=0$) for both the large and small scale variables of the system separately. [We choose for $h$, $b$, $c$, $K$, and $J$ the standard values mentioned above. We have verified that the change in the boundary conditions for the $Y$ variables has a negligible effect on the statistical properties of the $X$ variables: the pdf of each $X$ variable (Fig. \[fig:probdenscomp\]), its time correlation (Fig. \[fig:autocorrcomp\]), and the spatial correlation of the $X$ variables at zero time lag (Fig. \[fig:sautocorrcomp\]) are virtually identical for the original and modified Lorenz ’96 model.]{} The presence of chaos and of a corresponding nontrivial invariant measure for the $Y$ variables are necessary for being able to construct the W-L parameterization. In Appendix A we discuss how such a requirement can be relaxed through a suitable re-definition of the background around which the perturbative theory is applied. We now show how to practically construct a scale-adaptive parameterization. This provides us with a great deal of flexibility and extremely parsimonious numerical costs. We show that the uncoupled evolution equation for the $Y$ variables (Eq. ) can be written in a universal form. In fact, it is easy to check that, operating the substitutions $$\label{eq:rescc} \tau=ct$$ and $$\label{eq:rescb} Z_{j,k}=bY_{j,k},$$ we get [for the uncoupled evolution equation for the rescaled Y variables:]{} $$\label{eq:rescaledY} \frac{dZ_{j,k}}{d\tau}=Z_{j+1,k}(Z_{j-1,k}-Z_{j+2,k})-Z_{j,k}+F_2 .$$ Therefore, for all values of $h$, $b$, and $c$ we can construct the parameterizations just by resorting to the invariant measure of Eq. and adopting the suitable rescaling. Note that in the case of this specific system we are able to rescale also the size of the $Y$ variable and achieve a higher degree of flexibility than in the general case discussed above. This emphasizes the scale-adaptivity of the approach proposed here, and makes sure that only modest computation effort is needed to deal with the problem of parameterization. Note that, compared to the general case of multiscale system discussed before, in this case we have the additional problem that changing the value of $c=\gamma$ leads also to an increase in the value of $\epsilon$, so that large values of $c$ might break the weak coupling hypothesis. The problem can be circumvented by increasing at the same time the value of $b$ or considering smaller values of $h$. ![Probability density of the $X$ variable for the original (solid line) and the modified (dashed line) Lorenz 96 model. See text for details.[]{data-label="fig:probdenscomp"}](Vissio_Fig1-eps-converted-to.pdf){width="\linewidth"} ![Time autocorrelation of the $X$ variable for the original (solid line) and the modified (dashed line) Lorenz 96 model. See text for details.[]{data-label="fig:autocorrcomp"}](Vissio_Fig2-eps-converted-to.pdf){width="\linewidth"} ![Spatial autocorrelation of the $X$ variable for the original (solid line) and the modified (dashed line) Lorenz 96 model. See text for details.[]{data-label="fig:sautocorrcomp"}](Vissio_Fig3-eps-converted-to.pdf){width="\linewidth"} Constructing the Parameterization --------------------------------- The first order term in the parameterization is recovered using ergodicity and averaging $D(X)$ in Eq. . [By symmetry, the coupling is the same for all the X variables:]{} $$\label{eq:m1} D_k(X_k)=D(X)=D=-\frac{1}{b} \lim_{T\to\infty}\frac{1}{T}\int_0^T \sum\limits_{j=1}^JZ_{j,k} (\tau)d\tau,$$ where [$k=1, \ldots, K$ and]{} the average is performed by integrating Eq. . The value of this term is $\frac{-20.12}{b}$ for all $k'$s; choosing $b=10$ we get $D_k(X_k)=-2.012$. Therefore, the coupling between fast and slow scales leads on the average to a reduction in the effective forcing applied to the slow variables. In other terms, this indicates a net energy flux from slow to fast variables. [Despite the simplicity of the model considered here and of the coupling between the X and Y variables, this corresponds to the effect of introducing eddy viscosity in more complex fluid dynamical models.]{} The $k^{th}$ component of the stochastic term $S\{X\}$ in Eq. is constructed as an additive noise $\sigma(t)$ featuring the following lagged covariance: $$\label{eq:autocovariance} R_k(\tau)=R_k(ct)=\lim_{T\to\infty} \frac{1}{T} \int_0^T (-\sum\limits_{j=1}^J\frac{Z_{j,k}(\tau_1)}{b}-D)(-\sum\limits_{j=1}^J \frac{Z_{j,k}(\tau+\tau_1)}{b} -D) d\tau_1 ,$$ where the evolution of the $Z$ variables is given by Eq. and the covariance is reported in Fig. \[fig:m2ac\]. We can construct surrogate time series of $\sigma$ to be used for the parameterized simulation either from properly resampling time series of the fluctuation term $-\sum\limits_{j=1}^J\frac{Z_{j,k}}{b}-D$ or by reproducing them using simple stochastic models like those belonging to the $AR(n)$ family. We follow the second route, taking advantage of the software package ARFIT [@Neumaier2001; @Schneider2001]. [Note that this term describes the backscatter of energy from the small towards the large scales.]{} Note that, as the argument of the function is $ct$, we have that in the limit of $c\to\infty$ the autocovariance tends to zero for all $t>0$, because the function $R$ tends to zero for large values of its argument, while one has for all values of $c$ that $R(0)$ is finite. As a result, one obtains as limit a white noise of vanishing amplitude [for any fixed value of $\epsilon$.]{} ![Time lagged autocovariance of the noise term $\sigma(t)$ with $b=10$ and $h=1$. See text for details.[]{data-label="fig:m2ac"}](Vissio_Fig4-eps-converted-to.pdf){width="\linewidth"} We now wish to provide an explicit expression of the $k^{th}$ component of the non-markovian term $M\{X\}$ given in Eq. . We express the memory kernel [$h_k(\tau,\tilde{X_k})$]{} (where $\tau=ct$) as follows: [ $$\label{eq:memorykernel} h_k(\tau_1,\tilde{X_k})= - \frac{1}{b} \tilde{X_k} H(\tau_1),$$ ]{} where $$\label{eq:kernelH} H(\tau_1)=lim_{\Omega\to\infty} \frac{1}{\Omega} \int_0^\Omega \sum\limits_{j=1}^J \frac{\partial}{\partial Z_{j,k}(\omega)} Z_{j,k}(\tau_1+\omega) d\omega .$$ In Fig. \[fig:m3ws\] we plot the factor $H$ on the right hand side of Eq. , this clarifies that the kernel weighs less states of the $X$ variables with larger time separation, as expected. Increasing the value of $c$ leads to a compression of the time axis by a factor of $c$. Since [$H(\tau)\to 0$]{} in the limit of $c\to\infty$, $h$ vanishes for all values of $t>0$. Hence, memory effects disappear in this limit. ![ Memory effects as measured by the factor $H$, see Eq. , with $b=10$ and $h=1$. See text for details.[]{data-label="fig:m3ws"}](Vissio_Fig5-eps-converted-to.pdf){width="\linewidth"} We expect that, for a given value of $\epsilon$, the larger the value of $c$, the more dominant is the contribution to the parameterization coming from the deterministic first order term. Note that [@Abramov2016] addresses the problem of parameterizing a modified version of the Lorenz’96 system similar to the one presented here by a modified version of the homogenization method. The derived parameterization is different from what obtained here as the homogenization method assumes infinite separation of scales between the fast and slow variables. Abramov obtains a stochastic contributions that is always white (yet its variance depends on the time scale separation), and an extra deterministic linear term that, from construction, might point at a surrogate way to implicitly deal with memory effects. Performance of the parameterization =================================== In this section a series of statistical tests are performed in order to check the skill of the W-L parameterization. In what follows, we refer to Eqs. - as the *coupled model*. The *uncoupled model* is, instead, given by the evolution equations of Eq. where the last term is excluded (or, equivalently, $h$ is set to 0). The model with *first order parameterization* is obtained by inserting expression in Eq. and disregarding the other terms. The model with *second order parameterization* is obtained by inserting in Eq. both the first and second order terms. We test the skill of the parameterization in reproducing the statistical properties of the coupled model and compare it to the performance of the parameterization constructed according to the method proposed by [@Wilks2005]. We choose for this test the standard values of the parameters $c=10$, $b=10$, $h=1$; every other possible choice for these factors can be covered through a proper rescaling of the values for $D$, $S$ and $M$, as shown in sections 4.1 and 4.2. Wilks proposed an empirical parameterization of the fast dynamical variables in the Lorenz ’96 model. The idea is to fit the *unresolved tendencies* of the $X$ variables (*i.e.* the forcings terms written as a function of the $Y$ variables) using a polynomial regression in the form $$g_U(X_k)=b_0+b_1 X_k+b_2 X_k^2+b_3 X_k^3+b_4 X_k^4+e_k,$$ where the $b$s are the regression coefficients, while $e_k$ is a stochastic function constructed according to the following AR(1) process: $$e_k(t)=\phi e_k(t-\Delta)+\sigma_e(1-\phi^2)^{1/2} z_k(t),$$ written in term of the fitting parameters $\phi$ (lag-1 autocorrelation of $e_k$), $\sigma_e$ (standard deviation of the process $e_k$), where $z_k$ is a Gaussian uncorrelated process with zero mean and unitary variance. The parameterized system is then written as follows: $$\frac{dX_k}{dt}=X_{k-1}(X_{k+1}-X_{k-2})-X_k+F_1-g_U(X_k).$$ Note that in the case of Wilks’s parameterization, all terms are markovian and there is no clear justification of why the stochastic residual is captured by an $AR(1)$ process, nor of why a 4th order polinomial is chosen. On the other side, the W-L parameterization provides a simple constant as deterministic term $D(X)$ (see Eq. ), which seems an oversimplification compared to the fourth order polynomial used by Wilks. This clarifies that the two approaches are rather different in nature. [We remark that the framework for parameterizations for slow-fast system recently proposed by [@Wouters2017] and based on the Edgeworth expansion might provide a sound basis for justifying and possibly deriving explicitly closures structurally analogous to what proposed by Wilks for the Lorenz ’96 system.]{} We test the ability of the parameterizations in reproducing the probability density function of the variable $X_k$, the lagged temporal correlation $Corr(t)=\langle X_k X_k(t)\rangle$, and the spatial correlations at zero time lag $Sp(l)= \langle X_k X_{k+l}\rangle$. Fig. \[fig:prob\_dens\] shows the probability density of the $X_k$ variables for all considered models. It is clear how the second order parameterization offers a better result with respect to the first order, which is in turn a clear improvement of the basic uncoupled system. Both Wilks’s approach and the W-L parameterization provide rather good approximations of comparable quality for the distribution of the $X$ variable of the original system. ![Probability density of the $X$ variable for the different models considered in the paper. See text for details.[]{data-label="fig:prob_dens"}](Vissio_Fig6-eps-converted-to.pdf){width="\linewidth"} We then consider normalized second order properties for the $X$ variables. We first look at the lagged time autocorrelation (see Fig. \[fig:temp\_acorr\]). Higher order parameterizations lead to a better agreement with the coupled model, even if the improvement in the skill is most evident for small time lags. Nevertheless, the Wilks method provides very good results also for lags larger than 0.4 time units. Fig. \[fig:sp\_acorr\] shows the performance of the parameterization in simulating the spatial correlation of the $X_k$ variable. We find that considering higher order approximations in the parameterizations we do not get a substantial improvement of the results, even if the first and second order parameterization lead to an improvement with respect to the uncoupled case. In this case Wilks’s parameterization follows closely the full coupled model and overperforms the parameterizations constructed according to the method discussed here. ![Temporal autocorrelation of the $X$ variable for the different models considered in the paper. See text for details.[]{data-label="fig:temp_acorr"}](Vissio_Fig7-eps-converted-to.pdf){width="\linewidth"} ![Spatial autocorrelation of the $X$ variable for the different models considered in the paper. See text for details.[]{data-label="fig:sp_acorr"}](Vissio_Fig8-eps-converted-to.pdf){width="\linewidth"} The analysis of the second and higher order moments is shown in the next section. Sensitivity to the Strength of the Coupling ------------------------------------------- As our expansion is based on assuming the presence of weak coupling between the slow and the fast variables, it is crucial to test its performance as we vary the value of the coupling strength $h=\epsilon \frac{b}{c}$ (see Eqs. \[eq:lor1\] and \[eq:lor2\]) with respect to its standard value of 1. Note that we are treating the case where $b=c=10$ are held fixed, so that $h=\epsilon$ is changed in what follows. We look at the first moment and at the second, third and fourth central moments of the variable $X_k$. Fig. \[fig:1stmom\] shows that all parameterizations perform pretty well in terms of representing the first moment of $X_k$ for all considered values of $h<1.4$. Larger values of $h$ lead to a qualitative change in the properties of the system and fall outside the range of interest. We note that, surprisingly, the first order parameterization constructed using the W-L method overperform the second order model for $h\lessapprox1$, which hints at the importance (at least in this case) of possibly developing a theory for the third order scheme, beyond the W-L parameterization. Figs. \[fig:2ndmom\], \[fig:3rdmom\] and \[fig:4thmom\] portray the performance of the parameterizations in reproducing the values of the second, third and fourth centered moments, respectively. We consistently find that, while all methods are quite successful, the Wilks parameterization provides the best results, with the second order model constructed with the W-L method coming close second. We wish to underline that the Wilks parameterization needs to be constructed from scratch for each different value of $h$ (as well as of $b$ and $c$). This marks a fundamental difference with the parameterization tested in this study, where we need just to linearly rescale the first order term and quadratically rescale the two second order terms. Another problem shown by Wilks’s method is the lack of stability in case of high values of $h$; as a matter of fact, in order to obtain results for $h>1.2$ we had to reduce drastically the time step in the numerical integration, thus having a much higher computational cost. ![First moment as a function of the coupling strength for the different models considered in the paper. See text for details.[]{data-label="fig:1stmom"}](Vissio_Fig9-eps-converted-to.pdf){width="\linewidth"} ![Second centered moment as a function of the coupling strength for the different models considered in the paper. See text for details.[]{data-label="fig:2ndmom"}](Vissio_Fig10-eps-converted-to.pdf){width="\linewidth"} ![Third centered moment as a function of the coupling strength for the different models considered in the paper. See text for details.[]{data-label="fig:3rdmom"}](Vissio_Fig11-eps-converted-to.pdf){width="\linewidth"} ![Fourth centered moment as a function of the coupling strength for the different models considered in the paper. See text for details.[]{data-label="fig:4thmom"}](Vissio_Fig12-eps-converted-to.pdf){width="\linewidth"} Scale adaptivity ---------------- The most relevant advantage of the W-L approach proposed here is [that it allows one]{} to construct general parameterizations by suitably rescaling the three terms - deterministic, stochastic and non-markovian - after having estimated them through a single numerical simulation. The method proposed by Wilks is more precise for each given choice of the system’s parameters but lacks such a flexibility, which might be of crucial relevance when trying to develop self-adaptive parameterizations. The coefficients appearing in the Wilks parameterizations (see Table \[tab:Wparameters\]) cannot be readily predicted with suitable expressions. Using the general results for the first and second order terms of the W-L parameterization and adopting the suitable rescaling for the amplitude and the time axis discussed in the previous section, it is possible to explore an infinite range of scenarios for the values of $b$, $c$, and $h$. We present some examples below. In Fig. \[fig:c1pd\] we show that the probability densities of $X_k$ obtained through the different parameterizations are in good agreement with what shown by the coupled model. Note that choosing $c=1$ implies also assuming that there is *no* scale separation between the $X$ and the $Y$ variables. In fact, as discussed before, the W-L method can be used also in this case. [.5]{} ![a) Probability density of the $X$ variable in the case of $c=1$, $b=10$ and $h=1$. See text for details. b) Zoom on the peak of the distribution.[]{data-label="fig:c1pd"}](Vissio_Fig13a-eps-converted-to.pdf "fig:"){width="\linewidth"} [.5]{} ![a) Probability density of the $X$ variable in the case of $c=1$, $b=10$ and $h=1$. See text for details. b) Zoom on the peak of the distribution.[]{data-label="fig:c1pd"}](Vissio_Fig13b-eps-converted-to.pdf "fig:"){width="\linewidth"} Since we are treating the case where the coupling should not be too strong compared to the unperturbed vector flow (this is the condition under which we can use the W-L method), as said before, increasing the value of $c$ can be problematic unless we reduce accordingly the value of $h$ (or increase the value of $b$). We then show in Fig. \[fig:c100h1pd\] the probability density function of the $X$ variable in the case $c=100$, $b=10$, $h=0.1$, with a much stronger coupling than the previous case of Fig. \[fig:c1pd\]. In this case, it is clear that considering a parameterization is crucial for reproducing satisfactorily the statistics of the $X$ variable, and we observe that the first order parameterization is already rather successful. Note that as $c$ becomes larger, the memory term has a less and less relevant role and the stochastic contributions is rather similar to a white noise forcing. [.5]{} ![a) Probability density of the $X$ variable in the case of $c=100$, $b=10$ and $h=0.1$. See text for details. b) Zoom on the peak of the distribution.[]{data-label="fig:c100h1pd"}](Vissio_Fig14a-eps-converted-to.pdf "fig:"){width="\linewidth"} [.5]{} ![a) Probability density of the $X$ variable in the case of $c=100$, $b=10$ and $h=0.1$. See text for details. b) Zoom on the peak of the distribution.[]{data-label="fig:c100h1pd"}](Vissio_Fig14b-eps-converted-to.pdf "fig:"){width="\linewidth"} As last test (Fig. \[fig:stresspd\]) we stress the rescaling of the model applying the transformation to all the parameters at the same time, shifting from the $c=10$, $b=10$, $h=1$ to the $c=5$, $b=8$, $h=1.1$ scenario. ![Probability density of the $X$ variable in the case of $c=5$, $b=8$ and $h=1.1$. See text for details.[]{data-label="fig:stresspd"}](Vissio_Fig15-eps-converted-to.pdf){width="\linewidth"} Conclusions =========== Constructing accurate and efficient parameterizations is a central task in the numerical modelling of geophysical fluids, because these systems are characterized by variability on a large range of spatial and temporal scales. Parameterizations are expected to be able to improve our ability to predict and represent the statistical properties of the slow, large scales variables of interest bypassing the need for representing explicitly the dynamics of fast, small scale variables. Modern climate and weather prediction systems devote relevant resources to improving the parameterizations of subgrid scale processes. Unfortunately, parameterizations are usually constructed ad-hoc, targeting the optimisations of one of few specific properties of interest, and must usually be re-tuned each time the resolution of the model is changed or new components are added. This creates intrinsic uncertainties in the performance of the model and reduces its overall robustness. General mathematical findings suggest that parameterizations should include deterministic, stochastic, and non-markovian contributions [@Chekroun2015a; @Chekroun2015b; @Wouters2012; @Wouters2013; @Wouters2016]. In particular, non-white noise and non-markovian terms result from the finiteness of the scale separation between resolved and unresolved scales. Indeed, current developments in meteorology and climate science are strongly proposing supplementing the common deterministic parameterizations with stochastic components and satisfactory improvements in the skill are observed [@Palmer2008; @Franzke2015; @Berner2016]. In this paper we implement the parameterization scheme developed for general systems by Wouters and Lucarini through a re-elaboration of the Ruelle response theory [@Ruelle1998; @Ruelle2009] and, independently, through an expansion of the Mori-Zwanzig projection operator [@Zwanzig1960; @Zwanzig1961; @Mori1974], where the coupling between the variables of interest and the variables we want to parameterize is seen as small perturbation to the uncoupled dynamics of the former ones, thus taking a weak coupling hypothesis. This parameterization describing the dynamical impact of the neglected variables can be written as the sum of as a deterministic term (mean field effect) stochastic term (impacts of fluctuations), and non-markovian term (role of memory). We underline that following this point of view, instead, no hypotheses are taken on the scale separation between the systems, as opposed to well-exploited approaches as the homogenization method [@Pavliotis2008]. We also show that the W-L method can be used in general for constructing scale-adaptive parameterizations when multi-scale systems are considered. We test this parameterization scheme on a mildly modified version of the Lorenz ’96 two-level model [@Lorenz1996], which is a prototypical multi-scale system of great interest for nonlinear science in general. We construct a scale adaptive parameterization able to describe accurately the coupling between slow and fast scales, to describe conditions of finite scale separation and to reach the infinite time separation limit. In particular, we are able to construct explicitly the properties of the noise term responsible for the stochastic component of the parameterization and the memory kernel responsible for the non-markovian term. The parameterization does a very good job in surrogating the effects of the fast variables, as tested by evaluating the expectation value and the correlation properties of the slow variables, and shows a great deal of flexibility when different intensity for the coupling strength is varied. We have also tested the parameterization discussed here against the heuristic approach previously proposed by [@Wilks2005]. The Wilks method allows for constructing detailed parameterizations for each choice of the systems parameters, and outperforms the parameterizations constructed following Wouters and Lucarini. Nonetheless, the Wilks parameterization is not scale adaptive and needs to be retuned each time we change one or more parameters of the system, whereas the W-L parameterization is universal [within the approximation defined by Eq.]{}, except for the application of an algebraic rescaling, as proved by our last tests. We argue that, depending on the specific problem one needs to address, an accurate ad-hoc method or the flexible but less precise method proposed here might prove more advantageous. The flexibility of this approach has been demonstrated by changing by two orders of magnitude the time scale separation and also in the most general case when all the parameters $c$, $b$ and $h$ are changed with respect to the original values. It would be interesting to sistematically compare the parameterization described here with what recently proposed by [[@Abramov2016] and [@Wouters2017] through modifications]{} of the homogenization method, in order to assess benefits and pitfalls of each approach. [It would also be extremely interesting to test, for a given system of interest, the correspondences and differences between the point of view proposed here and the bottom-up, data-based, complementary approach proposed by [@Kondrashov2017], which also allows for dealing with non-markovian effects. Especially promising seems the possibility of testing and comparing the scale-adaptivity of the two approaches.]{} We wish to test next the relative performance of the parameterizations described here in terms of prediction of the state of the system described by the $X$ variables. Additionally, we plan to test sistematically what is presented in Appendix A, i.e. the flexibility we have in the theory used here of selecting different background states for constructing the parameterization. What we have shown in this paper is, evidently, mostly a proof of concept aimed at essaying the potential (and the pitfalls) of the W-L approach [and showing its scale-adaptivity, which had not been thoroughly studied before.]{} This is, [together with the recent contributions by [@Wouters2016] and [@Demaeyer2015],]{} just the first step in the direction of understanding its applicability in GFD systems of practical interest, where large datasets need to be processed and the computation of the memory term seems at first sight problematic. In particular, we will aim at constructing filters for large eddy simulations [@Pope2004; @Arakawa2004]. This clearly seems to be an ambitions task and further investigations are needed in this direction. [In fact, the potential of the WL parameterization might be higher than what shown until now. In a recent publication, [@Lucarini2017] showed that it is possible to derive explicit formulas allowing for projecting imposed changes in the dynamics of the full system due to perturbations onto the reduced, parameterized dynamics. This paves potentially the way for constructing extremely flexible parameterizations. This is another direction of work worth investigating.]{} Acknowledgement {#acknowledgement .unnumbered} =============== The authors wish to thank Sebastian Schubert for various fruitful discussions. GV was supported by the Hans Ertel Center for Weather Research (HErZ), a collaborative project involving universities across Germany, the Deutscher Wetterdienst and funded by the BMVI (Federal Ministry of Transport and Digital Infrastructure, Germany). VL acknowledges the financial support provided by the DFG cluster of excellence CliSAP and by the SFB/Transregio Project TRR181. Forcing in the fast dynamics ============================ As discussed in [@Wouters2012; @Wouters2013; @Wouters2016], a basic requirement for the proposed approach to allow for the construction of a parameterization for the Y variables is to have that the uncoupled dynamics of the $Y$ variables given in Eq. features a non-trivial invariant measure and fast decay of correlations due to the presence of chaos. Physically, this requires presence of an external forcing leading to the injection of energy for the $Y$ variables; this is achieved in the system studied here by choosing a sufficiently large value for the constant $F_2$. Another way to address such a problem is shown in [@Wouters2016a], where a stochastic forcing, corresponding to the presence of energy injection coming from even smaller, unresolved scale, is considered. In order to extent the method to physical situations where energy is injected only in the $X$ variables, we need to resort to a simple mathematical trick that amounts to changing the background state around which the perturbation induced by the presence of coupling is considered. The idea is to rewrite Eq. as follows: [ $$\frac{dY}{dt}=F_Y(Y)+\epsilon G+\epsilon\Psi_Y(X)-\epsilon G ,$$ ]{} such that the vector flows defining the uncoupled dynamics and the coupling are defined as follows: [ $$\begin{aligned} %\begin{split} & \widetilde F_Y(Y) = F_Y(Y)+\epsilon G ,\label{plusG}\\ & \widetilde {\epsilon\Psi_Y}(X) = \epsilon\Psi_Y(X)-\epsilon G. \label{minusG} %\end{split}\end{aligned}$$ ]{} The choice of the *artificial* forcing $G$ gives us a degree of flexibility and must obey only the requirement that $\dot{Y}=\tilde{F}_Y(Y)$ is chaotic. Note that, within the radius of expansion ensuring the validity of the [perturbative]{} approach the specific choice of $G$ affects only weakly our final result. An obvious choice is to choose $G=\rho_{0,X}(\Psi_Y(X))$, which makes sure that, at zero order, the uncoupled system has a nontrivial dynamics, because we have chosen a background state where the Y variables receive from the $X$ variables approximately as much energy as in the fully coupled case. The procedure can be repeated also in the case, like the one analyzed here, where we do not have the requirement of shifting the background state, and the natural definition of the uncoupled dynamics of the $Y$ variables given in Eq. can be used. We have tested here this hypothesis by using the framework given in Eqs. - and choosing the standard values for the system’s parameters and $G=\rho_{0,X}(\Psi_Y(X))=2.57$. In Figs. \[fig:probdensg\] and \[fig:autocorrg\] we show that, at the second order, we obtain almost undistinguishable results with respect to what shown in Fig. \[fig:prob\_dens\] for probability density and Fig. \[fig:temp\_acorr\] for the time autocorrelation using $F_2=6$ and $\frac{c}{b}F_2+G=8.57$ as forcing of the uncoupled $Y$ equation. ![Probability density of the $X$ variable calculated adding $G$ to uncoupled $Y$ equation. The standard case is the one shown in section 4. See text for details.[]{data-label="fig:probdensg"}](Vissio_Fig16-eps-converted-to.pdf){width="\linewidth"} ![Time autocorrelation of the $X$ variable calculated adding $G$ to uncoupled $Y$ equation. The standard case is the one shown in section 4. See text for details.[]{data-label="fig:autocorrg"}](Vissio_Fig17-eps-converted-to.pdf){width="\linewidth"} Abramov, R. V. (2016). . , 1, 2(October 2015):1–18. Abramov, R. V. and Majda, A. J. (2008). . , 18(3):303–341. Arakawa, A. (2004). . , 17(13):2493–2525. Arakawa, A., Jung, J., and Wu, C. (2011). . , 11:3731–3742. Berner, J., Achatz, U., Batt[é]{}, L., Bengtsson, L., C[á]{}mara, A. D. L., Christensen, H. M., Colangeli, M., Coleman, D. R. B., Crommelin, D., Dolaptchiev, S. I., Franzke, C. L., Friederichs, P., Imkeller, P., J[ä]{}rvinen, H., Juricke, S., Kitsios, V., Lott, F., Lucarini, V., Mahajan, S., Palmer, T. N., Penland, C., Sakradzija, M., Storch, J.-S. V., Weisheimer, A., Weniger, M., Williams, P. D., and Yano, J.-I. (2016). . . Blender, R. and Lucarini, V. (2013). . , 243(1):86–91. Chekroun, M. D., Liu, H., and Wang, S. (2015a). . . Springer International Publishing, Cham. Chekroun, M. D., Liu, H., and Wang, S. (2015b). . . Springer International Publishing, Cham. Demaeyer, J. and Vannitsem, S. (2017). . , 143(703):881–896. Eckmann, J. and Ruelle, D. (1985). . , 57(4):617–656. Franzke, C., Majda, A. J., and Vanden-Eijnden, E. (2005). . , (62):1722–1745. Franzke, C. L. E., O’Kane, T. J., Berner, J., Williams, P. D., and Lucarini, V. (2015). . , 6(1):63–78. Gallavotti, G. (2014). Springer International Publishing. Gallavotti, G. and Lucarini, V. (2014). . Ghil, M. and Childress, S. (1987). , volume 60 of [*[Applied Mathematical Sciences]{}*]{}. Springer New York, New York, NY. Hallerberg, S., Pazó, D., López, J. M., and Rodríguez, M. a. (2010). . , 81(6):1–8. Holton, J. R. (2004). , volume 88. Elsevier Academic Press. Imkeller, P. and von Storch, J.-S. (2001). , volume 66. Birkhäuser Basel, Basel. Kondrashov, D., Chekroun, M. D., and Ghil, M. (2017). In [*[Advances in Nonlinear Geosciences]{}*]{}. Springer International Publishing. Kravtsov, S., Kondrashov, D., and Ghil, M. (2005). . , 18:4404–4424. Li, F., Rosa, D., Collins, W. D., and Wehner, M. F. (2012). . , 4(4):1–10. Lorenz, E. N. (1996). . In Palmer, T. and Hagedorn, R., editors, [*[Predictability of Weather and Climate]{}*]{}, pages 40–58. Cambridge University Press. Lucarini, V. (2012). . , 146(4):774–786. Lucarini, V., Blender, R., Herbert, C., Ragone, F., and Pascale, S. (2014). . , 52:809–859. Lucarini, V. and Sarno, S. (2011). . , 18(1):7–28. Lucarini, V. and Wouters, J. (2017). . , 50. Majda, A. J. (2007). . , 64(7):2726–2734. Majda, A. J., Timofeyev, I., and Vanden-Eijnden, E. (1999). . , 96:14687–14691. Majda, A. J., Timofeyev, I., and Vanden-Eijnden, E. (2001). . , (54):891–974. Majda, A. J., Timofeyev, I., and Vanden-Eijnden, E. (2003). . , 60:1705–1722. McGuffie, K. and Henderson-Sellers, A. (2005). . John Wiley & Sons, Ltd, Chichester, UK. Mori, H., Fujisaka, H., and Shigematsu, H. (1974). . , 51(1):109–122. Neumaier, A. and Schneider, T. (2001). . , 27(1):27–57. Orrell, D. (2003). . , 60(17):2219–2228. Palmer, T. and Hagedorn, R. (2006). . Cambridge University Press. Palmer, T. N. and Williams, P. D. (2008). , 366(1875):2421–7. Park, S. (2014). . , 71(Lcl):3902–3930. Pavliotis, G. A. and Stuart, A. M. (2008). . Texts in applied mathematics : TAM, Springer, New York, NY. Peixoto, J. and Oort, A. (1993). . American Institute of Physics, New York, NY. Plant, R. S. and Yano, J.-I. (2016). . Imperial College Press. Pope, S. B. (2004). . , 6. Ruelle, D. (1998). . , 245(3-4):220–224. Ruelle, D. (2009). . , 22:855–870. Sakradzija, M., Seifert, A., and Dipankar, A. (2016). . , 8:786–812. Schneider, T. and Neumaier, A. (2001). . , 27(1):58–65. Trevisan, A., Isidoro, D., and Talagrand, O. (2010). . , (January):487–496. Trevisan, A. and Uboldi, F. (2004). . , pages 103–113. Wilks, D. S. (2005). . , 131(606):389–407. Wouters, J., Dolaptchiev, S. I., Lucarini, V., and Achatz, U. (2016). . , 23:435–445. Wouters, J. and Gottwald, G. (2017). . . Wouters, J. and Lucarini, V. (2012). . , 2012(03):P03003. Wouters, J. and Lucarini, V. (2013). . , 151(5):850–860. Wouters, J. and Lucarini, V. (2016). . In Chang, C.-P., Ghil, M., Latif, M., and Wallace, J. M., editors, [*[Climate Change: Multidecadal and Beyond]{}*]{}, volume 15, pages 67–80. Young, L. S. (2002). , 108(5-6):733–754. Zwanzig, R. (1960). . , 33(5):1338–1341. Zwanzig, R. (1961). . , 124(4):983.
78,132,085
A new book about Beatles producer George Martin explores the tensions surrounding the recording of the band's classic track “Hey Jude,” which took place 50 years ago this month. Sound Pictures: The Life of Beatles Producer George Martin — The Later Years, 1966-2016 is Kenneth Womack’s second title about Martin; it will be published on Sept. 4 via Chicago Review Press. In an excerpt provided to Variety, Womack detailed the events of late July and early August 1968. Martin recalled, “I thought that we had made ["Hey Jude"] too long. It was very much a Paul [McCartney] song, and I couldn’t understand what he was on about by just going round and round the same thing.” He remained concerned about the track running to seven minutes and 11 seconds. “In fact," Martin remembered, "after I timed it, I actually said ‘You can’t make a single that long,’ I was shouted down by the boys – not for the first time in my life – and John [Lennon] asked, ‘Why not?’ I couldn’t think of a good answer, really, except the pathetic one that disc jockeys wouldn’t play it.” Lennon countered, “They will if it’s us.” George Harrison remembered McCartney’s rejection of the suggestion that a guitar part should mimic the vocal melody, and noted it wasn’t a new situation. “Personally, I’d found that for the last couple of albums, the freedom to be able to play as a musician was being curtailed – mainly by Paul,” Harrison said later. “Paul had fixed an idea in his brain as to how to record one of his songs. He wasn’t open to anybody else’s suggestions.” “Hey Jude” went on to sell over two million copies in its first month of release, holding the Billboard No. 1 position for nine weeks, making it not only the Beatles’ longest-topping single, but the longest-playing single to reach the top. Beatles Albums Ranked
78,132,216
Green Party presidential candidate Jill Stein announced she is complying with a Senate Intelligence Committee request for campaign documents in the “Russiagate” probe, but warned against politicized targeting. “We are cooperating by sharing all communications relevant to the committee’s mission,” Stein wrote in a Facebook post on Tuesday. “We support safeguarding our elections from interference, while at the same time we caution strongly against the targeting of political opposition.” The committee is investigating whether there was Russian interference into the 2016 presidential election, and whether the Trump campaign was involved. Russia has consistently denied any interference in the election. "What is noteworthy about this latest move is that it can be seen as both a warning shot against third party challenges and a cynical move by the state and corporate sector to limit the range of information and acceptable opinion reaching the US public – a win-win for the ruling elite," Ajamu Baraka, Stein's vice president running mate said in an emailed statement to RT. "The latest bit of ridiculousness from the Senate Select Committee’s investigation of the Stein/Baraka campaign" is part of "the neo-McCarthyist assault on the freedom of thought and speech," he added. READ MORE: ‘CNN should register as agent of capitalism’: Green Party VP hits out at RT ‘foreign agent’ tag The committee is looking into Stein’s potential “collusion with Russia,” Senator Richard Burr (R-North Carolina) said Monday, according to BuzzFeed. Burr had previously said the committee was also looking into reports the Democratic National Committee paid for research that went into a dossier containing allegations of Donald Trump’s alleged exploits in Moscow. Stein’s former campaign manager told BuzzFeed he expects the committee will want to know more about the 2015 Gala dinner hosted by RT, celebrating its 10th anniversary in Moscow. Stein sat at the same table as Russian President Vladimir Putin and Michael Flynn, who briefly served as Trump’s national security adviser. Stein said she was not paid to attend the dinner and paid for her own travel costs. Click on "Jill Stein" that's trending & you'll see countless leading Dems - with large platforms - strongly implying if not outright stating she's a Kremlin agent: all because of a Congressional inquiry. They couldn't better replicate McCarthyism if they tried. https://t.co/HOWU16bS6O — Glenn Greenwald (@ggreenwald) December 18, 2017 “[We] made the trip with the goal of reaching an international audience and Russian officials with a message of Middle East peace, diplomacy, and cooperation against the urgent threat of climate change, consistent with long-standing Green principles and policies,” Stein wrote on Facebook. Stein has been a frequent guest on RT America to comment on events such as gun control, foreign policy, and environmental matters. She notably made appearances during Third Party presidential debates hosted by the network during the 2012 and 2016 elections. The only other US network to host third-party candidates was CNN, which organized a Libertarian town hall in June and a Green town hall in August. Stein requested to speak to Putin, Russian Foreign Minister Sergey Lavrov or someone in the Russian government to discuss her policies, but the request was not granted, she told the Intercept. “We sought contact with every powerful world leader we had access to,” Stein said, adding that she was particularly interested in the Russian government’s involvement in the Syrian War. Jill and our campaign will not be intimidated by this BS. https://t.co/jfYFaS6Opy — Ajamu Baraka (@ajamubaraka) December 18, 2017 Stein, and her vice president running mate Ajamu Baraka received about 1 percent of the popular vote in the 2016 election. Going to be horrible watching Democrats use more baseless ammo to obsessively attack @DrJillStein for another year because Hillary couldn’t beat a detestable game show host https://t.co/qdzQcntVVZ — Abby Martin (@AbbyMartin) December 19, 2017 She cautioned that the Senate investigation is exercising “overreach, politicizing, and sensationalism” and that is “a danger to democracy, especially in the current climate of all-out war on our First Amendment rights.” Hillary Clinton’s partisans are painting Stein as a Russian spy to “explain away their colossal failure to defeat Donald Trump,” Max Blumenthal, senior editor at Alternet, told RT. “I spoke to Jill Stein,” said Blumenthal. “She told me that Putin sat at the table for two minutes. There was no conversation between the two of them, no discussions whatsoever. Michael Flynn at the time was not even known as a Trump surrogate, he was known as a former Obama general. So this story is revolving around a picture that’s developed into a pseudo-scandal that’s being used to suppress all opposition to the left of the Democratic Party. It is a McCarthyite scam.” Led by Dianne Feinstein, Democrats are exploiting Russiagate to eliminate the political opposition to their left. The innuendo spread against Stein and the Green Party contains all the classical elements of McCarthyism. https://t.co/B0KRBlTZ5L — Max Blumenthal (@MaxBlumenthal) December 18, 2017 The expansion of the Senate Committee Intelligence investigation comes as the House panel’s investigation is finish up its probe. The House Intelligence Committee is interviewing a number of high-profile witnesses this week, according to the Boston Globe. Among them are Rob Goldstone, the music producer who helped set up a meeting with a Russian lawyer at Trump Tower in June 2016; FBI Deputy Director Andrew McCabe; Felix Sater, a Russian-born businessman; and Rhona Graff, Trump’s longtime aide.
78,132,481
require 'aws-sdk-s3' namespace :json do desc "Export all data to JSON files" task :export => :environment do s3 = Aws::S3::Resource.new bucket = s3.bucket('learnawesome-export') obj = bucket.object('dataset.json') json = { topics: Topic.all, idea_sets: IdeaSet.all, experts: Person.all }.to_json obj.put(body: json) puts "Done uploading" end end
78,132,558
One of Our 50 Is Missing Reader stories of misinformation and misunderstanding about The Land of Enchantment Hannah Pollard, a college student, traveled overseas with her class to help teach English to Italian middle schoolers. One of the teaching exercises consisted of each American college student stating his or her state of origin, and the Italian children finding it on a map. When it was Pollard’s turn, she said “New Mexico.” All of the children promptly pointed at Mexico. The Italian teacher tried to correct them— “Nuovo. Nuovo Mexico”—but the students kept pointing at Mexico. When the teacher finally pulled a huge map to the front of the class and showed them the correct state, the students all started to laugh. The teacher asked what was so funny, and the students told her, in Italian: “But that’s Arizona!” WHY THEY CALL IT “YAHOO” Engineer Abbas Akhil, a 40-year Albuquerque resident, was browsing Yahoo News when he came across this image in a photo essay about people who have fought oppression. Included was the famous 1968 photo of Olympic medal-winners Tommie Smith (Gold) and John Carlos (Bronze) protesting racism by raising fists and staring downward during the playing of “The Star-Spangled Banner.” While the text correctly states that the 1968 Olympic Games were held in Mexico City, Mexico, the headline reads “1968 racism: new mexico olympics.” I LIKE TO BE IN AMERICA Las Cruces native Diann Bowlin has widely traveled the world for both business and pleasure. Her most memorable airport experience, however, happened in Baltimore. When Bowlin checked in at the counter for her flight to Albuquerque, the clerk asked for her U.S. passport. “I said I wasn’t carrying it with me, and I asked why I would need it.” The clerk told her that she needed her passport for the trip. “I told her that I was not traveling outside the United States, and she said, ‘Yes, you are—New Mexico is a foreign destination.’” Bowlin told her that New Mexico is in the U.S., between Arizona and Texas; that it and Arizona both became states in 1912; that New Mexico is the fifth largest state; and that she did not need a passport to go to New Mexico. She kept her voice down until uttering the last phrase, which she admits to having screamed at the top of her lungs. The clerk looked a little shocked, then embarrassed. She looked back at her computer screen, and finally printed out Bowlin’s tickets and told her to have a nice flight. “As I walked away, I could hear some other passengers applauding. It felt so good to be a New Mexican!” UNDER THE RADAR Pat Aalseth recently visited the National Weather Service’s National Oceanic and Atmospheric Administration Dial-A-Forecast web page (nws.noaa.gov/pa/recordedforecasts.php), to obtain the phone number for a recorded weather forecast for New Mexico. Although the page lists numbers for every other state in the U.S., New Mexico was . . . you guessed it . . . missing. JERSEY ACCENT The eighth-grade class of Albuquerque’s Montessori Middle School recently enjoyed a class trip to Philadelphia that included a day on a New Jersey beach. Afterward, during dinner at a nearby pizzeria, student Julia King and her friend were approached by a local woman who asked them where they were from. “We told her we were a school group visiting the area from New Mexico. She looked at me with wide eyes and asked, ‘How do you speak such good English?’ With that, my friends and I started cracking up. She looked confused, until I told her we were from New Mexico, the state. I’m not entirely sure she got what I meant, but, embarrassed, she walked away.” NEW YORK STATE OF MIND While in New York City, bestselling author James McGrath Morris, of Tesuque, browsed the May 21 edition of Metro, a free daily paper. An article advised Big Apple residents that JetBlue’s new flights to Albuquerque would deliver them directly to Arizona. “I think we may need to inform the mayor of Albuquerque that they are now part of Arizona,” McGrath quipped. Send Us Your Story—Please! Dear “50” fans: Help sustain this popular feature by sharing your anecdotes—we know you have some choice ones that you haven’t gotten around to sending in. Just dash it off if you like, and we’ll take it from there. Submissions will be edited for style and space. Please include your name, hometown, and state. E-mail to fifty@nmmagazine.com, or mail to Fifty, New Mexico Magazine, 495 Old Santa Fe Trail, Santa Fe, NM 87501.
78,132,596