text
stringlengths
0
100k
Last month, Americans turned out in droves for a once-in-a-lifetime event, as a total solar eclipse made its way across mainland US for the first time since 1918. While we knew a lot of people watched it, a study from the University of Michigan has revealed just how popular the event was. And, well, it’s pretty impressive. According to their study, 215 million adults watched the eclipse either directly or electronically. That’s 88 percent of the entire population of the US. Even more impressively, that’s more than twice the number of people who viewed the last Super Bowl (111 million). “This level of public interest and engagement with a science-oriented event is unparalleled,” said Jon Miller, director of the International Center for the Advancement of Scientific Literacy at the University of Michigan’s Institute for Social Research, in a statement. The findings are based on responses from 2,211 adults, collected by the National Opinion Research Center at the University of Chicago. Miller conducted his research in partnership with NASA. Of those who watched it, 61 million did so electronically – be that via the Internet, TV, or whatnot. Around 20 million travelled from their home to another area to be able to watch it, with only a thin band across the US actually seeing totality (which, having been there, I can tell you was a wonderful and unbeatable experience). Only about 6.5 million watched the eclipse as part of an organized group, with most opting to see it with family or friends. This is probably because it occurred Monday morning on a regular working day. About a third of people took pictures or video, and half of those shared those on social media. Somewhat amusingly, on a scale of zero to 10 on how much they enjoyed it, those surveyed gave it just an average of 7.6. Guess people are hard to please, eh. On an educational scale, they gave it a 7. But hey, at least it was widely seen. If you were one of the 12 percent of Americans who didn’t watch the eclipse, fear not. There’s going to be another in 2024 that, while it won’t cross the whole US, it will be visible from Mexico all the way up to Maine. After that, you’ll have to wait until 2045.
Naveed Sattar , professor 1, Ewan Forrest , consultant 2, David Preiss , clinical senior lecturer 1 1BHF Glasgow Cardiovascular Research Centre, University of Glasgow, Glasgow G12 8TA, UK 2Department of Gastroenterology, Glasgow Royal Infirmary, Glasgow, UK Correspondence to: N Sattar naveed.sattar{at}glasgow.ac.uk Summary points Non-alcoholic fatty liver disease (NAFLD) represents a spectrum of liver disease with key stages consisting of hepatic steatosis (NAFL), steatohepatitis (NASH), fibrosis, and eventual cirrhosis NAFLD affects more than 20% of populations worldwide and most patients with type 2 diabetes mellitus The risk of progressive liver disease in the earliest stage of NAFLD, hepatic steatosis, is low but patients with NASH are at far higher risk, and hepatic steatosis due to NAFLD is also a major risk factor for the development of type 2 diabetes Most patients with NAFLD are asymptomatic and the disease is typically suspected based on raised alanine aminotransferase (ALT) levels together with other clinical and biochemical features, or an incidental finding during abdominal ultrasonography Owing to the slow progression of NAFLD, randomised clinical trials have been unable to identify drugs that conclusively reduce progression to cirrhosis, but sustained weight loss has been shown to improve liver function test results and liver histology and thus lifestyle improvement remains the key intervention There is no convincing evidence that NAFLD independently increases a patient’s cardiovascular risk but there is also no reason to withhold statins in patients with NAFLD who are at high cardiovascular risk unless transaminase levels are more than three times the upper limit of normal Non-alcoholic fatty liver disease (NAFLD) is now more common than alcoholic liver disease owing to the rapid rise in the prevalence of obesity,1 and NAFLD is the most common cause of abnormal liver function tests.2 Its prevalence worldwide is thought to be approximately 20% in the general population and up to 70% in patients with type 2 diabetes mellitus.3 The first recognisable stage of NAFLD is hepatic steatosis, when fat content exceeds 5% of liver volume. Simple steatosis is usually benign in terms of risk of progression to more advanced liver disease, but given its high prevalence it none the less represents an important cause of cirrhosis.4 Notably, NAFLD is strongly associated with insulin resistance and hyperglycaemia and it is therefore closely linked to type 2 diabetes. Non-alcoholic steatohepatitis (NASH), the next stage of NAFLD, develops when hepatic inflammation ensues, and its prevalence in the general population is estimated at 3-5%3; people with NASH are at much higher risk of clinically significant and progressive liver fibrosis, cirrhosis, and hepatocellular carcinoma.4 5 Relevant clinical questions include how to evaluate abnormal liver function test results, whether it is important to identify NAFLD, how to pragmatically identify patients who may have NASH, and who to refer for specialist evaluation. In this article we outline how NAFLD may be recognised in primary care, we suggest when further investigations are needed, and we show why NAFLD should be a strong driver for sustainable weight loss to reduce metabolic and, potentially, hepatic risks. Sources and selection criteria We sought relevant studies from the Cochrane Database of Systematic Reviews, Medline, and Embase, with particular emphasis on systematic reviews, randomised controlled trials, and meta-analyses of trials. Search terms included “non-alcoholic fatty liver disease” and “non-alcoholic steatohepatitis”. Studies were limited to those in adults and written in English. Who gets NAFLD? Obesity is a major risk factor for the development of NAFLD. The increase in obesity is therefore the main driver for the greater prevalence of NAFLD in the community. There is a strong link between NAFLD and type 2 diabetes, even beyond adiposity.6 Male sex and a family history of type 2 diabetes are also associated with a greater risk of NAFLD and NASH at any given body mass index,7 and preliminary evidence suggests greater liver fat content in certain ethnicities that are also known to be at increased risk of type 2 diabetes.8 Preliminary evidence suggests a genetic predisposition to hepatic accumulation of fat in some people through the PNP3A gene.9 Such people may not necessarily display the usual metabolic associations with NAFLD, but genetic screening for PNP3A is not currently recommended.3 Strictly speaking, NAFLD should only be diagnosed in people who consume no or only modest amounts of alcohol (daily intake <20 g (2.5 units) in women and <30 g (3.75 units) in men), although the clinical reality is that in many people both obesity and alcohol will contribute to their level of liver fat and risk of progressive liver disease.10 Uncommon causes of fatty liver also should be considered (drugs including amiodarone, diltiazem, steroids, synthetic oestrogens, tamoxifen, and highly active antiretroviral therapy; refeeding syndrome and total parenteral nutrition; severe weight loss after jejunoileal or gastric bypass; lipodystrophy and other rare disorders) by taking an appropriate medical history. When should NAFLD be suspected and how is it diagnosed? It is important to stress that many patients with NAFLD will be overweight or obese, asymptomatic, and have normal liver function test results. As NAFLD may have a benign asymptomatic course and because there is a lack of definitive evidence about effective interventions, there is currently no compelling reason to screen for the condition in an untargeted fashion. NAFLD is typically first suspected when the results of liver function tests, measured as part of routine testing (for example, health checks), are moderately abnormal. The usual observed biochemical pattern in hepatic steatosis due to NAFLD is of increased levels of transaminases, with alanine aminotransferase (ALT) levels exceeding those of aspartate aminotransferase (AST). This classical pattern is particularly useful in differentiating between hepatic steatosis from NAFLD and alcoholic liver injury, with the latter normally associated with a high AST:ALT ratio. With the progression of hepatic steatosis to NASH and associated hepatic fibrosis, however, AST levels increase with a resultant rise in the AST:ALT ratio.1 11 γ glutamyltransferase (GGT) may also be modestly increased along with the NAFLD pattern for transaminases. Both ALT and GGT have been shown in cross sectional studies to be modestly associated with the presence of fatty liver on ultrasonography and with liver fat content as measured by magnetic resonance imaging spectroscopy,12 whereas AST is not related. Another common way in which NAFLD is identified is by incidental finding of increased hepatic echogenicity on abdominal ultrasonography, when carried out for other reasons such as right upper quadrant abdominal pain. However, ultrasonography has poor sensitivity for diagnosing NAFLD, and many patients with important steatosis on biopsy are not recognised by such imaging,13 underlining the imperfect nature of both biochemistry and imaging for identifying NAFLD. Common forms of presentation of NAFLD Case 1 (box 1) shows one of the common presentations of NAFLD in which liver function tests have been checked. Although some uncertainty about alcohol intake is often present, this patient’s pattern of an increased ALT level with AST levels less than those for ALT along with other characteristics linked to insulin resistance or higher risk of type 2 diabetes—namely, high body mass index, high triglyceride level, low high density lipoprotein cholesterol level, and high or high normal HbA 1c level—strongly suggests the presence of NAFLD, specifically hepatic steatosis. The question of whether ultrasonography of the liver is required to confirm the suspicion is discussed below. Box 1 Typical scenario for identified NAFLD Case 1 Mr A is seen for cardiovascular risk screening: Age 43, body mass index 31 kg/m 2 , 14 units of alcohol per week Cholesterol 6.2 mmol/L, triglyceride 3.5 mmol/L, high density lipoprotein cholesterol 0.9 mmol/L ALT 62 U/L, AST 38 U/L (normal <50 U/L) Blood pressure: systolic 145 mm Hg, diastolic 88 mm Hg HbA 1c 44 mmol/mol ALT=alanine aminotransferase; AST=aspartate aminotransferase. In case 2 (box 2), fatty liver has been identified on abdominal ultrasonography and the clinician must distinguish between NAFLD and alcoholic liver disease. In this case, the patient is adamant about abstinence from alcohol. The clinician had recently checked the patient’s liver function test results and both transaminases were within the normal reference interval, with the ALT level higher than that of AST. Notably, higher ALT and GGT levels within the normal range are associated with a greater risk of type 2 diabetes.14 Further supportive evidence for the presence of NAFLD in this patient includes the related finding of hypertriglyceridaemia together with a family history of type 2 diabetes, although no formal glycaemia testing had been performed for this patient. Although overweight, the patient was not obese. Box 2 Typical scenario for suspected NAFLD Case 2 Mrs B is referred for abdominal ultrasonography because of abdominal discomfort: Age 56, body mass index 28 kg/m 2 , no alcohol Ultrasound findings: increased hepatic echogenicity in keeping with fatty liver Family history of type 2 diabetes ALT 38 U/L, AST 25 U/L Cholesterol 5.2 mmol/L, triglyceride 4.1 mmol/L, high density lipoprotein cholesterol 1.2 mmol/L No glycaemia testing ever performed ALT=alanine aminotransferase; AST=aspartate aminotransferase. How is NAFLD distinguished from alcoholic liver disease? Hepatic steatosis due to alcohol excess is often associated with an AST:ALT ratio >1.5, unlike non-alcoholic fatty liver disease. Alcohol excess also commonly results in an increased high density lipoprotein cholesterol together with triglyceride levels, which can vary between being normal and vastly increased, even in the same patient, depending on the timing of blood sampling in relation to alcohol intake. This pattern of biochemistry is less consistent with insulin resistance and NAFLD. Some other features can help to distinguish one from the other (table⇓). People may have mixed patterns of biochemistry and both obesity and alcohol related risk factors. Typical features of non-alcoholic fatty liver disease (NAFLD) and alcoholic liver disease View this table: How should suspected or confirmed NAFLD be managed? Several points require consideration in deciding how to treat patients with NAFLD (figure⇓): Proposed algorithm for diagnosis and initial management of suspected or confirmed non-alcoholic fatty liver disease (NAFLD) in primary care. ALT=alanine aminotransferase; LFTs=liver function tests; AST=aspartate aminotransferase. *Some biochemistry laboratories only measure one of the transaminases and in such cases it will be necessary to request both ALT and AST tests in relevant patients Depending on how NAFLD is first suspected, whether based on abnormal transaminase levels (with AST levels less than those of ALT) or an incidental finding on ultrasonography, additional evidence is often helpful. Details of previous lipid profiles, type 2 diabetes in patients and their families, past results for fasting glucose or HbA 1c , alcohol intake, and current weight provide incremental information to help in the diagnosis of NAFLD. Where such information is not available or where lipid profiles or screening tests for type 2 diabetes have not been done in recent years, this medical history should be obtained and the necessary blood tests performed. The patient should be provided with lifestyle advice to aid sustained weight loss and reduce alcohol intake. Repeating liver function tests in 3-6 months in those with NAFLD on ultrasonography gives patients time to implement lifestyle changes, at which point they can be reassessed by the clinician. Similar or improved results (a reduction in ALT or other metabolic parameters such as body weight, triglyceride, HbA 1c ) should drive ongoing improvements to lifestyle, whereas deterioration in results can be approached as described below. Screening for type 2 diabetes is particularly important given the close relation between NAFLD and dysglycaemia, as it provides the opportunity to not only potentially identify undiagnosed type 2 diabetes15 but to also identify those at increased risk, as defined in recent guidelines from the National Institute for Health and Care Excellence.16 Hepatic steatosis due to NAFLD is a risk factor for both type 2 diabetes and NASH, and its occurrence should form a major incentive for improvements to lifestyle. Where the liver function test results are mildly or moderately raised (transaminases 50-150 U/L (1 to 3 times the upper limit of normal) with AST levels less than those of ALT) and the available information (based on body weight, lipids, HbA 1c or glucose, family history of type 2 diabetes, alcohol intake) suggests NAFLD, patients should also be asked to return for repeat liver function tests in 2-3 months, having been advised to reduce any alcohol intake or preferably discontinue it and to pursue lifestyle improvements to sustainably reduce weight. Noticeable increases in transaminases (>150 U/L (>3 times the upper limit of normal) with AST levels less than those of ALT) or the additional increase of alkaline phosphatase (ALP) should heighten awareness of the possibility of other causes and of the potential for progressive liver disease, whether due to NAFLD or another cause. These patients should be seen again within a few weeks for repeat testing and consideration of specialist referral. Published recommendations for the management of abnormal transaminase levels exist, and this review is not intended to be a comprehensive guide to investigating all abnormal liver function test results. On the basis of the available clinical information, it is important to consider other liver conditions that are treatable or that may have important consequences for family screening such as chronic viral hepatitis, autoimmune liver disease, haemochromatosis, or drug induced liver injury. With the increasing prevalence of obesity it is inevitable that other liver diseases will be present among those with risk factors for NAFLD. This clinical overlap is sometimes compounded by the presence of mild to moderately raised ferritin and immunoglobulin (predominantly IgA) levels in NAFLD, both of which may reflect on the stage of liver damage in NAFLD without evidence of primary iron overload or autoimmune disease.17 18 Coexisting hepatic steatosis is itself a cofactor for the progression of other primary liver diseases. Although only a few patients with abnormal liver function test results will have serious liver disease requiring immediate treatment, studies have shown that most abnormal results remain so on repeat testing. Therefore appropriate investigation and treatment can be planned when these are first identified.19 Is ultrasonography needed if NAFLD is strongly suspected? In most patients with mildly abnormal transaminase levels plus a suggestive biochemical and risk factor profile in keeping with hepatic steatosis due to NAFLD, many clinicians pursue the diagnosis by means of liver ultrasonography. Proponents suggest a low threshold for ultrasonography scans for screening patients with suspected NAFLD.20 However, ultrasonography has several notable limitations: the variability between sonographers; the technical difficulties of scanning obese patients in a robust and reproducible way; the inability to distinguish NASH, which is far more likely to progress to advanced liver disease, from simple steatosis; the lack of an agreed grading system; the huge number of patients potentially requiring ultrasonography, which would overwhelm local services; and the lack of additional treatment options based on the scan result.21 22 In our opinion the additional benefit of routinely requesting liver ultrasonography to diagnose NAFLD in patients with suggestive phenotypic and biochemical features and no features of other liver disease or more advanced liver disease is therefore unproved and highly questionable. Weight loss and lifestyle improvements are the key goal in NAFLD Because of the low incidence of progressive liver disease in early NAFLD and the duration required for advanced liver disease to occur, randomised trials of lifestyle improvements and various drugs have necessarily been limited to changes in surrogate markers as their primary outcomes. Therefore, as yet there is no conclusive evidence for any particular treatment approach, and cost effective and non-invasive surrogates that robustly track with later development of cirrhosis are much sought after. For most patients with presumed or confirmed NAFLD, the key is to offer lifestyle advice that can lead to sustained weight loss. A recent systematic review of 23 studies evaluating the effect of diet or physical activity in adult populations with NAFLD showed that these lifestyle modifications consistently reduced liver fat and improved glucose control and insulin sensitivity.23 Limited data suggest that lifestyle interventions may also yield benefits for liver histology. Should glycaemia testing confirm type 2 diabetes or show that a patient is at high risk of its development, then lifestyle advice is recognised to be critical to the management of these patients as per universal guidance for the disease,24 and here it may have a dual benefit. General advice on healthy eating and increasing levels of physical activity can be delivered in primary care, or specialist settings where required. Patients can also be encouraged to attend a commercial weight loss programme of their choice. Recent evidence from a randomised trial shows that commercial weight loss programmes perform potentially better than advice given by the National Health Service in achieving weight loss.25 Should ALT and GGT levels decline along with weight reduction, these encouraging results should be shared with patients as further incentive to sustain their lifestyle improvements. Numerous trials of drug treatments, such as metformin, pioglitazone, vitamin E, and statins have failed to deliver conclusive evidence of reductions in clinically significant progression of liver disease, although some studies have yielded improvements in surrogate markers.26 Does NAFLD indicate that patients are at increased cardiovascular risk? Undoubtedly NAFLD is often accompanied by classical cardiovascular risk factors including, but not limited to, type 2 diabetes and low levels of high density lipoprotein cholesterol. This has driven a plethora of observational studies linking markers of NAFLD (including ALT and GGT, fatty liver on ultrasonography, steatosis on liver histology) to cardiovascular surrogate markers and cardiovascular outcomes.27 While some studies have found associations between these NAFLD surrogates and cardiovascular risk, many have been limited by inadequate adjustment for established cardiovascular risk factors.28 Crucially, for NAFLD to be considered as a truly important and independent risk factor, it will need to show clinically meaningful improvements in cardiovascular risk prediction when added to calculators that already include these established risk factors.29 No such evidence yet exists. Importantly, our suggested approach for evaluating the likelihood of NAFLD provides much of the information needed to calculate cardiovascular risk using established risk calculators. Therefore, current evidence suggests that cardiovascular risk should be calculated using the usual available tools without consideration of the presence or absence of NAFLD. What if patients are already using a statin or require a statin based on cardiovascular risk? Given that many patients with NAFLD will have risk factors for cardiovascular disease, many will already be taking a statin or may require statin treatment because of their increased cardiovascular risk. Statin treatment, including high potency statin treatment, is safe in the presence of NAFLD and should not be avoided because of mild to moderately raised transaminase levels (up to three times the upper limit of normal).30 Indeed, preliminary evidence from the Greek Atorvastatin and Coronary Heart Disease Evaluation study suggests that those with increased transaminase levels (up to three times the upper limit of normal) may derive an even greater cardiovascular benefit from statins.31 Robust evidence for the safety of statins in those with NAFLD and transaminases over three times the upper limit of normal is lacking and statin treatment is probably best avoided in such people unless recommended after specialist hepatology review. In those with moderately abnormal liver function test results (<150 U/L (<3 times the upper limit of normal)) at the time of starting statins, it is prudent to redo the tests one or two months after the start of statin treatment. When should patients with confirmed or suspected NAFLD be referred to a gastroenterologist? Given the high prevalence of NAFLD coupled with the low risk of progressive liver disease in most people with simple steatosis, only those considered to be at high risk of progressive liver disease should be referred to secondary care for further evaluation. Several risk scores have been developed for the assessment of NAFLD severity.32 33 The NAFLD fibrosis score seems to be the most accurate in comparative studies, but it is not readily calculable and a large proportion of patients have indeterminate results. A simpler score, FIB-4, seems to be of similar accuracy but it still requires clinicians to carry out some form of electronic calculation. The simplest instrument is the BARD score, which places a heavy weighting on the AST:ALT ratio, with a value >0.8 considered to be associated with advanced fibrosis. Indeed an AST:ALT ratio >0.8 on its own has been found to perform well as an indicator of more severe liver disease.33 In this study, the AST:ALT ratio provided the best negative predictive value for advanced fibrosis and also demonstrated good diagnostic accuracy with a C statistic of 0.83 which was comparable to or better than results for more complex scores, where C statistics ranged from 0.67 to 0.86. In practical terms, patients with features of NAFLD in whom other major liver disease has been excluded and whose AST:ALT ratio is increasing to >0.8 as a result of a rising AST level should be considered at risk of progressive liver disease and referred for further evaluation. In addition, we would suggest that patients with ALT or AST levels more than three times the upper limit of normal or with abnormal ALP levels should be considered for specialist referral. Development of other clinical or laboratory features of advanced liver disease or portal hypertension, such as the appearance of spider naevi or unexplained thrombocytopenia, would warrant specialist referral. Liver ultrasonography and assessment of the severity of NAFLD using more specific severity scoring, serological assessment of fibrosis, or measurement of liver stiffness (transient elastography or acoustic radiation force imaging), can be performed in a secondary care setting. Liver biopsy may be required to clarify the severity of the underlying liver disease but even this “definitive” investigation is subject to considerable variability.34 Recognition of those patients with more advanced liver disease or at risk of progressive liver damage allows appropriate monitoring; in particular patients with cirrhosis can be entered into surveillance programmes for hepatocellular carcinoma and the presence of oesophagogastric varices. Tips for non-specialists In patients with raised alanine aminotransferase (ALT) or γ glutamyltransferase (GGT) levels or with hepatic steatosis noted on ultrasonography, non-alcoholic fatty liver disease (NAFLD) should be suspected in those with risk factors (increased body weight, raised fasting glucose or HbA1c, modestly raised triglycerides, low high density lipoprotein cholesterol, and AST:ALT ratio <0.8) and with daily alcohol intake <20 g in women and <30 g in men Patients with or suspected of having NAFLD should be screened for type 2 diabetes and, in the absence of type 2 diabetes, patients with NAFLD should be given preventive advice whether or not they fall into the high diabetes risk category Cardiovascular risk in NAFLD should be calculated using the usual risk scores There should be no hesitation in recommending statins for patients with NAFLD who are at increased cardiovascular risk unless their transaminase levels are more than three times the upper limit of normal Routine liver ultrasonography is not required in most patients with suspected NAFLD Patients with suspected or confirmed NAFLD should be given lifestyle advice on sustainable weight reduction and, because obesity and alcohol may act synergistically to promote liver disease, a reduction in alcohol intake should also be strongly advised Referral to a gastroenterologist should be considered in patients with features of NAFLD, in whom other significant liver disease has been excluded and whose AST:ALT ratio reaches >0.8 owing to increasing AST levels Additional educational resources Resources for healthcare professionals British Society of Gastroenterology (www.bsg.org.uk/clinical/commissioning-report/nash-and-non-alcoholic-fatty-liver-disease.html)—a source of information for clinicians on background, current practice, and recommended practice for NAFLD in the United Kingdom American Association for the Study of Liver Diseases, American College of Gastroenterology, and the American Gastroenterological Association (http://gi.org/clinical-guidelines/clinical-guidelines-sortable-list/)—a comprehensive American guideline on all aspects of NAFLD Resources for patients National Health Service (www.nhs.uk/conditions/fatty-liver-disease/pages/introduction.aspx)—comprehensive source of information for patients on NAFLD, its stages, and sensible lifestyle modifications American College of Gastroenterology (http://patients.gi.org/topics/fatty-liver-disease-nafld/)—a source of patient information on the causes, risk factors, investigation, and treatment of NAFLD All the websites are free to access and do not require registration Questions for future research Are there non-invasive surrogates for progression to advanced liver disease (fibrosis, cirrhosis) that can be robustly applied in future trials in non-alcoholic fatty liver disease (NAFLD)? What simple algorithms can be developed that effectively predict the presence of or progression to progressive liver disease? Is NAFLD an independent predictor of cardiovascular disease after established risk factors have been fully accounted for? Notes Cite this as: BMJ 2014;349:g4596
Christyn Cianfarani is president of the Canadian Association of Defence and Security Industries Stimulating innovation in the Canadian economy has proven to be one of the hardest things for governments to achieve. For a generation, most attempts have barely moved the needle. But a recent idea floated by Navdeep Bains, the Minister of Innovation, Science and Economic Development – namely, using government procurement as a tool to promote innovation – suggests that new thinking is in the air in Ottawa. Story continues below advertisement For nearly a generation, Canada has followed the standard policy menu to stimulate innovation. Governments have reduced marginal tax rates on business and workers; invested in university-based research, postsecondary education, training/skills development and infrastructure; opened Canada's markets to international trade and investment; and provided generous research and development tax incentives. The results have been disappointing. Canada's productivity growth, which is highly dependent on innovation, remains stagnant, at about 1 per cent a year. New approaches and new policy tools are needed to break this logjam. One such tool is government buying power: procurement. And defence procurement is particularly well suited in this connection. The defence industry operates in a managed global marketplace in which governments are the main if not sole customers. In the defence market, governments also have policy levers at their disposal and wider discretion to achieve their objectives than in other markets. Defence procurement, for example, is not nearly as constrained by trade agreements as other economic sectors. For this and other reasons, many of our allies have for decades used military procurement to drive innovation in their economies. The British, U.S., French and Swedish economies, for example, would not be nearly as innovative today had they not approached military procurement with a focus on developing key defence technologies and innovations at home that often to lead to wider commercial applications and then exporting them abroad. Canada has not traditionally been in that game. But since the 2014 introduction of the federal Defence Procurement Strategy, we have had a new tool in the policy tool box: "Industrial and technological benefits" (ITBs) and, in particular, the "value proposition" part of that program. The value proposition principle can require bidders on defence projects to make commitments to R&D and intellectual property transfer to Canada, both of which are key drivers of innovation. This principle, still in its infancy, needs to be embraced and used systematically and strategically going forward to enhance innovation in Canada's defence industrial base. This would not mean creating an innovative domestic defence industry out of nothing. Rather, it is a case of building upon a solid foundation. A recently completed Innovation, Science and Economic Development/Statistics Canada survey concludes that the Canadian-based defence industry generates 60 per cent of its revenue from exports, which is 20 per cent higher than the overall Canadian manufacturing industry. An export intensity of that magnitude is an indicator of an innovative industry, especially given the highly regulated and protected international marketplace where defence companies operate. Story continues below advertisement Story continues below advertisement The survey also reported that more than 30 per cent of employment in the Canadian defence industry is concentrated among highly skilled engineers, scientists, researchers, technicians and technologists, and the sector outpaced the overall Canadian manufacturing average in employee compensation, all of which are innovation indicators. But to really achieve the result the government is looking for, an effort should be made to marry the innovation agenda with the recently announced defence review. Together, these provide the opportunity to make practical changes to improve the process of defence procurement while enhancing innovation. Canada should take a page from the playbooks of its allies, which have addressed the industrial dimensions of military procurement in their defence reviews. The Australians, for example, have just published a defence white paper accompanied by a defence industrial policy focused on innovation in that sector. The notion of using government procurement to move beyond the textbook innovation policy menu would be a welcome innovation in and of itself. The test case for this new approach should be the defence sector, in which Ottawa already possesses innovation policy instruments and is less constrained by trade agreements. Canada's defence industrial base can be a real source of innovation-led growth over the next few years. All it takes is government and industry working co-operatively and strategically toward that goal.
Earlier this month while visiting Monticello with a French delegation that included socialist President François Hollande, President Obama was caught in a moment joking about breaking protocol to view the grounds, quipping, “That’s the good thing as a President, I can do whatever I want.” Forget the enormity of the irony; an American President joking at Thomas Jefferson’s home that rules don’t apply to him and give Obama the benefit of the doubt that this was just a casual line. It was no different than a line Michael Douglas would say in The American President or Kevin Kline in Dave or Martin Sheen in The West Wing, and that’s exactly the point. Obama has become a President of good lines from movies but unable to act like a very real leader. The reason for the uproar over comments like this from this President is because he never wastes an opportunity to show just how right the absurdness of the social media noise machine is. When Obama jokes about being able to do whatever he wants, then turns around and hits an HBO producer up at a State Dinner for advanced copies of television shows to get him through an extended weekend, how are we as a desperate electorate supposed to react? We tolerate the luxuries afforded to our leaders. Just don’t be a dick about it. How is a world currently engulfed in flames of revolution supposed to react? The problem for a President who makes any excuse to hit up a golf course or admits to watching tons of HBO?is there are still events in the world happening outside his windows. People are desperate for American leadership and can’t wait for the killer on True Detective to be revealed. Nobody in Kiev is interested in the fallout of the Red Wedding. Nobody in Venezuela cares about the fate of Zoe Barnes. Obama and his administration can’t wait to inject themselves into pop culture as it suits their narratives. If Lego Movie is number one at the box office, the Secretary of State is referencing global warming to 3D movies. If the Super Bowl is trending on social media, out come the football analogies. Michelle Obama is on Jimmy Fallon’s new Tonight Show and Joe Biden on Seth Meyer’s follow up. Actors pushing health care that they themselves refuse to sign up for. What message does it send the world when cries for democracy in Ukraine and Venezuela are met with silence, but tweeting about a cable show is paramount? Barack Obama is the first President optimized for SEO and therein lies the problem. When #Venezuela and #Kiev are the top trends on Twitter for two days straight and not #HouseOfCards or #TrueDetective, the online persona machine that elected Barack Obama goes dark and resorts to a spam account selling us crappy insurance. At the height of violence that erupted with both protests this past weekend, where was he? Hosting a Hollywood premiere style party for #GeorgeClooney and cast of his film #TheMonumentsMen, in private at the White House, simply because he could. Right now in Kiev, historical statues and art are being burned in front of the world. He was content to remain silent and watch a movie about it happening instead. The real world does not interest this President. The set design does. @BarackObama (an account run by these people) can tweet about House of Cards Spoilers but can’t be bothered to tweet a statement condemning violence from the regimes in Kiev and Venezuela and sign it -bo. Meanwhile millions of people globally are captivated by the live streams and Hugoesque images of revolution. If Barack Obama is going to be a viral President, he can’t simply sit out the events of the world as he chooses. The coming out announcement of a former college and future NFL football star cannot outweigh events that have captivated the entire world for two days and cost people their blood and lives. The same goes for his complex of dying network media defenders. If Occupy Wall St. and their twisted logic deserve prime time coverage by NBC, CBS and ABC, why not pro democracy rallies happening simultaneously in two different corners of the world? Instead, we’re stuck with Sunday morning show hosts drooling over Kevin Spacey. The networks become no different than the governments of those nations instituting blackouts. The desperate attempt to remain relevant and cool is accompanied now by a heavy vacuum of leadership in exactly the one country on the Earth it can’t have one in. This is the eventual dilemma with electing a President under the superstar celebrity, media driven premise that Barack Obama was. Remember in 2008, this was not about electing another American politician to the Presidency. This was a citizen of the world, drawing outdoor crowds of over 70,000 people. This was a transcendence, not an election. Oprah cried. Candidate Obama stepped up in front of the world and told us that this was the time we all began to heal. All over the world the same nations that yesterday believed this mysterious Kryptonian figure, now ignore him for the thrill of a molotov cocktail. Nations all over the world have rejected his movie speeches of peace and prosperity for his more clear, instinctual desires of social upheaval. Why is this President, elected because of a hope he promised in front of the world, so content to sit it out instead? Is he bored? Does massive public upheaval not interest him if it does not serve his political ends? Does it remind him that everything is in fact, not awesome? I wrote about this very thing?last August when riots escalated in Egypt and the Administration’s top officials were all on vacation except for Joe Biden, who was talking to a camel. I’m forced to raise it again because of the current calamities in the Ukraine and Venezuela and here the country is once again forced to reconcile a disinterested President, completely unengaged with the moment as it unfolds in front of the rest of us. Instead he’s focusing on the weather, truck fuel and George Clooney. Once again, on opposite sides of the globe thousands of people took to the streets of their respective nations demanding change from the heavy handed governments they have been suffering under. Right between these two uprisings sits the United States, with a historical American President who has a Nobel Peace Prize sitting on his mantle collecting dust. And he’s watching HBO. Railing on about growing income inequality one day, while the next hosting a lavish State Dinner for an open socialist. Two men committed to taxing the success right out of their countries while celebrating their own in luxury, all while our media commanding us to #BOWDOWN?to the queen. Optics matter. Flying to California to blame a drought on Global Warming right before spending the entire afternoon golfing in it, while also using a separate airliner to fly bills for him to sign, matters. Remaining silent on casualties of uprisings not aligned with our President’s personal philosophies, matters. Thousands of people right now in Venezuela and Kiev are risking their lives – 140 characters at a time. Barack Obama uses his Twitter account to talk about magnets. His personal statement on Kiev finally came yesterday while visiting Mexico, warning the Ukrainian Government, “There will be consequences if people step over the line.” Four days after the violence had reached a zenith, resulting in the reported deaths of 25 people in Kiev and 9 in Venezuela. What line exactly is left to be crossed Mr. President? He’s content to ride the worst of these situations out until the very end when he issues his stern warning and then can claim credit for tempering the situation. Until then, hey Girls is on! If these examples seem absurd, it’s because they are and no one should have to be making them, but over and over again, we are. Barack Obama went in front of the world and told it that he would heal it. Six years later, there is not a single corner of the Earth that is better off than when he spoke those words. Not even Winterfell. – SM – From Reddit – Independence Square in Kiev before and after government clashes with protestors.
Opponents of Senate Bill 4 — which would outlaw “sanctuary” jurisdictions in Texas — stage a protest at the State Insurance Building in Austin on May 1, 2017. (Bob Daemmrich for The Texas Tribune) by Julián Aguilar | The Texas Tribune About 50 protesters took over the lobby of the State Insurance Building on the grounds of the Texas Capitol on Monday to protest Senate Bill 4, a measure that would outlaw “sanctuary” jurisdictions in Texas and passed out of the Texas House last week. The insurance building houses several of Gov. Greg Abbott’s administrative offices, including the human resources, homeland security grants and criminal justice divisions. Abbott has urged lawmakers to pass a bill this session that bans "sanctuary cities" — places where local officials do not fully cooperate with federal immigration authorities. The morning began with a modest gathering at the south entrance of the Capitol. Demonstrators made clear their intent to keep fighting the bill, even if it meant possible civil disobedience. Then the protesters told reporters to stay tuned for something more direct and began the march to the east side of the Capitol grounds. The bill would make department heads like sheriffs, constables, police chiefs and local leaders subject to a Class A misdemeanor if they don’t cooperate with federal authorities and honor requests from immigration agents to hold noncitizen inmates subject to deportation. It also provides civil penalties for entities in violation of the provision that begin at $1,000 for a first offense and climb to as high as $25,500 for each subsequent infraction. Though protesters were adamantly opposed to the bill since the start, the issue took on a new dimension last week after the Texas House amended the bill and added a more controversial measure. The amendment allows police officers to question a person’s immigration status during a detainment, as opposed to being limited to those under lawful arrest. Democrats and immigrant rights groups argue this makes the bill "show-me-your-papers"-type legislation, where police will be able to inquire into status during the most routine exchanges, including traffic stops. “What we're hopeful for is that communities around Texas are not just going to just lay down and accept this,” said Bob Libal, the executive director of Grassroots Leadership, an Austin-based immigrant rights groups and private prison watchdog. Libal said the end goal of the protests is that Abbott vetoes the bill, a highly unlikely outcome since the governor declared the issue an “emergency” item shortly after the current legislative session began. Abbott’s office did not immediately respond to a request for comment. Shortly after arriving in the lobby, protesters sat down in front of the building’s east entrance, barring employees from entering through that part of the building. Then half of that group moved to the west entrance and blocked that door. Attendees chanted “this entrance is closed” as employees tried to enter the ground floor. When asked if the protesters could be arrested, a DPS officer on the scene referred all questions to the agency’s public information office, where a spokesperson said the agency is "continuing to monitor the situation." Demonstrator Norma Herrera, a 29-year-old former staffer for the state’s Legislative Budget Board, said on Monday that every day she worked for the state “now felt like a lie.” “It’s our job to tell them they’re fired and they can go to hell,” she told the group. As Monday wears on, Herrera, a former undocumented immigrant who currently holds legal residency status, said she knows she could eventually be arrested. “It’s a possibility but if that’s the case, I know that I’m doing it because it’s my moral obligation to resist unjust laws,” she said. It's unclear what the final version of the bill will look like. After the House's action last week, the bill now heads back to the Senate, where lawmakers there can accept the changes and send it along to the governor's desk or reject the House version and call for a conference committee in which members from both chambers will meet to iron out the differences. As of 1 p.m., the protesters were still in the lobby. One gave a brief update to the crowd. “We’re not leaving,” she said. The Texas Tribune is a nonprofit, nonpartisan media organization that informs Texans — and engages with them — about public policy, politics, government and statewide issues.
International space law and the basis of space regulation Space law has its origins in the treaties and principles established by the UN Committee on the Peaceful Uses of Outer Space (COPUOS), a committee set up following the Soviet Union’s 1957 launch of the world’s first ever satellite, ‘Sputnik’. As early as 1962, the first set of principles on outer space was agreed, adopted by the UN General Assembly as the “Declaration of Legal Principles” for space activities. These principles were elaborated upon in the UN Outer Space Treaty of 1967 and in subsequent UN treaties, resolutions, and principles. Currently, over one hundred states are party to the Outer Space Treaty. For regulators, the most salient points of the UN space treaties are: the use of space must be exclusively for peaceful purposes space must be accessible to all countries and used for the benefit of all countries each state is internationally responsible and liable for its space activities, including activities carried out by non-governmental entities of that state each state must authorise and continuously supervise the space activities of its non-governmental entities each state must maintain a register of space objects it launches and furnish details regarding the orbital parameters and basic function of the space object to the UN each state must, in conducting, authorising, or supervising its space activities, avoid harmful contamination of outer space States have often sought to secure compliance with their international obligations by introducing national legislation and regulations. Typically, national legislation confers licensing or authorisation powers on the state. The state then uses these powers to determine the requirements a licensee has to meet and to set out the obligations of each licensee. The Outer Space Act 1986 The UK is one of the pioneering nations in space activity. Today it has a leading reputation in a number of key space sectors, both in the scientific and commercial arenas. The UK is also a signatory to four UN space treaties, and takes seriously its international commitment to the safe, responsible, and sustainable use of space; only with this global commitment can the world continue to exploit and enjoy the unique opportunities that space offers. The Outer Space Act 1986 (“OSA”) is the legal means by which the UK regulates the use of outer space. The purpose and provisions of the Act stem from the obligations that the UN space treaties place on state parties. Purpose and remit of the Outer Space Act The Outer Space Act is the basis for the regulation of activities in outer space carried out by organisations or individuals established in the United Kingdom, or in one of its overseas territories or Crown dependencies. It confers licensing and other powers on the Secretary of State for Business, Energy and Industrial Strategy, who acts through the UK Space Agency to exercise these powers. The OSA seeks to: ensure compliance with the UK’s various obligations under international treaties and principles covering the use of outer space, including liability for damage caused by space objects, the registration of objects launched into outer space and the principles for the remote sensing of the Earth ensure that space activities do not jeopardise public health or the safety of persons or property ensure that space activities licensed by the UK do not undermine national security manage the risk of claims for third-party damage being brought against the UK Government, and to transfer some of that liability from the UK Government and taxpayers to the licensed organisation or individual whose space activities caused the third-party damage An OSA licence is required for the following activities: launching or procuring the launch of a space object operating a space object any activity in outer space It is an offence for a person to whom the Act applies to carry out a licensable activity without a valid licence (OSA, s. 12). The 1986 Act was amended in 2015, to introduce a limit to the operator’s indemnity to the UK Government for third-party claims brought against the UK. The Act has also been extended, through Orders in Council, to apply with modifications to Crown dependencies and overseas territories. Fees regulations have also been made under powers in the Act. For more information about OSA licensing, such as what to expect from the licensing process and the obligations of licensees, please visit the ‘Applying for a licence’ section. 2015 Amendment to the Outer Space Act Section 12 of the Deregulation Act 2015 amended the Outer Space Act to make it obligatory for a licence to specify the maximum amount of a licensee’s liability to indemnify the Government in respect of activities authorised by the licence. The following archived material contains further details on the 2015 amendment. Impact Assessment: Review of the Outer Space Act (1986) PDF , 677KB, 46 pages This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. Consultation: Reform of the Outer Space Act 1986 PDF , 91.1KB, 12 pages This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. Government response: Outer Space Act consultation PDF , 182KB, 13 pages This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. Applying for a licence Process and application form UK nationals and UK companies intending to launch or procure the launch of a space object, operate a space object, or carry on any other activity in outer space should make themselves familiar with the provisions of the Outer Space Act 1986, plus the amendments made to the Act by the Deregulation Act 2015. Unless acting as an employee or agent of another organisation, you need to apply for a licence at least six months in advance of carrying out the licensable activity. In certain circumstances it may be possible to process an application in a reduced timescale, although no guarantees can be given. Applications should be made using the licence application form below. There is flexibility in the UK approach to licensing and the UK Space Agency encourages potential applicants to contact them as early as possible to discuss the best way forward and solution for their mission. Please get in touch at regulation@ukspaceagency.gov.uk Please read the ‘Guidance for Applicants’ document carefully before submitting an application. OSA application form MS Word Document, 330KB This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. Example of an OSA licence (where the indemnity cap applies) PDF , 135KB, 8 pages This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. Guidance for applicants PDF , 346KB, 19 pages This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. Outer Space Act Database of Standards PDF , 157KB, 23 pages This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. The Traffic Light System of licensing The Traffic Light System (TLS) for licensing is a pre-application stage that has been developed in response to suggestions from operators. Based on a small subset of information, it gives prospective licence applicants a pre-application Red/Amber/Green rating, which indicates the likelihood of a licence being granted. Depending on the mission, the Traffic Light System may also provide a shorter, tailored application form to help streamline the licensing process. The purpose of the TLS is threefold: to help less experienced operators understand the need for safety, security, and sustainability, as reflected in the UK’s licensing process to advise operators of the likelihood of application success before they submit a full application, and, where possible, to advise them on what modifications would be required to bring their proposed mission in line with the UK Space Agency’s licensing requirements where possible, to provide a smoother, more tailored application process that is more proportionate to the scale and risks of a particular mission type Some streamlining of the process is also possible in the case of repeat applications (applications from the same operator for identical or near-identical missions), reducing the administrative burden on the applicant. The Traffic Light rating is based on the operator’s responses to a short set of questions. These responses provide the UK Space Agency OSA Licensing team with the basic technical details – the what, when, why, how, and where – of the planned mission. The ratings indicate three likely, but non-binding, outcomes of the application. Missions assessed as ‘Green’ are deemed to be sufficiently low-risk: it is likely that the licence application will be successful An ‘Amber’ rating means either that more information is needed for the UK Space Agency to understand the risks, or that modifications to the mission would be required before this mission can be rated as ‘Green’ Missions assessed as ‘Red’ are deemed unsafe or too high-risk: The Licensing team would advise the operator not to apply and to reconsider its mission plans Prospective applicants are not obliged to use the Traffic Light System and can submit an application for an OSA licence without having first received a Traffic Light rating. However, we recommend that new operators, and all operators hoping to launch new types of missions, make use of the Traffic Light System. If you have any queries regarding the Traffic Light System, please contact the OSA Licensing team at regulation@ukspaceagency.gov.uk Traffic Light System: 10 questions MS Word Document, 16.8KB This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. Obligations of licensees Once a licence has been granted, licensees are obliged to: permit reasonable access to documents and inspection and testing of equipment and facilities by the UK Space Agency or their advisors as appropriate inform the UK Space Agency of any planned change to the licensed activity (e.g. change of orbit, change of owner) and seek approval prior to the change being made prevent contamination of outer space and adverse changes in the environment of the Earth avoid interference in the space activities of others avoid jeopardising public health or safety of persons or property avoid any breach of the UK’s international obligations preserve the national security of the UK indemnify the UK Government for any claims for third-party damage brought against the Government which arise from each licensed space activity in most cases, to insure themselves against third-party liabilities arising from each licensed activity, with the UK Government named as an additional insured; insurance should be for the launch and in-orbit phases of the mission for each licence application, a risk assessment will be performed to consider the potential risks posed by the mission and a commensurate level of insurance cover will be determined for more information about insurance requirements, please visit the insurance section dispose of the licensed space object appropriately at the end of the licensed activity and inform the UK Space Agency of the disposal and termination of the activity Space liability and insurance requirements Background The 1972 UN Convention on International Liability for Damage Caused by Space Objects (“the Liability Convention”), which the UK has ratified, is the foundation for space liability regimes worldwide. Under the Convention, a “Launching State” is internationally liable for damage arising out of its space activities to i) objects owned by nationals from another state, and ii) nationals of another state. A Launching State is defined in Art. I of the Convention as: a State that launches or procures the launching of a space object a State from whose territory or facility a space object is launched There can be, and often is, more than one Launching State. Damage caused on the surface of the Earth or to an aircraft in flight carries absolute liability. (Art. II) For damage caused elsewhere and to a space object, the Launching State is liable “only if the damage is due to its fault or the fault of persons for whom it is responsible” (Art. III). The Convention has significant implications for how states regulate space activities, and many states have sought to manage or offset some of their liability for non-governmental space activities. Typically, this is done by minimising the risk of a collision or third-party damage in the first place (through a state’s licensing and compliance procedures); by requiring an indemnity from licensees for claims presented to the state; and by requiring a minimum level of third-party liability insurance cover as a condition of the licence, to better protect both the operator and the Government of that state from such claims. The UK’s space liability and indemnity regime Under section 10 of the Outer Space Act 1986 (OSA), operators must indemnify the UK Government for claims brought against the latter other than in circumstances set out in that section. The operator’s indemnity to the UK Government is a vital part of the UK’s overall space risk management approach, helping the UK offset some of the liability that it incurs on behalf of satellite operators. Up until 2015, this indemnity was unlimited: if a claim was brought against UK Government, the Government could seek recovery from the operator, however large the claim. The OSA was amended by the Deregulation Act 2015. Henceforth, all licences issued must state a limit to the operator’s liability to indemnify UK Government for claims made against the latter. This means that, if a third-party claim is brought against the UK Government, the operator would be liable to indemnify the Government up to a limit, and the UK Government would meet the remaining liability. The limit on the liability to indemnify the Government, (“the Indemnity limit”) also referred to as the indemnity ‘cap’, is set out in licence conditions and determined by the UK Space Agency on a case-by-case basis. The UK Space Agency has full discretion to vary the indemnity limit for each licence, depending on the risks associated with that mission. It should be noted that each satellite is licensed individually and so in effect the indemnity limit is set on a per-satellite basis. For ‘standard missions’, the indemnity limit will normally be € 60 million for each licence issued. This means that, if a third-party claim is brought against the UK Government, the operator would be liable to indemnify the Government up to € 60 million, and the UK Government would meet the remaining liability. For missions deemed by the UK Space Agency to be higher-risk, the indemnity limit may be set at a higher level. We will engage closely with prospective licensees to inform them of the likely indemnity limit for their licensed activities. It should be noted that the indemnity limit is applied only to licensed activities: if damage is caused from unlicensed activities (i.e. activities not expressly authorised by the UK Space Agency), then the operator’s liability to indemnify the Government is without limit. The indemnity limit has important interactions with the amount of third-party liability insurance that we require operators to hold. Please see the ‘third-party liability insurance’ section for further details. Third-party liability insurance In order to better protect both operators and the UK Government from third-party claims, the UK Space Agency may also require licensees to hold third-party liability (TPL) insurance for regulated activities, which currently consist of in-orbit operations and the procurement of overseas launches. The UK Government should be named on the insurance policy as an additional insured. The minimum insurance cover required is determined at the discretion of the UK Space Agency; operators are free to hold more than the UK Space Agency requires. Launch insurance For launch, in the majority of cases involving single satellite missions employing established launchers, this insurance cover would be limited to € 60 million. In-orbit insurance On the 1st October 2018, the UK Space Agency introduced a new approach to its in-orbit third-party liability insurance requirements. The changes were introduced in response to feedback from the space industry and in recognition of a rapidly diversifying space sector. Such a diversification of technology and risk meant that the previous model of requiring € 60 million TPL insurance in all cases was no longer appropriate. The key changes explained further in the Fact sheet: new requirements for in-orbit Third Party Liability insurance ( PDF , 434KB, 6 pages) are: Requiring TPL insurance on a per-occurrence basis, rather than a per-satellite basis. This means that, for operators of more than one satellite, the UK Space Agency may allow all of that operator’s satellites to be covered under a single ‘any one occurrence’ TPL insurance policy. In essence, this would function as a fleet TPL insurance policy. For low-risk smallsat missions launched to an operational altitude below that of the International Space Station, the UK Space Agency may waive the TPL insurance requirement. The operator’s indemnity limit will remain € 60 million per licence (with each satellite licensed individually). for standard missions (see definition in Q&A section), the TPL insurance requirement would remain the same as it presently is – € 60 million. Where an operator has more than one standard mission, the UK Space Agency may allow all of the operator’s standard-mission satellites to be covered under a single € 60 million ‘any one occurrence’ insurance policy. After a certain number of satellites have been launched by that operator, the UK Space Agency may offer the operator the option to add an aggregate (a limit) to their per-occurrence TPL insurance policy. The UK Space Agency would determine the aggregate to be applied in such cases. The operator’s indemnity to the UK Government will continue to apply in respect of each licence. In most cases involving standard missions, the indemnity limit will be set at € 60 million per licence (with each satellite licensed individually). For higher-risk licensable missions (see definition in Q&A), the UK Space Agency may require a higher per-occurrence and/or a higher aggregate, depending on the risks of each mission. These requirements will be considered on a case-by-case basis, and set following an appropriate risk assessment. The UK Space Agency will take into account the capacity of the insurance market when setting its insurance requirements The operator’s indemnity to the UK Government will continue to apply for each licence. In most cases involving higher-risk missions, the indemnity limit would continue to be covered by the TPL insurance requirement, to minimise the risk of an operator carrying an uninsured liability to UK Government. The above points should be taken as indicative only. The UK Space Agency will, through its risk assessment procedures, determine the amount required for both the per-occurrence level and, where appropriate, the aggregate level. In setting its TPL insurance requirements, the UK Space Agency will consider, amongst other factors: the heritage and reliability of the technology; the orbital parameters; the contingency plans and redundancy of the planned mission; the manouevrability of the satellite and the capacity for it to be tracked; the estimated value of satellites in nearby orbits; the orbit-raising and de-orbiting plans, including the value of satellites that may be encountered during the procedures; the operational practices followed by the operator; the performance of similar space systems on orbit The licence applicant will be kept closely informed about the likely TPL insurance requirements for their mission. We may be able to give an early, non-binding indication of the likely TPL insurance requirements at the pre-application Traffic Light stage. Fact sheet: new requirements for in-orbit Third Party Liability insurance PDF , 434KB, 6 pages This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. Q&A When did the changes to the UK Space Agency’s setting of insurance requirements come in to force? The UK Space Agency will be implementing the new approach to insurance requirements from 1st October 2018. There is no legislative change required, as the Outer Space Act 1986 enables the UK Space Agency to enact these changes as policy initiatives. What is a ‘standard mission’? Standard missions represent very low and well-characterised third-party risks. For licensing purposes, we define a standard mission as a mission involving a single satellite employing an established launcher, a proven satellite platform, and recognised operational practices. Ultimately, it is the UK Space Agency that decides whether or not a particular mission is ‘standard’, based on information conveyed in the licence application. A standard mission will likely carry with it a € 60 million indemnity limit. We will in most cases require that a standard mission is covered by a € 60 million ‘any one occurrence’ third-party liability insurance policy. We may also allow for an operator to place all satellites that count as ‘standard missions’ onto a single ‘any one occurrence’ insurance policy. Please see the TPL insurance section for more information. What is a ‘higher-risk mission’? Higher-risk missions are licensable missions that: i) are novel in nature or scale; and/or ii) use techniques, technologies and/or systems which are unproven; and/or iii) present a higher risk of higher-value third-party liability claims; and/or iv) present third-party risks that are not well-characterised It should be noted that novelty in itself will not automatically render a mission ‘higher-risk’. As with all licensing decisions, the UK Space Agency will take a holistic view when determining whether a proposed mission is ‘standard’ or ‘higher-risk’. The UK Space Agency may require operators of higher-risk missions to hold more third-party liability insurance than the standard € 60 million requirement. Higher-risk missions may also be subject to an indemnity limit that is higher than the standard € 60 million per licensed satellite. In each case, the UK Space Agency will keep prospective operators informed of their likely indemnity limit and insurance requirements. What kind of factors will the UK Space Agency take into account when determining the indemnity limit and the insurance requirements? In setting its TPL insurance requirements, the UK Space Agency will consider, amongst other factors: the heritage and reliability of the technology; the orbital parameters; the contingency plans and redundancy of the planned mission; the manouevrability of the satellite and the capacity for it to be tracked; the estimated value of satellites in nearby orbits; the orbit-raising and de-orbiting plans, including the value of satellites that may be encountered during the procedures; the operational practices followed by the operator; the performance of similar space systems on orbit The licence applicant will be kept closely informed about the likely TPL insurance requirements for their mission. We may be able to give an early, non-binding indication of the likely TPL insurance requirements at the pre-application Traffic Light stage. Where can I find likely indemnity limits alongside likely third-party liability insurance requirements for various types of missions? The indemnity limit will continue to be set on a per satellite basis. Please see the table on page 4 of the Fact sheet: new requirements for in-orbit Third Party Liability insurance ( PDF , 434KB, 6 pages) containing both indicative TPL insurance requirements and indicative indemnity limits. Please note that the table provides an indication only; the UK Space Agency retains full discretion to set both the indemnity limit and the TPL insurance requirements appropriately. How does the indemnity limit work for third-party claims brought directly against the operator rather than brought against the UK Government? The limit to the operator’s liability to indemnify set out in the licence applies only for claims brought against the Government, a point that the Outer Space Act 1986 is clear on. Operators take on full liability for any claims brought against them and must consider their TPL insurance cover accordingly. Further, the indemnity limit is applied to licensed activities only. If damage is caused to a third party through unlicensed activities (i.e. activities not expressly authorised by the UK Space Agency’s OSA Licensing team), then the operator’s liability to indemnify the UK Government for claims brought against the Government is without limit. When will I be informed by the UK Space Agency what the insurance requirements are for my mission? The UK Space Agency may provide an early indication of the likely minimum third-party liability insurance requirements as part of the pre-application Traffic Light assessment. However, this will be a guideline only, and may change in light of the UK Space Agency’s assessment of the full licence application. This will be made clear to each prospective applicant. Once an application is submitted, the UK Space Agency will commence its detailed assessments of the mission. If the UK Space Agency considers that the indicative third-party liability insurance requirement conveyed at the pre-application stage should change, the operator will be promptly informed. What documentation does the UK Space Agency require from operators for the purposes of TPL insurance requirements? The UK Space Agency will need to see evidence that the operator holds insurance that meets the minimum requirements. This means the UK Space Agency will need to review both the insurance policy certificate and the policy wording (to include exclusions and definitions), and any other documentation that may impact on the TPL insurance policy. As per current practice, the UK Space Agency will seek specialist advice on any of these documents from its retained insurance advisors, on a strict commercial-in-confidence basis. If a collision involving one of my satellites should occur, would the insurance requirements from the UK Space Agency change? In the case of a collision, the UK Space Agency may suspend licensing activities for that operator while it undertook a detailed assessment of the factors that led to the collision, as well as its consequences. The UK Space Agency would remain in close dialogue with the operator, and with appropriate national or international entities, to try to better understand the event. If licensing were to resume, the minimum TPL insurance requirements for missions from that operator would partly depend on the outcome of the UK Space Agency’s assessment into the cause of the collision. Compliance and Monitoring The UK Space Agency carries out annual checks on all UK licensed spacecraft to meet its obligations under the UN treaties to monitor and supervise the activities of its nationals, most notably Article VI of the Outer Space Treaty: “States Parties to the Treaty shall bear international responsibility for national activities in outer space, including the moon and other celestial bodies, whether such activities are carried on by governmental agencies or by non-governmental entities, and for assuring that national activities are carried out in conformity with the provisions set forth in the present Treaty. The activities of non-governmental entities in outer space, including the moon and other celestial bodies, shall require authorization and continuing supervision by the appropriate State Party to the Treaty. […]” Our supervisory activities include monitoring the health of the spacecraft. All licensed operators will be required to provide information regarding their satellites by way of a ‘Health Check’, at least on an annual basis. We require only subsystem level reporting, but welcome exception reporting at lower levels. The UK Space Agency has introduced measures to help streamline its compliance and monitoring procedures. We utilise a Red-Amber-Green health check form. This is emailed to the operator annually. In addition to this, we request a copy of the third-party liability insurance policy upon its renewal each year. This is so that we can assess the policy and ensure that it continues to meet our normal requirements. Form: OSA Health Check PDF , 176KB, 1 page This file may not be suitable for users of assistive technology. Request an accessible format. If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@bis.gsi.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use. UK Registers of Space Objects UK registry of outer space objects Supplementary registry of space objects The UK Space Agency OSA Licensing team welcomes early engagement with prospective operators, and strongly encourages individuals or organisations thinking about applying for an OSA licence to get in touch as early on in the mission planning as possible. We also welcome any queries or comments about our licensing and regulatory regime. Please get in touch at regulation@ukspaceagency.gov.uk
(CNN) Call it a pop-up alliance. After spending much of this year berating each other after Turkey shot down a Russian jet over the Syrian-Turkish border , the two governments are suddenly the "honest brokers" of a ceasefire in Syria -- one that is designed to lead to political negotiations. The United States, which has long championed the stuttering diplomatic process on resolving the Syrian conflict, is nowhere to be seen. Russian President Vladimir Putin declared that the ceasefire was only the first step, with other documents signed on enforcing the truce and beginning peace talks. The Syrian military promised to cease operations nationwide at midnight Thursday. Here's how the deal looks. Russia, Turkey in driving seat Russian and Turkey are now driving what had been a UN-led political process. Each is responsible for bringing its own allies into the process: the Russians will bring the Assad regime on board and the Turks as many moderate factions as they can coax or cajole. Both sides envisage a rapid timeline, with the Turkish Foreign Ministry saying the Assad regime and opposition would meet soon in Kazakhstan, according to Turkish state media. Plenty can still go wrong, and recent history gives little cause for optimism. Putin acknowledged that "all the agreements reached are very fragile." Turkish Foreign Minister Mevlut Cavusoğlu said Thursday that details on how to monitor the ceasefire and apply sanctions against those who breached it were still being worked out. And he insisted there would be no direct negotiations between Turkey and the Syrian government. Russian President Vladimir Putin announced the Syrian ceasesfire in a meeting with Defense Minister Sergei Shoigu and Foreign Minister Sergei Lavrov at the Kremlin. But the intent is clear: peel off moderate rebel groups from the tacit alliances they have formed with radical Islamist groups in parts of Syria. Then crush the militant groups excluded from the process. And this is where, inevitably, things get complicated. The Russian Defense Ministry sai d influential Islamist groups such as Ahrar al-Sham and Jaysh al-Islam have signed up to the process. But the Syrian army has asserted that groups linked to Fateh al-Sham, of which Ahrar al-Sham has been the most prominent, will be excluded from the deal. One source in Ahrar al-Sham, which receives extensive support from Turkey, acknowledged that it is involved in the negotiations. But late Thursday the group said on its Twitter feed that it "has reservations about the proposed agreement" and had not yet signed. Its ultimate decision will be important. If Ahrar does sign up, Jabhat Fateh al-Sham will be isolated and vulnerable. Obama administration ignored The timing of the deal is critical. The Russian-Turkish entente has exploited the political transition in Washington, ignoring the Obama administration in its dying days and betting that the Trump administration will accept a process already under way. Russian Foreign Minister Sergey Lavrov has already issued the invitation. "I would like to express the hope that as soon as the administration of Donald Trump takes office, they will also be able to join these efforts," he said during a meeting with Putin in Moscow. Photos: The battle for Aleppo in 20 photos A Syrian man reacts while standing on the rubble of his house while others look for survivors and bodies in the Tariq al-Bab district of the northern city of Aleppo on February 23, 2013. Hide Caption 1 of 20 Photos: The battle for Aleppo in 20 photos A Free Syrian Army fighter aims his weapon during clashes with government forces in Aleppo on Tuesday, January 15, 2013. Hide Caption 2 of 20 Photos: The battle for Aleppo in 20 photos A member of the Syrian pro-government forces stands amid heavily damaged buildings in Aleppo's 1070 district on November 8, 2016, after troops seized it from rebel fighters. Hide Caption 3 of 20 Photos: The battle for Aleppo in 20 photos A wounded Syrian boy cries after bombs fell on the opposition-controlled Firdevs neighborhood in Aleppo on October 11, 2016. Hide Caption 4 of 20 Photos: The battle for Aleppo in 20 photos Smoke rises after a bomb explodes in a residential area in the Darat Izza neighborhood of Aleppo on October 4, 2016. Hide Caption 5 of 20 Photos: The battle for Aleppo in 20 photos Search and rescue team members carry an injured man after Syrian regime airstrikes targeted the Meshed neighborhood of Aleppo on July 21, 2016. Hide Caption 6 of 20 Photos: The battle for Aleppo in 20 photos A rebel fighter aims his weapon toward Syrian government forces' positions at the Menagh military airport near Aleppo on March 13, 2013. Hide Caption 7 of 20 Photos: The battle for Aleppo in 20 photos The body of a Syrian army soldier lies on the ground after heavy clashes with government forces at a military academy besieged by the rebels in Tal Sheer village, north of Aleppo, on December 16, 2012. Hide Caption 8 of 20 Photos: The battle for Aleppo in 20 photos Debris covers a street and flames rise from a building after a reported airstrike by Syrian government forces on March 7, 2014, during the Friday prayer in the Sukkari neighborhood of Aleppo. Hide Caption 9 of 20 Photos: The battle for Aleppo in 20 photos Wounded 5-year-old Omran Daqneesh sits alone in the back of an ambulance after he was injured during a Russian or Assad regime forces airstrike targeting the Qaterji neighborhood of Aleppo on August 17, 2016. Hide Caption 10 of 20 Photos: The battle for Aleppo in 20 photos A Syrian opposition tank fires a rocket toward an Assad forces' building during clashes near the Air Intelligence building of Jamiat al-Zahra, Aleppo, on April 13, 2015. Hide Caption 11 of 20 Photos: The battle for Aleppo in 20 photos On September 7, 2012, Free Syrian Army fighters run after attacking a Syrian army tank during fighting in the Izaa district of Aleppo. Hide Caption 12 of 20 Photos: The battle for Aleppo in 20 photos Search and rescue team members inspect collapsed buildings after Assad regime forces attacked residential areas in the Karm al-Beik region of Aleppo on July 9, 2015. Hide Caption 13 of 20 Photos: The battle for Aleppo in 20 photos Syrian civil defense volunteers and rescuers remove a baby from under the rubble of a destroyed building after a reported airstrike on the rebel-held neighborhood of al-Kalasa in Aleppo on April 28, 2016. Hide Caption 14 of 20 Photos: The battle for Aleppo in 20 photos A Free Syrian Army fighter takes cover during clashes with Syrian army soldiers in the Salaheddine neighborhood of central Aleppo on August 7, 2012. Hide Caption 15 of 20 Photos: The battle for Aleppo in 20 photos Syrian pro-government forces walk in the damaged ancient Umayyad Mosque in the old city of Aleppo on December 13, 2016, after they captured the area. Hide Caption 16 of 20 Photos: The battle for Aleppo in 20 photos A man in front of a field hospital mourns the death of his relatives on August 21, 2012, in Aleppo. Hide Caption 17 of 20 Photos: The battle for Aleppo in 20 photos Rubble is seen in the Salaheddine neighborhood on March 24, 2013, in Aleppo. Hide Caption 18 of 20 Photos: The battle for Aleppo in 20 photos Syrian government forces walk in the strategic area of the Bazo hilltop, on the southern outskirts of Aleppo, as they advance to seize the rebel-held eastern part of the city on October 25, 2016. Hide Caption 19 of 20 Photos: The battle for Aleppo in 20 photos An aerial view shows a convoy of buses and ambulances waiting at a crossing point in the Amiriyah district of Aleppo on December 15, 2016, to evacuate civilians trying to flee from areas under siege by Iran-led Shiite militias and Assad regime forces. Hide Caption 20 of 20 Sergey Aleksashenko at the Brookings Institution says "Trump may believe that there isn't a clear alternative at this point, since the Syrian opposition has not been able to unify or put forward a single leader." The President-elect has expressed doubts about the reliability of moderate groups previously supported by the Obama administration and suggested his administration would consider joining Russia in the battle against ISIS in Syria. Why Turkey embraced Russia The critical event in bringing about the new entente between Moscow and Ankara was the fall of Aleppo. The Assad regime -- along with its Russian and Iranian sponsors -- turned the tide of the civil war in seizing complete control of the largest city, or what remained of it. Rebel groups -- some of them supported by Turkey and the Gulf States -- retreated in disarray. Their evacuation was negotiated between Russia and Turkey Vladimir Putin held all the best cards, and it seems that Ankara decided to cash in its chips. It was not abandoning its allies among the Syrian factions, but refocusing its efforts on creating a "Turkey-friendly" region in northern Syria. In the process President Recep Tayyip Erdogan has sought to kill two birds with one stone. Syrian opposition fighters fire towards positions held by Islamic State group jihadists in al-Bab on the northeastern outskirts of the northern embattled city of Aleppo on December 13. Turkey has long been exasperated by US support for the Syrian Kurdish militia known as the YPG. "Up to now, the US has given weapons to the YPG, full stop," Cavusoğlu said Thursday. Washington sees the YPG as its battering ram against ISIS in northern Syria. Turkey sees it as a terror organization intimately linked to its own Kurdish insurgents, the PKK. A CNN team inside northern Syria earlier this year saw plenty of evidence of those links, including large posters of the PKK leader Abdullah Ocalan in many Kurdish-dominated towns. The United States has denied supplying the YPG with weapons but says it provides tactical support to the Syrian Democratic Forces, the umbrella group in which the YPG is the main component. One sign of the new cooperation between Russia and Turkey is the introduction of Russian air power this week to help groups allied to Turkey rid the strategic town of Al Bab in northern Syria of ISIS fighters. Another is the establishment of a hotline to monitor breaches of the new ceasefire. Assad remains a hurdle The Russia-Turkey axis may be a marriage of convenience, and there are some major hurdles to overcome. Turkey wants the Lebanese Shia militia Hezbollah -- a key military supporter of the Assad regime -- out of Syria. That will not sit well with Iran, which is also sponsoring the ceasefire process. Nor has Turkey abandoned its long-term goal that President Bashar al Assad must step down. That also remains the position of the Free Syrian Army, whose representative Abu Zeid said Thursday: "The fact that the negotiations are based on Geneva 1 [the UN-sponsored peace process] means that there is no place for Assad in Syria's future." In addition, some fighters with Islamist groups -- especially in Ahrar al-Sham -- may opt to join the resistance rather than lay down their arms. And the Gulf states -- especially Qatar and Saudi Arabia -- may decide to double down on their support for groups opposed to negotiations. Abu Zeid, who took part in the negotiations on behalf of the opposition, insisted Thursday: "Our message to the Syrian population is that our fingers will remain on the trigger." But the truth is that armed and political opposition to the Syrian regime, shorn of its support from Ankara, has rarely been weaker since the uprising against Bashar al-Assad began five years ago. And the most influential actors on the ground in Syria -- Russia, Turkey, Iran and the Syrian regime -- are, for now, all on the same page.
NOTE: If you intend on using techniques such as these and allowing such wide open functionality in Jenkins, I recommend that you run your entire Jenkins build system without outbound internet access by default. Allow access from the build system network segment only to approved endpoints while dropping the rest of the traffic. This will allow you to use potentially dangerous, but extremely powerful scripts while maintaining a high level of security in the system. API Calls Making HTTP calls in a Jenkinsfile can prove tricky as everything has to remain serializable else Jenkins cannot keep track of the state when needing to restart a pipeline. If you’ve ever received Jenkins’ java.io.NotSerializableException error in your console out, you know what I mean. There are not too many clear-cut examples of remote API calls from a Jenkinsfile so I’ve created an example Jenkinsfile that will talk to the Docker Hub API using GET and POST to perform authenticated API calls. Explicit Versioning The Docker image build process leaves a good amount to be desired when it comes to versioning. Most workflows depend on the :latest tag which is very ambiguous and can lead to problems being swallowed within your build system. In order to maintain a higher level of determinism and auditability, you may consider creating your own versioning scheme for Docker images. For instance, a version of 2017.2.1929 #<year>-<week>-<build #> can express much more information than a simple latest . Having this information available for audits or tracking down when a failure was introduced can be invaluable, but there is no built-in way to do Docker versioning in Jenkins. One must rely on an external system (such as Docker Hub or their internal registry) to keep track of versions and use this system of record when promoting builds. This versioning scheme we are using is not based on Semver, but it does encode within it the information we need to keep versions in lock and also will always increase in value. Even if the build number is reset, the date + week will keep the versions from ever being lower that the day previously. Version your artifacts however works for your release, but please make sure of these two things: The version string never duplicates The version number never decreases Interacting with the Docker Hub API in a Jenkinsfile For this example we are going to connect to the Docker Hub REST API in order to retrieve some tags and promote a build to RC. This type of workflow would be implemented in a release job in which a previously built Docker image is being promoted to a release candidate. The steps we take in the Jenkinsfile are: Provision a node Stage 1 Make an HTTP POST request to the Docker Hub to get an auth token Use the token to fetch the list of tags on an image Filter through those tags to find a tag for the given build # Stage 2 Promote (pull, tag, and push) the tag found previously as ${version}-rc Push that tag to latest to make it generally available This is a fairly complex looking Jenkinsfile as it stands, but these functions can be pulled out into a shared library to simplify the Jenkinsfile. We’ll talk about that in another post. Jenkinsfile # ! groovy /* NOTE: This Jenkinsfile has the following pre-requisites: - SECRET (id: docker-hub-user-pass): Username / Password secret containing your Docker Hub username and password. - ENVIRONMENT: Docker commands should work meaning DOCKER_HOST is set or there is access to the socket. */ import groovy.json.JsonSlurperClassic ; // Required for parseJSON() // These vars would most likely be set as parameters imageName = "technolog/serviceone" build = "103" // Begin our Scripted Pipeline definition by provisioning a node node () { // First stage sets up version info stage ( 'Get Docker Tag from Build Number' ) { // Expose our user/pass credential as vars withCredentials ([ usernamePassword ( credentialsId: 'docker-hub-user-pass' , passwordVariable: 'pass' , usernameVariable: 'user' )]) { // Generate our auth token token = getAuthTokenDockerHub ( user , pass ) } // Use our auth token to get the tag tag = getTagFromDockerHub ( imageName , build , token ) } // Example second stage tags version as -release and pushes to latest stage ( 'Promote build to RC' ) { // Enclose in try/catch for cleanup try { // Define our versions def versionImg = "${imageName}:${tag}" def latestImg = "${imageName}:latest" // Login with our Docker credentials withCredentials ([ usernamePassword ( credentialsId: 'docker-hub-user-pass' , passwordVariable: 'pass' , usernameVariable: 'user' )]) { sh "docker login -u${user} -p${pass}" } // Pull, tag, + push the RC sh "docker pull ${versionImg}" sh "docker tag ${versionImg} ${versionImg}-rc" sh "docker push ${versionImg}-rc" // Push the RC to latest as well sh "docker tag ${versionImg} ${latestImg}" sh "docker push ${latestImg}" } catch ( err ) { // Display errors and set status to failure echo "FAILURE: Caught error: ${err}" currentBuild . result = "FAILURE" } finally { // Finally perform cleanup sh 'docker system prune -af' } } } // NOTE: Everything below here could be put into a shared library // GET Example // Get a tag from Docker Hub for a given build number def getTagFromDockerHub ( imgName , build , authToken ) { // Generate our URL. Auth is required for private repos def url = new URL ( "https://hub.docker.com/v2/repositories/${imgName}/tags" ) def parsedJSON = parseJSON ( url . getText ( requestProperties: [ "Authorization" : "JWT ${authToken}" ])) // We want to find the tag associated with a build // EX: 2017.2.103 or 2016.33.23945 def regexp = "^\\d{4}.\\d{1,2}.${build}\$" // Iterate over the tags and return the one we want for ( result in parsedJSON . results ) { if ( result . name . findAll ( regexp )) { return result . name } } } // POST Example // Get an Authentication token from Docker Hub def getAuthTokenDockerHub ( user , pass ) { // Define our URL and make the connection def url = new URL ( "https://hub.docker.com/v2/users/login/" ) def conn = url . openConnection () // Set the connection verb and headers conn . setRequestMethod ( "POST" ) conn . setRequestProperty ( "Content-Type" , "application/json" ) // Required to send the request body of our POST conn . doOutput = true // Create our JSON Authentication string def authString = "{\"username\": \"${user}\", \"password\": \"${pass}\"}" // Send our request def writer = new OutputStreamWriter ( conn . outputStream ) writer . write ( authString ) writer . flush () writer . close () conn . connect () // Parse and return the token def result = parseJSON ( conn . content . text ) return result . token } // Contain our JsonSlurper in a function to maintain CPS def parseJSON ( json ) { return new groovy . json . JsonSlurperClassic (). parseText ( json ) } Script Security Due to the nature of this type of script, there is definitely a lot of trust assumed when allowing something like this to run. If you follow the process we are doing in Modern Jenkins nothing is getting into the build system without peer review and nobody but administrators have access to run scripts like this. With the environment locked down, it can be safe to use something of this nature. Jenkins has two ways in which Jenkinsfiles (and Groovy in general) can be run: sandboxed or un-sandboxed. After reading Do not disable the Groovy Sandbox by rtyler (@agentdero on Twitter), I will never disable sandbox again. What we are going to do instead is whitelist all of the required signatures automatically with Groovy. The script we are going to use is adapted from my friend Brandon Fryslie and will basically pre-authorize all of the required methods that the pipeline will use to make the API calls. Pre-authorizing Jenkins Signatures with Groovy URL: http://localhost:8080/script import org.jenkinsci.plugins.scriptsecurity.scripts.ScriptApproval println ( "INFO: Whitelisting requirements for Jenkinsfile API Calls" ) // Create a list of the required signatures def requiredSigs = [ 'method groovy.json.JsonSlurperClassic parseText java.lang.String' , 'method java.io.Flushable flush' , 'method java.io.Writer write java.lang.String' , 'method java.lang.AutoCloseable close' , 'method java.net.HttpURLConnection setRequestMethod java.lang.String' , 'method java.net.URL openConnection' , 'method java.net.URLConnection connect' , 'method java.net.URLConnection getContent' , 'method java.net.URLConnection getOutputStream' , 'method java.net.URLConnection setDoOutput boolean' , 'method java.net.URLConnection setRequestProperty java.lang.String java.lang.String' , 'new groovy.json.JsonSlurperClassic' , 'new java.io.OutputStreamWriter java.io.OutputStream' , 'staticMethod org.codehaus.groovy.runtime.DefaultGroovyMethods findAll java.lang.String java.lang.String' , 'staticMethod org.codehaus.groovy.runtime.DefaultGroovyMethods getText java.io.InputStream' , 'staticMethod org.codehaus.groovy.runtime.DefaultGroovyMethods getText java.net.URL java.util.Map' , // Signatures already approved which may have introduced a security vulnerability (recommend clearing): 'method java.net.URL openConnection' , ] // Get a handle on our approval object approver = ScriptApproval . get () // Aprove each of them requiredSigs . each { approver . approveSignature ( it ) } println ( "INFO: Jenkinsfile API calls signatures approved" )
Police have set up checkpoints at ferry piers on the island of Koh Tao and called in reinforcements to mount an intensive manhunt after the semi-naked bodies of two foreign tourists were found brutally murdered there on Monday morning. Police on Koh Tao investigate the scene of the brutal murder of two British tourists. Their half-naked bodies were found bludgeoned to death on Sai Ree beach early Monday. (Photo by Supapong Chaolan) The British man and woman, both 24, were found stripped near their budget bungalow on Sai Ree Beach on the Surat Thani island popular for its full-moon parties and scuba diving early Monday. Police said no identification was found on the bodies, but described them as a woman from Great Yarmouth, Norfolk, and man from Jersey, Channel Islands, The Associated Press said. Police suspect that the woman, whose skirt was hiked up to her waist and t-shirt pulled up, was also raped. Four metres away, the male victim was found naked on his back with a wound caused by a blunt blow to the back of his head. A pair of shorts, a t-shirt and a denim skirt were found nearby. About 50 metres from the victims, police found a blood-covered hoe, believed to be the murder weapon. The Bangkok Post is not naming the victims until it can be confirmed their families have been officially notified. Surat Thani provincial police chief Kiattipong Khawsamang said initial investigation found that the two victims had travelled separately to Koh Tao, where they met while staying at the same seaside hotel. "They went out to a bar and left together after 1am, according to closed circuit TV camera footage,'' he said. Local police official Jakkrapan Kaewkhao told AFP that the woman had three wounds on her face and the man had four wounds on his back. Found around 6:30am, they had probably been slain between 4am and 5am, local media reported. Police said they had no immediate suspects and were checking closed circuit TV cameras at nearby restaurants, hotels and shops in search of the attackers. But Pol Maj Gen Kiattipong told AFP that the suspect, or suspects, were "probably still on the island." "We don't know who the suspect might be yet but we have talked to different witnesses who might lead us to some clues," he said, adding that the woman was travelling with three other friends. Thai media quoted other Surat Thani officers speculating that migrant workers were responsible, although they provided no evidence to support their assertion. Investigators were also looking for witnesses who might know if the pair attended a nearby late-night beach party that attracted tourists. They also called in reinforcements from the neighbouring island of Koh Phangan Local media reported that residents told police there had been a beach party for about 50 people, mostly foreigners, on Sunday night that went into the early morning. The double-murder has sent shockwaves through the island, popular for its full-moon parties and scuba diving. "It was the first time this has happened on the island, I have never seen anything like this," said an employee at the seaside resort where they were staying. In a statement the British Embassy in Bangkok said officials are "urgently seeking information from local authorities." "Consular staff stand ready to provide assistance to friends and family at this tragic time," it added. Britain says Thailand is the country where its citizens are second most likely to require consular assistance if they visit, behind the Philippines. There were 389 deaths of British nationals in Thailand in the year to March 2013 -- about one for every 2,400 British visitors or residents -- although that figure includes natural causes. But it is rare for tourists to be murdered in Thailand, although visitors frequently perish in accidents. The bodies of two British tourists, a 24-year-old man and 23-year-old woman, were discovered on Koh Tao in Surat Thani province. (Reuters video)
The most important road not taken, however, occurred many months ago, in December of last year, when the president chose to keep his distance from the recommendations of his own fiscal commission. Suppose he had endorsed its broad approach while making it clear that he disagreed about a number of specifics. Suppose further that he had reinforced that message by featuring it in his 2011 State of the Union address and by using it as the framework for his 2012 budget proposal. If he had done so, he would have had a full six months to build support for his “everything on the table” approach and to rally the American people who, as countless surveys have shown, strongly prefer it to the Republicans’ spending cuts-only strategy. I have heard of two arguments against this strategy offered by White House officials. First, it is said the president did not want to step forward until after the Republicans had offered their own budget framework. If Representative Paul Ryan remained true to his principles, he would propose huge cuts in popular programs such as Medicare, generating a public backlash, after which the president could return to the fray in a much stronger position. Well, the president certainly smoked Ryan & Co. out. But what did he gain? As of now, I can’t think of anything. Sure, public approval of the Republican Party is way down. But so are his own numbers. And if the debt ceiling deal reflects a weakened Republican Party, one shudders to think of what a stronger one would have done. The second argument against endorsing the fiscal commission’s approach goes like this: Sure, Obama wanted to end up roughly where the commission did. But if he had said so back in December, then all the bargaining would have been between that position and options further to the right. In the end, the president would have been forced to accept a deal far less balanced than the one the commission had recommended. Again, that sounds clever and strategic, exactly the kind of advice that senior political advisors and their wannabes love to offer. And what happened? Not only was the president forced to accept a deal to the right of the fiscal commission’s proposal; he also yielded the high ground for three crucial months, enduring unrelenting criticism for his lack of leadership. And even after his mid-April shift, which rendered his original budget proposal an embarrassing dead letter two months after it was submitted, he continued to bob and weave. It’s easy to speculate about what was going on behind the scenes. Congressional Democrats were urging Obama not to yield an inch on Medicare and Social Security: If he did, he would blur the bright-line contrast between the parties so beloved of political consultants. By the time he decided to grasp the nettle and enter into serious discussions with Boehner, the clock had virtually run out, making it far more difficult to reach an agreement embracing both revenues and entitlements, which would have been a tough sell even in better circumstances. I’m aware how easy it is to dismiss my arguments. Alternative history is a fool’s errand, it may be said. And how can I claim to know what senior White House advisors (let alone the president) were really thinking? Answer: I can’t. I’ve just tried to reconstruct a sequence of events that otherwise makes no sense. Obama began the year determined to talk about selective public investments as the key to “winning the future.” He ended up focused exclusively on the Tea Party’s preferred agenda. Once he couldn’t avoid the fiscal issue, he wanted a balanced approach but was forced to settle for something quite different. He tried to position himself as the adult riding herd on a brood of squabbling children; he ended up portraying himself as a suitor left at the altar, not just once, but repeatedly.
MagmaConf 2017 Anti-Harassment Policy MagmaConf is designed to be a harassment-free conference experience for all participants. We welcome everyone regardless of gender, sexual orientation, disability, physical appearance, body size, race, religion, mother tongue, operational system of your devices, preferred programming language/framework, or anything else. We embrace diversity. We ask all participants to respect one another, and we will not tolerate harassment of conference participants in any form. Harassment includes offensive verbal comments related to gender, gender identity and expression, sexual orientation, disability, physical appearance, body size, race, religion, sexual images in public spaces, deliberate intimidation, stalking, following, harassing photography or recording, sustained disruption of talks or other events, inappropriate physical contact, and unwelcome sexual attention. Participants asked to stop any harassing behavior are expected to comply immediately. We reserve the right to warn or expel offenders from the event without a right of refund, and they will not be able to attend any future MagmaConf related events in any way. Please contact a staff member in case you are being harassed, or are concerned about anything that might be related to this subject. Conference staff will be happy to help participants contact hotel/venue security or local law enforcement, provide escorts, or otherwise assist those experiencing harassment to feel safe for the duration of the conference. We value your attendance.
Municipality and city in Jalisco, Mexico Jocotepec ( Spanish pronunciation: [xokoteˈpek]) is a town and municipality, in Jalisco in central-western Mexico. The municipality covers an area of 384.36 km². As of 2005, the municipality had a total population of 37,972.[1] History [ edit ] Perhaps as early as 100 BC, nomadic bands of Indians passed through the Lake Chapala Valley. Some moved on, others settled on the shore. Jocotepec, once Xuxutepeque, a small fishing village at the western end of the Lake, became a permanent home for the Nahua Indians in 1361. They built a temple to their god, Iztlacateotl, and practiced human sacrifice. The village became a trading and ceremonial site for the surrounding mountain area. "Xuxutepeque was the name given Jocotepec by its first Nahua settlers. (The last of the nomadic bands to settle in this area were the Purépecha.) It became a permanent home for the Nahuas in 1361. Xuxutepeque later became "Xilotepec", meaning "Hill of ear of Corn". Finally, with the arrival of the Spaniards, led by Jacob Tepec, the settlement's name became "Jocotepec" and was interpreted as meaning "Hill of Guavas". (Guavas are a small bitter-sweet tasting fruit.) The meaning of Jocotepec is thus derived: Xoco-tepe-K, meaning Xoco (acid); Tepetl (hill); and K (place)." [2] In 1520, Captain Alonzo de Avalos was given this area as an encomienda (land grant). Chief Xitomatl, who then governed the area between Chapala and Jocotepec, submitted his territory to Spanish rule without a battle. In 1529, Jocotepec was formally founded, according to a title of property issued by Hernan Cortes, a copy of which can be found today in Jocotepec records. Franciscan fathers then proceeded with conversion of the natives. Old Indian temples were destroyed and Catholic church foundations laid in their ruins. At that time, Jocotepec acquired its two religious protectors - Nuestro Senor del Monte and Nuestro Senor del Guaje. The municipality of Jocotepec has a large variety of trees and plants, mostly located inside garden walls. The main plaza is surrounded by greenery, making it very inviting. Vegetation is composed mainly of jacaranda, galeana, hule, pine, roble, cazuarina, mesquite, guamuchil, chaparrale and encino. Fruit trees such as mango, avocado, lime, lemon and orange are also abundant. In North Jocotepec, acacia, huizache and palo-bobos predominate, while in the south (lake) side, there are a few sauce trees and sabinos. A large farm grows raspberries for export. Fields of corn and chayote are very common in this area. Products of Jocotepec are mainly wool carpets in typical weaves and many colors, and the traditional serapes of this village. Another important industry is the fabrication of tiles, ready-made or made to the client's design. Wood and forged iron furniture can also be made to order. A large sweater factory is expected to soon start exporting. Recently, painting and music have been given a boost by local organizations promoting cultural events. Jocotepec has two religious protectors: Nuestro Senor del Monte and Nuestro Senor del Guaje. A Fiesta Patronal (a religious celebration of these protectors) is held early in January. It lasts two weeks, and honors the first patron, the Lord of the Mountain, with daily masses, dances, cockfights, bullfights, parades and fireworks. Another fiesta, later in the year, honors Nuestro Senor del Guaje, but on a smaller scale. SERVICES: several sport recreation centers, 2 banks, and 2 gas stations Notable people [ edit ] María Guadalupe Urzúa Flores (1912 – 2004), Municipal President of Jocotepec, from 1983 to 1985.
Brazilian President Dilma Rousseff addresses the United Nations General Debate at the 68th United Nations General Assembly in the UN building in New York City on September 24, 2013. UPI/John Angelillo | License Photo RIO DE JANEIRO, Nov. 14 (UPI) -- Brazilian President Dilma Rousseff is pushing legislation to give the government greater control over data processed by Google, Microsoft and other IT giants. That data includes all email traffic and Internet usage as well as ancillary services such as Google's mapping program that led to Brazil demanding access to street view data collected by the U.S. company in Brazil. Rousseff sought tougher oversight of U.S. information technology companies active in Brazil after reports the U.S. National Security Agency spied on her, senior aides and other government functionaries. The Brazilian president demonstrated her ire by calling off an official visit to Washington and talks with U.S. President Barack Obama. During U.N. meetings in New York in September Rousseff delivered a sharp attack on the United States, saying NSA actions violated international law. U.S. companies in Brazil are under increasing scrutiny and Rousseff has ordered government departments to avoid using Google, Microsoft and Yahoo email services. New legislation that Rousseff wants rushed through congress will require Google and other Internet companies to store all Brazil-related data within the country and to have it on call for examination by Brazilian government agencies. Google and other U.S. companies say compliance will escalate their costs and also expose them to hefty fines. In the furor this month on Google street view data, Brazilian judges ordered Google to hand over all street data obtained by Google vans operating across the country. Critics say Google technology allows the IT giant not only to capture images but also to eavesdrop on private WiFi and other local communications. Google hasn't denied the accusation. Brazilian courts warned Google of fines of up to $500,000 for non-compliance with the order to hand over the data. The Brazilian Institute of Computer Policy and Rights said it knew Google engaged in similar electronic data interception in other parts of the world and called on the company to categorically state it hadn't done the same in the Latin American country. RELATED Google opens up about its mystery barge in San Francisco Bay Reports of U.S. intelligence gathering activities first emerged from documents released by U.S. secrets-leaker Edward Snowden, who now lives in Russia after Moscow granted him asylum. Snowden is sought by U.S. authorities but has offered to cooperate with German officials on a European investigation of his activities. The spying debate has put on hold most of business exchanges that appeared to be growing after Obama visited Brazil and neighboring countries in March 2011. Questions have arisen on U.S. participation in a Brazil defense procurement program that includes a multi-billion dollar contract for next generation fighter aircraft for the Brazilian air force. Google and other U.S. technology companies so far have resisted Brazilian demands for a data center based in the country. Faced with potential exclusion from Brazil's thriving economy and hefty fines, the companies are said to be resigned to complying with Brazilian government demands, local news media reported. RELATED Google Street View brings Galapagos Islands to the Internet Brazil is also said to be considering similar controls on other Western technology companies, including British and other European businesses active in the country.
Miami Marlins shortstop Adeiny Hechavarria sure does make a lot of awesome plays in the field! Unfortunately, that’s about all he does. The Marlins have taken to social media with the hashtag #HechofaPlay to bring attention to “Hech” and let fans see his glove wizardry. That’s all fun and impressive, but the fact remains that position players are expected to play the field AND hit. In 2013, Adeiny Hechavarria was statistically the worst player in baseball. There wasn’t even a close second. His pathetic .227/.267/.298 line combined with awful base running and mediocre-at-best defense (more on that later) made for a -2.1 fWAR. His closest competition was White Sox “first baseman” Paul Konerko at -1.8 and the last season of Michael Young‘s career at -0.6. 2014 was a bit better, as Hech raised his line to a decent .276/.308/.356 and ran the bases a little better. He was still a liability on the bases, but not to the extent he had been the previous season. His defense was still not great, but his overall improvement earned him a 0.3 fWAR, good enough to be outside of the bottom 20 in baseball for the season. 2015 was Adeiny’s breakout season. His defense finally measured up to the way MLB’s highlight videos would make you think they always had. His offense continued to improve just like it had the year before. He ended up with a line of .281/.315/.374. By no means an offensive juggernaut, but combined with his (finally) elite defense? That’s good for a 3.1 WAR, 5.1 points higher than his previous career total of -2.0. Defense was always Adeiny’s calling card, but advanced metrics showed that early in his career, he needed to dip, dodge, duck, dive, and dodge for balls that other shortstops would easily get to due to his bad instincts and a slow first step. Something clicked before the 2015 season though, and that athletic ability that he always had came through even better as he developed better instincts and a much quicker first step. 2016 should have been even better. Hechavarria was an elite defender and he had improved with the bat every single year. His 3.1 fWAR in 2015 was nice, but could he push for 4.0 with an extra year of experience? Well, yes and no. Yes, he pushed for 4, but no, it wasn’t 4.0. Hech finished the 2016 season with 0.4 fWAR. Unfortunately for him, that’s still the second best season of his career. His offense fell off a cliff to the tune of .236/.283/.311. His defense was still great though, making him a positive contributor to the Marlins, albeit only on one side of the ball. After being one of the best shortstops in baseball overall in 2015, Hech was back to the side of that list where he’s more comfortable: he was the worst offensive player in the game in 2016. As a positive, his base running improved significantly. 2016 was the first season where Hech provided his team positive value on the base paths instead of his usual negative. His base running value, per season beginning in 2013: -5.0, -1.3, -0.9, +2.0. If the Marlins hope to compete next season, they need to put Hech on a short leash. If he starts the season with signs of repeating 2015, then he’s a fine starting shortstop for a team with playoff aspirations. If he starts off looking like 2015 was a fluke career year and 2016 is more his norm, then the Marlins will need to look into moving some pieces (what pieces, no one knows…) to acquire a legitimate Major League caliber starting shortstop. 2017 will be a big year for the Marlins for many reasons, but it may be an even bigger year for Adeiny Hechavarria’s baseball future.
Turns Out 4chan Boards Not As Lawless As They Seem Share Tweet Nearly overlooked during the recent criminal trial of the Tennessee man who hacked Sarah Palin’s e-mail account was the testimony--as a government witness--of the founder of 4chan.org, the anarchic message board. Federal prosecutors put Christopher “Moot” Poole on the stand to identify 4chan records that linked David Kernell to the Palin hack. Poole, 22, had previously turned over to the FBI server logs and other records after being served with a search warrant. Kernell, 22, initially uploaded images from Palin’s Yahoo account to 4chan’s notorious “/b/” board, and later posted there a step-by-step description of the September 2008 e-mail incursion. The 4chan records (and other electronic fingerprints) clearly linked the Palin hack to Kernell, who was convicted of two felony counts. Poole’s testimony--which included quaint prosecution queries about b/tards, new fags, trolls, and getting Rickrolled--can be downloaded here (PDF). Denizens of /b/--where memes are born, cyberbullying campaigns are hatched, and child porn can sometimes be found--might be surprised to discover that 4chan isn’t as lawless as it seems.
Advertisement (Editor’s Note: Magellan has updated its list of affected locations as of 9:00 a.m. Matt Skinner, public information director with the Oklahoma Corporation Commission, states lists will be updated throughout the day.) Magellan Midstream Partners released a statement Friday afternoon concerning the ongoing gas recall first announced Tuesday in an email press release from the Oklahoma Corporation Commission. The statement reads as follows: The majority of the affected gasoline with higher than intended ethanol content has been safely recovered and replaced with on-specification fuel allowing gasoline retailers to return to normal operations. All affected gasoline in Tulsa has been completely recovered and replaced and we have made significant progress in the greater Oklahoma City and other affected areas. We expect to recover and replace the remainder of the affected gasoline tomorrow. We wish to apologize to our customers, the fuel distributors, retail gasoline marketers, fuel consumers, state regulators and our fellow Oklahomans that have been affected by this unfortunate incident. The recalled gas totals 449,00 gallons that could contain up to 30 percent ethanol, well over the 10 percent threshold noted on most fuel pumps. The gas was distributed from a midstream facility in Oklahoma City to stations in several counties. Magellan has posted a color-coded list of affected retail locations and their status at this link. For a list of metro-area gas stations that feature 100 percent gas, visit this page and consult its map. From the release: Tracking down the bad gas The original OCC release from Tuesday noted the fuel was dispatched from the Magellan facility during the period between Aug. 23 and Monday from one of six delivery bays. The Oklahoma Corporation Commission’s Petroleum Storage Tank division fuel inspectors will be working to ensure impacted retailers have stopped sale of the fuel in question, and that the product is returned to Magellan. Any consumer with a fuel-related issue should contact the retailer. Magellan will work with retailers to satisfy consumer complaints. Fuel complaints regarding this or any other concern related to fuel measurement or quality can also be filed online with the Oklahoma Corporation Commission or by calling (405) 521-2211. EPA fails to report on biofuels impact Recently the EPA announced that it has failed to provide reporting on the environmental impact of biofuels as the agency is required to do. The agency noted its last report on the subject came in 2011, but reports are supposed to be issued every three years. The entire EPA report on the federal Renewable Fuel Standard can be found here. Since the EPA’s last report, industry-led pushes for and against fuels with higher ethanol (E) ratings have been ongoing. In 2012, USA Today reported a CEO of the Renewable Fuels Association claimed E15 was safe for post-2001 vehicles. Meanwhile, the American Petroleum Institute produced a study that found “issues about engine durability” with E15 and E20 blends of biofuel. If you are concerned about the potential dangers of ethanol fuel blends, speak with a trusted automotive mechanic about the specifics of your vehicle.
It’s Snow Leopard day zero, so of course I had to upgrade. All in all, everything is great. (Especially the multiple monitor window migration fix!) But the #1 thing that annoys me about all OS X releases in the colors in Terminal.app. They’re pretty much unusable on a dark background (especially the blue). For some time, there have been hacks to fix the problem. Well of course these hacks didn’t work on 10.6 anymore. Never one to shy away from the problem, I dove in. And we have success! Here’s how to make it work: Find Terminal.app in Finder (/Applications/Utilities), right click, “Get Info” There is a checkbox “Open in 32-bit mode”, Check it! Install SIMBL. Plugsuit was installed on my machine before, it freaks out because of the 10.6 changes. SIMBL silently just works or doesn’t. Get My updated TerminalColours SIMBL plugin. See the original post for details on how to install it. Restart Terminal.app Enjoy your readable colors! This works because InputManagers still work in 32bit mode, but not 64bit mode. So by forcing Terminal.app to run in 32bit mode, SIMBL can still hook in. I just had to update TerminalColours to swizzle a new method that 10.6 uses to pick colors. Hope you enjoy! Update: I’ve changed the tar.gz download link to one that should work better. Advertisements
13 Drivers line up for Round 2 of Formula 4 The CAMS Jayco Australian Formula 4 Championship is gearing up for the second instalment of the new open wheel development category certified by the FIA, with Round 2 to be held at Ipswich, Queensland from 31 July – 2 August. Like the successful inaugural championship opener in Townsville earlier this month, Round 2 will have 13 cars line up on the grid. Championship Manager Cameron McConville is delighted that a strong field will again be on track as another Queensland crowd witnesses the arrival of FIA Formula 4 in Australia. “Townsville was fantastic, all the drivers competed hard and we saw some good close racing and I’m sure we will see it again in Ipswich from the field of young and talented drivers,” McConville said. “We have certainly had a lot of interest after the first outing of Formula 4 and I am confident the field will grow if the interested parties take the next step and get on the grid for this exciting new category.” In Townsville, Will Brown went down in the history books as the first winner of a Formula 4 race in Australia. The CAMS Foundation supported driver also won the inaugural CAMS Burson Rookie of the Round Medal. Overall Round 1 honours went to Jordan Lloyd from exciting Western Australian driver Nick Rowe with Brown third. CAMS Jayco Australian Formula 4 Championship Round 2 Ipswich 31 July – 2 August DRIVER AND TEAM LIST 94 Jordan Lloyd (QLD) Team BRM 3 Zane Goddard (QLD) Team BRM/BGDArchitects/PWR/Parry Group 26 Harry Hayek (NSW) Team BRM / Laing+Simmons Potts Point I Pyrmont 2 Francesco Maiolo (SA) Team BRM 04 Drew Ridge (NSW) Brian Hilton Sydney Motor Group/Team BRM 99 William Brown (QLD) Cars Galore/AGI Sport 25 James Vernon (NSW) Team Toolforce Racing/AGI Sport 15 Tom Grech (VIC) iQ Option Racing/AGI Sport 79 Jordan Love (WA) AGI Sport 20 Jack Sipp (QLD) JSR Motorsport 49 Thomas Randle (VIC) Dream Motorsport 96 Luis Leeds (VIC) Dream Motorsport 97 Nick Rowe (WA) AGI Sport
Is the evidence on GP co-payments as bad as Labor says? Updated Healthcare spending has been named as the nation's single largest long-term fiscal challenge by the National Commission of Audit report. Earlier this year, Federal Health Minister Peter Dutton said the Government was considering introducing a co-payment for visits to general practitioners in the May 13 budget. This followed a speech in which he said Medicare was on an unsustainable path. Essentially, a co-payment would mean the 80 per cent of Australians who go to bulk-billing doctors would be required to make a payment of the Government's choosing. Opposition health spokeswomen Catherine King is critical of the proposal and recently said: "Evidence of co-payments from around the world, evidence from all of the health groups in Australia to a committee that's been looking at this issue are all saying that if you introduce a co-payment, what that does... it means that people avoid doctor visits, that's what it's designed to do. They end up much sicker and they end up in our emergency departments." The committee Ms King refers to in her claim is the Senate Select Committee into the Abbott Government's Commission of Audit. ABC Fact Check examines the evidence around co-payments to GPs. The claim: Catherine King says evidence shows that GP co-payments mean people avoid doctor visits, end up sicker and end up in emergency departments. Catherine King says evidence shows that GP co-payments mean people avoid doctor visits, end up sicker and end up in emergency departments. The verdict: While Ms King is correct that the average person will visit the GP fewer times, there is only evidence to suggest the low income and chronically ill will get sicker. No studies could be found about the direct impact of GP co-payments on emergency departments. Paying to see the GP Medicare figures show around 20 per cent of Australians are not bulk-billed and already make a payment to visit a GP. In 2012-13 the average gap fee paid for out of hospital GP visits (non-referred GP attendances) was $28.58 per visit. In 2013 a report was published by the Commonwealth Fund, a private American fund, which compared the health care systems of 11 countries including the US, Canada, France, New Zealand and the United Kingdom. According to the report, 25 per cent of Australians paid more than $1,000 annually in out-of-pocket medical expenses, which was the second highest of the 11 countries. This includes services such as specialist care, dental care, diagnostic tests and pathology. In the 1991 budget, the Hawke government announced a co-payment of $3.50. It was watered down to $2.50 before it began and was abandoned within months after Paul Keating replaced Bob Hawke as prime minister. A flat-rate $6 co-payment to visit the GP was floated in October 2013 by the Australian Centre for Health Research, an organisation funded by private health funds and private hospital groups. The proposal was made in a paper written by social policy consultant Terry Barnes, a former adviser to Tony Abbott when he served as health minister. A submission to the Senate Select Committee into the Commission of Audit by Mr Barnes recommends concessional patients and families with children under 16 would be exempt from the co-payment after 12 visits per year. Mr Barnes also said the $6 co-payment was simply the Hawke government's original payment indexed to 2013 using a Reserve Bank of Australia inflation calculator. "It was not plucked out of the air," he said. The Commission of Audit has since recommended a higher co-payment of $15 per visit to the GP with a safety net in operation once a patient reached 15 visits, reducing subsequent co-payment to $7.50. It also recommended state governments introduce co-payments for "less urgent conditions" in public hospital emergency departments. What Australian health groups say There has been criticism of the proposal by the Australian Medical Association and Australia's largest consumer health group, the Consumer Health Forum, said it would "not support measures that increase co-payments and charges given the considerable evidence surrounding the impact of growing out-of-pocket costs on Australians". However, the Catholic Health Association "considers there is scope to examine whether there is a role for small mandatory co-payments to be introduced in areas of health service demand that are growing rapidly, for example pathology tests or public hospital emergency department treatment that could otherwise be accessed through primary care at the same time in the same locality". In the Senate committee into the Commission of Audit, Simon Cowan from The Centre for Independent Studies supported the introduction of GP co-payments saying: "Our model involves not just a $5 co-payment but a $5 reduction in the Medicare benefit that is paid, and that is where the savings to Government will come from." Other experts who gave evidence against the co-payment include Dr Stephen Duckett from the Grattan Institute, Professor Laurie Brown for NATSEM, and Professor Geoff Dobb from the AMA. The Senate committee believes measures which place a barrier to a person seeing a GP are not in the best interests of keeping people healthy, and "strongly recommends that the Government does not implement co-payments for GP consultations and emergency department services". Do co-payments mean 'people avoid doctor visits'? Dr Duckett referred Fact Check to a systematic review by Danish researchers Astrid Kiil and Kurt Houlberg from 1990-2011 and published in the European Journal of Health Economics, which identified a total of 47 studies on the behavioural effects of co-payments. It concluded that "the majority of the reviewed studies found that co-payment reduces the use of prescription medicine, consultations with general practitioners and specialists, and ambulatory care". A leading researcher in co-payments, Amal Trivedi, an Associate Professor at Brown University in the United States, says "economic theory and empirical evidence suggest that patients will use fewer health services when they have to pay more for them". The RAND Health Insurance Experiment - funded by the US department of health - is often cited as the stand-out study in co-payments in the health care system. The $15 million study took place in the 1970s and 80s, and remains the largest health policy study in the history of the US. It included co-payments for visits to the GP, prescriptions and hospital admissions. It showed that modest cost sharing, or co-payments "reduces use of services". Dr Beverley Essue from the Menzies Centre of Health at the University of Sydney says "there is evidence that co-payments can impact on access to health care – both necessary and unnecessary care - and that this impact differs among different population groups with the elderly, low income and those with chronic illness (i.e. frequent users of the health system) more substantially impacted". A 2013 report by the Australian Bureau of Statistics found 15 million people visited a GP in the past 12 months. This reports says 5.4 per cent - or over 800,000 - of people "delayed seeing or did not see" a GP at least once because of cost, indicating if a co-payment was introduced would affect the number of patient visits. In sum: There is strong evidence to show people visit the GP less often when a modest co-payment is introduced. Do people 'end up much sicker'? In what was called "a striking finding" at the time, The RAND study found cost sharing did not significantly affect the quality of care received by participants, or their health. However, negative health effects were founds in "individuals with high blood pressure and low income as well as individuals with poor sight". Dr Brett Montgomery from the University of Western Australia says: "The RAND study is methodologically the most pure of our co-payment studies, because it was a proper randomised trial – that’s why everyone refers to it so often". However, he warns the study is dated and excluded older patients, suggesting its results may not be that meaningful for the present debate. Dr Montgomery says the Kiil and Houlberg review found that evidence of the effect of co-payments on health outcomes, or whether people became sicker, was sparse. It's a sentiment echoed by Associate Professor Trivedi, who recently said there are "remarkably few studies of the consequences of increasing co-payments for ambulatory care, and even these studies have been limited because they have excluded elderly patients". However, he told Fact Check: "Policymakers should exercise deep caution about increasing ambulatory co-payments, particularly for high-risk populations with chronic disease. Increasing outpatient co-payments can be an ill-advised cost-containment strategy. We have found that in response to modest increases in co-payments, elderly patients in the US cut back on the number of outpatient visits, but then experienced substantial increases in their use of expensive acute hospital care. In other words, the co-payment increases were penny-wise and pound foolish." Dr Essue says there are few Australian studies which have investigated the impact of co-payments on "hard clinical outcomes". However she says one Australian study shows that in 2005 when PBS co-payments increased by over 20 per cent, dispensing of medicines prescribed for diseases including epilepsy, glaucoma, Parkinson’s disease, asthma, osteoporosis, and thyroid deficiency significantly decreased. She also notes dispensing of statins and some antiplatelet drugs decreased. These drugs are commonly used to prevent heart attacks, strokes and other vascular diseases. A recent review of international evidence by health policy analyst, Jennifer Doggett, for Consumers Health Forum concludes there is "a risk that the introduction of additional co-payments for bulkbilled and hospital emergency department visits could adversely impact upon the health of some already marginalised groups in the community and result in an overall increase in costs to the community". The Government has not yet indicated whether it would take up the Australian Centre for Health Research's suggestion that there would be exemptions from the co-payment for low income people and children. In sum: There is some evidence to suggest co-payments for GP visits affect low income people, and the chronically ill. Do people 'end up in our emergency departments'? In relation to this claim, Ms King's office referred Fact Check to the 2013 report by the Commonwealth Fund. It found Australia currently has the lowest use of emergency departments in the 11 countries surveyed, with 22 per cent of Australians visiting an emergency department in the last two years. In the US, nearly 50 per cent of uninsured Americans had done so. The results indicate people who have to pay for healthcare - and possibly cannot afford to - do end up in emergency departments more frequently. The Government has not yet indicated whether it would take up the Commission of Audit's suggestion that state governments should be encouraged to introduce a co-payment to try to address this issue. Dr Essue says "there isn't a wealth of evidence to suggest that people will be pushed into emergency departments as a direct result of co-payments but it can be inferred". "We know that compromised care in terms of non-adherence and non-compliance with recommended care often leads to exacerbations and complications and people use emergency departments when experiencing such exacerbations and complications of their conditions," she said. In sum: There are few studies, if any, that measure whether modest co-payments to GPs result in higher emergency rates. However, logic would dictate this to be the case. The verdict On the evidence, Ms King is overreaching. While she is correct that the average person will visit the GP fewer times, there is only evidence to suggest the low income and chronically ill will get sicker. No studies could be found about the direct impact of GP co-payments on emergency departments. Sources Topics: alp, health, federal-government, doctors-and-medical-professionals, australia First posted
A newly unearthed internal Republican Party memo shows preparations underway to deal with the multiple legal investigations that surround the Trump administration. The Republican Party is preparing for Donald Trump’s mountain scandals and investigations to drag the whole party down with him. The Republican National Committee has issued a memo to its employees instructing them to preserve documents relating to the 2016 presidential election, indicating that the organization fears legal involvement in the investigations surrounding the Trump administration. BuzzFeed reports that lawyers for the RNC warned staffers that the probes from the FBI, special counsel, and Congress may soon reach their offices. “Given the important role that the RNC plays in national elections and the potentially expansive scope of the inquiries and investigations,” the memo stated, “it is possible that we will be contacted with requests for information.” The memo also told staff to “preserve all documents potentially relevant to these matters until they are resolved or until we are informed by all necessary parties that preservation is no longer necessary.” The RNC was of course hip-deep in the machinations at the top of the Trump campaign. By the time senior members of the campaign — including his son-in-law, Jared Kushner, his son Donald Trump Jr., and campaign chairman Paul Manafort — were meeting with Russian operatives at Trump Tower digging for anti-Hillary Clinton dirt, Trump was the presumptive nominee. A month later, he would accept the nomination at the party convention in Cleveland, and top figures at the RNC, like then-chairman Reince Priebus and then-communications director Sean Spicer, were being integrated into the Trump campaign apparatus. There would have been little separation between the campaign and the party, and it seems highly likely that investigations into Trump wrongdoing would reach into the official party. These connections have spread Trump’s toxicity because the Republican Party has submitted itself completely to him, refusing to provide strong oversight of his team’s behavior, even with the possibility of criminal and ethical wrongdoing. This attitude was expressed early on, as the Obama administration became aware that Russia-backed hackers were breaking into Democratic Party computers and leaking information in a campaign to hurt Clinton and aid Trump. They wanted to go public, but as the Washington Post reported, Senate Majority Leader Mitch McConnell “raised doubts about the underlying intelligence and made clear to the administration that he would consider any effort by the White House to challenge the Russians publicly an act of partisan politics.” Republicans are now dealing with the karmic blowback for their inaction and complicity. The party is looking to protect itself legally, but it is probably too late. They have allowed the cancer to spread everywhere.
Periodically I like to share my current decklist. Mainly I do this because it helps me to remember what I’m playing and why, but occasionally some players like to read my decks and try them out themselves. So without further ado, here’s my current decklist post-GBT11. Grade 0 (17) Fetter Creator, Van x1 Van is MVP of this deck, no questions asked. There is simply no better forerunner for a revelation build. Van hard counters 10K vanillas by allowing you to give any unit +2k. This can even allow you Grade 2 Vanguard to hit a Grade 3. And then on first stride you can just shove him into the soul to draw. Nakisawame may be cute, but she has NOTHING on Van’s utility. Witch of Great Talent, Laurier x4 She cute, and she can be a free soul if you G-Guard with the Grade 4 Laurier. Goddess of Sound Sleep, Tahro x4 There are several reasons why Tahro is essential to this deck. It’s Tahro. Nuff said. She has revelation. She helps prevent deck-out. She synergizes beautifully with Vanargandr and Athena. She’s a stand trigger. Dreaming Dragon x4 Just like Tahro, Dreaming Dragon is essential to this deck, but for different reasons. It’s a stand trigger. It helps prevent deck-out. Free draw, yes? Often I’ve found that the difference between winning a game and losing resides on proc-ing Dreaming Dragon’s skill. Battle Maiden, Kukurihime x4 Kukurihime is not essential to this deck, but she is a nice add-on. +3k is good, +1 soul is good, and hey, sometimes crit pressure is good too! Grade 1 Goddess of Fort, Kibitsuhime x4 Though I wish Genesis could get a Lien clone, Kibituhime is still really solid. She has revelation, and can allow for free soul charge. Detect Angel x3 Detect Angel actually surprised me in her utility. I never thought that her skill would be relevant, useful, or sought-out… until I realized that Blademaster also got a boost in GBT11. Detect Angel is great for being Grade 1 beater when you need it, because sometimes YOU NEED IT. Goddess of Transitory, Awanami x4 I was surprised to find that Awanami was my ideal first ride (I though it would be Gelja). Though her skill doesn’t go off when you first stride, she is useful to stuff the soul in preparation for an Ishtar turn. She also helps boost up Kotonoha’s numbers. Shackle Fetter, Gelgja x4 Essential. Re-standing a rested rearguard in your main phase is staple for any Revelation build. Grade 2 (12) Battle Maiden, Kotonoha x4 I’ll leave you with a quote from my friend. KILL KOTONOHA!! Battle Maiden, Senri x4 Though she can be used for stacking, I mainly use her to get an 11k beater. Numbers > triggers. Goddess of Favorable Wind, Ninnil x4 Ninnil is bae. She’s by far the best revelation Grade 2. On-ride Revelation is AMAZING, and low-cost power-up is FANTASTIC. Grade 3 (6) Prime Beauty, Amaruda x4 This deck lives and dies off of Amaruda. What did you expect? It’s an Amaruda deck! kjsdhiuerehrjhdiakfhearsnjdfs, Fenrir? x2 I’ve been playing Minerva, but I’m trying out the new Fenrir now. You can also play the old Fenrir, or Svava or Asuza. Heck you could even play Wiseman. All are great choices as long as you never ever ride them. (Though Revelation is generally preferred). I guess you could ride the old Fenrir. He’s still good. Grade 4 Goddess of Investigation, Ishtar x4 Ishtar! Ishtar! ISHTAR! ISHTAAAAAAAAAAAAR! Goddess of Settlement, Athena x2 If they aren’t dead after Ishtar. Goddess of Seven Colors, Iris x2 Best generic g-guard. Gives you resources as a cost to give you shield. Soul charge Tahro, or Gelgja, or Awanami. Goddess of Twill, Tagwoot x1 I actually really like this card. Less soul charge than Iris, but WAAAAAAAAAAAAY better than Hanasatsuki. Great Angel, Doom Brace x1 Filler. You could replace him for Tyr. I doesn’t really matter. Mythical Destroyer Beast, Vanargandr x2 Vanargandr + Tahro. Great offensively, but even better defensively when you need to dig for key cards (heals, and PGs, mainly). Mythical Hellsky Beast, Fenrir x2 The shiniest Ishtar flip fodder in my g-zone. I only stride into it when I’m losing and need the draw. Witch Queen of Accomplishment, Laurier x2 Accelerates GB8, recycles my PGs, and has a big shield. How to Play Revelation Most people I see seem to be a bit confused about how Revelation works. You get to look at the top card of the deck, but do you do with it? What’s the goal? Am I trying to sack? Well to answer that last question: NO. OTT vs. Genesis Though at first glance these two clans seem very similar, there are some key differences. The main one being Genesis’ Revelation vs. OTT’s scry. Cards like Battle Sister, Stollen, or Chief Deity of the Heavens, Amaterasu scry the top cards of the deck and then reward the player for hitting triggers. With Genesis, it’s the opposite. There is nothing worse than hitting triple trigger on a Goddess of Investigation, Ishtar turn. What a whiff! The key difference is that modern OTT looks at more than 1 card from the deck at a time, and Genesis is limited to just 1 card. So they are very different in that sense. I made a handy guide to help with Revelation. This guide will not be applicable in ALL situations, so use your best judgment. Revelation It’s a critical trigger! SOUL CHARGE IT! It’s a stand trigger! SOUL CHARGE IT! It’s a heal trigger! LEAVE IT! It’s a normal (non-PG) unit! SOUL CHARGE IT! It’s a perfect guard! LEAVE IT! And there you go. Genesis’ winning image is to use the cards in the soul, not the deck to win the game. Use Revelation to power-up Senri, Kotonoha, Ninnil, Asuza, Detect Angel, etc. Then use Gelgja to stand them, and Ishtar to stand them again!
The Consequences of Sin and Evil June 24, 2014 at 11:22 am revchrisw Sin is not a popular word, so I will use the word evil also. They are of course related. Sin is not just minor offences but serious ones. Evil is clearly serious. All sin is contrary to God’s good will. In the Old Testament or Hebrew Bible there are three words for sin that can be translated as: falling short of the mark, transgression against a law, and rebellion. As can be seen the three have increasing levels of seriousness. It is one thing to come short of God’s will; it is another to consciously rebel against God. I will focus on the serious end of the scale, namely that which is clearly evil, not misdemeanours or minor transgressions. In our society we clearly regard people such as Hitler and the Nazi regime as evil, the apartheid system in South Africa was evil, and perpetrators of child sexual abuse are evil. The reason is obvious. In each people were treated cruelly, harmfully and unjustly with negative consequences including death, deprivation and taking away their dignity. It is not surprising that those who have experienced serious sin against them feel shame, resentment and often desire to retaliate against those who treated them with evil intent. It can be the case that those who have been mistreated unfortunately go on to treat others in the same kinds of ways when they gain the power to do so. The oppressed become in turn the oppressors; the abused become abusers. Or they have to live the rest of their lives with diminished capacity and reduced self-esteem because of what has been done to them. The Bible speaks of the ‘sinned against’ as well as those who sin. Humans in Christian understanding are both made ‘in the image of God’ with all that implies in terms of capacity and ‘fallen’ with the recognition of the negative side of human life. Given the fallenness, the brokenness, the conflicts in human life, sin and evil are never fully overcome. They continue to emerge in new forms. The Christian faith does not hold to the false notion of human progress but nor does it despair. It is realistic about the reality of sin and evil, yet its faith in God as revealed in Jesus provides hope. Human sin and evil lead, in general terms, to injustice, death, exploitation, shame, guilt, resentment, and retaliation. We see this clearly on a national scale with evil regimes, corporately with organised crime for example, and on a personal scale with murders and child abuse. As I indicated earlier all too often there is a cycle of sin and evil. Or else it is not resolved and people live with the consequences. How can the consequences of sin and evil be dealt with in a way that opens up a new future and does not simply repeat the past in new forms? Retaliation and punishment will not do it. Ignoring what has happened will not suffice. God in Jesus Christ has in fact acted in a way that has yet to be fully appreciated and appropriated. I am convinced that Jesus dealt with the consequences of sin and evil in a way that was entirely different to what was expected. We still find it hard to understand and follow. Jesus took upon himself the consequences of sin and evil. He was convicted unjustly by the Roman and Jewish powers of his day. The mob called for his crucifixion. The soldiers carried out their job. His disciples abandoned him. He was executed in the cruellest way having first been flogged. He died an agonising death. This was done not just to one unfortunate victim of the systems of the day. God had taken the initiative in Jesus to inaugurate his reign. The God of the Bible revealed supremely by Jesus wants people to have fullness of life lived according to God’s reign of love, justice and peace. God’s rule came in Jesus’ proclamation and actions. The forces of sin and evil were being overcome through Jesus as seen in his healings and exorcisms in which people were brought back to wholeness of body and mind. Yet the reality of sin and evil remained and the powers of the day conspired against Jesus which led to his crucifixion. Instead of calling on his disciples to rise up with him against his enemies, Jesus willingly submitted to them. He rejected the way of retaliation and violence. He chose to take the consequences of sin and evil upon himself. He did so not just as one individual but as a representative person, indeed as the Son of Man and Son of God. He absorbed the sin and evil inflicted upon him. In doing so he nullified the power of sin and evil. He refused to let sin and evil have the last say. He loved and forgave to the end. Jesus broke the causal chains of sin and evil in the world and created a new basis on which it is possible to work off the consequences of sin. That basis includes facing the truth of what has happened, repentance and seeking reconciliation with one another, even former enemies. In this way he opened up new possibilities. People do not have to continue the cycle of sin and evil, of violence and retaliation. People can experience the healing love of God so that what they have been through does not determine their future. If people appropriate what Jesus did on the cross they can know forgiveness, healing and new life can be opened for them. In following the way of Jesus and living in the reign of God now we can already experience something of the life God wants for us and are given hope for the future. God’s reign has come and will come. We can know reconciliation with God and others now and look forward to its fullness in the kingdom to come. Jesus not only died to deal with the consequences of sin; he was also raised that we might know new life through him the risen Lord. Advertisements Share this: Twitter Facebook Like this: Like Loading... Related Entry filed under: Uncategorized.
The Georgia legislature seems to have lost the idea of transparency in government. The Peach State is suing Carl Malamud, operator of the Public.Resource.org website, for copyright infringement. His transgression? He published the Official Code of Georgia Annotated (OCGA) and made it available to the public free of charge and without restriction. Georgia has an agreement with Lexis-Nexis to make the annotated laws available in book form and on the Internet, but the Internet site is available only to users who agree to the "Terms and Conditions" of the website. When a user enters the site (after having been redirected from the Georgia General Assembly website) he is greeted with this notice: Your use of this service is subject to Terms and Conditions. These Terms and Conditions do not apply to the Statutory Text and Numbering contained in the Content of the site. However, the State of Georgia reserves the right to claim and defend the copyright in any copyrightable portions of the site. Please indicate your agreement to the Terms and Conditions by clicking "I Agree" below. This is troubling, because Lexis-Nexis refers to the OCGA as "the essential reference you need to guide you quickly and efficiently in understanding the Georgia statutory scheme." Furthermore, those terms and conditions do not allow the information to be copied or shared. Section 2.1 warns, You may not copy, modify, reproduce, republish, distribute, display, or transmit for commercial, non-profit or public purposes all or any portion of this Web Site, except to the extent permitted above. You may not use or otherwise export or re-export this Web Site or any portion thereof, or the Content in violation of the export control laws and regulations of the United States of America. Any unauthorized use of this Web Site or its Content is prohibited. So, users of the website are allowed to read "the essential reference" needed for understanding the laws of the State of Georgia, but they "may not copy, modify, reproduce, republish, distribute, display, or transmit" it. As Carl Malamud told The New American, "In America, when we say that the law is available to citizens, that's not only the right to read the law, but it's also the right to speak the law. You should not need a 'license' to speak the Official Code of Georgia Annotated." Agreeing to the terms and conditions of the Lexis-Nexis website grants users a "limited license." Section 1 says, Web Site Limited License. As a user of this Web Site you are granted a nonexclusive, nontransferable, revocable, limited license to access and use this Web Site and Content in accordance with these Terms of Use. Provider may terminate this license at any time for any reason. So, it is not only a "limited license," it is also "revocable," and Lexis-Nexis "may terminate this license at any time for any reason." Since that severely limits the ability of citizens to "speak the law," Malamud decided to make the law (annotations and all) freely available without restrictions. In doing this, he did not copy the information from the website. He instead went to what he calls the "only code of Georgia that's available," which is a printed book that "the law-making body of the state has issued under its own name with a copyright — State of Georgia — and that is the only code of Georgia which is official: this printed book. We purchased it, we copied it, and we posted it," he told The New American. As part of the agreement between Georgia and Lexis-Nexis, Lexis-Nexis handles the annotations and copyrights the finished product, but Georgia owns the copyright. Lexis-Nexis sells the OCGA books. The pricetag is $378. On the Lexis-Nexis page for purchasing the book, it is described as follows, The Official Code of Georgia Annotated (OCGA) provides users with the official Georgia statutes, fully annotated and including guidance from the Georgia Code Commission. If you live or work in Georgia, the OCGA is the essential reference you need to guide you quickly and efficiently in understanding the Georgia statutory scheme. So in the absence of Malamud's website, the only two ways for citizens to access the annotated code (including guidance from the Georgia Code Commission) are to read it from the Lexis-Nexis website (after obtaining a "Limited License," which is revocable) or by laying out almost $400 for the books (which are copyrighted to prohibit any copying or sharing). As Mike Masnick pointed out in his excellent article for TechDirt, Furthermore, multiple parts of the Georgia government refer to the OCGA as the law of Georgia, rather than the unannotated version. Just as two quick examples, the Georgia Department of Community Affairs cites the OGCA to explain Georgia's construction codes, rather than the unannotated law. And the Department of Banking and Finance insists that: Laws governing entities regulated by the Department are primarily found in the Official Code of Georgia Annotated (O.C.G.A.) Title 7. This would certainly bear out Malamud's claim that "this is the publication of the law-making entity of the state. It is therefore what's technically know as an 'edict of government.' If you look at the U.S. Copyright Office manual of office practices, [it is] very clear that in the United States, 'edicts of government' do not have copyright." That section declares plainly: Edicts of government, such as judicial opinions, administrative rulings, legislative enactments, public ordinances, and similar official legal documents are not copyrightable for reasons of public policy. This applies to such works whether they are Federal, State, or local as well as to those of foreign governments. So why is Georgia taking what is so obviously an untenable position? The reasons are unclear. Especially in light of the fact that the state of Oregon considered and then dismissed the same course of action when Public.Resource.org published the Oregon Revised Statutes. After Oregon threatened suit against Public.Resource.org, Public.Resource.org threatened a countersuit, and the state then held public hearings to decide how to handle the situation. Malamud was invited to speak. He told The New American, We testified, the citizens of Oregon testified, Oregon's lawyer testified, then the legislature deliberated and they unanimously voted to waive copyright. And this is how these things should be resolved. This is an issue between the citizens of Georgia and their government. They should be holding hearings, not filing suits. But filing suits they are. The paper trail for this goes back to 2013 and includes letters back and forth. Finally, rather than resolving the issue the way Oregon did, Georgia has decided on the drastic action of suing Malamud. In a twist of logic that boggles the mind, the state claims that its agreement with Lexis-Nexis saves the taxpayers the cost of publishing the OCGA. It insists that without that agreement, it would be unable to publish the OCGA at all because it would be unfair to burden the taxpayers with the bill. Of course, it doesn't seem to be losing any sleep over burdening those same taxpayers with the bill for this lawsuit. Malamud started Public.Resource.org as a 501(c)(3) in 2007. He describes himself as "an old Internet guy who started the first Internet radio station." In 1993, he launched Internet Talk Radio, a weekly program featuring interviews with computer experts. Programming was later expanded to include live feeds from the floors of both the House and the Senate. He has written nine professional reference books about the Internet and was credited in the early 1990s with posting copies of the databases of both the Securities and Exchange Commission and the U.S. Patent Office online for the first time. He was able to convince the U.S. government to take over his work on those databases and begin providing those services itself. As he told The New American, I have 30 years' experience (of public service, frankly) helping the government put things online. And I want to make a point that this issue is not a [matter of] Right vs. Left or copyright vs. not copyright. My work has been on both sides of the aisle. Speaker Boehner and Congressman [Darrell] Issa [R-Calif.] asked me to help them put 14,000 hours of congressional video on the Internet — which I did. I was also the chief technology officer for John Podesta at the Center for American Progress. So, this is very much an issue that spans partisan divides. It is very much about a core part of our [government], which is about making the law available to citizens. As The New American's constitutional lawyer Joe Wolverton, II, J.D. noted about this case, Secrecy is often the tool of tyrants, particularly in an Anglo-American legal system where notice of the law has been a key principle for over 1,000 years. These annotations, while not part of the law, take on the color of law through the deference they are given. Thus, that which is de jure not the law, becomes the law de facto and will work to subtly diminish the civil liberties of all to whom these commentaries are applied. If anything should be open-source, it is the law. Without free and unhindered access to it, citizens are the servants of government. With such access to it, citizens can hold government accountable.
Visit our Re-post guidelines This article is copyrighted by GreenMedInfo LLC, 2018 Acne involves the over-production of sebum from the sebaceous glands which results in the blockage of the pores with a sticky mass of the dead cells and oil. This creates a breeding ground for the opportunistic overgrowth of bacteria normally present in the skin. These convert the mass into compounds that cause inflammation and unattractive raised surfaces. Skin irritation may also result from the use of artificial cosmetics and even some popular acne products [see Is Bleaching Your Face Really A 'Proactive' Acne Solution?]. The problem is aggravated by pinching, squeezing, scratching or rubbing the skin area plagued with acne. One of the most common contributing factors is hormonal changes and imbalance that occurs at the time of adolescence. Some other causing factors that send invitation to acne are emotional stress and sex hormones. When active during puberty, androgens (sex hormones), stimulate the oil glands typically on the face, chest, back and shoulders. Many girls are prone to acne usually in the week before menstruation. Junk food is also linked to the problem with prime suspects being high-glycemic foods containing sugars, saturated fat and iodized salt (liberally used on crisps, chips, and other fast foods) contained in them. Diet based on junk foods, snacks, chocolates, soft drinks, and alcohol is also low in several vital nutrients. Deficiency of these nutrients may also result in eruptions of acne. There are several herbal and folk remedies available for curing acne or pimple. Tribesmen in Patalkot and Dangs have been using these formulation for time immemorial. I would like to mention about tribal's traditional herbal formulations based on 11 important herbs. 1. Garlic (Allium sativum) Rub your acne with raw Garlic (Allium sativum) several times a day, it helps in relieving the pain and also heals acne fast. 2. Aloe (Aloe vera) The pulp of Aloe (Aloe vera) is an exceptional skin cleanser. Juice of the plant counteract infection and promote healing. Split off a portion of Aloe vera leaf and rub the pulp directly on the skin. 3. Amaranth (Amaranthus spinosus) Make a tea from Amaranth (Amaranthus spinosus) seeds and use as a face wash. To make the tea bring 3 cups of water to boil, add 2 teaspoons of seeds. Cover and boil for five minutes; remove from heat and add 1 teaspoon of leaves and steep for 30 minutes. 4. Neem (Azadirachta indica) Neem (Azadirachta indica) is valued in Ayurvedic medicine for its varied healing properties due to its anti-bacterial, anti-fungal, and anti-viral capabilities. For acne, fresh 5 leaves everyday taken in the morning helps in removing stubborn acne. 5. Lemon (Citrus limon) Clean your skin and apply Lemon (Citrus limon) juice with a cotton ball. The acid in emon helps flush out the pores and keeps the skin looking beautiful. Another method using Lemon juice is to "steam clean" the face by putting it over a pan of boiling water with a towel over your head to trap the steam. This will loosen the dirt and oil. Then apply a cotton ball to remove the dirt and oil buildup. Use this method once a week. 6. Coriander (Coriandrum sativum) Drink a tea made by combining 0.5 teaspoons of Coriander (Coriandrum sativum), Cumin (Cuminum cyminum) and Fennel (Foeniculum vulgare) and steeping it in hot water for 10 minutes. Strain and drink the tea after breakfast, lunch and dinner. 7. Basil (Ocimum basilicum) Make an infusion of Basil (Ocimum basilicum) leaves. Put two to four teaspoons of dried Basil leaves in a cup of boiling water, steep for 10 to 20 minutes, cool, and apply to the acne. 8. Cucumber (Cucumis sativus) Liquefy a peeled Cucumber (Cucumis sativus) in a blender and apply the juice to the acne. Another variation of this remedy is to drink four or five cups of Cucumber juice daily for a week. This is said to purify the blood and lymphatic system, resulting in a clearer skin. 9. Grape (Vitis vinifera) Grape (Vitis vinifera) seed extract is a powerful all-around antimicrobial agent and is an excellent disinfectant. Make a solution of 4-40 drops in four ounces of water and apply to the affected areas with a cotton ball two or three times a day. 10. Chickpea (Cicer arietinum) Wash your face with Chickpea (Cicer arietinum) paste (mix one teaspoon of chick-pea flour with a little water). Dry with a clean towel. This is also a good remedy to cure acne. 11. Beet (Beta vulgaris) Use a blend of one part Beet (Beta vulgaris) root juice, three parts Carrot (Daucus carota) juice and two parts water to stimulate the liver and to cleanse the system. First of all, get rid of associated infections such as skin infections and dandruff. There are several home remedies to help with disorders like this. One should not scratch, pinch or squeeze acne as that will worsen the situation and will spread it to other areas of the skin. Also, this may leave ugly marks on the face even when the skin heals up. Food hygiene as well as body hygiene are an essential part of maintaining good health and keep infections at bay. Proper care should be taken and one should avoid an irregular diet program. It should be well organized and maintained. Avoid excessive amounts of caffeine (in the form of coffee, soft drinks, etc.), alcohol and tobacco. Cut down on chocolates, sweets, highly salted snacks and added sugar. Practicing yoga, brisk walk, mediation and deep breathing exercises for healthy mind and body would certainly help in getting rid of dreadful acne. References
Earlier, we pointed out that Glenn Beck was "restoring honor" at his "Divine Destiny" religious revival with Rabbi Daniel Lapin. Lapin's honor-restoration bona fides include a close relationship with Republican-lobbyist/convicted felon Jack Abramoff. Unsurprisingly, Lapin was not the only controversial figure to join Beck on stage. The last religious figure to appear at the event was Rev. John Hagee, who you may remember from earlier this summer as one of the "leaders in the faith community" that Beck held out as an example of "people that need to start standing up." Back in July, Beck plugged Hagee's "excellent" book, Can America Survive? 10 Prophetic Signs That We Are The Terminal Generation. As I detailed at the time, Can America Survive? is an account of how the world is fast-approaching Biblical Armageddon. Hagee views himself as an expert in Biblical prophecy, and laid out several bold predictions in the book, including: The "very real fact" that one third of humanity will soon die in an ecological disaster. "The Antichrist will rule the world with the full authority and anointing of Satan himself. Satan and his legions are coming!" "Why is this divine covenant for a specific land to the Jewish people so crucial in the twenty-first century? It's urgent because World War III is about to begin over the failure of humanity to recognize Israel's historic right to the land."
Nelson Mandela, pictured in 1994, was in prison for nearly 28 years after police were told where to arrest him Rex Features A former CIA spy has revealed his key role in the arrest of Nelson Mandela, which led to the future South African president’s trial and imprisonment for almost 28 years. The bombshell disclosure led yesterday to a demand for the CIA to come clean about putting behind bars a figure who became one of the world’s most revered statesmen. A veteran political associate of Mandela called it a “shameful act of betrayal” that “hindered the struggle against apartheid”. The former CIA operative, Donald Rickard, was unrepentant, saying that when arrested in 1962 Mandela was “the world’s most dangerous communist outside of the Soviet Union”. He made his taped confession in March to the director John Irvin, who has recreated Mandela’s final months of freedom before…
Story of O Ring (Over time these rings have also come to be called "Collar Rings" because they look similar to the collars worn on necks) These rings are fashioned after the ring that O wore in the film version of the well-known "Story of O" as she experiences her journey through BDSM exploring her submissiveness. Often known as the "Story of O ring", it has also been called the "Story of O slave ring", or simply a "slave ring". I make 9 styles of the Story Of O ring. Each is available with a small or a medium or a large shackle. You can choose sterling silver, 10kt, 14kt, 18kt, yellow or white gold and 14kt rose gold. Sterling silver rings can be made with sterling silver shackles or a 14kt gold shackle in yellow or rose gold. Now worn by submissives, Dominants, Tops, and bottoms alike. They are worn on any finger, including the thumb. Some wearers choose to wear a slave ring on the left hand if submissive and on the right hand if Dominant; however, this is by no means a hard-and-fast rule. This slave ring is a band with a representation of a shackle on the top of it. Originally it was a simple flat ring, but I have created a number of variations on it to cater to a variety of tastes. The Story of O slave rings that I make come in a flat style and in a rounded style. Both are available in three different widths, and the shackle can also be made in three different sizes. As well, I have made styles of rings that resemble the links of a chain as opposed to a solid band of metal. Add a shackle to a ring like this and you have a chain-link Story of O ring. There is another style of slave ring that I make that was popular in the 1970's. This "Classic slave ring" is built in such a fashion as to appear as a "V" on the wearer's finger. It is available with or without the same sort of shackle as on the Story of O ring. The Shackles The shackle is the round ring on the top your ring. You can choose from three different sizes. The large shackle is 13.0 mm in diameter. The medium shackle is 11.5 mm. The small shackle is 9.25 mm. The shackle will not fall off! (Unlike the inexpensive look alike rings that are available) The shackle is free to move back and forth. It is welded shut. Shackles on Sterling Silver Rings When choosing a sterling silver ring you can have a sterling silver shackle, or a 14kt yellow gold or a 14kt rose gold shackle. Shackles on Gold Rings A gold ring can be made with a choice of the color of gold for the shackle. The shackle in any size can be: yellow gold white gold rose gold Anyone familiar with the distinctive style of a ring with a shackle on it will easily recognize it as a Slave ring or Story of O ring, and will associate the wearer with a BDSM lifestyle or interest of some sort. The slave rings that I make are of very high quality, hand-made and polished. I make them in yellow and white gold and rose gold. I also make them in sterling silver. I regularly work in platinum as well. Please write for a quote for any ring that you want in platinum. As you can see, there is a considerable selection from which to choose. Story of O Rings are made individually to your chosen specifications. The usual time to make your ring and have it ready to ship to you, is 8 business days. See the Story of O Rings made in Sterling Silver ..... here See the Story of O Rings made in Gold ..... here
I am not a registered dietician and nothing in this post should be considered dietary/medical advice. I am a Certified Personal Trainer with a specialty in Sports Nutrition. Unfortunately, the only nutritional recommendations I am allowed to make follow the governments model…. and I refuse to do that so I refrain from giving any advice at all. The following is merely a long explanation of how I started eating this way and my experience compared to other ways of eating. The whole topic of high-fat low-carb vs. high-carb/low-fat is very polarizing. There are die hard fans in each camp. It is hard to discuss nutrition these days without getting into a heated debate over which is the most optimal way to eat. I like many people have been led to believe that a high-carb/low-fat diet was the key to optimal health. I have trained for numerous endurance events and used this model of eating with success. I have done everything from Mediterranian/bodybuilding to 80/10/10. The more research I did on high carb diets the angrier I became. I will put some links at the end of this post- If you do enough research you can find links to support each side of the argument. Certain things are not secret like Monsanto (don’t know what the big deal with Monsanto is? – see links at end) and the Government, the FDA and the things they allow us to consume vs. what other Countries have banned. My wariness with the government’s role in our health grows with each passing year. I won’t get into a long rant but if you aren’t already, you should be cautious of any person/group who makes recommendations with something to gain i.e.: doctors/ Govt./Big Pharma… I’ll leave it at that. Why I chose the Keto diet for my race: As I mentioned earlier, I have used high carb diets for long endurance events. This required many carbohydrate/sugar filled supplements during the course of a race: gels, jelly beans, liquid… I heard about a lot of ultra marathoners and professional athletes switching to this diet and the improvements they had with performance. Intrigued by the idea of endless energy without supplementation I figured what better time than now to give it a try?!?! Keto Flu- Myth or Reality?: I read about the many horrible symptoms that people were experiencing while fat adapting. I guess I was one of the lucky ones because my symptoms were relatively mild. I had a very mild brain fog and felt a little more tired than normal the first week. There was a mild headache that I couldn’t shake for a week. My energy levels were lower than normal. I have read some horror stories and ways to avoid them. I credit my lack of flu symptoms to drinking a lot of water and upping my salt intake to keep my electrolytes in balance during the process. I pushed through the cravings and after a week, it was like a light switched and I had a crazy abundance of energy, totally clear mind. My mind felt so sharp. The whole experience was surreal. Training while fat adapting: I won’t sugar coat this, training was a challenge. My energy levels were low and my body was not efficient at turning fat into fuel. This proved to be especially frustrating for someone training for a marathon! I also struggled mentally with the fact that the best time I ever ran a half marathon was 2:15 and not training for close to 3 years brought that time to 3 hours prior to switching my diet, to 3 hours plus after the switch!!! UGH! I feared that I would never get better and would have to withdraw from the race. My training really sucked for a good month. This had to be one of the most frustrating months of my life. I kept on and then something magical happened. I noticed running felt back to normal and I didn’t need to supplement with gels, never hit a wall, and felt like I could go on forever. When I do my long runs now I am still amazed by the fact that I don’t need to take energy gels. I remember the days when I would calculate how many hours I would train and how many calories I would have to supplement during a training session. I no longer have stomach cramps/nausea after a hard training session. The whole experience has been so liberating. Benefits I have noticed You get a new found level of mental clarity, I never realized what kind of mental fog I lived in until I cut out the carbs. I wake up and don’t depend on coffee to get my day going, whereas before a while pot of coffee was necessary to get moving. I can run for hours on end without needing supplementation. My mood has stabilized, I no longer get “hangry” because I never feel that overwhelming need to eat like I did before. My energy throughout the day is very stable, I do not have highs and lows throughout the day, no afternoon crashes. I feel satiated at the end of a meal, in fact, there are times where I don’t feel the need to eat everything on my plate because fat is so filling. One of the most impressive benefits is the stabilization of my blood pressure- I have always had slightly elevated pressure and now my pressure reads normal if not better! Negatives of the Keto diet Nothing is perfect and though this way of eating comes with some amazing benefits, it isn’t without its downfalls. Your body doesn’t hold onto water so you must drink more and also add a lot of salt to everything to keep your electrolyte levels stable. Eating out takes a lot more planning/thinking, you can’t just sit and eat pizza with friends, you have to order something else which always brings the same conversation until people realize you aren’t changing anytime soon. If you are an athlete, you can expect a major hit to your performance for at least a month. Even though the headaches pass, it takes a while for your body to truly become fat adapted. Digestion/bowel movements can be tricky the first few weeks: ranging from constipation to liquid, depending on how your body adapts and what you consume. Cravings suck the first few days and you have to get creative to re-create your favorite treats. This diet lacks a lot of micronutritional content- vitamins are necessary. If you are an athlete you will definatley want to take a potassium/magnesium supplement to help with cramping! Have you tried this way of eating? Curious? Have questions? I would love to hear from you. I will do a post marathon follow up in a few weeks!
When staff at the Enterprise Rent-a-Car outlet in Hawkesbury came across some items in a returned vehicle this month, they were hardly surprised. Customers forget belongings all the time. So the staff followed their usual process of storing the items for safekeeping in case the client came back for them. A few days later, however, an employee detected an unusual odour emanating from a pouch that was among the items, and decided to call the nearby Ontario Provincial Police detachment in Hawkesbury, which is one hour east of Ottawa. “Officers attended the car rental centre and seized the item, which was revealed to be approximately one kilogram of hashish, in a solid brick form,” the OPP said in a release. No arrests have been made, but police said their investigation continues.
In March, the number of applications by social workers to take children into care set a new record: 882 in a single month. Over the past year, my reports on how our highly secretive “child protection” system seems, too often, to collude in seizing children without proper justification have provoked considerable irritation in a number of judges – and last week the judiciary hit back. Mr Justice Bellamy, presiding over a case to which I have referred several times, took the unusual course of publishing a judgment in which he was highly critical of me for my “unbalanced” and “inaccurate” reporting. Then the head of the family courts, Lord Justice Wall, in his ruling on another case, swiftly endorsed Bellamy’s attack on me (despite his own earlier criticisms of the “shocking” determination of some social workers to place children in “an unsatisfactory care system”). I am not displeased that Bellamy has published his judgment, because the main part of it provides a rare opportunity to see how a judge may rely on a particular medical argument which has become increasingly controversial. But first I must deal with his criticism of my reporting. In the many hundreds of words I have written about this case, on five separate occasions, he singled out only two points as inaccurate. On one of these he was right: I was misinformed that a particular medical witness had appeared in another of Bellamy’s cases. The next day, however, the judge had to add a supplementary judgment, correcting some of what he had said. It emerged that he had made several factual errors in his references to me. These included misquoting what I had written, through reliance on a website (which he misspelt), and claiming that my articles had appeared in The Daily Telegraph. Bellamy went on, however, to use my two errors as his text for a general homily on how inexcusable it is to give a tendentious account of family cases based on a one-sided picture given by aggrieved parents. This might sound damning to anyone unfamiliar with the whole secretive system, but it takes no account of the extraordinary obstacles placed in the way of any journalist wishing to report fairly on them. On more than one occasion when I approached a local authority to check on the facts of what seemed a very disturbing case, the only response was to seek a gagging order prohibiting me from mentioning the case at all. When I accurately reported on one case so embarrassing to the council concerned that it eventually dropped its bid to seize a child, the judge ruled that any future reference to the case outside the court could lead to summary imprisonment. So the only recourse left to those trying to establish the facts of such cases is rigorously to test what can be learned from the few people willing to speak, and to come to as informed and judicious a view as possible. Something else came to light in Bellamy’s judgment, however, that is far more important than his criticism of me. The case before him concerned a couple who last year became so concerned that their six-week old baby had developed a “floppy arm” that in the middle of the night they took it to hospital to be examined. X-rays showed the child had suffered a “non-displaced” fracture of the humerus. The police were summoned to arrest the parents, who were led off in handcuffs and held for hours in police cells. Coventry social workers took the child into care and the police charged the father with physically abusing his son. Now the judge has delivered his fact-finding judgment, on the basis of which he will decide the child’s future in September, we can see, for the first time, that its injuries included not only the fractured arm but also six “metaphyseal fractures” and several marks or bruises. (“Metaphyseal” refers to the metaphysis, the part of a long bone near where it meets a joint, the part that grows in childhood.) All of this sounded like a very grave set of injuries, which might point to serious physical abuse. The court heard that in every other respect the couple seemed to be devoted, conscientious parents, anxious only to do the right thing by their child. But what clearly weighed most heavily with the judge were those “metaphyseal fractures”. He heard evidence from no fewer than four medical experts that metaphyseal fractures are a virtually certain sign of “non-accidental injury” (a phrase used 20 times in his judgment), implying intentional physical harm. Finding on this basis that the child had definitely been abused, Bellamy then saw it as his duty to identify the “perpetrator”. Based on the timing of the events that led to the parents rushing their child to hospital, he concluded that the main injury must have been inflicted in a brief interval when the father was out of the room, and the person responsible must have been the mother. The police, he argued, had been wrong to charge the father (a charge still awaiting trial). The judge was thus, in effect, accusing the mother of a crime. The problem with regarding metaphyseal injuries as an indicator of abuse is that in recent years ever more medical experts have strongly questioned the idea. Their studies show that metaphyseal fractures may occur in babies with soft, still-forming bones, with minimal trauma. They even question whether such injuries can be properly described as fractures at all. The real explanation, they believe, lies in a metabolic bone disease, a contributory factor to which may be a deficiency in Vitamin D (of the type which evidence showed the mother in our present case to have). Only this month a leading American expert, Dr Marvin Miller, published a major new study suggesting that “the cause of multiple unexplained fractures in some infants” might be “metabolic bone disease, not child abuse”. Also something of an expert on this subject is Dr James Le Fanu of this newspaper, who in 2005 published a paper in the Journal of the Royal Society of Medicine entitled “The wrongful diagnosis of child abuse: a master theory”. In another paper, “The misdiagnosis of metaphyseal fractures: a potent cause of wrongful accusations of child abuse”, he described how the theory of metaphyseal fractures as characteristic of child abuse, first advanced by Dr Paul Kleinman in the US in 1986, was taken up by a small group of radiologists in Britain who became much in demand in our courts as expert witnesses. In 2005, under the headline “Happy, loving parents? They must be child abusers”, Dr Le Fanu explained in these pages how reliance on this diagnosis in the criminal courts was already strongly contested, to the point where it became discredited. But in the family courts, he wrote – citing a case remarkably similar to the one before Mr Justice Bellamy today – the theory was unchallenged. It is certainly noticeable from Bellamy’s account of the evidence that it was all strictly according to Kleinman’s theory. The four expert witnesses he heard all came across as committed advocates of the Kleinman thesis, in arguing that metaphyseal fractures are an indicator of child abuse. For whatever reason, not one expert was called who was prepared to challenge that view. Bellamy himself said that these injuries are often regarded as “pathognomonic of abuse”, meaning they can have no other cause – seemingly unaware that there is a growing body of scientific opinion to suggest that this may not be their cause at all. The lawyers for the mother, who has effectively been accused by the learned judge of a criminal act, are said to be considering an appeal against Bellamy’s ruling. If so, one hopes they will take the opportunity to call expert witnesses ready to challenge this still prevailing orthodoxy, on the basis of which scores of children have been removed from loving and conscientious parents – so that the bench on that occasion can at least be given a rather less “one-sided” view. Comments on this story have been disabled for legal reasons.
It’s official: Game of Thrones won’t only go for seven seasons. Even though Thrones showrunners have repeatedly declared they only want to make seven seasons of the fantasy hit, HBO programming president Michael Lombardo for the first time firmly announced the series will go at least eight seasons. “Seven-seasons-and-out has never been the [internal] conversation,” Lombardo said to critics at the Television Critics Association’s press tour Thursday. “The question is how much beyond seven are we going to do. [Showrunners David Benioff and Dan Weiss are] feel like there’s two more years after six. I would always love for them to change their minds. That’s what we’re looking at right now.” And what about a prequel series? The Brief Newsletter Sign up to receive the top stories you need to know right now. View Sample Sign Up Now “I would be open to anything Dan and David want to do,” Lombardo said. “It really would depend fully on what they want to do. I think you’re right, there’s enormous storytelling to be mined in a prequel. We haven’t had any conversations.” This article originally appeared on EW.com Contact us at editors@time.com.
PhDs are hard. It has been tough to find a week to mess around with software, etc, that isn’t research related. I thought I would put out at least one piece of content this year, so here goes. I have been running both a desktop and a Surface Pro X for a few years now, and see no reason why that won’t continue into the future. If I could get away with not using Windows I would, but essentially the entire point of using a Surface is OneNote which is definitely amazing, and also totally free now (OneNote 2016 standalone download for desktop). Also, plenty of entities in academia and industry use windows environments, so in the interest of daily productivity, it’s not going away. It’s highly unlikely that there will be remotely comparable open source alternatives to OneNote with stylus support unless some entity like Canonical debuts some tablet hardware. Therefore, it seems likely I will be running dual boot setups well into the future. VMs are ok, but have their limitations, regardless of choice of host OS, especially when it comes to support of cutting edge hardware (my desktop has a PCIei NVMe drive, etc). I spent some time this spring attempting to configure a VM box with some old hardware, and can safely say that, for my purposes, dual booting is much less demanding. Other academics and engineers will likely find it useful as well. I’ve been asked multiple times about my setup, so I am writing this quick little guide for setting up a dual boot Arch Linux and Windows configuration running on UEFI hardware that works with Secure Boot enabled (signed kernels, loader, etc). Pieces of these instructions are sourced from forums/wikis across the web, but none of it is available in one place and most of them leave out crucial steps without elaboration. This guide will make minimum assumptions regarding background knowledge, and should work for distributions other than Arch with regards to the bootloader setup. I will cite where it makes sense to. I will provide instructions for both single disk (Linux and Windows on same disk) and multi disk (Linux and Windows on different disks) setups. The only significant difference is the bootloader used. Some helpful webpages: In summary, I am outlining two schemes here that are not completely addressed elsewhere: Dual boot Arch/Windows on Surface Pro X with Secure Boot enabled. Dual boot Arch/Windows on a desktop with Arch on a SATA SSD and Windows on a PCIe NVME drive. using systemd-boot only. A Microsoft signed bootloader and hashing tool, available here enable compatibility with Secure Boot. All loaders sit on the same ESP (EFI partition). This should also give people ideas on what they might have to do with other, more exotic configurations. For a more general overview of loading OSs and kernels in a UEFI system, see this resource written by Rod Smith. 1 Overview Based on past experience, I prefer to install Linux first and on the first few partitions, and then install Windows afterwards. I mostly do this to have explicit control over the size and order of partitions (see partitioning section below). For single disk PCs, Windows 10 should find your existing EFI partition and add its bootloader and some recovery stuff to /boot/EFI. For multi disk PCs, Windows 10 will be installed to a separate drive and, presumably, have no knowledge of the Linux drive, which will be used as the boot drive. The multi disk configuration has worked for me with a variety of bootloaders (grub2, refind) in the past, but with Windows 10 sitting on an Intel 750 PCIe drive, only refind, and with some tinkering, has worked well. If Secure Boot functionality is desired, the setup will vary depending on motherboard vendor. If running on Surface Pro hardware, there is no option to enable secure boot out of the box for alternate operating systems (Linux, etc). One must either use Microsoft-prepared loader and hash tool (to generate signatures for kernels), or wipe the keys and certificates that ship with the platform and generate their own (see Creating Keys section of the Secure Boot entry in the Arch wiki). I see no reason why, for personal use, using the Microsoft loader and keygen is undesirable, so that is the method outlined here. Regarding installation media, I use a live USB of the latest Arch build (see this entry on bootable media) and a bootable USB of Windows 10 which can be created either by an existing Windows 10 installation, or just download the ISO from here. Note that this requires a Windows 10 OS key for full activation. From windows, rufus can be used to create bootable media. There are a plethora of methods to create a live USB of either OS from common Linux distros, from using packages to manual partitioning and file copying. Regardless of which hardware platform is being used, start by turing Secure Boot OFF. It is not guaranteed that the linux distro live ISO image will have correctly signed loaders. Some firmware interfaces, like on my ASUS X99-A motherboard, allow switching the Secure Boot option to “Other OS”, which presumably allows non-vendor signed kernels/modules to load. However, on my Surface Pro 3, there is no such option. Disabling Secure Boot completely during initial installation will likely minimize headaches. Additionally disable Windows Fast Boot. It is a type of hibernation mode and may lead to headaches when you are jumping around between operating systems. 2 Partitioning The installation starts with the Linux distro. On the Surface Pro, insert the live USB and boot by holding volume down, then the power button. Most Linux distributions have live ISOs with GUI installers. Arch does not, so if this is your first time with it, keep the Beginner’s Installation guide handy. Start by navigating to your motherboard’s BIOS and turning Secure Boot off. Then, load the live USB. 2.1 Single Disk Most live ISOs will have gparted available for use. If not, you can make a gparted live USB with this image. The following can be done either in the gparted GUI or from the available linux shell: $ parted /dev/sdx # on Surface Pro this will be /dev/sda ( parted ) mklabel gpt # say Yes to the prompt ( parted ) mkpart ESP fat32 1MiB 512MiB ( parted ) set 1 boot on ( parted ) mkpart primary linux-swap 512MiB 17GiB # I have 16GB DRAM ( parted ) mkpart primary ext4 17GiB 40 % ( parted ) mkpart primary ntfs 40 % 60 % ( parted ) quit That last entry is not for Windows, it is for a shared NTFS filesystem that both operating systems use (for Dropbox, email, downloads, etc). The Windows section is left unallocated, it will be formatted by the windows installer. Afterwards, format the partitions as needed: $ mkfs.fat -F32 /dev/sdx1 $ mkswap /dev/sdx2 $ swapon /dev/sdx2 $ mkfs.ext4 /dev/sdx3 $ mkfs.ntfs -f /dev/sdx4 FIGURE 1 Single disk partition scheme for dual booting in UEFI mode. Works on, for example, Surface Pro line, etc. 2.2 Multi Disk With a multiple disk environment, I run Arch on one drive, Windows on another and shared storage on yet another drive (or two (or three…)). So partitioning the Arch drive would look like this: $ parted /dev/sdx ( parted ) mklabel gpt # say Yes to the prompt ( parted ) mkpart ESP fat32 1MiB 512MiB ( parted ) set 1 boot on ( parted ) mkpart primary linux-swap 512MiB 17GiB # I have 16GB DRAM ( parted ) mkpart primary ext4 17GiB 100 % make sure to format the partitions as shown above. 3 Arch Installation and Bootloader (systemd-boot) The rest of the installation should proceed exactly as in the Arch Beginner’s Guide until the last few steps or so. Please do everything on that page, except the “Boot Loader” and “Unmount/Reboot” sections, as this is covered here. After completing the other installation tasks, do not exit the mounted disk’s root shell. If you restart at this point you will have a bad time. The bootloader needs to be installed. Go ahead and enter the following: $ pacman -S efibootmgr efitools $ bootctl --path = /boot install Systemd-boot does not automatically detect Linux kernels, but it does automatically detect the windows loader (bootmgfw.efi, I believe). Custom entries must be made for linux kernels as well as tools like MemTest86+. Simply edit the following files: $ vi /boot/loader/loader.conf ### file contents default arch timeout 4 editor 0 Then, add the boot entry for the linux kernel. You can also install the intel microcode updater and load it before initramfs (recommended) $ pacman -S intel-ucode $ blkid ### NOTE THE PARTUUID FOR THE / PARTITION ON YOUR ARCH DRIVE $ vi /boot/loader/entries/arch.conf ### file contents title Arch linux /vmlinuz-linux initrd /intel-ucode.img initrd /initramfs-linux.img options root = PARTUUID = THE-NUMBER-GIVEN-BY-BLKID rw The last step is to sign the necessary items so that they are compatible with the present Secure Boot keys, etc. This will download a Microsoft-signed loader and hashing tool (so that you can generate compatible keys for any kernel, tool, etc). Also, apparently, the name of the standard systemd bootloader has to be changed so that PreLoader will recognize it (the last step below): $ cd /boot/EFI/systemd $ wget http://blog.hansenpartnership.com/wp-uploads/2013/PreLoader.efi $ wget http://blog.hansenpartnership.com/wp-uploads/2013/HashTool.efi $ cp /boot/EFI/systemd/systemd-bootx64.efi /boot/EFI/systemd/loader.efi Now the boot order on the motherboard NVRAM has to be changed so that PreLoader.efi and HashTool.efi are a) present, and b) in the correct order. $ efibootmgr -c -d /dev/sdx -p 1 -L PreLoader -l /EFI/systemd/PreLoader.efi $ efibootmgr -c -d /dev/sdx -p 1 -L HashTool -l /EFI/systemd/HashTool.efi $ efibootmgr -v # see the order and note the codes for each item ... $ efibootmgr -o 000P, 000H, 000L where P, H, L are the hex codes for the PreLoader.efi, HashTool.efi and Linux Boot Manager, respectively. After this is done, you can exit the root shell. Unmount the installation drive ( $ sudo umount -R /mnt ) and reboot. Go into your BIOS and turn secure boot back on. Upon a subsequent reboot, you should be loaded into a blue screen that prompts you that no signed binaries were found and that you need to sign them using the tool. The following items MUST be signed for the Linux kernel to boot correctly: /boot/EFI/systemd/loader.efi /boot/vmlinuz-linux (or any other kernel that you want to boot) Older versions of things like memtest.efi (newer memtest releases have been signed by Microsoft) You can navigate back through folders using the ../ option.If you forgot to sign something and can not boot, don’t worry. You can load up the live USB (have to disable Secure Boot first), and set HashTool.efi to be the first boot entry using efibootmgr . Then, you can re-sign as needed. 4 Windows Install windows normally on either the unallocated space of the single disk setup, or on a separate disk. Remember to disable Fast Boot (in “Power Options”) before you restart for the first time. Systemd-boot will not be able to load windows through PreLoader.efi if the Windows boot manager is not on the same ESP. This is a trivial fix. For the single disk environment, this is not an issue. Windows will see the ESP and install it’s manager to /boot/EFI/Microsoft/.. . For the multi disk environment, you can simply copy the manager from the other disk(s) to the first ESP. For example, I have Windows installed on an NVMe drive: $ lsblk NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINT sda 8 :0 0 931 .5G 0 disk └─sda1 8 :1 0 931 .5G 0 part /home/USER/STORAGE_DISK sdb 8 :16 0 223 .6G 0 disk ├─sdb1 8 :17 0 512M 0 part /boot ├─sdb2 8 :18 0 191 .1G 0 part / └─sdb3 8 :19 0 32G 0 part [ SWAP ] sdc 8 :32 0 465 .8G 0 disk /home/USER/OTHER_STORAGE_DISK sr0 11 :0 1 1024M 0 rom sr1 11 :1 1 1024M 0 rom nvme0n1 259 :0 0 372 .6G 0 disk ├─nvme0n1p1 259 :1 0 450M 0 part ├─nvme0n1p2 259 :2 0 100M 0 part ├─nvme0n1p3 259 :3 0 16M 0 part └─nvme0n1p4 259 :4 0 372 .1G 0 part The boot partition is clearly on /dev/nvme0n1p2 (Windows partitions are recovery, boot, filesystem), so we can mount it and copy the relevant files over: $ sudo mount /dev/nvme0n1p2 /mnt $ sudo cp -R /mnt/EFI/Boot /boot/EFI/ $ sudo cp -R /mnt/EFI/Microsoft /boot/EFI/ $ sudo umount /mnt For the single disk environment, Windows will likely overwrite your existing boot order in NVRAM to place its boot manager first. This is an easy fix. Just load up the live USB again (have to disable Secure Boot first) and change the boot order back to PreLoader.efi being first: $ mount /dev/sdx3 /mnt # the Arch / partition $ mount /dev/sdx1 /mnt/boot # The Arch /boot partition $ arch-chroot /mnt /bin/bash $ efibootmgr -o 000P, 000H, 000L, 000W $ exit $ umount -R /mnt $ reboot where P, H, L and W are the hex codes for the PreLoader.efi, HashTool.efi, Linux Boot Manager and Windows Boot Manager entries, respectively. Again, you can check the order with $ efibootmgr -v . The mentioned order is also desired in the multi disk environment. We want to be able to sign new kernels, etc, upon restart. After restarting, you should be booted into the systemd-boot menu. I like to make Windows the default entry because of automatic updates that require multiple restarts, etc. This can be done by just moving the cursor to the Windows entry and pressing “d” in the systemd-boot menu at startup. 5 Conclusion Hopefully this was a straightforward, non-mysterious guide to setting up dual boot with Arch Linux and Windows for common disk configurations. This should work with any other Linux distribution with minimal tweaking. Maybe one day I’ll care about having a sexy bootloader (ReFIND + themes), but the ease of setup for systemd-boot is attractive. From my experience with ReFIND I don’t see why this method (copying the loader from the NVMe disk over to the ESP on the Arch disk) wouldn’t simply work, but that’s a project for another day. Lastly, most desktop motherboards have their own bootloaders now. If you don’t switch operating systems often enough, it may be sufficient for you to just do two separate installs and then just spam F8 on reboot (or whatever button it is for your BIOS) and just load the correct disk. Regardless, if you want Microsoft-signed Secure Boot to work with your Linux OS, you need to install the signed loader, etc, as outlined above. The Surface Pro line, unfortunately, does not have it’s own loader. Also, if I want to tinker, I can enable the boot options editor (change the line editor 0 to editor 1 in /boot/loader/loader.conf ), and add cool words like “quiet” and “splash”.
Index for "Baseball Analyst" Nos. 1-40, 1982-1989 Read Rob Neyer's introduction of the Baseball Analyst archives. Click on an issue below to download the text-searchable PDF file. Special thanks to Bill James and Phil Birnbaum for making these files available. Index compiled by Stephen Roney, Charles Pavitt, Pete Palmer, Cliff Blau, Bill Deane, Dan Heisman and Barry Mednick. Introduction by Bill James, p. 1 by Bill James, p. 1 Ballpark Effects on the Production of Infield Errors and Double Plays by Paul W. Schwarzenbart, p. 2-7 Using data from the NL from 1972,3,6,8,9, and 80, presents home/road rates and ratios for infield and outfield errors and double plays and analysis of the results by Paul W. Schwarzenbart, p. 2-7 Using data from the NL from 1972,3,6,8,9, and 80, presents home/road rates and ratios for infield and outfield errors and double plays and analysis of the results The Distribution of Runs Scored by Dallas Adams, p. 8-10 Using data from 1967-76 from both leagues, presents % of games teams score a given number of runs from 0 to 19+. Teams are lumped into 11 groups based on average runs per game. A graph presents the same data as the table. 2 formulas are given to predict the distribution of runs for any team given its average scoring rate. One is for 5 or fewer runs in a game, the other for 6 or more. by Dallas Adams, p. 8-10 Using data from 1967-76 from both leagues, presents % of games teams score a given number of runs from 0 to 19+. Teams are lumped into 11 groups based on average runs per game. A graph presents the same data as the table. 2 formulas are given to predict the distribution of runs for any team given its average scoring rate. One is for 5 or fewer runs in a game, the other for 6 or more. Nolan Ryan's Fifth No-Hitter by Tom Jones, p. 11 Account from box score of the 9/26/81 game by Tom Jones, p. 11 Account from box score of the 9/26/81 game Wins and Losses For All Players by Mark D. Pankin, p. 12-14 Describes a method of assigning wins and losses to pitchers and hitters. No statistics or actual examples given. Wins are given only to players on winning team and vice versa. by Mark D. Pankin, p. 12-14 Describes a method of assigning wins and losses to pitchers and hitters. No statistics or actual examples given. Wins are given only to players on winning team and vice versa. Home Runs - A Matter of Attitude by Robert Kingsley, p. 15-20 Examines the reasons more home runs are hit at some parks than others, using Atlanta Fulton County Stadium and Busch Stadium for illustrative purposes. Concludes 3 main factors are physical dimensions, climate, and hitter attiude. Statistics given for home and road homers for and against Braves and Cardinals for 1976, 7, and 8. Introduction by Bill James, p. 1-2 by Bill James, p. 1-2 Some Patterns of Age and Skill by Dallas Adams, p. 3-9 Data presented for all non-pitchers who finished their careers between 1901 and 1968. Gives data for each age from 16 to 46:total games, % of games, number of players, games per player, number of players playing 95 or more games, and % playing 95 or more games. Next table gives same data for Hall of Famers included in table 1. Table 3 gives same data for catchers included in table 1. Much of these data are presented in graph form, too. by Dallas Adams, p. 3-9 Data presented for all non-pitchers who finished their careers between 1901 and 1968. Gives data for each age from 16 to 46:total games, % of games, number of players, games per player, number of players playing 95 or more games, and % playing 95 or more games. Next table gives same data for Hall of Famers included in table 1. Table 3 gives same data for catchers included in table 1. Much of these data are presented in graph form, too. Ballpark Effects on Fielding Performance:Further Evidence by Craig Wright, p. 10-12 Uses NL data from 1976-80 to compare double play rates on grass versus artificial turf. Speculates that ground balls are more prevelent on grass fields and thus the effect on fielding average is slightly less than the effect on errors. by Craig Wright, p. 10-12 Uses NL data from 1976-80 to compare double play rates on grass versus artificial turf. Speculates that ground balls are more prevelent on grass fields and thus the effect on fielding average is slightly less than the effect on errors. Run Production by Batting Order Position by Dick O'Brien, p. 13-14 For 1981, presents RBI by batting order position and % of RBI by batting order position. Also same info given for pinchhitters and other subs. Compares actual RBI performance for a select group of number 3 and 4 hitters to their expected RBI. Detailed comments on Steve Garvey and Dave Winfield. by Dick O'Brien, p. 13-14 For 1981, presents RBI by batting order position and % of RBI by batting order position. Also same info given for pinchhitters and other subs. Compares actual RBI performance for a select group of number 3 and 4 hitters to their expected RBI. Detailed comments on Steve Garvey and Dave Winfield. Clutch Hitting by Dick O'Brien, p. 15-16 Gives average runs per HR for each batting order position and compares % for each to cleanup slot. Based on players with at least 100 HR up to 1974. Gives runs per HR for about 3 dozen sluggers. by Dick O'Brien, p. 15-16 Gives average runs per HR for each batting order position and compares % for each to cleanup slot. Based on players with at least 100 HR up to 1974. Gives runs per HR for about 3 dozen sluggers. In Search of the "True" Slugging Percentage by Jim Morrow, p. 17-19 Presents formula for runs produced based on average number of runs scored and driven in per each type of hit as well as walks. Its correlation with actual runs is slightly less than slugging averaage's. by Jim Morrow, p. 17-19 Presents formula for runs produced based on average number of runs scored and driven in per each type of hit as well as walks. Its correlation with actual runs is slightly less than slugging averaage's. The Effects of Overwork on Young Pitchers by Dallas Adams, p. 20 Concludes that overworking rookie pitchers significantly shortens their careers. Presents table innings pitched in each of first four years for 3 groups based on number of games pitched as rookie. More on the "True" Slugging Percentage by Jim Reuter, p. 2-6 Responding to an article by Jim Morrow in the previous issue, and measuring the value of home runs and doubles in "true" slugging percentage. by Jim Reuter, p. 2-6 Responding to an article by Jim Morrow in the previous issue, and measuring the value of home runs and doubles in "true" slugging percentage. Batting Average Comparisons by Ward Larkin, p. 7-13 Studying batting average in different eras. by Ward Larkin, p. 7-13 Studying batting average in different eras. Comment on "Effects of Overwork of Rookie Pitchers" by Ward Larkin, p. 11 Responding to an article by Dallas Adams in the previous issue. by Ward Larkin, p. 11 Responding to an article by Dallas Adams in the previous issue. Effects of Overwork on Rookie Pitchers, Part II by Dick O'Brien, p. 14-15 Studying if age has any effect on the conclusions made in Dallas Adams' article in the previous issue. by Dick O'Brien, p. 14-15 Studying if age has any effect on the conclusions made in Dallas Adams' article in the previous issue. Player Development Study by Craig Wright, p. 16-19 Responding to a 1981 study by Bill James on farm systems and the value of their performance, and the effect of the Major League Scouting Bureau. by Craig Wright, p. 16-19 Responding to a 1981 study by Bill James on farm systems and the value of their performance, and the effect of the Major League Scouting Bureau. Comment on Larkin, "Batting Average Comparisons" by Bill James, p. 20 Responding to Ward Larkin's article on page 11 of this issue. Introduction by Bill James 2-3 by Bill James 2-3 Measuring Relief Performance by John Billheimer, p. 4-12 Evaluates relief pitchers in 1981 according to Prorated ERA (PERA) which charges relievers for a share of inherited runners they allow to score, and Slips (similar to blown saves.) Gives stats for teams and individuals. by John Billheimer, p. 4-12 Evaluates relief pitchers in 1981 according to Prorated ERA (PERA) which charges relievers for a share of inherited runners they allow to score, and Slips (similar to blown saves.) Gives stats for teams and individuals. Some Additional Aspects of the Distribution of Runs Scored by Dallas Adams, p. 13-16 Updates article from Issue 1 with data from 1977-81. Also discusses probability of scoring a given number of runs in an inning or per out. by Dallas Adams, p. 13-16 Updates article from Issue 1 with data from 1977-81. Also discusses probability of scoring a given number of runs in an inning or per out. Commentary on prior article by Bill James, p. 17 Compares actual # of times Yankees and A's scored a given # of runs in an inning to theoretical results using Adams' formulas. by Bill James, p. 17 Compares actual # of times Yankees and A's scored a given # of runs in an inning to theoretical results using Adams' formulas. A New Look at "Hard Luck" Pitchers by Mark Lazarus, p. 18-22 Gives team errors behind starting pitchers for 1982. Also breaks down overall errors by position and for left/right handed pitchers. by Mark Lazarus, p. 18-22 Gives team errors behind starting pitchers for 1982. Also breaks down overall errors by position and for left/right handed pitchers. Thoughts on Isolated Power by Jim Reuter, p. 23-24 Introduces Adjusted Isolated Power (AIP) which equals slugging average minus batting average divided by (1 minus batting average.) Also equal to extra bases per out. Gives top 20 career leaders and top 10 for first 4 20 year periods of this century. Introduction by Bill James, p. 2 Philosophies of publication and plea for material. by Bill James, p. 2 Philosophies of publication and plea for material. Effect of Batting and Pitching Changes on Team Change In Won-Lost Record by Dick O'Brien, p. 3-5 Examination of 1969-82 teams which improved or declined by at least five games from one year to the next. Were they effects of improvements/declines in hitting, pitching, or both? Are the results different when separated by teams in hitters or pitchers parks? by Dick O'Brien, p. 3-5 Examination of 1969-82 teams which improved or declined by at least five games from one year to the next. Were they effects of improvements/declines in hitting, pitching, or both? Are the results different when separated by teams in hitters or pitchers parks? Home Park Factors by Jim Reuter, p. 6 General formula to adjust a players statistics based on his home park. by Jim Reuter, p. 6 General formula to adjust a players statistics based on his home park. Balls and Strikes by Pete Palmer, p. 7-8 Examination of pitch-by-pitch data from 1974-77 post-season series. What is the effect of the ball-strike count on a batters batting, on-base, and slugging averages? How can this data be used to evaluate strategies such as pitch-outs, sacrifices, or swinging at 3-0 pitches? by Pete Palmer, p. 7-8 Examination of pitch-by-pitch data from 1974-77 post-season series. What is the effect of the ball-strike count on a batters batting, on-base, and slugging averages? How can this data be used to evaluate strategies such as pitch-outs, sacrifices, or swinging at 3-0 pitches? Some Further Aspects of the Distribution of Runs Scored by Dallas Adams, p. 9-20 Continuation of an article in the previous issue. Examines the probabilities of teams scoring and/or winning based on score, inning, and number of outs. Includes many equations, tables, and graphs. Ballpark Effects on Fielding Statistics -- American League Parks by Paul W. Schwarzenbart, p. 2-5 Follow-up to article in issue #1. Uses various data between 1972-80 to examine the effect of different AL parks on errors (by both infielders and outfielders) and double plays. Key point of discussion is the effect of grass vs. turf fields. by Paul W. Schwarzenbart, p. 2-5 Follow-up to article in issue #1. Uses various data between 1972-80 to examine the effect of different AL parks on errors (by both infielders and outfielders) and double plays. Key point of discussion is the effect of grass vs. turf fields. Quality Versus Quantity by Dan Heisman, p. 6-9 What is the effect of quantity (i.e., career value) vs. quality (peak value) in Hall of Fame voting? In actual value to a players team? Includes examination of the best five years of 31 past and present stars. Editors comment on p.20. by Dan Heisman, p. 6-9 What is the effect of quantity (i.e., career value) vs. quality (peak value) in Hall of Fame voting? In actual value to a players team? Includes examination of the best five years of 31 past and present stars. Editors comment on p.20. Team Won/Lost Percentage as a Function of Runs and Opponents Runs by Dallas Adams, p. 10-12 Follow-up to articles in previous issues. Uses 1967-76 game scores to determine teams probability of winning based on number of runs scored and allowed. Includes tables and equations. by Dallas Adams, p. 10-12 Follow-up to articles in previous issues. Uses 1967-76 game scores to determine teams probability of winning based on number of runs scored and allowed. Includes tables and equations. Adjusted Home Park Factor by Pete Palmer, p. 13-16 Computation of home park effect, with commentary on Reuters article in previous issue. Includes table showing home/road data in runs and home runs for 1977-82. Editors comment on p. 20. by Pete Palmer, p. 13-16 Computation of home park effect, with commentary on Reuters article in previous issue. Includes table showing home/road data in runs and home runs for 1977-82. Editors comment on p. 20. Evaluating Pitchers Performances by Cuthbert Magnolia, p. 17-19 Tongue-in-cheek article with serious formulas for rating starters and relievers, by comparing them with pitchers in their league and on their team. New Editors Notes by Jim Baker, p. 2 Introduction from publication's new editor, and plea for material. by Jim Baker, p. 2 Introduction from publication's new editor, and plea for material. Letter by Mrs. B. Kwandrie, p. 3, 8 Letter to the editor, apparently fictitious and created to encourage real letters to the editor. by Mrs. B. Kwandrie, p. 3, 8 Letter to the editor, apparently fictitious and created to encourage real letters to the editor. Run Production By Batting Order Position -- Part II by Dick O'Brien, p. 4 Follow-up to article in August 1982 issue. Examination of 1978-82 boxscores to reveal individual runs scored and RBI by position in the batting order. by Dick O'Brien, p. 4 Follow-up to article in August 1982 issue. Examination of 1978-82 boxscores to reveal individual runs scored and RBI by position in the batting order. On the Probability of Hitting .400 by Dallas Adams, p. 5-7 Examination of the probability of an individual batting .400 for a season, based on the league batting average. Includes graphs. by Dallas Adams, p. 5-7 Examination of the probability of an individual batting .400 for a season, based on the league batting average. Includes graphs. A Trend Analysis of Batting Averages by Gary T. Brown, p. 8-17 Examines league batting averages over the years, illustrated by graphs. Offers explanations for changes over the years, and projections for the future. by Gary T. Brown, p. 8-17 Examines league batting averages over the years, illustrated by graphs. Offers explanations for changes over the years, and projections for the future. The Chalmers Award Born Anew! p. 13 Announcement of a mythical MVP-type award sponsored by the publication, and voted on by the readers. p. 13 Announcement of a mythical MVP-type award sponsored by the publication, and voted on by the readers. Assigning Relative Values To Relief Wins, Losses and Saves by John Schwartz, p. 18 Offers an alternative method of rating relievers weighting wins, losses, and saves based on the frequency of each. by John Schwartz, p. 18 Offers an alternative method of rating relievers weighting wins, losses, and saves based on the frequency of each. Distribution of Runs by Pete Palmer, p. 19-20 Analyzes the distribution of runs, citing many previous studies on this subject. Examines 1980-82 AL data in testing formulae, and illustrates the effect random chance can have on statistics. Editor's Note by Jim Baker, p. 2 Satirical "tabloid" style story about 1983 World Series by Jim Baker, p. 2 Satirical "tabloid" style story about 1983 World Series Letter from Dallas Adams p. 4 Discussing distribution of runs scored and team scoring efficiency p. 4 Discussing distribution of runs scored and team scoring efficiency Introducing Project Scoresheet by Bill James, p. 5-6 Proposes the organization which collected play-by-play data for all games for several seasons. by Bill James, p. 5-6 Proposes the organization which collected play-by-play data for all games for several seasons. Scoring Sequences by Barry L. Mednick, p. 7-8 Based on 3 weeks worth of games by the Giants and A's. Gives % of time a given offensive event was associated with a run. by Barry L. Mednick, p. 7-8 Based on 3 weeks worth of games by the Giants and A's. Gives % of time a given offensive event was associated with a run. On Handedness and Pitchers' Fielding by Warren Johnson, p. 9-14 Gives range factors for 1982 ERA title qualifiers and discusses the reasons for differences between leagues and right/lefties. Range factor used here is changes accepted per batted ball out. by Warren Johnson, p. 9-14 Gives range factors for 1982 ERA title qualifiers and discusses the reasons for differences between leagues and right/lefties. Range factor used here is changes accepted per batted ball out. Pitchers' Range Factors by Clem Comly, p. 15 Gives leaders and trailers for 1982 and period 1974-82. Range factor used here is chances accepted per 27 batted ball outs. by Clem Comly, p. 15 Gives leaders and trailers for 1982 and period 1974-82. Range factor used here is chances accepted per 27 batted ball outs. Power Hitters Strikeout/Home Run Ratios by Dick O'Brien, p. 16-17 For the period 1920-82 and 1970-82, for all players driving in at least 100 runs in a season, groups hitters by strikeout/hr ratio. Gives percentage of total for each of 6 groups. For the latter period, gives k/hr ratio for two other groups-those with at least 20 HRs but fewer than 100 RBIs and those with at least 20 HRs but fewer than 70 RBIs. These two groups had higher ratios than the first group. by Dick O'Brien, p. 16-17 For the period 1920-82 and 1970-82, for all players driving in at least 100 runs in a season, groups hitters by strikeout/hr ratio. Gives percentage of total for each of 6 groups. For the latter period, gives k/hr ratio for two other groups-those with at least 20 HRs but fewer than 100 RBIs and those with at least 20 HRs but fewer than 70 RBIs. These two groups had higher ratios than the first group. On Foul Balls by David Aceto, p. 18-19 Discusses the theoretical effect on batting averages of foul outs. No actual data given. by David Aceto, p. 18-19 Discusses the theoretical effect on batting averages of foul outs. No actual data given. The Left Handed Hitter's Advantage by John Schwartz, p. 19 Gives the distance between home and the other bases that must be run by left and right handed hitters. Lefties have a 5% advantage running to first. by John Schwartz, p. 19 Gives the distance between home and the other bases that must be run by left and right handed hitters. Lefties have a 5% advantage running to first. Humorous review of fictional film about Max Patkin by John Borkowski and Jim Baker, p. 20 Editor's Note by Jim Baker, p. 2 by Jim Baker, p. 2 Review: "The Hidden Game of Baseball" Book Review Book Review The Best Fielding Second Baseman Since 1925 by Dan Finkle, p. 4-8,19 Mazeroski,Bill; Schoendienst,Red; Doerr,Bobby; Gordon, Joe; Trillo,Manny; Critz,Hugh; Gehringer,Charlie; Melillo,Oscar; Fielding; Second Baseman; Fielding Index; Double Plays; Errors by Dan Finkle, p. 4-8,19 Mazeroski,Bill; Schoendienst,Red; Doerr,Bobby; Gordon, Joe; Trillo,Manny; Critz,Hugh; Gehringer,Charlie; Melillo,Oscar; Fielding; Second Baseman; Fielding Index; Double Plays; Errors The Worst by Joe Ferrere, p. 9 On John Gochnaur by Joe Ferrere, p. 9 On John Gochnaur Functions for Predicting Winning Percentage from Runs by Charles Hofacker, p. 10-16 Won-Lost Record;Run Production;Pythagorean Projection by Charles Hofacker, p. 10-16 Won-Lost Record;Run Production;Pythagorean Projection Assists versus Strikeouts by Barry Mednick, p. 17 Assists;Strikeouts;Range Factor by Barry Mednick, p. 17 Assists;Strikeouts;Range Factor An Analysis of Win Percentage by Bill Deane, p. 18-19 Carlton, Steve;Guidry, Ron Won-Lost Record;Wins Above Team;Pitcher Performance Percentage Old Baseball Pie by Mike Ross, p. 2-3 Recipes;Humor by Mike Ross, p. 2-3 Recipes;Humor A Critique of "The Best Fielding Second Basemen since 1925" by Dick O'Brien, p. 4-5 Fielding; Second Baseman;Double Play by Dick O'Brien, p. 4-5 Fielding; Second Baseman;Double Play The Contenders During the Pressure Month - September by Jack Carlson, p. 6-11 1983;Pennant Race by Jack Carlson, p. 6-11 1983;Pennant Race Minors 1983:Pitcher's or Hitter's Leagues by Larry Smith, p. 12 1983;Minor Leagues by Larry Smith, p. 12 1983;Minor Leagues A Comparison of Baltimore's September Pennant Drive With Their Performance During the Rest of the Season by Barry Mednick, p. 13-14,17 1983;Pennant Race;Baltimore Orioles by Barry Mednick, p. 13-14,17 1983;Pennant Race;Baltimore Orioles The Difference Between Night and Day by Paul Schwarzenbart, p. 15-17 Day/Night by Paul Schwarzenbart, p. 15-17 Day/Night Commentary on Schwarzenbart's "Night and Day" by Bill James, p. 18 Day/Night by Bill James, p. 18 Day/Night A National League Rating System by Joe Levy, p. 19-20 Ratings;Evaluations Mays vs. Aaron: A New Look by Bill Deane, p. 4-6 Statistics: "Production"; "Star"; "SuperStar"; Home/Road HR comparison Topics: Comparing Aaron and Mays career stats in many categories by Bill Deane, p. 4-6 Statistics: "Production"; "Star"; "SuperStar"; Home/Road HR comparison Topics: Comparing Aaron and Mays career stats in many categories The Best Fielding Third Basemen Since 1925 by Dan Finkle, p. 7-9; 20 Consecutive years of selection; Range, DP; Fly Ball, and Misplay Index; Overall Index Named Person(s): Buddy Bell, Mike Schmidt, Brooks Robinson, Ron Santo, Eddie Mathews, Sal Bando, Clete Boyer, Rodriguez, Ken Keltner, Kamm, Whitney; Harland Clift, Graig Nettles Comparing modern 3rd baseman on a variety of fielding stats by Dan Finkle, p. 7-9; 20 Consecutive years of selection; Range, DP; Fly Ball, and Misplay Index; Overall Index Named Person(s): Buddy Bell, Mike Schmidt, Brooks Robinson, Ron Santo, Eddie Mathews, Sal Bando, Clete Boyer, Rodriguez, Ken Keltner, Kamm, Whitney; Harland Clift, Graig Nettles Comparing modern 3rd baseman on a variety of fielding stats Statistical Procedures for Baseball Research I: Correlation and Simple Regression by Charles Pavitt and Elaine M. Gilby, p. 10-15 Correlation Computations for Pitchers Named Person(s): Steve Carlton, Bob Knepper, Phil Niekro, Rick Rhoden, Steve Rogers, Dick Ruthven, Tom Seaver, Mario Soto, Bob Walk, Bob Welch Using correlation and linear regression to find the relationship between baseball stats by Charles Pavitt and Elaine M. Gilby, p. 10-15 Correlation Computations for Pitchers Named Person(s): Steve Carlton, Bob Knepper, Phil Niekro, Rick Rhoden, Steve Rogers, Dick Ruthven, Tom Seaver, Mario Soto, Bob Walk, Bob Welch Using correlation and linear regression to find the relationship between baseball stats Minor League Effects on Major League Pitching Performances by Terry Bohn, p. 16-17 Table of Poor, Average, and Star Pitchers with Below AAA and "AAA" stats compared Looking for patters as to whether minor league success for pitchersindicated success at the major league level by Terry Bohn, p. 16-17 Table of Poor, Average, and Star Pitchers with Below AAA and "AAA" stats compared Looking for patters as to whether minor league success for pitchersindicated success at the major league level On the Importance of Getting the Leadoff Batter on Base by Chuck Waseleski, p. 18-19 Red Sox and Opponents 1983 Runs Scored (Number and Frequency) vs. Whether batter reached first; Runs in a game vs. number of innings the leader batter got; Base Reached by Leadoff Batter vs. Number of Runs Scored. Looking at how teams performed in runs scored vs. how the leadoff batter performed The Metrodome and Home Runs by Terry Bohn, p. 4 Metrodome; Home Runs;Park Factor by Terry Bohn, p. 4 Metrodome; Home Runs;Park Factor Is Artificial Turf More Offensive? by Robert S. Smith, p. 5-10 Artificial Turf; Park Factor by Robert S. Smith, p. 5-10 Artificial Turf; Park Factor The Pythagorean Theorem and Twenty-Two Recent Managers by David F. Hoppes, p. 11-13 Weaver, Earl;Lasorda, Tommy;McNamara,John Managers;Pythagorean Projection by David F. Hoppes, p. 11-13 Weaver, Earl;Lasorda, Tommy;McNamara,John Managers;Pythagorean Projection Four-Decade Candidates for 1990 by Daniel Greenia, p. 13 Four-Decade Players;Longevity by Daniel Greenia, p. 13 Four-Decade Players;Longevity When Games End and What That Tells Us by Clem Comly, p. 14-17 Simulations;Line Up;Pinch-Hitting;Run Production by Clem Comly, p. 14-17 Simulations;Line Up;Pinch-Hitting;Run Production Criteria for Hall of Fame Selection by Dan Rappoport, p. 18-20 Hall of Fame;Runs Batted In Letter-comment on "Statistical Procedures for Baseball Research I: Correlation and Simple Regression" by Dan Rappoport, p. 4-5 by Dan Rappoport, p. 4-5 Letter by Charles Pavitt, p. 5 On Double Plays by Charles Pavitt, p. 5 On Double Plays Dan Greenia's Freak Show: Stealing Contribution by Dan Greenia, p. 5 On Stolen Bases by Dan Greenia, p. 5 On Stolen Bases Some Comments on the Benefit of Getting the Leadoff Batter on Base by Charles Hofacker, p. 6,16 Leadoff; Run Production by Charles Hofacker, p. 6,16 Leadoff; Run Production "The Natural": Strictly Artificial by Chris Martens, p. 7 Review: "The Natural" by Chris Martens, p. 7 Review: "The Natural" Joe D And The Halo Effect by Bill Deane, p. 8-9 DiMaggio, Joe by Bill Deane, p. 8-9 DiMaggio, Joe The Righetti Decision - A Historical Perspective by Art Springsteen, p. 10-16 Righetti, Dave; Relief Pitcher by Art Springsteen, p. 10-16 Righetti, Dave; Relief Pitcher One Adjustment to the Range Factor by Tim Marcou, p. 17-19 Range Factor Base on Balls Abstract, Part I by Craig Wright, p. 4-9 Table of Dodger stats for the three years they scored more than 100 runs more than any other team in the league ('49; '50; '53); Table of league offensive stats broken down by top, middle and lowest players in unintentional BB. Plate Appearance Distribution for Bases Empty (not-leading off), Leading Off an Inning, Runners in Scoring Position, and Men on First Only vs. Hi/Med/Lo BB men for 1983. Several other tables with similar distribution information showing value of the BB, affect of BB ability to do "what is most valuable in a situation"; clutch-situation ability, comparing individual players in Late-Inning Pressure situations, etc. Table of "Situational Stats for '83 AL players with 400+ PA separated into non-intentional BB average into three equal groups" Research of Walk Values (de-bunking myths about BB, etc.) by Craig Wright, p. 4-9 Table of Dodger stats for the three years they scored more than 100 runs more than any other team in the league ('49; '50; '53); Table of league offensive stats broken down by top, middle and lowest players in unintentional BB. Plate Appearance Distribution for Bases Empty (not-leading off), Leading Off an Inning, Runners in Scoring Position, and Men on First Only vs. Hi/Med/Lo BB men for 1983. Several other tables with similar distribution information showing value of the BB, affect of BB ability to do "what is most valuable in a situation"; clutch-situation ability, comparing individual players in Late-Inning Pressure situations, etc. Table of "Situational Stats for '83 AL players with 400+ PA separated into non-intentional BB average into three equal groups" Research of Walk Values (de-bunking myths about BB, etc.) Unforgettable Dizziness by Mike Kopf, p. 10-11, 17 Review of G.H. Fleming's book, "The Dizziest Season" (about 1934 NL St. Louis) by Mike Kopf, p. 10-11, 17 Review of G.H. Fleming's book, "The Dizziest Season" (about 1934 NL St. Louis) Batter's Offensive Wins and Losses by Neil Munro, p. 12-17 Runs Created Formulas; 1920 AL Leaders in Offensive Wins; 1993 NL and AL Leaders in Offensive Wins Above .500; Career Leaders in Offensive Wins Minus Offensive Losses; Season Leaders in Offensive Wins Minus Offensive Losses. Using Runs Created to convert to Offensive Winning Percentage; Calculating the Number of Outs in a Game. Notes: Some nice OW% info, including historical; discussion of outs. by Neil Munro, p. 12-17 Runs Created Formulas; 1920 AL Leaders in Offensive Wins; 1993 NL and AL Leaders in Offensive Wins Above .500; Career Leaders in Offensive Wins Minus Offensive Losses; Season Leaders in Offensive Wins Minus Offensive Losses. Using Runs Created to convert to Offensive Winning Percentage; Calculating the Number of Outs in a Game. Notes: Some nice OW% info, including historical; discussion of outs. Distribution for Players' Offensive Performance Statistics by Charles Hofacker, p. 18-20 Formula for performance deviating from League Mean; Figures: Frequency Polygon for HR/AB, frequency of players with a certain percentage of HR, Frequency Polygon for H/AB, frequency of players with a certain BA, Frequency Polygon for 1B/AB, frequency of players with a certain percentage of 1B/AB. (Based on 1982 season? - not clear) Named Person(s): Ward Larkin (10/82 Baseball Analyst) Topics: Determining the relative frequency of certain events (and what is the curve - normal, etc.?) Notes: mostly mathematical; comparing large amounts of data with theoretical curves. Base on Balls Abstract, Part II by Craig Wright, p. 4-18 Walks in Career Context - 4A) 24 Individual Tables of (Walk Ave/League Walk Ave) by Age - 4B) Summary Charts for groups of 10 (Lo, Med, Hi) of (Walk Ave/League Walk Ave) by Age - 4C) Tables with 1982 and 1983 offensive stats (AB, H, BA, etc.) for sets of players with different walk frequencies - 4D) Table comparing two similar players who came up at age 20 but one walked more, and how they did in their 1st 3 seasons - 4E) 8 Tables of (Walk Ave/League Walk Ave) by Age, 4 each for careers with rising BB% and steady BB% - 4F) Summary Charts for the 2 sets of 4 each given in 4E for (Walk Ave/League Walk Ave) by Age, and for (HRC%/ HRC% of league) - 4G) Table of Pete Rose's performance in years when he walked frequently vs. other years - 4H) Table of Ernie Whitt's performance in years when he walked frequently vs. other years by Craig Wright, p. 4-18 Walks in Career Context - 4A) 24 Individual Tables of (Walk Ave/League Walk Ave) by Age - 4B) Summary Charts for groups of 10 (Lo, Med, Hi) of (Walk Ave/League Walk Ave) by Age - 4C) Tables with 1982 and 1983 offensive stats (AB, H, BA, etc.) for sets of players with different walk frequencies - 4D) Table comparing two similar players who came up at age 20 but one walked more, and how they did in their 1st 3 seasons - 4E) 8 Tables of (Walk Ave/League Walk Ave) by Age, 4 each for careers with rising BB% and steady BB% - 4F) Summary Charts for the 2 sets of 4 each given in 4E for (Walk Ave/League Walk Ave) by Age, and for (HRC%/ HRC% of league) - 4G) Table of Pete Rose's performance in years when he walked frequently vs. other years - 4H) Table of Ernie Whitt's performance in years when he walked frequently vs. other years Largest Metropolitan Areas with No Local Major League Baseball by Daniel Greenia, p. 10 Table of Metropolitan population of largest areas without ML Teams (shows teams within 50 miles); also lists 4 smallest areas with teams. Large Markets without Teams; Small Markets with Teams by Daniel Greenia, p. 10 Table of Metropolitan population of largest areas without ML Teams (shows teams within 50 miles); also lists 4 smallest areas with teams. Large Markets without Teams; Small Markets with Teams Hall of Fame Candidates by Daniel Greenia, p. 18 Top 3 in HR, RBI, BA, R, H, SL%, G, TB, BB who are not in the HoF, though eligible. Highest ranking players in offensive categories who are not in the HoF but eligible. by Daniel Greenia, p. 18 Top 3 in HR, RBI, BA, R, H, SL%, G, TB, BB who are not in the HoF, though eligible. Highest ranking players in offensive categories who are not in the HoF but eligible. Some Further Aspects of the Distribution of Runs by Dallas Adams, p. 19-20 On Home Team's Probability of Losing when the game enters extra innings and on Home Team's Probability of losing when it enters the bottom of an extra inning trailing by "I" runs or tied 8. Topics: Probabilities of a team winning (or losing) in extra innings (follow-up on Feb 83 Baseball Analyst paper by Adams) Letters p. 1-3 Letter from Charles Pavitt responding to Dan Rappoport's critique of his article in an earlier issue on correlation and simple regression. Also a fictional letter from Cuthbert Magnolia on various theories in "savvymetrics". p. 1-3 Letter from Charles Pavitt responding to Dan Rappoport's critique of his article in an earlier issue on correlation and simple regression. Also a fictional letter from Cuthbert Magnolia on various theories in "savvymetrics". Base on Balls Abstract, Part III by Craig Wright, p. 4-10 Table of Burt Shotton's effect on his clubs' SB, BB, p%; HR. Learning to Walk (Manager's effect on clubs propensity to take BB) by Craig Wright, p. 4-10 Table of Burt Shotton's effect on his clubs' SB, BB, p%; HR. Learning to Walk (Manager's effect on clubs propensity to take BB) Percentage Baseball Reconsidered: Model and Method by Charles Pavitt, p. 11-16 Tables of Offensive States (e.g. Runners on base vs. outs; Markovian matrix of which states can turn into which other states: "First Order Transition Matrix for On-Base Out/Situations" Playing the Percentages, Probablilities of Scoring Runs, Multi-Order Finite State Transition Model by Charles Pavitt, p. 11-16 Tables of Offensive States (e.g. Runners on base vs. outs; Markovian matrix of which states can turn into which other states: "First Order Transition Matrix for On-Base Out/Situations" Playing the Percentages, Probablilities of Scoring Runs, Multi-Order Finite State Transition Model Meteors by Daniel Greenia, p. 16 Table of players who have hit over 25% of their career HR in one 30+ HR season by Daniel Greenia, p. 16 Table of players who have hit over 25% of their career HR in one 30+ HR season BA Comparison of Two Consistent Teams by Jack Carlson, p. 17-20 Comparing 1984 Detroit Tigers and 1984 Pittsburgh Pirates Won-Loss vs. Runs Scored, Won-Loss by Run Differential, and Scoring by Inning Review: "The 1985 Elias Baseball Analyst" by Jim Baker, p. 2 A review from the editor of the new book "The 1985 Elias Baseball Analyst" by Jim Baker, p. 2 A review from the editor of the new book "The 1985 Elias Baseball Analyst" Technique of Run Estimation by Game-Line Assembly by Bill James, p. 4-9 Developing an alternative method of estimating the number of runs likely to result from the offensive contributions of any individual player. by Bill James, p. 4-9 Developing an alternative method of estimating the number of runs likely to result from the offensive contributions of any individual player. Notes on the Stolen Base: 1984 Ranger Games by Craig Wright, p. 10-14 Assessing the run value of stolen bases and failed stolen base attempts by the Texas Rangers in 1984 games. by Craig Wright, p. 10-14 Assessing the run value of stolen bases and failed stolen base attempts by the Texas Rangers in 1984 games. The All-Time Most Valuable Players by Neil Munro, p. 15-19 Developing a method to determine the cumulative voting results for MVPs since 1931 and selecting MVP all-star teams. Musial,Stan; Williams,Ted; Ford,Whitey; Most Valuable Player; Selection Process by Neil Munro, p. 15-19 Developing a method to determine the cumulative voting results for MVPs since 1931 and selecting MVP all-star teams. Musial,Stan; Williams,Ted; Ford,Whitey; Most Valuable Player; Selection Process Letter by Dan Rappoport, p. 20 Responding to Jack Carlson's article in the previous issue. 1984; Pittsburgh Pirates Comparing Statistics From Different Eras by Alden Mead, p. 4-9 Era Comparison; Batting Average; Relative Batting Averag by Alden Mead, p. 4-9 Era Comparison; Batting Average; Relative Batting Averag Pitcher Burnout(?) by Daniel Greenia, p. 9 Martin, Billy; Fowler, Art; Pitchers; Complete Game by Daniel Greenia, p. 9 Martin, Billy; Fowler, Art; Pitchers; Complete Game Some Additional Comments on The Benefit of Getting the Leadoff Batter On Base by Dallas Adams, p. 10-12 Leadoff; Run Production by Dallas Adams, p. 10-12 Leadoff; Run Production Quantity vs Quality: A Look at Linear Weights Per Game by Dan Heisman, p. 13 Linear Weights by Dan Heisman, p. 13 Linear Weights Divergence of Won-Loss Records by Scott Segrin, p. 14-18 Won-Lost Record; Relief Pitching; Scoring by Scott Segrin, p. 14-18 Won-Lost Record; Relief Pitching; Scoring New Worlds to Conquer by Mike Kopf, p. 19-20 Dollar Sign on the Muscle; Scouting; Review by Mike Kopf, p. 19-20 Dollar Sign on the Muscle; Scouting; Review A Note to Fellow Sabermetricians by Scott Segrin, p. 20 Elias Baseball Analyst; Review Some Comments on the June Issue by Craig Wright, p. 4-5 Martin, Billy; Fowler, Art; Pitchers; Complete Game; Era Comparison; Batting Average; Elias Baseball Analyst by Craig Wright, p. 4-5 Martin, Billy; Fowler, Art; Pitchers; Complete Game; Era Comparison; Batting Average; Elias Baseball Analyst Comment on All-Time MVP by Dan Rappoport, p. 5-6 Most Valuable Player; Selection Process by Dan Rappoport, p. 5-6 Most Valuable Player; Selection Process A Report on the Accuracy of the Brock4 Career Projection Method by Dallas Adams, p. 7-13 Career Projection; Brock4 by Dallas Adams, p. 7-13 Career Projection; Brock4 A Note About "Percentage Baseball Reconsidered" by Mark D. Pankin, p. 14-15 Percentage Baseball; Markov Chain by Mark D. Pankin, p. 14-15 Percentage Baseball; Markov Chain Six Phases of a Baseball Research Project by Dan Rappoport, p. 15 Humor by Dan Rappoport, p. 15 Humor An Attempt at Determining Runs Batted In Efficiency by Terry Bohn, p. 16-17 Run Batted In; Run Production by Terry Bohn, p. 16-17 Run Batted In; Run Production Quantity vs Quality: A Look at Linear Weights Per Game by Dan Heisman, p. 18-19 Linear Weights Into the Baseball Mind's Eye p. 2 Humorous article from "X.X. Ludeman" on the end of the 1985 season p. 2 Humorous article from "X.X. Ludeman" on the end of the 1985 season Letters p. 4-5 Letters from Dan Greenia, Craig Wright, L.W. Beach and Robert Johnson commenting on the previous issue. p. 4-5 Letters from Dan Greenia, Craig Wright, L.W. Beach and Robert Johnson commenting on the previous issue. Freak Show: Strikeouts/Home Run: Good & Bad by Daniel Greenia, p. 6 A list of the best and worst all-time career strikeout/home run ratios. by Daniel Greenia, p. 6 A list of the best and worst all-time career strikeout/home run ratios. Ground Ball Pitchers on Artificial Turf by Craig Wright, p. 7-11 Debunking the myth that artificial surfaces are "deadly" to ground ball pitchers. by Craig Wright, p. 7-11 Debunking the myth that artificial surfaces are "deadly" to ground ball pitchers. Another Look At Ground Ball Pitchers on Artificial Turf by Craig Wright, p. 12-20 The author discusses his research methodology in the previous article and on a similar Bill James study that appeared in the "Baseball Abstract Newsletter" Casey Studied Further by Jim Baker, p. 2 A humorous look at what happened to the fictious Mudville teammates in "Casey At The Bat" by Jim Baker, p. 2 A humorous look at what happened to the fictious Mudville teammates in "Casey At The Bat" Freak Show: Won-Lost % of Expansion Teams by Daniel Greenia, p. 4 A chart looking at the win-loss records of expansion Teams since 1961 by Daniel Greenia, p. 4 A chart looking at the win-loss records of expansion Teams since 1961 A Response to Mark Pankin's "A Note About 'Percentage BB Reconsidered'" by Charles Pavitt, p. 5 The author responds to Mark Pankin's critique of his article in an earlier issue. by Charles Pavitt, p. 5 The author responds to Mark Pankin's critique of his article in an earlier issue. Changes in Productivity as Players Age by Bill James, p. 6-13 Studying a defined group of players and identifying the changes that take place in their performance as they age. by Bill James, p. 6-13 Studying a defined group of players and identifying the changes that take place in their performance as they age. Additional Fielding Statistics from Play-by-play Data by Clem Comly, p. 14-17 Using play-by-play from 1983 Phillies home games, the author looks at pitchers versus the running game, infielders' and outfielders' range factors and other interesting studies that could now be done as Project Scoresheet expanded its data and scope. by Clem Comly, p. 14-17 Using play-by-play from 1983 Phillies home games, the author looks at pitchers versus the running game, infielders' and outfielders' range factors and other interesting studies that could now be done as Project Scoresheet expanded its data and scope. Armchair at the Bedside by Mike Kopf, p. 18-20 Review: "The Armchair Book of Baseball" Letter by Craig Wright, p. 2 The author and frequent "Analyst" contributor discusses his decision to leave the Texas Rangers. by Craig Wright, p. 2 The author and frequent "Analyst" contributor discusses his decision to leave the Texas Rangers. Synchronicity and the Double Play by Dick O'Brien, p. 4, 20 Questioning the possible overuse of "the double play strategy" by managers. by Dick O'Brien, p. 4, 20 Questioning the possible overuse of "the double play strategy" by managers. Observations on the Stolen Base by Craig Wright, p. 5-9 Using 1984 data from Project Scoresheet, the author studies stolen bases, broken down by league, pitcher handedness, attempts and pickoffs. by Craig Wright, p. 5-9 Using 1984 data from Project Scoresheet, the author studies stolen bases, broken down by league, pitcher handedness, attempts and pickoffs. Peripheral and Identifying Characteristics of Ground-Ball and Fly-Ball Type Pitchers by Bill James, p. 10-16 Developing new ways to classify ground-ball and fly-ball pitchers, using ratios released in the "Elias Baseball Analyst". by Bill James, p. 10-16 Developing new ways to classify ground-ball and fly-ball pitchers, using ratios released in the "Elias Baseball Analyst". Run Estimation by Game-Line Assembly by Mark D. Pankin, p. 17-20 Offering corrections to Bill James' article in the previous issue on estimating run production. Letter p. 2, 18, 19, 20 Fictional letter to the editor from "Manny Mannitillo" p. 2, 18, 19, 20 Fictional letter to the editor from "Manny Mannitillo" Some Reflections on the "Iorgy of Recrimination" by Mike Kopf, p. 4-8 The author reflects on his experience at Game Six of the 1985 World Series, and makes an impassioned appeal to all Project Scoresheet participants. by Mike Kopf, p. 4-8 The author reflects on his experience at Game Six of the 1985 World Series, and makes an impassioned appeal to all Project Scoresheet participants. The Baseball Batting Sequence Problem by Jess Boronico, p. 9-17 Taking a further look at how batters should be arranged in the lineup, in response to a study that says they should be arranged by order of decreasing production. Run Production and Run Prevention Correlated to Wins by David F. Hoppes, p. 4-6 Won-Lost Record; Run Production by David F. Hoppes, p. 4-6 Won-Lost Record; Run Production Bias in Infielding evaluation 1: Pitching Handedness by Charles Pavitt, p. 7-11 Fielding;Pitcher Handedness;Total Chances by Charles Pavitt, p. 7-11 Fielding;Pitcher Handedness;Total Chances The Size and Nature of The Platoon Differential by Tom Hanrahan, p. 12-16 Platoon Differential by Tom Hanrahan, p. 12-16 Platoon Differential Ball Park Elevation and Humidity as Factors in Home Run Production by Dick O'Brien, p. 17-18 Home Runs;Park Factor;Weather;Elevation;Minor Leagues by Dick O'Brien, p. 17-18 Home Runs;Park Factor;Weather;Elevation;Minor Leagues Reaction To Dick O'Brien's Article by Bill James, p. 19-20 Home Runs;Park Factor;Weather;Elevation;Psychology;Minor Leagues Computers and the Ballgame by Tony Formo, p. 2-5 Computers by Tony Formo, p. 2-5 Computers Why Right-Handed Hitters are Better High-ball Hitters and Left-Handed Batters are Better Low-ball Hitters by Dick O'Brien, p. 6-7 Platoon;High Ball Hitter; Low Ball Hitter;Hitting by Dick O'Brien, p. 6-7 Platoon;High Ball Hitter; Low Ball Hitter;Hitting Some Random Thoughts on Baseball and Ballplayers by Mike Kopf, p. 8-10 Gossage, Goose;Kroc, Joan Comiskey Park;Owners;Books;Humor by Mike Kopf, p. 8-10 Gossage, Goose;Kroc, Joan Comiskey Park;Owners;Books;Humor How Long Do Players Spend in the Minor Leagues? by Bill James, p. 11 Minor Leagues by Bill James, p. 11 Minor Leagues Some Thoughts on Project Scoresheet by Tony Formo, p. 12-14 Project Scoresheet Cloudland Revisted by Dick O'Brien, p. 2-5 Strikeouts; Run batted in by Dick O'Brien, p. 2-5 Strikeouts; Run batted in Batting Leadoff For St.Louis: George Foster by Timothy Morway, p. 6-7 Coleman, Vince Leadoff;Line Up by Timothy Morway, p. 6-7 Coleman, Vince Leadoff;Line Up Percentage Baseball Reconsidered: 2. Preliminary 1984 Findings by Charles Pavitt, p. 8-19 Baseball;Markov Chains;Project Scoresheet;Run Production Tip O'Neill: Estimating His 1887 RBIs by Dallas Adams, p. 2-3 O'Neill, Tip Run Batted In;Linear Weights by Dallas Adams, p. 2-3 O'Neill, Tip Run Batted In;Linear Weights A Look at 'Turf' Teams and 'Turf' Players: 1983-1986 by Sandy Sillman, p. 4-7 Turf;Park Factor by Sandy Sillman, p. 4-7 Turf;Park Factor The Implications of Leadership Research on Baseball: 1. Past Research by Charles Pavitt, p. 8-11 Managers;Managerial Change;Management by Charles Pavitt, p. 8-11 Managers;Managerial Change;Management Some Relief Pitching Statistics from 1986 by Philip Meneely, p. 12-20 Relief Pitching;1986;Pitching Staffs;Inherited Runner;Holds;Saves Scoring From First on a Double or Second on a Single by Dallas Adams, p. 8-10 Base Running;Run Production by Dallas Adams, p. 8-10 Base Running;Run Production Why Making Double Plays is an Important Part of Fielding Skill by Dan Finkle, p. 10-13 Double Play;Fielding;Fielding Index by Dan Finkle, p. 10-13 Double Play;Fielding;Fielding Index Suggestions for Areas of Research by Bill James, p. 2-7 Mattingly,Don; Aging;Hall of Fame;Iron Man;Minor League Leaders;Race by Bill James, p. 2-7 Mattingly,Don; Aging;Hall of Fame;Iron Man;Minor League Leaders;Race Looking at the 1987 Rookie Crop by Bill James, p. 18-26 1987;Rookies;Major League Equivalencies by Bill James, p. 18-26 1987;Rookies;Major League Equivalencies Hall of Fame and a New Historical Ranking of the All-Time Greats by Robert K. McCleery and Robert O. Wood, p. 14-17 Hall of Fame;Ranking What Type Team Spells R-E-L-I-E-F Best by Russ Eagle, p. 2-5 Relief Pitching;Saves by Russ Eagle, p. 2-5 Relief Pitching;Saves A New Framework for Assessing Individual Offensive Performance by David Smyth, p. 6-8 Run Production by David Smyth, p. 6-8 Run Production The Further Adventures of Clutch Hitting by Dick O'Brien, p. 9-11 Clutch Hitter by Dick O'Brien, p. 9-11 Clutch Hitter The Effect of Relief Pitchers on Aggregate Batting Averages by Robert E. Shipley, p. 12-20 Relief Pitching Improving the Runs Created Formula by David H. Robinson, p. 2-6 James,Bill;Johnson,Paul; Runs Created by David H. Robinson, p. 2-6 James,Bill;Johnson,Paul; Runs Created Miracle Teams and Long Shots:What are the Odds? by Sandy Sillman, p. 7-10 Miracle Teams;Odds by Sandy Sillman, p. 7-10 Miracle Teams;Odds Some Comments on Charles Pavitt's "Bias in Fielding Evaluation 1: Pitcher Handedness" by Dallas Adams, p. 11 Fielding;Pitcher Handedness;Pitcher Fielding by Dallas Adams, p. 11 Fielding;Pitcher Handedness;Pitcher Fielding All-Time Greats Consensus by Daniel Greenia, p. 12 Rankings by Daniel Greenia, p. 12 Rankings Implications of Leadership Research on Baseball 2: Current Theory by Charles Pavitt, p. 13-16 Manager;Leadership by Charles Pavitt, p. 13-16 Manager;Leadership On the Innaccuracy of the Pythagorean Equation at Extreme Scoring Ratios by Dallas Adams, p. 17-18 Pythagorean Projection;Offensive Winning Percentages by Dallas Adams, p. 17-18 Pythagorean Projection;Offensive Winning Percentages Strikeout-to-Walk Ratios and Winning Records by Russ Eagle, p. 19-20 Strikeout-to-Walk Ratio Similarity Scores Among the All-time Greats by Robert K. McCleery and Robert O. Wood, p. 2-6 Similarity Scores;Hall of Fame by Robert K. McCleery and Robert O. Wood, p. 2-6 Similarity Scores;Hall of Fame On Batting Order by Doug Bennion, p. 7-11 Line up;Run Production by Doug Bennion, p. 7-11 Line up;Run Production Bias in Infielding Evaluation 2: Pitching Handedness and Strikeout Tendencies by Charles Pavitt, p. 12-15 Fielding; Pitching; Strikeouts by Charles Pavitt, p. 12-15 Fielding; Pitching; Strikeouts Late Home Run Hitters by Stephen Roney, p. 16 Home Run; Age by Stephen Roney, p. 16 Home Run; Age Using a Baseball Simulator Program to Calculate Batter Runs Created by Dallas Adams, p. 17-20 Ruth, Babe; Runs Created;Run Production;Computer Simulations;Offensive Winning Percentage Hitter Controlled or Pitcher Controlled by Bill James, p. 1 Thoughts on trying to calculate to what extent each area of run production is controlled by hitters or pitchers. by Bill James, p. 1 Thoughts on trying to calculate to what extent each area of run production is controlled by hitters or pitchers. A Preliminary Review of Outside Influences on Rich Ashburn's Fielding Statistics by Bruce H. Garland, p. 3-8, 11 Studying the defensive abilities of Richie Ashburn and the effects of his home ballpark, Connie Mack Stadium; his teammates, including Del Ennis; the National League averages at the time; and other factors. by Bruce H. Garland, p. 3-8, 11 Studying the defensive abilities of Richie Ashburn and the effects of his home ballpark, Connie Mack Stadium; his teammates, including Del Ennis; the National League averages at the time; and other factors. 1987: How the East Was Won by Russ Eagle, p. 9-11 Studying how the St. Louis Cardinals created their runs on their way to the 1987 NL East division title. by Russ Eagle, p. 9-11 Studying how the St. Louis Cardinals created their runs on their way to the 1987 NL East division title. Pitcher's Defensive Wins and Losses by Neil Munro, p. 12-17, 20 Trying to determine the number of defensive wins and losses that a pitcher contributes to his team over the course of a season. by Neil Munro, p. 12-17, 20 Trying to determine the number of defensive wins and losses that a pitcher contributes to his team over the course of a season. Formulas: Awards by Tom Hanrahan, p. 18-20 Proposing new formulas for predicting who might win postseason awards, including Most Valuable Player; Cy Young Award; and Manager of the Year. Alternatives for Career Projection by Bill James, p. 2-4 Seeking alternatives for career projection systems over the previously used 1Brock.wk1 system and developing a "target-oriented" system. by Bill James, p. 2-4 Seeking alternatives for career projection systems over the previously used 1Brock.wk1 system and developing a "target-oriented" system. The 1988 Hall of Fame Ballot by Robert O. Wood, p. 5-9 Looking at the eligible candidates and predictions for the 1988 National Baseball Hall of Fame ballot. by Robert O. Wood, p. 5-9 Looking at the eligible candidates and predictions for the 1988 National Baseball Hall of Fame ballot. Research in Progress by Bill James, p. 10-15 Reporting on the accuracy and improves of the author's 1Brock.wk1 career projection system. by Bill James, p. 10-15 Reporting on the accuracy and improves of the author's 1Brock.wk1 career projection system. Sending the Runner by Doug Bennion, p. 16-20 Studying the wisdom of sending a runner from first base on a full count with zero or one outs. Run/Opposition Run Connection by Bill James, p. 1-3 Run Production;1987;Score by Bill James, p. 1-3 Run Production;1987;Score Sustained Illusions in Platoon Effects by Bill James, p. 3-4 Platoon by Bill James, p. 3-4 Platoon Markov Chain Models: Theoretical Background by Mark D. Pankin, p. 5-10 Markov Chain Model; Expected Runs by Mark D. Pankin, p. 5-10 Markov Chain Model; Expected Runs On The Relief Pitcher's ERA Advantage by Phil Birnbaum, p. 11-16 Hernandez,Willie; Quisenberry,Dan; Righetti,Dave; Henke,Tom; Relief Pitcher; Earned Run Average; Expected Runs; Pitching Changes by Phil Birnbaum, p. 11-16 Hernandez,Willie; Quisenberry,Dan; Righetti,Dave; Henke,Tom; Relief Pitcher; Earned Run Average; Expected Runs; Pitching Changes Home Run Miscellany by Stephen Roney, p. 17 Homeruns;1987:Age by Stephen Roney, p. 17 Homeruns;1987:Age Hitter Control vs Pitcher Control by Russ Eagle, p. 18-20 Pitching;Batting The Effect of Being Traded on Batting Performance: More Academic Baseball Research by Charlie Pavitt, p. 2-3 Murcer, Bobby Trade;Review;Psychology;Experience Murcer, Bobby Trade;Review;Psychology;Experience By the Stars by Bill James, p. 4-5 Astrology;Similarity Scores by Bill James, p. 4-5 Astrology;Similarity Scores Independent Situations and Baseball Statistics by Dan Heisman, p. 6-7 Clutch Hit;Game-Winning RBI;Victory-Important RBI;Pressure by Dan Heisman, p. 6-7 Clutch Hit;Game-Winning RBI;Victory-Important RBI;Pressure Are Hall of Fame Standards Declining? by Bill James, p. 8 Hall of Fame Selecting by Bill James, p. 8 Hall of Fame Selecting Rabbit Ball Revisited by Joe Mangano, p. 9-14 Rabbit Ball;1987 by Joe Mangano, p. 9-14 Rabbit Ball;1987 Reflections of a Megalomaniac Editor: Managerial Changes by Bill James, p. 15 Review;Managerial Change by Bill James, p. 15 Review;Managerial Change Reflections of a Megalomaniac Editor: Ballpark Elevation and Humidity by Bill James, p. 15 Park Factor;Weather;Elevation by Bill James, p. 15 Park Factor;Weather;Elevation Reflections of a Megalomaniac Editor: Effect of Being Traded on Batting by Bill James, p. 16 Trade;Review by Bill James, p. 16 Trade;Review Reflections of a Megalomaniac Editor: Independent Situations by Bill James, p. 16-18 Clutch Hit;Pressure;Review by Bill James, p. 16-18 Clutch Hit;Pressure;Review Reflections of a Megalomaniac Editor: Rabbit Revisited by Bill James, p. 18-20 Rabbit Ball;Review;Balata Ball;Dead Ball;Lively Ball Baseball's Hall of Fame Voting System by Robert O. Wood, p. 2-5 Hall of Fame Selecting by Robert O. Wood, p. 2-5 Hall of Fame Selecting A Brief Look at Left-Handed Pitching Effectiveness by Dick O'Brien, p. 6-7 Left-Handed Pitcher by Dick O'Brien, p. 6-7 Left-Handed Pitcher Lefties vs Lefties: As Time Goes Bye by Tom Locker, p. 8-9 Platoon;Left-Handed Batter by Tom Locker, p. 8-9 Platoon;Left-Handed Batter Debunking a Pitching Clichè by Dallas Adams, p. 10-11 Run Production;Pitching by Dallas Adams, p. 10-11 Run Production;Pitching Note on "Lefties vs Lefties" by Bill James, p. 11 Rookies;Platoon;Left-Handed Batter by Bill James, p. 11 Rookies;Platoon;Left-Handed Batter Predicting Team Performance: Beyond Pythagorean by Robert Cramer, Robin Ellins and David Lutz, p. 12-14 Pythagorean Projection;Run Production;Winning by Robert Cramer, Robin Ellins and David Lutz, p. 12-14 Pythagorean Projection;Run Production;Winning Response to Cramer, et al by Bill James, p. 15 Run Production by Bill James, p. 15 Run Production Adventures in Computer Simulations by Gary Fletcher, p. 16-20 Lineup;Simulations;Run Production Introduction, with comments on Heisman, "Independent Situations" by Bill James, p. 1-2 Pressure Situations;Baseball Abstract by Bill James, p. 1-2 Pressure Situations;Baseball Abstract Ballpark Effects on Home Runs by Robert K. McCleery and Robert O. Wood, p. 3-6 DiMaggio,Joe;Dawson, Andre by Robert K. McCleery and Robert O. Wood, p. 3-6 DiMaggio,Joe;Dawson, Andre Late Home Run Hitters Revisited by Stephen Roney, p. 7-8 Home Runs;Age by Stephen Roney, p. 7-8 Home Runs;Age Leadoff Man and His Effect on the Lineup by Gary Fletcher, p. 9-11 Line Up;Runs Created;Leadoff;Simulations by Gary Fletcher, p. 9-11 Line Up;Runs Created;Leadoff;Simulations Why Everyone Wants Lefty Pitching by Dick O'Brien, p. 12-13 Left-Handed Pitcher by Dick O'Brien, p. 12-13 Left-Handed Pitcher Thirty-Homer Guys: A Four Part Study by Bill James, p. 14-20 Canseco, Jose;Kingman,Dave; Home Runs;Similarity Scores Player Win Averages: An Extended Book Review by Paul R. Pudaite, p. 2-7 Player Win Averages;Run Production;Clutch Hitter by Paul R. Pudaite, p. 2-7 Player Win Averages;Run Production;Clutch Hitter Predicting Ws and Ls from Runs Scored and Allowed by Willie Runquist, p. 8-9 Soolman;Cook,Earnshaw Win-Loss Record;Pythagorean Projection by Willie Runquist, p. 8-9 Soolman;Cook,Earnshaw Win-Loss Record;Pythagorean Projection Response to "Predicting Ws and Ls from Runs" by Bill James, p. 10 Soolman;Cook,Earnshaw Win-Loss Record;Pythagorean Projection by Bill James, p. 10 Soolman;Cook,Earnshaw Win-Loss Record;Pythagorean Projection Fred McMullin: An Underrated Crook? by Mike Kopf, p. 11 McMullin, Fred; Chicago White Sox;Black Sox;1919 World Series by Mike Kopf, p. 11 McMullin, Fred; Chicago White Sox;Black Sox;1919 World Series Peer Comparison System by Andy Spark, p. 12-15 Comparison by Andy Spark, p. 12-15 Comparison Intra-season Statistical Trends by Dallas Adams, p. 16-18 1987;Statistics by Dallas Adams, p. 16-18 1987;Statistics Hall of Fame Note by Bill James, p. 19-20 Hall of Fame Selecting An Addendum to Bruce Garland's Preliminary Review Of Outside Influences on Rich Ashburn's Fielding Statistics by Bill Carr, p. 2-3 Ashburn, Richie;Roberts, Robin Outfield;Fielding by Bill Carr, p. 2-3 Ashburn, Richie;Roberts, Robin Outfield;Fielding Breaking in a Catcher by Tom Hanrahan, p. 4-6 Catcher;Earned Run Average by Tom Hanrahan, p. 4-6 Catcher;Earned Run Average Pitcher Run Average by Doug Bennion, p. 7-10 Earned Run Average;Pitcher Run Average;Run Potential by Doug Bennion, p. 7-10 Earned Run Average;Pitcher Run Average;Run Potential Racism in Baseball: Offensive Performance by James R. McNesby, p. 11-15 Racism by James R. McNesby, p. 11-15 Racism Parity: A New Look by Robert K. McCleery and Robert O. Wood, p. 16-20 Parity; Law of Competitive Balance;Luck;Expansion
BlackBerry is shutting down its Bedford, N.S., office, affecting at least 350 workers as part of the beleaguered Ontario-based tech company's global layoffs. BlackBerry says its Bedford, N.S., office, which offers customer service, will close Jan. 10. The company says 35 employees will be offered positions to work at home and remain with the company. The company was lured to Nova Scotia in 2005 by a previous Progressive Conservative government.The Bedford location opened on Nov. 25, 2008. The province offered $19 million in subsidies, including $14 million in payroll rebates and $5 million for training and recruitment. The company was told it had to create 1,200 jobs over five years to get the full rebate. BlackBerry drew almost $11 million from the payroll rebate program over a six-year period ending in February 2012, Nova Scotia Business Inc. has said. The Nova Scotia government announced it would give BlackBerry $10 million over five years to create a Centre of Excellence and guarantee at least 400 jobs in the province at an average salary of at least $60,000 a year. It's not known exactly how much of that money the company has received, but BlackBerry said Thursday it is paying back a $2 million contribution. Government back and forth “I’m confident the current government knew this was coming and would have been prepared for this day,” said MLA Kelly Regan. Regan will represent Bedford in the Nova Scotia legislature when the Liberal candidate is once again sworn in. Premier-designate Stephen McNeil said he understands the outgoing government might be offering services for the laid-off workers. But when questioned, a spokesperson from Darrell Dexter’s office said, "I suggest you get in touch with someone from the premier-elect's communication team. "They are still the government and I would have expected them to act accordingly, like grownups, right?” said Regan. "The fact of the matter is if they don't do what they're supposed to do, we will." Regan said she has been making calls and sending emails to existing high-tech firms in the area to see if they might have jobs available for the ousted BlackBerry employees. Local marketing company SimplyCast tweeted it has job openings. More cuts coming The Nova Scotia layoffs are the latest in a string of cost-cutting measures within BlackBerry. The struggling company, based in Waterloo, announced in September it is cutting 4,500 employees across its global operations. The losses will affect 40 per cent of its staff, leaving about 7,000 employees. On Tuesday, BlackBerry said it's laying off 300 people in its Waterloo office. Fairfax Financial Holdings Ltd. has made an offer to buy BlackBerry for $4.7 billion, although there are doubts the deal will go through. Looking for rival bids BlackBerry is known to be soliciting rival bids in the meantime, with companies such as SAP AG, Cisco Systems and Samsung Electronics reported to have approached its advisers. However, the tech titans are likely interested only in pieces of the company, Bloomberg reported, adding that BlackBerry company management is becoming more open to a breakup of the company. The federal government’s move to quash a deal by Egyptian firm Accelero to take over MTS-Allstream may have bidders thinking twice about investing in Canada. The government nixed the deal over "national security concerns" but gave no further explanation. Treasury Board President Tony Clement said Oct. 8 that national security would play a role in assessing offers for BlackBerry as well.
Hey, remember how the Food and Drug Administration gave restaurants a yearlong extension on the deadline for getting their act together regarding calorie counts on menus nationwide? They were supposed to get their acts together and post that information on menus nationwide by December of this year. Now, though, a new bill passed in the House of Representatives seeks to change that before eateries are forced to comply. Which wouldn’t be for another few years. The new version of the bill leaves it up to companies to decide what a “serving size” is. This leads to ridiculousness like a bag of cookies in a vending machine being labeled as two or three “servings,” or pizzerias dividing their pies up into as many slices as they want to make dishes appear to be lighter in fat and calories. Margo G. Wootan, director of nutrition policy at the Center for Science in the Public Interest, pointed out in a statement from the organization that this bill”would result in consumer confusion and prevent disclosure of straightforward, consistent calorie information at many food service establishments.” As a food-eating consumer, here’s what you would need to know. Under the rules that were supposed to go into effect in December, a restaurant would have to give you the nutrition information for an entire pizza, since what you’re buying is an entire pizza, no matter how many people you share it with. The bill just passed would allow them to post nutrition information for one slice of pizza, since pizzas generally come pre-sliced. The bill pretty much leaves this up to restaurants, inserting this language (we bolded the parts that you should pay attention to): the number of calories contained in the whole standard menu item, or the number of servings (as reasonably determined by the restaurant or similar retail food establishment) and number of calories per serving, or the number of calories per the common unit division of the standard menu item, such as for a multiserving item that is typically divided before presentation to the consumer. In a joint statement by a number of public health organizations, health care providers, and government agencies, the experts point out that letting companies decide on their own serving size is a recipe for posting calorie counts for half-muffins. Posting the total calories per menu item enables consumers to more easily compare different types of food items, such as nachos, chicken wings, or pizza, and leaves it up to the individual – not the restaurant – to determine how many people will share the item. It would be deceptive to label muffins, entrees, desserts, and most menu items as multiple servings, since items are most often consumed by one person. If the bill became law, it would also give food companies another few years to stall: new regulations would be written, which wouldn’t go into effect until at least two years after those regulations were released to the restaurant industry and to the public.
In terms of ROI, video marketing and email marketing are two of the most effective forms of digital marketing available. But how can they be blended into a single marketing approach? Video marketing and email marketing can be highly effective when used together, but because of their formats, some care needs to be taken. Blending video and marketing requires an in-depth understanding of why each marketing approach is so successful — and the demographic that it most appeals to. But first, some stats. The Importance of Video and Email Marketing 51% of marketing professionals believe that video is the type of content with the best ROI. 68% of marketing professionals believe that email is the best digital channel for ROI. So we can see why combining video and combining email would be inherently so effective: it puts the best type of content on the best channel. Video marketing is effective because it’s engaging. Everyone likes to watch a good video. They also absorb the information and retain it for longer through that format. In fact, 90% of the information that the brain receives is visual. Videos also allow a company to better tell their story and establish their brand, as video can condense more information into a shorter amount of time. Email marketing is effective because it’s direct. It can be customized and tailored to the individual, it can be targeted and triggered to specific scenarios, and it can access practically anyone in every demographic. There will be 3.7 billion email users throughout the world by the end of the year; there is scarcely any demographic that a marketing professional cannot reach through this content channel. When used together, video marketing can push compelling, engaging content directly through to your potential, current, and even former customers. Video marketing can be used to tell detailed product stories, build trust in your brand, or even just deliver entertaining and timely content. But because both video marketing and email marketing are challenging, there are a lot of things you need to first consider. 11 Video and Email Marketing Best Practices 1. Create “buyer personas” before developing a strategy — and tailor each strategy to a single persona. A buyer persona is meant to represent your core demographics. They allow you to speak more authentically to your buyers, to determine what they want and what they would find most appealing. A buyer persona is the foundation to any solid marketing strategy and is especially important in email and video marketing. Ask yourself questions about your major demographics: how old are they? What gender? How much do they make and where do they live? When they encounter your product, what problem are they trying to solve? What actions will they try to take to solve it? 2. Keep your branding consistent. Video has the remarkable power to establish a company’s brand identity and voice very quickly. But when that voice is inconsistent, customers can find it hard to trust your business or to understand what it’s about. Throughout your email and video marketing, you need to be able to establish a consistent voice above all else. Figure out your company’s values early on and align all of your media with your mission statement. 3. Create multiple marketing strategies. Every demographic is different as is every buyer persona. Multiple strategies may be needed to encompass the entirety of your market. Not only should you create multiple targeted strategies, but you also need to track them appropriately. Statistics will give you the information that you need to determine which strategies are working best and which may need to be tweaked and fine-tuned. Just like traditional email marketing, video marketing via email also needs to be A/B tested. 4. Keep your video concise. The attention span for content on the Internet is quite short. Though an email can provide additional information, a video itself needs to be short and snappy. Make your basic points, stick to it, and don’t exceed a few minutes of time. If you need to add more information, make sure that the most important information is front-loaded at the very beginning of the video. 5. Consider your entire sales funnel. Email and video marketing is going to guide your buyers through the purchase (and even beyond it). That means that it has to be able to support their journey. At the very beginning, videos will be targeted towards educating customers regarding your product and your business. Further along, the videos will be persuasive; they will be designed to draw your customers in with information regarding why your product is better than the competition, and why they should invest. After purchase, videos should be targeted towards supporting your product and encouraging additional sales. Altogether, your marketing is going to guide your buyer through their relationship with your company. 6. Prepare your hosting service. Your videos are going to have to be hosted somewhere. Third party video platforms are often best; they can provide hosting without the need for you to pay for your own server. If you are paying for your own server, you need to make sure that you have the resources and bandwidth available. If your video goes “viral” (which is often the hope), you’ll need to be able to keep up with the new demand. 7. Track your customer metrics. Just like a loading page, a video can track information on whether users clicked, how long they viewed the video for, and where they came from. Through your email marketing links, you should track as much information as possible so you fully understand how successful your campaign is. For instance, the “bounce” rate of the video is going to tell you a lot about whether the video was something they expected or if they felt that it didn’t provide enough value to them. 8. Get the tone right. Lighter is usually better. On the Internet, videos that are dry or boring are usually ignored. Though you don’t need to create something that’s the pinnacle of entertainment, you usually want something that’s fairly upbeat and fast-paced. The tone of your video is going to impact the relationship the customer develops with you. 9. Don’t just focus on views. There are countless people, for instance, that watch Thai Life Insurance commercial advertisements — but very few people are looking for Thai Life Insurance. When you create a compelling ad, you can expect to get a lot of views, but not all of these views are actually interested in your product. Focus on performance-based metrics instead, such as the number of individuals who continued on to your product pages, or the number of customers who actually committed to a purchase. 10. Invest in a consolidated marketing system. There are many marketing systems that are specifically designed to track your strategies on multiple sales channels. They can track your brand mentions through social media in addition to the amount of clicks that you’re receiving on your video marketing campaign. No campaign occurs in a vacuum; the true impact of your email and video campaign will need to be measured throughout all of the social media platforms you use and all of the owned media channels you’ve developed. 11. Always have a clear call-to-action. Once the video ends, the customer must be aware of the next step. This is important even if the next step isn’t necessarily a purchase. The next step could also be watching more videos, visiting a certain website, or simply requesting a quote. Make your call to action as direct and simple as possible. Challenges When Mixing Video and Email Marketing •Video marketing needs to be produced professionally. Video marketing requires an in-depth knowledge of both video and audio production. It also requires better equipment than most people have at hand — and a poorly produced video can actually have a negative impact on a campaign rather than a positive one. In addition to this, attempting to produce video without an understanding of video marketing can lead to costly expenditures. Professionally produced video can be less expensive than video produced by those outside of the industry. •Email marketing has to be appropriately targeted. There have been tremendous volumes of information posted on targeting email marketing. Without appropriate targeting of specific demographics, most commercial emails are going to go completely unread. Most people receive dozens of emails every day, and for the most part they delete them without opening. Email marketing is a science: there are countless studies regarding exactly what gets someone to click on an email and open it. Professional marketing knowledge is required. •Video marketing covers many types of video. From standard commercials to in-depth explainer videos, there are dozens of types of video that a company can choose from. But only some of these video formats lend themselves to an email distribution channel. For the purposes of email marketing, videos need to be very specific and need to speak directly to the consumer. Each video has to be tailored to the email marketing campaign (and the opposite is also true) for it to be truly effective. General videos, such as commercials, are not well-suited to the format. •Email marketing requires contacts. Email “cold calling” usually doesn’t result in a sale; instead, email addresses need to be appropriately sourced so that all messages are directed to customers who could be truly interested in your product. Rather than attempting to convince someone to purchase your product, most email marketing is targeted towards educating customers on why they already needed your product, That means that you also need to invest some time in developing your email list before you can begin your email campaign. The numbers don’t lie: video and email marketing, together, are extraordinarily effective. But because they are also both complicated paths, it often does require a professional to pull them together into a cohesive and coherent campaign. Investing in video and email marketing can bring a company some substantial returns, as well as further building out their marketing strategy as a whole. Article by Joe Forte, co-owner and producer at D-Mak Productions, a corporate video production company based in Phoenix, AZ. Download your guide to corporate videos and why you need them!
Sunday night, the Indians pushed their winning streak to 18 games. The next-longest active winning streak in baseball is five. Since this all began, the Indians, of course, have gone 18-0, and the next-best record in the American League has been 9-7. The goal is to win the World Series, and the Indians will be disappointed if they again come up short, but at a certain point, this will just become their legacy. Somebody wins the World Series every year. The Indians have one of the longest winning streaks that baseball’s ever seen. Doing something like this is more improbable, and it’s a reflection of how well the Indians have been built, from top to bottom. A truth about baseball is that a winning team is never as good as it looks when it’s winning, and a losing team is never as bad as it looks when it’s losing. The Indians feel like they’re bulletproof, mostly because they’ve been bulletproof for about three weeks. Their odds of winning everything haven’t meaningfully changed. It’s useful to keep the Dodgers in mind. The change in perception has been abrupt, even though it’s more or less the same active roster. Invulnerability isn’t forever, as demonstrated by the reality that everything dies. The Indians aren’t unbeatable. The chances are still that they won’t win it all. Upon eventual reflection, the team shouldn’t be judged only by how it performed when it didn’t lose a game. One should attempt to consider the whole of the picture. Let’s do that right now. Let’s talk about the Indians’ pitching. On August 17, Travis wrote a post titled, “Cleveland’s Rotation Is Distancing Itself from the Pack.” It was! The rotation came together, and the rotation got hot. There might not be another rotation more fearsome when it’s hot. Now we can take an even grander view. We can think about the entire 2017 season, and we can think about the Indians’ entire pitching staff. This plot should be very easy to understand. Here’s how all the staffs rank this year, by WAR. That’s FanGraphs’ version of WAR. (On FanGraphs, it’s always FanGraphs’ version of WAR.) The Indians are way out in front, having achieved a measure of separation. The Indians are in first, and they’re in first by more than five wins. The gap between the Indians and the Red Sox is a hair wider than the gap between the Red Sox and the Nationals. It’s uncommon to have a team achieve such an outlier status. The picture doesn’t look very different if you choose to evaluate the teams by the runs they’ve actually allowed, instead. The regular WAR model is somewhat theoretical. The RA9-WAR model is less theoretical. Here’s how that looks. Indians in front, now by about four wins. The first-place status isn’t threatened, nor is the separation between first and second. Every team looks a little different by RA9-WAR, as opposed to regular WAR, but the impression of the Indians is unchanged. This year, the Indians have had baseball’s best pitching, and it’s been true by a significant margin. Once you understand who looks the best in a given season, it follows that you should be curious about historical context. Let me tell you — there have been very nearly 2,500 individual team-seasons going all the way back to 1900. I don’t know when you, personally, believe that baseball “began,” but when discussing statistics, I don’t know that 1899 is ever relevant. Even 1900 is probably going back too far, but, whatever. Since then, the Indians’ current staff ranks seventh in total WAR. They still have a few weeks left to play. To make everything even, I calculated every team’s WAR per 162 games. Behold the top 10. Best-Ever Pitching Staffs Team Season Games WAR/162 Indians 2017 143 30.9 Braves 1996 162 29.5 Braves 1994 114 28.8 Braves 1997 162 28.5 Yankees 2003 163 28.4 Yankees 2002 161 28.4 Phillies 2011 162 28.1 Braves 1999 162 27.8 Braves 1995 144 27.2 Cubs 1970 162 27.1 The Indians are on track to finish in first place. First place! Out of so many decades of baseball! But, again, maybe you don’t love the theoretical model. Maybe you don’t love how much our WAR equation strips out. So here’s the same table, only this time sorting instead by RA9-WAR per 162 games. Best-Ever Pitching Staffs Team Season Games RA9/162 Cubs 1905 155 39.2 Cubs 1906 154 39.0 Cubs 1909 155 38.0 Cardinals 1944 157 37.1 Yankees 1939 152 37.1 Braves 1974 163 35.2 Braves 1993 162 34.2 Cardinals 1943 157 33.9 Indians 2017 143 33.8 Cubs 1907 155 33.5 Now the Indians drop all the way to…ninth. Third, since the end of World War II. Therefore, this doesn’t change things too much. Yet just as regular WAR might strip too much away, RA9-WAR might leave too much intact. A shortcut we use sometimes is to simply average the two measures to get a clearer impression of true talent. So here’s one last table, which looks like the last two tables. For this one, I’ve just found the midpoint for each team’s WAR and RA9-WAR per 162 games. Best-Ever Pitching Staffs Team Season Games Blend/162 Indians 2017 143 32.3 Braves 1997 162 30.9 Cubs 1909 155 30.7 Phillies 2011 162 30.4 Cardinals 1944 157 30.2 Braves 1974 163 30.2 Braves 1993 162 30.0 Braves 1999 162 29.9 Dodgers 2003 162 29.5 Braves 1995 144 29.4 And there it is. First place again. The Indians’ season isn’t complete. Their pitching numbers are going to change, possibly for the worse. And it should go without saying that our WAR metrics are far from perfect. They don’t, as one example, consider strength of opposing schedule. Maybe regular WAR is too far removed from reality. Maybe RA9-WAR isn’t removed enough. Maybe it’s too sloppy to just find the average of the two numbers as if it really means anything. You can argue as much as you’d like to, because there is always room for disagreement. But there’s a legitimate case to be made that the 2017 Cleveland Indians have the best pitching staff in baseball history. This is the evidence. By all-time WAR/162, they rank in first place. By all-time combined WAR/162, they rank in first place. They still have games to play, but they can perform worse while remaining where they are. When you think about the Indians’ staff, you think about Corey Kluber. He’s great! And when you think about the Indians’ staff, you think about Andrew Miller. He’s also great! Carlos Carrasco, further, has been great. The Indians have benefited from remaining generally healthy; Miller is down now, but he’ll be back soon, and the team’s needed to use only seven starters. An underrated aspect here has to do with depth, as the Indians have given just 24.2 innings to pitchers with a below-0 WAR. All those innings have been thrown by Shawn Armstrong, whose WAR is -0.1. The Indians rank first in baseball in having the least negative pitching WAR, combined. They, again, stand at -0.1. The league average, and median, are -2.2. The Indians haven’t had to give time to bad pitchers, which is a subtle means of keeping the overall numbers terrific. The conclusion to something like this is always that, moving forward, it can mean only so much. Look back at that last table again. The 1997 Braves lost in the NLCS. The 2011 Phillies lost in the NLDS. I don’t need to tell you that this year’s Indians staff could lose, just as last year’s depleted playoff Indians staff mostly managed to win. There’s no such thing as solving October. The goal is always what lies at the end of October, but a team can only try to solve October by solving April through September. That’s where the Indians have accomplished their mission. Just in recent weeks, they’ve ripped off one of the longest winning streaks of all time. And if you go back to the beginning, the Indians have thrown what might stand as the best pitching staff in baseball history. It’s no wonder they’ve been a hard team to beat.
Sarah Palin isn't even finished her speech and already the "blame the media" meme has been pushed explicitly to the forefront — this woman who has spent less than a week in the media spotlight, who has thus far made herself completely unavailable for interviews, has blasted the media with both barrels, deriding them for daring to cover her and the revelations that continue to emerge about this untested, unknown and unvetted candidate. Here's how she characterized the media's coverage of her for these past few days, since it's been revealed that John McCain met her once and didn't investigate her political past: I'm not a member of the permanent political establishment. And I've learned quickly, these past few days, that if you're not a member in good standing of the Washington elite, then some in the media consider a candidate unqualified for that reason alone. Really? Care to qualify that with any examples? Because I haven't seen you submit to any interviews, sit for a grilling on "Meet The Press" (oh, to have Russert here right now!), or open yourself up willingly to the scrutiny of the fourth estate, whose importance as a check on the political process is so critical that it was enshrined in the U.S. Constitution by the founders. I don't think the press has any problem with mavericks — hell, John McCain has gotten by on that one with the help of his 'base' for eight years now. And I don't think that the press requires any candidate to be party of the "Washington elite" or the "permanent political establishment" — otherwise a certain former community organizer (new dirty word!) wouldn't have had a hope in hell of those glowing magazine covers. But it's not unreasonable to expect that a candidate for the second-highest office in the land make herself available to the media — the representative of the people, at least in terms of asking the hard questions that a governor might, say, hire a lawyer to consult with before answering. Just by way of example. So the McCain campaign was so miffed by Campbell Brown's tough questioning of their surrogate — a trained, smooth, competent, TV-ready surrogate well-versed in talking points — who still couldn't muster up proof of Palin's leadership mettle in the Alaska National Guard — that they pulled McCain from an interview? So they're accusing the press of being "on a mission to destroy" Sarah Palin? That's crazy. No one has had time yet to form an opinion — let alone enough information. And a campaign with nothing to fear would have no problem throwing open the doors and saying, come on in, we've got nothing to hide...and we know that because we actually, you know, checked. You need a whole lot less bluster when the facts are on your side. There's no reason for Sarah Palin — or the McCain campaign — to be so shocked that the media might want to actually know something about a VP nominee. My God, how many weeks of speculation and discussions about vetting did we endure between the end of the primaries and this point? Palin was a wild-card candidate, so far better known for her penchant for moose-burgers, aerial wolf hunting, and — yes — her pregnant and unmarried 17 year old daughter than she is for her actual qualifications that being a heartbeat away from the presidency. There is absolutely nothing wrong with the media's desire to even that score. That is its job, plain and simple — and if it doesn't always thrill the guy on the Straight Talk Express, well, so be it. So — here's a little newsflash for Sarah Palin, to paraphrase her speech: The media isn't writing about you to seek your good opinion — they're writing about you to serve the people of this country. Americans expect the media to investigate their candidates for office for the right reasons, not just to get the right access. If you really want to serve the people — as opposed to just your party, or yourself — then you'll do well to remember that. Update: Quoting Joe Klein from earlier today:
A former marketing executive for textbook publishing giant Pearson Education reveals the anti-American agenda behind Common Core and the Advanced Placement U.S. History framework in the third video of a series produced by Project Veritas and focused on the corporate cronyism behind the education reform known as Common Core. Kim Koerber, a former Pearson executive who now works as a sales consultant for National Geographic – another Gates Foundation-funded Common Core publisher – tells the Project Veritas undercover journalist that “conservative voters are afraid of everything,” and proceeds to say why Common Core is important in her view. She explains that those behind Common Core and the new AP U.S. History framework have attempted to minimize the Constitution and remove Christianity from the core concepts, while they also stress the importance of teaching about Islam: “The dead white guys did not create this country,” Koerber says. “They [presumably conservatives] want to talk about those dead white guys.” Koerber continues that Common Core is necessary because “it needs to be come cohesion between the states.” She expresses frustration, however, that “Texas keeps screwing it up over and over again.” “People who say they want to teach the Constitution, only want to teach the part of the Constitution that they like,” she tells the journalist, who then asks her about the Second Amendment. “But yet they don’t want to teach all of it,” she replies. “Damn the Second Amendment.” The discussion continues: Kim Koerber (KK): People that are not educated, Fox TV viewers think that Common Core comes from the educated liberal groups and that’s why they are against it. They don’t know anything about it. They think it’s liberal so they’re against it. That’s what I think it is. It’s a knee jerk reaction. My mother, oh my God, she’s a Fox person. If I could remove Fox from my television set, I would… I did a big presentation yesterday for AP US History and the AP US History agenda was set, until Texas got upset about it and they wanted to have their founders – they wanted founders in it. And it’s like – come on. The dead white guys did not create this country. It was a whole bunch of different kinds of people. And yes there were women, and yes there were people of color, and yes…you need to talk about them too. But they want to talk about those dead white guys. And that’s the problem. You’re getting pushback, because there’s a bunch of Republican people, conservatives that don’t like being told what to do by people they don’t agree with. For example, in AP U.S. History a long time ago, Texas wanted to have U.S. History books, right? Pearson made them. And it talked about the Wild West and how there were prostitutes, right. And Texas was really upset. They didn’t want to mention…I’m like…You’re too young to… Did you watch Gun Smoke? It was a TV show, and you had Marshall Dillon and Ms. Kitty was his friend. She owned a bar and she was a prostitute. They never mentioned it but that’s what she was. It’s like who was Ms. Kitty? Who were these people who went out and serviced these men that went out in the world? That was real. The Wild West was not a nice place. And our kids need to know that that’s what it was like, you know. Project Veritas Journalist (PV): But these people in Texas are really upset that the Constitution is not being covered. KK: It is being covered, but not the way they…cause they’re idiots and they don’t know what’s in it. PV: Is it covered as much as it would have been? KK: In 12th grade government it certainly is, and in 5th grade it is. Yes. PV: “It’s not a necessity for the kids.” KK: You should know a little bit about it, you shouldn’t have to memorize the thing. Republicans want to get in there and talk about stuff and change things about school stuff because they want to, they want to influence what is being taught. Common core doesn’t put up with that. PV: Yeah. And, so it’s not really being… A lot of these complaints about it are not so much about the content yeah… KK: They’re misunderstanding; they are people that don’t really know what they are talking about. I…I can’t stand it. If they talk to me one more time about…climate change not being real, I’m just gonna scream. PV: I am really glad I’m here in California, whatever religious affiliation you want to take is fine, but in Texas they want to push the Christianity. KK: Because they think it’s the only one. PV: They do, and I see that. KK: That’s why it’s so offensive to have these prayers in the school board. PV: Christianity is totally out of the common core? KK: Yes it is. Totally. It’s not a core concept at all. PV: But then there is a mention of other religions like Islam. KK: Yeah well you have to because … PV: So how did Islam get worked in? KK: Islam…they said you have to talk about Islam, you have to talk about Judaism and you have to talk about Christianity and they wanted to make it big about Christianity; no it’s like, everybody needs to know about everything else… PV: Is that one of the complaints, that common core does have a liberal bias? KK: Yes, they feel like we’d be educating their kids to the world which they don’t want to do that. They want their kids to only know this… It’s like birth control. They don’t want their kids to know about it, yet Chlamydia is huge in Texas. So it’s like, you know…In the schools that have kids that, because the kids don’t learn about anything about what they’re doing and they’re messing around and they get in trouble because they didn’t get educated. So, I think the progressive bias is the more educated you are, the better you are, and the conservative bias is the less they know the better they are going to be. Yeah. PV: What is it that they don’t agree with? KK: They don’t agree with Islam, so they don’t want their kids to be taught it. They don’t agree with birth control so they don’t want their kids to talk about it. They don’t agree with math because they don’t understand it. It’s not the same math they did in high school. So they don’t want their kids to know about it. It’s conservative push back, that, they are afraid. So these conservative, these conservative voters are afraid of everything. PV: I’m just wondering why Common Core specifically…before Common Core kids were learning about math and science. KK: Because it’s the government telling me what to do. People who say they want to teach the Constitution, only want to teach the part of the Constitution that they like. PV: Second Amendment? KK: But yet they don’t want to teach all of it. Damn the Second Amendment. I don’t think personal handguns need to be on anyone except the government, the police. What is the purpose of having a gun? The separation of church and state they don’t understand. They don’t like that. They don’t like equal rights between all groups. The voter suppression that is going on in the south is just unbelievably awful. People who are not educated are easy flim flam. And that they react by fear instead of by knowledge. The Project Veritas journalist asks Koerber about the profits for textbook publishers with the Common Core reform. KK: Anytime anybody changes something in a textbook its profitable for the textbook companies. So the textbooks have to change and the school district has to adopt the new ones, that’s profitable. PV: Say that again. KK: Anytime a change happens that has to be put in a textbook suddenly the school district has to adopt new books. The video also includes footage of Project Veritas president James O’Keefe at the South Carolina Tea Party convention, during which Republican 2016 frontrunner Donald Trump told his audience Common Core is a “disaster,” and that no candidate can win who is in favor of Common Core. “We spend more than anybody else and to a large extent that’s Common Core, because these people in Washington – the bureaucrats – are making a fortune,” Trump said. “They don’t give a damn about your kids in South Carolina.” O’Keefe asked Sen. Ted Cruz about the corporate cronyism behind Common Core. “I’ll tell you this, as President, I will instruct the Department of Education to end Common Core on day one,” Cruz said. Former Sen. Rick Santorum also asserted, “I am very much against Common Core, against any kind of federal intervention into our schools. That’s the big problem.” Asked about the major issues surrounding Common Core, Santorum replied: The elites in our culture who want to indoctrinate our young people into a certain way to think, a certain belief structure, and it’s all spread out through Common Core. I believe the best and safest way to maintain our values in this country is to leave it up to the people at the grass roots level. “It doesn’t matter if it’s corporate cronyism or liberal ideology, if you are slipping your agenda into our education system, we are going to expose you, one by one, until the whole rotten system is revealed,” O’Keefe tells Breitbart News about his project on Common Core and education. “Corporate cronyism and underhanded political deals have contributed to Common Core’s massive disruption and the unraveling of America’s educational fabric.”
Plotting Climate Data with Matplotlib and Python Is it getting hot in here, or is it just me? You’ve no doubt seen the barrage of coverage discussing climate change over the last century. But how do you separate the hype from the facts? Let’s go straight to the source. Today we’re going to use a dataset sourced directly from NOAA (National Oceanic and Atmospheric Administration) and plot that data in Python using Matplotlib. NOAA has a wide variety of datasets tracking all kinds of things, some of them reaching back hundreds of years. For this tutorial, we’re going to use a dataset tracking global land and temperature anomalies each June. The dataset reaches all the way back to 1880, so that gives us a lot to work with. Let’s see what the data has to say. Access the Dataset The first thing you need to do is access the proper dataset from NOAA. They have a whole data gallery you can browse, but for this example we’ll be using the Climate at a Glance dataset. It comes in CSV format and shows a date, the mean temperature, and the variation of that mean from the average temperature between 1901-2000. That way we can see how much higher or lower the mean temperature is from the “average” temperature across the last century. Set up Dev Environment Once you’ve downloaded the dataset, we need to get our development environment set up. You’ll need to have Python 3.6 installed on your machine for this tutorial. We’ll begin by setting up a virtual environment to manage the dependencies. This uses the Python package virtualenv. If you don’t have it installed, you can access it by entering pip install virtualenv at your command line. $ mkdir climate_data $ cd climate_data $ virtualenv -p /usr/local/bin/python3 climate $ source climate/bin/activate This creates and activates a Python environment within the climate_data folder, so you can install your dependencies and not deal with conflicts from other Python versions or libraries. Your shell prompt should look something like this now: (climate) Als-MacBook-Pro:climate_data alnelson$ The next thing we need to do is install matplotlib, which will help us plot the data on a graph. $ pip install matplotlib Once that’s done, we’re ready to move on to the coding part of this tutorial. Import the Data Create a Python file called climate.py and open it in your favorite text editor. Then import the necessary libraries: import matplotlib as mpl import numpy as np import matplotlib.pyplot as plt 1 2 3 4 import matplotlib as mpl import numpy as np import matplotlib . pyplot as plt Note: if you’re on Mac OSX, then you may see an error when you try to import pyplot. This is a known issue with matplotlib and virtualenv. Luckily, you can use this workaround. Enter these lines right after the numpy import if you’re getting errors: mpl.use(‘TkAgg’) import matplotlib.pylot as plt 1 2 mpl . use ( ‘ TkAgg ’ ) import matplotlib . pylot as plt The next thing we need to do is load in the CSV data file. We do that using NumPy’s genfromtxt function, like so: data = np.genfromtxt('global_data.csv', delimiter=',', dtype=None, skip_header=5, names=('date', 'value', 'anomaly')) 1 data = np . genfromtxt ( 'global_data.csv' , delimiter = ',' , dtype = None , skip_header = 5 , names = ( 'date' , 'value' , 'anomaly' ) ) We’ll break this down a little at a time. First, you need to enter the name and path of your data file. In my case, I have the dataset in the same directory as my Python file. Make sure you’re pointing to the correct file location. Next, I specify the delimiter which is ‘,’ since it’s a CSV. dtype=None tells the interpreter to automatically assign data types based on the data that appears in the columns. skip_header tells it to skip the first 5 rows, because if you look at the dataset in a text editor, you’ll see that the first 5 rows are description. Finally, we tell NumPy what each column is called and save it to a variable called data. Graph the Data Now that we’ve got our data loaded in, we need to set up matplotlib to receive it. plt.title(“Global Land and Ocean Temperature Anomalies, June”) plt.xlabel(‘year’) plt.ylabel(‘degrees F +/- from average’) plt.bar(data[‘date’], data[‘value’], color=”blue”) plt.show() 1 2 3 4 5 plt . title ( “ Global Land and Ocean Temperature Anomalies , June ” ) plt . xlabel ( ‘ year ’ ) plt . ylabel ( ‘ degrees F + / - from average ’ ) plt . bar ( data [ ‘ date ’ ] , data [ ‘ value ’ ] , color = ” blue ” ) plt . show ( ) You should now have a plot that looks like this: What does this mean for our Earth? Well, you have the data. I’ll let you be the judge. Next Steps If you wanted to do some other interesting experiments, you could look up the weather on the day you were born, or on major election days. Perhaps you could combine the weather with polling data to see if there was any correlation. NOAA has lots of datasets for you to play with, so go and check it out.
Updated to reflect that Lee’s Liquors at 2903 Hamilton Street sits across the street from another drive-through news stand that is not affiliated with this story. BY REBECCA BENNETT — Local business owners Amrik and Ravinder Melhi received another setback this month when the Prince George’s County Board of License Commissioners revoked the liquor license of an Adelphi restaurant, The Golden Bull, which they allegedly control. Last August, the board ordered their business, Tick Tock Liquors in Langley Park, to stop selling alcohol. At Tick Tock, located at 1820 University Boulevard East, a handwritten sign on the door says, “We close,” even though the website still lists store hours as 7 a.m. to midnight. The store has a sordid past. Tick Tock Liquors and owners Amrik and Ravinder Melhi were implicated in the 2010 indictment of former Prince George’s County Executive Jack Johnson, who pleaded guilty to accepting bribes from developers, conspiracy, and extortion. Johnson was sentenced to seven years in prison and is scheduled to be released in 2018. Amrik Melhi admitted to conspiring with Johnson and a Prince George’s County Police officer to transport and distribute untaxed alcohol using extortion. In Dec. 2011, a U.S. District Court sentenced 52-year-old Melhi to 46 months in prison and three years of supervised release, the FBI said. He was released on March 18, 2014, according to the Federal Bureau of Prisons. The FBI says Ravinder Melhi, 49, pled guilty to illegally accessing a protected Maryland Motor Vehicle Administration computer for commercial and personal gain. She was sentenced to 18 months probation and ordered to pay a fine of $25,000. According to the 2010 federal indictment, the Melhis also own the building where Lee’s Liquor sits at 2903 Hamilton Street in West Hyattsville, which was highlighted as a target for forfeiture. A federal indictment alleged the properties were used to carry out illegal activity. The Hyattsville Patch reported in 2010 that Lee’s owner Ajay Sharma said the store was not involved. Amrik Melhi took a plea deal and agreed to pay a fine of $975,327.32, court records show. In the Sept. 2011 forfeiture consent order, the government required none of the personal properties, business properties or cars listed in the indictment be turned over. That meant Tick Tock Liquors and Hyattsville tenant Lee’s Liquors could continue operating at their current locations. Despite being caught up in a major felony indictment, the Melhi’s and Tick Tock Liquors are still having trouble staying on the straight and narrow. Court records show that an investigator for the Comptroller of Maryland filed an incident report in Aug. 2013 alleging Tick Tock Liquors purchased nearly 4,000 bottles of contraband beer from an unlicensed wholesaler. According to notes from the Jan. 8, 2014 Board of the License Commissioner’s hearing, Tick Tock Liquors was found responsible for “unlawfully purchasing or keeping alcoholic beverages on a licensed premises which were purchased from other than a duly licensed manufacturer or wholesaler …” The establishment was fined $5,000 and its license suspended for 10 days. The license commissioners asked that Tick Tock Liquors show cause for two incidents in March 2014 at the establishment. On March 8, according to a Prince George’s County Police report, officers responded to a stabbing just outside of the restaurant and store. The board fined Tick Tock $5,000 and imposed a 30 day suspension on its liquor license. The board stated, “The bartender, who served in that position for a number of years never received the formal training in alcoholic beverage service which is so necessary to enable service personnel to recognize the signs of intoxication and prevent over service.” On March 23, according to Prince George’s County Police, a customer unscrewed a foot-long beer tap handle and struck another customer over the head several times. The board fined Tick Tock Liquors $12,500. According to the license commissioners, “The staff of the licensee did not summon medical assistance for a customer who was injured so seriously that he was transported to the hospital by medical personnel when they were finally summoned by the police.” “Police resources are unduly taxed by the operation of the premises. The situation is intolerable and clearly a danger to the community,” license commissioner notes say, citing that the board found approximately 20 percent of recent calls to police in the area involved Tick Tock Liquors. … The board concludes that the operation of the premises causes a danger to the peace and good order of the community …” In Aug. 2014, the board held a special session to recommend Tick Tock’s liquor license not be renewed. The license commissioners alleged that even though Ravinder Melhi could no longer hold a liquor license after being convicted of a felony, and that she had technically sold her share to other people, she was still controlling the liquor licenses at Tick Tock Liquors and at Golden Bull, a restaurant located near Adelphi Road and Riggs Road. “Although Mrs. Melhi holds herself out to be a mere salaried employee of the company, she continues to exercise operational control over the business and to represent the business in any number of critical areas,” says notes from the Aug. 6 Board of License Commissioners meeting. Ravinder Melhi’s daughter is a stockholder at Golden Bull, according to the license commissioners. “The board finds that these two licensed premises are operated under one management umbrella,” board notes state. Calling Ravinder Melhi’s actions fraudulent, the board voted to not renew Tick Tock’s liquor license effective Aug. 22, 2014. Court records show attorney Edward Leyden filed a petition for judicial review in the Prince George’s County Circuit Court. Leyden also filed a motion to stay the license commissioner’s decision, claiming it was arbitrary and unsupported by substantial evidence. Court documents filed on behalf of Tick Tock Liquors stated, “Mrs. Melhi worked unrelentingly and ceaselessly to maintain and grow the Tick Tock business in order to provide the kind of financial legacy for her three children that her own parents could never have dreamed of providing to her.” Tick Tock Liquors grossed revenues of $7.4 million in 2012, according to its lawyer. In response, court documents filed by the Board of License Commissioners state, “the board’s decision does not cut off all sources of revenue to the petitioner. It does not prevent the petitioner from … providing meals to patrons without the service of alcoholic beverages.” Golden Bull, the other liquor license the board alleges Ravinder Melhi controls, has had its fair share of problems this year, including the alleged operation of a separate liquor store when they were only authorized to sell alcohol in a restaurant. According to Dec. 17 findings of fact and conclusions, the license commissioners also decided to revoke that license effective Jan. 9, 2015. Leyden, who is also representing Golden Bull, has already filed a request for judicial review with the circuit court. He said it is not an accurate characterisation in reference to Mrs. Melhi controlling both liquor licenses. Even though Leyden filed the relevant paperwork for Tick Tock Liquors, he said the primary counsel is Tim Maloney with law firm Joseph Greenwald and Laake. The Hyattsville Life & Times reached out to Maloney’s office and is awaiting a comment.
Have you prepared yourself to return, this Sunday, to Twin Peaks, that small Washington town, so well known for its coffee and cherry pie, once rocked by the murder of homecoming queen Laura Palmer? Fans of the eponymous television series, which first made surreal prime-time television history on ABC in 1990, have binge-watched and re-binge-watched its original two seasons in advance of the new Twin Peaks' May 21st debut on Showtime. Even fans who disliked the second season, in which series creators David Lynch and Mark Frost gave in to network pressure to resolve the story of Palmer's murder, have re-watched it, and with great excitement. But can simply watching those first thirty episodes (and maybe the follow-up feature film Twin Peaks: Fire Walk with Me, once booed at Cannes, the very same festival which will screen the first two parts of the new Twin Peaks on the 25th) suffice? To get yourself as deep into the show's reality as possible, we recommend dipping into the Twin Peaks material we've posted over the years here at Open Culture, beginning with the four-hour video essay on the series' making and mythology we featured just this past January. You can orient yourself by keeping an eye on Lynch's hand-drawn map of the the town of Twin Peaks, which he used to pitch the show to ABC in the first place, and which appears just above. But Twin Peaks has its foundation as much in music as in geography. Just above, you can hear composer Angelo Badalamenti, a frequent collaborator with Lynch, tell the story of how he and the director composed the show's famous "Love Theme," which not only made an impact on the televisual zeitgeist but set the tone for the everything to follow. "It's the mood of the whole piece," Lynch once said of the composition, "It is Twin Peaks." Badalamenti has scored the new series as well, joining the long list of returnees to the project that includes not just Lynch and Frost, but Kyle MacLachlan as FBI Special Agent Dale Cooper and many others from the original cast as well, including the late Miguel Ferrer and Warren Frost. “There’s so much more to Twin Peaks than a riveting murder mystery,” says Alan Thicke, another performer no longer with us, hosting the 1990 behind-the-scenes preview of the show's second season just above. “There’s a whole look and a feel and a texture,” an experience “180 degrees away from anything else on television.” As dramatically as televisual possibilities have expanded over the past 27 years, it seems safe to say that the continuation of Twin Peaks, which comes after such expansions of its fictional universe as Frost's Secret History of Twin Peaks, will maintain a similar creative distance from the rest of what's on the air. "The one thing I feel I can say with total confidence," to paraphrase David Foster Wallace writing about Lost Highway twenty years ago, is that the new Twin Peaks will be... Lynchian. Above, you can watch a mini-season of Twin Peaks, which also doubles as a series of Japanese coffee commercials. They, too, come courtesy of David Lynch. And below, watch “Previously, on Twin Peaks…”, an abbreviated, 55-minute refresher on what happened during the first two seasons of the show. (It comes to us via WelcometoTwinPeaks.) Also you can read a recap of every episode over at The New York Times. Related Content: Watch an Epic, 4-Hour Video Essay on the Making & Mythology of David Lynch’s Twin Peaks Hear 20 Minutes of Mark Frost’s New Secret History of Twin Peaks, the Book Fans Have Waited 25 Years to Read David Lynch Draws a Map of Twin Peaks (to Help Pitch the Show to ABC) Angelo Badalamenti Reveals How He and David Lynch Composed the Twin Peaks‘ “Love Theme” David Lynch Directs a Mini-Season of Twin Peaks in the Form of Japanese Coffee Commercials Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. He’s at work on a book about Los Angeles, A Los Angeles Primer, the video series The City in Cinema, the crowdfunded journalism project Where Is the City of the Future?, and the Los Angeles Review of Books’ Korea Blog. Follow him on Twitter at @colinmarshall or on Facebook.
Christmas is a very special occasion. People make things on Christmas and I have come up to you with some Sweet chex mix recipes, chex mix recipes and Christmas crack to make your holiday season a memorable one. It’s been a tradition to make special recipes on Christmas. Here I have sorted out my favorite twenty five Sweet Chex Mix Recipes. One recipe for one day till to the end of twenty fifth of December. I hope you will love this idea of making one recipe for one day. If you have kids, you would love to see crafts for them here: EASY TO MAKE CHRISTMAS CRAFTS FOR TODDLERS Sweet Chex Mix Recipes 1. Sweet & Salty Cashew Chex Mix Get the Recipe 2. Caramel Churro Chex Mix Enjoy the Recipe 3. Zesty Ranch Chex Mix Get the Recipe 4. Sweet Holiday Chex Mix Get the Recipe 5. Lucky Rainbow Chex Mix Get the Recipe 6. Chex Mix Arare Get the Recipe 7. Thin Mint Flavored Chex Mix Get the Recipe 8. Addicting Toffee Chex Mix Get the Recipe 9. Salt and Vinegar Chex Mix Get the Recipe 10. Apple Cinnamon Snack Mix Get the Recipe 11. Buffalo Cheddar Ranch Chex Mix Get the Recipe 12. Almond Coconut Chex Mix Get the Recipe 13. Whole Grain Chex Mix Get the Recipe 14. Ranch Chex Party Mix Get the Recipe 15. Sugar Cookie Chex Mix Get the Recipe 16. Num Num Chex Mix Recipe Get the Recipe 17. Chocolate Cherry Trail Mix Recipe Get the Recipe 18. Homemade Margarita Chex Mix Get the Recipe 19. Orange and Cranberry Chex Mix Get the Recipe 20. Buckeye Chex Mix Recipe Get the Recipe 21. Bear Lake Trail Mix Get the Recipe 22. Snickers Chex Mix Get the Recipe 23. Healthier Coconut Almond Chex Mix Get the Recipe 24. Autumn Spice Chex Mix with Pecans and Cranberries Get the Recipe 25. Mom’s Secret Christmas Eve Chex Mix Get the Recipe Note: I do not claim these Sweet Chex Mix Recipes as mine. I have just collected them from different sources and linked them up to their owners. If anyone got problem with that just write me an email and I will remove the particular one. Save Save Save
ISLAMABAD (Reuters) - Taliban insurgents said on Tuesday that the Pakistani schoolgirl its gunmen shot in the head deserved to die because she had spoken out against the group and praised U.S. President Barack Obama. A student holds a placard with a picture of schoolgirl Malala Yousufzai, who was shot on October 9 by the Taliban, during a rally in Lahore October 16, 2012. REUTERS/Mohsin Raza Malala Yousufzai, 14, was flown to Britain on Monday, where doctors said she has every chance of making a “good recovery”. The attack on Yousufzai, who had been advocating education for girls, drew widespread condemnation. Pakistani surgeons removed a bullet from near her spinal cord during a three-hour operation the day after the attack last week, but she now needs intensive specialist follow-up care. Authorities have said they have made several arrests in connection with the case but have given no details. Pakistan’s Taliban described Yousufzai as a “spy of the West”. “For this espionage, infidels gave her awards and rewards. And Islam orders killing of those who are spying for enemies,” the group said in a statement. “She used to propagate against mujahideen (holy warriors) to defame (the) Taliban. The Quran says that people propagating against Islam and Islamic forces would be killed. “We targeted her because she would speak against the Taliban while sitting with shameless strangers and idealized the biggest enemy of Islam, Barack Obama.” Yousufzai, a cheerful schoolgirl who had wanted to become a doctor before agreeing to her father’s wishes that she strive to be a politician, has become a potent symbol of resistance against the Taliban’s efforts to deprive girls of an education. Pakistanis have held some protests and candlelight vigils but most government officials have refrained from publicly criticizing the Taliban by name over the attack, in what critics say is a lack of resolve against extremism. “We did not attack her for raising voice for education. We targeted her for opposing mujahideen and their war,” said the Taliban. “Shariah (Islamic law) says that even a child can be killed if he is propagating against Islam.”
It was nearly 30 years ago—Jan. 24, 1986, nearly a decade after it had been launched—that the Voyager 2 spacecraft made its closest pass to Uranus and, as TIME phrased it, “taught scientists more about Uranus than they had learned in the entire 205 years since it was discovered.” But the sophisticated equipment that sent information back to NASA wasn’t the only important thing on board the spacecraft. The Voyager 2, like the Voyager 1, carried with it a record, plated in gold, on which had been encoded sounds and images meant to “portray the diversity of life and culture on Earth,” according to NASA. The message from Earth was curated by a committee led by Carl Sagan and contained 115 images of “scenes from Earth.” It was estimated in 1977, when the Voyagers launched, that it would take 40,000 years for them to reach a star system where there might be a being capable of deciphering the record. But, should that ever happen, what exactly could those photos say about humanity? Here’s a hint, from a few of the pictures on the golden record, and our best guesses at how hypothetical aliens might interpret them: Cute young Earthlings and an image of their planet, or maybe giant Earthlings and a smaller planet under their control: A group of children learn at the UN International School in New York City in 1968 Yutaka Nagata—UN Photo A fully grown Earthling, or maybe a demonstration of the kinds of weapons available on Earth: Portrait of a worker engaged in thinning out a densely regenerated area of trees in Nicaragua in May 1971. Yutaka Nagata—UN Photo An Earth city building at sunset, or maybe the spacecraft with which a large number of Earthlings will come find you: The United Nations headquarters in New York City in 1968. Yutaka Nagata—UN Photo Earth traffic jam, or maybe why Earthlings will be fleeing to move to your home planet: Traffic congestion at rush hour jam in Bangkok in 1972. UN Photo Earth scientist at work, or maybe an Earthling with goggles that can see you right now: A student uses a microscope in a health center in Mogadishu, Somalia in 1970. Rice—UN Photo How this thing got to you, or maybe a missile: Voyager 2 launching aboard a Titan-Centaur rocket in Cape Canaveral, Fla., on Aug. 20, 1977. NASA An image of early Earth spaceflight, or a being we abandoned in space: Gemini 4 astronaut Ed White, the first American to take a spacewalk. NASA Celestial bodies near the Earth, Jupiter, Mercury and Mars, or maybe the places we’ve already conquered: Jupiter, Mercury and Mars. NASA Where to find us, or maybe where to stay away from: Planet Earth. NASA Contact us at editors@time.com.
Authorities confirm remains of Galveston teen Jessica Cain, who disappeared in 1997 C.H. Cain and his wife, Suzy Cain, release a dove in memory of their daughter, Jessica Cain, during a tribute site dedication at Highland Bayou Park in 2012, in La Marque. C.H. Cain and his wife, Suzy Cain, release a dove in memory of their daughter, Jessica Cain, during a tribute site dedication at Highland Bayou Park in 2012, in La Marque. Photo: handout Photo: handout Image 1 of / 18 Caption Close Authorities confirm remains of Galveston teen Jessica Cain, who disappeared in 1997 1 / 18 Back to Gallery Authorities have confirmed the remains found in a south Houston field are those of Jessica Cain, a Galveston teenager who disappeared almost two decades ago. Galveston County Criminal DA Jack Roady announced Friday that remains found last month in southeast Houston have been identified as those of Cain, according to a press release. RELATED: Police confirm remains of missing UNT student Kelli Cox Cain, 17, vanished on Aug. 17, 1997 as she drove home from a high school musical cast party at a Clear Lake restaurant. Her truck was later found on the shoulder of Interstate 45 with her purse locked inside. "We are relieved at the news that Jessica has been found," Roady said in a release. "But while this news brings confirmation, it also brings new sorrow to Jessica's family, friends and those in law enforcement who have mourned her loss. We ask that everyone be respectful of her family and friends' need for privacy during this time of grieving." Earlier this week authorities confirmed that a second set of remains found in the area belonged to Kelli Cox, a University of North Texas student who went missing on July 15, 1997. EXPLAINED: What you need to know about William Lewis Reece William Reece, a convicted kidnapper, is linked to both cases. He is currently serving a 60-year prison sentence for a separate kidnapping in 1997. Reece, 56, led police to the pasture near Hobby Airport where the remains were found. He is also suspected in the kidnappings and killings of 12-year-old Laura Smither and 19-year-old Tiffany Johnson. Both of those slayings occurred in 1997.
WHAT if aliens really do exist? The thought itself is too much for BBC bosses to handle. The media behemoth shut down a TV presenter's plan to point a radio telescope at a newly-discovered planet out of fear that aliens might answer back. Professor Brian Cox said the BBC was concerned the experiment, to be staged live on air during his show Stargazing Life, broke the corporation's health and safety rules. "We decided that we'd point the Jodrell Bank telescope at the planet (Threapleton Holmes B) that had been discovered by these two viewers (in January) and listen because no one had ever pointed a radio telescope at it and you never know," Mr Cox said on BBC radio. "The BBC actually said, 'But you can't do that because we need to go through the regulations and health and safety and everything in case we discover a signal from an alien civilisation'." "(I said), 'You mean we would discover the first hint that there is other intelligent life in the universe beyond Earth, live on air, and you're worried about the health and safety of it?' "It was incredible. They did have guidelines. Compliance."
(CNN) -- To project elections, CNN and its election experts use scientific statistical procedures to make estimates of the final vote count in each race. CNN will broadcast a projected winner only after an extensive review of data from a number of sources. CNN editorial policy strictly prohibits reporting winners or characterizing the outcome of a statewide contest in any state before all the polls are scheduled to close in every precinct in that state. CNN will receive information from the following sources: The Associated Press: The Associated Press will provide vote totals for each race. The AP will be gathering numbers via stringers based in each county or other jurisdiction where votes are tabulated. Edison Media Research: To assist CNN in collecting and evaluating this information, CNN, the other television networks and The Associated Press have employed Edison Media Research (EMR). In previous elections, this firm has assisted CNN in projecting winners in state and national races. EMR will conduct exit polls, which ask voters their opinion on a variety of relevant issues, determine how they voted, and ask a number of demographic questions to allow analysis of voting patterns by group. Using exit poll results, scientifically selected representative precincts, vote results from The AP, and a number of sophisticated analysis techniques, EMR also recommends projections of a winner for each race it covers. Collecting data The process of projecting races begins by creating a sample of precincts. The precincts are selected by random chance, like a lottery, and every precinct in the state has an equal chance to be in the sample. They are not bellwether precincts or "key" precincts. Each one does not mirror the vote in a state but the sample collectively does. The first indication of the vote comes from the exit polls conducted by EMR. On the day of the election, EMR interviewers stand outside of precincts in a given state. They count the people coming out after they have voted and are instructed to interview every third person or every fifth person, for example, throughout the voting day. The rate of selection depends on the number of voters expected at the polling place that day. They do this from the time the polling place opens until shortly before it closes. The interviewers give each selected voter a questionnaire, which takes only a minute or two to complete. It asks about issues that are important, and background characteristics of the voter, and it also asks for whom they voted in the most important races. During the day, the interviewer phones the information from the questionnaires to a computer center. Next, vote totals come in from many of the same sample precincts as the exit polls after the voting has finished in those precincts. These are actual votes that are counted after the polls have closed. Election officials post the results so anyone at the precinct can know them. The third set of vote returns come from the vote tallies done by local officials. The local figures become more complete as more precincts report vote returns. The county or township vote is put into statistical models, and EMR makes estimates and projections using those models. In addition, CNN will be monitoring the Web sites of the Secretaries of State offices to help analyze the outcome of early voting and absentee voting. Projections The projections for CNN will be made from the CNN Election Analysis Center at the Time Warner Center. An independent team of political analysts and statistical experts will analyze the data that will lead to the final decisions on projections. CNN will decide when and how to make a projection for a race depending on how close the race is. In races that do not appear to be very close, projections may be made at poll closing time based entirely on exit poll results, which are the only information available when the polls close about how people voted. The races projected from exit polls alone are races with comfortable margins between the top two candidates. Projections from exit polls also take into account the consistency between exit poll results and pre-election polls. In the case of close races, CNN will wait for actual votes to be tabulated and reported. EMR may make projection recommendations to its clients, but CNN will make all final calls for broadcast. Shortly after poll closing time, CNN may make projections using models that combine exit polls and actual votes. This happens in closer races. For extremely close races, CNN will rely on actual votes collected at the local level. These are the races that cannot be projected when the polls close from exit polls or even from actual votes collected at the sample precincts mentioned earlier. The projection for these races will be based on a statistical model that uses the actual votes. If it is too close for this model to provide a reliable projection, CNN will wait for election officials to tally all or almost all the entire vote. What a projection call means CNN analysts will make all projections for CNN broadcasts. When CNN's analysts project a winner in a race, whether it is based upon data from EMR or from the CNN computations, it means that when all the votes are counted, CNN projects that the candidate will win the race. A projection is as close to statistical certainty as possible, but that does not mean that a mistake cannot happen; rather, it means that every precaution has been taken to see that a mistake is not made. CNN will not "declare" someone a winner because that declaration is up to election officials. CNN will make projections based on our best estimate of how CNN expects an election to turn out. When a lot of vote returns have been tallied, a race may be referred as "too close to call" by CNN anchors and analysts. "Too close to call" means the final result will be very close and that the CNN analysts may not know who won. For the races that are the closest, the CNN Election Analysis Desk will keep CNN viewers up to date on the state-by-state rules regarding automatic recounts and will report immediately on any official candidate challenge regarding the results or voting irregularities.
Come January, a new mayor of New York will take office with the city facing a bad budget forecast: cloudy, chilly, with a chance of apocalypse. You wouldn’t know this by following the candidates around now. They’re out campaigning on the sunny side of the street — a place where, to hear them tell it, thousands of new police officers will soon be on patrol, tens of thousands of dilapidated public-housing units are about to be swiftly repaired and new affordable apartments built, and children will one day be heading off to their universal prekindergarten classes, getting a head start on life in a greener, safer, sturdier city in which the middle class and the City University are resurgent, jobs and services are abundant, and there’s money to cover all needs. Those visions are seductive; they’re supposed to be. Politicians don’t get elected by predicting failure and doom. But doom can happen, especially when unrealistic promises meet dismal facts like these: a $2 billion gap in the city’s $70 billion budget; huge and steadily growing pension obligations and health care costs for city workers; middling economic growth; the unexpected and continuous costs of disasters like Hurricane Sandy. One burden is the contracts covering 300,000 workers in nearly every municipal union. Mayor Michael Bloomberg let them all expire years ago, and they have not been renegotiated. The unions haven’t had raises in recent years, and they are expecting the next mayor to hand over nearly $8 billion in retroactive pay. That is more than the city’s annual operating budgets for police, fire and corrections combined.
There was a jackhammer and a crew from Muddruckers hard at work at the entrance to the Bomber Store at the stadium, Monday, with the statue of Bud Grant looking nervously on. The home of Winnipeg’s CFL team obviously still needs some work. As for the team itself, current head coach Mike O’Shea is convinced the roster he settled on over the weekend is better than the one that posted an 11-7 record last year. “We had a lot of depth at camp, some great competition, and our team’s going to be better for it,” O’Shea said in his first media session since after Thursday’s pre-season finale. To get there, though, he had to take a jackhammer to some dreams. Aside from having to regularly meet with local media types, the worst part of a head coach’s job is still calling a player into his office and telling him it’s time to pack his bags because he’s not good enough. Try doing that a dozen times in a single day. “Obviously decision-making time becomes difficult,” O’Shea said. “There’s just little things that separate one guy from the next. And you’re dealing with real people that have real dreams that maybe change when they finish talking to me.” Some are fresh out of college, some might be at their last pro stop. One was supposed to be a key receiver this season. Kenny Stafford signed a two-year contract with the Bombers in January, hoping to put down some roots after a nomadic first four years in the CFL. Over the weekend he was cut loose after two pre-season games in which he didn’t catch a single pass. The reason? Two, actually: Old Reliable, aka Clarence Denmark, and flashy newcomer L’Damian Washington. “There were just a couple of guys that their play was pretty elevated and they ended up winning jobs,” O’Shea said. “I don’t think it was anything that Kenny did or didn’t do. It’s a matter of a guy like Denny and Washington stepping up pretty good.” The performance of the receivers in the pre-season probably made that decision an easy one. Denmark just keeps making plays, while Washington is a mouthwatering combination of speed, size and hands the Bombers just couldn’t release. Of course, seven NFL teams and the Edmonton Eskimos, briefly last fall, have also had a look at Washington, but the 26-year-old has yet to find a football home. He quietly joined the Bombers a few days into training camp. “He came in a few days late and picked everything up very quickly,” O’Shea said. “And just that intrigue… his physical stature and athleticism is a pretty good combination.” Finding the right combination for the Canada Day season-opener in Regina is now Job 1. Injuries always play a role – dime back Moe Leggett will try to ease his way back from injury this week, while first-year receiver/kick returner T.J. Thorpe is “at least a couple weeks” away – and O’Shea sounds like someone who’s still undecided at a couple of spots. “There’s still some interesting things that could play out,” he said. “We still have maybe a couple more decisions to make.” Canadian Sam Hurl looks to be the starting middle linebacker, although O’Shea wouldn’t confirm it, other than to say Hurl had another good camp. The boss did express his displeasure with the team’s lack of discipline in the pre-season, vowing to address it. “We took too many penalties, overall,” O’Shea said. “I’m not singling guys out for penalties. We handle that in-house. There were far too many penalties in the pre-season, and we’ll fix all that.” As for a defence that leaked considerably the last two weeks, O’Shea shrugged off any concerns, saying a lot of players were moving around. Stopping the other guys, you’ll recall, was an issue last season. “We’ve done a good job of addressing and improving the entire team,” O’Shea said. “That’s not just players and personnel. It’s systems and concepts, it’s the way we plan – all that.” One thing he hasn’t improved: the difficulty in letting players go, especially those he’s been around for a while. Four training camps into this gig, he’s thinking that may never get easier. “I hope it doesn’t.” pfriesen@postmedia.com Twitter: @friesensunmedia Bombers ponder Week 1 bye To bye, or not to bye? That is the question facing the Blue Bombers, as they’ll be the only CFL team not playing, Week 1. Advantage, or disadvantage? “Getting to watch film and them not watching us would I guess be an advantage to us,” coach Mike O’Shea said. “But we’re also not going to be hitting this week or tackling and things like that. We’ve got to make sure we take care of the little details that we need to take care of.” To that end, the Bombers, who were off on the weekend, will practice Tuesday through Friday this week. That’s a far cry from the long holiday players would get if the bye week happened a month or two from now. “It’s too fresh out of camp for me to think they can all basically go home for a week and come back and pick up,” O’Shea said. “If it was mid-season it’s a different story.” Other teams, of course, have gone through this: Saskatchewan last year, B.C. in 2015 and the expansion Ottawa Renegades the year before that. All three lost the next week. O’Shea says he hasn’t picked anyone’s brain to try to find the best approach. “It’s just different,” he said. “There are a lot of things good about it. You’ve just got to find the right plan. Until you do it once, you’re not really going to know if it was the right plan or not. We’ll see how it turns out.” Winnipeg opens the season in Regina, Canada Day.