title
stringlengths
0
221
text
stringlengths
0
375k
There is much less need to learn foreign languages for countries where English is the first language. For those from countries where English is not the first language English is an obvious language to study, it is a language which is useful all over the world, not just in countries where English is the native language because so many people speak it as a second language. There is not the same obvious second language for native English speakers. It is undeniable that English is increasingly a global language; it is the language of technology and global communication. English is likely to be used in a conversation between for example a German Scientist and an Italian Politician. [1] It is therefore being realistic for English speakers to believe that any other language they learn will have less utility than their own. [1] English Online Learners, ‘English the Global Language’, British Council,
There is a large gap between those who make progress in languages and those who do not. There is a gulf between people who do make progress in languages and those who do not. Those able in languages struggle to deconstruct the difficult concepts and explain them to learners who cannot understand. Teachers cannot empathise with students who struggle. Expecting students who have great difficulty in learning languages to be able to do so from those who cannot even explain linguistic concepts successfully is far too much to ask. This one reason why in the UK Ofsted (Office for Standards in Education) considers language teaching in secondary schools to be weak. [1] There are similar problems with grammar between those who are bilingual and those who are not. People who are bilingual due to their background do not think in grammar. If they do not know why certain grammatical constructions are used when and why, how is an absolute beginner struggling with languages supposed to understand such grammar rules? [2] [1] Webb, Lauren, ‘Ofsted reports poor language teaching in UK’, Veritas, [2] Reynolds, ‘Bilingualism, Multiculturalism and Second Language Learning’, 1990, p.164
In many countries it would not be practical to have foreign languages as mandatory. It would not always be practical to increase foreign language teaching to being mandatory for all students. In the United Kingdom for example there is a shortage of foreign language teachers already with 73% of Local Education Authorities struggling to find teachers, particularly for Maths and Languages. [1] At the same time in many countries there are worries about their competitiveness in the world due to the success of East Asian countries in education. The PISA tests shows that East Asian countries, particularly China (Shanghai and Hong Kong), South Korea and Singapore far exceed countries where English is the first language in Maths and Science leading to a need to improve those subjects first. [2] [1] MailOnline, ‘Teacher shortage reaching crisis levels’, [2] PISA, ‘What Students now and can do: Student Performance In Reading, Mathematics and Science’, OECD, 2009,
While it is undeniable that at the moment English is the most used international language this is not a reason to be complacent. Just because English is currently dominant does not mean it will remain so. In the 18th, 19th and into the 20th Centuries French was the international language and before that Latin was in Europe. It is as likely that the dominance of the English language will decline as that it will continue to increase. Students should be taught other languages to take advantage of changes that may occur within their lifetimes.
Students not exceling in a subject is not a reason for not teaching that subject. Even a basic understanding of another language is useful. Anxiety is something the students will have to work through and may well be affected in other subjects as well, students who are anxious about learning foreign languages will never be willing to attempt to learn them if they are not compulsory at school.
Teachers accept that marking student work is an important part of their job. Well planned homework should not take so long to mark that the rest of their job suffers, and it can inform their understanding of their students, helping them design new activities to engage and stretch them. As for recruitment, although teachers do often work in the evenings, they are not alone in this and they get long holidays to compensate.
Marking homework reduces the amount of time teachers have to prepare good lessons Irrespective of homework's educational value, marking it takes up much of teachers' time. Australian teachers have complained that 'homework marking can result in four extra hours of work a day and they are rarely rewarded for their effort'.1 This leaves teachers tired and with little time to prepare effective, inspiring lessons. If the lessons aren't to the standard they should be, the point of homework is lost as the students have little to practise in the first place. The heavy workload also puts young graduates off becoming teachers, and so reduces the talent pool from which schools can recruit. 1 Speranza, 2011
The ban on homework could be easily enforced through school inspections In many countries public schools require regular school inspections to ensure students are receiving a relatively equal level of education. In Britain for example, Ofsted is a public body that exists specifically to inspect public schools.1 A ban on homework would thus not require a level of trust between the state and individual school principals, for state inspectors could very quickly work out whether homework was being given out by asking the children themselves. Children, who don't like homework at the best of times, would not lie. 1 Ofsted, 2011
Homework has not prevented students doing other activities; it takes very little time to complete. Recent American surveys found that most students in the USA spent no more than an hour a night on homework. That suggests there does not seem to be a terrible problem with the amount being set. Furthermore, British studies have shown that 'more children are engaging in sport or cultural activities' than ever before.1 As such, there is no clear evidence to suggest that students are stuck at home doing their homework instead of doing other activities. In addition, concerns over how busy children are suggest that parents need to help their children set priorities so that homework does not take a back seat to school work. 1 BBC News, 2008
Homework is about 'winning' on tests, not learning Many governments make their schools give students a national test (a test taken by all students of the same age). After the tests, they compare schools and punish the schools and teachers whose students do badly. Because schools and teachers are therefore scared about their students doing poorly, they give them more homework, not in the hope they learn more but simply to do better on the tests.1 As such, homework is not designed to help the student, just their teachers and schools who want them to 'win' the test and make them look good, not learn for the students' own benefit. 1 Sorrentino
Homework has little educational worth, and therefore is a waste of students' time Homework has little educational worth and adds nothing to the time spent in school. Some schools and some countries don't bother with homework at all, and their results do not seem to suffer from it. Studies show that homework adds nothing to standardised test scores for primary/ elementary pupils. As Alfie Kohn notes, no study has ever found a link between homework and better tests results in elementary school, and there is no reason to believe it is necessary in high school.1 International comparisons of older students have found no positive relationship between the amount of homework set and average test scores - students in Japan and Denmark get little homework but score very well on tests.2 If anything, countries with more homework get worse results! 1 Sorrentino , 2011 2 Britt , 2005
Homework has a lot of educational value, the reason it has not shown this is because teachers do not set the right kind of homework or they set the wrong amount of it. Some teachers believe homework is for reviewing material, others think it is better for learning new concepts. The result is 'confusion for students'.1 If the homework was consistent however, and related specifically to what is learnt in the classroom, it would have a great deal of educational value by helping them remember their lessons and increase students' confidence in how much they are learning. Furthermore, Professor Cooper of Duke University has shown that by the high schools years, there is a strong and positive relationship between homework and how well students do at school. There are two main reasons why this relationship does not appear in elementary school: 1) Elementary school teachers assign homework not so much to enhance learning, but in order to encourage the development of good study skills and time management;2 2) young children have less developed cognitive skills to focus and concentrate on their work.3 Thus, they are more easily distracted from their homework assignments. 1 Strauss, 2006 2 Muhlenbruck, Cooper, Nye, & Lindsey, 2000 3Hoover-Dempsey et al., 2001
Setting homework with the intention of encouraging students to do well at tests is beneficial to students as much as it is to teachers and schools. National tests are a way of assessing whether students are at the level they should be, if they do well on the tests, that is a good thing. Therefore, a 'win' for the teachers and schools is also a great deal of learning for the student, the two need not be separated.
Many states do not in fact have a structured school inspection system that could enforce such a ban. The United States, for example, has one of the largest student bodies in the world but the state does not have a formal inspection system that could enforce a ban on homework. Therefore any ban would only prove a recommendation at best, and could not possibly hope to be enforced. Furthermore, even in those states that do have inspection bodies, the regularity of inspections allows school principals to prepare for their arrival. Students might be forced by their teachers to lie to inspectors, otherwise they would receive even more homework. Furthermore, the school inspections are partly so that they can test the ability of students – therefore teachers are encouraged to give their students homework so that they do better on these inspections.
If homework puts students off learning, then it has been badly planned by the teacher. As Linda Darling-Hammond, a professor of Education notes, 'many teachers lack the skills to design homework assignments that help kids learn and don't turn them off to learning' .1 The best homework tasks engage and stretch students, encouraging them to think for themselves and follow through ideas which interest them. Over time, well planned homework can help students develop good habits, such as reading for pleasure or creative writing. The research however suggests that homework is not in fact putting students off learning. Rather studies in Britain indicate that 'most children are happy (and) most are achieving a higher level than before'.2 Homework cannot be blamed for a problem that does not exist. Poor children may indeed lack support to do their homework, but this just means that schools need to do more to provide the help they need. 1 Strauss, 2006 2 BBC News, 2008
Homework reduces the amount of time for students to do other activities Homework takes a lot of time up. In America, they encourage the '10 minute rule', 10 minutes homework for every grade, meaning that high-school students are all doing more than an hour's worth of homework each night.1 Being young is not just about doing school work every night. It should also about being physically active, exploring the environment through play, doing creative things like music and art, and playing a part in the community. It is also important for young people to build bonds with others, especially family and friends, but homework often squeezes the time available for all these things. 1 Associated Press, 2009
Homework puts students off learning Homework puts students off learning. Studies have shown that many children find doing homework very stressful, boring and tiring. Often teachers underestimate how long a task will take, or set an unrealistic deadline. Sometimes because a teacher has not explained something new well in class, the homework task is impossible. So children end up paying with their free time for the failings of their teachers. They also suffer punishments if work is done badly or late. After years of bad homework experiences, it is no wonder that many children come to dislike education and switch off, or drop out too early. Teachers in Britain fear that poor children, because they lack the support to do their homework, will be turned off school 1. 1 BBC News, 2008
Homework is a class issue. In school everyone is equal, but at home some people have advantages because of their family background. Middle-class families with books and computers will be able to help their children much more than poorer ones can. This can mean poorer children end up with worse grades and more punishments for undone or badly done homework. David Baker, a researcher, believes too much homework causes parents and children to get angry with each other and argue, destroying the child’s confidence 1. On the other hand pushy parents may even end up doing their kids’ homework for them – cheating and not helping the student learn at all. 1 Britt , 2005
Setting homework does little to develop good study skills. It is hard to check whether the homework students produce is really their own. Some students have always copied off others or got their parents to help them. But today there is so much material available on the internet that teachers can never be sure. It would be better to have a mixture of activities in the classroom which help students to develop a whole range of skills, including independent learning. Furthermore, if teachers want to develop independence in their students, students should be given a choice in the matter of homework. Otherwise, they’re not using their judgement and therefore they aren’t being independent at all 1. 1 Sorrentino , 2011
Homework ensures that students practise what they are taught at school Having homework also allows students to really fix in their heads work they have done in school. Doing tasks linked to recent lessons helps students strengthen their understanding and become more confident in using new knowledge and skills. For younger children this could be practising reading or multiplication tables. For older ones it might be writing up an experiment, revising for a test and reading in preparation for the next topic. Professor Cooper of Duke University, has found that there is evidence that in elementary school students do better on tests when they do short homework assignments related to the test 1. Students gain confidence from such practise, and that shows when they sit the tests. 1 Strauss, 2006
Homework is an essential part of education, allowing students to learn information beyond that which they are taught at school. Homework is a vital and valuable part of education. There are only a few hours in each school day – not enough time to cover properly all the subjects children need to study. Setting homework extends study beyond school hours, allowing a wider and deeper education. It also makes the best use of teachers, who can spend lesson time teaching rather than just supervising individual work that could be done at home. Education is about pushing boundaries, and the learning should not stop at the entrance to the classroom – students should take skills learnt in the classroom and apply them at home. Homework allows this to happen, encouraging students to go above and beyond what they do in school. Reading is the best example, students learn how to read at school, but in order to get better, they need to practise and that is best done at home, with the support of parents and at the right pace for the student.
Homework provides a link between child, school and the home Education is a partnership between the child, the school and the home 1. Homework is one of the main ways in which the student’s family can be involved with their learning. Many parents value the chance to see what their child is studying and to support them in it. It has been described as the ‘window into the school’ for parents, the area in which schools, parents and students interact daily 2. And schools need parents’ support in encouraging students to read at home, to help with the practising of tables, and to give them opportunities to research new topics. 1 Walker, et. al, 2004 2 Gill & Schlossman, 2003
Homework encourages students to work more independently (by themselves) Homework encourages students to work more independently, as they will have to at college and in their jobs. Everyone needs to develop responsibility and skills in personal organization, working to deadlines, being able to research, etc. If students are always “spoon-fed” topics at school they will never develop study skills and self-discipline for the future. A gradual increase in homework responsibilities over the years allows these skills to develop 1. For instance, to read a novel or complete a research project, there is simply no time at school to do it properly. Students have to act independently and be willing to read or write, knowing that if they struggle, they will have to work through the problem or the difficult words themselves. Diane Ravitch points out that a novel like Jane Eyre cannot be completed if it is not read at home – students have to work through it themselves 2. When given the choice of homework or no homework most students would chose not to do it. But by doing homework they are effectively taught independence in finding their own ways to explain and understand the topic. 1 Bempechat, 2004 2 Ravitch , 2007
Homework is not an essential part of information. If what was to be learnt from homework was that essential, it would not be left to the child to learn on their own and away from school. In fact, many teachers admit to simply setting homework because they are expected to set it, not because they think it will be helpful 1. The best environment for learning is in a classroom, where the student is able to ask for assistance if stuck and the teacher is available to help. . 1 BBC News, 2008
Homework does not ensure that students practise what they are taught at school. Teachers often give pupils the end of the exercise they were doing in class to complete at home, it tends to be the harder questions towards the end of the exercise and if a teacher or a tutor is not present to explain or help then it causes the pupil to doubt their ability. To practise what a student has been taught requires the presence of a teacher or tutor who can guide the student if they get something wrong. Homework, done by the student on their own, offers little support and is only a source of stress. If confused, the student may only come to dislike the topic or subject, which will only further reduce their ability to remember what they were taught.
The facts are against the premise again. Research does not support the idea that young people who play violent video games have decreased social ability. This is refuted most notably in studies by Anderson and Ford (1986), Winkel et al. (1987), Scott (1995), Ballard and Lineberger (1999), and Jonathan Freedman (2002). More recently, Block and Crain (2007) claim that in a critical paper by Anderson (and his co-author, Bushman), data was improperly calculated and produced fallacious results. Additional meta-analyses (reviews of research that attempt to statistically combine data from multiple studies for more powerful results) by other researchers, such as by Ferguson and Kilburn (2009) and Sherry (2007) have failed to find any causal link between video game violence and aggression, as have recent reviews by the Australian Government (2010) and the US Supreme Court (June, 2011). The question of whether violent games that only allow violence as a solution to problems could negatively affect young people in subtle ways deserves further study. However, there are many aspects of video games, such as puzzle solving, that are intrinsic parts of even the basest first person shooters. Many first-person shooters themselves require tactical deployment and thinking—all of which are able to stimulate thought in people, albeit in a different manner than negotiation might do. [1] Further, newer military games are more sophisticated, often requiring the player to take one side of a conflict and then the other in different levels of the game, or forcing the player to face moral dilemmas that affect the game’s storyline or outcome. [1] Freedman, Jonathan L. Media violence and its effect on aggression: assessing the scientific evidence. Toronto: University of Toronto Press, 2002. ISBN 0802084257
Violent Video Games Cause Social Interaction Problems Video games of a violent nature tend to fail to offer many solutions to a problem. Most military shooters have no form of negotiation with enemies; players are asked to simply kill as many nameless terrorists as possible. Given this, social interaction problems can be caused because people are presented with problems and then told that they must be solved with violence instead of other methods. In other words, physical violence is portrayed as the first-choice (and often only-choice) solution to a conflict. This lack of portrayal of alternate solutions can stifle growth of other skills, especially amongst children and adolescents, specifically skills important to making friends and engaging in negotiation in times of conflict or pressure. Further, it encourages children to see people who oppose them as “others,” and thus presents them psychologically as enemies instead of as people who are simply different to the player and thus might have other grievances. This can lead to increases in aggression among players. This is especially true given the relatively simplistic portrayal of conflicts within areas such as the Middle East and Afghanistan. [1] [1] "Violent Video Games May Increase Aggression in Some But Not Others, Says New Research". apa.org. American Psychological Association. 27 September 2011.
The facts are strongly against the Proposition’s analysis The proposition’s arguments fail to stand up in the real world. Several major studies published in The Journal of Adolescent Health, The British Medical Journal and The Lancet (among others) have shown no conclusive link between video game usage and real-life violent behaviour. The Federal Bureau of Investigation found no evidence linking video game use to the massacre at Columbine (or other highly publicized school shootings). [1] There is no evidence to support the idea that people exposed to violent video game (or other violent media content) will then go on to commit crimes. [2] Further, if violent video games were causing violent behaviour, we would expect to see rates of violent crime increase as games with realistic portrayals of violence became more widely available on popular game consoles. Instead, violent crime has decreased in recent years. Some economists have argued (based on time series modelling) that increased sales of violent video games are associated with decreases in violent crime. [3] In Grand Theft Childhood: The Surprising Truth About Violent Video Games and What Parents Can Do, researchers/authors Lawrence Kutner, PhD, and Cheryl K. Olson, ScD of Harvard Medical School and Massachusetts General Hospital’s Center for Mental Health and Media refute claims of violent behaviour increase caused by violent video games. The researchers' quantitative and qualitative studies (surveys and focus groups) found that young adolescents view game behaviour as unrelated to real-life actions, and this is why they can enjoy criminal or violent acts in a game that would horrify them in reality. They also found evidence that those relatively few adolescents who did not play video games at all were more at-risk for violent behaviours such as bullying or fighting (although the sample size was too small for statistical significance). The authors speculated that because video game play has gained a central and normative role in the social lives of adolescent boys, a boy who does not play any video games might be socially isolated or rejected. Finally, although more study is needed, there is some evidence to suggest that violent video games might allow players to get aggressive feelings out of their system (i.e., video game play might have a cathartic effect), in a scenario that does not harm anyone else. [4] , [5] , [6] [1] O’Toole, Mary Ellen, ‘The School Shooter: A Threat Assessment perspective’, Critical Incident Response Group, www.fbi.gov/stats-services/publications/school-shooter [2] Editorial. Is exposure to media violence a public-health risk? The Lancet, 2008, 371:1137. [3] Cunningham, Scott, et al., ‘Understanding the Effects of Violent Video Games on Violent Crime’, 7 April 2011, [4] Kutner, Lawrence & Cheryl K. Olson. Grand Theft Childhood: The Surprising Truth About Violent Video Games and What Parents Can Do. Simon and Schuster, 2008 [5] Bensley, Lillian and Juliet Van Eenwyk. Video games and real-life aggression: A review of the literature, Journal of Adolescent Health, 2001, 29:244-257. [6] Griffiths, Mark. Video games and health. British Medical Journal, 2005, 331:122-123.
Children See Violent Video Games Whilst it might be agreed that violent video games in the hands of a person who is old enough to see them and be able to understand the context in which the violence is being wrought is acceptable, this may not be true of younger people who acquire games. Games with violent content are often easily acquired by players too young to purchase them. They may also gain access to them at home from older siblings. Because children do not have fully developed mental faculties yet, and may not clearly separate fantasy from reality, exposure to violent games can have a large impact upon children. This has a greater impact than children seeing films that feature realistic violence because whilst a child might get bored with films owing to the lack of interaction with the medium, this is much less likely to be the case with, for example, a military shooting game, which a child might play over and over As such, all violent video games should be banned to prevent their acquisition by young children either by accident, or owing to parental ignorance. [1] [1] Anderson, Craig et al. The influence of media violence on youth. Psychological Science in the Public Interest, 2003, 4:81-110
This is empirically false Again, the crux of opposition counter-argument is that the evidence in this regard is strongly behind opposition. In April 2011, the U.S. Federal Trade Commission undercover shopper survey found that video game retailers continue to enforce the ratings by allowing only 13% of underage teenage shoppers to buy M-rated video games, a statistically significant improvement from the 20% purchase rate in 2009. By contrast, underage shoppers purchased R-rated movies 38% of the time, and unrated movies 47% of the time. Given that children are able to easily access violent content in other visual media, and there is no evidence that video games are more harmful than other media, this argument falls. Further, there is a long tradition of exposing children to extremely violent content in the form of fairy tales. Further, with greater education regarding the harms of videogames to parents (and with more parents having played video games themselves) many are becoming savvier about appropriate restrictions on their children’s video game play. Given the lack of evidence that video games are clearly or uniquely harmful, but acknowledging society’s interest in protecting vulnerable children, investing in additional parent education is a more logical response than attempting to ban all violent games. [1] [1] Federal Trade Commission. FTC undercover shopper survey on enforcement of game ratings finds compliance worst for retailers of music CDs and the highest among video game sellers. News release, 20 April 2011.
Violent Video Games cause Violent Behaviour Video games exist as an interactive medium. The player has control over their character and many of their character’s actions whereas in a book or movie, the audience does not. This means that the player can become invested emotionally in characters to a greater extent because of the autonomy afforded to each character. Given that this is true it becomes more difficult to ensure dissociation between the real world and the game world with which the player interacts. With the growing drive towards realism of videogame graphics, game environments are able to look incredibly similar to real life, further blurring the distinction. If this is the case, then a person who visits violence upon another person within a game universe feels the same emotions as someone who does so within real life, and therefore may be desensitised to real-life violence. Whilst game producers would claim that is not their aim and that their games do not cause this desensitisation, many have been actively pursuing technologies that allow for greater immersion within their game-worlds. If this is the case then acts of violence may fail to register the same level of shock or revulsion in a person than they usually do. Given that this is true, people who play video games become more able to harm others or less likely to intervene to prevent harm. In terms of actual evidence, there is very little to back up this analysis. Most studies supporting the concept have been debunked by others. [1] [1] Anderson, Craig & Bushman, Brad. Effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, and prosocial behavior: A meta-analytic review of the scientific literature. Psychological Science, 2001, 12: 353-359
Video games teach people to deal with frustrations in the wrong way. In dealing with frustrations and aggression by using video games as an outlet, players of these games often assume that the problem is gone or dealt with. This is often not the case, with many sources of frustration being ones which repeat day in and day out. Given that this is the case, video games prevent people from dealing with the root causes of their problems and thus leave people more susceptible to frustration in the future. Further, playing back into the first point on proposition, they teach players only one method of dealing with their problems, which is resorting to violence, so should they seek to deal with their frustrations in the real world, often the solutions they do engage with are ones which are suboptimal.
There is a generation gap Children in this age have grown up with computers and digital media devices where their parents have not. Whilst some parents are able to readily adapt to new technology, there are a large proportion that are unable to do so. Even if parents have adapted to the digital age, there are still lots of things their children know about that parents simply cannot keep up with. It is entirely feasible for a child to be able to keep the presence of a violent video game hidden from his or her parents through use of the various “Home” menus that all the major games consoles now possess. Further, on the computer, a user can simply Alt+Tab out of any application they are in to avoid detection. Given there are many ways for children to avoid their parents and given the generation gap, it seems unfair to expect parents to be able to monitor their children in this way.
Video games Improve Skills First, the claims of harm caused by video games have not been proven. The most criticised violent video games are generally military shooters. However, these games generally focus much more strongly on multiplayer components of the game. These multiplayer components often require significant levels of teamwork in order for one side to be successful over the other. As such, many of these video games end up teaching players core teamwork skills as well as often teaching leadership skills when players become part of organised gaming groups. Further, numerous researchers have proposed potential positive effects of video games on aspects of social and cognitive development and psychological well-being. It has been shown that action video game players have better hand-eye coordination and visuo-motor skills, such as their resistance to distraction, their sensitivity to information in the peripheral vision and their ability to count briefly presented objects, than non-players. Video games also promote the development of intellectual skills such as planning and problem-solving, and social games may improve the social capabilities of the individual. [1] Given then that video games provide these benefits, banning violent games would harm the industry overall, causing many of the developers of other games which encourage these kinds of skills to lose their funding from game publishers. Put simply, the banning of violent video games would lead to fewer games overall being published and if these games have the effects listed above then a great net benefit is lost in the process. [2] [1] Green, C. Shawn & Daphne Bavelier. Action video game modifies visual selective attention. Nature, 2003, 423:534-537. [2] Olson, Cheryl K. Children’s motivations for video game play in the context of normal development. Review of General Psychology, 2010, 14: 180-187.
Violent Video Games Prevent Violent Behaviour In most people’s lives there are instances where they might like to react to a situation with a level of aggression. However, owing to a number of reasons such a solution is often impossible and undesirable. It has been theorised by psychologists that pent up frustrations with the world are the root of many psychological problems. Given that this is true then, an outlet for frustrations is required in society such that aggressive behaviour in individuals can be avoided. Video games in this situation provide such an outlet for aggression and frustrations. Firstly aggression is dealt with through the simple act of defeating enemies within games and frustration is dealt with through the completion of goals within the video games, allowing players a sense of satisfaction upon their completion. Hence, one could argue that this may result in comparatively lower levels of aggressive behaviour among video game players. This is supported by research conducted by Dr. Cheryl Olson and her team at Harvard. Studying a sample of 1,254 students aged 12 to 14 years, she found that over 49% of boys and 25% of girls reported using violent games such as Grand Theft Auto IV as an outlet for their anger. She suggests that instead of a blanket ban on M-rated game use by young adolescents, parents should monitor how much time children spend playing games and how they react to specific game content. [1] [1] Olson, Cheryl K., et al., ‘Factors Correlated with Violent Video Game Use by Adolescent Boys and Girls’, Journal of Adolescent Health, Vol.41 no.1, pp77-83, July 2007,
The Responsibility Lies With Parents In the digital age, young people are almost certain to be exposed to violent media content, including violent video games, even if parents attempt to restrict children’s exposure to such content in the home. Parents therefore have an obligation to educate themselves about video games (many government, industry and private websites provide such information) and to help their children become “media literate” regarding the content and context of games. The state places responsibility on parents for the welfare of a child and in doing so the state can allow things that would potentially be dangerous for children, anything from skateboards to R-rated films, as long as parents can supervise their children. Parents need not know how to skateboard to supervise such activity, but should know about potential risks and safety equipment. This same logic applies to video games. To not confer this responsibility on parents is to further undermine their status as role models for their children, as it assumes that parents are incapable of ensuring the safety of their children. Practically speaking, this could affect the respect they get from their children, with “The government says I can’t,” being a much weaker response when questioned about violent video games than an actual explanation of the harms behind them. [1] [1] American Psychological Association. "Violent Video Games — Psychologists Help Protect Children from Harmful Effects", 8 June 2004,
The skills learnt within video games are skills that could be learnt elsewhere without the negative problems that have been associated with video games. All of the benefits listed are thusly moot in this context because things such as team sports are able to develop many of the skills team shooters do, whilst also improving fitness and other areas of well-being. More tactical sports can have a great impact on somebody’s intellectual well-being as well as their physical well-being. Additionally, videogames in general might be able to improve some skills, but we are discussing violent videogames in particular. There are other, much less violent, videogames that allow people to further increase their skills.
To only ask state-funded schools to accept military recruiters ensures that those entering the military out of school are disproportionally from state-schools rather than privately-funded schools, and therefore more likely to be middle and lower-class. Furthermore, there should be no quid pro quo regarding the funding of schools, conditions for further funding should be related to the success of students and the quality of teaching, not whether the school has furthered the state's desire to see its military substantiated. Schools should in fact protect students, not expose them annually to military recruiters who can incrementally pressurize them into a military career.
All high schools accepting state funding should accept military recruiters once a year The relationship between the state and the schools that it establishes and funds goes both ways; if schools accept state funding, the state is entitled to use schools as a platform for the military to appeal to future recruits. All state-funded schools, irrespective of location and student demographics but only high schools, would be expected to accept military recruiters once a year to speak to the entire student body. The event would be a condition of further funding for the school, however there would be no limits placed on a minimum number of students that needed to enlist as a result.
However it is dressed up, all the military is interested in schools for is the chance to recruit students. The various educational materials (not always clearly marked as coming from the military) and courses on offer are all intended to interest students in a military career. Such methods are dishonest and should not be allowed in schools; Paul McGarr, a teacher in East London, stated that 'only when recruiting materials gave a true picture of war would he welcome them into his school'1. If students are genuinely interested in joining the military, they can go along to a recruitment centre outside school, potentially with their parents, and ask the necessary questions there. 1 Goff, H. (2008, March 25). Teachers reject 'Army propaganda'. Retrieved May 18, 2011, from BBC News:
The military is an all-volunteer force and needs a percentage of school-age recruits each year Our military is an all-volunteer force and must recruit openly to keep up its numbers. The army, navy and air force need well-educated and motivated recruits; as the pool of potential recruits shrinks, efforts to attract young people must be permitted to 'intensify and diversify' 1 The alternative is a return to the conscription and national service that offers those recruits little choice. Military recruitment in schools permits the recruitment of only those with an interest in the armed forces, allowing those who wish to pursue other endeavours that opportunity. As such, visits to schools are not about forcing militaristic propaganda on children, but about making sure that 16-18 year olds know about the military as a potential career choice. After all, college representatives and local employers are allowed to make presentations to students, so it would be unfair to keep just the military out. If you accept that we need armed forces, then you must allow them to recruit openly. 1 Gee, D. (2008, January). Informed Choice? Armed forces recruitment practice in the United Kingdom. Retrieved May 18, 2011, from Informed Choice:
Young people should hear of the opportunities available in the armed services whilst in school School children are entitled, as part of their education, to a wide range of careers information, including potential roles in the military. It is a school's duty to offer not only paths to employment, but opportunities to engage with future employers like the military. With university places now increasingly competitive, schools must remain more vigilant than ever that they do not encourage purely academic paths to future careers. Furthermore, nationalism is a powerful factor in school curriculums worldwide, and permitting militaries into schools to talk to students is not an extension of already-permitted activities like the recital of the Lord's Prayer in British state schools or the Pledge of Allegiance in American schools. As such, it comes as little surprise that the predominant reason given for enlistment is service to country1. If schools are asked to ensure that such activities are carried out to foster national sentiment, it follows that military service should be, if not actively encouraged, respected sufficiently to grant the armed services an opportunity to engage with students. 1 Accardi, M. (2011, June 15) Army recruiters become a 'partner' In education Retrieved June 16, 2011, from The Huntsville Times:
The armed services have no right to preach to the youth, particularly when they are in a trusting environment like a school. To permit any organization to advertise to schoolchildren about job prospects is misguided at a time when their critical faculties are nascent and they are endowed with the belief that what is taught at school is to be imbibed with little rebuttal. Mandated school activities like the Lord's Prayer and Pledge of Allegiance do serve to promote nationalism, but do not do so in such a way as to threaten the lives or disrupt the career paths of school children. School children must be protected from organizations that have the potential to put pressure on them and guilt trip them into signing away the rest of their young adult life. If their choices are to be respected, they must be left to develop their critical faculties and then permitted to use information available to the general public to make a decision.
The need for recruits, however genuine, does not necessitate recruitment within schools. There will of course be certain students who would be attracted voluntarily to a role in the armed services, however these students can be reached through means other than their schools. Furthermore, if the motivation of recruits is paramount, then recruits can do no more to prove their motivation than actively and independently seek out a role in the armed services, rather than having it forced upon them through visits to their schools.
Young people are not aware and are, in many cases, deliberately misled as to the risks of military service. School children, conditioned by modern television, film and video games as to the heroism of military service, do not often ponder the dangers inherent in conflict. Modern video games, in which war deaths are the norm and immediate 're-spawning' dulls all sensitivity to death, do not serve to educate the youth about the risks but downplay them to the point of banality. Studies indicate that military recruiters, whilst not actively seeking to downplay risks or obscure the truth, are reluctant to volunteer information that would dissuade potential recruits 1. 1 Gee, D. (2008, January). Informed Choice? Armed forces recruitment practice in the United Kingdom. Retrieved May 18, 2011, from Informed Choice:
The purpose of the military entering schools is not solely recruitment but awareness Militaries provide a public service that too often goes unnoticed and underappreciated; school visits raise the level of understanding for the important job they do. In the UK the army publicly states that it does not directly recruit in schools but does visit many each year "with the aim of raising the general awareness of the armed forces in society"1They always visit by invitation of the Head teacher. Compared to the USA fewer young people have local or family connections with the military, so it is important for them to learn about the role the armed forces play in our country. And in both the UK and the USA the military offers other services to schools, from educational materials to leadership courses and team-building exercises. Sgt. Maj. Jerome DeJean, of the U.S. Army's 2nd Recruiting Brigade, describes their role as 'a partner in education
Young people are aware of the risks of military service and therefore would not be easily misled by military personnel Young people are not stupid – they know that there are risks involved in joining the military. In fact the media usually focuses on the bad news coming out of Afghanistan and Iraq, ignoring the good work of our military there. A career in the military also offers young people a lot of benefits, and it is only right that they should get to hear about those as well. As Donald Rumsfeld noted, ‘for some of our (US) students, this may be the best opportunity they have to get a college education’1. In addition, no one is signed up on the spot in the classroom; they always get the chance to think about it over a few months or more, and to discuss the decision carefully with parents and peers. As such, military recruitment in schools should be seen as no less unethical than the visits to schools of policemen, for whom there is similar risk but little public conjecture. 1Vlahos, K. B. (2005, June 23). Heavy military recruitment at high schools irks some parents. Retrieved May 18, 2011, from Fox News:
Military presentations in schools are not designed to be propaganda for their institutions, or the state as a whole, but educate the school children as to the undeniably important role that they play. State survival invariably is dependent upon the existence of a strong, well-trained armed force filled with motivated volunteers. Furthermore, demonstrations of modern technology and smart uniforms do not paint an unfair or inaccurate image of contemporary warfare. Such examples in fact illustrate the honesty of militaries in their portrayal to school children of modern combat. They act as not merely an educational tool, but a life lesson, demonstrating that the world of their video games is, in conflict zones at least, very much real.
School children are not targeted for military service; the intention is to raise awareness about the work that the military do. A Ministry of Defence spokesman in the UK stated that they 'visit about 1,000 schools a year only at the invitation of the school – with the aim of raising the general awareness of their armed forces in society, not to recruit’. Furthermore, children interested in a military career are not instantly signed up, they are granted the time until they turn 18 to decide. In addition, before official enlistment, all potential recruits are sent away on a six-week camp to find out what a career in the army will be like1 1 Goff, H. (2008, March 25). Teachers reject 'Army propaganda'. Retrieved May 18, 2011, from BBC News:
Military recruiters downplay the risks of a military career, tempting schoolchildren into a career they would not have chosen with honest information. Recruitment officers often make highly misleading pitches about life in the military. They play up the excitement and chances to travel, as well as the pay and benefits such as college fees and training in special skills. They don't talk about the dangers of military life, the casualty rates in Iraq and Afghanistan, or the thousands of young soldiers who have lost limbs or been emasculated in recent years. And they don't mention the impact of war on soldiers' mental health, or the lack of support when they leave the military. If we must have the military in our schools, then they should be made to give a much more realistic view of military life. Evidence suggests that 'whilst staff are generally willing to answer questions honestly, information that might dissuade potential recruits from enlisting is not routinely volunteered'1. If we are to accept the military in schools, they must similarly accept the moral necessity of presenting the risks of the career in a fair and truthful manner. 1 Gee, D. (2008, January). Informed Choice? Armed forces recruitment practice in the United Kingdom. Retrieved May 18, 2011, from Informed Choice:
Military recruitment in schools is illegal Recruitment in schools is against parts of the UN Convention on the Rights of the Child. A set of rules that the USA signed up to in 2002 forbids the recruitment of children under the age of 181. Despite this, the American Civil Liberties Union has found that US military recruiters target children as young as 11, visiting their classrooms and making unfair promises to them2. Though the military would argue that its school visits do not constitute recruitment, if recruitment of those under 18 is wrong, then advertising to those under 18 should similarly be considered wrong. In order to live up to its pledge in 2002, the USA should stop trying to recruit in schools. 1 United Nations General Assembly . (2000, May 25). Optional Protocol to the Convention on the Rights of the Child. Retrieved May 18, 2011, from Office of the United Nations High Commissioner for Human Rights: 2 American Civil Liberties Union. (2008, May 13). Military recruitment practices violate international standards, says ACLU. Retrieved May 18, 2011, from American Civil Liberties Union:
Military recruitment in schools is less education than propaganda Allowing members of the military into schools is a form of propaganda. They promote the military and make war seem glamorous. Soldiers in smart uniforms come into classes with specially-made videos and powerful weapons, making violence and state-organised murder seem cool. A recent report into the practice stated 'key messages are routinely tailored to children's interests: military roles are promoted as glamorous…(and) warfare is portrayed as game-like and enjoyable.’1 This encourages young people to support aggressive action abroad. It also promotes an unthinking loyalty to the state, whether its actions are right or wrong. By allowing the military in, schools are signalling to their students that these things are OK. 1 Gee, D. (2008, January). Informed Choice? Armed forces recruitment practice in the United Kingdom. Retrieved May 18, 2011, from Informed Choice:
School children are too young to target for military service School children should be protected from targeted appeals for jobs they are unprepared for, both physically and emotionally. The army is short of manpower due to high casualty rates and the unwillingness of current soldiers to reenlist. This means that they are very keen to get into schools to sign up young people. But it is not right to let them get at students who are too young to vote, or even drive. 16 and 17 year olds are not grown-up enough to make life and death decisions, like joining the army. They may not be able to see through exciting presentations or resist a persuasive and experienced recruitment officer. Under the No Child Left Behind Act, military recruiters collect data on 30 million students. The act 'grants the Pentagon access to directories of all public high schools to facilitate contact for military service recruitment'1. A huge database contains their personal details, including social security numbers, email addresses and academic records. The purpose of this is to allow recruiters to pester young people with messages, phone calls and home visits. Schools should be safe places to grow and learn, not somewhere to sign your life away before it has even properly begun. Upon enlisting, recruits enter a contract that legally binds them to the Armed Forces for up to six years2; school children should not be exposed to pressure to sign their young adolescence away. 1 Berg, M. (2005, February 23). Military recruiters have unrivaled access to schools. Retrieved May 18, 2011, from Common Dreams: 2 Gee, D. (2008, January). Informed Choice? Armed forces recruitment practice in the United Kingdom. Retrieved May 18, 2011, from Informed Choice:
Military recruitment in schools is not illegal in the United States for they have not signed the relevant documents. The USA has not signed the UN Convention on the Rights of the Child referred to opposite, although it has signed the UN's Optional Protocol on the Involvement of Children in Armed Conflict (United Nations General Assembly , 2000). However, the US military does not recruit under-18s anyway, so it is keeping to it's agreement. In any case, neither of these agreements stops recruiters visiting schools in order to make students aware of military career options once they turn 18.
Recruiters do not minimise the risks of a military career, rather the armed forces have a good story to tell and they don't prevent themselves from saying so. Furthermore, it is policy for recruitment staff to 'explain the recruits' rights and responsibilities and the nature of the commitment to the Armed Forces'1. There really are great opportunities for keen, talented young people in the military, and almost all soldiers, etc. find it a very satisfying life. And compared with the past, soldiers today are much better looked after in terms of physical, medical and psychological wellbeing. 1 Gee, D. (2008, January). Informed Choice? Armed forces recruitment practice in the United Kingdom. Retrieved May 18, 2011, from Informed Choice:
Dress codes are a half-way house that does not work. It does not make students look at all uniform and it does not show what school they are from. In the United States there has been a move away from allowing either no uniform or dress codes towards having school uniforms.[6]
Dress Codes instead of school uniform Rather than having school uniform, why not have a dress code instead? This has all the benefits of uniform without the many disadvantages. While uniforms force all children to wear the same clothes, dress codes give students a lot of choice what to wear. Only a few unsuitable things are banned - for example, gang colors, very short skirts, crop tops, bare shoulders, etc
A lot of schools have a choice of uniform so that children can wear what they feel most comfortable in. For example, in Australia, which is a very hot country, schools often have a summer uniform of clothes that are more comfortable in the hot weather [9]. This means that in summer, children might be allowed to wear shorts instead of trousers and short-sleeved instead of long-sleeved shirts. If children were allowed to choose their own clothes to wear to school, instead of a uniform, they might choose impractical clothes themselves, like baggy tee shirts or long skirts, or jeans with chains hanging from them. To make sure that children are all wearing sensible clothes in which they will be able to take part in all their school activities, there needs to be one uniform that all children at the school wear.
Individuality and creativity should be encouraged Article 19 of The Universal Declaration of Human Rights states that "Everyone has the right to freedom of opinion and expression"[18]. Children's freedom of expression is restricted by school uniforms, because children who have to wear the same clothing as every other child in their school are not able to express their individuality and creativity. We should get rid of school uniform so that all children can express themselves freely.
Students should be allowed to wear religious dress If children are religious, they should be allowed to wear the clothes that express their religion, but school a uniform can often restrict this. Religious beliefs can be extremely valuable and important to many children, giving their lives a great deal of meaning and structure and inspiring them to work hard and behave compassionately in a school environment. Some religions place a great deal of value upon worn symbols of faith, such as turbans, headdresses and bracelets. When a school demands that a child remove these symbols, it inadvertently attacks something central to that child’s life. This may cause the child to see her school and her faith as mutually exclusive institutions[1]. Vulnerable young people should not be forced into an adversarial relationship with their school, as close, collaborative involvement with teaching and learning techniques will greatly effect a child’s ability to adapt, learn and acquire new skills in the future. For example, school skirts are often not long enough for Muslim girls, who believe that they should cover most of their bodies. To allow children to express their religions, we should get rid of school uniforms.
Some schools do have different rules for religious students, so that those students can express their beliefs. For example, a school might let Muslim girls wear some of their religious items of clothing mixed with the school uniform (e.g., Reading Girls' School)[2].
Schools can foster creativity and individuality without getting rid of school uniform. There are many schools with a uniform which still support creativity and individuality with "Child Initiated Independent Learning", and other schemes which encourage children to think for themselves [19, 20]. Also, if children are participating in creative activities like art, it is surely better for them to wear sensible clothes, and it's easier to make sure all children are wearing sensible clothes if they all have to wear the same uniform.
In many countries, parents can apply for help with the cost of school uniform. For example, in the U.K., parents who don't earn a lot of money can get money from the government to help pay for their child's school uniform[13] . In Australia, the Australian Scholarships Group, which specialises in helping parents save money when it comes to their children's education, has tips for parents to get their child's uniform cheaper.[14] Also, parents would probably have to spend a lot more money if their children didn't wear a uniform to school, because they would have to buy them more casual clothes. Since children don't like to wear the same thing too often (in case they get bullied), parents would have to spend a lot of money making sure their children have lots of different outfits.
School uniforms are often impractical or uncomfortable School uniforms are often not very comfortable or practical. In state schools (schools for which parents don't have to pay fees) in the U.K., for example, girls often have to wear dresses or skirts, when they might feel more comfortable in trousers, and boys often have to wear button-up shirts and ties, which can also be uncomfortable for active children[7]. In independent schools, uniforms are often even more impractical and uncomfortable, with blazers or even tailcoats for the children to wear[8].
School uniforms are often expensive If a school has a uniform, parents are expected to buy it, and then buy a new one every time their child outgrows the last. This can be expensive. It has been reported that parents in South Africa[10], Australia[11], and the U.K[12]. have to pay a lot of money for their children's school uniforms, and it is probably the same in other countries too.
Researchers have actually found that having to wear a school uniform does not make children better behaved. For example, Brunsma and Rockquemore[22] looked at data for more than 4,500 students and found that those who wore a school uniform did not have fewer behavioural problems or better attendance. School uniform does not encourage discipline, so there is no need to make children wear one.
There will always be teasing between children. If it's not based on what clothes the kids are wearing, it'll be because of their hair colour[4], or the fact that they wear glasses [5]. Children need to learn from an early age that everyone is different, or how can they learn to accept that? The differences between people should be embraced; in making students wear a uniform, schools are wrongly teaching children that everyone should look the same. When it comes to the opposition's evidence it should be remembered that opinion polls themselves are slippery, depending on the question asked, as is something like a belief in the benefits of school uniforms. There is also no evidence to link parent's belief that it promotes equality to whether it really does.
School uniforms contribute to the sense of school unity Schools that have a uniform often say that they do so because wearing a uniform helps their students feel a sense of unity and pride in their school (e.g., Sacred Heart Catholic School, 2010)[15]. The headmistress of Fulham Cross School in London, England, has been quoted as saying that introducing a uniform at her school gave students "an incredible sense of pride"; after the introduction of a school uniform, GCSE passes at her school rose from 42 to 53 per cent[16]. This sense of unity is especially important on school trips, where teachers need to be able to tell which children belong to their school, so that no one gets lost.
School uniforms encourage discipline Having to wear smart clothes encourages children to respect their school and their teachers and behave themselves. This is because of the association between smart clothes and work. Casual wear at school can also make students feel over-relaxed and 'at home,' meaning they don't focus as much on work. A lot of schools are bringing back school uniform because they want to improve discipline[21]. Moreover, school uniform can actively encourage students to enter into an adversarial relationship with the curriculum and their teachers. Exercising arbitrary control over children in the interests of “discipline” is likely to convince them that the very sensible, rational principles of learning and critical thought that they acquire during the school day are equally arbitrary and meaningless. By refusing to allow children to participate in enjoyable, beguiling processes of discovery and understanding unless they comply with unjustified and meaningless rules about dress, schools risk being seen as oppressive and capricious by their students. 1 The Telegraph, 2009. School uniforms return in drive to improve school discipline [online] 1 October.
School uniforms create a sense of equality School catchment areas are diverse and in private schools, some children are there on a scholarship. So, without uniforms there are clear indicators of wealth between what children wear. This makes poorer children stand out, (or even possibly the reverse). Children can then be bullied for being different, which diminishes a child's enjoyment of school. A study in New York has shown that 84% of parents think uniforms promote equality, and 89% of guidance counselors think uniforms help teach children to be more accepting of others who are less fortunate[3]. This perception among parents will help create the same perception among their children. This is also likely to translate to the teachers who will therefore treat their pupils more equally.
School uniforms might help improve the feeling of unity within schools, but pride in one's school is dependent on being distinct and different from another school. This can lead increase rivalry between schools (already present from school sports matches). There are many examples of school rivalry (often made worse by the fact that children from different schools are made to wear different uniforms) leading to children being beaten up or worse. For example, in New Zealand, a boy was beaten up by boys from a rival school; he said that the boys told him he should be shot because he went to a different school, which they could see from his uniform[17]. Because of this rivalry, it might be better for students not to wear school uniforms on outings, where they might encounter children from other schools. Schools can use other things to make sure children don't get lost on school trips, like buddy schemes where each child has a buddy, and having plenty of teachers or assistant teachers. 1 TVNZ, 2007. Boy beaten as school rivalry heats up [online] 21 October.
Vocational training would not actually improve training in the skills which employers are concerned about. When people complain about a skills gap, there are two kinds of skills they are worried about: technical ones, in subjects like engineering, and general ones, such as the ability to present or to write clearly. This is something which is already done in university; the best way to learn how to present and write is to practice presenting and writing. Picking a subject, such as history, simply acts as a useful focus for this work. As long as employers can be sufficiently clear about what it is they want graduates to be capable of, we will be able to incorporate this into existing courses – so in fact, even supposedly non-vocational courses will teach the right skills. Technical careers like engineering and computer science might indeed benefit from the change, but it makes no sense to shape the whole education system around a limited set of jobs.
Vocational courses produce better employees The courses which are generally offered at the moment are not serving students well when it comes to providing the skills for employment. 65% of businesses complain of being unable to hire people with the right skills. [1] Increasingly, universities are offering as a selling point the fact that they have extra-curricular courses to teach people business skills, but this is a tacit admission that they are selling people degrees which are not fit for purpose. Solving this requires us to teach more vocationally. There are schemes underway in many areas to do just that – to give one example, in Maine, USA, a bill has been passed to improve local colleges. [2] Our policy moves these efforts from the fringes to the core of the system: isolate as far as possible the specific things which make good employees and teach those to people. This will help them get jobs more easily, and also ensure that companies are able to operate effectively. The consequences of such a policy would be good all round. [1] Personnel Today, ‘Skills gap ‘hindering UK business growth’, say CEOs’, agr, 29 April 2013 [2] State House Bureau, ‘House Oks bill to plug ‘skills gap’, Portland Press Herald, 21 May 2013
The statement “universities can’t take everyone” is clearly true. But there is a big jump from that to saying “we should stop people from applying,” for two reasons. Firstly, the more obvious conclusion would be to find a way to increase the number of places available, on the grounds that more students means a larger pool of knowledge to draw from and therefore academia will be better. Secondly, for this to have the desired effect we would need the good people to continue to apply, and this is by no means guaranteed – they may simply waltz off into jobs and be lost to academia, in which case we will actually end up worse off. The limited number of places is a problem, but the proposed solution may make things worse.
Students are forcing themselves through university even when it is not right for them Not everyone should be spending their time in academic study. As well as requiring certain skills, it also requires that the personality of the student be suited to it. They must be capable of manufacturing a sustained interest in a subject, or they will not be able to drag themselves through three or more years of thinking about little else. Some people are, by nature, not that kind of person – they may think in a short-term way or simply not be curious about the world. It also requires a level of intelligence which some people simply don’t have. These people will gain very little from spending time at university. In fact, at some (typically less prestigious) universities, dropout rates can be as high as 20%, meaning students will literally gain nothing. [1] Many people are putting themselves through university despite it not being right for them. Partial blame for this lies with employers – the large number of graduates means a culture has developed among recruiters of using the presence or absence of a degree as a default filter for applicants; 78% of leading employers filter out anyone with less than a 2:1. [2] We should discourage this. By implementing this policy, we create a different and better way to measure someone’s employability. This will make employers more likely to hire these people, and allow them to follow a path through life better suited to their personality. [1] Paton, Graeme, ‘University drop-out rate soars by 13pc in a year’, The Telegraph, 29 March 2012 [2] Tims, Anna, ‘Get a third-class degree? Time to turn on the charm’, The Guardian, 11 September 2010
Everyone gains something from university, whether quantifiable or not. Simply getting out in to the world and meeting more people – not just minorities and other social groups, but even a wider variety of people within your own social group – is an effective way to learn to think more broadly. Many university students live away from home for the first time, forcing them to do things for themselves and learn how things like personal finance work. It also allows them space to explore themselves and shape their own principles. Non-academic activities within university can also broaden horizons and teach new things such as joining student clubs or societies, such as the debating society. Although university may not be the only way of doing this, it has proven effective over the years, so it’s not true to say non-academic people get absolutely nothing from it. Despite the problems associated with a degree culture, there are other problems with a non-academic culture. Academia creates things: products and inventions in the case of sciences, and thoughts or ideas in the case of humanities (and even though some people argue against government funding for humanities, almost no-one argues they should not be studied at all). Sustaining this creativity requires at least some new people entering the field, bringing their own insights and approaches. For this to happen, it has to be both respectable and accessible. A government policy against academic courses will cripple this and damage all of us.
This is a mischaracterisation of how academics work. No serious researcher cuts themselves off from the world to work: collaboration, exchange of ideas and chatting by the water cooler are invaluable. Often, a crucial insight into a problem comes from a casual reference by a colleague. Every report into improving research environments stresses the importance of collaboration, both within a discipline and between disciplines. Anyone who loves their subject will be happy to have more people studying and sharing ideas with them, even if those people are not quite as committed as they are. If those people then leave to do vocational stuff, they will have at least been a positive presence.
Universities don’t have unlimited places available Universities cannot take every student who applies. They have to balance the number of applications they get with both the number of teaching staff they have and the time they need for research. In the UK, almost a third of applicants do not get places as it is, [1] and those that do often find they have less contact time with staff than they had expected. [2] Simply put, if you want to have academics doing useful research, you can’t expect them to teach all the time. If universities have a finite number of places, it makes sense that they should be allocated to the people best suited for them. Currently, universities are so overwhelmed by demand that it isn’t possible for them to test this properly – in most cases, they will take a cursory look at predicted grades, and perhaps an interview with the candidate. Discouraging applicants would lower the stress on admissions departments, making the process more accurate. It will also allow them more leisure to reach out to and target students with the right personality, improving the quality of applications. Forget all of the discussion as to whether or not academic courses are useful – it’s simply not practical to have everyone do them. [1] ‘UCAS End of Cycle report 2012’, UCAS, 13 December 2012 [2] Paton, Graeme, ‘University teaching time ‘fails to rise’ despite fees hike’, The Telegraph, 15 May 2013
Academia must be free of distractions The best academic departments are ones run with purely academic aims. Intensive study of a field requires that you are given the resources, support, time and space that you need. Moreover, the best atmosphere is one in which everyone around you shares your love of study. It follows that departments should be allowed to use this as their top priority. This affects undergraduate study in two ways: students must be free to spend time getting to grips with their subject properly, and lecturers must be allowed to teach the things they feel to be most important for their subject. Neither of these things are possible when you are worrying about jobs. Every subject has certain parts which are more or less relevant to their related careers, but this may not be the same as the parts which are important to academic study of the subject. For example, maths students will invariably be taught Linear Algebra and Group Theory, normally in the first year, but 20% of Mathematics graduates work in Business & Finance, where this is not relevant. If everyone is expected to have one eye on vocational training the academic study will necessarily suffer. Solving this problem requires that we split vocational and academic study, so that people doing one don’t need to worry about the other. This will improve each of them.
Clearly, more tolerance is a good thing, but putting people through an expensive, three-year course with no career benefit is a sensible way to achieve this. As an example of an alternative, give more support to gap-year programmes and run them in such a way as to get an equivalent mixing. People will learn just as much tolerance in one year as in three, will save time and can even do useful volunteering while they’re on it. This is not mutually exclusive with our policy, which means that you get both benefits.
The importance of university to minority groups derives directly from its importance to the rest of the country. It is seen as the key to things like higher-paying jobs for low-income families because it is seen as the key to higher-paying jobs in general. Moreover, this is based on an attitude problem: there are plenty of jobs which do not require degree-level education and which can pay very well at the top end. [1] Under our vocational system, this will all change, and academic study will no longer be the benchmark for success. Alternatively, even under the current system, what matters to people generally is not the fact of university education alone, it is the careers which it opens up – in particular, stereotypically middle-class careers such as lawyers and bankers. Vocational training would give children just as many opportunities, if not more, as they are not being forced through an academic process of questionable utility first. [1] Smith, Jacquelyn, ‘America’s Best-Paying Blue-Collar Jobs’, Forbes, 6th April 2012
Life experience is an essential part of personal development People gain much more than a subject from their time at university. Life requires interpersonal skills, self-discipline and general knowledge which must be absorbed over time. There are distinct advantages to picking up these skills before you start work. Firstly, it will make you a more effective worker, whether you are working alone (self-discipline) or with other people (interpersonal skills). Secondly, while working you are likely to have much less time for that sort of thing. Thirdly, you will be to go through on-the-job training more easily if you already know how to study. All of this can be done very effectively at university. You are allowed time and space to learn planning, budgeting, finding and managing accommodation and a myriad other things which will help you in life. So to say that people don’t gain anything from non-vocational courses is misleading – even if the study doesn’t help them, the life experience does.
We must retain a respect for academia Academia is important to society. Technical subjects have the obvious outcomes of new inventions, gadgets, medicines etc. – and although these applications are vocational, they are inspired by academic study. Creative arts are also a huge industry in their own right. Humanities are a source of ideas about society, happiness, social policy and cultural understanding, besides simply being interesting. [1] This is all activity which we should encourage. Emphasising vocational training would damage the image of academia. Quite apart from the fact that reduced government support for the sector is likely to damage it in real terms, it is very likely that if people are being told by the whole government education system that vocational training is more useful for themselves and for society, they will come to regard non-vocational courses with suspicion. Pressure to conform is a real factor, especially for schoolchildren at ages when they are unlikely to see any reason for a principled, pro-academia stance. This means fewer children will go into it and fewer people will tolerate support for it. Preserving the prestige of non-vocational courses is important, and it requires government policy to take them seriously. [1] ‘Section 3: What Research in the Humanities and Social Sciences Offers’, British Academy, accessed 12 June 2013
Universities cut across class and social divides in a unique way University is a great equaliser. One positive side-effect of people going through university is that they are virtually guaranteed to interact with people who are different from them in all sorts of ways – including ethnicity, where minority groups are sometimes better represented than they are in the general population, [1] and international students account for 17% of the university population. [2] The more this mixing happens, the easier it is for people to be tolerant and sensitive to other people. While this isn’t necessarily a problem everywhere, there are still places where these divides cause tension and violence, so the fact that our policy helps to tackle this makes it good. Vocational courses are rather less likely to be mixed. Certain careers are associated with certain groups, and people studying for that specific career will be drawn largely from that group. For example, the clients of an accountancy course and a construction course are not likely to overlap very much, if at all. Despite whatever merits vocational education may have, government policy is not just about education: it should take into account the wider social good, and so we should be on the side which produces a more tolerant society. [1] Sellgren, Katherine’, ‘Rise in ethnic minority students at UK universities’, BBC News, 3 February 2010 [2] ‘International students in UK higher education: key statistics’, UK Council for International Student Affairs, 2011-12
University education gives people something to aim for University education is something which a lot of traditionally disadvantaged groups aspire to, for themselves or, more commonly, for their children. Those who are accepted are seen as having “made good,” partly because of the prestige attached to intelligence and partly because of the correlation with higher salaries. However, this can be hard for them: they may be from lower-income families, where there is no family history of higher education, or they may be from immigrant communities, who have struggled to learn the local language. These children are therefore likely to encounter significant barriers to getting to university. But a permanent lottery-of-birth, where only children of successful people can ever be successful, would not be fair. All children should be helped to build their own futures regardless of their background and broad access to university is necessary for this. It is notable that most of the countries with the most social mobility (Denmark, Norway, Finland, Australia, New Zealand, Sweden) are also those with the highest rates of graduation from tertiary education, Canada is the only outlier. [1] This is the reason the government generally gives them extra support so as to make university realistic. If the government switches focus to vocational courses, it will necessarily lower the amount of support available for these children to get to university. This makes it harder for them to break out of poverty, harder to improve their station in life and harder for them to gain status in their community. This is too valuable to give up. [1] This is clearly very inexact, there are countries with very high graduation rates (Iceland being top) that are not on the intergenerational earnings elasticity graph so can’t be compared. Corak, Miles, ‘Here is the source for the “Great Gatsby Curve” in the Alan Krueger speech at the Center for American Progress on January 12’, Economics for public policy, 12 January 2012 ‘Tertiary education graduation rates Percentage of graduates to the population at the typical age of graduation’, OECD ilibrary, 14 June 2010
It is entirely consistent to respect academia while insisting it isn’t appropriate for everyone. By way of analogy, consider that few people do serious sport, but almost no-one looks down on those who do (thinking particularly of casual sport rather than professional sport). We are perfectly capable of seeing the value in things which we don’t do ourselves. It is even plausible that under the new system academics would become an elite cadre of intellectuals whom schoolchildren would aspire to join and the status of academia would be considerably enhanced. There is a well-known saying, “familiarity breeds contempt.” If fewer people were tempted to think of themselves as amateur scientists, amateur historians etc., we might have more respect for the real ones.
None of the above is unique to university. It is possible to find something useful to do practically wherever you are, including university. That doesn’t make it the most important, efficient or effective thing to do – or, indeed, the best place to do it. Anyone on a vocational course will pick up the same general skills and study techniques at least as well. We agree that there is an advantage to knowing how to study before you start job training, but we don’t think the right answer is to do other, random study first – the skills should ideally be taught at school, or as an introduction to the job training.
For whatever reason the treasures were first collected, we should not rewrite history. There is no reason to politicise this argument; museums have no 'political' agenda but merely wish to preserve historical objects for their intrinsic value. Their reasons for keeping these items may be financial, or in the interests of keeping the artefacts safe and accessible to the public; whatever they may be, they are not political. Don’t the nations who have expended resources protecting and preserving these artefacts deserve in return the right to display them? Additionally, not all artefacts held outside their country of origin are the result of imperial or exploitative relationships. The original Medieval Crown of England is held in Munich [1] . Artistic exchange has nothing to do with politics anymore. [1] Bayerische Verwaltung der staatlichen Schlösser, Gärten und Seen, ‘Treasury (Schatzkammer)'
Retaining artefacts is a relic of imperialist attitudes to non-occidental cultures Display of cultural treasures in Western museums may be seen as a last hangover from the imperial belief that “civilised” states such as Britain were the true cultural successors to Ancient Greece and Rome, and that the ‘barbarian’ inhabitants of those ancient regions were unable to appreciate or look after their great artistic heritage. Whether that was true in the 19th century is open to doubt; it certainly is not valid today and the display of imperial trophies in institutions such as the British Museum or the Louvre is a reminder to many developing nations of their past oppression. For instance, the British Museum is refusing to return 700 of the Benin Bronzes to Nigeria despite repeated requests by the Nigerian government [1] . The Rosetta stone has been the subject of demands by the Egyptian government but remains in London. These artefacts become almost souvenirs of Imperialism, a way of retaining cultural ownership long after the political power of Britain has faded. Returning them would be a gesture of goodwill and cooperation. [1] “The British Museum which refuses to state clearly how many of the bronzes it has is alleged to be detaining has 700 bronzes whilst the Ethnology Museum, Berlin, has 580 pieces and the Ethnology Museum, Vienna, has 167 pieces. These museums refuse to return any pieces despite several demands for restitution.” From Opoku, Kwame, ‘France returns looted artefacts to Nigeria: Beginning of a long process or an isolated act?’ 29th January 2010
Although some treasures may have been acquired illegally, the evidence for this is often ambiguous. Experts agree that Greece could mount no court case because Elgin was granted permission by what was then Greece's ruling government. Lord Elgin’s bribes were the common way of facilitating any business in the Ottoman Empire, and do not undermine Britain’s solid legal claim to the Parthenon marbles, based upon a written contract made by the internationally-recognised authorities in Athens at the time. The veracity of the document can never be fully dismissed as it is a translation. And while some Benin bronzes were undoubtedly looted, other “colonial trophies” were freely sold to the imperial powers, indeed some were made specifically for the European market.
Cultural artefacts are enriched when displayed in the context from which they originated Cultural treasures should be displayed in the context in which they originated; only then can they be truly valued and understood. In the case of the Parthenon marbles this is an architectural context which only proximity to the Parthenon itself can provide. In the British Museum they appear as mere disconnected fragments, stripped of any emotional meaning. It may also be useful for academics to have a cultural property in its original context in order to be able to understand it, for example a carved door may be a beautiful artefact but it cannot be truly understood unless we know what the door was used for, where it leads too something for which it is necessary to see the context. Cultural and historical tourism is an important source of income for many countries, and is especially important for developing countries. If their artefacts have been appropriated by foreign museums in wealthy nations then they are being deprived of the economic opportunity to build a successful tourist trade. Both the treasures themselves are being devalued as is the experience of seeing the treasures.
The artefacts' place of origin has more often than not changed dramatically since they were in situ there. It is therefore unconvincing to argue that the context of modern Orthodox Greece aids visitors’ appreciation of an ancient pagan relic. Too much has changed physically and culturally over the centuries for artefacts to speak more clearly in their country of origin than they do in museums, where they can be compared to large assemblies of objects from a wide variety of cultures. Similarly, a great many cultural treasures relate to religions and cultures which no longer survive and there can be no such claim for their return. Technology has also evolved to the point that Ancient Greece can be just as accurately evoked virtually as it could be in modern Greece [1] . Countries with cultural heritage retain the attraction of being the original locations of historical events or places of interest even without all the artefacts in place. The sanctuaries of Olympia and Delphi in Greece are a good example of this; they are not filled with artefacts, but continue to attract visitors because the sites are interesting in themselves. In 2009 2,813,548 people visited Athens, with 5,970,483 visiting archaeological sites across Greece [2] , even without the Parthenon marbles. Also, people who have seen an artefact in a foreign museum may then be drawn to visit the area it originated from. It is the tourist trade of the nations where these artefacts are held (mostly northern European nations, like Britain and France) which would suffer if they were repatriated. Lacking the climate and natural amenities of other tourist destinations they rely on their cultural offerings in order to attract visitors [1] Young Explorers, ‘A brief history of…’ The British Museum. [2] AFP, ‘New Acropolis Museum leads rise in Greek Museum visitor numbers for 2009’, Elginism, June 8th 2010. (Breakdown of visitor figures according to major destinations. )
In the case of the Parthenon marbles, Lord Elgin’s action in removing them was an act of rescue as the Parthenon was being used as a quarry by the local population. [1] The Parthenon had already been destroyed by an explosion in 1687. [2] Having been removed the result was that the British protected them between 1821 and 1833 during the Greek War of Independence was occurring and the Acropolis was besieged twice. [3] Furthermore, if they had been returned upon Greek independence in 1830, the heavily polluted air of Athens would have caused extensive damage to such artefacts that would be open to the elements and Greek attempts at restoration in 1898 were as damaging as the British. [4] Today economic austerity lends new uncertainty to Greece’s commitment to financing culture. Similar problems face the return of artefacts to African museums; wooden figures would decay in the humid atmosphere. Artefacts in Northern Africa are at risk because of the recent revolts and civil wars [5] . Wealthier countries sometimes simply have better resources to protect, preserve and restore historical artefacts than their country of origin. Our moral obligation is to preserve the artefact for future generations, and if this is best achieved by remaining in a foreign country then that must be the course of action. [1] Beard, Mary, ‘Lord Elgin - Saviour or Vandal?’, BBC History, 17 February 2011. [2] Mommsen, Theodor E., ‘The Venetian in Athens and the Destruction of the Parthenon in 1687’, American Journal of Archaeology, Vol 45, No. 4, Oct-Dec 1941, pp. 544-556. [3] Christopher Hitchens, The Elgin Marbles: Should They Be Returned to Greece?, 1998,p.viii, ISBN 1-85984-220-8 [4] Hadingham, Evan, ‘Unlocking Mysteries of the Parthenon’ Smithsonian Magazine, February 2008. [5] Parker, Nick ‘Raiders of the Lost Mubarak’, , The Sun, 1st Feburary 2011.
Many artefacts resting in western museums were acquired illegally. Western states have a duty to retain them. Artefacts were often acquired illegally. Elgin, for instance, appropriated the Parthenon Marbles from the Ottoman authorities who had invaded Greece and were arguably not the rightful owners of the site; he took advantage of political turmoil to pillage these ancient statues. Doubt has even been cast on the legality of the 1801 document which purportedly gave Elgin permission to remove the marbles [1] . The Axum obelisk was seized from Ethiopia by Mussolini as a trophy of war; fortunately the injustice of this action has since been recognised and the obelisk was restored to its rightful place in 2005 [2] . UNESCO regulations initially required the return of artefacts removed from their country of origin after 1970,when the treaty came into force, but did not deal with any appropriations before this date due to deadlock in the negotiations for the framing of the convention that prevented inclusion of earlier removals. . However, the 1995 UNIDROIT Convention on Stolen or Illegally Exported Cultural Objects essentially removes the ambiguity about time limitations of UNESCO’s 1970 convention. Here, nations are required, in all cases, to return cultural artefacts to their countries of origin if those items were once stolen or removed illegally [3] . International law is thus on the side of returning artefacts. [1] Rudenstine, David, 'Did Elgin cheat at marbles?' Nation, Vol. 270, Issue 21, 25 May 2000. [2] BBC News, ‘Who should own historic artefacts?’, 26th April 2005, [3] Odor, ‘The Return of Cultural Artefacts to Countries of Origin’.
Developing countries are able to guard and preserve their own cultural treasures It may have been true that countries such as Greece were not capable of looking after their heritage in the past, but that has now changed. Since 197 5 Greece has been carefully restoring the Acropolis and Athens now has a secure environment to maintain the marbles. The state-of-the-art New Acropolis Museum, which cost $200m, has now been completed to house the surviving marbles [1] , and even contains a replica of the temple, thus the marbles would appear as being exactly the same as on the real temple. Pollution control measures (such as installing pollution monitoring stations throughout metropolitan Athens and ensuring that motor vehicles must comply with emission standards [2] ) have reduced sulphur-dioxide levels in the city to a fifth of their previous levels. At the same time the curatorship of institutions such as the British Museum is being called into question, as it becomes apparent that controversial cleaning and restoration practices may have harmed the sculptures they claim to protect. In the 1930s the British museum’s attempt to clean them using chisels caused irreparable damage. [3] They have also been irresponsible when it comes to protecting the fate of many of its artefacts: “The British Museum has sold off more than 30 controversial Benin bronzes for as little as £75 each since 1950, it has emerged”; “The museum now regrets the sales” [4] . [1] Acropolis museum, Home page. [2] Alexandros.com, ‘Greece’. [3] Smith, Helena, ‘British damage to Elgin marbles ‘irreparable’’, The Guardian, 12 November 1999. [4] BBC News, ‘Benin bronzes sold to Nigeria’, 27th March 2002.
Many people from an artefact's country of origin never get to see them because they cannot afford to travel to a foreign museum; as such the cost of access to that museum is a very small part of the total cost. These artefacts are part of their cultural history and national identity, and it is important that local people are given the opportunity to see them. It is not all about quantity of visitors; those closest to the artefacts have the greatest right to see them. For others, it should be a privilege not a right.
If the artefacts are of sufficient historical and cultural interest, scholars will travel to any location in order to study them. Indeed, the proximity of artefacts in developing countries may even stimulate intellectual curiosity and increase the quality of universities in there, which would be beneficial for world culture.
The historical significance of artefacts extends beyond their culture of origin Artefacts have a historical and symbolic meaning that transcends their origins; over the years they acquire a connection with the place that they are housed. For example, the Egyptian obelisk that stands in the Piazza di San Pietro in Rome was brought to Italy in the reign of Caligula. [1] It is no longer merely an ‘Egyptian’ artefact - it has become a symbol of Roman dominance in the ancient world and the European Christian culture that succeeded it. During the Middle Ages it was believed that the ashes of Julius Caesar were contained in the gilt ball at the top [2] . Further, all artefacts are part of a world-wide collective history. Olduvai handaxes (from countries in Eastern Africa such as Tanzania) are held in the British Museum [3] - but the people who made them are our ancestors just as much as they are the ancestors of local people. Holding these in London encourages us to see the common ground we hold with people everywhere in the world, whereas keeping them only in their local country only highlights our differences and tribal identities. “Culture knows no political borders. It never has. It’s always been mongrel; it’s always been hybrid; and it’s always moved across borders or bears the imprint of earlier contact” [4] . [1] Saintpetersbasilica.org, ‘The Obelisk’. [2] Wikipedia, ‘List of obelisks in Rome’, And Wikipedia, ‘Saint Peter’s Sqaure’, (Both have useful links and pictures.) [3] The British Museum, ‘Highlights Olduvai Handaxe’. [4] Cuno, James, author of ‘Who Owns Antiquity? Museums and the Battle over our Ancient Heritage’, quoted in Tiffany Jenkins, ‘Culture knows no political borders’, The Spectator July 2008.
In many cases, returning an artefact may prove to be unreasonably expensive Even with modern transport links and technology, transporting every artefact in a foreign museum back to its original location would be an impractically mammoth task. The risk of damage to artefacts would be unavoidable, not to mention the possibility of theft or sabotage en route. Important artefacts in transit would be an ideal public target for acts of terrorism. Moreover, the infrastructure of developing countries is probably not sufficient to cope with that volume. Greece may have spent $200m developing a new museum but relatively it is one of the more wealthy countries of origin for artefacts in the British Museum; places such as Nigeria are unlikely to put such emphasis on cultural investment. Museums all over the world do loan out their collections [1] . Just because they are held in another country’s museum does not mean that the place of origin would not be able to access artefacts. Creating a generous and dynamic network of sharing relics between museums would be a much more realistic way of sharing and ensuring that all could benefit from seeing them. [1] The British Museum, ‘Tours and loans’.
Artefacts should be made accessible to the largest possible number of visitors Art treasures should be accessible to the greatest number of people and to scholars, because only then can the educational potential of these artefacts be realised. In response to a question about whether museums have any social responsibility, Richard Armstrong, director at the Guggenheim, said “Absolutely, it began with the French Revolution. It is the more than a 200-year-old quest to have the most powerful cultural artefacts available to the greatest number of people. One could say it is the project of democratizing beauty” [1] . In practice this means retaining them in the great museums of the world. Further some of the world great museums, such as those in Britain and the Smithsonian in Washington D.C. are free of charge. [1] Boudin, Claudia, ‘Richard Armstrong on the Future of the Solomon R Guggenheim Foundation’, 4th November 2008.