qid
int64
1
74.7M
question
stringlengths
12
33.8k
date
stringlengths
10
10
metadata
list
response_j
stringlengths
0
115k
response_k
stringlengths
2
98.3k
30,348,134
I am working on Demandware business manager where i need to change the site settings. Going to Manage Sites> Selecting a site and going to settings, there is a field called Cartridges where i put up a list of cartridges to be deployed for this site. I want to add these cartridges to a different hierarchy like inside a folder. Is there a way to define this in the Cartridges field under settings?
2015/05/20
[ "https://Stackoverflow.com/questions/30348134", "https://Stackoverflow.com", "https://Stackoverflow.com/users/1036999/" ]
No, you cannot add cartridges into subfolders. Cartridges need to reside directly in the *version directory* on the server.
Technically, you can have hierarchy folder structure IN YOUR LOCAL MACHINE (or GIT/SVN) but they will be uploaded to the DW WebDAV at the same level. For example, in your machine, you will have: * Catridge1 + Cartridge1.1 + Cartridge1.2 * Catridge2 When you "Import" cartridges into Eclipse, don't forget to select "Nested-folders" When Eclipse build the cartridges into webdav it should be: * Catridge1 * Cartridge1.1 * Cartridge1.2 * Catridge2
30,348,134
I am working on Demandware business manager where i need to change the site settings. Going to Manage Sites> Selecting a site and going to settings, there is a field called Cartridges where i put up a list of cartridges to be deployed for this site. I want to add these cartridges to a different hierarchy like inside a folder. Is there a way to define this in the Cartridges field under settings?
2015/05/20
[ "https://Stackoverflow.com/questions/30348134", "https://Stackoverflow.com", "https://Stackoverflow.com/users/1036999/" ]
No, you cannot add cartridges into subfolders. Cartridges need to reside directly in the *version directory* on the server.
Cartridges - is a separate units (entities) and can be fully independence in workaround with full of logic and UI features. Off course, you can develop additional cartridge for your site, for example, if you need to extend your current payment systems, or you need some multi usable cartridge and you plane sell your cartridge as a, kind of plugin. In other case, if need extend only locally feature - no needs reason to deploy new brand cartridge. But each of cartridges you can upload to sendbox only separately. Else, if you nested 'physically' folder with different cartridge, you'll broke it all. Hope it will help you. Please, specify in comment your issue, if I understand this quote not quite well.
30,348,134
I am working on Demandware business manager where i need to change the site settings. Going to Manage Sites> Selecting a site and going to settings, there is a field called Cartridges where i put up a list of cartridges to be deployed for this site. I want to add these cartridges to a different hierarchy like inside a folder. Is there a way to define this in the Cartridges field under settings?
2015/05/20
[ "https://Stackoverflow.com/questions/30348134", "https://Stackoverflow.com", "https://Stackoverflow.com/users/1036999/" ]
Technically, you can have hierarchy folder structure IN YOUR LOCAL MACHINE (or GIT/SVN) but they will be uploaded to the DW WebDAV at the same level. For example, in your machine, you will have: * Catridge1 + Cartridge1.1 + Cartridge1.2 * Catridge2 When you "Import" cartridges into Eclipse, don't forget to select "Nested-folders" When Eclipse build the cartridges into webdav it should be: * Catridge1 * Cartridge1.1 * Cartridge1.2 * Catridge2
Cartridges - is a separate units (entities) and can be fully independence in workaround with full of logic and UI features. Off course, you can develop additional cartridge for your site, for example, if you need to extend your current payment systems, or you need some multi usable cartridge and you plane sell your cartridge as a, kind of plugin. In other case, if need extend only locally feature - no needs reason to deploy new brand cartridge. But each of cartridges you can upload to sendbox only separately. Else, if you nested 'physically' folder with different cartridge, you'll broke it all. Hope it will help you. Please, specify in comment your issue, if I understand this quote not quite well.
39,761
**Exceeding federal personal exemption** I'm an American citizen returning to the United States from abroad, I'm planning on taking 3 liters of spirits back from my trip. I know the United States Federal Government allows a "personal exemption" of 1 liter of spirits duty and excise tax free, so I understand I'll need to declare that I have more and pay excise tax and duty. * I'm not worried a few bottles would raise suspicions that I intend to resell them (which I won't). * I've found the [excise tax rate here](http://www.ttb.gov/tax_audit/atftaxes.shtml). * **Where can I learn how much duty I'll need to pay?** **State rules** Also, I know that there are laws about crossing state lines with alcohol that need to be respected and which can be enforced by customs officers. I am landing in Chicago, Illinois but have proof that I'm traveling onward to Wisconsin. [Wisconsin allows residents returning from abroad to bring up to 4 liters of spirits into the state.](http://Can%20individuals%20bring%20liquor%20into%20Wisconsin?%20No.%20By%20state%20law,%20you%20may%20not%20bring%20any%20liquor%20(intoxicating%20liquor%20or%20wine%20-%20including%20cider)%20into%20Wisconsin,%20unless%20you%20hold%20a%20valid%20liquor%20permit%20issued%20by%20DOR%20or%20you%20qualify%20for%20one%20of%20the%20following%20three%20exceptions%20[sec.%20139.03(5),%20Wis.%20Stats.].%20%20Individual%20Exceptions%20Only%20(not%20applicable%20to%20businesses):%20%20Individuals%20who%20change%20their%20domicile%20and%20move%20into%20Wisconsin%20from%20another%20state%20or%20foreign%20country%20may%20bring%20liquor%20into%20Wisconsin%20without%20a%20permit%20from%20DOR%20or%20without%20paying%20the%20Wisconsin%20liquor%20tax,%20provided%20the%20liquor%20is%20part%20of%20their%20household%20goods.%20[Sec.%20139.03(5)(a),%20Wis.%20Stats.]%20Individuals%20who%20have%20been%20in%20a%20foreign%20country%20for%20at%20least%2048%20hours%20may%20bring%20into%20Wisconsin%20no%20more%20than%20four%20liters%20of%20liquor%20in%20sealed%20original%20containers%20without%20payment%20of%20the%20Wisconsin%20liquor%20tax.%20The%20liquor%20must%20be%20in%20that%20individual's%20immediate%20possession%20and%20may%20not%20be%20sent,%20shipped%20or%20carried%20into%20Wisconsin%20in%20some%20other%20manner.%20[Sec.%20139.03(5)(b),%20Wis.%20Stats.]%20An%20active%20duty%20member%20of%20the%20military%20who%20has%20been%20out%20of%20the%20country%20for%20duty%20or%20training%20for%20at%20least%2048%20hours%20may%20bring%20into%20Wisconsin%20no%20more%20than%2016%20liters%20of%20liquor%20in%20sealed%20original%20containers%20without%20payment%20of%20the%20Wisconsin%20liquor%20tax.%20The%20liquor%20must%20be%20in%20that%20individual's%20immediate%20possession%20and%20may%20not%20be%20sent,%20shipped%20or%20carried%20into%20Wisconsin%20in%20some%20other%20manner.%20[Sec.%20139.03(5)(b)2.,%20Wis.%20Stats.]) > > *Can individuals bring liquor into Wisconsin?* > > > No. By state law, you may not bring any liquor > > > Individual Exceptions Only (not applicable to businesses): > > > Individuals who have been in a foreign country for at least 48 hours may bring into Wisconsin no more than four liters of liquor in sealed original containers without payment of the Wisconsin liquor tax. The liquor must be in that individual's immediate possession and may not be sent, shipped or carried into Wisconsin in some other manner. > > > Is it correct that the Wisconsin law is applicable here and not the Illinois law?
2014/12/14
[ "https://travel.stackexchange.com/questions/39761", "https://travel.stackexchange.com", "https://travel.stackexchange.com/users/20576/" ]
As I originally pointed out the quantity was with-in reasonable personal use, but still theoretically subject to tax. I did declare the small overage but customs agents weren't interested in it and waved me through. Here are the rates for [excise tax](http://www.ttb.gov/tax_audit/atftaxes.shtml) and for [duty](http://usitc.gov/tata/hts/archive/0200/0200c22.pdf) that I'd have had to have payed, they amount to under $5 a bottle.
This is anecdotal, but the last time I returned to the US from abroad (Singapore), my brother in law unloaded a bunch of his liquor on me because he gets it for Christmas, Chinese New Years, etc., but he doesn't drink. I queried the US Customs site, and it basically came down to: as long as the Customs officer doesn't think you're bringing it back to sell, there's no real restriction. I think I brought back half a dozen bottle of high end cognac and whisky, declared them, and didn't even get asked about them on the way in. If you're only bringing 3 liters of booze into the country, there's little reason to be concerned. Bear in mind this was about four years ago (2010), and the rules may have changed, so, prudence would dictate that you check on their site, but I'd imagine you'd be fine.
49,697,285
I know such questions are already answered. none of the solutions is working in my case. I've changed the links in the database using all possible methods I found on the internet. My site is not loading images because it's picking the localhost path after migrating it to the server. It doesn't seem to be a database issue. because I checked my database multiple times. here is the screenshot of the errors. [![enter image description here](https://i.stack.imgur.com/CQ9tF.jpg)](https://i.stack.imgur.com/CQ9tF.jpg)
2018/04/06
[ "https://Stackoverflow.com/questions/49697285", "https://Stackoverflow.com", "https://Stackoverflow.com/users/8976140/" ]
Dump database to SQL file, open it with text editor and check (with search) if there are no old site's links there.
These are old links that exist in your sql db , You should search and replace all these links with the new urls matching to your website , I Prefer using SEARCH-REPLACE-DB-MASTER tool , But don't forget to delete it right after its usage , Plus also you may find some broken links "posts/pages" , Don't worry , This just need you to open your dashboard "Settings - Permalinks - Save" so that you reset your permalinks , Here is the tool and its documentation . <https://interconnectit.com/products/search-and-replace-for-wordpress-databases/>
438
I've heard the motto [**interaction is more powerful than algorithms**](http://www.cs.brown.edu/people/pw/papers/ficacm.ps) from [Peter Wegner](http://www.cs.brown.edu/~pw/). The basis of the idea is that a (classical) Turing Machine cannot handle interaction, that is, communication (input/output) with the outside world/environment. > > How can this be so? How can something be more powerful than a Turing Machine? What is the essence of this story? Why is it not more well known? > > >
2012/03/16
[ "https://cs.stackexchange.com/questions/438", "https://cs.stackexchange.com", "https://cs.stackexchange.com/users/31/" ]
Turing machines can handle interaction just fine, using oracle tapes. It works are follows: from the point of view of a computer that handles interaction, the input of the operator is simply another sequence of bits that it sent into the computer from time to time. We all know that a lazy sysadmin could write a script to send the input to the program when it is requested, so that the sysadmin could go on break earlier. The interaction machine has no way to know if there is really a live operator at the console, or if the input is being piped from another program. Having all the interaction input prepared ahead of time is the same, in theoretical terms, as having all the input on a separate tape that is used by an oracle Turing machine. Whenever the computer would normally request interaction from the operator, the machine instead reads from the input tape. If the thing read from the tape seems invalid in some way, the Turing machine does exactly what the interaction machine would do on receiving that input. I would guess that Wagner is aware of the ability to use oracle tapes to code input, so you have to take his comments with a grain of salt, or you have to ask what he actually means. I believe that people who think about interaction are generally worried about two things: * "Real" computers do have interaction, but algorithms as defined by Turing don't. We can get around this by coding the input on oracle tapes, but this still does not match the way that real computers operate. It might be nice to study models of computation that are more closely aligned with real computers. * Oracle-based algorithms are not often considered in day-to-day computing because normal computers don't come with a magic "oracle" to supply data. But we might be able to actually just use a person as the oracle. If the person understands the data that is being requested, they may even be able to help the algorithm along, thus improving its performance. In other words a human may be able to provide a useful oracle tape rather than simply a random one, and in principle this might lead to faster or more powerful computing methods compared to non-oracle-based ones. A similar thing happens with randomized computing, after all, where the machine is given a sequence of random bits as an extra input.
Recently ACM held Ubiquity symposium '[What is Computation?](http://ubiquity.acm.org/article.cfm?id=1883611)' in which Peter Wegner published an article which reflects his views on Interactive computing. Here are two excerpts from the article by Peter Wegner: > > One new concept, missing from early Turing machines, is "Interactive Computing," which accommodates interaction with the environment during the computation. > > > Interaction machines can perform more powerful forms of computing than Turing machines, and can perform the kind of thinking proposed by Turing because interaction improves their performance over that of Turing machines. > > > However, Fortnow, who has an article in the same symposium, appears to disagree with the Wegner views and believes that interactive computing does not offer any additional power over Turing Machines. To add to the mix, it appears that we are still debating and defining computation. Moshe Vardi has an article, What is an Algorithm?, Communications of the ACM, Vol. 55, No. 3, March 2012. Vardi reports on two new definitions of algorithms. The first is proposed by Gurevich and the second by Moschovakis. > > Gurevich argued that every algorithm can be defined in terms of an abstract state machine. > > > Moschovakis, in contrast, argued that an algorithm is defined in terms of a recursor, which is a recursive description built on top of arbitrary operations taken as primitives. > > >
438
I've heard the motto [**interaction is more powerful than algorithms**](http://www.cs.brown.edu/people/pw/papers/ficacm.ps) from [Peter Wegner](http://www.cs.brown.edu/~pw/). The basis of the idea is that a (classical) Turing Machine cannot handle interaction, that is, communication (input/output) with the outside world/environment. > > How can this be so? How can something be more powerful than a Turing Machine? What is the essence of this story? Why is it not more well known? > > >
2012/03/16
[ "https://cs.stackexchange.com/questions/438", "https://cs.stackexchange.com", "https://cs.stackexchange.com/users/31/" ]
Recently ACM held Ubiquity symposium '[What is Computation?](http://ubiquity.acm.org/article.cfm?id=1883611)' in which Peter Wegner published an article which reflects his views on Interactive computing. Here are two excerpts from the article by Peter Wegner: > > One new concept, missing from early Turing machines, is "Interactive Computing," which accommodates interaction with the environment during the computation. > > > Interaction machines can perform more powerful forms of computing than Turing machines, and can perform the kind of thinking proposed by Turing because interaction improves their performance over that of Turing machines. > > > However, Fortnow, who has an article in the same symposium, appears to disagree with the Wegner views and believes that interactive computing does not offer any additional power over Turing Machines. To add to the mix, it appears that we are still debating and defining computation. Moshe Vardi has an article, What is an Algorithm?, Communications of the ACM, Vol. 55, No. 3, March 2012. Vardi reports on two new definitions of algorithms. The first is proposed by Gurevich and the second by Moschovakis. > > Gurevich argued that every algorithm can be defined in terms of an abstract state machine. > > > Moschovakis, in contrast, argued that an algorithm is defined in terms of a recursor, which is a recursive description built on top of arbitrary operations taken as primitives. > > >
Check this out :) "***Turing's Ideas and Models of Computations***" <https://www.cs.montana.edu/~elser/turing%20papers/Turing%27s%20Ideas%20and%20Models%20of%20Computation.pdf>
438
I've heard the motto [**interaction is more powerful than algorithms**](http://www.cs.brown.edu/people/pw/papers/ficacm.ps) from [Peter Wegner](http://www.cs.brown.edu/~pw/). The basis of the idea is that a (classical) Turing Machine cannot handle interaction, that is, communication (input/output) with the outside world/environment. > > How can this be so? How can something be more powerful than a Turing Machine? What is the essence of this story? Why is it not more well known? > > >
2012/03/16
[ "https://cs.stackexchange.com/questions/438", "https://cs.stackexchange.com", "https://cs.stackexchange.com/users/31/" ]
Turing machines can handle interaction just fine, using oracle tapes. It works are follows: from the point of view of a computer that handles interaction, the input of the operator is simply another sequence of bits that it sent into the computer from time to time. We all know that a lazy sysadmin could write a script to send the input to the program when it is requested, so that the sysadmin could go on break earlier. The interaction machine has no way to know if there is really a live operator at the console, or if the input is being piped from another program. Having all the interaction input prepared ahead of time is the same, in theoretical terms, as having all the input on a separate tape that is used by an oracle Turing machine. Whenever the computer would normally request interaction from the operator, the machine instead reads from the input tape. If the thing read from the tape seems invalid in some way, the Turing machine does exactly what the interaction machine would do on receiving that input. I would guess that Wagner is aware of the ability to use oracle tapes to code input, so you have to take his comments with a grain of salt, or you have to ask what he actually means. I believe that people who think about interaction are generally worried about two things: * "Real" computers do have interaction, but algorithms as defined by Turing don't. We can get around this by coding the input on oracle tapes, but this still does not match the way that real computers operate. It might be nice to study models of computation that are more closely aligned with real computers. * Oracle-based algorithms are not often considered in day-to-day computing because normal computers don't come with a magic "oracle" to supply data. But we might be able to actually just use a person as the oracle. If the person understands the data that is being requested, they may even be able to help the algorithm along, thus improving its performance. In other words a human may be able to provide a useful oracle tape rather than simply a random one, and in principle this might lead to faster or more powerful computing methods compared to non-oracle-based ones. A similar thing happens with randomized computing, after all, where the machine is given a sequence of random bits as an extra input.
I do not think models with IO are "more powerful" than Turing machines, they just model different things. In theory, you could view IO as (noisy?) oracle. With a perfect oracle you can computer Turing-uncomputable functions; but where to get the oracle from? Humans are the only "super-Turing" choice (if there is any) and we are known to be very unreliable. A class of programs that fit this model are interactive proof assisstants (e.g. [Isabelle/HOL](http://www.cl.cam.ac.uk/research/hvg/Isabelle/), [Coq](http://coq.inria.fr/)). They deal with undecidable proof spaces but (arguably) every proof can be found (and checked) with suitable user input.
438
I've heard the motto [**interaction is more powerful than algorithms**](http://www.cs.brown.edu/people/pw/papers/ficacm.ps) from [Peter Wegner](http://www.cs.brown.edu/~pw/). The basis of the idea is that a (classical) Turing Machine cannot handle interaction, that is, communication (input/output) with the outside world/environment. > > How can this be so? How can something be more powerful than a Turing Machine? What is the essence of this story? Why is it not more well known? > > >
2012/03/16
[ "https://cs.stackexchange.com/questions/438", "https://cs.stackexchange.com", "https://cs.stackexchange.com/users/31/" ]
I do not think models with IO are "more powerful" than Turing machines, they just model different things. In theory, you could view IO as (noisy?) oracle. With a perfect oracle you can computer Turing-uncomputable functions; but where to get the oracle from? Humans are the only "super-Turing" choice (if there is any) and we are known to be very unreliable. A class of programs that fit this model are interactive proof assisstants (e.g. [Isabelle/HOL](http://www.cl.cam.ac.uk/research/hvg/Isabelle/), [Coq](http://coq.inria.fr/)). They deal with undecidable proof spaces but (arguably) every proof can be found (and checked) with suitable user input.
Check this out :) "***Turing's Ideas and Models of Computations***" <https://www.cs.montana.edu/~elser/turing%20papers/Turing%27s%20Ideas%20and%20Models%20of%20Computation.pdf>
438
I've heard the motto [**interaction is more powerful than algorithms**](http://www.cs.brown.edu/people/pw/papers/ficacm.ps) from [Peter Wegner](http://www.cs.brown.edu/~pw/). The basis of the idea is that a (classical) Turing Machine cannot handle interaction, that is, communication (input/output) with the outside world/environment. > > How can this be so? How can something be more powerful than a Turing Machine? What is the essence of this story? Why is it not more well known? > > >
2012/03/16
[ "https://cs.stackexchange.com/questions/438", "https://cs.stackexchange.com", "https://cs.stackexchange.com/users/31/" ]
Turing machines can handle interaction just fine, using oracle tapes. It works are follows: from the point of view of a computer that handles interaction, the input of the operator is simply another sequence of bits that it sent into the computer from time to time. We all know that a lazy sysadmin could write a script to send the input to the program when it is requested, so that the sysadmin could go on break earlier. The interaction machine has no way to know if there is really a live operator at the console, or if the input is being piped from another program. Having all the interaction input prepared ahead of time is the same, in theoretical terms, as having all the input on a separate tape that is used by an oracle Turing machine. Whenever the computer would normally request interaction from the operator, the machine instead reads from the input tape. If the thing read from the tape seems invalid in some way, the Turing machine does exactly what the interaction machine would do on receiving that input. I would guess that Wagner is aware of the ability to use oracle tapes to code input, so you have to take his comments with a grain of salt, or you have to ask what he actually means. I believe that people who think about interaction are generally worried about two things: * "Real" computers do have interaction, but algorithms as defined by Turing don't. We can get around this by coding the input on oracle tapes, but this still does not match the way that real computers operate. It might be nice to study models of computation that are more closely aligned with real computers. * Oracle-based algorithms are not often considered in day-to-day computing because normal computers don't come with a magic "oracle" to supply data. But we might be able to actually just use a person as the oracle. If the person understands the data that is being requested, they may even be able to help the algorithm along, thus improving its performance. In other words a human may be able to provide a useful oracle tape rather than simply a random one, and in principle this might lead to faster or more powerful computing methods compared to non-oracle-based ones. A similar thing happens with randomized computing, after all, where the machine is given a sequence of random bits as an extra input.
Turning Machines model computation, and don't have a concept of interaction. In that sense a machine that supported interaction with an outside system can do things a Turning Machine can't. But the computation done *between* bit of input from an outside source can obviously always be modelled by a Turing Machine, so even an "IO Machine" can't do anything *with* outside input that a Turing Machine couldn't do. In some sense such a machine may be able to "decide" problems that are undecidable by Turing Machines, but only if you imagine that the system it is interacting with has super-Turing-Machine powers *and is reliable* (in some way; probabilistic reliability would be enough). Imagine a program for an IO Machine like: "for any initial tape input, print the tape contents, then read a symbol from outside input; accept if the symbol is 1 and reject otherwise". This program can decide **any** problem. But *only* if the outside system it can interact with is capable of deciding the problem; to me that's not a very interesting way of saying that the IO Machine of is able to decide problems that are undecidable by Turing Machines. I think it would always be possible to represent an interactive computation by imagining a machine that takes as input on its tape an encoding of some prior configuration together with an outside input, and have the machine halt with its tape containing an encoding of a configuration together with output. Then the process of "running a program" is repeatedly running this Turing Machine in a mechanical fashion, with the only "non-mechanical" part being however the outside input is sourced. I'm certain that you could prove that if such a system got its input by giving its output to *another Turing Machine* set up to operate in a similar manner, then the combined system has identical computational powers to a single Turing Machine. I find that a convincing argument that interactive computation is no more powerful than non-interactive computation, *unless the system the computation interacts with is more powerful than a Turing Machine*. --- There is a non-theoretical sense in which interactivity can add to a computer's ability to solve problems, however. There are many things which humans do very accurately that we don't know how to get computers to do very well. But there are also many many things that humans are rubbish at that we can get computers to do. Combining these two can lead to projects such as [reCaptcha](http://www.google.com/recaptcha/learnmore), which is effectively automatically digitising books by farming out the problems of recognising words to humans in difficult cases. The resulting system of computer + human labour achieves a result that is currently impractical to achieve with either the computation alone or the human labour alone.
438
I've heard the motto [**interaction is more powerful than algorithms**](http://www.cs.brown.edu/people/pw/papers/ficacm.ps) from [Peter Wegner](http://www.cs.brown.edu/~pw/). The basis of the idea is that a (classical) Turing Machine cannot handle interaction, that is, communication (input/output) with the outside world/environment. > > How can this be so? How can something be more powerful than a Turing Machine? What is the essence of this story? Why is it not more well known? > > >
2012/03/16
[ "https://cs.stackexchange.com/questions/438", "https://cs.stackexchange.com", "https://cs.stackexchange.com/users/31/" ]
Turing machines can handle interaction just fine, using oracle tapes. It works are follows: from the point of view of a computer that handles interaction, the input of the operator is simply another sequence of bits that it sent into the computer from time to time. We all know that a lazy sysadmin could write a script to send the input to the program when it is requested, so that the sysadmin could go on break earlier. The interaction machine has no way to know if there is really a live operator at the console, or if the input is being piped from another program. Having all the interaction input prepared ahead of time is the same, in theoretical terms, as having all the input on a separate tape that is used by an oracle Turing machine. Whenever the computer would normally request interaction from the operator, the machine instead reads from the input tape. If the thing read from the tape seems invalid in some way, the Turing machine does exactly what the interaction machine would do on receiving that input. I would guess that Wagner is aware of the ability to use oracle tapes to code input, so you have to take his comments with a grain of salt, or you have to ask what he actually means. I believe that people who think about interaction are generally worried about two things: * "Real" computers do have interaction, but algorithms as defined by Turing don't. We can get around this by coding the input on oracle tapes, but this still does not match the way that real computers operate. It might be nice to study models of computation that are more closely aligned with real computers. * Oracle-based algorithms are not often considered in day-to-day computing because normal computers don't come with a magic "oracle" to supply data. But we might be able to actually just use a person as the oracle. If the person understands the data that is being requested, they may even be able to help the algorithm along, thus improving its performance. In other words a human may be able to provide a useful oracle tape rather than simply a random one, and in principle this might lead to faster or more powerful computing methods compared to non-oracle-based ones. A similar thing happens with randomized computing, after all, where the machine is given a sequence of random bits as an extra input.
Check this out :) "***Turing's Ideas and Models of Computations***" <https://www.cs.montana.edu/~elser/turing%20papers/Turing%27s%20Ideas%20and%20Models%20of%20Computation.pdf>
438
I've heard the motto [**interaction is more powerful than algorithms**](http://www.cs.brown.edu/people/pw/papers/ficacm.ps) from [Peter Wegner](http://www.cs.brown.edu/~pw/). The basis of the idea is that a (classical) Turing Machine cannot handle interaction, that is, communication (input/output) with the outside world/environment. > > How can this be so? How can something be more powerful than a Turing Machine? What is the essence of this story? Why is it not more well known? > > >
2012/03/16
[ "https://cs.stackexchange.com/questions/438", "https://cs.stackexchange.com", "https://cs.stackexchange.com/users/31/" ]
Turning Machines model computation, and don't have a concept of interaction. In that sense a machine that supported interaction with an outside system can do things a Turning Machine can't. But the computation done *between* bit of input from an outside source can obviously always be modelled by a Turing Machine, so even an "IO Machine" can't do anything *with* outside input that a Turing Machine couldn't do. In some sense such a machine may be able to "decide" problems that are undecidable by Turing Machines, but only if you imagine that the system it is interacting with has super-Turing-Machine powers *and is reliable* (in some way; probabilistic reliability would be enough). Imagine a program for an IO Machine like: "for any initial tape input, print the tape contents, then read a symbol from outside input; accept if the symbol is 1 and reject otherwise". This program can decide **any** problem. But *only* if the outside system it can interact with is capable of deciding the problem; to me that's not a very interesting way of saying that the IO Machine of is able to decide problems that are undecidable by Turing Machines. I think it would always be possible to represent an interactive computation by imagining a machine that takes as input on its tape an encoding of some prior configuration together with an outside input, and have the machine halt with its tape containing an encoding of a configuration together with output. Then the process of "running a program" is repeatedly running this Turing Machine in a mechanical fashion, with the only "non-mechanical" part being however the outside input is sourced. I'm certain that you could prove that if such a system got its input by giving its output to *another Turing Machine* set up to operate in a similar manner, then the combined system has identical computational powers to a single Turing Machine. I find that a convincing argument that interactive computation is no more powerful than non-interactive computation, *unless the system the computation interacts with is more powerful than a Turing Machine*. --- There is a non-theoretical sense in which interactivity can add to a computer's ability to solve problems, however. There are many things which humans do very accurately that we don't know how to get computers to do very well. But there are also many many things that humans are rubbish at that we can get computers to do. Combining these two can lead to projects such as [reCaptcha](http://www.google.com/recaptcha/learnmore), which is effectively automatically digitising books by farming out the problems of recognising words to humans in difficult cases. The resulting system of computer + human labour achieves a result that is currently impractical to achieve with either the computation alone or the human labour alone.
Check this out :) "***Turing's Ideas and Models of Computations***" <https://www.cs.montana.edu/~elser/turing%20papers/Turing%27s%20Ideas%20and%20Models%20of%20Computation.pdf>
3,049,658
I have Silverlight problem. I have to deal with fairly large legacy application that has some web services exposed (regular asmx, not wcf). What I need to do is to build Silverlight app that consumes those web services. Due to business logic I need to have the same web method to be called from Silverlight app synchronously. I was able to build web services consumption but only asynchronously because that would be default Silverlight behavior. Is there a way to make asynchronous calls synchronous in Silverlight? I looked here and googled around but came across only wcf examples. I would appreciate if somebody would get me a code example for regular asmx web service. I use Silverlight 4.0. Thanks!
2010/06/15
[ "https://Stackoverflow.com/questions/3049658", "https://Stackoverflow.com", "https://Stackoverflow.com/users/333948/" ]
Silverlight does not support sync calls. Even the TCP/IP stack is async. In some cases where you need the results of "WebServiceCall-A" to send to "WebServiceCall-B", you will have to write code to chain those together. For example, webservice-call-a -> callback-a -> process data -> webservice-b -> callback-b -> process data -> etc.
Just FYI: I have found the article by Daniel Vaughan today. Here is the link: <http://www.codeproject.com/KB/silverlight/SynchronousSilverlight.aspx> It explains how you can actually do Synchronous Web Service Calls with Silverlight.
4,224,844
I currently have a Nagios monitoring solution installed on one of our App Servers in our DEV environment, and I'd like this solution to monitor all other other App servers in this network (3 in total). We also have a separate Web server, and I'm having difficulty configuring the web interface on this remote server. All the research I have done so far has given example of a co-hosted Web and App server. Nothign seems to tell me what steps should be followed in order to configure the web interface remotely. FYI, the App servers are Unix, and Web has Apache insalled on it. Any tips would be appreciated. Thansk,
2010/11/19
[ "https://Stackoverflow.com/questions/4224844", "https://Stackoverflow.com", "https://Stackoverflow.com/users/506074/" ]
You should check [Thruk](http://www.thruk.org/) Nagios web interface
I think it can be possible to configure web interface on the same server where nagios is and then create a virtual host in apache and point the right ip-address to it in your domain configuration file.
4,224,844
I currently have a Nagios monitoring solution installed on one of our App Servers in our DEV environment, and I'd like this solution to monitor all other other App servers in this network (3 in total). We also have a separate Web server, and I'm having difficulty configuring the web interface on this remote server. All the research I have done so far has given example of a co-hosted Web and App server. Nothign seems to tell me what steps should be followed in order to configure the web interface remotely. FYI, the App servers are Unix, and Web has Apache insalled on it. Any tips would be appreciated. Thansk,
2010/11/19
[ "https://Stackoverflow.com/questions/4224844", "https://Stackoverflow.com", "https://Stackoverflow.com/users/506074/" ]
The nagios webinterface needs access to the status.dat, so it has to be on the same machine as nagios. However, you could, as already suggested, have a look at the [Thruk](http://thruk.org) interface which supports remote installations through a livestatus connection.
I think it can be possible to configure web interface on the same server where nagios is and then create a virtual host in apache and point the right ip-address to it in your domain configuration file.
4,224,844
I currently have a Nagios monitoring solution installed on one of our App Servers in our DEV environment, and I'd like this solution to monitor all other other App servers in this network (3 in total). We also have a separate Web server, and I'm having difficulty configuring the web interface on this remote server. All the research I have done so far has given example of a co-hosted Web and App server. Nothign seems to tell me what steps should be followed in order to configure the web interface remotely. FYI, the App servers are Unix, and Web has Apache insalled on it. Any tips would be appreciated. Thansk,
2010/11/19
[ "https://Stackoverflow.com/questions/4224844", "https://Stackoverflow.com", "https://Stackoverflow.com/users/506074/" ]
The nagios webinterface needs access to the status.dat, so it has to be on the same machine as nagios. However, you could, as already suggested, have a look at the [Thruk](http://thruk.org) interface which supports remote installations through a livestatus connection.
You should check [Thruk](http://www.thruk.org/) Nagios web interface
87,020
Sometimes I see the search bar inside already present text "Search ..." and that once clicked is replaced by what the user wants to write: Which is a better UX design? a) leave the search bar empty and clean? b) insert the text "Search ..."?
2015/11/16
[ "https://ux.stackexchange.com/questions/87020", "https://ux.stackexchange.com", "https://ux.stackexchange.com/users/74706/" ]
Use a prompt ============ I've tested this in ecomm and enterprise contexts in the past and providing some kind of "placeholder prompt" text has always been the winner. An empty field doesn't capture the same level of engagement. As with all things, test with your users to validate.
I guess none of your options are close to how an internet user think about it. It's just telling you there is a BAR for search, if you need. I guess an internet user needs a search function while she/he is looking for something. I guess an computer user knows what is about the processing of search. So, if a guest doesn't know there is a common feature for searching what they want, then they don't even need to think about what it says. By the way, have your tried this yet? [![By the way, have your tried this yet?](https://i.stack.imgur.com/OZbOJ.png)](https://i.stack.imgur.com/OZbOJ.png)
87,020
Sometimes I see the search bar inside already present text "Search ..." and that once clicked is replaced by what the user wants to write: Which is a better UX design? a) leave the search bar empty and clean? b) insert the text "Search ..."?
2015/11/16
[ "https://ux.stackexchange.com/questions/87020", "https://ux.stackexchange.com", "https://ux.stackexchange.com/users/74706/" ]
I would say that having "Search" as a placeholder or having a blank search input amounts to pretty much the same thing. What I do as a pattern is using that placeholder to help the user by suggesting what to search, as in if you're doing a grocery site then just say "Search for vegetables, meat, fish..." according to what you formulate to be the most obvious search topics. Also there's a lot of UI's that use the exact same design pattern (icon/input styling) for search inputs and filter inputs so if you have both you might want to use the placeholder to make the difference more obvious.
I guess none of your options are close to how an internet user think about it. It's just telling you there is a BAR for search, if you need. I guess an internet user needs a search function while she/he is looking for something. I guess an computer user knows what is about the processing of search. So, if a guest doesn't know there is a common feature for searching what they want, then they don't even need to think about what it says. By the way, have your tried this yet? [![By the way, have your tried this yet?](https://i.stack.imgur.com/OZbOJ.png)](https://i.stack.imgur.com/OZbOJ.png)
87,020
Sometimes I see the search bar inside already present text "Search ..." and that once clicked is replaced by what the user wants to write: Which is a better UX design? a) leave the search bar empty and clean? b) insert the text "Search ..."?
2015/11/16
[ "https://ux.stackexchange.com/questions/87020", "https://ux.stackexchange.com", "https://ux.stackexchange.com/users/74706/" ]
Use a prompt ============ I've tested this in ecomm and enterprise contexts in the past and providing some kind of "placeholder prompt" text has always been the winner. An empty field doesn't capture the same level of engagement. As with all things, test with your users to validate.
As long as it's an obvious input field with the word “Search” and/or the icon either within the field or as a label for the field you should be fine. Facebook had some problems with this when they implemented graph search. Fastco wrote an [interesting article](http://www.fastcodesign.com/1673209/how-facebook-fixed-the-biggest-design-flaw-in-graph-search) about their research. [![enter image description here](https://i.stack.imgur.com/3qU3x.jpg)](https://i.stack.imgur.com/3qU3x.jpg)
87,020
Sometimes I see the search bar inside already present text "Search ..." and that once clicked is replaced by what the user wants to write: Which is a better UX design? a) leave the search bar empty and clean? b) insert the text "Search ..."?
2015/11/16
[ "https://ux.stackexchange.com/questions/87020", "https://ux.stackexchange.com", "https://ux.stackexchange.com/users/74706/" ]
I would say that having "Search" as a placeholder or having a blank search input amounts to pretty much the same thing. What I do as a pattern is using that placeholder to help the user by suggesting what to search, as in if you're doing a grocery site then just say "Search for vegetables, meat, fish..." according to what you formulate to be the most obvious search topics. Also there's a lot of UI's that use the exact same design pattern (icon/input styling) for search inputs and filter inputs so if you have both you might want to use the placeholder to make the difference more obvious.
As long as it's an obvious input field with the word “Search” and/or the icon either within the field or as a label for the field you should be fine. Facebook had some problems with this when they implemented graph search. Fastco wrote an [interesting article](http://www.fastcodesign.com/1673209/how-facebook-fixed-the-biggest-design-flaw-in-graph-search) about their research. [![enter image description here](https://i.stack.imgur.com/3qU3x.jpg)](https://i.stack.imgur.com/3qU3x.jpg)
107,421
I am wondering if the common sign up strategy shared by almost all websites is always the best: 1. Provide username 2. Provide password and e-mail or mobile number 3. Provide more information and confirm e-mail or mobile phone I was thinking about tracking the users and let them interact with the website just as if they were subscribed. I was thinking to track their session by many ways in order to be almost (99%+) sure about who is using the site and so I am able to already create a user account on my server without the user knowing it. I call it the progressive account creation: 1. On first connection, I track the user's session 2. The user can visit the website just as if they were loggedin 3. When the user wants to perform an action that requires a specific data, there will be a form asking for this specific data. 4. At some point they will enter all info (name, e-mail, mobile...) but they will only do it when they need it I have three questions: Do you see any major drawback on this approach? Why isn't it used at all ? Are there other improvements to be done to legitimate this approach ?
2017/04/26
[ "https://ux.stackexchange.com/questions/107421", "https://ux.stackexchange.com", "https://ux.stackexchange.com/users/32309/" ]
I design enterprise software and this problem has come up many times. Firstly if your users want a different and simplified UX, you don't need to provide those features in your Primary Search. For Primary Search **keep it simple** - one simple search box is all you need (think Google). For all those other features you mentioned, create an **Advanced Search link** and place it on your Primary Search page. When users click this Advanced Search link you can take them to a new page which has all those more complex Boolean search and Save features. You can now monitor how many people click the Advanced Search link. You might find that not many people will use the Advanced Search, so you can make the Advanced Search page as complicated as you like. Because you are not an UX Expert, don't take my word of this. Jakob Nielsen is a very respected expert in the field of UX and he said this about Advanced Search in 1997: <https://www.nngroup.com/articles/search-and-you-may-find/> (the emphasis is mine) > > **Boolean search should be avoided** since all experience shows that users > cannot use it correctly. We have studied many groups of users who have > been given tasks like this: You have the following pets: > > > * cats > * dogs > > > Find information about your pets. > > > Almost all users will enter the query cats AND dogs. In our studies, they typically do not find anything, since the site never mentions both animals on the same page. Upon encountering a "no hits found"message, the vast majority of users concludes that there is no information available about these pets. > > > Experienced programmers normally use the same erroneous query, but when they get the null result, they typically say "oh, yes, I should have used an OR instead of the AND." > > > Unfortunately, **most users have not been taught debugging**, so they are **very poor at query reformulation**. This is why I recommend minimal use of scoped search and **no use of Boolean search in the primary search interface**. > > > **Advanced search is fine if offered on a different page than the simple > search**. The advanced search page can provide a variety of fancy > options, including Booleans, scopes, and various parametric searches > (e.g., only find pages added or changed after a certain date). > > > It is important to use an intimidating name like "advanced search" to > scare off novice users from getting into the page and hurting > themselves. > > > Search is one of the few cases where I do recommend shaping the user's behaviour by intimidation. > > > Jakob has validated this many times since 1997. Peter Morville is a very respected expert on search and findability, and has identified many search patterns. I highly recommend you read Search Patterns: Design for Discovery by Peter Morville & Jeffery Callender. They say this about Advanced Search: > > A relative concept, advanced search includes whatever simple search > doesn't. It's a pattern that **many of us love to hate**. often, advanced > search is a clumsy add-on that's rarely used, and it lets engineers > and designers take the easy way out. > > > Is it a user-friendly query builder for novices or a power tool for > experts? > > > This pattern also suffers from an ignorance of context. > > > In conclusion, advanced search is a pattern on the edge. In practice, > it's often abused and rarely used. Yet, like federated search, it > invites us to go further in our search for ideas, and serves as a > forgiving playground for experiments and exploration. > > >
you can simplify that like this into a single unit which is easily understandable and usable compared to the existing model. 1. with add as the first element in the drop down 2. And edit option only visible after hovering the particular item in the drop down. [![enter image description here](https://i.stack.imgur.com/54Usl.png)](https://i.stack.imgur.com/54Usl.png)
406,318
Here is my current setup, 2 VGA Monitors Mouse & Keyboard Speakers I would like a KVM switch that would allow me to use both monitors as consoles and then connect computers to the KVM Switch so I can assign each computer to one monitor. I haven't really been able to find a solution for this in my searching. I don't need dual monitors on any of the computers. Does there exist such a product or would I be better off getting two KVM switches?
2012/03/29
[ "https://superuser.com/questions/406318", "https://superuser.com", "https://superuser.com/users/89330/" ]
I don't know of a single utility that will handle everything, but some useful options are: 1. strings (from the binutils package) 2. lesspipe (from the less package) 3. antiword 4. odt2txt 5. pdftotext (from poppler-utils) Anything else, and you'll probably need to run /usr/bin/file and see if you can find a package that will let you convert to some canonical format that you know how to deal with.
No, the set of all filetypes and text-encodings is too large for a single program to be able to parse them all. Consider for example * Amiga Noisetracker mod file format (contains instrument names etc) * IBM VSAM file containing EBCDIC? * OS/2 Lotus Notes file (encrypted)
406,318
Here is my current setup, 2 VGA Monitors Mouse & Keyboard Speakers I would like a KVM switch that would allow me to use both monitors as consoles and then connect computers to the KVM Switch so I can assign each computer to one monitor. I haven't really been able to find a solution for this in my searching. I don't need dual monitors on any of the computers. Does there exist such a product or would I be better off getting two KVM switches?
2012/03/29
[ "https://superuser.com/questions/406318", "https://superuser.com", "https://superuser.com/users/89330/" ]
Since all of your needed filetypes are [here](http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35287) and [Google Desktop](http://googledesktop.blogspot.com/) has been discontinued I would try [Desktop Search Engines](http://en.wikipedia.org/wiki/List_of_search_engines#Desktop_search_engines) or look for a download site that still offers Google Desktop such as [CNET](http://download.cnet.com/Google-Desktop/3000-2379_4-10328117.html)
No, the set of all filetypes and text-encodings is too large for a single program to be able to parse them all. Consider for example * Amiga Noisetracker mod file format (contains instrument names etc) * IBM VSAM file containing EBCDIC? * OS/2 Lotus Notes file (encrypted)
406,318
Here is my current setup, 2 VGA Monitors Mouse & Keyboard Speakers I would like a KVM switch that would allow me to use both monitors as consoles and then connect computers to the KVM Switch so I can assign each computer to one monitor. I haven't really been able to find a solution for this in my searching. I don't need dual monitors on any of the computers. Does there exist such a product or would I be better off getting two KVM switches?
2012/03/29
[ "https://superuser.com/questions/406318", "https://superuser.com", "https://superuser.com/users/89330/" ]
Since all of your needed filetypes are [here](http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35287) and [Google Desktop](http://googledesktop.blogspot.com/) has been discontinued I would try [Desktop Search Engines](http://en.wikipedia.org/wiki/List_of_search_engines#Desktop_search_engines) or look for a download site that still offers Google Desktop such as [CNET](http://download.cnet.com/Google-Desktop/3000-2379_4-10328117.html)
I don't know of a single utility that will handle everything, but some useful options are: 1. strings (from the binutils package) 2. lesspipe (from the less package) 3. antiword 4. odt2txt 5. pdftotext (from poppler-utils) Anything else, and you'll probably need to run /usr/bin/file and see if you can find a package that will let you convert to some canonical format that you know how to deal with.
57,938
Are the buldging rules applicable in generalized linear models? Specifically, to transform the independent variables? I've only seen it disscused/used in OLS regression. Thanks in advance
2013/05/02
[ "https://stats.stackexchange.com/questions/57938", "https://stats.stackexchange.com", "https://stats.stackexchange.com/users/22552/" ]
A main point about GLM is that using a link function (assuming it's not the identity link) replaces transforming the response or outcome. (Why still use the poor term "dependent variable"?) Indeed, the GLM argument here is that you get the best of both worlds, namely that predictions are on the original scale, and the need for back-transforming -- and the corrections that should imply -- is avoided. I don't know anywhere that this is *really* well explained. Some of the original GLM literature was written by extremely smart people for very smart people, and the rest of us have to catch up somehow. So, the question is not "should I transform y to ln y?" but "should I work with a log link?". What you might do on the RHS is not changed, as I understand it.
I am interested by the term "Bulging rule" so I try to provide my answer here. Short answer: you can transform independent variables, but generalized linear model can do much more. I think Bulging rule can transform y and/or x into a linear relationship, but it may not work well for multiple regression, e.g. y ~ x1 + x2 + x3 . On the other side, generalized linear model (GLM) is a flexible framework. Not only you can model a non-linear relationship between y and multiple x (x1, x2, ...), you also have better interpretations from the model.
26,562
I have overhead, several times, that templates should not be used on video game consoles; however, I never heard why, or understood why. Why shouldn't templates be used on video game consoles, and if they are to be used, what are some ways to avoid the negativity?
2012/04/01
[ "https://gamedev.stackexchange.com/questions/26562", "https://gamedev.stackexchange.com", "https://gamedev.stackexchange.com/users/8507/" ]
The answer to your question is one thing, but I think you probably meant something slightly different, so I'll answer both. **Using Templates** There is absolutely no reason to avoid templates on a console any more than anywhere else. Templates by definition only really affect compile time, not runtime. There is an opportunity for templates to be abused in such a way as to result in bloated machine code, as John Calsbeek's answer describes in some detail. However, nothing is console specific about any of the potential problems with templates. To put it bluntly, your choice to use a template or not should be 100% based on whether a template is the correct way to solve the design problem you're working on. The target platform is entirely irrelevant. There are numerous AAA engines targeting consoles that make heavy use of templates, in fact. The problems that are solved well by templates are any pieces of code you write that need to be generic to types. Containers are an excellent place to use templates. Utility code that applies to many types -- algorithms, smart pointers/handles, and so on -- are excellent places to use templates. Even some template meta-programming can come in massively useful for handling script binding, serialization, editor GUI property sheet binding, messaging, and even network packet formation (done right, these techniques will have zero or very near-zero runtime overhead and again, being on a console or not will just not matter). **Standard Template Library** The C++ STL is a somewhat different matter to templates in general. What you probably heard was that the STL should be avoided on consoles. In fact, even many PC games avoid much of the STL as well. While the STL is a very good library for general purpose programming, it has various weaknesses for games. The STL containers are often poor fits for games as they're designed to be rather broadly applicable rather than targeted at specific needs. The std::vector container and *maybe* the std::deque are acceptable as-is, but the majority of the other containers are problematic. Many implementations of std::map for instance are basic binary trees, and your worst case times with them hence don't suit real-time applications like games. You're usually better off writing your own red-black tree container since that way you can know for sure you're getting a data structure with an acceptable worst-case lookup/insertion/removal time. The worst offender of the containers is the std::unordered\_map (hash table), whose implementation is defined by the standard to have various properties which are not really all that useful in most real code I've seen (including all game code I've seen). Those properties basically mandate a suboptimal implementation. In particular, std::unordered\_map is required to not invalidate references to objects on insert/removal, which means that the map cannot use open-addressing, and hence is going to have sloppy memory usage. The implementation is designed under the assumption that you might be storing expensive to move/copy objects, but in games you'll very rarely be storing anything besides numbers or cheap pointers/handles in these kinds of data structures. Algorithms like std::sort are implemented in ways that work well in general cases, but which again have abysmal worst-case performance, and honestly just unacceptable average-case performance for the kinds of things that games sort. Most games will only very rarely sort strings or such, but will sort plain integers very often. Hence using a radix sort algorithm is quite likely going to be a much better choice. The normal STL implementations also tend to make Debug builds abysmally slow due to the abstract nature of their implementation and the excessive debugging checks. EA has an in-house STL implementation that goes out of its way to avoid every unnecessary method call in its internal implementation, so that even in non-inlined debug builds the containers will remain at least reasonably fast. They submitted a paper to the C++ committe on the topic (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2271.html) and apparently published it to GitHub (https://github.com/paulhodge/EASTL). The allocator model of the STL is also a pain (for games or any other application), and games in general want to have very explicit control over memory. Doubly so for consoles. While this can be worked around with the STL allocator model, it's so painful and obtuse and difficult to use correctly that most games end up using custom container libraries that have non-STL-compliant (but vastly superior for their needs) allocator APIs. So, it *may* be desirable to avoid using any normal STL implementation for console games. Using a template library that is written with games, real-time simulation, and constrained platforms' needs in mind is a very different proposition, however.
While I have not heard the 'do not use templates on a console', I can maybe give a personal experience as to why people may say this. Working on a very memory limited console, you of course want the compiler to strip out unused code to compact the exe as small as possible. This actually gave us issues on one platform where the compiler would improperly strip out (or just never compile to begin with) pieces of the templates used for the objects. It did not naturally see that those methods were needed so when the run time would execute, it would crash. We fixed the issue, but this did cause us a bit of code bloat as we had to forcibly declare the template classes. This added a bit more to the code but it ensured that the code we did need would still be there but allow us to still run code stripping as a compilation option. Hope this helps.
156,725
I'm not sure if the death troopers speak entirely in what I assume is code or if it's just a very noticeable part of their communication and they do speak normally at times. Is there are a canon (and out of universe would be very interesting) reason for this?
2017/04/07
[ "https://scifi.stackexchange.com/questions/156725", "https://scifi.stackexchange.com", "https://scifi.stackexchange.com/users/46963/" ]
Most special forces have their own special battle cant that is gibberish to anybody else hearing it. Comms can be intercepted but decoding a specialised jargon or language would take insider knowledge. Just take a look at [Enigma code](https://en.wikipedia.org/wiki/Enigma_machine) from WW2. Best codebreakers of the allies weren't able to crack it for a few years. Their only "breakthrough" came from acquiring an enigma decoder machine from a Nazi submarine. Or take a look at [Navaho code talkers](http://www.californiaindianeducation.org/native_american_veterans/navajo_code_talkers.html), also from WW2... nobody but a Navaho would be able to understand even if the code could be intercepted over radio waves. [![enter image description here](https://i.stack.imgur.com/qXE08.png)](https://i.stack.imgur.com/qXE08.png) We do see this in many other universes too. Dan A. from W40K writes of battle-cant, battlecode etc.
They have encryption modules within their helmets that allow them to be able to understand what other Death Troopers are saying, but that prevents normal people without these encryption modules from being able to understand them. If you have read *Thrawn: Treason* you would know that Death Troopers are specially trained to be able to keep secrets even under pressure and torture, and this is why the encryption module was never stolen by Rebels, and why even Director Krennic can't understand them.
115,818
I build a Zig-Zag-mechanic from a bone chain and enabled Inverse Kinematics for the first bone and the whole chain. When contracted it should look like this: VVVVVVV Currently it is a straight line of bones like this: --------------- Now the fun part ;) How can I force the bones to always build this ZigZag-Pattern if the IK-enabled bone is moved directly towards the opposite of the chain? How can I force the bone chain into one plane of movement? How can I force the bone chain to build a half circle if the IK-enabled bone is not directly moved towards the end of the chain? The movement of the IK-enabled bone always is in the same plane as the rest of the bones. I tried "Limit rotation" but this does not work for me (which either could mean, that I did something wrong or "Limit rotation" is the wrong tool. How can I acchieve what I want to do?
2018/08/11
[ "https://blender.stackexchange.com/questions/115818", "https://blender.stackexchange.com", "https://blender.stackexchange.com/users/-1/" ]
By making a Group Node you can easily adjust specific inputs. 1. Go the Node Editor 2. Select all Nodes **except** the Node you want to be flexible, in this case every node except the Color Node 3. Hit Ctrl+G 4. Hit Tab to toggle between the general node view and group node 5. Now you can use this group node in your other materials, and adjust the color per specific material. If you make adjustments to the group node itself it will be applied to all other materials
You can use the Pass Index to discriminate between objects with the same material. * In Object Properties > Relations, assign a Pass index to each object, between 1 and [ your number of distinct colors ]. You can assign the index to multiple selected objects by right-clicking the Pass Index field on the active object, and choosing 'Copy to Selected'. * Replace the RGB node in your tree with the nodes illustrated here: [![enter image description here](https://i.stack.imgur.com/Jxvuk.jpg)](https://i.stack.imgur.com/Jxvuk.jpg) The Divide is just to map the range of integers to 0.0 .. 1.0 for the color ramp, and the subtract nudges the point to make the tabs in the color ramp easier to place) [![enter image description here](https://i.stack.imgur.com/CUPuq.png)](https://i.stack.imgur.com/CUPuq.png) If you need the Pass index for something else, you can drive a Value node from a Custom Property, but that's not quite so convenient to assign.
109,993
In the past, some of my projects have required me to create a movie version of a fullscreen Flash application. The easiest way to do this has been to get a screen capture. However, capturing anything over 1024x768 has resulted in choppy video, which is unacceptable. I understand that there are hardware based solutions for capturing fullscreen video, but I have not been able to find out what these are. My output needs to be scalable up to 1920x1080 and result in an uncompressed AVI file with no choppy-ness.
2008/09/21
[ "https://Stackoverflow.com/questions/109993", "https://Stackoverflow.com", "https://Stackoverflow.com/users/2715/" ]
Various professional products support full HD capture: <http://www.decklink.com/products/hd/> <http://www.aja.com/> There are others. Capturing the full, uncompressed digital or analog stream is a pretty heavy requirement. -Adam
With a bit of luck your graphic adapter already has a analog video output. You could hook up a dvd recorder and just digitze the video signal on a stand alone hardware box. That won't give you 1920x1080 though. If you really need to get captures higher than dvd resolution you need professional (and *incredible* expensive) video capture equipment. edit: ===== Btw - if you want to capture 1920x1080 in true color at 30 frames per second uncompressed you have to somehow store around **237 megabytes per second**. Just to give you an idea how much data you have to deal with...
109,993
In the past, some of my projects have required me to create a movie version of a fullscreen Flash application. The easiest way to do this has been to get a screen capture. However, capturing anything over 1024x768 has resulted in choppy video, which is unacceptable. I understand that there are hardware based solutions for capturing fullscreen video, but I have not been able to find out what these are. My output needs to be scalable up to 1920x1080 and result in an uncompressed AVI file with no choppy-ness.
2008/09/21
[ "https://Stackoverflow.com/questions/109993", "https://Stackoverflow.com", "https://Stackoverflow.com/users/2715/" ]
Various professional products support full HD capture: <http://www.decklink.com/products/hd/> <http://www.aja.com/> There are others. Capturing the full, uncompressed digital or analog stream is a pretty heavy requirement. -Adam
If the Flash application is non-interactive, there are many tools that can get non-realtime capture (but completely smooth and perfect) to either an AVI file or a series of PNGs. If it is interactive and you absolutely need realtime capture, FRAPS might actually be able to do the job, at least on Vista, where its not usually that difficult to manipulate FRAPS into recording various non-DirectShow applications by using Aero as a graphics layer.
109,993
In the past, some of my projects have required me to create a movie version of a fullscreen Flash application. The easiest way to do this has been to get a screen capture. However, capturing anything over 1024x768 has resulted in choppy video, which is unacceptable. I understand that there are hardware based solutions for capturing fullscreen video, but I have not been able to find out what these are. My output needs to be scalable up to 1920x1080 and result in an uncompressed AVI file with no choppy-ness.
2008/09/21
[ "https://Stackoverflow.com/questions/109993", "https://Stackoverflow.com", "https://Stackoverflow.com/users/2715/" ]
If the Flash application is non-interactive, there are many tools that can get non-realtime capture (but completely smooth and perfect) to either an AVI file or a series of PNGs. If it is interactive and you absolutely need realtime capture, FRAPS might actually be able to do the job, at least on Vista, where its not usually that difficult to manipulate FRAPS into recording various non-DirectShow applications by using Aero as a graphics layer.
With a bit of luck your graphic adapter already has a analog video output. You could hook up a dvd recorder and just digitze the video signal on a stand alone hardware box. That won't give you 1920x1080 though. If you really need to get captures higher than dvd resolution you need professional (and *incredible* expensive) video capture equipment. edit: ===== Btw - if you want to capture 1920x1080 in true color at 30 frames per second uncompressed you have to somehow store around **237 megabytes per second**. Just to give you an idea how much data you have to deal with...
109,993
In the past, some of my projects have required me to create a movie version of a fullscreen Flash application. The easiest way to do this has been to get a screen capture. However, capturing anything over 1024x768 has resulted in choppy video, which is unacceptable. I understand that there are hardware based solutions for capturing fullscreen video, but I have not been able to find out what these are. My output needs to be scalable up to 1920x1080 and result in an uncompressed AVI file with no choppy-ness.
2008/09/21
[ "https://Stackoverflow.com/questions/109993", "https://Stackoverflow.com", "https://Stackoverflow.com/users/2715/" ]
Various professional products support full HD capture: <http://www.decklink.com/products/hd/> <http://www.aja.com/> There are others. Capturing the full, uncompressed digital or analog stream is a pretty heavy requirement. -Adam
[Phillips dvdr3575h](http://www.google.com/products/catalog?rlz=1C1GGLS_enUS291&sourceid=chrome&ie=UTF-8&q=dvd+recoreder+hdd&um=1&cid=9245182892068372171&sa=X&oi=product_catalog_result&resnum=1&ct=result#ps-tech-specs) and other set-top boxes may be the simplest. However, I don't believe they support the resolution you are looking for.
109,993
In the past, some of my projects have required me to create a movie version of a fullscreen Flash application. The easiest way to do this has been to get a screen capture. However, capturing anything over 1024x768 has resulted in choppy video, which is unacceptable. I understand that there are hardware based solutions for capturing fullscreen video, but I have not been able to find out what these are. My output needs to be scalable up to 1920x1080 and result in an uncompressed AVI file with no choppy-ness.
2008/09/21
[ "https://Stackoverflow.com/questions/109993", "https://Stackoverflow.com", "https://Stackoverflow.com/users/2715/" ]
Various professional products support full HD capture: <http://www.decklink.com/products/hd/> <http://www.aja.com/> There are others. Capturing the full, uncompressed digital or analog stream is a pretty heavy requirement. -Adam
<http://rgb.com/> I've looked at their product before - very high end/expensive but perfect video and it's a hardware solution so it's not processor intensive on the machine you are trying to do the demo on.
109,993
In the past, some of my projects have required me to create a movie version of a fullscreen Flash application. The easiest way to do this has been to get a screen capture. However, capturing anything over 1024x768 has resulted in choppy video, which is unacceptable. I understand that there are hardware based solutions for capturing fullscreen video, but I have not been able to find out what these are. My output needs to be scalable up to 1920x1080 and result in an uncompressed AVI file with no choppy-ness.
2008/09/21
[ "https://Stackoverflow.com/questions/109993", "https://Stackoverflow.com", "https://Stackoverflow.com/users/2715/" ]
If the Flash application is non-interactive, there are many tools that can get non-realtime capture (but completely smooth and perfect) to either an AVI file or a series of PNGs. If it is interactive and you absolutely need realtime capture, FRAPS might actually be able to do the job, at least on Vista, where its not usually that difficult to manipulate FRAPS into recording various non-DirectShow applications by using Aero as a graphics layer.
[Phillips dvdr3575h](http://www.google.com/products/catalog?rlz=1C1GGLS_enUS291&sourceid=chrome&ie=UTF-8&q=dvd+recoreder+hdd&um=1&cid=9245182892068372171&sa=X&oi=product_catalog_result&resnum=1&ct=result#ps-tech-specs) and other set-top boxes may be the simplest. However, I don't believe they support the resolution you are looking for.
30,994
I'm testing a tool i.e. Draw tool. The tool enables the user to draw on a map using the available graphics and to export drawing from local disk. The user is able to draw, but export drawing sub function is not working at all. In this case, what do we consider the severity of the defect related to the work stoppage of export tool? **Note:** Requirement in SRS says that, user should be able to draw on map and export his drawings.
2017/12/13
[ "https://sqa.stackexchange.com/questions/30994", "https://sqa.stackexchange.com", "https://sqa.stackexchange.com/users/29177/" ]
According to what you have described, it is definitely an issue, but how do you define "critical" in your organisation? * Is critical of the most severity within your organisation? I personally consider this issue should be a ***show stopper*** as its functionality is not met. What is the purpose of drawing something but you can not export it? * You can argue this software is partially working if the user can save it in the default format, but most of users if not all of them will consider export a drawing into a different format or location is a standard feature.
As per your description about the Draw tool, it has two functionality:- Firstly, it allow user to draw and Secondly allows user to exports its drawing. Both the functionality makes the tool alive as single one. Otherwise no one will be interested to work on half alive tool. As per the experience in [software testing services company](https://www.qasource.com/software-testing-company), would suggest this kind of issue to be a critical one and need to be fixed.
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
**Radiation belts.** <https://astronomy.com/magazine/ask-astro/2020/02/what-is-the-source-of-jupiters-radiation> > > Any charged particles in the space around Jupiter will experience the > planet’s strong magnetic field and get accelerated to high energies. > These negatively charged electrons and positively charged ions of > hydrogen (i.e., protons), oxygen, and sulfur form Jupiter’s radiation > belts... The dominant particle source, however, is the volcanic moon > Io, which dumps a ton per second of sulfur dioxide gas into space > where the molecules are broken up and ionized. These processes around > the moon drive million-amp currents between Io and Jupiter’s > ionosphere, where they excite intense aurorae on the planet. > > > The highest fluxes of energetic particles are found closest to the > planet, where they form a doughnut-shaped belt around the equator > (which causes radio emission that astronomers have measured from Earth > since the 1960s)... > > > It is Jupiter’s combination of a strong magnetic field, Io’s > prodigious source, and the magnetic coupling of charged particles to > the planet’s rapid (10-hour) spin that drives the intense radiation. > Earth’s radiation belts (named after James Van Allen, who discovered > them with the U.S.’s first satellite in 1958) are much weaker. > > > Your planet has a very strong magnetic field and one of its moons plays the role of Io, dumping particles that are accelerated to high speeds. Fast moving particles are the equivalent of electromagnetic radiation as regards biological damage. Your people do not realize that these belts exist and dip down to and below the planet surface because the belt is not homogenous - it moves with the moons, with high and low tides. In addition a detection method tuned to short wavelength radiation like Xrays and gamma rays might completely overlook fast moving large ions of sulfur and iron. As regards the prospect of life not affected by particulate radiation, why not? One could imagine Earth life durable to this sort of thing. A simple method: if radiation makes random changes in a genome you could have multiple copies of the genome and check them against each other frequently, correcting according to consensus for a given gene.
We have examples of life which has evolved to thrive in hostile environment. [Deinococcus radiodurans](https://en.wikipedia.org/wiki/Deinococcus_radiodurans) is famous for tolerating radiation > > Deinococcus radiodurans is an extremophilic bacterium and one of the most radiation-resistant organisms known. It can survive cold, dehydration, vacuum, and acid, and therefore is known as a polyextremophile. It has been listed as the world's toughest known bacterium in The Guinness Book Of World Records. > > > D. radiodurans is capable of withstanding an acute dose of 5,000 grays (Gy), or 500,000 rad, of ionizing radiation with almost no loss of viability, and an acute dose of 15,000 Gy with 37% viability. A dose of 5,000 Gy is estimated to introduce several hundred double-strand breaks (DSBs) into the organism's DNA (~0.005 DSB/Gy/Mbp (haploid genome)). For comparison, a chest X-ray or Apollo mission involves about 1 mGy, 5 Gy can kill a human, 200-800 Gy will kill E. coli, and more than 4,000 Gy will kill the radiation-resistant tardigrade. > > > It's not far fetched to think that, with suitable environmental pressure, complex life could evolve with a similar resistance. After all complex life has evolved to tolerate oxygen, which is also a nasty environmental factor. However, regarding your plot, a Geiger counter is sufficient to measure environmental radiation level. It sounds very naive to disembark on a planet without measuring the background radiation.
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
We have examples of life which has evolved to thrive in hostile environment. [Deinococcus radiodurans](https://en.wikipedia.org/wiki/Deinococcus_radiodurans) is famous for tolerating radiation > > Deinococcus radiodurans is an extremophilic bacterium and one of the most radiation-resistant organisms known. It can survive cold, dehydration, vacuum, and acid, and therefore is known as a polyextremophile. It has been listed as the world's toughest known bacterium in The Guinness Book Of World Records. > > > D. radiodurans is capable of withstanding an acute dose of 5,000 grays (Gy), or 500,000 rad, of ionizing radiation with almost no loss of viability, and an acute dose of 15,000 Gy with 37% viability. A dose of 5,000 Gy is estimated to introduce several hundred double-strand breaks (DSBs) into the organism's DNA (~0.005 DSB/Gy/Mbp (haploid genome)). For comparison, a chest X-ray or Apollo mission involves about 1 mGy, 5 Gy can kill a human, 200-800 Gy will kill E. coli, and more than 4,000 Gy will kill the radiation-resistant tardigrade. > > > It's not far fetched to think that, with suitable environmental pressure, complex life could evolve with a similar resistance. After all complex life has evolved to tolerate oxygen, which is also a nasty environmental factor. However, regarding your plot, a Geiger counter is sufficient to measure environmental radiation level. It sounds very naive to disembark on a planet without measuring the background radiation.
You asked specifically about "radioactively hostile" but in case you just meant "very unhealthy in the longer term" - radiation on the surface could be much higher than on Earth (e.g. like Mars) from cosmic sources, either due to the stellar neighborhood or to a weaker magnetic field (again, like Mars). As a total non-expert in this kind of biology I can suppose that the native life has adapted to this, similar to the Earth organisms mentioned in other answers. Going outside 'radiation', there are any number of trace elements that can be present in soil, and therefore also as dust in the air, that are long-term toxic or carcinogenic for humans e.g. Arsenic (As), cadmium (Cd), chromium (Cr), and nickel (Ni). Asbestos is also a naturally occuring contaminant in various minerals: vermiculite and taconite are well-known examples. Here's a completely different form of hostility: It seems pretty plausible based on recent research, that life can be constructed from a different set of amino acids than Earth life. And if it can be, the odds are it WILL be on other planets. As a result, if your colonists use native life as food, they will quickly or slowly develop amino-acid deficiencies unless they supplement with the essential Earth amino acids that are missing. I speculate that some non-Earthly aminos might even be actively toxic, but that's just a guess. One last comment: If you don't want your story to feel like a throwback to the 1950's or 60's, assume that if humans can colonize an extrasolar planet, that their science and technology is MORE advanced than ours, and they will understand physics, chemistry, biology, and medicine better than we do. Unless they've gone through a radical loss of civilization (despite interstellar travel) they will NOT be caught by surprise by any of these environmental problems like radiation, toxic metals, or amino-acid deficits, any more than a team of Navy SEALs would get lost in the desert. Their satellites, or landers, or smart-phones, or body augments, will have told them all about it.
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
We have examples of life which has evolved to thrive in hostile environment. [Deinococcus radiodurans](https://en.wikipedia.org/wiki/Deinococcus_radiodurans) is famous for tolerating radiation > > Deinococcus radiodurans is an extremophilic bacterium and one of the most radiation-resistant organisms known. It can survive cold, dehydration, vacuum, and acid, and therefore is known as a polyextremophile. It has been listed as the world's toughest known bacterium in The Guinness Book Of World Records. > > > D. radiodurans is capable of withstanding an acute dose of 5,000 grays (Gy), or 500,000 rad, of ionizing radiation with almost no loss of viability, and an acute dose of 15,000 Gy with 37% viability. A dose of 5,000 Gy is estimated to introduce several hundred double-strand breaks (DSBs) into the organism's DNA (~0.005 DSB/Gy/Mbp (haploid genome)). For comparison, a chest X-ray or Apollo mission involves about 1 mGy, 5 Gy can kill a human, 200-800 Gy will kill E. coli, and more than 4,000 Gy will kill the radiation-resistant tardigrade. > > > It's not far fetched to think that, with suitable environmental pressure, complex life could evolve with a similar resistance. After all complex life has evolved to tolerate oxygen, which is also a nasty environmental factor. However, regarding your plot, a Geiger counter is sufficient to measure environmental radiation level. It sounds very naive to disembark on a planet without measuring the background radiation.
Increased Resilience to Errors in Genetic Information ===================================================== One of the biggest problems caused by radiation to humans is the damage done to our cells' DNA, which lead to [The Hallmarks of Cancer](https://en.wikipedia.org/wiki/The_Hallmarks_of_Cancer). In short, if the right combination of damage occurs to *any* cell's DNA, it will cause that cell to endlessly reproduce, which copies the defective DNA -- the problem is self-compounding. Our cells have mechanisms to protect against DNA damage via [proofreading and repair](https://www.khanacademy.org/science/high-school-biology/hs-molecular-genetics/hs-discovery-and-structure-of-dna/a/dna-proofreading-and-repair). As the examples on Khan academy show, these repairs largely rely on the fact that each of the two strands in DNA is identical to the other. The Wikipedia article on [DNA Replication](https://en.wikipedia.org/wiki/DNA_replication#DNA_structure) notes: > > The pairing of complementary bases in DNA (through hydrogen bonding) means that the information contained within each strand is redundant. > > > Moreover, each strand is used as a template during DNA replication, as shown here [[1](https://en.wikipedia.org/wiki/DNA_replication#/media/File:DNA_replication_split.svg)]: ![A diagram of the DNA double helix being unzipped for replication.](https://upload.wikimedia.org/wikipedia/commons/thumb/7/70/DNA_replication_split.svg/800px-DNA_replication_split.svg.png) This single redundancy is what allows our DNA to be so resilient against damage. Compare the double helix in DNA to the single-stranded RNA used for short-lived information in our cells [[2](https://en.wikipedia.org/wiki/Nucleic_acid#/media/File:Difference_DNA_RNA-EN.svg)]: ![A diagram of the DNA double helix being unzipped for replication.](https://upload.wikimedia.org/wikipedia/commons/thumb/3/37/Difference_DNA_RNA-EN.svg/1920px-Difference_DNA_RNA-EN.svg.png) Thus, RNA has no redundancy, and is *much* more susceptible to damage. This is why the genetic information of complex organisms in encoded using DNA instead of RNA: the information is far better protected against corruption. Note how essential [base pairs](https://en.wikipedia.org/wiki/Base_pair) are to both DNA replication and repair processes. Both DNA replication and repair rely entirely on the fact that the only possible DNA [nucleobase](https://en.wikipedia.org/wiki/Nucleobase) pairings are **A-T** and **C-G**. Now then, you might imagine a structure similar to DNA being used by the biology of your fictional planet. However, instead of using [nucleosides](https://en.wikipedia.org/wiki/Nucleoside) that facilitate base *pairs*, there might be alien nucleosides that can only bond as base *triples*, so instead of **A-T** and **C-G**, you might have **A-B-C** and **X-Y-Z**. This would result in molecule with 3 strands instead of the 2 strands seen in DNA. Thus, instead of only having the *single* redundancy of humans, your lifeforms' genetic information has *dual* redundancy, making it far more resilient against the radiation-induced mutations that cause cancer. This would also render the biological mechanisms that protect your fictional planet's life against radiation incompatible with Earth-based life. Also note that, because each strand in DNA is a sequence of 4 nucleotides, this is base-4 information storage scheme very similar to the base-2 system of 1's and 0's used in our computers. You could add biological mechanisms that serve as analogs to real-world human-developed [error detection and correction](https://en.wikipedia.org/wiki/Error_detection_and_correction) techniques. Again, such mechanisms would safeguard against the most damaging long-term effects of radiation. Side note --------- As someone involved in space science: radiation is always a major concern, even for robotic missions. I would find it extremely unrealistic for space-faring humans to be oblivious to radiation from the host star of any planet. You might want to give this area some thought.
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
**Radiation belts.** <https://astronomy.com/magazine/ask-astro/2020/02/what-is-the-source-of-jupiters-radiation> > > Any charged particles in the space around Jupiter will experience the > planet’s strong magnetic field and get accelerated to high energies. > These negatively charged electrons and positively charged ions of > hydrogen (i.e., protons), oxygen, and sulfur form Jupiter’s radiation > belts... The dominant particle source, however, is the volcanic moon > Io, which dumps a ton per second of sulfur dioxide gas into space > where the molecules are broken up and ionized. These processes around > the moon drive million-amp currents between Io and Jupiter’s > ionosphere, where they excite intense aurorae on the planet. > > > The highest fluxes of energetic particles are found closest to the > planet, where they form a doughnut-shaped belt around the equator > (which causes radio emission that astronomers have measured from Earth > since the 1960s)... > > > It is Jupiter’s combination of a strong magnetic field, Io’s > prodigious source, and the magnetic coupling of charged particles to > the planet’s rapid (10-hour) spin that drives the intense radiation. > Earth’s radiation belts (named after James Van Allen, who discovered > them with the U.S.’s first satellite in 1958) are much weaker. > > > Your planet has a very strong magnetic field and one of its moons plays the role of Io, dumping particles that are accelerated to high speeds. Fast moving particles are the equivalent of electromagnetic radiation as regards biological damage. Your people do not realize that these belts exist and dip down to and below the planet surface because the belt is not homogenous - it moves with the moons, with high and low tides. In addition a detection method tuned to short wavelength radiation like Xrays and gamma rays might completely overlook fast moving large ions of sulfur and iron. As regards the prospect of life not affected by particulate radiation, why not? One could imagine Earth life durable to this sort of thing. A simple method: if radiation makes random changes in a genome you could have multiple copies of the genome and check them against each other frequently, correcting according to consensus for a given gene.
Ramsar, Mazandaran ------------------ [![Lovely Ramsar (Wikipedia)](https://i.stack.imgur.com/fXdX7.jpg)](https://i.stack.imgur.com/fXdX7.jpg) is a city in Iran known for levels of background radiation 80 times the acceptable level from artificial sources. There is radium in the ground water and radon gas coming up into the air. Yet there is a mystery because reports keep coming in that the cancer rate is *not* elevated. See [Wikipedia](https://en.wikipedia.org/wiki/Ramsar,_Mazandaran) for some description. A leading explanation is a crack-pot theory\* called "radiation hormesis", essentially that you can get an internal suntan from being in radiation. Note this conflicts violently with [government advice from the U.S.](https://www.dep.pa.gov/Business/RadiationProtection/RadonDivision/Pages/Radon-in-the-home.aspx), though that hardly guarantees it's wrong. A [2019 paper in an Iranian journal](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6709356/) even suggested that the residents should be considered as good Mars colonists. There is [a limited amount of ongoing research](https://pubmed.ncbi.nlm.nih.gov/?term=ramsar+radiation&sort=date) but it doesn't seem very enlightening. Sometimes - often - biology just digs in like a mule and rejects all entreaties to logic and reason. * *This term of endearment confused some readers. Please note that "crack-pot theories" turn out to be true more often than expected in biology. Why, just the other day we found out [humans carry an HIV-like polymerase to convert RNA to DNA](https://advances.sciencemag.org/content/7/24/eabf1771), which is to say, Lamarckism may be back on the menu. You never know!*
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
Ramsar, Mazandaran ------------------ [![Lovely Ramsar (Wikipedia)](https://i.stack.imgur.com/fXdX7.jpg)](https://i.stack.imgur.com/fXdX7.jpg) is a city in Iran known for levels of background radiation 80 times the acceptable level from artificial sources. There is radium in the ground water and radon gas coming up into the air. Yet there is a mystery because reports keep coming in that the cancer rate is *not* elevated. See [Wikipedia](https://en.wikipedia.org/wiki/Ramsar,_Mazandaran) for some description. A leading explanation is a crack-pot theory\* called "radiation hormesis", essentially that you can get an internal suntan from being in radiation. Note this conflicts violently with [government advice from the U.S.](https://www.dep.pa.gov/Business/RadiationProtection/RadonDivision/Pages/Radon-in-the-home.aspx), though that hardly guarantees it's wrong. A [2019 paper in an Iranian journal](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6709356/) even suggested that the residents should be considered as good Mars colonists. There is [a limited amount of ongoing research](https://pubmed.ncbi.nlm.nih.gov/?term=ramsar+radiation&sort=date) but it doesn't seem very enlightening. Sometimes - often - biology just digs in like a mule and rejects all entreaties to logic and reason. * *This term of endearment confused some readers. Please note that "crack-pot theories" turn out to be true more often than expected in biology. Why, just the other day we found out [humans carry an HIV-like polymerase to convert RNA to DNA](https://advances.sciencemag.org/content/7/24/eabf1771), which is to say, Lamarckism may be back on the menu. You never know!*
You asked specifically about "radioactively hostile" but in case you just meant "very unhealthy in the longer term" - radiation on the surface could be much higher than on Earth (e.g. like Mars) from cosmic sources, either due to the stellar neighborhood or to a weaker magnetic field (again, like Mars). As a total non-expert in this kind of biology I can suppose that the native life has adapted to this, similar to the Earth organisms mentioned in other answers. Going outside 'radiation', there are any number of trace elements that can be present in soil, and therefore also as dust in the air, that are long-term toxic or carcinogenic for humans e.g. Arsenic (As), cadmium (Cd), chromium (Cr), and nickel (Ni). Asbestos is also a naturally occuring contaminant in various minerals: vermiculite and taconite are well-known examples. Here's a completely different form of hostility: It seems pretty plausible based on recent research, that life can be constructed from a different set of amino acids than Earth life. And if it can be, the odds are it WILL be on other planets. As a result, if your colonists use native life as food, they will quickly or slowly develop amino-acid deficiencies unless they supplement with the essential Earth amino acids that are missing. I speculate that some non-Earthly aminos might even be actively toxic, but that's just a guess. One last comment: If you don't want your story to feel like a throwback to the 1950's or 60's, assume that if humans can colonize an extrasolar planet, that their science and technology is MORE advanced than ours, and they will understand physics, chemistry, biology, and medicine better than we do. Unless they've gone through a radical loss of civilization (despite interstellar travel) they will NOT be caught by surprise by any of these environmental problems like radiation, toxic metals, or amino-acid deficits, any more than a team of Navy SEALs would get lost in the desert. Their satellites, or landers, or smart-phones, or body augments, will have told them all about it.
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
Ramsar, Mazandaran ------------------ [![Lovely Ramsar (Wikipedia)](https://i.stack.imgur.com/fXdX7.jpg)](https://i.stack.imgur.com/fXdX7.jpg) is a city in Iran known for levels of background radiation 80 times the acceptable level from artificial sources. There is radium in the ground water and radon gas coming up into the air. Yet there is a mystery because reports keep coming in that the cancer rate is *not* elevated. See [Wikipedia](https://en.wikipedia.org/wiki/Ramsar,_Mazandaran) for some description. A leading explanation is a crack-pot theory\* called "radiation hormesis", essentially that you can get an internal suntan from being in radiation. Note this conflicts violently with [government advice from the U.S.](https://www.dep.pa.gov/Business/RadiationProtection/RadonDivision/Pages/Radon-in-the-home.aspx), though that hardly guarantees it's wrong. A [2019 paper in an Iranian journal](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6709356/) even suggested that the residents should be considered as good Mars colonists. There is [a limited amount of ongoing research](https://pubmed.ncbi.nlm.nih.gov/?term=ramsar+radiation&sort=date) but it doesn't seem very enlightening. Sometimes - often - biology just digs in like a mule and rejects all entreaties to logic and reason. * *This term of endearment confused some readers. Please note that "crack-pot theories" turn out to be true more often than expected in biology. Why, just the other day we found out [humans carry an HIV-like polymerase to convert RNA to DNA](https://advances.sciencemag.org/content/7/24/eabf1771), which is to say, Lamarckism may be back on the menu. You never know!*
Increased Resilience to Errors in Genetic Information ===================================================== One of the biggest problems caused by radiation to humans is the damage done to our cells' DNA, which lead to [The Hallmarks of Cancer](https://en.wikipedia.org/wiki/The_Hallmarks_of_Cancer). In short, if the right combination of damage occurs to *any* cell's DNA, it will cause that cell to endlessly reproduce, which copies the defective DNA -- the problem is self-compounding. Our cells have mechanisms to protect against DNA damage via [proofreading and repair](https://www.khanacademy.org/science/high-school-biology/hs-molecular-genetics/hs-discovery-and-structure-of-dna/a/dna-proofreading-and-repair). As the examples on Khan academy show, these repairs largely rely on the fact that each of the two strands in DNA is identical to the other. The Wikipedia article on [DNA Replication](https://en.wikipedia.org/wiki/DNA_replication#DNA_structure) notes: > > The pairing of complementary bases in DNA (through hydrogen bonding) means that the information contained within each strand is redundant. > > > Moreover, each strand is used as a template during DNA replication, as shown here [[1](https://en.wikipedia.org/wiki/DNA_replication#/media/File:DNA_replication_split.svg)]: ![A diagram of the DNA double helix being unzipped for replication.](https://upload.wikimedia.org/wikipedia/commons/thumb/7/70/DNA_replication_split.svg/800px-DNA_replication_split.svg.png) This single redundancy is what allows our DNA to be so resilient against damage. Compare the double helix in DNA to the single-stranded RNA used for short-lived information in our cells [[2](https://en.wikipedia.org/wiki/Nucleic_acid#/media/File:Difference_DNA_RNA-EN.svg)]: ![A diagram of the DNA double helix being unzipped for replication.](https://upload.wikimedia.org/wikipedia/commons/thumb/3/37/Difference_DNA_RNA-EN.svg/1920px-Difference_DNA_RNA-EN.svg.png) Thus, RNA has no redundancy, and is *much* more susceptible to damage. This is why the genetic information of complex organisms in encoded using DNA instead of RNA: the information is far better protected against corruption. Note how essential [base pairs](https://en.wikipedia.org/wiki/Base_pair) are to both DNA replication and repair processes. Both DNA replication and repair rely entirely on the fact that the only possible DNA [nucleobase](https://en.wikipedia.org/wiki/Nucleobase) pairings are **A-T** and **C-G**. Now then, you might imagine a structure similar to DNA being used by the biology of your fictional planet. However, instead of using [nucleosides](https://en.wikipedia.org/wiki/Nucleoside) that facilitate base *pairs*, there might be alien nucleosides that can only bond as base *triples*, so instead of **A-T** and **C-G**, you might have **A-B-C** and **X-Y-Z**. This would result in molecule with 3 strands instead of the 2 strands seen in DNA. Thus, instead of only having the *single* redundancy of humans, your lifeforms' genetic information has *dual* redundancy, making it far more resilient against the radiation-induced mutations that cause cancer. This would also render the biological mechanisms that protect your fictional planet's life against radiation incompatible with Earth-based life. Also note that, because each strand in DNA is a sequence of 4 nucleotides, this is base-4 information storage scheme very similar to the base-2 system of 1's and 0's used in our computers. You could add biological mechanisms that serve as analogs to real-world human-developed [error detection and correction](https://en.wikipedia.org/wiki/Error_detection_and_correction) techniques. Again, such mechanisms would safeguard against the most damaging long-term effects of radiation. Side note --------- As someone involved in space science: radiation is always a major concern, even for robotic missions. I would find it extremely unrealistic for space-faring humans to be oblivious to radiation from the host star of any planet. You might want to give this area some thought.
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
**Radiation belts.** <https://astronomy.com/magazine/ask-astro/2020/02/what-is-the-source-of-jupiters-radiation> > > Any charged particles in the space around Jupiter will experience the > planet’s strong magnetic field and get accelerated to high energies. > These negatively charged electrons and positively charged ions of > hydrogen (i.e., protons), oxygen, and sulfur form Jupiter’s radiation > belts... The dominant particle source, however, is the volcanic moon > Io, which dumps a ton per second of sulfur dioxide gas into space > where the molecules are broken up and ionized. These processes around > the moon drive million-amp currents between Io and Jupiter’s > ionosphere, where they excite intense aurorae on the planet. > > > The highest fluxes of energetic particles are found closest to the > planet, where they form a doughnut-shaped belt around the equator > (which causes radio emission that astronomers have measured from Earth > since the 1960s)... > > > It is Jupiter’s combination of a strong magnetic field, Io’s > prodigious source, and the magnetic coupling of charged particles to > the planet’s rapid (10-hour) spin that drives the intense radiation. > Earth’s radiation belts (named after James Van Allen, who discovered > them with the U.S.’s first satellite in 1958) are much weaker. > > > Your planet has a very strong magnetic field and one of its moons plays the role of Io, dumping particles that are accelerated to high speeds. Fast moving particles are the equivalent of electromagnetic radiation as regards biological damage. Your people do not realize that these belts exist and dip down to and below the planet surface because the belt is not homogenous - it moves with the moons, with high and low tides. In addition a detection method tuned to short wavelength radiation like Xrays and gamma rays might completely overlook fast moving large ions of sulfur and iron. As regards the prospect of life not affected by particulate radiation, why not? One could imagine Earth life durable to this sort of thing. A simple method: if radiation makes random changes in a genome you could have multiple copies of the genome and check them against each other frequently, correcting according to consensus for a given gene.
You asked specifically about "radioactively hostile" but in case you just meant "very unhealthy in the longer term" - radiation on the surface could be much higher than on Earth (e.g. like Mars) from cosmic sources, either due to the stellar neighborhood or to a weaker magnetic field (again, like Mars). As a total non-expert in this kind of biology I can suppose that the native life has adapted to this, similar to the Earth organisms mentioned in other answers. Going outside 'radiation', there are any number of trace elements that can be present in soil, and therefore also as dust in the air, that are long-term toxic or carcinogenic for humans e.g. Arsenic (As), cadmium (Cd), chromium (Cr), and nickel (Ni). Asbestos is also a naturally occuring contaminant in various minerals: vermiculite and taconite are well-known examples. Here's a completely different form of hostility: It seems pretty plausible based on recent research, that life can be constructed from a different set of amino acids than Earth life. And if it can be, the odds are it WILL be on other planets. As a result, if your colonists use native life as food, they will quickly or slowly develop amino-acid deficiencies unless they supplement with the essential Earth amino acids that are missing. I speculate that some non-Earthly aminos might even be actively toxic, but that's just a guess. One last comment: If you don't want your story to feel like a throwback to the 1950's or 60's, assume that if humans can colonize an extrasolar planet, that their science and technology is MORE advanced than ours, and they will understand physics, chemistry, biology, and medicine better than we do. Unless they've gone through a radical loss of civilization (despite interstellar travel) they will NOT be caught by surprise by any of these environmental problems like radiation, toxic metals, or amino-acid deficits, any more than a team of Navy SEALs would get lost in the desert. Their satellites, or landers, or smart-phones, or body augments, will have told them all about it.
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
**Radiation belts.** <https://astronomy.com/magazine/ask-astro/2020/02/what-is-the-source-of-jupiters-radiation> > > Any charged particles in the space around Jupiter will experience the > planet’s strong magnetic field and get accelerated to high energies. > These negatively charged electrons and positively charged ions of > hydrogen (i.e., protons), oxygen, and sulfur form Jupiter’s radiation > belts... The dominant particle source, however, is the volcanic moon > Io, which dumps a ton per second of sulfur dioxide gas into space > where the molecules are broken up and ionized. These processes around > the moon drive million-amp currents between Io and Jupiter’s > ionosphere, where they excite intense aurorae on the planet. > > > The highest fluxes of energetic particles are found closest to the > planet, where they form a doughnut-shaped belt around the equator > (which causes radio emission that astronomers have measured from Earth > since the 1960s)... > > > It is Jupiter’s combination of a strong magnetic field, Io’s > prodigious source, and the magnetic coupling of charged particles to > the planet’s rapid (10-hour) spin that drives the intense radiation. > Earth’s radiation belts (named after James Van Allen, who discovered > them with the U.S.’s first satellite in 1958) are much weaker. > > > Your planet has a very strong magnetic field and one of its moons plays the role of Io, dumping particles that are accelerated to high speeds. Fast moving particles are the equivalent of electromagnetic radiation as regards biological damage. Your people do not realize that these belts exist and dip down to and below the planet surface because the belt is not homogenous - it moves with the moons, with high and low tides. In addition a detection method tuned to short wavelength radiation like Xrays and gamma rays might completely overlook fast moving large ions of sulfur and iron. As regards the prospect of life not affected by particulate radiation, why not? One could imagine Earth life durable to this sort of thing. A simple method: if radiation makes random changes in a genome you could have multiple copies of the genome and check them against each other frequently, correcting according to consensus for a given gene.
Increased Resilience to Errors in Genetic Information ===================================================== One of the biggest problems caused by radiation to humans is the damage done to our cells' DNA, which lead to [The Hallmarks of Cancer](https://en.wikipedia.org/wiki/The_Hallmarks_of_Cancer). In short, if the right combination of damage occurs to *any* cell's DNA, it will cause that cell to endlessly reproduce, which copies the defective DNA -- the problem is self-compounding. Our cells have mechanisms to protect against DNA damage via [proofreading and repair](https://www.khanacademy.org/science/high-school-biology/hs-molecular-genetics/hs-discovery-and-structure-of-dna/a/dna-proofreading-and-repair). As the examples on Khan academy show, these repairs largely rely on the fact that each of the two strands in DNA is identical to the other. The Wikipedia article on [DNA Replication](https://en.wikipedia.org/wiki/DNA_replication#DNA_structure) notes: > > The pairing of complementary bases in DNA (through hydrogen bonding) means that the information contained within each strand is redundant. > > > Moreover, each strand is used as a template during DNA replication, as shown here [[1](https://en.wikipedia.org/wiki/DNA_replication#/media/File:DNA_replication_split.svg)]: ![A diagram of the DNA double helix being unzipped for replication.](https://upload.wikimedia.org/wikipedia/commons/thumb/7/70/DNA_replication_split.svg/800px-DNA_replication_split.svg.png) This single redundancy is what allows our DNA to be so resilient against damage. Compare the double helix in DNA to the single-stranded RNA used for short-lived information in our cells [[2](https://en.wikipedia.org/wiki/Nucleic_acid#/media/File:Difference_DNA_RNA-EN.svg)]: ![A diagram of the DNA double helix being unzipped for replication.](https://upload.wikimedia.org/wikipedia/commons/thumb/3/37/Difference_DNA_RNA-EN.svg/1920px-Difference_DNA_RNA-EN.svg.png) Thus, RNA has no redundancy, and is *much* more susceptible to damage. This is why the genetic information of complex organisms in encoded using DNA instead of RNA: the information is far better protected against corruption. Note how essential [base pairs](https://en.wikipedia.org/wiki/Base_pair) are to both DNA replication and repair processes. Both DNA replication and repair rely entirely on the fact that the only possible DNA [nucleobase](https://en.wikipedia.org/wiki/Nucleobase) pairings are **A-T** and **C-G**. Now then, you might imagine a structure similar to DNA being used by the biology of your fictional planet. However, instead of using [nucleosides](https://en.wikipedia.org/wiki/Nucleoside) that facilitate base *pairs*, there might be alien nucleosides that can only bond as base *triples*, so instead of **A-T** and **C-G**, you might have **A-B-C** and **X-Y-Z**. This would result in molecule with 3 strands instead of the 2 strands seen in DNA. Thus, instead of only having the *single* redundancy of humans, your lifeforms' genetic information has *dual* redundancy, making it far more resilient against the radiation-induced mutations that cause cancer. This would also render the biological mechanisms that protect your fictional planet's life against radiation incompatible with Earth-based life. Also note that, because each strand in DNA is a sequence of 4 nucleotides, this is base-4 information storage scheme very similar to the base-2 system of 1's and 0's used in our computers. You could add biological mechanisms that serve as analogs to real-world human-developed [error detection and correction](https://en.wikipedia.org/wiki/Error_detection_and_correction) techniques. Again, such mechanisms would safeguard against the most damaging long-term effects of radiation. Side note --------- As someone involved in space science: radiation is always a major concern, even for robotic missions. I would find it extremely unrealistic for space-faring humans to be oblivious to radiation from the host star of any planet. You might want to give this area some thought.
206,053
I am writing a story that involves the colonization of an exoplanet with very similar properties to Earth. This exoplanet would be orbiting a red dwarf, but due to certain factors (it's atmosphere and the presence of one or several moons) it is NOT tidally locked. Upon arrival from humans, to their surprise this planet already harbors complex life. It has a large ocean, continents, fauna and flora much like our own earth. My question is: despite all the similarities, would it be plausible for this planet to be radioactively hostile towards humans? What I mean by this is, despite the similarities, could the fauna and flora have adapted to thrive in an environment which would be ultimately too harsh for our own bodies in the long run? To provide some more context as to why i'm asking this: The plot i'm working on makes it so that when humans arrive, they can breathe without their suits, and at first it seems like the planet is just like earth, but soon after they realize that prolongued exposure to the radiation within the planet leads to early offset of cancer and thus humans can't really afford to live in this planet as it is without applying some form of alteration to the atmosphere (that's a wholenother subject i'll leave for another post haha), which would lead to harmful consequences towards the native life that has evolved to thrive within this specific environment. Love this forum, it's been super helpful, to anyone taking time out of their day to answer: Thank you!
2021/06/24
[ "https://worldbuilding.stackexchange.com/questions/206053", "https://worldbuilding.stackexchange.com", "https://worldbuilding.stackexchange.com/users/84700/" ]
Increased Resilience to Errors in Genetic Information ===================================================== One of the biggest problems caused by radiation to humans is the damage done to our cells' DNA, which lead to [The Hallmarks of Cancer](https://en.wikipedia.org/wiki/The_Hallmarks_of_Cancer). In short, if the right combination of damage occurs to *any* cell's DNA, it will cause that cell to endlessly reproduce, which copies the defective DNA -- the problem is self-compounding. Our cells have mechanisms to protect against DNA damage via [proofreading and repair](https://www.khanacademy.org/science/high-school-biology/hs-molecular-genetics/hs-discovery-and-structure-of-dna/a/dna-proofreading-and-repair). As the examples on Khan academy show, these repairs largely rely on the fact that each of the two strands in DNA is identical to the other. The Wikipedia article on [DNA Replication](https://en.wikipedia.org/wiki/DNA_replication#DNA_structure) notes: > > The pairing of complementary bases in DNA (through hydrogen bonding) means that the information contained within each strand is redundant. > > > Moreover, each strand is used as a template during DNA replication, as shown here [[1](https://en.wikipedia.org/wiki/DNA_replication#/media/File:DNA_replication_split.svg)]: ![A diagram of the DNA double helix being unzipped for replication.](https://upload.wikimedia.org/wikipedia/commons/thumb/7/70/DNA_replication_split.svg/800px-DNA_replication_split.svg.png) This single redundancy is what allows our DNA to be so resilient against damage. Compare the double helix in DNA to the single-stranded RNA used for short-lived information in our cells [[2](https://en.wikipedia.org/wiki/Nucleic_acid#/media/File:Difference_DNA_RNA-EN.svg)]: ![A diagram of the DNA double helix being unzipped for replication.](https://upload.wikimedia.org/wikipedia/commons/thumb/3/37/Difference_DNA_RNA-EN.svg/1920px-Difference_DNA_RNA-EN.svg.png) Thus, RNA has no redundancy, and is *much* more susceptible to damage. This is why the genetic information of complex organisms in encoded using DNA instead of RNA: the information is far better protected against corruption. Note how essential [base pairs](https://en.wikipedia.org/wiki/Base_pair) are to both DNA replication and repair processes. Both DNA replication and repair rely entirely on the fact that the only possible DNA [nucleobase](https://en.wikipedia.org/wiki/Nucleobase) pairings are **A-T** and **C-G**. Now then, you might imagine a structure similar to DNA being used by the biology of your fictional planet. However, instead of using [nucleosides](https://en.wikipedia.org/wiki/Nucleoside) that facilitate base *pairs*, there might be alien nucleosides that can only bond as base *triples*, so instead of **A-T** and **C-G**, you might have **A-B-C** and **X-Y-Z**. This would result in molecule with 3 strands instead of the 2 strands seen in DNA. Thus, instead of only having the *single* redundancy of humans, your lifeforms' genetic information has *dual* redundancy, making it far more resilient against the radiation-induced mutations that cause cancer. This would also render the biological mechanisms that protect your fictional planet's life against radiation incompatible with Earth-based life. Also note that, because each strand in DNA is a sequence of 4 nucleotides, this is base-4 information storage scheme very similar to the base-2 system of 1's and 0's used in our computers. You could add biological mechanisms that serve as analogs to real-world human-developed [error detection and correction](https://en.wikipedia.org/wiki/Error_detection_and_correction) techniques. Again, such mechanisms would safeguard against the most damaging long-term effects of radiation. Side note --------- As someone involved in space science: radiation is always a major concern, even for robotic missions. I would find it extremely unrealistic for space-faring humans to be oblivious to radiation from the host star of any planet. You might want to give this area some thought.
You asked specifically about "radioactively hostile" but in case you just meant "very unhealthy in the longer term" - radiation on the surface could be much higher than on Earth (e.g. like Mars) from cosmic sources, either due to the stellar neighborhood or to a weaker magnetic field (again, like Mars). As a total non-expert in this kind of biology I can suppose that the native life has adapted to this, similar to the Earth organisms mentioned in other answers. Going outside 'radiation', there are any number of trace elements that can be present in soil, and therefore also as dust in the air, that are long-term toxic or carcinogenic for humans e.g. Arsenic (As), cadmium (Cd), chromium (Cr), and nickel (Ni). Asbestos is also a naturally occuring contaminant in various minerals: vermiculite and taconite are well-known examples. Here's a completely different form of hostility: It seems pretty plausible based on recent research, that life can be constructed from a different set of amino acids than Earth life. And if it can be, the odds are it WILL be on other planets. As a result, if your colonists use native life as food, they will quickly or slowly develop amino-acid deficiencies unless they supplement with the essential Earth amino acids that are missing. I speculate that some non-Earthly aminos might even be actively toxic, but that's just a guess. One last comment: If you don't want your story to feel like a throwback to the 1950's or 60's, assume that if humans can colonize an extrasolar planet, that their science and technology is MORE advanced than ours, and they will understand physics, chemistry, biology, and medicine better than we do. Unless they've gone through a radical loss of civilization (despite interstellar travel) they will NOT be caught by surprise by any of these environmental problems like radiation, toxic metals, or amino-acid deficits, any more than a team of Navy SEALs would get lost in the desert. Their satellites, or landers, or smart-phones, or body augments, will have told them all about it.
10,173
If I were to ask a question on using separate fonts to represent distorted or alien speech, <or enclose some kind of bracket to represent talking in another language>, how well would that question be received?
2017/04/12
[ "https://english.meta.stackexchange.com/questions/10173", "https://english.meta.stackexchange.com", "https://english.meta.stackexchange.com/users/219667/" ]
I don't think it's [on-topic](https://english.stackexchange.com/help/on-topic) here, as it doesn't specifically deal with the *English* language. I don't know the context of your question (it might be good to add it!) but if you're writing a book, you might want to visit our sister sites [Writers](https://writers.stackexchange.com/help/on-topic) and [Worldbuilding](https://worldbuilding.stackexchange.com/help/on-topic).
I don't think that kind of question is on-topic here. I agree with Glorfindel's suggestion to consider [Writers](https://writers.stackexchange.com/help/on-topic), since the help center says > > Questions on these topics are welcome here: > > > * Non-fiction, technical, scholarly, or journalistic writing. > * Writing fiction, poetry, or song lyrics. > * General copywriting, **style,** and organization. > > > Another possibility for questions about typography is the [Graphic Design](https://graphicdesign.stackexchange.com/help/on-topic) Stack Exchange. However, keep in mind that questions that seem like "[idea generation](https://writers.meta.stackexchange.com/questions/1271/do-the-on-topic-rules-fit-non-fiction-particularly-technical-writing/1272#1272)" (per the Writers help center) or "Brainstorming or idea gathering" (per the Graphic Design help center) are broadly off-topic on all Stack Exchange sites. So make sure your question is narrow enough in scope to avoid falling into one of these categories. For example, rather than asking "How can I use typography to indicate talking in another language?", which if taken literally could be answered with anyone's opinion, you might ask "What techniques are commonly used in published novels/graphic novels to indicate talking in another language?", which implies that answers should be based primarily on facts rather than on opinion and which narrows the scope to whatever category of literature you're interested in. Some kinds of questions about [typography](https://english.stackexchange.com/questions/tagged/typography) are on-topic for ELU. The exact lines are probably blurry, but I would say that the main thing that makes a typography question on-topic here is some element of specificity with regard to the English language; it also helps if the question is clearly objective and interesting.
14,641
I understand that using encryption for email isn't particularly mainstream (a fraction of a percentage of users, at a guess). Is that the only reason that banks and other sensitive companies do not offer some kind of encryption for email notifications? For example, my bank sends me an email every month: > > **Subject: Your monthly statement is ready** > > > Dear Alex, > > > Your monthly statement is ready. Please sign in to the bank's website to download it. > > > For security *(i.e. to prove this really came from us)*, your postcode is S12 TYX. > > > While admittedly it's not a huge deal, I'd rather Google - or any mail server - didn't ever receive my postcode in plaintext. This also applies to, for example, receipts from PayPal. Technically, I would think it fairly trivial for the bank to encrypt this notification using my public key. Is there anything I've overlooked in coming to that conclusion?
2012/05/05
[ "https://security.stackexchange.com/questions/14641", "https://security.stackexchange.com", "https://security.stackexchange.com/users/3119/" ]
Key management is the primary challenge and barrier for peer-to-peer like implementation of encryption (be it email or otherwise). There are companies (like Zixcorp) that simplify data protection for larger organizations. Zixcorp is more commonly used in healthcare than banking. Other vendors such as Messagelabs, Websense, and other email security gateway vendors offer similar functionality to Zix where the user receives notification of a message and clicks a link to retrieve the message. This eliminates key management by forcing the user to retrieve the message through a secured channel - still subject to attack and forgery, but such a method is still in use). Fundamentally, organizations that need to send sensitive information have several options based on their philosophy on customer communication. Forcing users to connect to their servers to retrieve notifications simplifies the communication stream by reducing the "hops" involved as well as minimizing liability by centralizing data storage. For example, if a bank uses Zix, GPG keys, or some other method via a 3rd party to send out communication, there's overhead costs tied to risk of the 3rd party. In the case of GPG keys, what if the bank screws up and sends your info to someone else using the recipients GPG keys? Such an event is easily possible - I still receive payment and account notifications from a big auto financing company after I sold my car to someone else. By centralizing communication, it's much easier for a bank to protect the data and reduce liability.
I think it's the lack of widespread usage. I can't think of any technical reason that your bank or any other party couldn't send automated, encrypted emails to you, simply instead I think it's that so few people use any form of encrypted email (and therefore most of the public don't have public keys) that from their point of view it's not worth considering. Next question is obviously 'How can we provide a secure email setup for the masses, without having to teach them the (apparently too complicated) basics of GPG?'
14,641
I understand that using encryption for email isn't particularly mainstream (a fraction of a percentage of users, at a guess). Is that the only reason that banks and other sensitive companies do not offer some kind of encryption for email notifications? For example, my bank sends me an email every month: > > **Subject: Your monthly statement is ready** > > > Dear Alex, > > > Your monthly statement is ready. Please sign in to the bank's website to download it. > > > For security *(i.e. to prove this really came from us)*, your postcode is S12 TYX. > > > While admittedly it's not a huge deal, I'd rather Google - or any mail server - didn't ever receive my postcode in plaintext. This also applies to, for example, receipts from PayPal. Technically, I would think it fairly trivial for the bank to encrypt this notification using my public key. Is there anything I've overlooked in coming to that conclusion?
2012/05/05
[ "https://security.stackexchange.com/questions/14641", "https://security.stackexchange.com", "https://security.stackexchange.com/users/3119/" ]
I think it's the lack of widespread usage. I can't think of any technical reason that your bank or any other party couldn't send automated, encrypted emails to you, simply instead I think it's that so few people use any form of encrypted email (and therefore most of the public don't have public keys) that from their point of view it's not worth considering. Next question is obviously 'How can we provide a secure email setup for the masses, without having to teach them the (apparently too complicated) basics of GPG?'
(I think you are really asking about signing the email, not encrypting it. Signing is the relevant cryptographic primitive, since you want to verify the origin of the email.) **Overview.** It is far from trivial for the bank to send all of its customers signed email. There are three major challenges: * **Deployment.** Many users' email clients don't support signature verification, so this would have no value for them. * **Key management.** How do users get their bank's public key, in a way that isn't subject to spoofing? These issues are not trivial. * **Usability.** Finally, the most serious issue: signed email is not usable by average users. It is horrifically confusing for end users. Apart from the fact that any security mechanism that isn't usable, won't have much benefit to security -- just think of all the help desk calls the bank is going to get about this. Each help desk call probably costs the bank $20-40, so that'd be a serious cost. Ouch. **Details.** For more on this topic, I'd like to point you to some scientific papers in the research literature: * [Views, Reactions and Impact of Digitally-Signed Mail in e-Commerce](http://www.simson.net/ref/2004/fc2005_smime_submitted.pdf). Simson L. Garfinkel, Jeffrey I. Schiller, Erik Nordlander, David Margrave, and Robert C. Miller. FC 2005. * [How to Make Secure Email Easier To Use](http://www.simson.net/ref/2004/chi2005_smime_submitted.pdf). Simson L. Garfinkel, David Margrave, Jeffrey I. Schiller, Erik Nordlander, Robert C. Miller. CHI 2005. * [Johnny 2: A User Test of Key Continuity Management with S/MIME and Outlook Express](http://simson.net/clips/academic/2005.SOUPS.johnny2.pdf). Simson L. Garfinkel, Robert C. Miller. SOUPS 2005. For instance, the last work shows that something like 20-50% of users fail to use signed email securely, and thus could be fooled by various attacks.
14,641
I understand that using encryption for email isn't particularly mainstream (a fraction of a percentage of users, at a guess). Is that the only reason that banks and other sensitive companies do not offer some kind of encryption for email notifications? For example, my bank sends me an email every month: > > **Subject: Your monthly statement is ready** > > > Dear Alex, > > > Your monthly statement is ready. Please sign in to the bank's website to download it. > > > For security *(i.e. to prove this really came from us)*, your postcode is S12 TYX. > > > While admittedly it's not a huge deal, I'd rather Google - or any mail server - didn't ever receive my postcode in plaintext. This also applies to, for example, receipts from PayPal. Technically, I would think it fairly trivial for the bank to encrypt this notification using my public key. Is there anything I've overlooked in coming to that conclusion?
2012/05/05
[ "https://security.stackexchange.com/questions/14641", "https://security.stackexchange.com", "https://security.stackexchange.com/users/3119/" ]
Key management is the primary challenge and barrier for peer-to-peer like implementation of encryption (be it email or otherwise). There are companies (like Zixcorp) that simplify data protection for larger organizations. Zixcorp is more commonly used in healthcare than banking. Other vendors such as Messagelabs, Websense, and other email security gateway vendors offer similar functionality to Zix where the user receives notification of a message and clicks a link to retrieve the message. This eliminates key management by forcing the user to retrieve the message through a secured channel - still subject to attack and forgery, but such a method is still in use). Fundamentally, organizations that need to send sensitive information have several options based on their philosophy on customer communication. Forcing users to connect to their servers to retrieve notifications simplifies the communication stream by reducing the "hops" involved as well as minimizing liability by centralizing data storage. For example, if a bank uses Zix, GPG keys, or some other method via a 3rd party to send out communication, there's overhead costs tied to risk of the 3rd party. In the case of GPG keys, what if the bank screws up and sends your info to someone else using the recipients GPG keys? Such an event is easily possible - I still receive payment and account notifications from a big auto financing company after I sold my car to someone else. By centralizing communication, it's much easier for a bank to protect the data and reduce liability.
(I think you are really asking about signing the email, not encrypting it. Signing is the relevant cryptographic primitive, since you want to verify the origin of the email.) **Overview.** It is far from trivial for the bank to send all of its customers signed email. There are three major challenges: * **Deployment.** Many users' email clients don't support signature verification, so this would have no value for them. * **Key management.** How do users get their bank's public key, in a way that isn't subject to spoofing? These issues are not trivial. * **Usability.** Finally, the most serious issue: signed email is not usable by average users. It is horrifically confusing for end users. Apart from the fact that any security mechanism that isn't usable, won't have much benefit to security -- just think of all the help desk calls the bank is going to get about this. Each help desk call probably costs the bank $20-40, so that'd be a serious cost. Ouch. **Details.** For more on this topic, I'd like to point you to some scientific papers in the research literature: * [Views, Reactions and Impact of Digitally-Signed Mail in e-Commerce](http://www.simson.net/ref/2004/fc2005_smime_submitted.pdf). Simson L. Garfinkel, Jeffrey I. Schiller, Erik Nordlander, David Margrave, and Robert C. Miller. FC 2005. * [How to Make Secure Email Easier To Use](http://www.simson.net/ref/2004/chi2005_smime_submitted.pdf). Simson L. Garfinkel, David Margrave, Jeffrey I. Schiller, Erik Nordlander, Robert C. Miller. CHI 2005. * [Johnny 2: A User Test of Key Continuity Management with S/MIME and Outlook Express](http://simson.net/clips/academic/2005.SOUPS.johnny2.pdf). Simson L. Garfinkel, Robert C. Miller. SOUPS 2005. For instance, the last work shows that something like 20-50% of users fail to use signed email securely, and thus could be fooled by various attacks.
5,005,753
What is the time complexity of the Newton-Raphson square method? * [Wikipedia: Newton's method](http://en.wikipedia.org/wiki/Newton%27s_method)
2011/02/15
[ "https://Stackoverflow.com/questions/5005753", "https://Stackoverflow.com", "https://Stackoverflow.com/users/435645/" ]
From <http://en.citizendium.org/wiki/Newton%27s_method#Computational_complexity>: > > Using Newton's method as described > above, the time complexity of > calculating a root of a function f(x) > with n-digit precision, provided that > a good initial approximation is known, > is O((\log n) F(n)) where F(n) is the > cost of calculating f(x)/f'(x)\, with > n-digit precision. > > > However, depending on your precision requirements, you can do better: > > If f(x) can be evaluated with variable > precision, the algorithm can be > improved. Because of the > "self-correcting" nature of Newton's > method, meaning that it is unaffected > by small perturbations once it has > reached the stage of quadratic > convergence, it is only necessary to > use m-digit precision at a step where > the approximation has m-digit > accuracy. Hence, the first iteration > can be performed with a precision > twice as high as the accuracy of x\_0, > the second iteration with a precision > four times as high, and so on. If the > precision levels are chosen suitably, > only the final iteration requires > f(x)/f'(x)\, to be evaluated at full > n-digit precision. Provided that F(n) > grows superlinearly, which is the case > in practice, the cost of finding a > root is therefore only O(F(n)), with a > constant factor close to unity. > > >
This [article](http://en.citizendium.org/wiki/Newton%27s_method#Computational_complexity) gives a relevant approach as to how to consider the method's complexity.
171,684
I have decided to switch from Window to Linux. Should I use Ubuntu or Kubuntu? Or there are other handy alternatives? I'm a LAMP based developer. Any help would be appreciated.
2010/08/04
[ "https://superuser.com/questions/171684", "https://superuser.com", "https://superuser.com/users/-1/" ]
I enjoyed reading this: [Ubuntu VS Kubuntu](http://www.latest-technews.com/kubuntu-vs-ubuntu-or-ubuntu-kubuntu/) In the end I think the beauty of Linux is that *you don't have to pay* for it. So, why not try both? There are **Live CD** versions of both systems available for download!
I think both have a live cd, try them out and figure out which one you like better. There are tons and tons of choices, which one works for you will be 100% up to you. Basically you are asking about gnome vs kde, which is a basically a religious war :)
171,684
I have decided to switch from Window to Linux. Should I use Ubuntu or Kubuntu? Or there are other handy alternatives? I'm a LAMP based developer. Any help would be appreciated.
2010/08/04
[ "https://superuser.com/questions/171684", "https://superuser.com", "https://superuser.com/users/-1/" ]
The two use different environment. Ubuntu is Gnome , and Kubutu is KDE. <http://www.psychocats.net/ubuntu/kdegnome>
I think both have a live cd, try them out and figure out which one you like better. There are tons and tons of choices, which one works for you will be 100% up to you. Basically you are asking about gnome vs kde, which is a basically a religious war :)
171,684
I have decided to switch from Window to Linux. Should I use Ubuntu or Kubuntu? Or there are other handy alternatives? I'm a LAMP based developer. Any help would be appreciated.
2010/08/04
[ "https://superuser.com/questions/171684", "https://superuser.com", "https://superuser.com/users/-1/" ]
It all boils down to personal preference. Since this is Linux after all, I'd suggest downloading and trying both out and sticking with whichever you're more comfortable with. Its the same underlying system but with a different "wrapper." My personal preference is Gnome or [XFCE](http://www.xfce.org/about/). XFCE can be added to any version of Ubuntu or can be found in Xubuntu.
I think both have a live cd, try them out and figure out which one you like better. There are tons and tons of choices, which one works for you will be 100% up to you. Basically you are asking about gnome vs kde, which is a basically a religious war :)
3,547
There are so many reviews spread around the web about the Zoom H4N, some of them praise this recorder, other say some things about bad preamps, hissy mics etc... Who has had experiences with this device? Would you recommend it? How are the internal mics? I am thinking about going out into the woods to record ambient sounds. I also like the multi-track feature and would like to use external mics, too.
2010/09/14
[ "https://sound.stackexchange.com/questions/3547", "https://sound.stackexchange.com", "https://sound.stackexchange.com/users/258/" ]
I'm not a huge fan of the built in mics on any of the smaller recorders but in a pinch they can do the job. My small discrete rig is the SONY PCM-M10 with a sounddevices MP2 mixer as a front end. The entire package fits into a tiny hip pack.
I bought the recorder and only have one major grievance with it: each different mode is limited with a different bit/sample rate as opposed to them all being able to record up to 24bit 96kHz broadcast wav. It goes like this: Stereo: max 24bit 96kHz wav 4 Channel: max 24bit 48Khz wav multitrack: 16bit 44.1kHz wav As I bought it to get into field recording I was slightly disappointed with the multitrack option being on 16/44.1 as I had hoped this would have allowed me to record in mono using my NTG-2 at 24/96. HOWEVER I improvised a work around: If you want to record 1 or 2 mono tracks using the XLR inputs, record in 'Stereo' mode with inputs 1/2 selected to get your 24/96 and then separate the stereo file into left and right tracks later (I use Snapper to quickly do this) which will produce your two mono tracks e.g. input 1 = left track, 2 = right. Not ideal but good for those of us with lower budgets or whom are just getting started. Also the software updates now allow for individual adjustment of the record levels of the mics (e.g. can set mic 1 at rec. lvl 70 and mic 2 at rec. lvl 50) as opposed to this being fixed as the same for both XLR inputs which it previously was making my previous work around even more viable. When using the inputs press the rec. lvl button on the side and then input 1 or 2 to select which input level you wish to change before going back to the rec. lvl button on the side to then adjust. Here's the link for the updates: <http://www.zoom.co.jp/downloads/h4n/software/> From the other reviews it would also appear that I ran into the 1/4 -10dB problem when trying to use a hydrophone with it. I found the levels very quiet and in some cases it was necessary to turn the rec. level up to 100 (when recording stiller water generally). Despite all that I do find it to be a generally good recorder and all I'd say was just compare it to what you will be using it for before diving in. All in all though, for what it is, thumbs up!
3,547
There are so many reviews spread around the web about the Zoom H4N, some of them praise this recorder, other say some things about bad preamps, hissy mics etc... Who has had experiences with this device? Would you recommend it? How are the internal mics? I am thinking about going out into the woods to record ambient sounds. I also like the multi-track feature and would like to use external mics, too.
2010/09/14
[ "https://sound.stackexchange.com/questions/3547", "https://sound.stackexchange.com", "https://sound.stackexchange.com/users/258/" ]
I bought the recorder and only have one major grievance with it: each different mode is limited with a different bit/sample rate as opposed to them all being able to record up to 24bit 96kHz broadcast wav. It goes like this: Stereo: max 24bit 96kHz wav 4 Channel: max 24bit 48Khz wav multitrack: 16bit 44.1kHz wav As I bought it to get into field recording I was slightly disappointed with the multitrack option being on 16/44.1 as I had hoped this would have allowed me to record in mono using my NTG-2 at 24/96. HOWEVER I improvised a work around: If you want to record 1 or 2 mono tracks using the XLR inputs, record in 'Stereo' mode with inputs 1/2 selected to get your 24/96 and then separate the stereo file into left and right tracks later (I use Snapper to quickly do this) which will produce your two mono tracks e.g. input 1 = left track, 2 = right. Not ideal but good for those of us with lower budgets or whom are just getting started. Also the software updates now allow for individual adjustment of the record levels of the mics (e.g. can set mic 1 at rec. lvl 70 and mic 2 at rec. lvl 50) as opposed to this being fixed as the same for both XLR inputs which it previously was making my previous work around even more viable. When using the inputs press the rec. lvl button on the side and then input 1 or 2 to select which input level you wish to change before going back to the rec. lvl button on the side to then adjust. Here's the link for the updates: <http://www.zoom.co.jp/downloads/h4n/software/> From the other reviews it would also appear that I ran into the 1/4 -10dB problem when trying to use a hydrophone with it. I found the levels very quiet and in some cases it was necessary to turn the rec. level up to 100 (when recording stiller water generally). Despite all that I do find it to be a generally good recorder and all I'd say was just compare it to what you will be using it for before diving in. All in all though, for what it is, thumbs up!
I did some comparisons (Zoom H1, H2, H4n, and Sony D-50) and really the Sony PCM D-50 is the best what You can get from a handheld recorder. Just take a look at these samples: <http://www.wingfieldaudio.com/portable-recorder-sound-samples.html>
3,547
There are so many reviews spread around the web about the Zoom H4N, some of them praise this recorder, other say some things about bad preamps, hissy mics etc... Who has had experiences with this device? Would you recommend it? How are the internal mics? I am thinking about going out into the woods to record ambient sounds. I also like the multi-track feature and would like to use external mics, too.
2010/09/14
[ "https://sound.stackexchange.com/questions/3547", "https://sound.stackexchange.com", "https://sound.stackexchange.com/users/258/" ]
I'm not a huge fan of the built in mics on any of the smaller recorders but in a pinch they can do the job. My small discrete rig is the SONY PCM-M10 with a sounddevices MP2 mixer as a front end. The entire package fits into a tiny hip pack.
I did some comparisons (Zoom H1, H2, H4n, and Sony D-50) and really the Sony PCM D-50 is the best what You can get from a handheld recorder. Just take a look at these samples: <http://www.wingfieldaudio.com/portable-recorder-sound-samples.html>
3,547
There are so many reviews spread around the web about the Zoom H4N, some of them praise this recorder, other say some things about bad preamps, hissy mics etc... Who has had experiences with this device? Would you recommend it? How are the internal mics? I am thinking about going out into the woods to record ambient sounds. I also like the multi-track feature and would like to use external mics, too.
2010/09/14
[ "https://sound.stackexchange.com/questions/3547", "https://sound.stackexchange.com", "https://sound.stackexchange.com/users/258/" ]
I love mine...Always in my briefcase, and often in my booth. I don't find the XLR's on my version to be noisy (not that I'm recording REALLY quiet sounds with it) - assuming you have your phantom power set correctly. The H4n has both 24 and True 48 volts phantom power, and I have found a lot of mics don't like the 24v very much, which is what the unit defaults to. Of course 48v eats batteries a LOT faster.
I spent a year with it, recording corporates...short films etc and I have to say that it got me through them, albeit clumsily. For field recording, I'm sure you can get some really nice results, but for stuff like shorts, I had no-where near enough control…but that's why you pay for something like the SD744 I guess. For recorders in its class I'm pretty confident in saying it beats them all, but it depends how much you need to push it.
3,547
There are so many reviews spread around the web about the Zoom H4N, some of them praise this recorder, other say some things about bad preamps, hissy mics etc... Who has had experiences with this device? Would you recommend it? How are the internal mics? I am thinking about going out into the woods to record ambient sounds. I also like the multi-track feature and would like to use external mics, too.
2010/09/14
[ "https://sound.stackexchange.com/questions/3547", "https://sound.stackexchange.com", "https://sound.stackexchange.com/users/258/" ]
I have that recorder as my *keep in the bag at all times recorder - just in case*. I gotta say I really do like it. The reason I bought was for the multi-track recording and the M/S decoder. I thought I would use it as a travel recorder with my Sennheiser MKH418s (M/S), but turns out that I only ever use the on-board mics. Reason being is that if I'm gonna be bringing my other mic with the softie or windshield and all, then I might as well bring my Sound Devices. I used to only have a Fostex FR2 recorder, wich was very big to travel with, so this made sense. I haven't even yet tested the XLR inputs, but from I hear they are pretty noisy. The on-board mics aren't very noisy at all. I think that sound pretty damn good actually. I also like that you can choose 90 or 120 deg for your stereo image. If I had to do it again though, I would buy The Sony [PCM D50](http://pro.sony.com/bbsc/ssr/product-PCMD50/). Because from what I hear it is even better sounding, and considering I never plug in other mics, then that recorder would be more suitable. If you wanna hear some recordings from the H4n, there are a few from my holiday that I posted on [{sound + design}](http://www.soundplusdesign.com/?p=3975). Otherwise [you can listen](http://www.soundplusdesign.com/index.php?s=h4n) to some older ones recorded on it. Hope this helps a little.
i'm really hoping to see a new product from zoom that addresses the "hissing", the upcoming IBC 2011 show would really be a swell time to announce it. h4n really could use an update. that sony d-50 and their pcm m10 indeed have excellent, low-noise recording and good user interfaces (but sadly do not double as a computer i/o, which is why the h4n interests me particularly).
3,547
There are so many reviews spread around the web about the Zoom H4N, some of them praise this recorder, other say some things about bad preamps, hissy mics etc... Who has had experiences with this device? Would you recommend it? How are the internal mics? I am thinking about going out into the woods to record ambient sounds. I also like the multi-track feature and would like to use external mics, too.
2010/09/14
[ "https://sound.stackexchange.com/questions/3547", "https://sound.stackexchange.com", "https://sound.stackexchange.com/users/258/" ]
I love my Zoom H4n, but be VERY AWARE that the 1/4inch "line level" inputs that they advertise are actually -10db High impedance guitar level inputs, not +4db low impedance professional line inputs. I got caught on that one when I tried to hook up my Sound Devices MM1 external preamp to clean up the mic inputs and there was a severe level/impedance mismatch. Was quite pissed when I discovered that because they are not too forthcoming with that fact in their advertising :(
Interesting questions - I'm currently considering selling my Fostex FR2 LE and buying a H4N; the Fostex just isn't convenient enough for me, its too bulky! I love it, but it spends most of its life sat on the shelf, I need something more handheld.
3,547
There are so many reviews spread around the web about the Zoom H4N, some of them praise this recorder, other say some things about bad preamps, hissy mics etc... Who has had experiences with this device? Would you recommend it? How are the internal mics? I am thinking about going out into the woods to record ambient sounds. I also like the multi-track feature and would like to use external mics, too.
2010/09/14
[ "https://sound.stackexchange.com/questions/3547", "https://sound.stackexchange.com", "https://sound.stackexchange.com/users/258/" ]
I love mine...Always in my briefcase, and often in my booth. I don't find the XLR's on my version to be noisy (not that I'm recording REALLY quiet sounds with it) - assuming you have your phantom power set correctly. The H4n has both 24 and True 48 volts phantom power, and I have found a lot of mics don't like the 24v very much, which is what the unit defaults to. Of course 48v eats batteries a LOT faster.
I bought the recorder and only have one major grievance with it: each different mode is limited with a different bit/sample rate as opposed to them all being able to record up to 24bit 96kHz broadcast wav. It goes like this: Stereo: max 24bit 96kHz wav 4 Channel: max 24bit 48Khz wav multitrack: 16bit 44.1kHz wav As I bought it to get into field recording I was slightly disappointed with the multitrack option being on 16/44.1 as I had hoped this would have allowed me to record in mono using my NTG-2 at 24/96. HOWEVER I improvised a work around: If you want to record 1 or 2 mono tracks using the XLR inputs, record in 'Stereo' mode with inputs 1/2 selected to get your 24/96 and then separate the stereo file into left and right tracks later (I use Snapper to quickly do this) which will produce your two mono tracks e.g. input 1 = left track, 2 = right. Not ideal but good for those of us with lower budgets or whom are just getting started. Also the software updates now allow for individual adjustment of the record levels of the mics (e.g. can set mic 1 at rec. lvl 70 and mic 2 at rec. lvl 50) as opposed to this being fixed as the same for both XLR inputs which it previously was making my previous work around even more viable. When using the inputs press the rec. lvl button on the side and then input 1 or 2 to select which input level you wish to change before going back to the rec. lvl button on the side to then adjust. Here's the link for the updates: <http://www.zoom.co.jp/downloads/h4n/software/> From the other reviews it would also appear that I ran into the 1/4 -10dB problem when trying to use a hydrophone with it. I found the levels very quiet and in some cases it was necessary to turn the rec. level up to 100 (when recording stiller water generally). Despite all that I do find it to be a generally good recorder and all I'd say was just compare it to what you will be using it for before diving in. All in all though, for what it is, thumbs up!
3,547
There are so many reviews spread around the web about the Zoom H4N, some of them praise this recorder, other say some things about bad preamps, hissy mics etc... Who has had experiences with this device? Would you recommend it? How are the internal mics? I am thinking about going out into the woods to record ambient sounds. I also like the multi-track feature and would like to use external mics, too.
2010/09/14
[ "https://sound.stackexchange.com/questions/3547", "https://sound.stackexchange.com", "https://sound.stackexchange.com/users/258/" ]
I love my Zoom H4n, but be VERY AWARE that the 1/4inch "line level" inputs that they advertise are actually -10db High impedance guitar level inputs, not +4db low impedance professional line inputs. I got caught on that one when I tried to hook up my Sound Devices MM1 external preamp to clean up the mic inputs and there was a severe level/impedance mismatch. Was quite pissed when I discovered that because they are not too forthcoming with that fact in their advertising :(
I love mine...Always in my briefcase, and often in my booth. I don't find the XLR's on my version to be noisy (not that I'm recording REALLY quiet sounds with it) - assuming you have your phantom power set correctly. The H4n has both 24 and True 48 volts phantom power, and I have found a lot of mics don't like the 24v very much, which is what the unit defaults to. Of course 48v eats batteries a LOT faster.
87,304
I'm reading *The Count of Monte Cristo* at the moment and found this sentence: > > One can't expect the son of a Girondin to be entirely free from the spice of the old leaven. > > >
2016/04/12
[ "https://ell.stackexchange.com/questions/87304", "https://ell.stackexchange.com", "https://ell.stackexchange.com/users/31660/" ]
It's a very clumsy translation from the French. The original is "One cannot be the son of a Girondin and not preserve a *goût de terroir*". * A *Girondin* is literally an inhabitant of the Gironde, a region in western France; but during the Revolution (a generation before the time at which this epsisode is set) the Gironde was also a name for the most extreme republican party, many of whose members hailed from that region. * The phrase *goût de* may be understood as either "a taste/hint of" or "a taste/appetite for". * *Terroir* means "land, soil, earth"; but it also evokes *la terreur*, the Terror, the revolutionary excesses of the Girondins. So in saying that Villefort's attitude is characteristic of the country his family comes from, the marquise also insinuates that he shares the attitude of the sponsors of the Terror. In fairness, it's almost impossible to reproduce such a cunning use of one language into another. This translator elects to reproduce the indirection, the insinuating character of the line, and bypasses the specific references. But this is so indirect that the real venom is entirely lost.
Without knowing anything about the context, it seems the sentence means something about the son (his personality most likely) has been influenced by his family. It's like an old fashion way of saying "The apple doesn't fall too far from the tree."
243,255
I'm using the Feeds module and entity references heavily on my site to add, update, and link content. Unfortunately, the data I'm using has a number of records that are named the same, which causes issues when matching Entity References through GUID. Is there a way to import entity reference fields through Feeds but using title and content-type instead?
2017/08/06
[ "https://drupal.stackexchange.com/questions/243255", "https://drupal.stackexchange.com", "https://drupal.stackexchange.com/users/77872/" ]
I believe I've found an answer to this problem. Using the Feeds Tamper module, I can add use the Entity Fields Query Finder plugin to lookup against any other field belonging to a particular content type. In my case I'm saving the lookup key I need in a field, and then matching against that field & content type.
> > Title **and** content-type > > > By default, there is no way to do that. So you have 2 choices: A) Go by the referenced nid instead. B) Create a custom parser, see the [The developer's guide to Feeds](https://www.drupal.org/node/622700).
17,817,631
I'm distributing a chrome app internally at our company and due to spec changes, it will need to include 500MB of files. Will Chrome allow that to be installed? Is there a size restriction? (This is an App, not an Extension)
2013/07/23
[ "https://Stackoverflow.com/questions/17817631", "https://Stackoverflow.com", "https://Stackoverflow.com/users/857025/" ]
Yes, There is a limit of 2 GB while this answer is being posted. Here is a link for ref. <https://developer.chrome.com/webstore/faq#apps>
I think not. Google Chromium is not just trying to be a browser or OS for specific devices, its trying to be a platform. See this link from GigaOM: <http://gigaom.com/2013/05/22/how-google-plans-to-rule-the-computing-world-through-chrome/> From the article: "A good example of a native client app is a game I played on my Chromebook Pixel recently called Cracking Sands Racing The app, a port of a game for iOS and Android, was a **533 MB download** to my Pixel and I played it outside of the browser." Now mind you that this is a packaged app and not extension per say (see the difference here: <http://developer.chrome.com/apps/about_apps.html>). But I would think that you should be able to create bigger extension without much problem. As others have said, a sure way would be to create a dummy huge file and packaging it in your extension. Just my 2 cents, even though it does not answer your question fully.
63,105
This is my first question here in Information Security SE. Is there a recommendation to help telling the scenarios where authentication should precede authorization from the ones where authorization comes first? I experienced both situations at different workplaces (the situation was very similar, switching to a system user with a certain set of privileges in Unix command line). When does authorization need to come first to spare the authentication? When should it authenticate first so as to not to tell a potential intruder that the user impersonated would not be entitled for the current operation? Are there situation specific generic guidelines or rules for this? -- OK, let me come up with a specific situation. Logged in with own employee user ID, SUing to another user. I mistyped the system user I was about to switch to, but the other is also an existing user in our environment. Before asking for password, I got rejected saying I wasn't authorized to perform this operation. Given that I just finished an anti-social engineering course today, I wondered whether it was a good idea to tell me I wasn't authorized (even this may be a useful piece of information for someone trying to impersonate me in the corporate network).
2014/07/14
[ "https://security.stackexchange.com/questions/63105", "https://security.stackexchange.com", "https://security.stackexchange.com/users/51924/" ]
How can you know who you authorise if you haven't authenticated? Authentication always comes first, except when everyone is authorised or noone is authorised. **Edit:** seems like you have two questions now :) **1)** You were refused most likely because there is an access control policy that prevents your own original UID from performing an operation on the UID of the user you mistyped. It's your own UID that authenticated you (called an ambient authority) because the system that performs authentication checks trusts your system to provide an accurate UID. Your UID was not authorised so you were refused access. As a side note, some auth server protocols like NIS failed to account for the fact that a user can locally escalate privilege and their local UID cannot be used as an authentication token. I honestly don't remember how a NIS+, LDAP or Kerberos server would check your identity before checking if you're allowed to use one of their resources (such as connecting to another account). **2)** [Some people](http://dl.acm.org/citation.cfm?id=2535822) believe that it's a good idea to obfuscate whether an authentication attempt worked by serving a honeypot session to the poor user that misspelt an ID or password. These people are wrong. If the system hadn't told you you were not authorised, you would not have noticed that you're not logged in onto the account you expected and you would not have been able to explain why the data you expected to find isn't here, or why the data you added during that session was not actually saved to that user's home directory. These interaction breakdowns are all but straightforward to recover from, and the security benefit of obfuscation is very debatable.
I agree with both answers given thus far by [Thomas Pornin](https://security.stackexchange.com/users/655/thomas-pornin) and [Steve DL](https://security.stackexchange.com/users/49936/steve-dl). Authorization always comes after authentication since authorization is the act of looking at a given user's claims and determining whether the user can do what they are trying to do based on those claims and based on the authorization policies in place. I would like to broaden the definition though and state that authentication is not necessarily about your identity directly (who you are) but rather about proving a the authenticity of a claim you make. A good example of that is when I go buy alcohol. The clerk couldn't care less who I am but they do care about my age (>21 in the US, >20 in Sweden, >18 in most places). So, when I show my driver's license / id card / passport, I'm proving my date of birth. The stamps, holograms, and other mechanisms on the card prove the authenticity of the claim (my date of birth). But the clerk isn't interested in my identity in itself.
2,521,302
I have made an grailsapplication. Now i have to make it complaint with the REST protocol. how do i do this ? can i get any material where i have examples of how to carry out this process ??groovygrails
2010/03/26
[ "https://Stackoverflow.com/questions/2521302", "https://Stackoverflow.com", "https://Stackoverflow.com/users/302301/" ]
See section 13.1 of the manual at the following link. <http://www.grails.org/doc/1.0.x/guide/13.%20Web%20Services.html>
there is a great grails plugin called jaxrs www.grails.org/plugin/jaxrs. to install it on your grails app just run grails install-plugin jaxrs get more info here <http://code.google.com/p/grails-jaxrs/>
196,652
I switched to a dark theme (Dolorean Dark). I like it and I think--it is better for eyes. However I have problems in some web pages. Most pages are intended to be shown on white background. On dark theme form controls have black background and mostly the font is also black, so the content of forms are invisible. I would like to use white theme on web pages in Firefox.
2012/10/05
[ "https://askubuntu.com/questions/196652", "https://askubuntu.com", "https://askubuntu.com/users/92867/" ]
changing userContent.css solves problem ; more information is [here](http://ubuntuforums.org/showthread.php?t=1715679) also dolerean theme comes with the firefox tweak more information is [here](http://gnome-look.org/content/show.php/DeLorean-Dark?content=153866); just installing it is enough
You cannot do anything with it, except changing to a white theme! EDIT: This is wrong. Please check the below comment given by OP for correct answer
11,148
I am looking for a good antonym for reducing/simplifying equivalent fractions: 'reduce' and 'simplify' both make sense to me when dividing, but I'm struggling to name what it is we do when we multiply a fraction by 2/2, or 7/7, or any of their Unity Sisters! Ideas that came to mind: increasing? complexifying?
2016/06/20
[ "https://matheducators.stackexchange.com/questions/11148", "https://matheducators.stackexchange.com", "https://matheducators.stackexchange.com/users/6885/" ]
I thought the standard terminology was ***expanding*** fractions, at least it is in my language. Some support for this term found by googling: [Expanding and Reducing Fractions](https://www.wyzant.com/resources/lessons/math/elementary_math/fractions/expanding_and_reducing_fractions) [Expanding Fractions](http://www.personal-math-online-help.com/expanding-fractions.html)
How about : rescaling the fraction. This could refer to going in either direction.
1,012
Over on [Politics SE](https://politics.stackexchange.com/), we often have questions about law enforcement policies and crime statistics, especially when big events related to law enforcement occur (i.e, the last couple weeks in the US). The questions about policies are fully on-topic there, but questions purely about current or historical crime statistics are a bit iffy and it'd be great to point those questions towards a site where they would be a better fit. So, are questions regarding crime statistics on-topic here on Law? For example, [this question](https://politics.stackexchange.com/questions/54017/is-the-frequency-of-lethal-police-interactions-between-black-and-white-americans) over on Politics is what prompted me to ask this question here. It is essentially asking for crime statistics for different demographics, and while the data certainly has political implications the question is only focused on the statistics themselves. Would that sort of question be accepted here? I haven't looked at Law much before now, but judging by most of the existing questions this sort of thing likely wouldn't be on-topic, though it's not fully clear: * According to [the help page](https://law.stackexchange.com/help/on-topic) about what's on topic, Law seems to focus solely on questions regarding specific application of laws, how individuals interact with a legal system, references to decisions, and things along those lines, and most of the questions seem to stick to that. * I wasn't able to find anything conclusive on Law Meta. [This question](https://law.meta.stackexchange.com/questions/993/can-one-question-posted-on-this-law-site-be-simultaneously-posted-on-the-politic) implies that some politics-related questions could be on topic if the focus is on the legal aspects, and [this question](https://law.meta.stackexchange.com/questions/302/how-do-i-ask-a-general-law-question) states that general questions regarding law are on-topic, but the example on-topic questions are about applications of laws (does X count as homicide in Y jurisdiction) rather the effects of laws (how many people have been arrested for X type of homicide in Y jurisdiction). * However, I've found a few questions that are more focused on broad statistics or research: [this question](https://law.stackexchange.com/questions/30169/any-solid-research-on-error-rate-of-criminal-conviction-system-jury-vs-judges/30201#30201) asking for research comparing certain legal system aspects, [this question](https://law.stackexchange.com/questions/6620/where-to-find-statistics-on-incarceration-rates-for-a-certain-time-period) asking for historical incarceration statistics, and the [reference-request tag](https://law.stackexchange.com/questions/tagged/reference-request) which is "For requests to off-site resources, statistics, and other information.", so it seems like at least some statistics questions are on-topic.
2020/06/18
[ "https://law.meta.stackexchange.com/questions/1012", "https://law.meta.stackexchange.com", "https://law.meta.stackexchange.com/users/32077/" ]
Depends on the question ----------------------- Crime statistics on its own is not a good fit. However, if the question is about the effect of laws or legal systems and natural experiments caused by different jurisdictions or different epochs and the *focus* is on the legal aspects then it's a definite maybe. Other sites that might be relevant depending on the thrust of the question are: * [Politics](https://politics.stackexchange.com/) which is where you started. * [Cross Validated](https://stats.stackexchange.com/) if it's about the statistics themselves. * [Economics](https://economics.stackexchange.com/) if it's about incentives and people's response to them. * [History](https://history.stackexchange.com/) if it's about change over time or crime in different time periods. * [Skeptics](https://skeptics.stackexchange.com/) if it's about debunking (or bunking?) a particular claim about crime.
I don't think so. These are fundamentally not questions about the law or legal process. Asking for statistics that help illuminate a particular point of law or explain something about the legal process should be fine, the same as asking about cases that show it. Asking about statistics of what happened where is no more about the law than it is about history. People seeking such information are best directed to Open Data SE, though that site may have further constraints.
1,012
Over on [Politics SE](https://politics.stackexchange.com/), we often have questions about law enforcement policies and crime statistics, especially when big events related to law enforcement occur (i.e, the last couple weeks in the US). The questions about policies are fully on-topic there, but questions purely about current or historical crime statistics are a bit iffy and it'd be great to point those questions towards a site where they would be a better fit. So, are questions regarding crime statistics on-topic here on Law? For example, [this question](https://politics.stackexchange.com/questions/54017/is-the-frequency-of-lethal-police-interactions-between-black-and-white-americans) over on Politics is what prompted me to ask this question here. It is essentially asking for crime statistics for different demographics, and while the data certainly has political implications the question is only focused on the statistics themselves. Would that sort of question be accepted here? I haven't looked at Law much before now, but judging by most of the existing questions this sort of thing likely wouldn't be on-topic, though it's not fully clear: * According to [the help page](https://law.stackexchange.com/help/on-topic) about what's on topic, Law seems to focus solely on questions regarding specific application of laws, how individuals interact with a legal system, references to decisions, and things along those lines, and most of the questions seem to stick to that. * I wasn't able to find anything conclusive on Law Meta. [This question](https://law.meta.stackexchange.com/questions/993/can-one-question-posted-on-this-law-site-be-simultaneously-posted-on-the-politic) implies that some politics-related questions could be on topic if the focus is on the legal aspects, and [this question](https://law.meta.stackexchange.com/questions/302/how-do-i-ask-a-general-law-question) states that general questions regarding law are on-topic, but the example on-topic questions are about applications of laws (does X count as homicide in Y jurisdiction) rather the effects of laws (how many people have been arrested for X type of homicide in Y jurisdiction). * However, I've found a few questions that are more focused on broad statistics or research: [this question](https://law.stackexchange.com/questions/30169/any-solid-research-on-error-rate-of-criminal-conviction-system-jury-vs-judges/30201#30201) asking for research comparing certain legal system aspects, [this question](https://law.stackexchange.com/questions/6620/where-to-find-statistics-on-incarceration-rates-for-a-certain-time-period) asking for historical incarceration statistics, and the [reference-request tag](https://law.stackexchange.com/questions/tagged/reference-request) which is "For requests to off-site resources, statistics, and other information.", so it seems like at least some statistics questions are on-topic.
2020/06/18
[ "https://law.meta.stackexchange.com/questions/1012", "https://law.meta.stackexchange.com", "https://law.meta.stackexchange.com/users/32077/" ]
Depends on the question ----------------------- Crime statistics on its own is not a good fit. However, if the question is about the effect of laws or legal systems and natural experiments caused by different jurisdictions or different epochs and the *focus* is on the legal aspects then it's a definite maybe. Other sites that might be relevant depending on the thrust of the question are: * [Politics](https://politics.stackexchange.com/) which is where you started. * [Cross Validated](https://stats.stackexchange.com/) if it's about the statistics themselves. * [Economics](https://economics.stackexchange.com/) if it's about incentives and people's response to them. * [History](https://history.stackexchange.com/) if it's about change over time or crime in different time periods. * [Skeptics](https://skeptics.stackexchange.com/) if it's about debunking (or bunking?) a particular claim about crime.
No: Such questions are most likely off topic here. -------------------------------------------------- Law.SE tries to deal with *what the law is*. We generally reject questions about *why the law is what it is* (see [Can "\*Why\* a law exists" be on topic?](https://law.meta.stackexchange.com/q/904/10)), and I believe will likewise continue to reject questions about *what systemic effects or outcomes the law produces*, which is another way of approaching the same underlying question. The one good counter-example you point to is [Any solid research on error rate of criminal conviction system: jury vs judges?](https://law.stackexchange.com/q/30169/10) which slipped by because it is a question about legal systems themselves, and because it got a phenomenal answer.
120,172
Which is correct? > > Was the cheese taste delicious? > > > or > > Did the cheese taste delicious? > > > I know that we should use 'did' when we want to ask some question in the past simple form, but I'm not sure about this case. When we use "did" in this sentence, it sounds like supposition about cheese taste, but I need to ask straightly. How to say it correctly?
2017/02/21
[ "https://ell.stackexchange.com/questions/120172", "https://ell.stackexchange.com", "https://ell.stackexchange.com/users/49962/" ]
TL;DR; You should use "did", because your main verb is preceded by no auxiliaries. There's a phenomenon in questions called 'subject–auxiliary inversion'. You already probably know what a subject is; it's "Max" in "Max had lunch in a restaurant." 'auxiliary' is a type of verb, but not the normal type. You're familiar with auxiliaries as well; one such example is "was" in "It was raining very heavily yesterday." Auxiliary verbs are verbs, but they somehow **assist** the other verbs in the sentence they appear. So, what does 'subject–auxiliary inversion' mean? It means that, to form a question, you simply need to swap the places of the subject and the auxiliary: > > **It** *was* raining very heavily yesterday. > > > becomes > > *Was* **it** raining very heavily yesterday? > > > So the question becomes "which auxiliary should I use?" To answer, you need to remind yourself of what the verb phrase in the declarative sentence is. If there's a "be" verb in the initial sentence, you simply swap the place of subject and the auxiliary, and add a question mark, to create an interrogative sentence. > > **Kim** *was* creating a warm and welcoming atmosphere for all the strangers. > > > becomes > > *Was* **Kim** creating a warm and welcoming atmosphere for all the strangers? > > > In cases where "be" is a main verb, you can apply the same process. > > Kim was happy. > > > becomes > > Was Kim happy? > > > If, however, there are no auxiliary or main "be"s present, you create an interrogative sentence (question) by using "did" as auxiliary. > > I tried to fly with metal wings. > > > becomes > > **Did** I *try* to fly with metal wings? > > > Notice how "try" is in the plain form, without any suffix or prefixes. Any changes are done to the auxiliary "did". In your sentence, you need to examine what the initial sentence was. Was it > > \*It was taste delicious. > > > or > > It tasted delicious. > > > ? You already know that if you use an auxiliary "was" before a main verb in a sentence, you need to use the main verb in the gerund form. "I was try" is ungrammatical while "I was trying" is grammatical, albeit incomplete. --- As the comment by Ilmari points out below, you may also encounter constructions such as these: > > I **did** convince him to buy us a new laptop. > > He **does** seem inclined to eat junk food. > > > So, what's the deal with these? They have an auxiliary, but it's not one of "be"'s forms. It's called an [emphatic 'do'](http://grammar.about.com/od/e/g/Emphatic-do.htm). As you may have guessed, just like the case of "was", you can simply swap (or inverse) the auxiliary and the subject to create an interrogative. > > **Did** *I* convince him to buy us a new laptop? > > **Does** *he* seem inclined to eat junk food? > > >
Because you're using "taste", you need to use "did". > > Did the cheese taste delicious? > > > However, if you omit "taste", you would use "was". I would argue that this form is much more common because, in general, "taste" is implied with "delicious". > > Was the cheese delicious? > > > The former would be more (or at least equally) common if you used just about any word other than "delicious": > > Did the cheese taste [salty/sweet/sour/good...]? > > > versus > > Was the cheese [salty/sweet/sour/good]? > > > Both of these are pretty equal in my mind, though the version with "taste" is a bit more specific.
3,980
I was looking into applications of Quantum Computing for machine learning and encountered the following pre-print from 2003. [Quantum Convolution and Correlation Algorithms are Physically Impossible](https://arxiv.org/abs/quant-ph/0309070). The article doesn't appear to have been published in any journals, but it has been cited a few dozen times. The article author makes the case that it is impossible to compute discrete convolution over quantum states. Intuitively this seems incorrect to me, since I know we can perform quantum matrix multiplication, and I know that discrete convolution can be framed simply as multiplication with a Toeplitz (or circulant) matrix. The crux of his argument seems to be that there is no realizable composition of unitary operators for the elementwise (Hadamard) product of two vectors. Where is my disconnect? Is there any reason we cannot in general construct a Toeplitz matrix for discrete convolution in a quantum computer? Or is the article simply incorrect? I have worked through the contradiction that the author presents in his proof of Lemma 14, and it seems to make sense to me.
2018/08/08
[ "https://quantumcomputing.stackexchange.com/questions/3980", "https://quantumcomputing.stackexchange.com", "https://quantumcomputing.stackexchange.com/users/4298/" ]
Speaking as a computer scientist without any physics background making contributions to quantum computing: yes, computer scientists without any physics background can make contributions to quantum computing. Though I think it was always that way; it has nothing to do with the field being "mature". If you understand the postulates of quantum mechanics (operations are unitary matrices, states are unit vectors, measurement is a projection), and know how to apply those in the context of a computation, you can create quantum algorithms. The fact that these concepts were originally derived from physics is historically interesting, but not really relevant when optimizing a quantum circuit. As a concrete example: quantum physics is very heavy on calculus, but quantum computation isn't. Quantum physics does become relevant if you are trying to design algorithms for simulating quantum systems. And some of the concepts you would learn in a quantum physics course should also appear in a quantum computation course. But overall [I agree with Scott Aaronson](https://www.scottaaronson.com/democritus/lec9.html): > > [My] way to teach quantum mechanics [...] starts directly from the conceptual core -- namely, a certain generalization of probability theory to allow minus signs. Once you know what the theory is actually about, you can then sprinkle in physics to taste [...] > > > [quantum mechanics is] not a physical theory in the same sense as electromagnetism or general relativity. [...] Basically, quantum mechanics is the operating system that other physical theories run on as application software [...] > > > [...] [quantum mechanics is] about information and probabilities and observables, and how they relate to each other. > > >
It has pretty much always been like that. You can study the book by Nielsen & Chuang without knowing about physics. There is the [introduction by Mermin](https://arxiv.org/abs/quant-ph/0207118) aimed at computer scientists. There are probably lots of other resources (I'm pretty sure e.g. that Aaronson's book -- based on a [CS lecture](https://www.scottaaronson.com/democritus/) -- is perfectly suited for people without a physics background.) Overall, the physics formalism needed to understand quantum information and computation is pretty low-key, as compared to other fields of (quantum) physics. (This doesn't mean though that studying the phenomena in quantum information and computation is low-key.)
3,980
I was looking into applications of Quantum Computing for machine learning and encountered the following pre-print from 2003. [Quantum Convolution and Correlation Algorithms are Physically Impossible](https://arxiv.org/abs/quant-ph/0309070). The article doesn't appear to have been published in any journals, but it has been cited a few dozen times. The article author makes the case that it is impossible to compute discrete convolution over quantum states. Intuitively this seems incorrect to me, since I know we can perform quantum matrix multiplication, and I know that discrete convolution can be framed simply as multiplication with a Toeplitz (or circulant) matrix. The crux of his argument seems to be that there is no realizable composition of unitary operators for the elementwise (Hadamard) product of two vectors. Where is my disconnect? Is there any reason we cannot in general construct a Toeplitz matrix for discrete convolution in a quantum computer? Or is the article simply incorrect? I have worked through the contradiction that the author presents in his proof of Lemma 14, and it seems to make sense to me.
2018/08/08
[ "https://quantumcomputing.stackexchange.com/questions/3980", "https://quantumcomputing.stackexchange.com", "https://quantumcomputing.stackexchange.com/users/4298/" ]
It has pretty much always been like that. You can study the book by Nielsen & Chuang without knowing about physics. There is the [introduction by Mermin](https://arxiv.org/abs/quant-ph/0207118) aimed at computer scientists. There are probably lots of other resources (I'm pretty sure e.g. that Aaronson's book -- based on a [CS lecture](https://www.scottaaronson.com/democritus/) -- is perfectly suited for people without a physics background.) Overall, the physics formalism needed to understand quantum information and computation is pretty low-key, as compared to other fields of (quantum) physics. (This doesn't mean though that studying the phenomena in quantum information and computation is low-key.)
Quantum computing is totally understandable without physics background for several reasons: 1. Quantum computing is sufficiently written in abstract mathematical language (Linear Algebra). 2. Quantum computing does not include dynamical equations such as Schrodinger equations. Moreover, all operations can be understood as logical gates instead of physical entities. 3. Quantum mechanics itself made a bridge between physics and information in a such way that information scientist can understand it from his/her perspective. However, quantum mechanics and quantum computing includes new concepts that are hard to accept by everyone including physicists. So don't worry.
3,980
I was looking into applications of Quantum Computing for machine learning and encountered the following pre-print from 2003. [Quantum Convolution and Correlation Algorithms are Physically Impossible](https://arxiv.org/abs/quant-ph/0309070). The article doesn't appear to have been published in any journals, but it has been cited a few dozen times. The article author makes the case that it is impossible to compute discrete convolution over quantum states. Intuitively this seems incorrect to me, since I know we can perform quantum matrix multiplication, and I know that discrete convolution can be framed simply as multiplication with a Toeplitz (or circulant) matrix. The crux of his argument seems to be that there is no realizable composition of unitary operators for the elementwise (Hadamard) product of two vectors. Where is my disconnect? Is there any reason we cannot in general construct a Toeplitz matrix for discrete convolution in a quantum computer? Or is the article simply incorrect? I have worked through the contradiction that the author presents in his proof of Lemma 14, and it seems to make sense to me.
2018/08/08
[ "https://quantumcomputing.stackexchange.com/questions/3980", "https://quantumcomputing.stackexchange.com", "https://quantumcomputing.stackexchange.com/users/4298/" ]
Speaking as a computer scientist without any physics background making contributions to quantum computing: yes, computer scientists without any physics background can make contributions to quantum computing. Though I think it was always that way; it has nothing to do with the field being "mature". If you understand the postulates of quantum mechanics (operations are unitary matrices, states are unit vectors, measurement is a projection), and know how to apply those in the context of a computation, you can create quantum algorithms. The fact that these concepts were originally derived from physics is historically interesting, but not really relevant when optimizing a quantum circuit. As a concrete example: quantum physics is very heavy on calculus, but quantum computation isn't. Quantum physics does become relevant if you are trying to design algorithms for simulating quantum systems. And some of the concepts you would learn in a quantum physics course should also appear in a quantum computation course. But overall [I agree with Scott Aaronson](https://www.scottaaronson.com/democritus/lec9.html): > > [My] way to teach quantum mechanics [...] starts directly from the conceptual core -- namely, a certain generalization of probability theory to allow minus signs. Once you know what the theory is actually about, you can then sprinkle in physics to taste [...] > > > [quantum mechanics is] not a physical theory in the same sense as electromagnetism or general relativity. [...] Basically, quantum mechanics is the operating system that other physical theories run on as application software [...] > > > [...] [quantum mechanics is] about information and probabilities and observables, and how they relate to each other. > > >
As I can relate in my experience, I will say yes. One can indeed design algorithms without physics knowledge. For me it is so far Maths concepts. I remember once I watched a course about quantum computing from Scott Aaronson and he quoted : > > Quantum computing is really "easy" when you take the physics out of it. > > > However, if you are going to work on applications in physics or chemistry, it will always be useful to have a background of what you are going to work on. The field is open to many backgrounds (Maths, Physics, Computer Science...). I think it is just a challenge sometimes communicating between different backgrounds but it is not impossible. Indeed, I would say it is constructive and it can be beneficial to collaborate together. But one can always relate to his preferred interpretation/concepts. As a career, it is again according to your point of view. I think there is much work to be done in this field so do not worry about it. Do it if you feel you like it. Plus working in this field does not mean you have to restrain yourself. You will still have to work with classical algorithms and you will need coding skills. If you are interested in learning it from a computer scientist point of view, there is this book that may be helpful: [https://www.amazon.com/Quantum-Computing-Computer-Scientists-Yanofsky/dp/0521879965](https://rads.stackoverflow.com/amzn/click/0521879965) Good luck on your quantum trip !
3,980
I was looking into applications of Quantum Computing for machine learning and encountered the following pre-print from 2003. [Quantum Convolution and Correlation Algorithms are Physically Impossible](https://arxiv.org/abs/quant-ph/0309070). The article doesn't appear to have been published in any journals, but it has been cited a few dozen times. The article author makes the case that it is impossible to compute discrete convolution over quantum states. Intuitively this seems incorrect to me, since I know we can perform quantum matrix multiplication, and I know that discrete convolution can be framed simply as multiplication with a Toeplitz (or circulant) matrix. The crux of his argument seems to be that there is no realizable composition of unitary operators for the elementwise (Hadamard) product of two vectors. Where is my disconnect? Is there any reason we cannot in general construct a Toeplitz matrix for discrete convolution in a quantum computer? Or is the article simply incorrect? I have worked through the contradiction that the author presents in his proof of Lemma 14, and it seems to make sense to me.
2018/08/08
[ "https://quantumcomputing.stackexchange.com/questions/3980", "https://quantumcomputing.stackexchange.com", "https://quantumcomputing.stackexchange.com/users/4298/" ]
As I can relate in my experience, I will say yes. One can indeed design algorithms without physics knowledge. For me it is so far Maths concepts. I remember once I watched a course about quantum computing from Scott Aaronson and he quoted : > > Quantum computing is really "easy" when you take the physics out of it. > > > However, if you are going to work on applications in physics or chemistry, it will always be useful to have a background of what you are going to work on. The field is open to many backgrounds (Maths, Physics, Computer Science...). I think it is just a challenge sometimes communicating between different backgrounds but it is not impossible. Indeed, I would say it is constructive and it can be beneficial to collaborate together. But one can always relate to his preferred interpretation/concepts. As a career, it is again according to your point of view. I think there is much work to be done in this field so do not worry about it. Do it if you feel you like it. Plus working in this field does not mean you have to restrain yourself. You will still have to work with classical algorithms and you will need coding skills. If you are interested in learning it from a computer scientist point of view, there is this book that may be helpful: [https://www.amazon.com/Quantum-Computing-Computer-Scientists-Yanofsky/dp/0521879965](https://rads.stackoverflow.com/amzn/click/0521879965) Good luck on your quantum trip !
Quantum computing is totally understandable without physics background for several reasons: 1. Quantum computing is sufficiently written in abstract mathematical language (Linear Algebra). 2. Quantum computing does not include dynamical equations such as Schrodinger equations. Moreover, all operations can be understood as logical gates instead of physical entities. 3. Quantum mechanics itself made a bridge between physics and information in a such way that information scientist can understand it from his/her perspective. However, quantum mechanics and quantum computing includes new concepts that are hard to accept by everyone including physicists. So don't worry.
3,980
I was looking into applications of Quantum Computing for machine learning and encountered the following pre-print from 2003. [Quantum Convolution and Correlation Algorithms are Physically Impossible](https://arxiv.org/abs/quant-ph/0309070). The article doesn't appear to have been published in any journals, but it has been cited a few dozen times. The article author makes the case that it is impossible to compute discrete convolution over quantum states. Intuitively this seems incorrect to me, since I know we can perform quantum matrix multiplication, and I know that discrete convolution can be framed simply as multiplication with a Toeplitz (or circulant) matrix. The crux of his argument seems to be that there is no realizable composition of unitary operators for the elementwise (Hadamard) product of two vectors. Where is my disconnect? Is there any reason we cannot in general construct a Toeplitz matrix for discrete convolution in a quantum computer? Or is the article simply incorrect? I have worked through the contradiction that the author presents in his proof of Lemma 14, and it seems to make sense to me.
2018/08/08
[ "https://quantumcomputing.stackexchange.com/questions/3980", "https://quantumcomputing.stackexchange.com", "https://quantumcomputing.stackexchange.com/users/4298/" ]
Speaking as a computer scientist without any physics background making contributions to quantum computing: yes, computer scientists without any physics background can make contributions to quantum computing. Though I think it was always that way; it has nothing to do with the field being "mature". If you understand the postulates of quantum mechanics (operations are unitary matrices, states are unit vectors, measurement is a projection), and know how to apply those in the context of a computation, you can create quantum algorithms. The fact that these concepts were originally derived from physics is historically interesting, but not really relevant when optimizing a quantum circuit. As a concrete example: quantum physics is very heavy on calculus, but quantum computation isn't. Quantum physics does become relevant if you are trying to design algorithms for simulating quantum systems. And some of the concepts you would learn in a quantum physics course should also appear in a quantum computation course. But overall [I agree with Scott Aaronson](https://www.scottaaronson.com/democritus/lec9.html): > > [My] way to teach quantum mechanics [...] starts directly from the conceptual core -- namely, a certain generalization of probability theory to allow minus signs. Once you know what the theory is actually about, you can then sprinkle in physics to taste [...] > > > [quantum mechanics is] not a physical theory in the same sense as electromagnetism or general relativity. [...] Basically, quantum mechanics is the operating system that other physical theories run on as application software [...] > > > [...] [quantum mechanics is] about information and probabilities and observables, and how they relate to each other. > > >
Quantum computing is totally understandable without physics background for several reasons: 1. Quantum computing is sufficiently written in abstract mathematical language (Linear Algebra). 2. Quantum computing does not include dynamical equations such as Schrodinger equations. Moreover, all operations can be understood as logical gates instead of physical entities. 3. Quantum mechanics itself made a bridge between physics and information in a such way that information scientist can understand it from his/her perspective. However, quantum mechanics and quantum computing includes new concepts that are hard to accept by everyone including physicists. So don't worry.
83,456
I'm new to web applications. I have developed a web application using ASP.net. After hearing about jquery and going through some examples, I changed the existing application and used jquery to do most of the ajax work. On the server side I used html handlers for db connections and data processing. Because I'm using jquery, my concern is my script is visible to users of the application. Will this be a security issue? Am i following a right path?
2011/06/13
[ "https://softwareengineering.stackexchange.com/questions/83456", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/27627/" ]
jQuery is not a programming language. It is a JavaScript framework. Your AJAX code is not any more visible just because you switched to jQuery. No matter what, AJAX relies on JavaScript in the client, so that code is visible to whomever uses the web application. (Afterall, AJAX stands for Asynchronous Javascript And XML.) There may certainly be security holes in jQuery. However, chances are there will be bigger holes in what you write. (This isn't meant to put you down, but you are a neophyte. Expect to make mistakes.) jQuery is used by many and programmed by a group of developers. Security is one of the things that is on everyone's minds. I would worry more about the code I wrote with jQuery then the code that makes up jQuery. Just keep up on the updates and news about it, so that you are aware of any security issues that are identified. And, of course, learn the best practices for creating secure applications.
No. JQuery is executed in the browser, an from a security point of view, the browser is always unreliable. No matter how much JavaScript you use or which JS frameworks, it's always under control of the attacker. Security has to implemented on the server side. It's a common mistake beginners make, to place security checks in the JavaScript part and let the server trust it.
83,456
I'm new to web applications. I have developed a web application using ASP.net. After hearing about jquery and going through some examples, I changed the existing application and used jquery to do most of the ajax work. On the server side I used html handlers for db connections and data processing. Because I'm using jquery, my concern is my script is visible to users of the application. Will this be a security issue? Am i following a right path?
2011/06/13
[ "https://softwareengineering.stackexchange.com/questions/83456", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/27627/" ]
jQuery is not a programming language. It is a JavaScript framework. Your AJAX code is not any more visible just because you switched to jQuery. No matter what, AJAX relies on JavaScript in the client, so that code is visible to whomever uses the web application. (Afterall, AJAX stands for Asynchronous Javascript And XML.) There may certainly be security holes in jQuery. However, chances are there will be bigger holes in what you write. (This isn't meant to put you down, but you are a neophyte. Expect to make mistakes.) jQuery is used by many and programmed by a group of developers. Security is one of the things that is on everyone's minds. I would worry more about the code I wrote with jQuery then the code that makes up jQuery. Just keep up on the updates and news about it, so that you are aware of any security issues that are identified. And, of course, learn the best practices for creating secure applications.
Any javascript is visible to attackers, that is just the way the web works, browser downloads the script file and then executes it. Making a request for a script file, is the exact same process as making a request for a webpage. For example, if you are using Google Chrome right now (and for web development you should be using either google chrome or Firefox with firebug installed) hit Ctrl+Shift+J to open up the Developer TOols. You will be able to see everything about this web page. Click the Elements tab, expand the area that says `<head>` and look in there for any of the script elements, you will see something like this: ![Script tags in Google Chrome Developer Tools](https://i.stack.imgur.com/zKCsL.png) See those urls in the script tags? Those are the urls that you go to to get that code, you can type them directly into your browser and get those files, this is how all webpages work and the trick is to accept it and program secure code even with your hand showing. Heck, nowadays it's not that hard to decompile and reverse engineer *any* application so this is generally a good idea. The only time that someone seeing your code would be a problem is if they can see that you have coded a security bug and honestly, javascript does a decent enough job of minimizing your chances for doing something stupid that if you keep learning, and don't do anything too crazy you will be just fine.
83,456
I'm new to web applications. I have developed a web application using ASP.net. After hearing about jquery and going through some examples, I changed the existing application and used jquery to do most of the ajax work. On the server side I used html handlers for db connections and data processing. Because I'm using jquery, my concern is my script is visible to users of the application. Will this be a security issue? Am i following a right path?
2011/06/13
[ "https://softwareengineering.stackexchange.com/questions/83456", "https://softwareengineering.stackexchange.com", "https://softwareengineering.stackexchange.com/users/27627/" ]
No. JQuery is executed in the browser, an from a security point of view, the browser is always unreliable. No matter how much JavaScript you use or which JS frameworks, it's always under control of the attacker. Security has to implemented on the server side. It's a common mistake beginners make, to place security checks in the JavaScript part and let the server trust it.
Any javascript is visible to attackers, that is just the way the web works, browser downloads the script file and then executes it. Making a request for a script file, is the exact same process as making a request for a webpage. For example, if you are using Google Chrome right now (and for web development you should be using either google chrome or Firefox with firebug installed) hit Ctrl+Shift+J to open up the Developer TOols. You will be able to see everything about this web page. Click the Elements tab, expand the area that says `<head>` and look in there for any of the script elements, you will see something like this: ![Script tags in Google Chrome Developer Tools](https://i.stack.imgur.com/zKCsL.png) See those urls in the script tags? Those are the urls that you go to to get that code, you can type them directly into your browser and get those files, this is how all webpages work and the trick is to accept it and program secure code even with your hand showing. Heck, nowadays it's not that hard to decompile and reverse engineer *any* application so this is generally a good idea. The only time that someone seeing your code would be a problem is if they can see that you have coded a security bug and honestly, javascript does a decent enough job of minimizing your chances for doing something stupid that if you keep learning, and don't do anything too crazy you will be just fine.
53,242
I've got a Windows 2003 Active Directory infrastructure, and there are times (such as when terminating an employee) that I want instantaneous propagation across both of my AD servers. Currently, I make the change in both places, which I suspect is unhealthy, but it's the only way I know to make sure that the account is disabled to every machine. Is there a better way? Do I have to wait for the normal propagation time for convergence, or is there a way to "force" it?
2009/08/12
[ "https://serverfault.com/questions/53242", "https://serverfault.com", "https://serverfault.com/users/4392/" ]
There is an even easier change. Just reset the users password. That is one of the few instant replications that AD performs. No need to run site replication EDIT: Small edit. It's not completely instant. What is does is forward the change in an out-of-band update. (It doesn't wait for normal replication cycles) however, it's probably about as close to instant that you can get with AD.
There 's also the possibility to make this using *repadmin.exe* with the /sync switch or you can even script it using *ReplicaSync* wich is included in the IADsTools. You might see [this](http://support.microsoft.com/?scid=kb;en-us;232072&x=11&y=10) kb article wich discuss the availables options.
53,242
I've got a Windows 2003 Active Directory infrastructure, and there are times (such as when terminating an employee) that I want instantaneous propagation across both of my AD servers. Currently, I make the change in both places, which I suspect is unhealthy, but it's the only way I know to make sure that the account is disabled to every machine. Is there a better way? Do I have to wait for the normal propagation time for convergence, or is there a way to "force" it?
2009/08/12
[ "https://serverfault.com/questions/53242", "https://serverfault.com", "https://serverfault.com/users/4392/" ]
If you go into Active Directory Sites and Services, you can force replications. Open the Server object and click on the NTDS Settings. This will give you a list of their replication partners for GC data as well as regular DC-DC traffic. As I understand it, you can force replication by going to each of the connections, right-clicking on it, and choosing "Replicate Now". [![alt text](https://i.stack.imgur.com/kuvC4.png)](https://i.stack.imgur.com/kuvC4.png) (source: [sysadmin1138.net](http://sysadmin1138.net/images/replicate-now.png))
[**Here's a Technet articl**](http://technet.microsoft.com/en-us/library/cc772726(WS.10).aspx#w2k3tr_repup_how_huzs)e that outlines the AD replication model. Look at the Urgent Replication section for a discussion of what gets replicated immediately and doesn't require a forced replication. Mostly it's critical user security events (password changes, account lockouts) that are replicated immediately. There are configuration considerations to be made to make this happen site to site.
53,242
I've got a Windows 2003 Active Directory infrastructure, and there are times (such as when terminating an employee) that I want instantaneous propagation across both of my AD servers. Currently, I make the change in both places, which I suspect is unhealthy, but it's the only way I know to make sure that the account is disabled to every machine. Is there a better way? Do I have to wait for the normal propagation time for convergence, or is there a way to "force" it?
2009/08/12
[ "https://serverfault.com/questions/53242", "https://serverfault.com", "https://serverfault.com/users/4392/" ]
There is an even easier change. Just reset the users password. That is one of the few instant replications that AD performs. No need to run site replication EDIT: Small edit. It's not completely instant. What is does is forward the change in an out-of-band update. (It doesn't wait for normal replication cycles) however, it's probably about as close to instant that you can get with AD.
[**Here's a Technet articl**](http://technet.microsoft.com/en-us/library/cc772726(WS.10).aspx#w2k3tr_repup_how_huzs)e that outlines the AD replication model. Look at the Urgent Replication section for a discussion of what gets replicated immediately and doesn't require a forced replication. Mostly it's critical user security events (password changes, account lockouts) that are replicated immediately. There are configuration considerations to be made to make this happen site to site.
53,242
I've got a Windows 2003 Active Directory infrastructure, and there are times (such as when terminating an employee) that I want instantaneous propagation across both of my AD servers. Currently, I make the change in both places, which I suspect is unhealthy, but it's the only way I know to make sure that the account is disabled to every machine. Is there a better way? Do I have to wait for the normal propagation time for convergence, or is there a way to "force" it?
2009/08/12
[ "https://serverfault.com/questions/53242", "https://serverfault.com", "https://serverfault.com/users/4392/" ]
If you go into Active Directory Sites and Services, you can force replications. Open the Server object and click on the NTDS Settings. This will give you a list of their replication partners for GC data as well as regular DC-DC traffic. As I understand it, you can force replication by going to each of the connections, right-clicking on it, and choosing "Replicate Now". [![alt text](https://i.stack.imgur.com/kuvC4.png)](https://i.stack.imgur.com/kuvC4.png) (source: [sysadmin1138.net](http://sysadmin1138.net/images/replicate-now.png))
There is an even easier change. Just reset the users password. That is one of the few instant replications that AD performs. No need to run site replication EDIT: Small edit. It's not completely instant. What is does is forward the change in an out-of-band update. (It doesn't wait for normal replication cycles) however, it's probably about as close to instant that you can get with AD.
67,090
My wife and I just bought our first house. I think that we discovered that our dryer vent is going to require some regular maitenence to keep it clean and able to dry our clothes. When we moved in, we found that the vent opening in our laundry room was absolutely full of lint (solid). I got the vacuum cleaner and sucked up as much as I could and reached up and pulled out some more. When she ran our first load of laundry though, it didn't dry much at all. I went up to the attic and looked in our vent up there and there was more lint but not as much. I'm going to pick up a dryer vent [brush](http://ecx.images-amazon.com/images/I/51ea8-Yf%2BlL.jpg) today after work but I noticed that it looked like the tape on their vent was very new looking which leads me to think this was something they regularly had to do. I'm wondering how easy / effective is it to get one of the fans to attach to your vent that will essentially *help* it along. I can't think of what the fans are called right now though.
2015/06/02
[ "https://diy.stackexchange.com/questions/67090", "https://diy.stackexchange.com", "https://diy.stackexchange.com/users/33924/" ]
They're often called dryer booster fans. They aren't particularly difficult to install, and they are reasonably effective. However, they also have some significant downsides - the fan itself obstructs the the vent pipe, which makes it much more difficult to effectively clean the vent, and it will further reduce airflow and increase lint accumulation should it fail to operate (due to user error or mechanical failure, both of which will occur eventually). Installing and running a booster fan would certainly be preferable to your existing situation, but it would be even better to improve the venting to get sufficient airflow without needing a booster. Any flex duct should be replaced with rigid, smooth wall metal. Turns in the duct work should be minimized, and where they cannot be avoided, use long turn "dryer-ell" bends instead of the standard sharp 90 if possible. Installing a recessed dryer vent box where you connect the dryer can often allow you to connect the dryer with fewer turns in the dryer hose, as well as protecting the dryer hose from crushing damage. The dryer hose should be semi rigid aluminum flex duct; soft foil flex duct or periscope connectors should not be used. Make sure the exterior vent opening in the wall or roof opens widely and is not too restrictive.
Check the vent on the outside. The vent on my house had a screen behind the flap, which is not appropriate for a dryer vent but ok for a bathroom fan. Removing the screen fixed the constant clogging behavior in my system.
67,090
My wife and I just bought our first house. I think that we discovered that our dryer vent is going to require some regular maitenence to keep it clean and able to dry our clothes. When we moved in, we found that the vent opening in our laundry room was absolutely full of lint (solid). I got the vacuum cleaner and sucked up as much as I could and reached up and pulled out some more. When she ran our first load of laundry though, it didn't dry much at all. I went up to the attic and looked in our vent up there and there was more lint but not as much. I'm going to pick up a dryer vent [brush](http://ecx.images-amazon.com/images/I/51ea8-Yf%2BlL.jpg) today after work but I noticed that it looked like the tape on their vent was very new looking which leads me to think this was something they regularly had to do. I'm wondering how easy / effective is it to get one of the fans to attach to your vent that will essentially *help* it along. I can't think of what the fans are called right now though.
2015/06/02
[ "https://diy.stackexchange.com/questions/67090", "https://diy.stackexchange.com", "https://diy.stackexchange.com/users/33924/" ]
Dryer vents need regular maintenance. Period. No one ever does it, of course, which is why every time you buy a new house, you pretty much need to replace the old vent because no one ever cleaned it out. Ideally, you'd clean the lint out every 6 months or so. A booster fan can help, but doesn't necessarily eliminate any of the maintenance. Plus, you now have another mechanical device to deal with.
Check the vent on the outside. The vent on my house had a screen behind the flap, which is not appropriate for a dryer vent but ok for a bathroom fan. Removing the screen fixed the constant clogging behavior in my system.
67,090
My wife and I just bought our first house. I think that we discovered that our dryer vent is going to require some regular maitenence to keep it clean and able to dry our clothes. When we moved in, we found that the vent opening in our laundry room was absolutely full of lint (solid). I got the vacuum cleaner and sucked up as much as I could and reached up and pulled out some more. When she ran our first load of laundry though, it didn't dry much at all. I went up to the attic and looked in our vent up there and there was more lint but not as much. I'm going to pick up a dryer vent [brush](http://ecx.images-amazon.com/images/I/51ea8-Yf%2BlL.jpg) today after work but I noticed that it looked like the tape on their vent was very new looking which leads me to think this was something they regularly had to do. I'm wondering how easy / effective is it to get one of the fans to attach to your vent that will essentially *help* it along. I can't think of what the fans are called right now though.
2015/06/02
[ "https://diy.stackexchange.com/questions/67090", "https://diy.stackexchange.com", "https://diy.stackexchange.com/users/33924/" ]
They're often called dryer booster fans. They aren't particularly difficult to install, and they are reasonably effective. However, they also have some significant downsides - the fan itself obstructs the the vent pipe, which makes it much more difficult to effectively clean the vent, and it will further reduce airflow and increase lint accumulation should it fail to operate (due to user error or mechanical failure, both of which will occur eventually). Installing and running a booster fan would certainly be preferable to your existing situation, but it would be even better to improve the venting to get sufficient airflow without needing a booster. Any flex duct should be replaced with rigid, smooth wall metal. Turns in the duct work should be minimized, and where they cannot be avoided, use long turn "dryer-ell" bends instead of the standard sharp 90 if possible. Installing a recessed dryer vent box where you connect the dryer can often allow you to connect the dryer with fewer turns in the dryer hose, as well as protecting the dryer hose from crushing damage. The dryer hose should be semi rigid aluminum flex duct; soft foil flex duct or periscope connectors should not be used. Make sure the exterior vent opening in the wall or roof opens widely and is not too restrictive.
Definitely get the booster fan. We installed one in our old house. It worked quite well. They cost about $200 and you can pick them up at places like Home Depot. It sounds like your venting tube is probably very long. The booster fan will help force the gunk out.
67,090
My wife and I just bought our first house. I think that we discovered that our dryer vent is going to require some regular maitenence to keep it clean and able to dry our clothes. When we moved in, we found that the vent opening in our laundry room was absolutely full of lint (solid). I got the vacuum cleaner and sucked up as much as I could and reached up and pulled out some more. When she ran our first load of laundry though, it didn't dry much at all. I went up to the attic and looked in our vent up there and there was more lint but not as much. I'm going to pick up a dryer vent [brush](http://ecx.images-amazon.com/images/I/51ea8-Yf%2BlL.jpg) today after work but I noticed that it looked like the tape on their vent was very new looking which leads me to think this was something they regularly had to do. I'm wondering how easy / effective is it to get one of the fans to attach to your vent that will essentially *help* it along. I can't think of what the fans are called right now though.
2015/06/02
[ "https://diy.stackexchange.com/questions/67090", "https://diy.stackexchange.com", "https://diy.stackexchange.com/users/33924/" ]
They're often called dryer booster fans. They aren't particularly difficult to install, and they are reasonably effective. However, they also have some significant downsides - the fan itself obstructs the the vent pipe, which makes it much more difficult to effectively clean the vent, and it will further reduce airflow and increase lint accumulation should it fail to operate (due to user error or mechanical failure, both of which will occur eventually). Installing and running a booster fan would certainly be preferable to your existing situation, but it would be even better to improve the venting to get sufficient airflow without needing a booster. Any flex duct should be replaced with rigid, smooth wall metal. Turns in the duct work should be minimized, and where they cannot be avoided, use long turn "dryer-ell" bends instead of the standard sharp 90 if possible. Installing a recessed dryer vent box where you connect the dryer can often allow you to connect the dryer with fewer turns in the dryer hose, as well as protecting the dryer hose from crushing damage. The dryer hose should be semi rigid aluminum flex duct; soft foil flex duct or periscope connectors should not be used. Make sure the exterior vent opening in the wall or roof opens widely and is not too restrictive.
Dryer vents need regular maintenance. Period. No one ever does it, of course, which is why every time you buy a new house, you pretty much need to replace the old vent because no one ever cleaned it out. Ideally, you'd clean the lint out every 6 months or so. A booster fan can help, but doesn't necessarily eliminate any of the maintenance. Plus, you now have another mechanical device to deal with.
67,090
My wife and I just bought our first house. I think that we discovered that our dryer vent is going to require some regular maitenence to keep it clean and able to dry our clothes. When we moved in, we found that the vent opening in our laundry room was absolutely full of lint (solid). I got the vacuum cleaner and sucked up as much as I could and reached up and pulled out some more. When she ran our first load of laundry though, it didn't dry much at all. I went up to the attic and looked in our vent up there and there was more lint but not as much. I'm going to pick up a dryer vent [brush](http://ecx.images-amazon.com/images/I/51ea8-Yf%2BlL.jpg) today after work but I noticed that it looked like the tape on their vent was very new looking which leads me to think this was something they regularly had to do. I'm wondering how easy / effective is it to get one of the fans to attach to your vent that will essentially *help* it along. I can't think of what the fans are called right now though.
2015/06/02
[ "https://diy.stackexchange.com/questions/67090", "https://diy.stackexchange.com", "https://diy.stackexchange.com/users/33924/" ]
Dryer vents need regular maintenance. Period. No one ever does it, of course, which is why every time you buy a new house, you pretty much need to replace the old vent because no one ever cleaned it out. Ideally, you'd clean the lint out every 6 months or so. A booster fan can help, but doesn't necessarily eliminate any of the maintenance. Plus, you now have another mechanical device to deal with.
Definitely get the booster fan. We installed one in our old house. It worked quite well. They cost about $200 and you can pick them up at places like Home Depot. It sounds like your venting tube is probably very long. The booster fan will help force the gunk out.
357,285
I was looking at a question that I had answered, and the person who had asked the question had many badges (including a gold badge and 6 silver badges), but only 1 point. And when I click on the profile of that person, the profile says that it is suspended until **November of 2021** because of "voting irregularities." Here is a screenshot of the above profile: [![enter image description here](https://i.stack.imgur.com/PmyFu.jpg)](https://i.stack.imgur.com/PmyFu.jpg) These voting irregularities are (I assume) the person in question having a bunch of downvotes on questions/answers that others deem appropriate. This person has been an integral part of multiple communities, with 251 points in matter modeling and hundreds, if not thousands in ELL, which is where I found this profile. This has been happening more and more often, with dedicated users being suspended for things such as voting irregularities. So my question is as following: Is it fair for these committed users (across all sites) that obviously add to the community to be suspended for such a long period of time over something as inconclusive as "voting irregularities?" Because to me, it seems a little harsh to suspend a veteran account (which has contributed greatly) for such a long period of time, and I believe that this should be lessened or a warning should be issued beforehand. Any thoughts on this issue would be greatly appreciated as well as to where I can find what constitutes as voting irregularities, because it seems kind of broad and vague. EDIT: I looked further at the profile of this person and found the following: [![enter image description here](https://i.stack.imgur.com/cEEHN.jpg)](https://i.stack.imgur.com/cEEHN.jpg) So I have absolutely no idea as to how 7 votes would lead to a suspension over voting irregularities.
2020/11/30
[ "https://meta.stackexchange.com/questions/357285", "https://meta.stackexchange.com", "https://meta.stackexchange.com/users/888841/" ]
There's not really much that we can talk about in public about a specific user but someone doesn't generally make it to a year long suspension without having done something repeatedly after several warnings. That's how we guide moderators - generally a warning followed by 7, 30, and finally 365-day suspensions for repeated infractions, particularly for the same or similar activities. There are several ways that accounts can commit voting fraud, including by using fake accounts to vote for a primary account (sockpuppeting) or by using fake accounts to target votes at someone else. In these cases, the standard action is to delete the sock accounts and suspend the primary account with an explanation of how voting on our network works. You can also commit voting fraud with a single account by targeting your votes at one other user - targeted voting is not allowed, particularly targeted downvoting, so someone who downvotes another user a lot may be suspended for the abusive behavior of attacking a person with downvotes. If this is discovered, the votes are invalidated by staff (mods can't do this) and the user will often be suspended with information about how votes should be cast based on the content of the post, not who wrote it. I can understand that you find this troubling but, should it not be more troubling to think that someone achieved that high-rep status by falsely inflating their own reputation or chose to use their voting privileges to target another user on the site for votes? That is something that I don't like, whether it's done for a positive (upvotes) or negative (downvotes) reason. Someone who has fallen into this several times, so that they have made their way to a full year suspension, may be struggling to understand what I see as a very simple request - vote on content, not the people creating it. How much reputation a user has amassed in the time they've spent on the site shouldn't really impact that much - and 1k reputation - which may seem like a lot to some, isn't actually that much in the grand scheme of some sites.
> > Is it fair for these committed users (across all sites) that obviously add to the community to be suspended for such a long period of time over something as inconclusive as "voting irregularities?" > > > Objectively, **yes**. You don't know any further details other than the fact that they were suspended for voting irregularities, and year-long bans are **not** given lightly. We mortals only see this from one perspective - maybe two if the person who was suspended posts about their experience on social media (and [that has happened before](https://meta.stackoverflow.com/q/288229/1079354)). You're not going to get the full picture, so establishing if it's "fair" is above our pay grade by a ***lot***.
410,647
Suppose that magnetic wire with resistivity similar to copper but with non-negligible permeability existed. What effect would magnetic wire have on the coil inductance versus a copper coil of the same size/construction? This is of course a purely academic exercise, since such a material does not exist, though it could perhaps exist for plated wire at high frequencies, due to the skin effect. EDIT: I guess it's not entirely academic. isdi pointed out that "high permeability wire" actually exists. According to [this](http://www.hongertech.com/high-quality-soft-magnetic-mu-metal-wire_p35.html): > > Mu Metal Wire is used in the production of inductors and electrical transformers. It has one of the highest magnetic permeability ratings of any metal or alloy, making it an industry standard in applications where metals with high magnetic permeability are needed. > > >
2018/12/05
[ "https://electronics.stackexchange.com/questions/410647", "https://electronics.stackexchange.com", "https://electronics.stackexchange.com/users/159890/" ]
The [IR2110](https://www.infineon.com/dgdl/ir2110.pdf?fileId=5546d462533600a4015355c80333167e) is designed to supply +/-2A to drive the gate of your output devices. You have an asymmetric drive caused by the 47 Ohm and diode across it. You have not specified what 2VM1 is, but assuming it's no more than a 15V supply your maximum gate drive current is about 0.3A during turn on. During turn off you may be exceeding the gate drive capability of the IR2110 since you then have 47 Ohm with whatever the effective Vf slope resistance of the diode is (along with a 0.8-0.9V offset). I'd suggest you could remove D2, 3, 5, 6 and replace the 47 Ohm with 10 Ohm resistors. This will make the turn on/off delays **the same** which is important. With a 15V VCC that will give you 1.5A charge/discharge current.
You have problem with charging the bootstrap capacitor cause its charging thru 2l at very high frequency he will never charge for needed value.
18,701,297
My problem is that I can't get the full width and height, I have tried lots of things but I can't get it right and now I am asking you guys to help me. <http://i.imgur.com/tjACREv.png> <http://imgur.com/tjACREv> What you are seeing here is a picture of my header. Everything works but I can't remove the background lines at the outide of the image as you can see it: left, right and top so how do I fix that?
2013/09/09
[ "https://Stackoverflow.com/questions/18701297", "https://Stackoverflow.com", "https://Stackoverflow.com/users/2761832/" ]
Given that you do not want to use a third party tool then this is not achievable with SSRS. Assuming end users have some PDF Reader - granted a third party tool, but probably already in use - then is exporting the report in PDF format a suitable alternative?
With SoftArtisans OfficeWriter's SSRS integration, you can create do this by turning on worksheet protection in the template Excel file that is embedded in the RDL when the report is designed with OfficeWriter's client-side designer (an Excel add-in). The output file will be locked and protected. See: [OfficeWriter for SQL Server Reporting Services](http://www.officewriter.com/officewriter-for-sql-server-reporting-services.aspx) Disclaimer: I work for SoftArtisans, makers of OfficeWriter
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
Hand them the mouse. The mouse pointer is designed to point to things at the screen. On a regular basis I hand the mouse over to colleagues standing by my desk. Not as much to prevent them from touching the screen, but because the "now click here" leads to misinterpretation. If you do that with a smooth gesture when they start pointing to your screen, they will quickly grab the mouse. For me it is specifically efficient because I am left-handed, use the mouse on right-handed setting, and, due to my position in the office, people will generally be on my left-hand side.
Install a surge protector with a large, silent switch underneath your desk with only your monitor plugged into it. When your co-worker touches your screen, stealthily turn off the monitor with your foot. Now pretend that your monitor always breaks when it is touched, but it can be easily fixed if they'll just wait a moment. Now spend a few minutes wiggling it around to "try" to fix it. Don't act frustrated, but silently focus on resolving this new problem while ignoring their presence. Draw this act out for as long as is necessary until the co-worker becomes uncomfortable or impatient. When they seem ready to leave, sneakily switch your monitor back on with your foot. Do this as many times as is needed until this person decides to alter their behavior. Most will remember the problem they caused and will not want to see that happen a second time. If someone keeps doing it, you can safely assume that they are a sociopath, in which case you may want to consider putting a thin layer of petroleum jelly on your screen for the next time when they come by. If they touch the screen and ask about the slimy substance, tell them that you are sorry, but have had a hard time keeping enough tissues on hand. If you're going to be passive aggressive, you must make an art form of it.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
If you're worried that they will be mad at your for asking them to change their ways, you have to take a big breath and either put up with the smears (which do appear to bother you more than most people) or put up with them thinking you're weird for caring. You can try to minimize the extent to which they will be mad by **apologizing** and by **framing it as being your oddity**. For example: > > Hey! Don't touch my screen! > > > Is an order, a demand, and is all about them (with a little ownership thrown in). In contrast > > Sorry, it makes me crazy when people touch my screen, can I ask you please just to point without touching? > > > Is a request, acknowledges that they couldn't have known, is about you and asking them to accommodate your difference, and offers an alternative (always better than just "don't X.") Signs are useful only in the absence of the sign-writer. Since I doubt people are stopping by to touch your screen when you're away, you don't need them. Deep sighing and conspicuous screen cleaning will make people feel bad, but if you're distracted by a smudge someone left, go ahead and clean it the moment you notice it, even if that's the moment the person left it. And practice **asking** repeatedly (with the sorry from time to time, with the explanation from time to time, with the please every time) that they don't do that. Finally, model this behaviour yourself. Don't touch other people's screens. Demonstrate that it's possible to discuss code without leaving smudges. And if your coworkers care more than you do about chairs or whiteboard markers or putting garbage in someone else's garbage can or whatever, be super cool and understanding about that. Work towards having an office that cares about each other, foibles and all, and wouldn't find it funny to deliberately upset a coworker by doing something you know they don't like.
Perhaps I'm misunderstanding the question, but I'm surprised that no one has suggested [an anti-dust, anti-fingerprint screen protector](https://www.google.com.tw/search?newwindow=1&client=firefox-b-ab&q=monitor%20screen%20protectors%20fingerprints&oq=monitor%20screen%20protectors%20fingerprints&gs_l=serp.3...1177.5360.0.5621.17.14.2.0.0.0.255.1220.10j3j1.14.0....0...1c.1.64.serp..1.12.777...0i22i30k1j33i160k1j33i21k1.6LDXrwc1Tvc)? While some are unsightly, there are plenty of good options out there. And if you get the unsightly kind that are like a shell, you can just put them up when people come by and take them down after they have left. This being said, I do agree to a certain extent that you should maybe also learn to let it go a little bit. I suggest this because it is often easier [to change ourselves than to change others](https://en.wikipedia.org/wiki/Serenity_Prayer). Not to pass judgement on you, but to save you future headaches when and if similar things occur on future occasions. Disclaimer: I **do not** work for screen protector companies of any sort nor do I have any vested interest in any of them. Just thought to bring it up.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
If you're worried that they will be mad at your for asking them to change their ways, you have to take a big breath and either put up with the smears (which do appear to bother you more than most people) or put up with them thinking you're weird for caring. You can try to minimize the extent to which they will be mad by **apologizing** and by **framing it as being your oddity**. For example: > > Hey! Don't touch my screen! > > > Is an order, a demand, and is all about them (with a little ownership thrown in). In contrast > > Sorry, it makes me crazy when people touch my screen, can I ask you please just to point without touching? > > > Is a request, acknowledges that they couldn't have known, is about you and asking them to accommodate your difference, and offers an alternative (always better than just "don't X.") Signs are useful only in the absence of the sign-writer. Since I doubt people are stopping by to touch your screen when you're away, you don't need them. Deep sighing and conspicuous screen cleaning will make people feel bad, but if you're distracted by a smudge someone left, go ahead and clean it the moment you notice it, even if that's the moment the person left it. And practice **asking** repeatedly (with the sorry from time to time, with the explanation from time to time, with the please every time) that they don't do that. Finally, model this behaviour yourself. Don't touch other people's screens. Demonstrate that it's possible to discuss code without leaving smudges. And if your coworkers care more than you do about chairs or whiteboard markers or putting garbage in someone else's garbage can or whatever, be super cool and understanding about that. Work towards having an office that cares about each other, foibles and all, and wouldn't find it funny to deliberately upset a coworker by doing something you know they don't like.
> > You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. > > > I have a different point of view here. It sounds like lighting ergonomics are a significant factor in your problem. You should **never** have screen glare or the sun reflecting off your screen! I don't know what country you are from, but surely there are laws governing ergonomics in the workplace. Blinds on the windows, anti-glare screen covers, lighting requirements etc. If you can solve the screen glare issue so that the sun *doesn't* reflect off the screen, then you won't care any more about people touching your monitor and you can better get on with your work. Talk to your supervisor.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
Install a surge protector with a large, silent switch underneath your desk with only your monitor plugged into it. When your co-worker touches your screen, stealthily turn off the monitor with your foot. Now pretend that your monitor always breaks when it is touched, but it can be easily fixed if they'll just wait a moment. Now spend a few minutes wiggling it around to "try" to fix it. Don't act frustrated, but silently focus on resolving this new problem while ignoring their presence. Draw this act out for as long as is necessary until the co-worker becomes uncomfortable or impatient. When they seem ready to leave, sneakily switch your monitor back on with your foot. Do this as many times as is needed until this person decides to alter their behavior. Most will remember the problem they caused and will not want to see that happen a second time. If someone keeps doing it, you can safely assume that they are a sociopath, in which case you may want to consider putting a thin layer of petroleum jelly on your screen for the next time when they come by. If they touch the screen and ask about the slimy substance, tell them that you are sorry, but have had a hard time keeping enough tissues on hand. If you're going to be passive aggressive, you must make an art form of it.
If I were you I'd be putting a "Please Wash Your Hands" sign in the bathroom or something! I've been trying to reproduce your oily fingerprint problem after reading this post, and no matter what angle I look at it relative to the sun & the lights I can't see any mark on it... It may be seen as a bit overly sensitive to complain about people touching your screen... it risks inviting gossip. It may be a lot easier to position yourself so the sun is never glaring off your screen as Jimbo said in another post.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
Make a small, colorful sign that says "PLEASE DONT TOUCH MY SCREEN." Include some clip art of a fingerprint. Attach the sign to the frame of your monitor in a conspicuous spot. There you go.
If I were you I'd be putting a "Please Wash Your Hands" sign in the bathroom or something! I've been trying to reproduce your oily fingerprint problem after reading this post, and no matter what angle I look at it relative to the sun & the lights I can't see any mark on it... It may be seen as a bit overly sensitive to complain about people touching your screen... it risks inviting gossip. It may be a lot easier to position yourself so the sun is never glaring off your screen as Jimbo said in another post.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
You have to ask yourself two questions—"what do I really want?" and "will I really get it?" You can certainly push very hard on the "don't touch my monitor" angle, from cleaning the screen when ever someone touches it, to putting up a sign. The question is, what will you get out of that? As you've noted, it will probably come off as passive aggressive. So then, ask yourself if having your coworkers not touch your screen is more important than having a good working relationship. I snore. It drives my wife crazy. Yet she loves me a lot and has decided putting up with my snoring is worth the great relationship we have (17 years and going strong). I would also especially note that given we are moving more and more to a touch screen society it is very likely you're fighting a losing battle. My advice is to keep the cleaning wipes handy and when you are alone, clean your monitor. Your relationships with your coworkers is worth the extra screen cleanings.
Install a surge protector with a large, silent switch underneath your desk with only your monitor plugged into it. When your co-worker touches your screen, stealthily turn off the monitor with your foot. Now pretend that your monitor always breaks when it is touched, but it can be easily fixed if they'll just wait a moment. Now spend a few minutes wiggling it around to "try" to fix it. Don't act frustrated, but silently focus on resolving this new problem while ignoring their presence. Draw this act out for as long as is necessary until the co-worker becomes uncomfortable or impatient. When they seem ready to leave, sneakily switch your monitor back on with your foot. Do this as many times as is needed until this person decides to alter their behavior. Most will remember the problem they caused and will not want to see that happen a second time. If someone keeps doing it, you can safely assume that they are a sociopath, in which case you may want to consider putting a thin layer of petroleum jelly on your screen for the next time when they come by. If they touch the screen and ask about the slimy substance, tell them that you are sorry, but have had a hard time keeping enough tissues on hand. If you're going to be passive aggressive, you must make an art form of it.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
You don't. Your next question would be how do keep people from throwing their trash away in my trash can. Then how can I keep pens from getting stolen on my desk. Then how can I keep people from borrowing my chair when I am not there. This is in essence like the cleaning crew at my company standing next to the glass entry door asking people to only touch the handle. These are the dangers of going to a workplace. If these things truly bother you then be prepared to be bothered. You don't put signs on things, ask people, or anything else. You have several pieces of advice here. Please do not say anything to anyone or proactively try to keep these things from happening. There is a good chance they will take you for something in between an OCD loon and someone who doesn't like them/people. You may feel better about saying something to John and he may never touch your screen again. But he also may never think of you the same again. It just isn't worth it. So deal with it. Don't pull out the wipes while they are there, don't sigh, don't do passive aggressive mind games, just chill.
Install a surge protector with a large, silent switch underneath your desk with only your monitor plugged into it. When your co-worker touches your screen, stealthily turn off the monitor with your foot. Now pretend that your monitor always breaks when it is touched, but it can be easily fixed if they'll just wait a moment. Now spend a few minutes wiggling it around to "try" to fix it. Don't act frustrated, but silently focus on resolving this new problem while ignoring their presence. Draw this act out for as long as is necessary until the co-worker becomes uncomfortable or impatient. When they seem ready to leave, sneakily switch your monitor back on with your foot. Do this as many times as is needed until this person decides to alter their behavior. Most will remember the problem they caused and will not want to see that happen a second time. If someone keeps doing it, you can safely assume that they are a sociopath, in which case you may want to consider putting a thin layer of petroleum jelly on your screen for the next time when they come by. If they touch the screen and ask about the slimy substance, tell them that you are sorry, but have had a hard time keeping enough tissues on hand. If you're going to be passive aggressive, you must make an art form of it.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
You have to ask yourself two questions—"what do I really want?" and "will I really get it?" You can certainly push very hard on the "don't touch my monitor" angle, from cleaning the screen when ever someone touches it, to putting up a sign. The question is, what will you get out of that? As you've noted, it will probably come off as passive aggressive. So then, ask yourself if having your coworkers not touch your screen is more important than having a good working relationship. I snore. It drives my wife crazy. Yet she loves me a lot and has decided putting up with my snoring is worth the great relationship we have (17 years and going strong). I would also especially note that given we are moving more and more to a touch screen society it is very likely you're fighting a losing battle. My advice is to keep the cleaning wipes handy and when you are alone, clean your monitor. Your relationships with your coworkers is worth the extra screen cleanings.
You don't. Your next question would be how do keep people from throwing their trash away in my trash can. Then how can I keep pens from getting stolen on my desk. Then how can I keep people from borrowing my chair when I am not there. This is in essence like the cleaning crew at my company standing next to the glass entry door asking people to only touch the handle. These are the dangers of going to a workplace. If these things truly bother you then be prepared to be bothered. You don't put signs on things, ask people, or anything else. You have several pieces of advice here. Please do not say anything to anyone or proactively try to keep these things from happening. There is a good chance they will take you for something in between an OCD loon and someone who doesn't like them/people. You may feel better about saying something to John and he may never touch your screen again. But he also may never think of you the same again. It just isn't worth it. So deal with it. Don't pull out the wipes while they are there, don't sigh, don't do passive aggressive mind games, just chill.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
If you're worried that they will be mad at your for asking them to change their ways, you have to take a big breath and either put up with the smears (which do appear to bother you more than most people) or put up with them thinking you're weird for caring. You can try to minimize the extent to which they will be mad by **apologizing** and by **framing it as being your oddity**. For example: > > Hey! Don't touch my screen! > > > Is an order, a demand, and is all about them (with a little ownership thrown in). In contrast > > Sorry, it makes me crazy when people touch my screen, can I ask you please just to point without touching? > > > Is a request, acknowledges that they couldn't have known, is about you and asking them to accommodate your difference, and offers an alternative (always better than just "don't X.") Signs are useful only in the absence of the sign-writer. Since I doubt people are stopping by to touch your screen when you're away, you don't need them. Deep sighing and conspicuous screen cleaning will make people feel bad, but if you're distracted by a smudge someone left, go ahead and clean it the moment you notice it, even if that's the moment the person left it. And practice **asking** repeatedly (with the sorry from time to time, with the explanation from time to time, with the please every time) that they don't do that. Finally, model this behaviour yourself. Don't touch other people's screens. Demonstrate that it's possible to discuss code without leaving smudges. And if your coworkers care more than you do about chairs or whiteboard markers or putting garbage in someone else's garbage can or whatever, be super cool and understanding about that. Work towards having an office that cares about each other, foibles and all, and wouldn't find it funny to deliberately upset a coworker by doing something you know they don't like.
You don't. Your next question would be how do keep people from throwing their trash away in my trash can. Then how can I keep pens from getting stolen on my desk. Then how can I keep people from borrowing my chair when I am not there. This is in essence like the cleaning crew at my company standing next to the glass entry door asking people to only touch the handle. These are the dangers of going to a workplace. If these things truly bother you then be prepared to be bothered. You don't put signs on things, ask people, or anything else. You have several pieces of advice here. Please do not say anything to anyone or proactively try to keep these things from happening. There is a good chance they will take you for something in between an OCD loon and someone who doesn't like them/people. You may feel better about saying something to John and he may never touch your screen again. But he also may never think of you the same again. It just isn't worth it. So deal with it. Don't pull out the wipes while they are there, don't sigh, don't do passive aggressive mind games, just chill.
54,312
It bothers me when a coworker who is talking with me at my desk reaches out and stabs a finger into the screen, leaving fingerprints/skin oil (sometimes repeatedly and lingeringly rolling that finger around for emphasis). You can't see the marks, maybe, but later, as the sun moves into the right position, what seemed invisible becomes a slew of very visible scattered smudges making my monitor look all dirty and hard to read. I've said up to at least two times each to different people who were touching my screen something like (trying to be mild in tone and phrasing), "I'd rather you not touch the screen" or "please don't touch the screen", and the person always says "oh, sorry". But the next week, there comes someone doing it again. I thought about having screen cleaning materials out and very obviously cleaning the screen where they've been touching immediately after they do it, but I'm worried that is too passive-aggressive an approach. I thought about trying to sit in my chair in a way that blocks the person from reaching the screen, but that's not workable the way I have a straight desk open on the sides. I considered just consistently asking the person each and every time to not touch the screen, but I don't want to be offensive. I also tend to come across about 2 or 3 notches more upset or angry than I really am, and people tend to notice this and react unless I really make an effort to be mild—but this leaves me usually saying nothing. I'm concerned people will go tell stories to others about me freaking out about my screen. (Whether or not I do freak out or my reaction is more than they think it should be, it seems a simple problem that is solvable, I just don't know how. I think it's reasonable to want people to avoid dirtying/marring, even if temporarily, the work tools assigned for my use.) Besides not having these conversations at my desk (we could go to a conference room or to the other person's desk where I for sure will not be touching their screen), or putting up some kind of physical blockade, what are some strategies I can use to be tactful, mild, yet effective? Note: I shouldn't have asked this question at work because someone came by, saw what I was posting before I could switch to another tab, and deliberately dragged his finger across my screen to bait me. Sigh. Anyway, here's the question. I just thought of getting a small pointing device, something like the stylus for a tablet. I could attach it to the monitor with a small piece of adhesive Velcro. What do you think about, when someone starts touching the screen, to hand this to them and ask them to point with it? Photos added for reference. [![A neighboring screen covered in smudges](https://i.stack.imgur.com/Ic7z0.jpg)](https://i.stack.imgur.com/Ic7z0.jpg) [![My screen, where smudges normally can't be seen](https://i.stack.imgur.com/c2iuV.jpg)](https://i.stack.imgur.com/c2iuV.jpg) [![The sun stabbing into this high-rise](https://i.stack.imgur.com/N5Iaj.jpg)](https://i.stack.imgur.com/N5Iaj.jpg)
2015/09/11
[ "https://workplace.stackexchange.com/questions/54312", "https://workplace.stackexchange.com", "https://workplace.stackexchange.com/users/5483/" ]
> > You can't see the marks, maybe, but later, as the sun moves into the > right position, what seemed invisible becomes a slew of very visible > scattered smudges making my monitor look all dirty and hard to read. > > > [...] > > > I thought about having screen cleaning materials out and very > obviously cleaning the screen where they've been touching immediately > after they do it, but I'm worried that is too passive-aggressive an > approach. > > > The easiest is just to have cleaning materials in your drawer and clean the glass on a daily basis. No need to tell anyone you're cleaning it. Your objective should just to get the glass clean. Period. After all, you don't curse at the wind or the rain when your car gets dirty. You just do the best you can to keep your car clean, and once in a while your car gets dirty despite your best efforts, so you take it to the car wash. This is the same thing here. Touching a screen is very often an unconscious gesture. Re-programming your fellow colleagues so that they no longer touch your screen 100% of the time is quite unfeasible. I know this because my own desktop screen is a touchscreen and every time my colleagues touch the screen (every time, not remembering that it's a touchscreen), their accidental touch steals the focus away from the current line and it usually messes up our ongoing conversation (and yet, they keep on doing it because it's just an automatic gesture for them). The most you could do is place an additional shading/privacy screen on top of the existing screen, so that only the shading screen gets touched, and not the glass itself. But any real solution would have to include some form of letting go on your part. And for that, I'd recommend you start watching youtube videos and listening to audiobooks made by [Byron Katie](https://www.youtube.com/user/TheWorkofBK). That's what her entire methodology is ultimately about, letting go.
If I were you I'd be putting a "Please Wash Your Hands" sign in the bathroom or something! I've been trying to reproduce your oily fingerprint problem after reading this post, and no matter what angle I look at it relative to the sun & the lights I can't see any mark on it... It may be seen as a bit overly sensitive to complain about people touching your screen... it risks inviting gossip. It may be a lot easier to position yourself so the sun is never glaring off your screen as Jimbo said in another post.