Title stringlengths 15 150 | A_Id int64 2.98k 72.4M | Users Score int64 -17 470 | Q_Score int64 0 5.69k | ViewCount int64 18 4.06M | Database and SQL int64 0 1 | Tags stringlengths 6 105 | Answer stringlengths 11 6.38k | GUI and Desktop Applications int64 0 1 | System Administration and DevOps int64 1 1 | Networking and APIs int64 0 1 | Other int64 0 1 | CreationDate stringlengths 23 23 | AnswerCount int64 1 64 | Score float64 -1 1.2 | is_accepted bool 2
classes | Q_Id int64 1.85k 44.1M | Python Basics and Environment int64 0 1 | Data Science and Machine Learning int64 0 1 | Web Development int64 0 1 | Available Count int64 1 17 | Question stringlengths 41 29k |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Python 3: test command line arguments | 14,250,723 | 0 | 0 | 8,127 | 0 | python,python-3.x | All arguments that are passed when running your script will be placed in sys.argv. You have to import sys first. And then go through the arguments as you would like to. You might consider counting how many arguments you have to decide what to do. And note that the first argument is always the name of your script. | 0 | 1 | 0 | 0 | 2013-01-10T04:05:00.000 | 3 | 0 | false | 14,250,658 | 1 | 0 | 0 | 1 | I'm newbie to Python. I'd like to code a script running on Linux.
To test if user enter all the script arguments:
If user type: myscript => print "Usage: myscript [Dir] [Old] [New]"
If user type: myscript Dir => print "Please enter Old and New"
If user type: myscript Dir Old => print "Please enter New"
If user type a... |
Cassandra reports:"Unable to complete request: one or more nodes were unavailable" when I use CQL:"select * from User" | 25,874,976 | 0 | 2 | 3,429 | 0 | java,python,cassandra,cql | What was the replication factor that you used for the keyspace?
How many rows of data does the "users" column family have?
I found myself in a similar situation (yesterday) with replication factor set to 1 and "users" column family having only one row.
Cluster Information:
3 nodes on AWS
Same datacenter name
Keysp... | 0 | 1 | 0 | 0 | 2013-01-10T04:06:00.000 | 5 | 0 | false | 14,250,664 | 0 | 0 | 0 | 3 | Cassandra works in cluster model with 3 nodes.When all nodes are "UP", I use cql “select * from User” in cqlsh,Cassandra returns the right result.But after a node is dead,when I use "select" again,no result returns but reports:"Unable to complete request: one or more nodes were unavailable" .
I turned to use cassandra-... |
Cassandra reports:"Unable to complete request: one or more nodes were unavailable" when I use CQL:"select * from User" | 14,264,889 | 0 | 2 | 3,429 | 0 | java,python,cassandra,cql | cqlsh and cli both default to CL.ONE. I suspect the difference is actually that your cqlsh query says "select all the users" while a "get" in the cli is "select exactly one user." | 0 | 1 | 0 | 0 | 2013-01-10T04:06:00.000 | 5 | 0 | false | 14,250,664 | 0 | 0 | 0 | 3 | Cassandra works in cluster model with 3 nodes.When all nodes are "UP", I use cql “select * from User” in cqlsh,Cassandra returns the right result.But after a node is dead,when I use "select" again,no result returns but reports:"Unable to complete request: one or more nodes were unavailable" .
I turned to use cassandra-... |
Cassandra reports:"Unable to complete request: one or more nodes were unavailable" when I use CQL:"select * from User" | 14,258,686 | 2 | 2 | 3,429 | 0 | java,python,cassandra,cql | I expect that when you are using CQL you are having a request with a Consistency-Level being "ALL". In this case, it will wait for a reply from all the servers (that host a replica of that node) before returning. As one node is down it fail because it cannot contact the down node.
When you are doing it through Cassandr... | 0 | 1 | 0 | 0 | 2013-01-10T04:06:00.000 | 5 | 0.07983 | false | 14,250,664 | 0 | 0 | 0 | 3 | Cassandra works in cluster model with 3 nodes.When all nodes are "UP", I use cql “select * from User” in cqlsh,Cassandra returns the right result.But after a node is dead,when I use "select" again,no result returns but reports:"Unable to complete request: one or more nodes were unavailable" .
I turned to use cassandra-... |
Twisted: ReconnectingClientFactory connection to different servers | 14,266,178 | 2 | 4 | 1,504 | 0 | python,python-2.7,twisted,failover | ReconnectingClientFactory doesn't have this capability. You can build your own factory which implements this kind of reconnection logic, mostly by hooking into the clientConnectionFailed factory method. When this is called and the reason seems to you like that justifies switching servers (eg, twisted.internet.error.C... | 0 | 1 | 0 | 0 | 2013-01-10T10:08:00.000 | 2 | 0.197375 | false | 14,255,289 | 0 | 0 | 0 | 1 | I have a twisted ReconnectingClientFactory and i can successfully connect to given ip and port couple with this factory. And it works well.
reactor.connectTCP(ip, port, myHandsomeReconnectingClientFactory)
In this situation, when the server is gone, myHandsomeReconnectingClientFactory tries to connect same ip and por... |
API Versioning while maintaining git history | 14,271,577 | 2 | 1 | 605 | 0 | python,git,api,version | I think you want to use tags in your git repository. For each version of your api, use git tag vn and you don't need to maintain earlier versions of your files. You can access all files at a certain version just using git checkout vn.
If you use a remote repository, you need to use the flag --tags to send the tags to t... | 0 | 1 | 0 | 1 | 2013-01-11T04:03:00.000 | 1 | 0.379949 | false | 14,271,489 | 0 | 0 | 0 | 1 | I currently have a v1 API and have updated and created new scripts for v2. The API is consumed by other developers and consists of a bunch of scripts. Before migrating and adding v2 I want to make sure I have a successful versioning strategy to go ahead with.
Currently, there is a bash script called before using the A... |
Which linux tool is best for parsing multiple files simultaneously | 14,271,786 | 2 | 0 | 217 | 0 | python,linux,perl,sed,awk | This thread is going to start a war on which is best :)
As you know python, you should definitely go with that. I myself have done a lot of text manipulation using python, where everything else tend to become complex.
Even though awk can do what you need, you won't like what you see in the code. | 0 | 1 | 0 | 0 | 2013-01-11T04:25:00.000 | 2 | 0.197375 | false | 14,271,653 | 0 | 0 | 0 | 1 | I know bit of all sed, awk , python , but not perl.
I need to parse around 100s of different files , find patterns , match multiple columns with each other and put in new files.
and i have to do that on regular basis.
I just want to know which tool will be best for that scenario.
based on that i will buy that books and... |
How to start ipython notebook server at boot as daemon | 32,446,382 | 3 | 20 | 20,905 | 0 | python,ipython | I assume you don't want to run the program as root. So this is my modified version that runs as <username> (put in /etc/rc.local before the exit 0 line):
su <username> -c "/usr/bin/ipython notebook --no-browser --profile <profilename> &"
You can check to make sure your ipython is at that path with which ipython. Though... | 0 | 1 | 0 | 0 | 2013-01-12T20:39:00.000 | 3 | 0.197375 | false | 14,297,741 | 1 | 0 | 0 | 1 | I love ipython, especially the notebook feature. I currently keep a screen session running with the notebook process in it. How would I add ipython's notebook engine/webserver to my system's (CentOS5) startup procedures? |
Django-nonrel broke after installing new version of Google App Engine SDK | 14,368,275 | 1 | 0 | 191 | 1 | python,google-app-engine,django-nonrel | Did you update djangoappengine without updating django-nonrel and djangotoolbox?
While I haven't upgraded to GAE 1.7.4 yet, I'm running 1.7.2 with no problems. I suspect your problem is not related to the GAE SDK but rather your django-nonrel installation has mismatching pieces. | 0 | 1 | 0 | 0 | 2013-01-13T20:03:00.000 | 2 | 0.099668 | false | 14,307,581 | 0 | 0 | 1 | 2 | I had GAE 1.4 installed in my local UBUNTU system and everything was working fine. Only warning I was getting at that time was something like "You are using old GAE SDK 1.4." So, to get rid of that I have done following things:
I removed old version of GAE and installed GAE 1.7. Along with that I have
also changed my ... |
Django-nonrel broke after installing new version of Google App Engine SDK | 14,382,654 | 0 | 0 | 191 | 1 | python,google-app-engine,django-nonrel | Actually I changed the google app engine path in /.bashrc file and restarted the system. It solved the issue. I think since I was not restarting the system after .bashrc changes, hence it was creating problem. | 0 | 1 | 0 | 0 | 2013-01-13T20:03:00.000 | 2 | 1.2 | true | 14,307,581 | 0 | 0 | 1 | 2 | I had GAE 1.4 installed in my local UBUNTU system and everything was working fine. Only warning I was getting at that time was something like "You are using old GAE SDK 1.4." So, to get rid of that I have done following things:
I removed old version of GAE and installed GAE 1.7. Along with that I have
also changed my ... |
How to tell if a file is being written to a Windows CIFS share from Linux | 14,489,863 | 1 | 9 | 3,260 | 0 | python,linux,share,cifs | What about this?:
Change the windows share to point to an actual Linux directory reserved for the purpose. Then, with simple Linux scripts, you can readily determine if any files there have any writers. Once there is a file not being written to, copy it to the windows folder—if that is where it needs to be. | 0 | 1 | 0 | 0 | 2013-01-14T06:12:00.000 | 2 | 0.099668 | false | 14,314,019 | 0 | 0 | 0 | 1 | I'm trying to write a script to take video files (ranging from several MB to several GB) written to a shared folder on a Windows server.
Ideally, the script will run on a Linux machine watching the Windows shared folder at an interval of something like every 15-120 seconds, and upload any files that have fully finished... |
sub domains in tornado web app for SAAS | 14,342,790 | 0 | 2 | 791 | 0 | python,webserver,subdomain,tornado,saas | Tornado itself does not handle subdomains.
You will need to something like NGNIX to control subdomain access. | 0 | 1 | 0 | 0 | 2013-01-15T09:03:00.000 | 2 | 0 | false | 14,334,222 | 0 | 0 | 1 | 2 | I have a web app which runs at www.mywebsite.com.
I am asking user to register and enter a subdomain name for their login. e.g. if user enters subdomain as "demo", then his login url should be "www.demo.mywebsite.com".
How this can be done in tornado web app, as tornado itself is a web server.
Or serving the app with n... |
sub domains in tornado web app for SAAS | 14,419,302 | 3 | 2 | 791 | 0 | python,webserver,subdomain,tornado,saas | self.request.host under tornado.web.RequestHandler will contain subdomain so you can change application logic according to subdomain eg. load current_user based on cookie + subdomain. | 0 | 1 | 0 | 0 | 2013-01-15T09:03:00.000 | 2 | 0.291313 | false | 14,334,222 | 0 | 0 | 1 | 2 | I have a web app which runs at www.mywebsite.com.
I am asking user to register and enter a subdomain name for their login. e.g. if user enters subdomain as "demo", then his login url should be "www.demo.mywebsite.com".
How this can be done in tornado web app, as tornado itself is a web server.
Or serving the app with n... |
Choose from disks in Python? | 14,341,933 | 0 | 0 | 170 | 0 | python,unix | If your program is *nix-specific, I suppose your best bet is parsing the output of mount command.
It gives you mount points, user names, and FS names. Of them you could filter points mounted or at least writable by the current user, with a right FS on it (possibly vfat?). | 0 | 1 | 0 | 0 | 2013-01-15T16:03:00.000 | 2 | 0 | false | 14,341,737 | 0 | 0 | 0 | 1 | I'm writing a Python program that uses dd to write an OS image to a USB flash drive. Drives /dev/sda and /dev/sdb are mounted, in my case, with sdb being the flash drive I want to write to.
However, on someone else's system, the drive they want to write to might be /dev/sdc. How do I let the user choose what drive to w... |
Best strategy for storing precomputed sunrise/sunset data? | 14,365,980 | 0 | 1 | 537 | 1 | python,google-app-engine,python-2.7 | I would say precompute those structures and output them into hardcoded python structures that you save in a generated python file.
Just read those structures into memory as part of your instance startup.
From your description, there's no reason to compute these values at runtime, and there's no reason to store it in th... | 0 | 1 | 0 | 0 | 2013-01-15T17:59:00.000 | 3 | 0 | false | 14,343,871 | 0 | 0 | 1 | 2 | I'm working on an NDB based Google App Engine application that needs to keep track of the day/night cycle of a large number (~2000) fixed locations. Because the latitude and longitude don't ever change, I can precompute them ahead of time using something like PyEphem. I'm using NDB. As I see it, the possible strateg... |
Best strategy for storing precomputed sunrise/sunset data? | 14,345,283 | 1 | 1 | 537 | 1 | python,google-app-engine,python-2.7 | For 2000 immutable data points - just calculate them when instance starts or on first use, then keep it in memory. This will be the cheapest and fastest. | 0 | 1 | 0 | 0 | 2013-01-15T17:59:00.000 | 3 | 0.066568 | false | 14,343,871 | 0 | 0 | 1 | 2 | I'm working on an NDB based Google App Engine application that needs to keep track of the day/night cycle of a large number (~2000) fixed locations. Because the latitude and longitude don't ever change, I can precompute them ahead of time using something like PyEphem. I'm using NDB. As I see it, the possible strateg... |
Script to run at startup on Windows RT | 14,555,543 | 0 | 0 | 1,365 | 0 | python,windows,windows-8,startup,windows-rt | You can create a scheduled task to run on login or boot. The run registry key and the startup folder do not function on Windows RT, but the task scheduler does.
Off topic (since I seem unable to add a comment to the other answer) there has been a copy of Python ported over to Windows RT using the jailbreak. | 0 | 1 | 0 | 0 | 2013-01-16T15:02:00.000 | 2 | 0 | false | 14,361,331 | 0 | 0 | 0 | 2 | I have a python script which basically lauches an Xmlrpc server. I need this script to run always. I have a function that may call for the system to reboot itself. So once the system has rebooted, I need to get the script running again.
How can I add this to the Windows RT startup? |
Script to run at startup on Windows RT | 14,365,833 | 0 | 0 | 1,365 | 0 | python,windows,windows-8,startup,windows-rt | There is no way for a Windows Store app to trigger a system reboot, neither can it run a Python script unless you implement a Python interperter inside your app. Windows Store apps run in their own sandbox and have very limited means of communication with the rest of the system. | 0 | 1 | 0 | 0 | 2013-01-16T15:02:00.000 | 2 | 0 | false | 14,361,331 | 0 | 0 | 0 | 2 | I have a python script which basically lauches an Xmlrpc server. I need this script to run always. I have a function that may call for the system to reboot itself. So once the system has rebooted, I need to get the script running again.
How can I add this to the Windows RT startup? |
Interactive console with NetBeans project modules on the classpath | 14,482,828 | 0 | 0 | 364 | 0 | python,netbeans,read-eval-print-loop | I had a similar problem.
Go tools->python platforms and set the default platform to python 2.7. This should cause window->pythonConsole to launch the correct version of python.
As for then being able to import your custom modules... that's the problem I'm currently having too. | 0 | 1 | 0 | 0 | 2013-01-16T16:43:00.000 | 1 | 0 | false | 14,363,351 | 1 | 0 | 0 | 1 | I am trying to wrap my head around Python, while my brain works for Java and Scala, so please excuse if this question is ill-formulated.
I have managed to setup NetBeans 6.9 with Python 2.7 on OS X. I can compile and run my project, fine.
Now what I want is something equivalent to sbt's console command. I want to launc... |
Pytest and Python 3 | 59,968,198 | 5 | 61 | 65,185 | 0 | python,python-3.x,pytest | Install it with pip3:
pip3 install -U pytest | 0 | 1 | 0 | 1 | 2013-01-17T02:11:00.000 | 6 | 0.16514 | false | 14,371,156 | 0 | 0 | 0 | 3 | I've installed pytest 2.3.4 under Debian Linux. By default it runs under Python 2.7, but sometimes I'd like to run it under Python 3.x, which is also installed. I can't seem to find any instructions on how to do that.
The PyPI Trove classifiers show Python :: 3 so presumably it must be possible. Aside from py.test s... |
Pytest and Python 3 | 14,371,623 | 27 | 61 | 65,185 | 0 | python,python-3.x,pytest | python3 doesn't have the module py.test installed. If you can, install the python3-pytest package.
If you can't do that try this:
Install virtualenv
Create a virtualenv for python3
virtualenv --python=python3 env_name
Activate the virtualenv
source ./env_name/bin/activate
Install py.test
pip install py.test
Now... | 0 | 1 | 0 | 1 | 2013-01-17T02:11:00.000 | 6 | 1 | false | 14,371,156 | 0 | 0 | 0 | 3 | I've installed pytest 2.3.4 under Debian Linux. By default it runs under Python 2.7, but sometimes I'd like to run it under Python 3.x, which is also installed. I can't seem to find any instructions on how to do that.
The PyPI Trove classifiers show Python :: 3 so presumably it must be possible. Aside from py.test s... |
Pytest and Python 3 | 14,371,849 | 69 | 61 | 65,185 | 0 | python,python-3.x,pytest | I found a workaround:
Installed python3-pip using aptitude, which created /usr/bin/pip-3.2.
Next pip-3.2 install pytest which re-installed pytest, but under a python3.2 path.
Then I was able to use python3 -m pytest somedir/sometest.py.
Not as convenient as running py.test directly, but workable. | 0 | 1 | 0 | 1 | 2013-01-17T02:11:00.000 | 6 | 1.2 | true | 14,371,156 | 0 | 0 | 0 | 3 | I've installed pytest 2.3.4 under Debian Linux. By default it runs under Python 2.7, but sometimes I'd like to run it under Python 3.x, which is also installed. I can't seem to find any instructions on how to do that.
The PyPI Trove classifiers show Python :: 3 so presumably it must be possible. Aside from py.test s... |
Google App Engine Instances keep quickly shutting down | 14,377,741 | 4 | 8 | 2,656 | 0 | python,google-app-engine,python-2.7 | My solution to this was to increase the Pending Latency time.
If a webpage fires 3 ajax requests at once, AppEngine was launching new instances for the additional requests. After configuring the Minimum Pending Latency time - setting it to 2.5 secs, the same instance was processing all three requests and throughput wa... | 0 | 1 | 0 | 0 | 2013-01-17T03:47:00.000 | 3 | 1.2 | true | 14,371,920 | 0 | 0 | 1 | 3 | So I've been using app engine for quite some time now with no issues. I'm aware that if the app hasn't been hit by a visitor for a while then the instance will shut down, and the first visitor to hit the site will have a few second delay while a new instance fires up.
However, recently it seems that the instances only ... |
Google App Engine Instances keep quickly shutting down | 14,416,358 | 1 | 8 | 2,656 | 0 | python,google-app-engine,python-2.7 | 1 idle instance means that app-engine will always fire up an extra instance for the next user that comes along - that's why you are seeing an extra instance fired up with that setting.
If you remove the idle instance setting (or use the default) and just increase pending latency it should "wait" before firing the extr... | 0 | 1 | 0 | 0 | 2013-01-17T03:47:00.000 | 3 | 0.066568 | false | 14,371,920 | 0 | 0 | 1 | 3 | So I've been using app engine for quite some time now with no issues. I'm aware that if the app hasn't been hit by a visitor for a while then the instance will shut down, and the first visitor to hit the site will have a few second delay while a new instance fires up.
However, recently it seems that the instances only ... |
Google App Engine Instances keep quickly shutting down | 14,742,365 | 0 | 8 | 2,656 | 0 | python,google-app-engine,python-2.7 | I only started having this type of issue on Monday February 4 around 10 pm EST, and is continuing until now. I first started noticing then that instances kept firing up and shutting down, and latency increased dramatically. It seemed that the instance scheduler was turning off idle instances too rapidly, and causing su... | 0 | 1 | 0 | 0 | 2013-01-17T03:47:00.000 | 3 | 0 | false | 14,371,920 | 0 | 0 | 1 | 3 | So I've been using app engine for quite some time now with no issues. I'm aware that if the app hasn't been hit by a visitor for a while then the instance will shut down, and the first visitor to hit the site will have a few second delay while a new instance fires up.
However, recently it seems that the instances only ... |
Compile Python 2.7.3 on Linux for Embedding into a C++ app | 14,390,969 | 1 | 0 | 667 | 0 | c++,python,gcc,g++,redhat | You want to link to the python static library, which should get created by default and will be called libpython2.7.a
If I recall correctly, as long as you don't build Python with --enable-shared it doesn't install the dynamic library, so you'll only get the static lib and so simply linking your C++ application with -l... | 0 | 1 | 0 | 1 | 2013-01-17T09:02:00.000 | 1 | 1.2 | true | 14,375,397 | 0 | 0 | 0 | 1 | I have a C++ application from Windows that I wish to port across to run on a Red Hat Linux system. This application embeds a slightly modified version of Python 2.7.3 (I added the Py_SetPath command as it is essential for my use case) so I definitely need to compile the Python source.
My problem is that despite lookin... |
WSGI apps with python 2 and python 3 on the same server? | 14,375,870 | -1 | 5 | 1,947 | 0 | python,apache,wsgi | Its quite possible. This is what virtualenv as all about. Set up the second app in a virtualenv , with python3 .
You an add it in a virtualhost configuration in apache. | 0 | 1 | 0 | 0 | 2013-01-17T09:09:00.000 | 2 | -0.099668 | false | 14,375,520 | 0 | 0 | 1 | 1 | I already have a web application in written in Python 2 that runs over WSGI (specifically, OpenERP web server).
I would like to write a new web application that would run on the same server (Apache 2 on Ubuntu), but using WSGI and Python 3. The two applications would be on different ports.
Is that possible? |
Pydev: How to import a gae project to eclipse Pydev gae project? | 14,387,118 | 0 | 1 | 712 | 0 | python,google-app-engine,pydev | If you want to use Eclipse's Import feature, go with General -> File system. | 0 | 1 | 0 | 0 | 2013-01-17T15:58:00.000 | 3 | 0 | false | 14,383,025 | 0 | 0 | 1 | 2 | Created a gae project with the googleappengine launch and have been building it with textmate.
Now, I'd like to import it to the Eclipse PyDev GAE project. Tried to import it, but it doesn't work.
Anyone know how to do that?
Thanks in advance. |
Pydev: How to import a gae project to eclipse Pydev gae project? | 14,383,720 | 2 | 1 | 712 | 0 | python,google-app-engine,pydev | You could try not using the eclipse import feature. Within Eclipse, create a new PyDev GAE project, and then you can copy in your existing files. | 0 | 1 | 0 | 0 | 2013-01-17T15:58:00.000 | 3 | 1.2 | true | 14,383,025 | 0 | 0 | 1 | 2 | Created a gae project with the googleappengine launch and have been building it with textmate.
Now, I'd like to import it to the Eclipse PyDev GAE project. Tried to import it, but it doesn't work.
Anyone know how to do that?
Thanks in advance. |
How to delete or reset a search index in Appengine | 14,390,379 | 7 | 5 | 2,747 | 0 | python,google-app-engine,full-text-search | If you empty out your index and call index.delete_schema() (index.deleteSchema() in Java) it will clear the mappings that we have from field name to type, and you can index your new documents as expected. Thanks! | 0 | 1 | 0 | 0 | 2013-01-17T21:14:00.000 | 1 | 1.2 | true | 14,388,251 | 0 | 0 | 1 | 1 | The Situation
Alright, so we have our app in appengine with full text search activated. We had an index set on a document with a field named 'date'. This field is a DateField and now we changed the model of the document so the field 'date' is now a NumericField.
The problem is, on the production server, even if I clea... |
How to implement "last command" function in a console based python program | 14,399,306 | 1 | 0 | 195 | 0 | python,windows | What you can do, is to apply some sort of shell history functionality: every command issued by the user would be placed in a list, and then you'd implement a special call (command of your console), let's say 'history' that would print out the list for the user in order as it was being filled in, with increasing number ... | 0 | 1 | 0 | 0 | 2013-01-18T12:28:00.000 | 2 | 0.099668 | false | 14,398,980 | 1 | 0 | 0 | 1 | So I am working on a console based python(python3 actually) program where I use input(">")to get the command from user.
Now I want to implement the "last command" function in my program - when users press the up arrow on the keyboard they can see their last command.
After some research I found I can use curses lib to i... |
clang error when installing MYSQL-python on Lion-mountain (Mac OS X 10.8) | 14,399,388 | 0 | 1 | 506 | 1 | python,mysql,django,pip,mysql-python | At first glance it looks like damaged pip package. Have you tried easy_install instead with the same package? | 0 | 1 | 0 | 0 | 2013-01-18T12:41:00.000 | 1 | 0 | false | 14,399,223 | 0 | 0 | 0 | 1 | When I try installing mysql-python using below command,
macbook-user$ sudo pip install MYSQL-python
I get these messages:
/System/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/pyconfig.h:891:1: warning: this is the location of the previous definition
/usr/bin/lipo: /tmp/_mysql-LtlmLe.o and /tmp/_m... |
Whats are the options to transform a event-driven tornado application into a threaded or otherwise blocking system? | 14,434,691 | 1 | 0 | 218 | 0 | python,tornado | Use Tornado to just receive non-blocking requests. To do the actual XML processing you can then spawn another process or use an async task processor like celery. Using celery would facilitate easy scaling of your system in future. In fact with this model you'll just need one Tornado instance.
@Eren - I don't think that... | 0 | 1 | 0 | 0 | 2013-01-18T12:44:00.000 | 2 | 0.099668 | false | 14,399,278 | 0 | 0 | 0 | 1 | I have inherited a rather large code base that utilizes tornado to compute and serve big and complex data-types (imagine a 1 MB XML file). Currently there are 8 instances of tornado running to compute and serve this data.
That was a wrong design-decision from the start and I am facing many many timeouts from applicati... |
changing GAE db.Model schema on dev_server with ListProperties? BadValueError | 14,401,694 | 0 | 3 | 94 | 0 | python,google-app-engine,google-cloud-datastore | here's my workaround to get it working on dev_server:
1) update your model in production and deploy it
2) use appcfg.py download_data and grab all entities of the type you've updated
3) use appcfg.py upload_data and push all the entities into your local datastore
voila.. your local datastore entities can now be retriev... | 0 | 1 | 0 | 0 | 2013-01-18T13:11:00.000 | 1 | 1.2 | true | 14,399,722 | 0 | 0 | 1 | 1 | My understanding of changing db.Model schemas is that it 'doesn't matter' if you add a property and then try and fetch old entities without that property.
Indeed, adding the following property to my SiteUser db.Model running on dev_server:
category_subscriptions = db.StringProperty()
Still allows me to retrieve an old ... |
Django Gunicorn Long Polling | 14,403,008 | 2 | 0 | 1,026 | 0 | python,django,event-handling,gunicorn | Gunicorn by default will spawn regular synchronous WSGI processes. You can however tell it to spawn processes that use gevent, eventlet or tornado instead. I am only familiar with gevent which can certainly be used instead of Node.js for long polling.
The memory footprint per process is about the same for mod_wsgi and ... | 0 | 1 | 0 | 0 | 2013-01-18T13:28:00.000 | 1 | 1.2 | true | 14,399,958 | 0 | 0 | 1 | 1 | Is using Django with gunicorn is considered to be a replacement for using evented/async servers like Tornado, Node.js, and similar ? Additionally, Will that be helpful in handling long-polling/cometed services?
Finally, is Gunicorn only replacing the memory consuming Apache threads (in case of Apache/mod-wsgi) with lig... |
Multiple Python installations in System Path | 17,404,756 | 0 | 2 | 1,186 | 0 | python,windows,command-line,path,executable | I use ixe013's junction approach. The one issue I have had is that enthoughts enpkg installer doesn't "read" the symbolic junction...I have lost the details, but it broke the symbolic link and then claimed the installation directory was empty...
So if you are using ixe013s approach with enthought I recommend the foll... | 0 | 1 | 0 | 0 | 2013-01-18T18:43:00.000 | 3 | 0 | false | 14,405,520 | 1 | 0 | 0 | 1 | I have two Python installations on my Windows 7 64bit workstation. I have 32bit Python 2.7 and 64bit Python 2.7. Each installation is required by specific applications. I currently have only the 32bit Python installation in my system path. However, I would like to add the 64bit version to the path as well.
Right no... |
Python .pyc files and Windows UAC | 14,410,362 | 0 | 2 | 1,244 | 0 | python,installation,uac | Re-reading the original question, there's a much simpler answer that probably fits your requirements.
I don't know much about Inno, but most installers give you a way to run an arbitrary command as a post-copy step.
So, you can just use python -m compileall to create the .pyc files for you at install time—while you've ... | 0 | 1 | 0 | 0 | 2013-01-18T21:16:00.000 | 3 | 0 | false | 14,407,779 | 1 | 0 | 0 | 1 | I'm working on an Inno Setup installer for a Python application for Windows 7, and I have these requirements:
The app shouldn't write anything to the installation directory
It should be able to use .pyc files
The app shouldn't require a specific Python version, so I can't just add a set of .pyc files to the installer
... |
Python getting Itunes song | 14,410,871 | 0 | 0 | 2,056 | 0 | python,applescript,itunes | You can set up a simple Automator workflow to retrieve the current iTunes song. Try these two actions for starters:
iTunes: Get the Current Song
Utilities: Run Shell Script
Change the shell script to cat > ~/itunes_track.txt and you should have a text file containing the path of the current track. Once you get your ... | 0 | 1 | 0 | 1 | 2013-01-19T03:21:00.000 | 2 | 0 | false | 14,410,771 | 0 | 0 | 0 | 1 | I have been doing a little bit of research and haven't found anything that is quite going to work. I want to have python know what the current song playing in iTunes is so I can serially send it to my Arduino.
I have seen Appscript but it is no longer supported and from what I have read full of a few bugs now that it h... |
Piped sdt output with notification Python | 14,417,574 | 0 | 1 | 133 | 0 | python,notifications,pipe | You can generate additional file somewhere where you record status of your application. On UNIX platforms users can tail it in another console. Sending e-mails with status is another option. | 0 | 1 | 0 | 0 | 2013-01-19T18:41:00.000 | 2 | 0 | false | 14,417,535 | 0 | 0 | 0 | 1 | I'm playing around with python and have written an application that generates a lot of text which it prints to the stdout.(I run the application in a console(bash) and don't want to use a GUI or external tool)
The process takes up a lot of time and I can calculate how much it has already finished..
The output of the a... |
echoprint - stopping the service Solr, I lose the database | 14,444,869 | 2 | 0 | 696 | 0 | python,solr,audio-fingerprinting,echoprint | well I found my mistake and if the ttserver. Thanks Alexandre for that data. Well the right way to make it work would be this
/usr/local/tokyotyrant-1.1.33/bin/ttserver casket.tch
there indicated the name of the on-disk hash, that will make persistent. Then start Solr normally and I can enter and view songs without pro... | 0 | 1 | 0 | 0 | 2013-01-20T17:28:00.000 | 1 | 0.379949 | false | 14,427,160 | 0 | 0 | 1 | 1 | As I can do to stop the service and tt solr correctly. What I do is restart the PC and then wake up services, but to perform validation of a song, I get a message as if the database has been damaged. I wonder what is the right way to close the service to run and test after songs but not the database is damaged. Greetin... |
Child script execution in crontab | 17,640,694 | 0 | 1 | 147 | 0 | python,shell,cron | If you're having problems it's a good idea to use full qualified paths to commands in any script that's being called from cron, so as to avoid PATH and environment variable issues with the bare-bones environment that cron is called in. | 0 | 1 | 0 | 1 | 2013-01-20T18:00:00.000 | 1 | 0 | false | 14,427,475 | 0 | 0 | 0 | 1 | I have a shell script which launches many different python scripts.
The shell script exports many variables, which are in turn used by the python scripts.
This is working perfectly when run in command line, but it does not work when executed in crontab.
In the cron logs, I could see the shell script working, but the py... |
Python Celery - lookup task by pid | 14,453,045 | 4 | 2 | 2,103 | 0 | python,celery,amqp | I'm going to make the assumption that by 'task' you mean 'worker'. The question would make little sense otherwise.
For some context it's important to understand the process hierarchy of Celery worker pools. A worker pool is a group of worker processes (or threads) that share the same configuration (process messages o... | 0 | 1 | 0 | 0 | 2013-01-21T17:26:00.000 | 1 | 1.2 | true | 14,444,015 | 0 | 0 | 0 | 1 | A pretty straightforward question, maybe -
I often see a celery task process running on my system that I cannot find when I use celery.task.control.inspect()'s active() method. Often this process will be running for hours, and I worry that it's a zombie of some sort. Usually it's using up a lot of memory, too.
Is ther... |
How do I make a twisted task send data down a cilent's connection? | 14,470,835 | 2 | 1 | 96 | 0 | python,asynchronous,network-programming,twisted | No. The tasks are not run in different threads; one might say that's the whole point of using Twisted in the first place. You are intended to be able to pass references to your protocol objects wherever you need them. | 0 | 1 | 0 | 0 | 2013-01-22T23:34:00.000 | 1 | 1.2 | true | 14,470,000 | 0 | 0 | 0 | 1 | If I create a looping task using twisted.internet.task.LoopingCall, how can that task safely access a client's twisted connection and send data it collects? Are the tasks run in separate threads meaning it might not be safe to send data from the task itself? Can I pass the task a reference to the client (instance of ... |
Using a celery task to create a new process | 14,514,760 | 2 | 2 | 2,089 | 0 | python,daemon,celery | Run the Celery daemons under a supervisor, such as supervisord. When the celeryd process dies, the supervisor will spin it back up. | 0 | 1 | 0 | 0 | 2013-01-23T18:25:00.000 | 2 | 0.197375 | false | 14,486,696 | 0 | 0 | 0 | 1 | In Celery, the only way to conclusively reload modules is to restart all of your celery processes. I know how to remotely shutdown workers (broadcast('shutdown', destination = [<workers>])), but not how to bring them back up.
I have a piece of Python code that works to create a daemon process with a new worker in it... |
Are pipes in each process independent? | 14,509,236 | 0 | 1 | 104 | 0 | python,subprocess,pipe | I dont know about python, but as far as C is concerned, pipe is not independent for each process.
Pipes are for the sole pupose of communicating between the parent and the child processes or even between the child processes themselves.
The data written in a pipe by a particular process can be read by another process fr... | 0 | 1 | 0 | 0 | 2013-01-24T19:28:00.000 | 2 | 0 | false | 14,509,190 | 1 | 0 | 0 | 1 | If I generate multiple subprocess.Popen(['commands', 'that', 'I', 'called']) and for each I do stdin.write(..) or p.communicate(...)to interact with the commands, is it guarantee to be independent and will come back to the each process (stdout from the called command)? |
Using celery to output to files | 14,516,799 | 1 | 5 | 2,459 | 0 | python,multithreading,celery | If I didn't want to have a separate service (outside of Celery) sync to file, the way I would accomplish this in Celery is to bind a one worker pool (a pool with concurrency 1) to a specific queue that accepts only file write tasks. This way tasks run serially and there won't be any race conditions while writing.
Alte... | 0 | 1 | 0 | 0 | 2013-01-25T06:20:00.000 | 2 | 0.099668 | false | 14,516,447 | 0 | 0 | 0 | 2 | I am trying to use Celery to output to multiple files. The task is very simple:
Get some data along with a file path
Append that data to the file path (and create the file if it doesn't exist)
I do not want to open/close the file handle each time, since I would write to the same file in many cases. So I made a simple... |
Using celery to output to files | 14,516,523 | 4 | 5 | 2,459 | 0 | python,multithreading,celery | The pool does not seem to be shared by the celery threads. Ideal way to do it is to assign a single process the task of writing to files and all the celery threads should write to that process via queue. | 0 | 1 | 0 | 0 | 2013-01-25T06:20:00.000 | 2 | 0.379949 | false | 14,516,447 | 0 | 0 | 0 | 2 | I am trying to use Celery to output to multiple files. The task is very simple:
Get some data along with a file path
Append that data to the file path (and create the file if it doesn't exist)
I do not want to open/close the file handle each time, since I would write to the same file in many cases. So I made a simple... |
How to auto resize a window when opening a program. | 14,524,064 | 1 | 0 | 869 | 0 | python,python-idle | Try this :
Options > Configure IDLE > General tab > Initial Window size | 0 | 1 | 0 | 0 | 2013-01-25T14:36:00.000 | 1 | 1.2 | true | 14,523,959 | 1 | 0 | 0 | 1 | I'm sorry if this is really basic but: when I open the Python IDLE editor, the shell opens, then I have to open the editor from the shell. Is there any way I can have the shell and editor automatically resize to a specific size upon opening? Can I use a batch file perhaps? |
Celery Worker Database Connection Pooling | 14,526,700 | 2 | 47 | 24,474 | 1 | python,postgresql,connection-pooling,celery | You can override the default behavior to have threaded workers instead of a worker per process in your celery config:
CELERYD_POOL = "celery.concurrency.threads.TaskPool"
Then you can store the shared pool instance on your task instance and reference it from each threaded task invocation. | 0 | 1 | 0 | 0 | 2013-01-25T16:38:00.000 | 6 | 0.066568 | false | 14,526,249 | 0 | 0 | 0 | 2 | I am using Celery standalone (not within Django). I am planning to have one worker task type running on multiple physical machines. The task does the following
Accept an XML document.
Transform it.
Make multiple database reads and writes.
I'm using PostgreSQL, but this would apply equally to other store types that ... |
Celery Worker Database Connection Pooling | 14,549,811 | 3 | 47 | 24,474 | 1 | python,postgresql,connection-pooling,celery | Have one DB connection per worker process. Since celery itself maintains a pool of worker processes, your db connections will always be equal to the number of celery workers.
Flip side, sort of, it will tie up db connection pooling to celery worker process management. But that should be fine given that GIL allows only... | 0 | 1 | 0 | 0 | 2013-01-25T16:38:00.000 | 6 | 0.099668 | false | 14,526,249 | 0 | 0 | 0 | 2 | I am using Celery standalone (not within Django). I am planning to have one worker task type running on multiple physical machines. The task does the following
Accept an XML document.
Transform it.
Make multiple database reads and writes.
I'm using PostgreSQL, but this would apply equally to other store types that ... |
where I host apps developed using tornado webserver | 14,639,967 | 1 | 2 | 1,996 | 0 | python,google-app-engine,webserver,hosting,tornado | At heroku the WebSockets protocol is not yet supported on the Cedar stack. | 0 | 1 | 0 | 0 | 2013-01-28T06:42:00.000 | 3 | 0.066568 | false | 14,556,744 | 0 | 0 | 1 | 1 | Is It there any hosting service for hosting simple apps developed using tornado.(Like we hosting in Google App Engine). Is it possible to host on Google App Engine?.The Apps is just like some student datas(adding,removing,searching etc).I'm devoloped using python.
Thanks in advance |
Python and ControlDesk interaction | 14,650,472 | 1 | 1 | 3,445 | 0 | python,matlab,simulink | Yes, it is quite possible. You should take a look at "Real-Time Testing" document which you can find in your dSPACE installation directory. | 0 | 1 | 0 | 0 | 2013-01-29T15:18:00.000 | 1 | 0.197375 | false | 14,586,171 | 0 | 0 | 0 | 1 | I hope someone can help us.
We are using a dSpace 1103 out of Simulink/Matlab and ControlDesk.
What I would like to know is, is it possible to use python in ControlDesk to transfer data into the dSpace from network? I mean, write an UDP Listener in python and use that script to update variables inside the Simulink/Matl... |
How to call .ksh file as part of Unix command through ssh in Python | 18,859,497 | 0 | 0 | 1,343 | 0 | python,unix,ssh,autosys | After reading your comment on the first answer, you might want to create a bash script with bash path as the interpreter line and then the autosys commands.
This will create a bash shell and run the commands from the script in the shell.
Again, if you are using autosys commands in the shell you better set autosys envi... | 0 | 1 | 0 | 1 | 2013-01-29T16:08:00.000 | 2 | 0 | false | 14,587,135 | 0 | 0 | 0 | 1 | I would like to achieve the following things:
Given file contains a job list which I need to execute one by one in a remote server using SSH APIs and store results.
When I try to call the following command directly on remote server using putty it executes successfully but when I try to execute it through python SSH pro... |
How can Linux program, e.g. bash or python script, know how it was started: from command line or interactive GUI? | 14,592,451 | 4 | 4 | 788 | 0 | python,linux,bash,user-interface,command-line-interface | It can check the value of $DISPLAY to see whether or not it's running under X11, and $(tty) to see whether it's running on an interactive terminal. if [[ $DISPLAY ]] && ! tty; then chances are good you'd want to display a GUI popup. | 0 | 1 | 0 | 1 | 2013-01-29T21:12:00.000 | 5 | 0.158649 | false | 14,592,390 | 0 | 0 | 0 | 1 | I want to do the following:
If the bash/python script is launched from a terminal, it shall do something such as printing an error message text. If the script is launched from GUI session like double-clicking from a file browser, it shall do something else, e.g. display a GUI message box. |
IDLE no longer runs any script on pressing F5 | 62,436,025 | 1 | 4 | 7,570 | 0 | python,python-idle | Your function keys are locked,I think so.
Function keys can be unlocked by fn key + esc.
Then f5 will work without any issue. | 0 | 1 | 0 | 1 | 2013-01-29T21:44:00.000 | 2 | 0.099668 | false | 14,592,879 | 0 | 0 | 0 | 2 | I cannot run any script by pressing F5 or selecting run from the menus in IDLE. It stopped working suddenly. No errors are coughed up. IDLE simply does nothing at all.
Tried reinstalling python to no effect.
Cannot run even the simplest script.
Thank you for any help or suggestions you have.
Running Python 2.6.5 on win... |
IDLE no longer runs any script on pressing F5 | 48,695,999 | 1 | 4 | 7,570 | 0 | python,python-idle | I am using a Dell laptop, and ran into this issue. I found that if I pressed Function + F5, the program would run.
On my laptop keyboard, functions key items are in blue (main functions in white). The Esc (escape) key has a blue lock with 'Fn' on it. I pressed Esc + F5, and it unlocked my function keys. I can now r... | 0 | 1 | 0 | 1 | 2013-01-29T21:44:00.000 | 2 | 0.099668 | false | 14,592,879 | 0 | 0 | 0 | 2 | I cannot run any script by pressing F5 or selecting run from the menus in IDLE. It stopped working suddenly. No errors are coughed up. IDLE simply does nothing at all.
Tried reinstalling python to no effect.
Cannot run even the simplest script.
Thank you for any help or suggestions you have.
Running Python 2.6.5 on win... |
How to run a python background process periodically | 14,614,330 | 2 | 2 | 2,035 | 0 | python,unix,cron | Have the program run every 5 hours -- I'm not to familiar with
system-level timing operations.
for nix cron is the default solution to accomplish this
Have the program efficiently run in the background -- I want these
'updates' to occur without the user knowing.
Using cron the program will be run in the backgrou... | 0 | 1 | 0 | 0 | 2013-01-30T21:32:00.000 | 2 | 0.197375 | false | 14,614,196 | 0 | 0 | 0 | 1 | I have a fairly light script that I want to run periodically in the background every 5 hours or so. The script runs through a few different websites, scans them for new material, and either grabs .mp3 files from them or likes songs on youtube based on their content. There are a few things I want to achieve with this pr... |
Reading a windows file without preventing another process from writing to it | 14,619,854 | 0 | 2 | 1,040 | 0 | python,windows,delphi,winapi,filesystems | You can setup a filter driver which can act in two ways: (1) modify the flags when the file is opened, and (2) it can capture the data when it's written to the file and save a copy of the data elsewhere.
This approach is much more lightweight and efficient than volume shadow copy service, mentioned in comments, howeve... | 0 | 1 | 0 | 0 | 2013-01-31T03:31:00.000 | 2 | 0 | false | 14,617,983 | 0 | 0 | 0 | 1 | I have a file that I want to read. The file may at any time be overwritten by another process. I do not want to block that writing. I am prepared to manage corruption to the data that I read, but do not want my reading to be in any way change the behaviour of the writing process.
The process that is writing the file is... |
get bitbucket commit message for each push | 14,627,911 | 0 | 0 | 258 | 0 | python,post,push-notification,bitbucket,githooks | Select the administration menu for the repository (the gear symbol), then Services. There you can set up integration with external services, such as email or twitter. | 0 | 1 | 0 | 1 | 2013-01-31T11:16:00.000 | 1 | 0 | false | 14,624,421 | 0 | 0 | 0 | 1 | I want to fetch the commit message to my bitbucket repository each time a user is doing any push operation.
How can I do that?
I am in development version. So is there any way by which I can post to localhost/someurl for each commit from my repository.
Else suggest other ways by which I can achieve this.
Thanks in adva... |
Async URL Fetch and Memcache on Appengine | 14,630,265 | 0 | 1 | 150 | 0 | python,google-app-engine | No, there is no automated way where async Url Fetch would store data automatically to memcache on completion. You have to do it in your code, but this defeats what you are trying to do.
Also remember that memcache is volatile and it's content can be purged at any time. | 0 | 1 | 0 | 0 | 2013-01-31T13:50:00.000 | 1 | 0 | false | 14,627,334 | 0 | 0 | 1 | 1 | Is it possible to make a async url fetch on appengine and to store the rpc object in the memcache?
What I try to do is to start the asynch url fetch within a task, but I don't want the task to wait until the fetch has finished.
Therefore I tought I would just write it to memcache and access it later from outside the ta... |
Unit testing GAE Blobstore (with nose) | 29,110,829 | 0 | 1 | 123 | 0 | python,google-app-engine | I had the same question so I dug into the nosegae code, and then into the actual testbed code.
All you need to do is set nosegae_blobstore = True where you're setting up all the other stubs. This sets up a dict-backed blobstore stub. | 0 | 1 | 0 | 0 | 2013-01-31T17:09:00.000 | 2 | 0 | false | 14,631,306 | 0 | 0 | 1 | 1 | We're using nose with nose-gae for unit testing our controllers and models. We now have code that hits the blob store and files API. We are having a hard time testing those due to lack of testing proxies/mocks. Is there a good way to unit tests these services or lacking unit testing is there a way to automated acceptan... |
Query window id from python in linux and mac | 14,640,571 | 0 | 0 | 1,222 | 0 | python,linux,macos,window,cross-platform | Not on a cross-platform basis. While windows do have IDs on both Linux and Mac OS, the meaning of the IDs is quite different, as is what you can do with them. There's basically nothing in common between the two.
And no, you cannot get those IDs when you launch an application, as the window(s) aren't created until later... | 0 | 1 | 0 | 0 | 2013-02-01T03:40:00.000 | 1 | 0 | false | 14,639,338 | 0 | 0 | 0 | 1 | Is there any way to query window id by window name from python? Something that would work cross-platform perhaps (linux / mac)?
Or even better catch that id when starting a new window directly from os.sys ? |
The tab indent in emacs ipython shell | 14,666,166 | 1 | 0 | 592 | 0 | emacs,ipython | In emacs you can use python-mode, and from there send the code to *REPL* buffer with C-c C-c.
When you send the buffer for the first time, it asks you what executable you use for python, so you can use ipython, or other one. | 0 | 1 | 0 | 0 | 2013-02-02T10:47:00.000 | 2 | 1.2 | true | 14,661,070 | 1 | 0 | 0 | 2 | The environment is Emacs 24.1.1 on Ubuntu. using Ipython for python programming.
The auto indent is works well when running ipython command on shell directly, but when i come to emacs run ipython there is no auto indent any more. and even worse when i type TAB it will prompt the Completion buffer.I also have searched ... |
The tab indent in emacs ipython shell | 14,671,433 | 0 | 0 | 592 | 0 | emacs,ipython | Any invocation of ipython-shell should do a correct setup.
Please file a bug-report.
If running python-mode.el -- modeline shows "Py" --
please checkout current trunk first
When bazaar is available
bzr branch lp:python-mode | 0 | 1 | 0 | 0 | 2013-02-02T10:47:00.000 | 2 | 0 | false | 14,661,070 | 1 | 0 | 0 | 2 | The environment is Emacs 24.1.1 on Ubuntu. using Ipython for python programming.
The auto indent is works well when running ipython command on shell directly, but when i come to emacs run ipython there is no auto indent any more. and even worse when i type TAB it will prompt the Completion buffer.I also have searched ... |
Is google app engine right for me (hosting a few rapidly updating text files created w/ python) | 14,670,069 | 0 | 0 | 108 | 0 | python,google-app-engine | Yes and no.
Appengine is great in terms of reliability, server speed, features, etc. However, it has two main drawbacks: You are in a sandboxed environment (no filesystem access, must use datastore), and you are paying by instance hour. Normally, if you're just hosting a small server accessed once in a while, you can... | 0 | 1 | 0 | 0 | 2013-02-03T05:38:00.000 | 2 | 0 | false | 14,669,819 | 0 | 0 | 1 | 1 | I have a python script that creates a few text files, which are then uploaded to my current web host. This is done every 5 minutes. The text files are used in a software program which fetches the latest version every 5 min. Right now I have it running on my web host, but I'd like to move to GAE to improve reliability. ... |
ndb.query.count() failed with 60s query deadline on large entities | 14,713,169 | 2 | 4 | 2,669 | 1 | python,google-app-engine,app-engine-ndb,bigtable | This is indeed a frustrating issue. I've been doing some work in this area lately to get some general count stats - basically, the number of entities that satisfy some query. count() is a great idea, but it is hobbled by the datastore RPC timeout.
It would be nice if count() supported cursors somehow so that you could ... | 0 | 1 | 0 | 0 | 2013-02-03T14:41:00.000 | 3 | 0.132549 | false | 14,673,642 | 0 | 0 | 1 | 1 | For 100k+ entities in google datastore, ndb.query().count() is going to cancelled by deadline , even with index. I've tried with produce_cursors options but only iter() or fetch_page() will returns cursor but count() doesn't.
How can I count large entities? |
Storing subprocess object in memory using global singleton instance | 14,691,638 | 1 | 1 | 1,465 | 0 | python,django,subprocess | You can use the same technique in Python as you did in Java, that is store the reference to the process in a module variable or implement a kind of a singleton.
The only problem you have as opposed to Java, is that Python does not have that rich analogy to the Servlet specification, and there is no interface to handle... | 0 | 1 | 0 | 0 | 2013-02-04T05:44:00.000 | 2 | 0.099668 | false | 14,681,015 | 1 | 0 | 1 | 1 | So I am using subprocess to spawn a long running process through the web interface using Django. Now if a user wants to come back to the page I would like to give him the option of terminating the subprocess at a later stage.
How can do this? I implemented the same thing in Java and made a global singleton ProcessManag... |
Connect to Sun ONC RPC server from Linux | 14,688,041 | 0 | 2 | 1,821 | 0 | python,c,linux,rpc | An ONC RPC client can be created by using the .idl file and rpcgen. The original RPC protocol precedes SOAP by several years.
Yes, you can create the RPC client in linux (see rpcgen)
Yes, you can create the RPC client in python (please see pep-0384) | 0 | 1 | 0 | 1 | 2013-02-04T12:31:00.000 | 3 | 0 | false | 14,686,861 | 0 | 0 | 0 | 1 | I am looking for solutions to create a RPC client in Linux that can connect to Sun ONC RPC server.
The server is written in C.
I would like to know if I can:
Create an RPC client in Linux
Create the RPC client in Python |
How can I integrate Tornado into my (currently) Apache driven site? | 14,700,450 | 1 | 0 | 956 | 0 | php,python,apache,localhost,tornado | Easiest is to run Tornado and Apache on different ports/addresses
So you probably have Apache listening to port 80 already. Tornado could listen to port 81
If the server is multihomed, you could have Apache listen to a.b.c.d:80 and Tornado listen to a.b.c.e:80. This means that you'll at least have to have the Apache pa... | 0 | 1 | 0 | 0 | 2013-02-05T04:37:00.000 | 1 | 1.2 | true | 14,700,305 | 0 | 0 | 1 | 1 | I have a website built in PHP and currently running on an Apache server (XAMPP locally). I would like to integrate a real-time chat system into the website. PHP and Apache not being geared for this in the slightest, I decided to work with Tornado and Python.
What is the easiest way to keep the base of the site in PHP ... |
Multiprocessing or os.fork, os.exec? | 40,097,923 | 4 | 4 | 6,063 | 0 | python | You can just rebind the logger in the child process to its own. I don't know about other OS, but on Linux the forking doesn't duplicate the entire memory footprint (as Ellioh mentioned), but uses "copy-on-write" concept. So until you change something in the child process - it stays in the memory scope of the parent pro... | 0 | 1 | 0 | 0 | 2013-02-05T07:03:00.000 | 2 | 0.379949 | false | 14,701,901 | 1 | 0 | 0 | 1 | I am using multiprocessing module to fork child processes. Since on forking, child process gets the address space of parent process, I am getting the same logger for parent and child. I want to clear the address space of child process for any values carried over from parent. I got to know that multiprocessing does fork... |
Eclipse PyDev use remote interpreter | 15,360,958 | 9 | 11 | 7,677 | 0 | eclipse,pydev,python | Unfortunately no. You can remotely connect to your Linux server via Remote System Explorer (RSE). But can't use it as a remote interpreter. I use Pycharm. You can use the free Community Edition or the Professional Edition for which you have to pay for it. It is not that expensive and it has been working great for me. | 0 | 1 | 0 | 1 | 2013-02-05T20:51:00.000 | 2 | 1 | false | 14,716,662 | 0 | 0 | 0 | 1 | is there a posibility to make eclipse PyDev use a remote Python interpreter?
I would like to do this, as the Linux Server I want to connect to has several optimization solvers (CPLEX, GUROBI etc.) running, that my script uses.
Currently I use eclipse locally to write the scripts, then copy all the files to the remote m... |
How to monitor google app engine from command line? | 14,723,922 | 1 | 1 | 104 | 0 | python,google-app-engine | I assume you are using Linux, Ubuntu/Mint If not that would be a good start
Debug as much as you can locally using dev_appserver.py - this will display errors on start up (in the console)
Add your own debug logs when needed
Run code snippets in the interactive console - this is really useful to test snippets of code:
... | 0 | 1 | 0 | 0 | 2013-02-06T02:11:00.000 | 3 | 0.066568 | false | 14,720,476 | 0 | 0 | 1 | 1 | I'm starting to use Google App Engine and being a newcomer to much of the stuff going on here, I broke my webpage (all I see is "server error" in my web browser). I'd like to be able to see a console of some sort which is telling me what's going wrong (python syntax? file not found? something else?). Searching around a... |
How can I stop, restart or start Gunicorn running within a virtualenv on a Debian system? | 14,737,918 | 2 | 1 | 5,683 | 0 | python,python-2.7,debian,gunicorn | That is indeed the proper way to do it. Start it with the -p option so you don't have to guess at the PID if you have more than one instance running. You can tell gunicorn to reload your application without restarting the gunicorn process itself by sending it a SIGHUP instead of killing it.
If that makes you uncomforta... | 0 | 1 | 0 | 1 | 2013-02-06T19:06:00.000 | 1 | 1.2 | true | 14,736,788 | 0 | 0 | 0 | 1 | How can I stop, restart or start Gunicorn running within a virtualenv on a Debian system?
I can't seem to find a solution apart from finding the PID for the gunicorn daemon and killing it.
Thank you. |
Effective implementation of one-to-many relationship with Python NDB | 14,749,034 | 6 | 11 | 1,389 | 1 | python,google-app-engine,app-engine-ndb | One thing that most GAE users will come to realize (sooner or later) is that the datastore does not encourage design according to the formal normalization principles that would be considered a good idea in relational databases. Instead it often seems to encourage design that is unintuitive and anathema to established n... | 0 | 1 | 0 | 0 | 2013-02-06T21:22:00.000 | 2 | 1 | false | 14,739,044 | 0 | 0 | 0 | 2 | I would like to hear your opinion about the effective implementation of one-to-many relationship with Python NDB. (e.g. Person(one)-to-Tasks(many))
In my understanding, there are three ways to implement it.
Use 'parent' argument
Use 'repeated' Structured property
Use 'repeated' Key property
I choose a way based on th... |
Effective implementation of one-to-many relationship with Python NDB | 14,740,062 | 7 | 11 | 1,389 | 1 | python,google-app-engine,app-engine-ndb | A key thing you are missing: How are you reading the data?
If you are displaying all the tasks for a given person on a request, 2 makes sense: you can query the person and show all his tasks.
However, if you need to query say a list of all tasks say due at a certain time, querying for repeated structured properties is ... | 0 | 1 | 0 | 0 | 2013-02-06T21:22:00.000 | 2 | 1 | false | 14,739,044 | 0 | 0 | 0 | 2 | I would like to hear your opinion about the effective implementation of one-to-many relationship with Python NDB. (e.g. Person(one)-to-Tasks(many))
In my understanding, there are three ways to implement it.
Use 'parent' argument
Use 'repeated' Structured property
Use 'repeated' Key property
I choose a way based on th... |
Convert Java Google AppEngine app to Python AppEngine | 14,743,018 | 2 | 0 | 201 | 0 | java,python,google-app-engine | It'll be a complete rewrite.
However, the server side should be independent of the client. You can have a python client for the Raspberry Pi and your server side code can still be written in Java. | 0 | 1 | 0 | 0 | 2013-02-07T00:14:00.000 | 1 | 1.2 | true | 14,741,395 | 0 | 0 | 1 | 1 | I'm a big noob to GAE, moderate level in Python, and moderate-to-rusty in Java.
I am looking to convert an existing and working GAE Java app (in the Google Play store and runs on Android) into GAE Python.
The end goal is to get it into the Raspberry Pi Store, so I'm assuming GAE Python would be the most seamless.
Has a... |
How read ip address under python without resource leaks | 14,754,106 | 0 | 1 | 265 | 0 | python,ip,resource-leak | What exactly do you want to do?
As far as I see, you don't count eth0 filehandles, but instead you count all filehandles.
If you just wan't open IP filehandles, you can use lsof (shelltool) under Linux.
lsof -u yourUser | grep IPv4
not just eth0, but I don't know how to filter that for interface. | 0 | 1 | 0 | 0 | 2013-02-07T14:12:00.000 | 1 | 0 | false | 14,753,159 | 0 | 0 | 0 | 1 | How to get the network information in Python in both Linux and Windows? I try to use netinfo package (ver 0.3.2) in Python 2.7 on Ubuntu 12.10 64 bit, but the use of this package makes the handles are not closed, as showed below. It is not accepted in my case.
import netinfo
def countOpenFiles():
import resource,... |
How to list all files, folders, subfolders and subfiles of a Google drive folder | 14,789,775 | 1 | 1 | 5,009 | 0 | python,python-2.7,google-drive-api | I think you have the right idea in your "update". Treat Drive as flat, make calls to list everything, and generate your own tree from that. | 0 | 1 | 0 | 0 | 2013-02-07T14:48:00.000 | 2 | 1.2 | true | 14,753,914 | 0 | 0 | 0 | 2 | Any ideas how to query for all the children and the children of the children in a single query?
Update
It seems like a simple question. I doubt if there is a simple solution?
Quering the tree of folders and files can cost a lot of API calls.
So, to solve my problem, I use a single query to list all the files and folde... |
How to list all files, folders, subfolders and subfiles of a Google drive folder | 14,808,043 | 2 | 1 | 5,009 | 0 | python,python-2.7,google-drive-api | I'm trying to do the same in PHP. My solution is:
Retrieve the complete list of files and folders from the drive
Make a double iteration (nested) on the retrieved json:
the first over the elements in "items" array,
the second (recursive) over the parents id of each element,
rejecting all the elements that not contain ... | 0 | 1 | 0 | 0 | 2013-02-07T14:48:00.000 | 2 | 0.197375 | false | 14,753,914 | 0 | 0 | 0 | 2 | Any ideas how to query for all the children and the children of the children in a single query?
Update
It seems like a simple question. I doubt if there is a simple solution?
Quering the tree of folders and files can cost a lot of API calls.
So, to solve my problem, I use a single query to list all the files and folde... |
0MQ in virtualenv | 14,801,987 | 3 | 3 | 247 | 0 | python,virtualenv,zeromq | Once you make your virtualenv and activate it, use pip to install Python packages. They will install into your virtualenv.
Alternately, when you create your virtualenv, enable system-wide packages (with the --system-site-packages switch) within it so that system-installed packages will be visible in the virtualenv. | 0 | 1 | 0 | 0 | 2013-02-10T19:58:00.000 | 1 | 0.53705 | false | 14,801,979 | 0 | 0 | 0 | 1 | I was able to install 0MQ in Ubuntu 12.04 by doing the followinng:
$ sudo apt-get install libzmq-dev
$ sudo apt-get install python-zmq
but when I went to use it in a virtualenv it could not find the module. What do I have to do in my virtualenv to see it |
Excluding a call to a subroutine from a commercial library | 14,830,945 | 0 | 0 | 198 | 0 | python,fortran,f2py | This problem is solved in a following way:
All instances where commercial FFT library is called are replaced by calls to free FFT library (in this case FFTW3). Of course ' include "fftw3.f" ' is placed on top of the fortran subroutines where necessary.
Extension module is created using f2py. First line creates the si... | 0 | 1 | 0 | 0 | 2013-02-11T13:57:00.000 | 1 | 0 | false | 14,813,494 | 0 | 0 | 0 | 1 | I have a fortran file with a lot of useful subroutines, and I want to make a Python interface to it using f2py.
The problem arises because some fortran subroutines call the FFT subroutine from the NAG library (named c06ebf). When imported into Python, it produces the 'undefined symbol: co6ebf' warning.
Is there other ... |
Python web app that can download itself | 14,817,342 | 3 | 1 | 142 | 0 | python,setuptools | Just package your app and put it on PyPI. Trying to automatically package the code running on the server seems over-engineered. Then you can let people use pip to install your app. In your app, provide a link to the PyPI page.
Then you can also add dependencies in the setup.py, and pip will install them for you. It... | 0 | 1 | 0 | 0 | 2013-02-11T17:17:00.000 | 1 | 0.53705 | false | 14,817,288 | 1 | 0 | 0 | 1 | I'm writing a small web app that I'd like to include the ability to download itself. The ideal solution would be for users to be able to "pip install" the full app but that users of the app would be able to download a version of it to use themselves (perhaps with reduced functionality or without some of the less essent... |
Is possible to create a shell like bash in python, ie: Bash replacement? | 14,824,582 | 0 | 2 | 1,426 | 0 | python,bash,shell,replace | Yes, of course. You can simply make an executable Python script, call it /usr/bin/pysh, add this filename to /etc/shells and then set it as your user's default login shell with chsh. | 0 | 1 | 0 | 0 | 2013-02-12T02:33:00.000 | 3 | 0 | false | 14,824,538 | 1 | 0 | 0 | 1 | I wonder if is possible to create a bash replacement but in python. I have done REPLs before, know about subprocess and that kind of stuff, but wonder how use my python-like-bash replacement in the OSX terminal as if were a native shell environment (with limitations).
Or simply run ipython as is...
P.D. The majority ... |
different types of scripting in linux | 14,824,929 | 2 | 0 | 966 | 0 | python,linux,perl,scripting | Every programmer will have a biased answer to this, but one thing to keep in mind is what your goal is. For instance, if you're only looking to be a successful sysadmin, then your goals might best be served by learning languages that are more conducive to sysadmin tasks (e.g. bash). However, if you're looking to do m... | 0 | 1 | 0 | 1 | 2013-02-12T03:22:00.000 | 3 | 0.132549 | false | 14,824,862 | 0 | 0 | 0 | 2 | I am very new to linux, and i want to learn scripting. It seems like there are quite a few options to learn about scripting from bash shell scripting, python, perl lisp, and probably more that i dont know about. I am just wonder what are the the advantage and disadvantage of all of them, and what would be a good place... |
different types of scripting in linux | 14,825,010 | 1 | 0 | 966 | 0 | python,linux,perl,scripting | I think a lot of times, people new to programming see all the options out there and don't know where to start. You listed a bunch of different languages in your post. My advice would be to pick one of those languages and find a book or tutorial and work through it.
I became interested in "scripting" from just trying ... | 0 | 1 | 0 | 1 | 2013-02-12T03:22:00.000 | 3 | 0.066568 | false | 14,824,862 | 0 | 0 | 0 | 2 | I am very new to linux, and i want to learn scripting. It seems like there are quite a few options to learn about scripting from bash shell scripting, python, perl lisp, and probably more that i dont know about. I am just wonder what are the the advantage and disadvantage of all of them, and what would be a good place... |
How to write on terminal after login with telnet to remote machine using python | 15,581,083 | 0 | 0 | 602 | 0 | python-2.7,telnetlib | You should try to login to the machine using telnet, then you will notice you will login into BusyBox. That string you print not an error it is hte normal BusyBox prompt.
It might not be what you expected, I only know BusyBox from Linux boxes that were unable to properly boot. | 0 | 1 | 0 | 1 | 2013-02-12T04:15:00.000 | 1 | 0 | false | 14,825,262 | 0 | 0 | 0 | 1 | I am trying to connect a remote machine in python. I used telnetlib module and could connect to machine after entering login id and password as
tn = Telnet("HOST IP")
tn.write("UID")
tn.write("PWD")
After entering password, the terminal connects to the remote machine which is a linux based software [having its own IP a... |
The system cannot find the path specified in cmd | 14,846,374 | 0 | 0 | 7,184 | 0 | python,windows-7 | Instead of
cd.. Python27
you need to type
cd \python27 | 0 | 1 | 0 | 1 | 2013-02-13T04:20:00.000 | 2 | 0 | false | 14,846,333 | 1 | 0 | 0 | 2 | When I open my Command Prompt,
the default path is C:\Users\acer>
so I want to change the path to C:\Python27
the method is as follows
i enter cd.. 2 times..
then I enter cd.. Python27
as my Python27 folder located in C:\
however, I got this message "the system cannot find the path specified"
Can anyone help me? |
The system cannot find the path specified in cmd | 14,846,408 | 1 | 0 | 7,184 | 0 | python,windows-7 | No need for cd .. mumbo jumbo, just go cd C:/Python27. | 0 | 1 | 0 | 1 | 2013-02-13T04:20:00.000 | 2 | 0.099668 | false | 14,846,333 | 1 | 0 | 0 | 2 | When I open my Command Prompt,
the default path is C:\Users\acer>
so I want to change the path to C:\Python27
the method is as follows
i enter cd.. 2 times..
then I enter cd.. Python27
as my Python27 folder located in C:\
however, I got this message "the system cannot find the path specified"
Can anyone help me? |
How to include third party Python libraries in Google App Engine? | 14,850,874 | 0 | 33 | 23,439 | 0 | python,google-app-engine | Just put Beautifulsoup in the root of your project and upload it all | 0 | 1 | 0 | 0 | 2013-02-13T10:01:00.000 | 6 | 0 | false | 14,850,853 | 0 | 0 | 1 | 2 | How to add third party python libraries in Google App Engine, which are not provided by Google? I am trying to use BeautifulSoup in Google App Engine and unable to do so. But my question is for any library I want to use in Google App Engine. |
How to include third party Python libraries in Google App Engine? | 35,193,844 | 0 | 33 | 23,439 | 0 | python,google-app-engine | pip install -t lib package_name
lib: the location for third party libraries
Then you are good to use this package like a normal library you use from ipython or terminal. | 0 | 1 | 0 | 0 | 2013-02-13T10:01:00.000 | 6 | 0 | false | 14,850,853 | 0 | 0 | 1 | 2 | How to add third party python libraries in Google App Engine, which are not provided by Google? I am trying to use BeautifulSoup in Google App Engine and unable to do so. But my question is for any library I want to use in Google App Engine. |
Unix process running python | 14,864,397 | 0 | 0 | 127 | 0 | python,linux,process | ps aux | grep json ought to do it, or just pgrep -lf json. | 0 | 1 | 0 | 1 | 2013-02-13T22:23:00.000 | 1 | 0 | false | 14,864,378 | 0 | 0 | 0 | 1 | I have a cron who execute 2 python scripts. How I can see with the "ps" command if the process are running ?
my scripts names are:
json1.py
json2.py |
Python script handler for Google AppEngine | 14,868,496 | 2 | 0 | 100 | 0 | python,google-app-engine,urlfetch,app.yaml | The myScript.py was for the 2.5 runtime, the model for invoking apps with 2.7 runtime normally utilises the myScript.app method. Have a look at the age of the tutorials and also what Python runtime they have configured in their app.yaml. | 0 | 1 | 0 | 0 | 2013-02-14T04:44:00.000 | 1 | 0.379949 | false | 14,867,945 | 0 | 0 | 1 | 1 | I am writing an app engine application to fetch url content using urlfetch available in google app engine.
however in the app.yaml file, I have a doubt in script handle
I have found that some people use script name as myScript.py while some tutorials use myScript.app
what's the difference between the two uses ? |
Installing a new distribution of Python on Fedora | 14,869,972 | 3 | 0 | 91 | 0 | python,linux | Do not try to uninstall the pre-installed Python.
Install other Python interpreters side by side (in different directories).
You may come across an option to choose the default Python interpreter for your system. Don't change that from the pre-installed one, as that may break some important scripts used by the system. ... | 0 | 1 | 0 | 0 | 2013-02-14T07:33:00.000 | 1 | 1.2 | true | 14,869,861 | 1 | 0 | 0 | 1 | I have a Fedora virtual machine. It comes with Python pre-installed. I've read that it's not a good idea to uninstall it. I want to install a different version of Python, Enthought Python. Should I try to uninstall the existing Python installation and how would I do that? Should I instead install Enthought Python ... |
How can I make my program utilize tab completion? | 14,886,568 | 0 | 9 | 3,643 | 0 | python,shell,command-line-interface,tab-completion | Take a look at the source of the 'cmd' module in the Python library. It supports command completion. | 0 | 1 | 0 | 0 | 2013-02-14T15:26:00.000 | 4 | 0 | false | 14,878,215 | 0 | 0 | 0 | 1 | I've noticed that some programs (e.g. hg) allow the user to tab-complete specific parts of the command. For example, if, in an hg repository working directory, I type:
hg qpush --move b8<TAB>
It will try to complete the command with any mercurial patches in my patch queue that start with "b8".
What I'd like to do is i... |
Programmatically modifying someones AppDelegate - categories, subclass? | 14,883,949 | 0 | 1 | 530 | 0 | python,ios,objective-c | One solution I am currently considering:
Add NewAppDelegate.m/h file that subclasses AppDelegate.
This subclass, does what I want, and then calls the super methods.
Find/replace AppDelegate with NewAppDelegate.m.h in main.m
This seems pretty simple and robust. Thoughts on this? Will this work for all/most projects? | 0 | 1 | 0 | 0 | 2013-02-14T20:37:00.000 | 2 | 1.2 | true | 14,883,568 | 0 | 0 | 0 | 1 | I am working on a framework installer script. The script needs to modify the users AppDelegate file and inject a few lines of code at the beginning or end of the applicationDidFinishLaunching and applicationWillTerminatate methods.
Some options I've thought about:
Parse the source code, and insert lines at correct pos... |
python vipscc in celery Missing argument | 14,896,601 | 0 | 0 | 77 | 0 | python,python-imaging-library,celery | ok, ive realised this only happens if im running celery in Debug mode. outside of this it works fine | 0 | 1 | 0 | 0 | 2013-02-15T14:03:00.000 | 1 | 0 | false | 14,896,418 | 0 | 0 | 0 | 1 | Im trying to write a celery task for processing large tif files. From past experience Ive found vipscc uses less memory than pil to process/resize tif's so id like to use that module. The problem is that when i try to import vipscc inside a celery task excuted by a worker i get this message:
fatal Python error: can't ... |
lxml on python-3.3.0 ImportError: undefined symbol: xmlBufContent | 14,927,230 | 1 | 1 | 2,277 | 0 | lxml,importerror,python-3.3 | You should probably mention the specific operating system you're trying to install on, but I'll assume it's some form of Linux, perhaps Ubuntu or Debian since you mention apt-get.
The error message you mention is typical on lxml when the libxml2 and/or libxslt libraries are not installed for it to link with. For whatev... | 0 | 1 | 1 | 0 | 2013-02-16T12:23:00.000 | 1 | 0.197375 | false | 14,910,250 | 0 | 0 | 0 | 1 | I am having a hard time installing lxml(3.1.0) on python-3.3.0. It installs without errors and I can see the lxml-3.1.0-py3.3-linux-i686.egg in the correct folder (/usr/local/lib/python3.3/site-packages/), but when I try to import etree, I get this:
from lxml import etree
Traceback (most recent call last):
... |
Is pynfs stable to run on windows server 2008? | 15,278,451 | 0 | 0 | 173 | 0 | python,python-2.7,windows-server-2008,ubuntu-12.04,nfs | pynfs is a test suite and not ment to run as nfs server in production | 0 | 1 | 0 | 0 | 2013-02-16T15:54:00.000 | 2 | 0 | false | 14,912,150 | 0 | 0 | 0 | 2 | I need to connect from a windows server 2008 in a secure network to an ubuntu box and write and read files easily from python code. I want to avoid samba or ftp, so I am considering NFS and my question is, if pynfs works stable on windows (if at all, or does it work on linux only?)
I found the source and some forks on ... |
Is pynfs stable to run on windows server 2008? | 15,273,016 | 0 | 0 | 173 | 0 | python,python-2.7,windows-server-2008,ubuntu-12.04,nfs | I would prefer pynfs had some modern infrastructure around it.
I went with samba this time. | 0 | 1 | 0 | 0 | 2013-02-16T15:54:00.000 | 2 | 1.2 | true | 14,912,150 | 0 | 0 | 0 | 2 | I need to connect from a windows server 2008 in a secure network to an ubuntu box and write and read files easily from python code. I want to avoid samba or ftp, so I am considering NFS and my question is, if pynfs works stable on windows (if at all, or does it work on linux only?)
I found the source and some forks on ... |
Python built exe process to kill itself after a period of time | 14,915,445 | 0 | 0 | 1,455 | 0 | python,process,terminate | Create a thread when your process starts.
Make that thread sleep for the required duration.
When that sleep is over, kill the process. | 0 | 1 | 0 | 0 | 2013-02-16T21:00:00.000 | 2 | 0 | false | 14,915,048 | 0 | 0 | 0 | 1 | How is it possible to get a compiled .exe program written in Python to kill itself after a period of time after it is launched?
If I have some code and I compile it into an .exe, then launch it and it stays in a 'running' or 'waiting' state, how can I get it to terminate after a few mins regardless of what the program ... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.