lang
stringclasses
4 values
desc
stringlengths
2
8.98k
code
stringlengths
7
36.2k
title
stringlengths
12
162
Python
I have a list of lists : I want to get the list whose sum of its elements is the greatest in the list . In this case [ 7,8,9 ] .I 'd rather have a fancy map or lambda or list comprehension method than a for/while/if loop . Best Regards
x = [ [ 1,2,3 ] , [ 4,5,6 ] , [ 7,8,9 ] , [ 2,2,0 ] ]
How to find the list in a list of lists whose sum of elements is the greatest ?
Python
This code is my celery worker script : When I try to run the worker I will get into this error : My question is how can I send the email task , when using a worker in celery.mymail.py__init__Apparently there is some conflict between the blueprint and worker . Remove the blueprint is not an option , if possible , due th...
from app import celery , create_appapp = create_app ( 'default ' ) app.app_context ( ) .push ( ) File `` /home/vagrant/myproject/venv/app/mymail.py '' , line 29 , in send_email_celery msg.html = render_template ( template + '.html ' , **kwargs ) File `` /home/vagrant/myproject/venv/local/lib/python2.7/site-packages/fla...
Send email task with correct context
Python
I am building a C++ application that will call python + numpy and I would like to DELAYLOAD the python dll . I use Visual Studio 2015 on Windows with 64 bit python 3.6 . The DELAYLOAD works fine as long as I am not using numpy . As soon as I call import_array ( ) , I can no longer build with DELAYLOAD option . The link...
// Initialize pythonPy_Initialize ( ) ; // If I remove this line , I am able to build with DELAYLOADimport_array ( ) ; if ( ! Py_IsInitialized ( ) ) { // Initialize Python Py_Initialize ( ) ; // Initialize threads PyEval_InitThreads ( ) ; // Needed for datetime PyDateTime_IMPORT ; // Needed to avoid use of Py_None , Py...
Delay load python DLL when embedding python+numpy
Python
I have a numpy array that I would like to share between a bunch of python processes in a way that does n't involve copies . I create a shared numpy array from an existing numpy array using the sharedmem package.My problem is that each subprocess needs to access rows that are randomly distributed in the array . Currentl...
import sharedmem as shmdef convert_to_shared_array ( A ) : shared_array = shm.shared_empty ( A.shape , A.dtype , order= '' C '' ) shared_array [ ... ] = A return shared_array # idx = list of randomly distributed integerslocal_array = shared_array [ idx , : ] # Do stuff with local array local_array = shared_array [ star...
Shared Non-Contiguous-Access Numpy Array
Python
I tried to compile Pyparsing on my Windows machine but got the following errors : I did the compilation with Microsoft Visual C++ 2008 Express edition and the Pyparsing module used is the most current version . Please , does anyone know of how to make this work ?
python setup.py build_ext -- inplacerunning build_extcythoning pyparsing.pyx to pyparsing.cError compiling Cython file : -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- ... If C { include } is set to true , the matched expression is also parsed ( theskipped text and matched exp...
How do I compile Pyparsing with Cython on WIndows ?
Python
I have a 3D numpy array that looks like this : At first I want the maximum along axis zero with X.max ( axis = 0 ) ) : which gives me : The next step is now my problem ; I would like to call the location of each 10 and create a new 2D array from another 3D array which has the same dimeonsions as X.for example teh array...
X = [ [ [ 10 1 ] [ 2 10 ] [ -5 3 ] ] [ [ -1 10 ] [ 0 2 ] [ 3 10 ] ] [ [ 0 3 ] [ 10 3 ] [ 1 2 ] ] [ [ 0 2 ] [ 0 0 ] [ 10 0 ] ] ] [ [ 10 10 ] [ 10 10 ] [ 10 10 ] ] Y = [ [ [ 11 2 ] [ 3 11 ] [ -4 100 ] ] [ [ 0 11 ] [ 100 3 ] [ 4 11 ] ] [ [ 1 4 ] [ 11 100 ] [ 2 3 ] ] [ [ 100 3 ] [ 1 1 ] [ 11 1 ] ] ] [ [ 11 11 ] [ 11 11 ] [...
Find Maximum of 3D np.array along Axis = 0
Python
I have the problem that I am not seeing any default values for arguments when specifying them via add_argument for subparsers using the argparse Python package.Some research said that you need non-empty help-parameters set for each add_argument step and you need ArgumentDefaultsHelpFormatter as formatter_class as descr...
from argparse import ArgumentParser , ArgumentDefaultsHelpFormatterparser = ArgumentParser ( description= '' Sample script '' , formatter_class=ArgumentDefaultsHelpFormatter , version= '' sample version '' ) # Initialize Subparserssubparsers = parser.add_subparsers ( help= '' '' , dest= '' command '' ) # foo commandfoo...
Argparse : Default values not shown for subparsers
Python
I am trying to make a kind of recursive call on my first Click CLI app.The main point is to have sub-commands associated to the first and , so , I was trying to separate it all in different files/modules to improve it 's maintainability.I have the current directory : This is my main file : My projects __init__ file : M...
root|-commands|-project| -- -__init__| -- -command1| -- -command2|-database| -- -__init__| -- -command1| -- -command2 import clickfrom commands.project import projectfrom commands.database import database @ click.group ( help= '' Main command '' ) def main ( ) : passmain.add_command ( project ) main.add_command ( datab...
Include submodules on click
Python
I want to be able to save my array subclass to a npy file , and recover the result later.Something like : the docs says ( emphasis mine ) : The format explicitly does not need to : [ ... ] Fully handle arbitrary subclasses of numpy.ndarray . Subclasses will be accepted for writing , but only the array data will be writ...
> > > class MyArray ( np.ndarray ) : pass > > > data = MyArray ( np.arange ( 10 ) ) > > > np.save ( 'fname ' , data ) > > > data2 = np.load ( 'fname ' ) > > > assert isinstance ( data2 , MyArray ) # raises AssertionError
How can I make np.save work for an ndarray subclass ?
Python
I am trying to upgrade my code to python 3 . Having some trouble with this line , output_file = open ( working_dir + `` E '' +str ( s ) + '' .txt '' , `` w+ '' ) and get this error TypeError : sequence item 0 : expected str instance , bytes foundWhat i 've tried , ive also tried using decode ( ) on the join , also trie...
output_file.write ( ' , '.join ( headers ) + `` \n '' ) output_file.write ( b '' , '' .join ( headers ) + b '' \n '' ) TypeError : write ( ) argument must be str , not bytes
TypeError : write ( ) argument must be str , not byte , upgrade to python 3
Python
It seems that each Tensorflow session I open and close consumes 1280 bytes from the GPU memory , which are not released until the python kernel is terminated . To reproduce , save the following python script as memory_test.py : Then run it from command line with different number of iterations : python memory_test.py 0 ...
import tensorflow as tfimport sysn_Iterations=int ( sys.argv [ 1 ] ) def open_and_close_session ( ) : with tf.Session ( ) as sess : passfor _ in range ( n_Iterations ) : open_and_close_session ( ) with tf.Session ( ) as sess : print ( `` bytes used= '' , sess.run ( tf.contrib.memory_stats.BytesInUse ( ) ) )
Tensorflow leaks 1280 bytes with each session opened and closed ?
Python
I know that the Unix timestamp is defined as the number of seconds passed since 1970-01-01 00:00:00Z . However , I could not find a clear source that gives this definition . I 've also read various different statements about the relationship between UTC and the Unix Timestamp in regards to leap seconds.This wikipedia p...
1972-06-30 23:59:60 > > > import datetime > > > a = datetime.datetime ( 1972 , 6 , 30 , 23 , 59 , 59 ) > > > b = datetime.datetime ( 1972 , 7 , 1 , 0 , 0 , 0 ) > > > b-adatetime.timedelta ( 0 , 1 ) > > > import time > > > t3 = time.mktime ( ( 1972 , 6 , 30 , 23 , 59 , 59 , -1 , -1 , -1 ) ) > > > t4 = time.mktime ( ( 19...
What do Unix Timestamps actually track ?
Python
I have a python discord bot built with discord.py , meaning the entire program runs inside an event loop.The function I 'm working on involves making several hundred HTTP requests and add the results to a final list . It takes about two minutes to do these in order , so I 'm using aiohttp to make them async . The relat...
async def searchPostList ( postUrls , searchString ) futures = [ ] async with aiohttp.ClientSession ( ) as session : for url in postUrls : task = asyncio.ensure_future ( searchPost ( url , searchString , session ) ) futures.append ( task ) return await asyncio.gather ( *futures ) async def searchPost ( url , searchStri...
Collecting results from python coroutines before loop finishes
Python
I 'm trying to apply a color map to a two dimensional gray-scale image in Numpy ( the image is loaded/generated by OpenCV ) .I have a 256 entries long list with RGB values , which is my colormap : When I input a gray scale image ( shape ( y , x,1 ) ) , I would like to output a color image ( shape ( y , x,3 ) ) , where ...
cMap = [ np.array ( [ 0 , 0 , 0 ] , dtype=np.uint8 ) , np.array ( [ 0 , 1 , 1 ] , dtype=np.uint8 ) , np.array ( [ 0 , 0 , 4 ] , dtype=np.uint8 ) , np.array ( [ 0 , 0 , 6 ] , dtype=np.uint8 ) , np.array ( [ 0 , 1 , 9 ] , dtype=np.uint8 ) , np.array ( [ 0 , 0 , 12 ] , dtype=np.uint8 ) , # Many more entries here ] colorIm...
Map values to higher dimension with Numpy
Python
I want to deprecate a parameter alias in click ( say , switch from underscores to dashes ) . For a while , I want both formulations to be valid , but throw a FutureWarning when the parameter is invoked with the to-be-deprecated alias . However , I have not found a way to access the actual alias a parameter was invoked ...
click.command ( ) click.option ( ' -- old ' , ' -- new ' ) def cli ( *args , **kwargs ) : ...
Correct way to deprecate parameter alias in click
Python
I 've been trying to understand how symmetric encryption works and how I can integrate it in my CLI application but I 've got stuck at some point which I 'm going to describe below.My use case is the following : I have a CLI application ( SQLAlchemy + click + Python 3.8 ) which is going to be a very simple Password Man...
import base64from cryptography.fernet import Fernet , InvalidTokenfrom cryptography.hazmat.backends import default_backendfrom cryptography.hazmat.primitives import hashesfrom cryptography.hazmat.primitives.kdf.pbkdf2 import PBKDF2HMACdef generate_key_derivation ( salt , master_password ) : kdf = PBKDF2HMAC ( algorithm...
Symmetric encryption using Fernet in Python - Master password use case
Python
I have a csv file that has several columns that I first delimit by colon ( ; ) . However , ONE column is delimited by a pipe | and I would like to delimit this column and create new columns.Input : Desired Output : My code so far delimits the first time by ; and then converts to DF ( which is my desired end format )
Column 1 Column 2 Column 3 1 2 3|4|5 6 7 6|7|8 10 11 12|13|14 Column 1 Column 2 ID Age Height 1 2 3 4 5 6 7 6 7 8 10 11 12 13 14 delimit = list ( csv.reader ( open ( 'test.csv ' , 'rt ' ) , delimiter= ' ; ' ) ) df = pd.DataFrame ( delimit )
Delimit a specific column and add them as columns in CSV ( Python3 , CSV )
Python
With the below command , all stderr and stdout redirect into /tmp/ss.log and it perform in background process.Now to redirect stderr and stdout into /var/log directory as following.It encounter permission problem.I made a try with sudo tee as following.All of them encounter another problem , the command ca n't run in b...
python sslocal -c /etc/shadowsocks.json > /tmp/ss.log 2 > & 1 & python sslocal -c /etc/shadowsocks.json > /var/log/ss.log 2 > & 1 & bash : /var/log/ss.log : Permission denied python sslocal -c /etc/shadowsocks.json |sudo tee -a /var/log/ss.log 2 > & 1 & python sslocal -c /etc/shadowsocks.json 2 > & 1|sudo tee -a /var/l...
How to redirect stderr and stdout into /var/log directory in background process ?
Python
I have a directory structure that might look something likeIn as simple/quick a step as possible , I want to rename Current as Previous including the contents and wiping out the original such that it is now : I 've tried something like : The docs led me to hope this would work : If target points to an existing file or ...
Data Current A B C Previous A X Data Previous A B C from pathlib import Pathsrc = Path ( 'Data/Current ' ) dest = Path ( 'Data/Previous ' ) src.replace ( dest )
Use pathlib to destructively rename one directory to another existing directory
Python
I want to check whether a particular string is present in a sentence . I am using simple code for this purposeThis is an easy straightforward approach , but it fails when sentence appears as hello world I am Jo ker hello world I am J okerAs I am parsing sentence from a PDF file some unnecessary spaces are coming here a...
subStr = 'joker'Sent = 'Hello World I am Joker'if subStr.lower ( ) in Sent.lower ( ) : print ( 'found ' )
How to match a string in a sentence
Python
In Python 2.x , the following code produces an error , as expected : However , the following is allowed : Why is the + operator not defined for function and int , but the < operator is ?
> > > def a ( x ) : return x+3 ... > > > a+4Traceback ( most recent call last ) : File `` < stdin > '' , line 1 , in < module > TypeError : unsupported operand type ( s ) for + : 'function ' and 'int ' > > > a < 4False
in Python 2.x , why is the > operator supported between function and int ?
Python
I have the following dictionary : To check if there are more than one key available in the above dictionary , I will do something like : The above is maintainable with only 2 key lookups . Is there a better way to handle checking a large number of keys in a very big dictionary ?
sites = { 'stackoverflow ' : 1 , 'superuser ' : 2 , 'meta ' : 3 , 'serverfault ' : 4 , 'mathoverflow ' : 5 } 'stackoverflow ' in sites and 'serverfault ' in sites
How do you check the presence of many keys in a Python dictionary ?
Python
This is code I have , but it looks like non-python.What is the most `` python '' way of doing this ? Use a lambda filter function ? For some reason very few examples online actually work with a list of objects where you compare properties , they always show how to do this using a list of actual strings , but that 's no...
def __contains__ ( self , childName ) : `` '' '' Determines if item is a child of this item '' '' '' for c in self.children : if c.name == childName : return True return False
Python List operations
Python
Let 's assume we want to create a family of classes which are different implementations or specializations of an overarching concept . Let 's assume there is a plausible default implementation for some derived properties . We 'd want to put this into a base classSo a subclass will automatically be able to count its ele...
class Math_Set_Base : @ property def size ( self ) : return len ( self.elements ) class Concrete_Math_Set ( Math_Set_Base ) : def __init__ ( self , *elements ) : self.elements = elementsConcrete_Math_Set ( 1,2,3 ) .size # 3 import mathclass Square_Integers_Below ( Math_Set_Base ) : def __init__ ( self , cap ) : self.si...
Subclassing : Is it possible to override a property with a conventional attribute ?
Python
Bear with me as I 'm fairly new to programming . My basic question is this . I have a program written in Haskell whose stdout I want to connect to the stdin of a Python program ( which will manage the GUI related stuff ) . Similarly , I want to connect the stdout of the Python program to the stdin of the Haskell progra...
main : : IO ( ) main = do -- putStrLn `` Enter a number . '' < - this will be displayed in Python string < - getLine putStrLn $ 5 + read string : :Int -- or any equivalent function to send to stdout from Tkinter import *root = Tk ( ) label = Label ( root , text = `` Enter a number . `` ) label.pack ( ) enternum = Entry...
Connecting A Haskell Program to a Python Program via Pipelines ( in Python )
Python
In Matlab , this type of algorithm ( `` growing arrays '' ) is advised againstwhereas it seems that many examples for Python show this kind of algorithm ( this is a really bad example though ) : I wonder why that is -- what is the difference ?
mine = [ ] for i=1:100 , mine = [ mine , randn ( 1 ) ] end import numpy.random as randmine = [ ] for i in range ( 100 ) : mine.append ( rand.random ( 1 ) [ 0 ] )
`` growing '' ( appending to ) a sequence object
Python
Consider this Python script : The desired output would be the 2 span elements , instead I get : Why does it include the text after the element until the end of the parent element ? I 'm trying to use lxml to link footnotes and when I a.insert ( ) the span element into the a element I create for it , it 's including the...
from lxml import etreehtml = `` ' < html xmlns= '' http : //www.w3.org/1999/xhtml '' > < head > < /head > < body > < p > This is some text followed with 2 citations. < span class= '' footnote '' > 1 < /span > < span сlass= '' footnote '' > 2 < /span > This is some more text. < /p > < /body > < /html > ' '' tree = etree...
Why does this element in lxml include the tail ?
Python
The code below ( to compute cosine similarity ) , when run repeatedly on my computer , will output 1.0 , 0.9999999999999998 , or 1.0000000000000002 . When I take out the normalize function , it will only return 1.0 . I thought floating point operations were supposed to be deterministic . What would be causing this in m...
# ! /usr/bin/env python3import mathdef normalize ( vector ) : sum = 0 for key in vector.keys ( ) : sum += vector [ key ] **2 sum = math.sqrt ( sum ) for key in vector.keys ( ) : vector [ key ] = vector [ key ] /sum return vectordict1 = normalize ( { `` a '' :3 , `` b '' :4 , `` c '' :42 } ) dict2 = dict1n_grams = list ...
Python floating point determinism
Python
I am a beginner at Python , and this is my first post , so do n't be too harsh : ) . I 've been playing around with Python lately and was wondering if something likewould result in Python first creating a list of all the elements and then finding the max , resulting in O ( 2n ) time , or it would keep track of the max ...
max ( [ x for x in range ( 25 ) ] )
Do list comprehensions in Python reduce in a memory efficient manner ?
Python
I am working in a Python Eve based RESTful service with a SQLAlcemy backend . I have two models with a one to many relationship : CommonColumnsis defined like hereThis works great when inserting , updating and deleting users . However , I ca n't get inserting right for folders : Which is a rather cryptic error message ...
class User ( CommonColumns ) : `` '' '' Model of an user in the database '' '' '' __tablename__ = `` user '' id = Column ( Integer , primary_key=True ) username = Column ( String , unique=True ) email = Column ( EmailType , unique=True ) folders = relationship ( 'Folder ' , backref='user ' ) def __unicode__ ( self ) : ...
Python Eve , SQLalchemy and ForeignKey
Python
I need to initialize arrays of variable shape ( dim , ) + ( nbins , ) *dim , where dim is usually small , but nbins can be large , so that the array has ndims = dim + 1 . For example if dim = 1 I need an array of shape ( 1 , nbins ) , if dim = 2 the shape is ( 2 , nbins , nbins ) etc.Is it possible to type numpy arrays...
ctypedef uint32_t uint_t ctypedef float real_t ... cdef : uint_t dim = input_data.shape [ 1 ] np.ndarray [ real_t , ndim=dim+1 ] my_array = np.zeros ( ( dim , ) \ + ( nbins , ) *dim , dtype = np.float32 )
Cython + Numpy variable ndim ?
Python
When I create a PyTorch DataLoader and start iterating -- I get an extremely slow first epoch ( x10 -- x30 slower then all next epochs ) . Moreover , this problem occurs only with the train dataset from the Google landmark recognition 2020 from Kaggle . I ca n't reproduce this on synthetic images , also , I tried to cr...
import argparseimport pandas as pdimport numpy as npimport os , sysimport multiprocessing , rayimport timeimport cv2import loggingimport albumentations as albufrom torch.utils.data import Dataset , DataLoadersamples = 50000 # count of samples to speed up testbs = 64 # batch sizedir = '/hdd0/datasets/ggl_landmark_recogn...
pytorch DataLoader extremely slow first epoch
Python
I created a notebook which should display plots in a tab widget . As far as I understand , to include something like plots in the tab widget , I need to wrap it in the output widget . In the notebook itself it works but when I convert it to html via nbconvert it produces the wrong output.Widgets like sliders , buttons ...
# Import librariesimport pandas as pdimport matplotlib.pyplot as pltimport ipywidgets as widgetsimport numpy as np # Generated data for plottingdata = pd.DataFrame ( ) for i in range ( 5 ) : data [ i ] = np.random.normal ( size = 50 ) # This does not work with plotschildren = [ ] for i in range ( data.shape [ 1 ] ) : o...
Output widget appears outside tab widget when using nbconvert on jupyter notebook with ipywidgets
Python
I love list comprehensions in Python , because they concisely represent a transformation of a list.However , in other languages , I frequently find myself writing something along the lines of : This example is in C # , where I 'm under the impression LINQ can help with this , but is there some common programming constr...
foreach ( int x in intArray ) if ( x > 3 ) //generic condition on x x++ //do other processing
Replacement for for ... if array iteration
Python
Apparently python will allow me to hash a generator expression like ( i for i in [ 1 , 2 , 3 , 4 , 5 ] ) On closer inspection however , this hash value is always the same no matter what generator I put into it ! why is this happening ? Why is hashing a generator even allowed ? I need to do this because I ’ m storing st...
> > > hash ( i for i in [ 1 , 2 , 3 , 4 , 5 ] ) 8735741846615 > > > hash ( i for i in range ( 2 ) ) 8735741846615 > > > hash ( i for i in [ 1 , 2 , 3 , 4 , 5 , 6 ] ) 8735741846615 > > > hash ( i for i in [ 0 , 1 , 2 , 3 , 4 , 5 , 6 ] ) 8735741846615
Hashing a generator expression
Python
I am trying to use Entrez to import publication data into a database . The search part works fine , but when I try to parse : ... I get the following error : File `` /venv/lib/python2.7/site-packages/Bio/Entrez/Parser.py '' , line 296 , in parse raise ValueError ( `` The XML file does not represent a list . Please use ...
from Bio import Entrezdef create_publication ( pmid ) : handle = Entrez.efetch ( `` pubmed '' , id=pmid , retmode= '' xml '' ) records = Entrez.parse ( handle ) item_data = records.next ( ) handle.close ( ) from Bio import Entrez Entrez.email = `` Your.Name.Here @ example.org '' handle = Entrez.efetch ( `` pubmed '' , ...
Issue with parsing publication data from PubMed with Entrez
Python
I get vertical stripes between the bins when creating a histogram with matplotlib 2.0.2 , python2.7 , Win7,64bit , visible both in the pdf and png created.I am usig pgf with latex to create a PDF which I will use by includegraphics in a pdflatex document . The PNG created is just a quick check.This was not the case in ...
import matplotlib as mplmpl.use ( 'pgf ' ) pgf_with_latex = { # setup matplotlib to use latex for output `` pgf.texsystem '' : `` pdflatex '' , # change this if using xetex or lautex `` text.usetex '' : True , # use LaTeX to write all text `` font.family '' : `` serif '' , `` font.serif '' : [ ] , # blank entries shoul...
Matplotlib 2.0 stripes in histogram
Python
I 'm trying to take a screenshot of a desired portion from a webpage using python in combination with selenium . When I execute my script , I do get a screenshot but that is not what I intended to let the script grab . I wish to grab the portion shown in Desired one below instead of Current output . To get the screensh...
import timefrom selenium import webdriverfrom selenium.webdriver.common.by import Byfrom selenium.webdriver.support.ui import WebDriverWaitfrom selenium.webdriver.support import expected_conditions as EClink = 'https : //www1.ticketmaster.com/celine-dion-courage-world-tour/event/0600567BFDB0AB48'def start_script ( ) : ...
Trouble getting the screenshot of any element after zooming in
Python
I 'm trying to set up an API endpoint to reply with HTML or JSON depending on the incoming request 's Accept headers . I 've got it working , testing through curl : I ca n't figure out how to use the APITestCase ( ) .self.client to specify what content should be accepted , though.My view looks like and my test code loo...
> curl -- no-proxy localhost -H `` Accept : application/json '' -X GET http : //localhost:8000/feedback/ { `` message '' : '' feedback Hello , world ! `` } > curl -- no-proxy localhost -H `` Accept : text/html '' -X GET http : //localhost:8000/feedback/ < html > < body > < h1 > Root < /h1 > < h2 > feedback Hello , worl...
How to specify Accept headers from rest_framework.test.Client ?
Python
Question : How can I Intercept __getitem__ calls on an object attribute ? Explanation : So , the scenario is the following . I have an object that stores a dict-like object as an attribute . Every time the __getitem__ method of this attribute gets called , I want to intercept that call and do some special processing on...
class Test : def __init__ ( self ) : self._d = { ' a ' : 1 , ' b ' : 2 } @ property def d ( self , key ) : val = self._d [ key ] if key == ' a ' : val += 2 return valt = Test ( ) assert ( t.d [ ' a ' ] == 3 ) # Should not throw AssertionError
Intercepting __getitem__ calls on an object attribute
Python
I wrote a small , naive regular expression that was supposed to find text inside parentheses : re.search ( r'\ ( ( .|\s ) *\ ) ' , name ) I know this is not the best way to do it for a few reasons , but it was working just fine . What I am looking for is simply an explanation as to why for some strings this expression ...
x ( y ) z In [ 62 ] : % timeit re.search ( r'\ ( ( .|\s ) *\ ) ' , ' x ( y ) ' + ( 22 * ' ' ) + ' z ' ) 1 loops , best of 3 : 1.23 s per loopIn [ 63 ] : % timeit re.search ( r'\ ( ( .|\s ) *\ ) ' , ' x ( y ) ' + ( 23 * ' ' ) + ' z ' ) 1 loops , best of 3 : 2.46 s per loopIn [ 64 ] : % timeit re.search ( r'\ ( ( .|\s ) ...
Regular expression that never finishes running
Python
In Python , if I copy a list or a dictionary , the copied instance is equal to the original : But if I copy an object , the result is not equal to the original : Why ?
> > > a = [ 1 , 2 , 3 ] > > > b = copy.copy ( a ) > > > a == bTrue > > > a = { `` a '' :1 , `` b '' :2 } > > > b = copy.copy ( a ) > > > a == bTrue > > > class MyClass ( ) : ... def __init__ ( self , name ) : ... self._name= name ... > > > a = MyClass ( ' a ' ) > > > b = copy.copy ( a ) > > > a == bFalse
Why does Python 's copy.copy ( ) return a object not equal to the original ?
Python
I am trying to make a selection field with widget= '' radio '' required using attrs in an XML file . The selection field does n't get required with widget= '' radio '' applied to it . but When I remove the radio widget , the selection field gets the required effect in form view upon creating new records.This is the sel...
< field name= '' installments_calculation '' widget= '' radio '' options= '' { 'horizontal ' : true } '' attrs= '' { 'required ' : [ ( 'repayment_method ' , '= ' , 'salary deduction ' ) ] } '' / > repayment_method = fields.Selection ( [ ( 'cash/bank ' , 'Cash/Bank ' ) , ( 'salary deduction ' , 'Salary Deduction ' ) ] )
Selection field with widget= '' radio '' not getting required effect applied with attrs in XML file in Odoo 12
Python
I 'm new to Python and Django , so please be patient with me.I have the following models : I know that this may not be the best/easiest way to handle it with Django , but I learned it this way and I want to keep it like this.Now , all I want to do is get all the post from one person and it 's friends . The question now...
class User ( models.Model ) : name = models.CharField ( max_length = 50 ) ... class Post ( models.Model ) : userBy = models.ForeignKey ( User , related_name='post_user ' ) userWall = models.ForeignKey ( User , related_name='receive_user ' ) timestamp = models.DateTimeField ( ) post = models.TextField ( ) class Friend (...
Complex query with Django ( posts from all friends )
Python
Why is it that if you compile a conditional expression likethe branches that use numbers get optimized out , but those that use None do n't ? Example : In which scenarios could if 0 and if None behave differently ?
def f ( ) : if None : print ( 222 ) if 0 : print ( 333 ) 3 0 LOAD_CONST 0 ( None ) 3 POP_JUMP_IF_FALSE 14 4 6 LOAD_CONST 1 ( 222 ) 9 PRINT_ITEM 10 PRINT_NEWLINE 11 JUMP_FORWARD 0 ( to 14 ) 5 > > 14 LOAD_CONST 0 ( None ) 17 RETURN_VALUE
Why does Python optimize out `` if 0 '' , but not `` if None '' ?
Python
I have a DataFrame where the columns are a PeriodIndex by Month as follows : I can select a subset of columns as follows : However when it comes to slicing to get for example all months from '2015-01 ' to '2015-03 ' I am stumped as to the syntax required . I have tried all kinds of iterations without luck.For example :...
df = pd.DataFrame ( np.random.randn ( 3,4 ) , index = np.arange ( 3 ) , columns = pd.period_range ( '2015-01 ' , freq = 'M ' , periods = 4 ) ) 2015-01 2015-02 2015-03 2015-040 -1.459943 -1.572013 2.977714 -0.0676961 -1.545259 -0.570757 0.133756 -1.2311922 0.498197 -0.555625 0.174545 0.371475 testdf [ [ pd.Period ( '201...
Selecting and Slicing Columns Which are a PeriodIndex
Python
Methods have an attribute , __self__ , that holds the instance to be passed when the underlying function gets invoked . Apparently , so do built-in functions.In Python 3 , they hold the module object : In Python 2 , on the other hand , they hold None : Does anyone know why there 's a discrepancy here ? In addition to t...
> > > len.__self__ < module 'builtins ' ( built-in ) > > > > sys.getrefcount.__self__ # also for other builtin modules < module 'sys ' ( built-in ) > > > > type ( len.__self__ ) < type 'NoneType ' > > > > sys.getrefcount.__self__ < type 'NoneType ' > > > > from pprint import pprint > > > pprint.__self__AttributeError :...
Why does __self__ of built-in functions return the builtin module it belongs to ?
Python
I have a dataframe with multiple columns . ABCDI would like to set a new column `` MyColumn '' where if BC , CC , and DC are less than AC , you take the max of the three for that row . If only CC and DC are less than AC , you take the max of CC and DC for that row , etc etc . If none of them are less than AC , MyColumn...
AC BC CC DC MyColumn
numpy.where ( ) with 3 or more conditions
Python
I 'm using PySide to manage some hardware and perform some relatively simple operations depending on ( e.g . ) button clicks in the interface . The code for running each of these pieces of hardware resides in another thread . For convenience , to all of those hardware drivers I 've added a generic invoke_method signal ...
my_driver.invoke_method.emit ( 'method_name ' , [ arg , ... ] , { kwarg , ... } )
Safe and lazy method invocations in PySide/PyQt
Python
I am prototyping a new system in Python ; the functionality is mostly numerical.An important requirement is the ability to use different linear algebra back-ends : from individual user implementations to generic libraries , such as Numpy . The linear algebra implementation ( that is , the back-end ) must be independent...
> > > v1 = Vector ( [ 1,2,3 ] ) > > > v2 = Vector ( [ 4,5,6 ] ) > > > print v1 * v2 > > > # prints `` Vector ( [ 4 , 10 , 18 ] ) '' # this example uses numpy as the back-end , but I mean # to do this for a general back-endimport numpy def numpy_array ( *args ) : # creates a numpy array from the arguments return numpy.a...
Architecture for providing different linear algebra back-ends
Python
I have a data frame that records responses of 19717 people 's choice of programing languages through multiple choice questions . The first column is of course the gender of the respondent while the rest are the choices they picked . The data frame is shown below , with each response being recorded as the same name as c...
ID Gender Python Bash R JavaScript C++0 Male Python nan nan JavaScript nan1 Female nan nan R JavaScript C++2 Prefer not to say Python Bash nan nan nan3 Male nan nan nan nan nan Gender Python Bash R JavaScript C++Male 5000 1000 800 1500 1000Female 4000 500 1500 3000 800Prefer Not To Say 2000 ... ... ... 860 df.iloc [ : ...
Unstack and return value counts for each variable ?
Python
I am trying to create genetic signatures . I have a textfile full of DNA sequences . I want to read in each line from the text file . Then add 4mers which are 4 bases into a dictionary.For example : Sample sequence ATGATATATCTATCATWhat I want to add is ATGA , TGAT , GATA , etc.. into a dictionary with ID 's that just i...
Genetic signatures , IDATGA,1TGAT , 2GATA,3 import sys def main ( ) : readingFile = open ( `` signatures.txt '' , `` r '' ) my_DNA= '' '' DNAseq = { } # creates dictionary for char in readingFile : my_DNA = my_DNA+char for char in my_DNA : index = 0 DnaID=1 seq = my_DNA [ index : index+4 ] if ( DNAseq.has_key ( seq ) )...
Splicing through a line of a textfile using python
Python
Is there a built in function or standard library function roughly equivalent tooror even justin any version of Python ? ( The latter is as good as the other two when combined with itertools.takewhile . ) A generator function like these would allow to compute certain recursively defined sequences iteratively , namely fi...
def recur_until ( start , step_fu , stop_predicate=lambda _ : False ) : current = start while not stop_predicate ( current ) : yield current current = step_fu ( current ) def recur_while ( start , step_fu , predicate=lambda _ : True ) : current = start while predicate ( current ) : yield current current = step_fu ( cur...
Does Python have an iterative recursion generator function for first-order recurrence relations ?
Python
I 've set up a Python app engine project with Cloud endpoints . I 'm having a problem where when I test locally , the auth with the endpoints fails but it seems to work fine when I deploy to app engine.Here is what I 've doneGenerated an Android client id using the debug keystoreGenerated a web client id Decorated my e...
ERROR 2014-01-22 23:29:07,006 users_id_token.py:367 ] Token info endpoint returned status 400 : Invalid Value
Google Cloud Endpoints Android Client - Auth error
Python
I have a data frame like thisand I want something like thisI want to pass the column names to values of a row with their corresponding id and the result of the df.I was trying to make it in a loop , introducing each element according to its id , but it 's horrible . Is there an easy way to do this ?
id a b c101 0 3 0102 2 0 5103 0 1 4 id letter num101 a 0101 b 3101 c 0102 a 2102 b 0102 c 5103 a 0103 b 1103 c 4
Passing columns to rows on python pandas
Python
Last edit : I 've figured out what the problem was ( see my own answer below ) but I can not mark the question as answered , it would seem . If someone can answer the questions I have in my answer below , namely , is this a bug in Cython or is this Cython 's intended behavior , I will mark that answer as accepted , bec...
Wed Dec 15 10:35:44 EST 2010test.py:5 : McryptSecurityWarning : get_key ( ) is not recommended return `` .join ( [ ' { :02x } '.format ( x ) for x in o.get_key ( ) ] ) key : b'\x01ez\xd5\xa9\xf9\x1f ) \xa0G\xd2\xf2Z\xfc { \x7fn\x02 ? , \x08\x1c\xc8\x03\x061X\xb5\xc9\x99\xd0\xca'key : b'\x01ez\xd5\xa9\xf9\x1f ) \xa0G\xd...
Data corruption : Where 's the bug‽
Python
Why is it that executing a set of commands in a function : will tend to run 1.5x to 3x times faster in python than executing commands in the top level :
def main ( ) : [ do stuff ] return somethingprint ( main ( ) ) [ do stuff ] print ( something )
global vs. local namespace performance difference
Python
I have now added the current problem onto GitHib . Please find the URL for the repo . I have included a Jupyter notebook that also explains the problem . Thanks guys.https : //github.com/simongraham/dataExplore.gitI am currently working with nutritional data for a project , where the data is in raw JSON format , and I ...
nutrition = pd.read_json ( 'data ' ) [ { `` vcNutritionPortionId '' : `` 478d1905-f264-4d9b-ab76-0ed4252193fd '' , `` vcNutritionId '' : `` 2476378b-79ee-4857-a81d-489661a039a1 '' , `` vcUserId '' : `` cc51145b-5a70-4344-9b55-1a4455f0a9d2 '' , `` vcPortionId '' : `` 1 '' , `` vcPortionName '' : `` 1 average pepper '' ,...
viewing nested JSON data into a pandas dataframe
Python
Given the following class : I do n't understand why they use the property function in this case . Using the property decorator for the input_size function allows one to call input_size on an object , let 's call it cell , of that class , but why do n't they just simply call cell._input_size ? Can anybody tell me why th...
class BasicRNNCell ( RNNCell ) : `` '' '' The most basic RNN cell . '' '' '' def __init__ ( self , num_units , input_size=None ) : self._num_units = num_units self._input_size = num_units if input_size is None else input_size @ property def input_size ( self ) : return self._input_size @ property def output_size ( self...
Usefulness of @ property in this case
Python
In python everything is an object and you can pass it around easily . So I can do : But if I do I get SyntaxError . Why so ?
> > def b ( ) : ... .print `` b '' > > a = b > > a ( ) b a = print
is print a function in Python ?
Python
I 've created a view that displays preview of other templates . I want to show the empty tags inside the templates , so I 've included ... on my settings.py file . However , I would only like to enable this setting for a particular view , not globally on my app.Thanks in advance . : )
TEMPLATE_STRING_IF_INVALID = ' % s '
Enable django 's TEMPLATE_STRING_IF_INVALID on a single method
Python
I could n't find a straightforward way to compare two ( multidimensional in my case ) arrays the in a lexicographic way.Ie.For a < b I want to get true where I get [ true , false , false , true ] For a > b I want to get false where I get [ false , true , true , false ]
a = [ 1,2,3,4 ] b = [ 4,0,1,6 ]
Lexicographic comparison of two numpy ndarrays
Python
I 'm new to Twisted and I 'm having trouble with some necessary subclassing for the static.File in twisted . i 'm trying to set request headers within the subclass.The first bit of code is the subclass definition itself ( pretty straightforward ) , while the second bit is the initialization portion from my code ( this ...
class ResponseFile ( static.File ) : def render_GET ( self , request ) : request.setHeader ( 'Content-Disposition ' , [ 'attachment ; filename= '' tick_db_export.csv '' ' ] ) static.File.render_GET ( self , request ) if __name__ == `` __main__ '' : from twisted.internet import reactor root = ResponseFile ( 'WebFolder '...
Subclassing static.File
Python
I have a DataFrame containing a time series : Last entry is 2016-06-07 23:00:00 . I now want to group this by , say two days , basically like so : However , I want to group starting from my last data point backwards , so instead of getting this result : I 'd much rather expect this : and when grouping by '3D ' : Expect...
rng = pd.date_range ( '2016-06-01 ' , periods=24*7 , freq= ' H ' ) ones = pd.Series ( [ 1 ] *24*7 , rng ) rdf = pd.DataFrame ( { ' a ' : ones } ) rdf.groupby ( pd.TimeGrouper ( '2D ' ) ) .sum ( ) a2016-06-01 482016-06-03 482016-06-05 482016-06-07 24 a2016-06-01 242016-06-03 482016-06-05 482016-06-07 48 a2016-06-01 2420...
Groupby with TimeGrouper 'backwards '
Python
I want to keep the last n rows of each group sorted by a variable var_to_sort using pandas . This is how I would do it now , I want to group the below dataframe by name and then sort by date and then use tail ( n ) to get the last n elements within in by-group.Is there a more efficient way to do this ? What if the data...
data = [ [ 'tom ' , date ( 2018,2,1 ) , `` I want this '' ] , [ 'tom ' , date ( 2018,1,1 ) , `` Do n't want '' ] , [ 'nick ' , date ( 2019,4,1 ) , `` Do n't want '' ] , [ 'nick ' , date ( 2019,5,1 ) , `` I want this '' ] ] # Create the pandas DataFramedf = pd.DataFrame ( data ) df.columns = [ `` names '' , `` date '' ,...
pandas : How to keep the last ` n ` records of each group sorted by another variable ?
Python
In the following example , I 'd like to exclude weekends and plot Y as a straight line , and specify some custom frequency for major tick labels since they would be a `` broken '' time series ( e.g. , every Monday , a la matplotlib 's set_major_locator ) .How would I do that in Altair ?
import altair as altimport pandas as pdindex = pd.date_range ( '2018-01-01 ' , '2018-01-31 ' , freq= ' B ' ) df = pd.DataFrame ( pd.np.arange ( len ( index ) ) , index=index , columns= [ ' Y ' ] ) alt.Chart ( df.reset_index ( ) ) .mark_line ( ) .encode ( x='index ' , y= ' Y ' )
How can I exclude certain dates ( e.g. , weekends ) from time series plots ?
Python
I want to feed data coming from spark clusters , to train a deep network . I do not have GPUs in the nodes , so distributed TensorFlow or packages like elephas is not an option . I have come up with the following generator which does the job . It just retrieves the next batch from Spark . In order to handle batches I a...
class SparkBatchGenerator ( tfk.utils.Sequence ) : def __init__ ( self , spark_df , batch_size , sample_count=None , feature_col='features ' , label_col='labels ' ) : w = Window ( ) .partitionBy ( sf.lit ( ' a ' ) ) .orderBy ( sf.lit ( ' a ' ) ) df = spark_df.withColumn ( 'index ' , sf.row_number ( ) .over ( w ) ) .sor...
Best practice for feeding spark dataframes for training Tensorflow network
Python
Here 's a snippet of code from within TurboGears 1.0.6 : I ca n't figure out how putting a list before a function definition can possibly affect it.In dispatch.generic 's docstring , it mentions : Note that when using older Python versions , you must use ' [ dispatch.generic ( ) ] ' instead of ' @ dispatch.generic ( ) ...
[ dispatch.generic ( MultiorderGenericFunction ) ] def run_with_transaction ( func , *args , **kw ) : pass
What does the [ ] -esque decorator syntax in Python mean ?
Python
I 'm developing a package that requires Python bindings for the dgtsv subroutine from the LAPACK Fortran library . At the moment , I 'm distributing the Fortran source file , dgtsv.f , alongside my Python code , and using numpy.distutils to automatically wrap it and compile it into a shared library , _gtsv.so , that is...
from numpy.distutils.core import setup , Extension , build_extimport osfortran_sources = [ `` dgtsv.f '' ] gtsv = Extension ( name= '' pyfnnd._gtsv '' , sources= [ os.path.join ( `` pyfnnd '' , `` LAPACK '' , ff ) for ff in fortran_sources ] , extra_link_args= [ '-llapack ' ] ) setup ( name='pyfnnd ' , py_modules= [ '_...
Is it possible to wrap a function from a shared library using F2PY ?
Python
In NumPy functions , there are often initial lines that do checking of variable types , forcing them to be certain types , etc . Can someone explain the point of these lines in scipy.signal.square ? What does subtracting a value from itself do ? source
t , w = asarray ( t ) , asarray ( duty ) w = asarray ( w + ( t-t ) ) t = asarray ( t + ( w-w ) )
Why subtract a value from itself ( x - x ) in Python ?
Python
I wish to populate the values x , y and z , where x are the x coordinates , y the y coordinates and z the associated values for each coordinate , as defined by p. Here is how I am doing it : Is there another more readable way ? Perhaps there is something in numpy or scipy for doing this ? Why does z result in array ( [...
p = { ( 1,2 ) : 10 , ( 0,2 ) :12 , ( 2,0 ) :11 } k , z = np.array ( list ( zip ( *p.items ( ) ) ) ) x , y = np.array ( list ( zip ( *k ) ) )
unzip a dictionary of coordinates and values
Python
I 'm attempting to leverage TensorFlow 2.0 's automatic differentiation to automate the calculation of certain gradients on financial instruments . Generally this involves a piecewise interpolation scheme between various `` benchmark points '' . The simplest example is below : Which outputs the following : The issue is...
import tensorflow as tf MATURITIES = tf.constant ( [ 1.0 , 2.0 , 3.0 , 5.0 , 7.0 , 10.0 , 12.0 , 15.0 , 20.0 , 25.0 ] ) CASH_FLOW_TIMES = tf.constant ( [ n * 0.5 for n in range ( 1 , 51 ) ] ) YIELDS = tf.Variable ( [ 0.04153733 , 0.0425888 , 0.04662959 , 0.05406879 , 0.05728735 , 0.0606996 , 0.06182699 , 0.05854381 , 0...
How can I make TensorFlow 2.0 handle piecewise gradients ( e.g . across ` tf.gather ` ) ?
Python
In Sympy , is it possible to modify the way derivatives of functions are output with latex ( ) ? The default is quite cumbersome . This : will outputwhich is quite verbose . If I prefer something like is there a way to force this behavior ?
f = Function ( `` f '' ) ( x , t ) print latex ( f.diff ( x , x ) ) \frac { \partial^ { 2 } } { \partial x^ { 2 } } f { \left ( x , t \right ) } f_ { xx }
Sympy : Modifying LaTeX output of derivatives
Python
As input I have two dataframes : I want to get in the end the following dataframe , with additional column in the first dataframe as a category number if the code lies in the corresponding interval : Intervals are random and the original dataframes are pretty big . Looping with itertuples is too slow . Any pythonic sol...
data1 = [ { 'code':100 } , { 'code':120 } , { 'code':110 } ] data1 = pd.DataFrame ( data1 ) code 0 100 1 120 2 110data2 = [ { 'category':1 , 'l_bound':99 , 'r_bound':105 } , { 'category':2 , 'l_bound':107 , 'r_bound':110 } , { 'category':3 , 'l_bound':117 , 'r_bound':135 } ] data2 = pd.DataFrame ( data2 ) category l_bo...
Merge two dataframes with interval data in one of them
Python
My goal is to generate functions dynamically and then save them in a file . For e.g , in my current attempt , On calling create_fileThe output I want is ( file /tmp/code.py ) : The output I get is ( file '/tmp/code.py ' ) : UPDATE : My solution uses inspect.getsource which returns a string . I wonder if I have limited ...
import iodef create_file ( a_value ) : a_func = make_concrete_func ( a_value ) write_to_file ( [ a_func ] , '/tmp/code.py ' ) def make_concrete_func ( a_value ) : def concrete_func ( b , k ) : return b + k + a_value return concrete_funcdef write_to_file ( code_list , path ) : import inspect code_str_list = [ inspect.ge...
Generate function with arguments filled in when creating it ?
Python
I am trying to create a column on a data frame which contains the minimum of column A ( the value column ) , for which column B ( the id column ) has a particular value . My code is really slow . I 'm looking for a faster way to do this . Here is my little function : And example usage : Some more context : In my real d...
def apply_by_id_value ( df , id_col= '' id_col '' , val_col= '' val_col '' , offset_col= '' offset '' , f=min ) : for rid in set ( df [ id_col ] .values ) : df.loc [ df [ id_col ] == rid , offset_col ] = f ( df [ df [ id_col ] == rid ] [ val_col ] ) return df import pandas as pdimport numpy as np # create data framedf ...
Fastest way to find compute function on DataFrame slices by column value ( Python pandas )
Python
I want to upload packages to pypi.org as mentioned in the Migrating to PyPI.org documentation , but Twine uploads to https : //upload.pypi.org/legacy/ . It 's available on pypi.python.org/pypi/mypolr , but is not found on pypi.org.I 've tried to read several other questions , tutorials , and guides.My pip.ini-file ( I ...
[ distutils ] index-servers = pypi [ pypi ]
Why is Twine 1.9.1 still uploading to legacy PyPi ?
Python
I wish to yield the following : when calling sub_combinations ( ( ' A ' , ' B ' , ' C ' , 'D ' ) ) Here 's my attempt but it does n't work : but I think I 'm on the right track.Additionally , I 'd like to have a second argument called limit which limits the size of the sub tuples , for example sub_combinations ( ( ' A ...
( ( ' A ' , ) , ( ' B ' , ) , ( ' C ' , ) , ( 'D ' , ) ) ( ( ' A ' , ) , ( ' B ' , ) , ( ' C ' , 'D ' ) ) ( ( ' A ' , ) , ( ' B ' , ' C ' ) , ( 'D ' , ) ) ( ( ' A ' , ) , ( ' B ' , ' C ' , 'D ' ) ) ( ( ' A ' , ' B ' ) , ( ' C ' , ) , ( 'D ' , ) ) ( ( ' A ' , ' B ' ) , ( ' C ' , 'D ' ) ) ( ( ' A ' , ' B ' , ' C ' ) , ( ...
subtuples for a tuple
Python
I am executing a subprocess using Popen and feeding it input as follows ( using Python 2.7.4 ) : Adding the entry to the environment it is executed with is necessary because the input string includes Japanese characters , and when the script is not executed from the command line ( in my case being called by Apache ) , ...
env = dict ( os.environ ) env [ 'LC_ALL ' ] = 'en_US.UTF-8'args = [ 'chasen ' , '-i u ' , '-F '' % m `` ' ] process = Popen ( args , stdout=PIPE , stderr=PIPE , stdin=PIPE , env=env ) out , err = process.communicate ( input=string )
Popen subprocess does not exit when stdin includes unicode
Python
Note : There is a very similar question here . Bear with me , however ; my question is not `` Why does the error happen , '' but `` Why was Python implemented as to throw an error in this case . `` I just stumbled over this : throws an UnboundLocalException . Now , I do know why that happens ( later in this scope , a i...
a = 5def x ( ) print a a = 6x ( ) a = 5def x ( ) print b b = 6x ( ) a = 5def x ( ) print globals ( ) [ `` a '' ] a = 6 # local assignmentx ( ) import osos.rename ( `` foo '' , `` bar '' )
Reason for unintuitive UnboundLocalError behaviour
Python
In my application I have a Job class as defined/outlined below . Instance of this job class represents a particular Job run . Job can have multiple checkpoints and each checkpoint can have multiple commands . At any given day there are like 100k different jobs that runs . Job information is persisted in File System . I...
Job - JobName - [ JobCheckpoint ] - StartTime - EndTime - Status - ... JobCheckpoint - JobCheckpointName - [ JobCommand ] - StartTime - EndTime - Status - ... JobCommand - JobCommandName - [ Command ] - StartTime - EndTime - Status - ... get_jobs ( Filter )
User Interface for filtering objects in Python
Python
I am using Python to scrape AAPL 's stock price from Yahoo finance . But the program always returns [ ] . I would appreciate if someone could point out why the program is not working . Here is my code : The original source is like this : Here I just want the price 112.31 . I copy and paste the code and find 'class ' ch...
import urllibimport rehtmlfile=urllib.urlopen ( `` https : //ca.finance.yahoo.com/q ? s=AAPL & ql=0 '' ) htmltext=htmlfile.read ( ) regex= ' < span id=\ '' yfs_l84_aapl\ '' class= '' '' > ( .+ ? ) < /span > 'pattern=re.compile ( regex ) price=re.findall ( pattern , htmltext ) print price < span id= '' yfs_l84_aapl '' c...
Python Web Scraping Problems
Python
I want to add in the count of values that have been summed in the dataframe per year as well as two additional columns [ total of years ] and [ total count ] EDIT ; EDIT 2 ; @ Jezrael , if I want to select only the rows I need ( as discussed in the other question ) , I run into problems with columns being named the sam...
import pandas as pdimport numpy as npdf = pd.DataFrame ( { ' A ' : [ 'd ' , 'd ' , 'd ' , ' f ' , ' f ' , ' f ' , ' g ' , ' g ' , ' g ' , ' h ' , ' h ' , ' h ' ] , ' B ' : [ 5,5,6,7,5,6,6,7,7,6,7,7 ] , ' C ' : [ 1,1,1,1,1,1,1,1,1,1,1,1 ] , 'S ' : [ 2012,2013,2014,2015,2016,2012,2013,2014,2015,2016,2012,2013 ] } ) ; df ...
Add in count of values and columns for totals
Python
Is it possible to start the bpython interpreter so that it always runs some custom commands when it launches ? In my case I simply want to do : I ca n't see anything in the docs . Anyone know a way ?
import numpy as npimport matplotlib.pyplot as plt
bpython configuration - importing numpy and matplotlib by default
Python
Okay , so after going through the tutorials on numpy 's structured arrays I am able to create some simple examples : ( My intended use case would have more than three entries and would use very long 1d-arrays . ) So , all goes well until we try to perform some basic math . I get errors for all of the following : Appare...
from numpy import array , onesnames= [ 'scalar ' , '1d-array ' , '2d-array ' ] formats= [ 'float64 ' , ' ( 3 , ) float64 ' , ' ( 2,2 ) float64 ' ] my_dtype = dict ( names=names , formats=formats ) struct_array1 = ones ( 1 , dtype=my_dtype ) struct_array2 = array ( [ ( 42. , [ 0. , 1. , 2 . ] , [ [ 5. , 6 . ] , [ 4. , 3...
No binary operators for structured arrays in Numpy ?
Python
MotivationTake a look at the following picture.Given are the red , blue , and green curve . I would like to find at each point on the x axis the dominating curve . This is shown as the black graph in the picture . From the properties of the red , green , and blue curve ( increasing and constant after a while ) this boi...
test = A ( 5 , 120000 , 100000 ) test.find_all_intersections ( ) > > > test.find_all_intersections ( ) iteration 4to compute function values it took0.0102479457855iteration 3to compute function values it took0.0134601593018iteration 2to compute function values it took0.0294270515442iteration 1to compute function values...
How to efficiently pass function through ?
Python
Openssl version of my mac and the python is not same . Openssl version of my mac is as following : Where when i check it in python ssl.OPENSSL_VERSION I 'm getting the following version : 'OpenSSL 0.9.8zh 14 Jan 2016 ' I 've tried brew but it 's installing python 3.7 but I need 3.5 . Tried installing it with pyenv but ...
OpenSSL 1.0.2q 20 Nov 2018built on : reproducible build , date unspecifiedplatform : darwin64-x86_64-cc
Python 3.5 with OpenSSL v > 1 MAC OSX Mojave
Python
I 'm running some Matlab code in parallel from inside a Python context ( I know , but that 's what 's going on ) , and I 'm hitting an import error involving matlab.double . The same code works fine in a multiprocessing.Pool , so I am having trouble figuring out what the problem is . Here 's a minimal reproducing test ...
import matlabfrom multiprocessing import Poolfrom joblib import Parallel , delayed # A global object that I would like to be available in the parallel subroutinex = matlab.double ( [ [ 0.0 ] ] ) def f ( i ) : print ( i , x ) with Pool ( 4 ) as p : p.map ( f , range ( 10 ) ) # This prints 1 , [ [ 0.0 ] ] \n2 , [ [ 0.0 ]...
Error pickling a ` matlab ` object in joblib ` Parallel ` context
Python
Using Pillow 5.4.1 , Python 3.6.8Given an image image.png with 9 distinct colours , and given a data palette with 5 distinct colours , one would expect that asking pillow to reduce the image to the described palette that the resulting image would contain colours from only that palette . However , using the im.im.conver...
from PIL import Imageim = Image.open ( `` image.png '' ) # create palette from raw data # colours : Red , Green , Blue , Black , and White ( 5 total ) RGBBW = [ ( 255,0,0 ) , ( 0,255,0 ) , ( 0,0,255 ) , ( 0,0,0 ) , ( 255,255,255 ) ] data = sum ( [ list ( x ) for x in RGBBW ] , [ ] ) [ :256 ] pimg = Image.new ( `` P '' ...
Why does Pillow convert return colours outside the specified palette ?
Python
How can I save a pixmap as a .png file ? I do this : I get this error :
image = gtk.Image ( ) image.set_from_pixmap ( disp.pixmap , disp.mask ) pixbf=image.get_pixbuf ( ) pixbf.save ( 'path.png ' ) pixbf=image.get_pixbuf ( ) ValueError : image should be a GdkPixbuf or empty
ValueError while trying to save a pixmap as a png file
Python
I would like to extract out the source code verbatim from code directives in a restructuredtext string . What follows is my first attempt at doing this , but I would like to know if there is a better ( i.e . more robust , or more general , or more direct ) way of doing it.Let 's say I have the following rst text as a s...
s = `` 'My title========Use this to square a number ... code : : python def square ( x ) : return x**2and here is some javascript too ... code : : javascript foo = function ( ) { console.log ( 'foo ' ) ; } ' '' from docutils.core import publish_doctreedoctree = publish_doctree ( s ) source_code = [ child.astext ( ) for...
extract code from code directive from restructuredtext using docutils
Python
There is a python code , which is supposed to support Python 3 , but may or may not run in Python 2.7.For example , this snippet can run in both Python 2.7 and Python 3.What is the standard way to enforce and recommend Python 3 compatibility in strict mode , even if the code runs fine on Python 2.7 ? Python 2.7 : https...
print ( 'This file works in both ' ) print ( 'How to throw an exception , and suggest recommendation of python 3 only ? ' )
What is the standard way to recommend `` Python 3 only '' compatibility for a Python module ?
Python
You may have heard of the classic checkerboard covering puzzle . How do you cover a checkerboard that has one corner square missing , using L-shaped tiles ? There is a recursive approach to this as explained in the book `` Python Algorithms Mastering Basic Algorithms in the Python Language . `` The idea is to split the...
def cover ( board , lab=1 , top=0 , left=0 , side=None ) : if side is None : side = len ( board ) # Side length s = side // 2 # Offsets for outer/inner squares of subboards offsets = ( ( 0 , -1 ) , ( side-1 , 0 ) ) for dy_outer , dy_inner in offsets : for dx_outer , dx_inner in offsets : # If the outer corner is not se...
What is the intuition behind the checkerboard covering recursive algorithm and how does one get better at formulating such an algorithm ?
Python
I was recently asked to do this task ( school ) : Write a loop generator , which takes as parameter a finite iterable , and generates in infinite loop the iterable So I did : and one of my classmate did : I 'd like to know what are the main differences between the two and if there is a more `` pythonic '' way to write ...
import itertoolsdef loop ( l ) : for eleme‍‌​‍‌nt in itertools.cycle ( l ) : yield element def loop ( l ) : while True : ​‍ for element in l : yield element
itertools.cycle ( iterable ) vs while True
Python
I was working on a simple class that extends dict , and I realized that key lookup and use of pickle are very slow.I thought it was a problem with my class , so I did some trivial benchmarks : The results are really a surprise . While key lookup is 2x slower , pickle is 5x slower.How can this be ? Other methods , like ...
( venv ) marco @ buzz : ~/sources/python-frozendict/test $ python -- versionPython 3.9.0a0 ( venv ) marco @ buzz : ~/sources/python-frozendict/test $ sudo pyperf system tune -- affinity 3 [ sudo ] password for marco : Tune the system configuration to run benchmarksActions=======CPU Frequency : Minimum frequency of CPU ...
Why does subclassing in Python slow things down so much ?
Python
How may I introduce a continuous hue to my seaborn pairplots ? I am passing in a pandas data frame train_df in order to visualise the relationship between the multiple features . However I 'd also like to add a hue which would use their corresponding target values , target_df . These target values are on a continuous s...
sns.pairplot ( train_df )
Seaborn pairplots with continuous hues ?
Python
I tried logging with multiprocessing , and found under windows , I will get different root logger in child process , but under Linux that is ok.The test code : main.py : mymod.py : Under Linux , the result is : But under Windows 7 , 64 bit , I will get different root logger between main and func : And If I initialize r...
# ! /usr/bin/env python # -*- coding : utf-8 -*-import loggingimport multiprocessingfrom mymod import funcdef m_func ( ) : server = multiprocessing.Process ( target=func , args= ( ) ) server.start ( ) logger = logging.getLogger ( ) # print 'in global main : ' , loggerif __name__ == '__main__ ' : print 'in main : ' , lo...
python logging with multiprocessing , root logger different in windows
Python
I have some code which is supposed to be a thread-safe python/c++ api . I am using the macros Py_BEGIN_ALLOW_THREADS and Py_END_ALLOW_THREADS , which expand to create save thread state and create a lock . I am releasing the lock just before method exit ; once inside of if statement scope , and once at method scope.Why ...
uint8_t SerialBuffer : :push_msg ( ) { # if defined ( UBUNTU ) Py_BEGIN_ALLOW_THREADS # endif if ( _type == ARRAY ) { // array access } else if ( _type == PRIORITY_QUEUE ) { // queue access } else { // Placing the return statement in the preprocessor directive // has no effect. # if defined ( UBUNTU ) Py_END_ALLOW_THRE...
C++ macro in scope of if statement not compiling
Python
I have a nested job structure in my python redis queue . First the rncopy job is executed . Once this is finished the 3 dependant registration jobs follow . When the computation of all these 3 jobs is finished I want to trigger a job to send a websocket notification to my frontend.My current try : Unfortunately it seem...
rncopy = redisqueue.enqueue ( raw_nifti_copymachine , patientid , imagepath , timeout=6000 ) t1c_reg = redisqueue.enqueue ( modality_registrator , patientid , `` t1c '' , timeout=6000 , depends_on=rncopy ) t2_reg = redisqueue.enqueue ( modality_registrator , patientid , `` t2 '' , timeout=6000 , depends_on=rncopy ) fla...
python rq - how to trigger a job when multiple other jobs are finished ? Multi job dependency work arround ?
Python
I 'm writing a program to calculate Levenshtein distance in Python . I implemented memoization because I am running the algorithm recursively . My original function implemented the memoization in the function itself . Here 's what it looks like : This works ! However , I found a way to memoize using decorators . I trie...
# Memoization table mapping from a tuple of two strings to their Levenshtein distancedp = { } # Levenshtein distance algorithmdef lev ( s , t ) : # If the strings are 0 , return length of other if not s : return len ( t ) if not t : return len ( s ) # If the last two characters are the same , no cost . Otherwise , cost...
Maximum recursion depth exceeded , but only when using a decorator