workspace
stringclasses
1 value
channel
stringclasses
1 value
sentences
stringlengths
1
3.93k
ts
stringlengths
26
26
user
stringlengths
2
11
sentence_id
stringlengths
44
53
timestamp
float64
1.5B
1.56B
__index_level_0__
int64
0
106k
pythondev
help
<@Orpha> :point_up: i have two examples there.
2017-09-07T11:22:26.000413
Johana
pythondev_help_Johana_2017-09-07T11:22:26.000413
1,504,783,346.000413
92,503
pythondev
help
I was looking for more of an add_all replacement
2017-09-07T11:23:56.000736
Orpha
pythondev_help_Orpha_2017-09-07T11:23:56.000736
1,504,783,436.000736
92,504
pythondev
help
add_all ?
2017-09-07T11:24:51.000049
Johana
pythondev_help_Johana_2017-09-07T11:24:51.000049
1,504,783,491.000049
92,505
pythondev
help
so right now
2017-09-07T11:26:26.000182
Orpha
pythondev_help_Orpha_2017-09-07T11:26:26.000182
1,504,783,586.000182
92,506
pythondev
help
i have this in a for loop
2017-09-07T11:26:39.000593
Orpha
pythondev_help_Orpha_2017-09-07T11:26:39.000593
1,504,783,599.000593
92,507
pythondev
help
oh you are using the orm.
2017-09-07T11:27:07.000774
Johana
pythondev_help_Johana_2017-09-07T11:27:07.000774
1,504,783,627.000774
92,508
pythondev
help
yea
2017-09-07T11:27:12.000322
Orpha
pythondev_help_Orpha_2017-09-07T11:27:12.000322
1,504,783,632.000322
92,509
pythondev
help
then you will need to use session.merge()
2017-09-07T11:27:27.000619
Johana
pythondev_help_Johana_2017-09-07T11:27:27.000619
1,504,783,647.000619
92,510
pythondev
help
i can take it out of the loop and use add_all
2017-09-07T11:27:28.000183
Orpha
pythondev_help_Orpha_2017-09-07T11:27:28.000183
1,504,783,648.000183
92,511
pythondev
help
ok, yea i was looking at that just now
2017-09-07T11:27:39.000776
Orpha
pythondev_help_Orpha_2017-09-07T11:27:39.000776
1,504,783,659.000776
92,512
pythondev
help
im moving files to s3 and storing url in db, but i have to handle duplicates
2017-09-07T11:28:15.000685
Orpha
pythondev_help_Orpha_2017-09-07T11:28:15.000685
1,504,783,695.000685
92,513
pythondev
help
just incase #clients
2017-09-07T11:28:29.000310
Orpha
pythondev_help_Orpha_2017-09-07T11:28:29.000310
1,504,783,709.00031
92,514
pythondev
help
i wouldn’t consider that a bulk insert but ok.
2017-09-07T11:28:33.000249
Johana
pythondev_help_Johana_2017-09-07T11:28:33.000249
1,504,783,713.000249
92,515
pythondev
help
fair enough
2017-09-07T11:28:40.000420
Orpha
pythondev_help_Orpha_2017-09-07T11:28:40.000420
1,504,783,720.00042
92,516
pythondev
help
through the orm you would have to look through all of them and .merge() them to the system.
2017-09-07T11:29:23.000887
Johana
pythondev_help_Johana_2017-09-07T11:29:23.000887
1,504,783,763.000887
92,517
pythondev
help
i haven’t actually read into it, but im assuming the current way im doing it is bad performance wise
2017-09-07T11:29:28.000342
Orpha
pythondev_help_Orpha_2017-09-07T11:29:28.000342
1,504,783,768.000342
92,518
pythondev
help
yea, the fastest thing you’ll get is using the core.
2017-09-07T11:29:45.000749
Johana
pythondev_help_Johana_2017-09-07T11:29:45.000749
1,504,783,785.000749
92,519
pythondev
help
sort of like doing a raw insert in a for loop 1000x times
2017-09-07T11:30:05.000588
Orpha
pythondev_help_Orpha_2017-09-07T11:30:05.000588
1,504,783,805.000588
92,520
pythondev
help
Moving the commit out of the loop will likely help a lot
2017-09-07T11:30:33.000877
Gabriele
pythondev_help_Gabriele_2017-09-07T11:30:33.000877
1,504,783,833.000877
92,521
pythondev
help
Hi! Does anyone know the way to validate json string without `json.loads`? `eval` is faster, but no :slightly_smiling_face:
2017-09-07T11:30:39.000171
Chu
pythondev_help_Chu_2017-09-07T11:30:39.000171
1,504,783,839.000171
92,522
pythondev
help
yea
2017-09-07T11:30:43.000856
Johana
pythondev_help_Johana_2017-09-07T11:30:43.000856
1,504,783,843.000856
92,523
pythondev
help
it will commit everything at once.
2017-09-07T11:30:48.000059
Johana
pythondev_help_Johana_2017-09-07T11:30:48.000059
1,504,783,848.000059
92,524
pythondev
help
remember that the session is like a recorder.
2017-09-07T11:30:55.000354
Johana
pythondev_help_Johana_2017-09-07T11:30:55.000354
1,504,783,855.000354
92,525
pythondev
help
yea, that was my other question
2017-09-07T11:30:55.000500
Orpha
pythondev_help_Orpha_2017-09-07T11:30:55.000500
1,504,783,855.0005
92,526
pythondev
help
it records everything in order and executes everything in that order when you do `session.commit()`
2017-09-07T11:31:08.000403
Johana
pythondev_help_Johana_2017-09-07T11:31:08.000403
1,504,783,868.000403
92,527
pythondev
help
is it worth doing a list then add all at once then commit, or just move the commit out
2017-09-07T11:31:23.000317
Orpha
pythondev_help_Orpha_2017-09-07T11:31:23.000317
1,504,783,883.000317
92,528
pythondev
help
<@Chu> Why don't you want to use `loads`?
2017-09-07T11:31:33.000019
Gabriele
pythondev_help_Gabriele_2017-09-07T11:31:33.000019
1,504,783,893.000019
92,529
pythondev
help
your loop above is doing an insert statement for every iteration.
2017-09-07T11:31:39.000110
Johana
pythondev_help_Johana_2017-09-07T11:31:39.000110
1,504,783,899.00011
92,530
pythondev
help
I'm receiveing string from client side and need to store in database as is. But I need to know that my json is valid. (There is no way to make json field in db, too large db under load)
2017-09-07T11:32:30.000222
Chu
pythondev_help_Chu_2017-09-07T11:32:30.000222
1,504,783,950.000222
92,531
pythondev
help
it depends <@Orpha>. how many inserts do you suspect?
2017-09-07T11:32:34.000108
Johana
pythondev_help_Johana_2017-09-07T11:32:34.000108
1,504,783,954.000108
92,532
pythondev
help
<@Orpha> If there's no good reason why you need to commit after each one, then don't do it. Usually you would commit after a coherent block of work has been completed. In this case, finishing inserting everything is a logical block of work, I'd say
2017-09-07T11:32:39.000565
Gabriele
pythondev_help_Gabriele_2017-09-07T11:32:39.000565
1,504,783,959.000565
92,533
pythondev
help
That doesn't explain why you don't want to use `loads`
2017-09-07T11:33:04.000105
Gabriele
pythondev_help_Gabriele_2017-09-07T11:33:04.000105
1,504,783,984.000105
92,534
pythondev
help
It consumes a lot of memory and CPU time in my scale. Looking for optimization.
2017-09-07T11:33:44.000270
Chu
pythondev_help_Chu_2017-09-07T11:33:44.000270
1,504,784,024.00027
92,535
pythondev
help
anywhere from 1 to 7k
2017-09-07T11:34:11.000772
Orpha
pythondev_help_Orpha_2017-09-07T11:34:11.000772
1,504,784,051.000772
92,536
pythondev
help
I don't know of a faster way than to use `loads`, sorry
2017-09-07T11:35:42.000631
Gabriele
pythondev_help_Gabriele_2017-09-07T11:35:42.000631
1,504,784,142.000631
92,537
pythondev
help
moving the commit out throws the duplicate entry error
2017-09-07T11:37:41.000458
Orpha
pythondev_help_Orpha_2017-09-07T11:37:41.000458
1,504,784,261.000458
92,538
pythondev
help
the commit won't affect that
2017-09-07T11:38:17.000276
Gabriele
pythondev_help_Gabriele_2017-09-07T11:38:17.000276
1,504,784,297.000276
92,539
pythondev
help
well… it does
2017-09-07T11:38:38.000892
Orpha
pythondev_help_Orpha_2017-09-07T11:38:38.000892
1,504,784,318.000892
92,540
pythondev
help
did you try .merge instead of .add?
2017-09-07T11:38:54.000068
Johana
pythondev_help_Johana_2017-09-07T11:38:54.000068
1,504,784,334.000068
92,541
pythondev
help
at some point, a commit will happen, and your duplicates will get found
2017-09-07T11:38:58.000031
Gabriele
pythondev_help_Gabriele_2017-09-07T11:38:58.000031
1,504,784,338.000031
92,542
pythondev
help
not yet
2017-09-07T11:39:08.000016
Orpha
pythondev_help_Orpha_2017-09-07T11:39:08.000016
1,504,784,348.000016
92,543
pythondev
help
only tried moving the commit out
2017-09-07T11:39:13.000061
Orpha
pythondev_help_Orpha_2017-09-07T11:39:13.000061
1,504,784,353.000061
92,544
pythondev
help
moving the commit is to speed things up, that's all
2017-09-07T11:39:47.000166
Gabriele
pythondev_help_Gabriele_2017-09-07T11:39:47.000166
1,504,784,387.000166
92,545
pythondev
help
:point_up:
2017-09-07T11:39:58.000034
Johana
pythondev_help_Johana_2017-09-07T11:39:58.000034
1,504,784,398.000034
92,546
pythondev
help
:thumbsup:
2017-09-07T11:40:02.000483
Orpha
pythondev_help_Orpha_2017-09-07T11:40:02.000483
1,504,784,402.000483
92,547
pythondev
help
for ultimate performance gains i urge using the core.
2017-09-07T11:40:21.000301
Johana
pythondev_help_Johana_2017-09-07T11:40:21.000301
1,504,784,421.000301
92,548
pythondev
help
but if that isn’t of utmost importance … carry on.
2017-09-07T11:40:48.000405
Johana
pythondev_help_Johana_2017-09-07T11:40:48.000405
1,504,784,448.000405
92,549
pythondev
help
not the most important
2017-09-07T11:41:10.000544
Orpha
pythondev_help_Orpha_2017-09-07T11:41:10.000544
1,504,784,470.000544
92,550
pythondev
help
so merge needs an existing object?
2017-09-07T11:41:29.000178
Orpha
pythondev_help_Orpha_2017-09-07T11:41:29.000178
1,504,784,489.000178
92,551
pythondev
help
ah, same duplicate error too
2017-09-07T11:42:08.000175
Orpha
pythondev_help_Orpha_2017-09-07T11:42:08.000175
1,504,784,528.000175
92,552
pythondev
help
i think so.
2017-09-07T11:42:12.000390
Johana
pythondev_help_Johana_2017-09-07T11:42:12.000390
1,504,784,532.00039
92,553
pythondev
help
yea it will need the id of the entity and it will merge the records.
2017-09-07T11:42:29.000025
Johana
pythondev_help_Johana_2017-09-07T11:42:29.000025
1,504,784,549.000025
92,554
pythondev
help
ah, forget it… ill just leave it as is… it works, and it’s not slow at all… was just bored seeing what other way i could quickly change it to
2017-09-07T11:43:20.000063
Orpha
pythondev_help_Orpha_2017-09-07T11:43:20.000063
1,504,784,600.000063
92,555
pythondev
help
argh. i keep getting slow connection and connection errors to websites. happens both at home and work. both of which are pretty fast connections. at least 100mpbs. even google search stalls. some times i have to hit the search button again. not connected to vpn. even in incognito window. and other browsers.
2017-09-07T11:43:22.000206
Bruno
pythondev_help_Bruno_2017-09-07T11:43:22.000206
1,504,784,602.000206
92,556
pythondev
help
<@Bruno> you have problems, your phone doesn’t even get text messages for a few days
2017-09-07T11:44:05.000004
Orpha
pythondev_help_Orpha_2017-09-07T11:44:05.000004
1,504,784,645.000004
92,557
pythondev
help
you’re hax0red
2017-09-07T11:44:09.000077
Johana
pythondev_help_Johana_2017-09-07T11:44:09.000077
1,504,784,649.000077
92,558
pythondev
help
wipe and pave
2017-09-07T11:44:20.000058
Johana
pythondev_help_Johana_2017-09-07T11:44:20.000058
1,504,784,660.000058
92,559
pythondev
help
i recommend buying all new hardware
2017-09-07T11:44:27.000508
Orpha
pythondev_help_Orpha_2017-09-07T11:44:27.000508
1,504,784,667.000508
92,560
pythondev
help
<@Orpha> lol that was over night. and only once
2017-09-07T11:44:29.000179
Bruno
pythondev_help_Bruno_2017-09-07T11:44:29.000179
1,504,784,669.000179
92,561
pythondev
help
lol i wish
2017-09-07T11:44:35.000105
Bruno
pythondev_help_Bruno_2017-09-07T11:44:35.000105
1,504,784,675.000105
92,562
pythondev
help
check all your network connections for any outgoing tor nodes
2017-09-07T11:44:49.000117
Johana
pythondev_help_Johana_2017-09-07T11:44:49.000117
1,504,784,689.000117
92,563
pythondev
help
^ oh fun
2017-09-07T11:45:03.000036
Orpha
pythondev_help_Orpha_2017-09-07T11:45:03.000036
1,504,784,703.000036
92,564
pythondev
help
it does sound like a dns related thing. if it keeps happening you may want to flush your dns.
2017-09-07T11:45:55.000394
Johana
pythondev_help_Johana_2017-09-07T11:45:55.000394
1,504,784,755.000394
92,565
pythondev
help
maybe even try changing dns servers
2017-09-07T11:46:13.000516
Johana
pythondev_help_Johana_2017-09-07T11:46:13.000516
1,504,784,773.000516
92,566
pythondev
help
flushed.
2017-09-07T11:46:32.000148
Bruno
pythondev_help_Bruno_2017-09-07T11:46:32.000148
1,504,784,792.000148
92,567
pythondev
help
yea. i may try using googles.
2017-09-07T11:47:02.000340
Bruno
pythondev_help_Bruno_2017-09-07T11:47:02.000340
1,504,784,822.00034
92,568
pythondev
help
or opendns
2017-09-07T11:47:12.000100
Orpha
pythondev_help_Orpha_2017-09-07T11:47:12.000100
1,504,784,832.0001
92,569
pythondev
help
you may want to even do a traceroute and see what hops you’re taking.
2017-09-07T11:48:03.000525
Johana
pythondev_help_Johana_2017-09-07T11:48:03.000525
1,504,784,883.000525
92,570
pythondev
help
are you on windows?
2017-09-07T11:48:32.000253
Johana
pythondev_help_Johana_2017-09-07T11:48:32.000253
1,504,784,912.000253
92,571
pythondev
help
xubuntu. yea tracert is windows. running it now
2017-09-07T11:48:56.000776
Bruno
pythondev_help_Bruno_2017-09-07T11:48:56.000776
1,504,784,936.000776
92,572
pythondev
help
that was pretty quick just to <http://google.com|google.com>
2017-09-07T11:49:21.000466
Bruno
pythondev_help_Bruno_2017-09-07T11:49:21.000466
1,504,784,961.000466
92,573
pythondev
help
Try using a different browser
2017-09-07T11:50:05.000323
Orpha
pythondev_help_Orpha_2017-09-07T11:50:05.000323
1,504,785,005.000323
92,574
pythondev
help
yea. gonna test vivaldi for a bit.
2017-09-07T12:00:29.000359
Bruno
pythondev_help_Bruno_2017-09-07T12:00:29.000359
1,504,785,629.000359
92,575
pythondev
help
i’ve been using brave. i really like it.
2017-09-07T12:01:33.000287
Johana
pythondev_help_Johana_2017-09-07T12:01:33.000287
1,504,785,693.000287
92,576
pythondev
help
cool. i havent used that one. i do like vivaldi but the lack of tab sync across devices is a no go for full time usage for me. for personal use at least
2017-09-07T12:07:31.000202
Bruno
pythondev_help_Bruno_2017-09-07T12:07:31.000202
1,504,786,051.000202
92,577
pythondev
help
brave is more security oriented.
2017-09-07T12:14:13.000492
Johana
pythondev_help_Johana_2017-09-07T12:14:13.000492
1,504,786,453.000492
92,578
pythondev
help
could someone help me with this python code? I’m trying to build my own tuple by looping through a list and adding them to the tuple, but something seems wrong and I can’t really figure why: ``` myTuple = (('', ''),) answers = ['Test1', 'Test2', 'Test3'] l = list(myTuple) for x in answers: l.append((x, x)) widget.choices = (myTuple)``` What i want my widget.choices to look like is ```widget.choices = (('', ''),('Test1', 'Test1'), ('Test2','Test2'), ('Test3','Test3'))```
2017-09-07T12:52:46.000075
Adrian
pythondev_help_Adrian_2017-09-07T12:52:46.000075
1,504,788,766.000075
92,579
pythondev
help
What does `widget.choices` look like in reality? You are modifying a list `l` but then you assign `myTuple` to `widget.choices`. Also know that `(myTuple)` won't make `myTuple` a tuple, if that is what you were thinking the paren would do in that case.
2017-09-07T13:00:17.000290
Mallie
pythondev_help_Mallie_2017-09-07T13:00:17.000290
1,504,789,217.00029
92,580
pythondev
help
Looks like you want a tuple of tuples.
2017-09-07T13:01:23.000533
Glinda
pythondev_help_Glinda_2017-09-07T13:01:23.000533
1,504,789,283.000533
92,581
pythondev
help
When you're better off using a list of namedtuple
2017-09-07T13:01:34.000216
Glinda
pythondev_help_Glinda_2017-09-07T13:01:34.000216
1,504,789,294.000216
92,582
pythondev
help
Or any other object that isn't tuple of tuple
2017-09-07T13:01:59.000369
Glinda
pythondev_help_Glinda_2017-09-07T13:01:59.000369
1,504,789,319.000369
92,583
pythondev
help
Looks like your widget.choices is waiting a tuple of tuples…
2017-09-07T13:02:59.000172
Cassondra
pythondev_help_Cassondra_2017-09-07T13:02:59.000172
1,504,789,379.000172
92,584
pythondev
help
well I just figured my mistake, I had to give widget.choices `l` instead of `myTuple`
2017-09-07T13:03:24.000497
Adrian
pythondev_help_Adrian_2017-09-07T13:03:24.000497
1,504,789,404.000497
92,585
pythondev
help
Anyone know of a way to make the `pd.read_csv` function change the header columns to lower case during import operation? I'm currently pre-processing each input file to make this happen, but it feels like something that's a common enough problem that I've overlooked a feature.
2017-09-07T13:07:33.000157
Margrett
pythondev_help_Margrett_2017-09-07T13:07:33.000157
1,504,789,653.000157
92,586
pythondev
help
`pd` being pandas in this scenario
2017-09-07T13:07:56.000452
Margrett
pythondev_help_Margrett_2017-09-07T13:07:56.000452
1,504,789,676.000452
92,587
pythondev
help
The ordering is important, as you can't be specific about which column names to bring in, if they don't match case.
2017-09-07T13:09:37.000318
Margrett
pythondev_help_Margrett_2017-09-07T13:09:37.000318
1,504,789,777.000318
92,588
pythondev
help
can you be sure there won't be duplicate column names?
2017-09-07T13:10:59.000232
Meg
pythondev_help_Meg_2017-09-07T13:10:59.000232
1,504,789,859.000232
92,589
pythondev
help
Yep, all my datasets are from database dumps of individual tables
2017-09-07T13:11:17.000710
Margrett
pythondev_help_Margrett_2017-09-07T13:11:17.000710
1,504,789,877.00071
92,590
pythondev
help
```from collections import namedtuple # build the tuple class example_named_tuple = namedtuple('example_named_tuple', ('val1', 'val2')) # example call single one single_tuple = example_named_tuple('','') print('this is value 1', single_tuple.val1, 'BLANK') print('this is value 2', single_tuple.val2, 'BLANK') # your items and making all named tuples answers = ['','Test1', 'Test2', 'Test3'] choices = [example_named_tuple(item, item) for item in answers] # print the list of tuples print(choices) # individually for item in choices: print(item.val1, item.val2) ``` <https://repl.it/Knzz/3>
2017-09-07T13:11:39.000192
Glinda
pythondev_help_Glinda_2017-09-07T13:11:39.000192
1,504,789,899.000192
92,591
pythondev
help
I think I have one dataset based on Oracle (which loves all uppercase field names), and 10 others which are always lowercase
2017-09-07T13:12:13.000006
Margrett
pythondev_help_Margrett_2017-09-07T13:12:13.000006
1,504,789,933.000006
92,592
pythondev
help
I mean, you can do conversion after import, but I don't think the `read_csv` options allow column name case change
2017-09-07T13:13:32.000002
Meg
pythondev_help_Meg_2017-09-07T13:13:32.000002
1,504,790,012.000002
92,593
pythondev
help
I mean, ther eis a `converters` option, but that's just for you to apply functions to convert values in specific columns
2017-09-07T13:14:19.000127
Meg
pythondev_help_Meg_2017-09-07T13:14:19.000127
1,504,790,059.000127
92,594
pythondev
help
<@Adrian> this way works really well if you need to keep your items intact.
2017-09-07T13:14:38.000145
Glinda
pythondev_help_Glinda_2017-09-07T13:14:38.000145
1,504,790,078.000145
92,595
pythondev
help
You can apply a lambda function for pandas on import
2017-09-07T13:14:54.000705
Glinda
pythondev_help_Glinda_2017-09-07T13:14:54.000705
1,504,790,094.000705
92,596
pythondev
help
<https://github.com/pandas-dev/pandas/issues/15799>
2017-09-07T13:16:35.000412
Glinda
pythondev_help_Glinda_2017-09-07T13:16:35.000412
1,504,790,195.000412
92,597
pythondev
help
```skipcols = [...] userows = [...] read_csv(..., usecols=lambda x: x not in skipcols, skiprows=lambda x: x not in userows])```
2017-09-07T13:16:52.000150
Glinda
pythondev_help_Glinda_2017-09-07T13:16:52.000150
1,504,790,212.00015
92,598
pythondev
help
<https://www.programiz.com/python-programming/anonymous-function> ` double = lambda x: x* 2` is equivalent to : ```def double(x): return x * 2```
2017-09-07T13:18:01.000283
Glinda
pythondev_help_Glinda_2017-09-07T13:18:01.000283
1,504,790,281.000283
92,599
pythondev
help
Ohh, okay, I can make that work. didn't realize usecols was that flexible
2017-09-07T13:18:15.000650
Margrett
pythondev_help_Margrett_2017-09-07T13:18:15.000650
1,504,790,295.00065
92,600
pythondev
help
It's really amazing
2017-09-07T13:18:41.000028
Glinda
pythondev_help_Glinda_2017-09-07T13:18:41.000028
1,504,790,321.000028
92,601
pythondev
help
The reason it's better to use a named tuple is you can preserve your relationship between the first and second item. You don't have to worry about inserting in any order. You can also do things like create a dictionary of named tuples.
2017-09-07T13:20:29.000331
Glinda
pythondev_help_Glinda_2017-09-07T13:20:29.000331
1,504,790,429.000331
92,602