markdown
stringlengths
0
37k
code
stringlengths
1
33.3k
path
stringlengths
8
215
repo_name
stringlengths
6
77
license
stringclasses
15 values
实例 : 对于一列数据,每隔两行取一个累加起来,最后把和插入到列的Series对象中
out = pd.Series() i=0 pieces = pd.read_csv('myCSV_01.csv',chunksize=3) for piece in pieces: print piece out.set_value(i,piece['white'].sum()) i += 1 out
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
写入文件 to_csv(filenmae) to_csv(filename,index=False,header=False) to_csv(filename,na_rep='NaN') HTML文件读写 写入HTML文件
frame = pd.DataFrame(np.arange(4).reshape((2,2))) print frame.to_html()
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
创建复杂的DataFrame
frame = pd.DataFrame(np.random.random((4,4)), index=['white','black','red','blue'], columns=['up','down','left','right']) frame s = ['<HTML>'] s.append('<HEAD><TITLE>MY DATAFRAME</TITLE></HEAD>') s.append('<BODY>') s.append(frame.to_html()) s.append('</BODY></HTML>') html=''.joi...
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
HTML读表格
web_frames = pd.read_html('myFrame.html') web_frames[0] # 以网址作为参数 ranking = pd.read_html('http://www.meccanismocomplesso.org/en/meccanismo-complesso-sito-2/classifica-punteggio/') ranking[0]
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
读写xml文件 使用的第三方的库 lxml
from lxml import objectify xml = objectify.parse('books.xml') xml root =xml.getroot() root.Book.Author root.Book.PublishDate root.getchildren() [child.tag for child in root.Book.getchildren()] [child.text for child in root.Book.getchildren()] def etree2df(root): column_names=[] for i in range(0,len(root....
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
读写Excel文件
pd.read_excel('data.xlsx') pd.read_excel('data.xlsx','Sheet2') frame = pd.DataFrame(np.random.random((4,4)), index=['exp1','exp2','exp3','exp4'], columns=['Jan2015','Feb2015','Mar2015','Apr2015']) frame frame.to_excel('data2.xlsx')
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
JSON数据
frame = pd.DataFrame(np.arange(16).reshape((4,4)), index=['white','black','red','blue'], columns=['up','down','right','left']) frame.to_json('frame.json') # 读取json pd.read_json('frame.json')
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
HDF5数据 HDF文件(hierarchical data from)等级数据格式,用二进制文件存储数据。
from pandas.io.pytables import HDFStore store = HDFStore('mydata.h5') store['obj1']=frame store['obj1']
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
pickle数据
frame.to_pickle('frame.pkl') pd.read_pickle('frame.pkl')
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
数据库连接 以sqlite3为例介绍
frame=pd.DataFrame(np.arange(20).reshape((4,5)), columns=['white','red','blue','black','green']) frame from sqlalchemy import create_engine enegine=create_engine('sqlite:///foo.db') frame.to_sql('colors',enegine) pd.read_sql('colors',enegine)
Data_Analytics_in_Action/pandasIO.ipynb
gaufung/Data_Analytics_Learning_Note
mit
Craigslist houses for sale Look on the Craigslist website, select relevant search criteria, and then take a look at the web address: Houses for sale in the East Bay: http://sfbay.craigslist.org/search/eby/rea?housing_type=6 Houses for sale in selected neighborhoods in the East Bay: http://sfbay.craigslist.org/search/eb...
# Get the data using the requests module npgs = np.arange(0,10,1) npg = 100 base_url = 'http://sfbay.craigslist.org/search/eby/rea?' urls = [base_url + 'housing_type=6'] for pg in range(len(npgs)): url = base_url + 's=' + str(npg) + '&housing_type=6' urls.append(url) npg += 100 more_reqs = [] for p in r...
ds/Webscraping_Craigslist_multi.ipynb
jljones/portfolio
apache-2.0
Extract and clean data to put in a database
# Define 4 functions for the price, neighborhood, sq footage & # bedrooms, and time # that can deal with missing values (to prevent errors from showing up when running the code) # Prices def find_prices(results): prices = [] for rw in results: price = rw.find('span', {'class': 'price'}) if pric...
ds/Webscraping_Craigslist_multi.ipynb
jljones/portfolio
apache-2.0
Add data to pandas database
# Make a dataframe to export cleaned data data = np.array([sqft_all, bedrooms_all, prices_all]).T print(data.shape) alldata = pd.DataFrame(data = data, columns = ['SqFeet', 'nBedrooms', 'Price']) alldata.head(4) alldata['DatePosted'] = times_all alldata['Neighborhood'] = neighborhoods_all alldata.head(4) # Check da...
ds/Webscraping_Craigslist_multi.ipynb
jljones/portfolio
apache-2.0
Download data to csv file
alldata.to_csv('./webscraping_craigslist.csv', sep=',', na_rep=np.nan, header=True, index=False)
ds/Webscraping_Craigslist_multi.ipynb
jljones/portfolio
apache-2.0
Data for Berkeley
# Get houses listed in Berkeley print(len(alldata[alldata['Neighborhood'] == 'berkeley'])) alldata[alldata['Neighborhood'] == 'berkeley'] # Home prices in Berkeley (or the baseline) # Choose a baseline, based on proximity to current location # 'berkeley', 'berkeley north / hills', 'albany / el cerrito' neighborhood_n...
ds/Webscraping_Craigslist_multi.ipynb
jljones/portfolio
apache-2.0
Scatter plots
# Plot house prices in the East Bay def scatterplot(X, Y, labels, xmax): # =X.max()): # labels=[] # Set up the figure fig = plt.figure(figsize=(15,8)) # width, height fntsz=20 titlefntsz=25 lablsz=20 mrkrsz=8 matplotlib.rc('xtick', labelsize = lablsz); matplotlib.rc('ytick', labelsize...
ds/Webscraping_Craigslist_multi.ipynb
jljones/portfolio
apache-2.0
Price
# How many houses for sale are under $700k? price_baseline = 700000 print(alldata[(alldata.Price < price_baseline)].count()) # Return entries for houses under $700k # alldata[(alldata.Price < price_baseline)] # In which neighborhoods are these houses located? set(alldata[(alldata.Price < price_baseline)].Neighborhood)...
ds/Webscraping_Craigslist_multi.ipynb
jljones/portfolio
apache-2.0
Group results by neighborhood and plot
by_neighborhood = alldata.groupby('Neighborhood').Price.mean() #by_neighborhood #alldata.groupby('Neighborhood').Price.mean().ix[neighborhoodsplt] # Home prices in the East Bay # Group the results by neighborhood, and then take the average home price in each neighborhood by_neighborhood = alldata.groupby('Neighborho...
ds/Webscraping_Craigslist_multi.ipynb
jljones/portfolio
apache-2.0
The example tardis_example can be downloaded here tardis_example.yml
config = Configuration.from_yaml('tardis_example.yml') sim = Simulation.from_config(config)
docs/research/code_comparison/plasma_compare/plasma_compare.ipynb
kaushik94/tardis
bsd-3-clause
Accessing the plasma states In this example, we are accessing Si and also the unionized number density (0)
# All Si ionization states sim.plasma.ion_number_density.loc[14] # Normalizing by si number density sim.plasma.ion_number_density.loc[14] / sim.plasma.number_density.loc[14] # Accessing the first ionization state sim.plasma.ion_number_density.loc[14, 1] sim.plasma.update(density=[1e-13]) sim.plasma.ion_number_dens...
docs/research/code_comparison/plasma_compare/plasma_compare.ipynb
kaushik94/tardis
bsd-3-clause
Updating the plasma state It is possible to update the plasma state with different temperatures or dilution factors (as well as different densities.). We are updating the radiative temperatures and plotting the evolution of the ionization state
si_ionization_state = None for cur_t_rad in range(1000, 20000, 100): sim.plasma.update(t_rad=[cur_t_rad]) if si_ionization_state is None: si_ionization_state = sim.plasma.ion_number_density.loc[14].copy() si_ionization_state.columns = [cur_t_rad] else: si_ionization_state[cur_t_rad] ...
docs/research/code_comparison/plasma_compare/plasma_compare.ipynb
kaushik94/tardis
bsd-3-clause
Check for dependencies, Set Directories The below code is a simple check that makes sure AFNI and FSL are installed. <br> We also set the input, data, and atlas paths. Make sure that AFNI and FSL are installed
# FSL try: print(f"Your fsl directory is located here: {os.environ['FSLDIR']}") except KeyError: raise AssertionError("You do not have FSL installed! See installation instructions here: https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FslInstallation") # AFNI try: print(f"Your AFNI directory is located here: {su...
tutorials/Overview.ipynb
neurodata/ndmg
apache-2.0
Set Input, Output, and Atlas Locations Here, you set: 1. the input_dir - this is where your input data lives. 2. the out_dir - this is where your output data will go.
# get atlases ndmg_dir = Path.home() / ".ndmg" atlas_dir = ndmg_dir / "ndmg_atlases" get_atlas(str(atlas_dir), "2mm") # These input_dir = ndmg_dir / "input" out_dir = ndmg_dir / "output" print(f"Your input and output directory will be : {input_dir} and {out_dir}") assert op.exists(input_dir), f"You must have an inpu...
tutorials/Overview.ipynb
neurodata/ndmg
apache-2.0
Choose input parameters Naming Conventions Here, we define input variables to the pipeline. To run the ndmg pipeline, you need four files: 1. a t1w - this is a high-resolution anatomical image. 2. a dwi - the diffusion image. 3. bvecs - this is a text file that defines the gradient vectors created by a DWI scan. 4. bva...
# Specify base directory and paths to input files (dwi, bvecs, bvals, and t1w required) subject_id = 'sub-0025864' # Define the location of our input files. t1w = str(input_dir / f"{subject_id}/ses-1/anat/{subject_id}_ses-1_T1w.nii.gz") dwi = str(input_dir / f"{subject_id}/ses-1/dwi/{subject_id}_ses-1_dwi.nii.gz") bve...
tutorials/Overview.ipynb
neurodata/ndmg
apache-2.0
Parameter Choices and Output Directory Here, we choose the parameters to run the pipeline with. If you are inexperienced with diffusion MRI theory, feel free to just use the default parameters. atlases = ['desikan', 'CPAC200', 'DKT', 'HarvardOxfordcort', 'HarvardOxfordsub', 'JHU', 'Schaefer2018-200', 'Talairach', 'aal...
# Use the default parameters. atlas = 'desikan' mod_type = 'prob' track_type = 'local' mod_func = 'csd' reg_style = 'native' vox_size = '2mm' seeds = 1
tutorials/Overview.ipynb
neurodata/ndmg
apache-2.0
Get masks and labels The pipeline needs these two variables as input. <br> Running the pipeline via ndmg_bids does this for you.
# Auto-set paths to neuroparc files mask = str(atlas_dir / "atlases/mask/MNI152NLin6_res-2x2x2_T1w_descr-brainmask.nii.gz") labels = [str(i) for i in (atlas_dir / "atlases/label/Human/").glob(f"*{atlas}*2x2x2.nii.gz")] print(f"mask location : {mask}") print(f"atlas location : {labels}")
tutorials/Overview.ipynb
neurodata/ndmg
apache-2.0
Run the pipeline!
ndmg_dwi_pipeline.ndmg_dwi_worker(dwi=dwi, bvals=bvals, bvecs=bvecs, t1w=t1w, atlas=atlas, mask=mask, labels=labels, outdir=str(out_dir), vox_size=vox_size, mod_type=mod_type, track_type=track_type, mod_func=mod_func, seeds=seeds, reg_style=reg_style, clean=False, skipeddy=True, skipreg=True)
tutorials/Overview.ipynb
neurodata/ndmg
apache-2.0
Import Section class, which contains all calculations
from Section import Section
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Initialization of sympy symbolic tool and pint for dimension analysis (not really implemented rn as not directly compatible with sympy)
ureg = UnitRegistry() sympy.init_printing()
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Define sympy parameters used for geometric description of sections
A, A0, t, t0, a, b, h, L = sympy.symbols('A A_0 t t_0 a b h L', positive=True)
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
We also define numerical values for each symbol in order to plot scaled section and perform calculations
values = [(A, 150 * ureg.millimeter**2),(A0, 250 * ureg.millimeter**2),(a, 80 * ureg.millimeter), \ (b, 20 * ureg.millimeter),(h, 35 * ureg.millimeter),(L, 2000 * ureg.millimeter)] datav = [(v[0],v[1].magnitude) for v in values]
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
First example: Closed section Define graph describing the section: 1) stringers are nodes with parameters: - x coordinate - y coordinate - Area 2) panels are oriented edges with parameters: - thickness - lenght which is automatically calculated
stringers = {1:[(sympy.Integer(0),h),A], 2:[(a/2,h),A], 3:[(a,h),A], 4:[(a-b,sympy.Integer(0)),A], 5:[(b,sympy.Integer(0)),A]} panels = {(1,2):t, (2,3):t, (3,4):t, (4,5):t, (5,1):t}
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Define section and perform first calculations
S1 = Section(stringers, panels)
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Verify that we find a simply closed section
S1.cycles
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Plot of S1 section in original reference frame Define a dictionary of coordinates used by Networkx to plot section as a Directed graph. Note that arrows are actually just thicker stubs
start_pos={ii: [float(S1.g.node[ii]['ip'][i].subs(datav)) for i in range(2)] for ii in S1.g.nodes() } plt.figure(figsize=(12,8),dpi=300) nx.draw(S1.g,with_labels=True, arrows= True, pos=start_pos) plt.arrow(0,0,20,0) plt.arrow(0,0,0,20) #plt.text(0,0, 'CG', fontsize=24) plt.axis('equal') plt.title("Section in starting...
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Expression of Inertial properties wrt Center of Gravity in with original rotation
S1.Ixx0, S1.Iyy0, S1.Ixy0, S1.α0
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Plot of S1 section in inertial reference Frame Section is plotted wrt center of gravity and rotated (if necessary) so that x and y are principal axes. Center of Gravity and Shear Center are drawn
positions={ii: [float(S1.g.node[ii]['pos'][i].subs(datav)) for i in range(2)] for ii in S1.g.nodes() } x_ct, y_ct = S1.ct.subs(datav) plt.figure(figsize=(12,8),dpi=300) nx.draw(S1.g,with_labels=True, pos=positions) plt.plot([0],[0],'o',ms=12,label='CG') plt.plot([x_ct],[y_ct],'^',ms=12, label='SC') #plt.text(0,0, 'CG...
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Expression of inertial properties in principal reference frame
S1.Ixx, S1.Iyy, S1.Ixy, S1.θ
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Shear center expression
S1.ct
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Analisys of symmetry properties of the section For x and y axes pair of symmetric nodes and edges are searched for
S1.symmetry
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Compute axial loads in Stringers in S1 We first define some symbols:
Tx, Ty, Nz, Mx, My, Mz, F, ry, ry, mz = sympy.symbols('T_x T_y N_z M_x M_y M_z F r_y r_x m_z')
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Set loads on the section: Example 1: shear in y direction and bending moment in x direction
S1.set_loads(_Tx=0, _Ty=Ty, _Nz=0, _Mx=Mx, _My=0, _Mz=0)
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Compute axial loads in stringers and shear flows in panels
S1.compute_stringer_actions() S1.compute_panel_fluxes();
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Axial loads
S1.N
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Shear flows
S1.q
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Example 2: twisting moment in z direction
S1.set_loads(_Tx=0, _Ty=0, _Nz=0, _Mx=0, _My=0, _Mz=Mz) S1.compute_stringer_actions() S1.compute_panel_fluxes();
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Panel fluxes
S1.q
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Set loads on the section: Example 3: shear in x direction and bending moment in y direction
S1.set_loads(_Tx=Tx, _Ty=0, _Nz=0, _Mx=0, _My=My, _Mz=0) S1.compute_stringer_actions() S1.compute_panel_fluxes();
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Panel fluxes Not really an easy expression
S1.q
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Compute Jt Computation of torsional moment of inertia:
S1.compute_Jt() S1.Jt
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Second example: Open section
stringers = {1:[(sympy.Integer(0),h),A], 2:[(sympy.Integer(0),sympy.Integer(0)),A], 3:[(a,sympy.Integer(0)),A], 4:[(a,h),A]} panels = {(1,2):t, (2,3):t, (3,4):t}
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Define section and perform first calculations
S2 = Section(stringers, panels)
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Verify that the section is open
S2.cycles
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Plot of S2 section in original reference frame Define a dictionary of coordinates used by Networkx to plot section as a Directed graph. Note that arrows are actually just thicker stubs
start_pos={ii: [float(S2.g.node[ii]['ip'][i].subs(datav)) for i in range(2)] for ii in S2.g.nodes() } plt.figure(figsize=(12,8),dpi=300) nx.draw(S2.g,with_labels=True, arrows= True, pos=start_pos) plt.arrow(0,0,20,0) plt.arrow(0,0,0,20) #plt.text(0,0, 'CG', fontsize=24) plt.axis('equal') plt.title("Section in starting...
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Expression of Inertial properties wrt Center of Gravity in with original rotation
S2.Ixx0, S2.Iyy0, S2.Ixy0, S2.α0
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Plot of S2 section in inertial reference Frame Section is plotted wrt center of gravity and rotated (if necessary) so that x and y are principal axes. Center of Gravity and Shear Center are drawn
positions={ii: [float(S2.g.node[ii]['pos'][i].subs(datav)) for i in range(2)] for ii in S2.g.nodes() } x_ct, y_ct = S2.ct.subs(datav) plt.figure(figsize=(12,8),dpi=300) nx.draw(S2.g,with_labels=True, pos=positions) plt.plot([0],[0],'o',ms=12,label='CG') plt.plot([x_ct],[y_ct],'^',ms=12, label='SC') #plt.text(0,0, 'CG...
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Expression of inertial properties in principal reference frame
S2.Ixx, S2.Iyy, S2.Ixy, S2.θ
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Shear center expression
S2.ct
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Analisys of symmetry properties of the section For x and y axes pair of symmetric nodes and edges are searched for
S2.symmetry
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Compute axial loads in Stringers in S2 Set loads on the section: Example 2: shear in y direction and bending moment in x direction
S2.set_loads(_Tx=0, _Ty=Ty, _Nz=0, _Mx=Mx, _My=0, _Mz=0)
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Compute axial loads in stringers and shear flows in panels
S2.compute_stringer_actions() S2.compute_panel_fluxes();
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Axial loads
S2.N
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Shear flows
S2.q
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Set loads on the section: Example 2: shear in x direction and bending moment in y direction
S2.set_loads(_Tx=Tx, _Ty=0, _Nz=0, _Mx=0, _My=My, _Mz=0) S2.compute_stringer_actions() S2.compute_panel_fluxes(); S2.N S2.q
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Second example (2): Open section
stringers = {1:[(a,h),A], 2:[(sympy.Integer(0),h),A], 3:[(sympy.Integer(0),sympy.Integer(0)),A], 4:[(a,sympy.Integer(0)),A]} panels = {(1,2):t, (2,3):t, (3,4):t}
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Define section and perform first calculations
S2_2 = Section(stringers, panels)
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Plot of S2 section in original reference frame Define a dictionary of coordinates used by Networkx to plot section as a Directed graph. Note that arrows are actually just thicker stubs
start_pos={ii: [float(S2_2.g.node[ii]['ip'][i].subs(datav)) for i in range(2)] for ii in S2_2.g.nodes() } plt.figure(figsize=(12,8),dpi=300) nx.draw(S2_2.g,with_labels=True, arrows= True, pos=start_pos) plt.arrow(0,0,20,0) plt.arrow(0,0,0,20) #plt.text(0,0, 'CG', fontsize=24) plt.axis('equal') plt.title("Section in st...
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Expression of Inertial properties wrt Center of Gravity in with original rotation
S2_2.Ixx0, S2_2.Iyy0, S2_2.Ixy0, S2_2.α0
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Plot of S2 section in inertial reference Frame Section is plotted wrt center of gravity and rotated (if necessary) so that x and y are principal axes. Center of Gravity and Shear Center are drawn
positions={ii: [float(S2_2.g.node[ii]['pos'][i].subs(datav)) for i in range(2)] for ii in S2_2.g.nodes() } x_ct, y_ct = S2_2.ct.subs(datav) plt.figure(figsize=(12,8),dpi=300) nx.draw(S2_2.g,with_labels=True, pos=positions) plt.plot([0],[0],'o',ms=12,label='CG') plt.plot([x_ct],[y_ct],'^',ms=12, label='SC') #plt.text(...
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Expression of inertial properties in principal reference frame
S2_2.Ixx, S2_2.Iyy, S2_2.Ixy, S2_2.θ
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Shear center expression
S2_2.ct
01_SemiMonoCoque.ipynb
Ccaccia73/semimonocoque
mit
Adversarial Regularization for Image Classification The core idea of adversarial learning is to train a model with adversarially-perturbed data (called adversarial examples) in addition to the organic training data. The adversarial examples are constructed to intentionally mislead the model into making wrong prediction...
!pip install --quiet neural-structured-learning import matplotlib.pyplot as plt import tensorflow as tf import tensorflow_datasets as tfds import numpy as np import neural_structured_learning as nsl
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Hyperparameters We collect and explain the hyperparameters (in an HParams object) for model training and evaluation. Input/Output: input_shape: The shape of the input tensor. Each image is 28-by-28 pixels with 1 channel. num_classes: There are a total of 10 classes, corresponding to 10 digits [0-9]. Model architectur...
class HParams(object): def __init__(self): self.input_shape = [28, 28, 1] self.num_classes = 10 self.conv_filters = [32, 64, 64] self.kernel_size = (3, 3) self.pool_size = (2, 2) self.num_fc_units = [64] self.batch_size = 32 self.epochs = 5 self.adv_multiplier = 0.2 self.adv_st...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
MNIST dataset The MNIST dataset contains grayscale images of handwritten digits (from '0' to '9'). Each image showes one digit at low resolution (28-by-28 pixels). The task involved is to classify images into 10 categories, one per digit. Here we load the MNIST dataset from TensorFlow Datasets. It handles downloading t...
datasets = tfds.load('mnist') train_dataset = datasets['train'] test_dataset = datasets['test'] IMAGE_INPUT_NAME = 'image' LABEL_INPUT_NAME = 'label'
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
To make the model numerically stable, we normalize the pixel values to [0, 1] by mapping the dataset over the normalize function. After shuffling training set and batching, we convert the examples to feature tuples (image, label) for training the base model. We also provide a function to convert from tuples to dictiona...
def normalize(features): features[IMAGE_INPUT_NAME] = tf.cast( features[IMAGE_INPUT_NAME], dtype=tf.float32) / 255.0 return features def convert_to_tuples(features): return features[IMAGE_INPUT_NAME], features[LABEL_INPUT_NAME] def convert_to_dictionaries(image, label): return {IMAGE_INPUT_NAME: image, ...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Base model Our base model will be a neural network consisting of 3 convolutional layers follwed by 2 fully-connected layers (as defined in HPARAMS). Here we define it using the Keras functional API. Feel free to try other APIs or model architectures.
def build_base_model(hparams): """Builds a model according to the architecture defined in `hparams`.""" inputs = tf.keras.Input( shape=hparams.input_shape, dtype=tf.float32, name=IMAGE_INPUT_NAME) x = inputs for i, num_filters in enumerate(hparams.conv_filters): x = tf.keras.layers.Conv2D( nu...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Next we train and evaluate the base model.
base_model.compile(optimizer='adam', loss=tf.keras.losses.SparseCategoricalCrossentropy( from_logits=True), metrics=['acc']) base_model.fit(train_dataset, epochs=HPARAMS.epochs) results = base_model.evaluate(test_dataset) named_results = dict(zip(base_model....
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Adversarial-regularized model Here we show how to incorporate adversarial training into a Keras model with a few lines of code, using the NSL framework. The base model is wrapped to create a new tf.Keras.Model, whose training objective includes adversarial regularization. We will train one using the FGSM adversary and ...
fgsm_adv_config = nsl.configs.make_adv_reg_config( multiplier=HPARAMS.adv_multiplier, # With FGSM, we want to take a single step equal to the epsilon ball size, # to get the largest allowable perturbation. adv_step_size=HPARAMS.pgd_epsilon, adv_grad_norm=HPARAMS.adv_grad_norm, clip_value_min=HPA...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Now we can wrap a base model with AdversarialRegularization. Here we create new base models (base_fgsm_model, base_pgd_model), so that the existing one (base_model) can be used in later comparison. The returned adv_model is a tf.keras.Model object, whose training objective includes a regularization term for the advers...
# Create model for FGSM. base_fgsm_model = build_base_model(HPARAMS) # Create FGSM-regularized model. fgsm_adv_model = nsl.keras.AdversarialRegularization( base_fgsm_model, label_keys=[LABEL_INPUT_NAME], adv_config=fgsm_adv_config ) # Create model for PGD. base_pgd_model = build_base_model(HPARAMS) # Creat...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Next we compile, train, and evaluate the adversarial-regularized model. There might be warnings like "Output missing from loss dictionary," which is fine because the adv_model doesn't rely on the base implementation to calculate the total loss.
fgsm_adv_model.compile(optimizer='adam', loss=tf.keras.losses.SparseCategoricalCrossentropy( from_logits=True), metrics=['acc']) fgsm_adv_model.fit(train_set_for_adv_model, epochs=HPARAMS.epochs) results = fgsm_adv_model.evaluate(test_set_for_adv_mo...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Both adversarially regularized models perform well on the test set. Robustness under Adversarial Perturbations Now we compare the base model and the adversarial-regularized model for robustness under adversarial perturbation. We will show how the base model is vulnerable to attacks from both FGSM and PGD, the FGSM-regu...
# Set up the neighbor config for FGSM. fgsm_nbr_config = nsl.configs.AdvNeighborConfig( adv_grad_norm=HPARAMS.adv_grad_norm, adv_step_size=HPARAMS.pgd_epsilon, clip_value_min=0.0, clip_value_max=1.0, ) # The labeled loss function provides the loss for each sample we pass in. This # will be used to calc...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Let's examine what some of these images look like.
def examine_images(perturbed_images, labels, predictions, model_key): batch_index = 0 batch_image = perturbed_images[batch_index] batch_label = labels[batch_index] batch_pred = predictions[batch_index] batch_size = HPARAMS.batch_size n_col = 4 n_row = (batch_size + n_col - 1) / n_col print('accuracy ...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
Our perturbation budget of 0.2 is quite large, but even so, the perturbed numbers are clearly recognizable to the human eye. On the other hand, our network is fooled into misclassifying several examples. As we can see, the FGSM attack is already highly effective, and quick to execute, heavily reducing the model accurac...
# Set up the neighbor config for PGD. pgd_nbr_config = nsl.configs.AdvNeighborConfig( adv_grad_norm=HPARAMS.adv_grad_norm, adv_step_size=HPARAMS.adv_step_size, pgd_iterations=HPARAMS.pgd_iterations, pgd_epsilon=HPARAMS.pgd_epsilon, clip_value_min=HPARAMS.clip_value_min, clip_value_max=HPARAMS.cl...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
The PGD attack is much stronger, but it also takes longer to run. Attacking the FGSM Regularized Model
# Set up the neighbor config. fgsm_nbr_config = nsl.configs.AdvNeighborConfig( adv_grad_norm=HPARAMS.adv_grad_norm, adv_step_size=HPARAMS.pgd_epsilon, clip_value_min=0.0, clip_value_max=1.0, ) # The labeled loss function provides the loss for each sample we pass in. This # will be used to calculate the...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
As we can see, the FGSM-regularized model performs much better than the base model on images perturbed by FGSM. How does it do against PGD?
# Set up the neighbor config for PGD. pgd_nbr_config = nsl.configs.AdvNeighborConfig( adv_grad_norm=HPARAMS.adv_grad_norm, adv_step_size=HPARAMS.adv_step_size, pgd_iterations=HPARAMS.pgd_iterations, pgd_epsilon=HPARAMS.pgd_epsilon, clip_value_min=HPARAMS.clip_value_min, clip_value_max=HPARAMS.cl...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
While the FGSM regularized model was robust to attacks via FGSM, as we can see it is still vulnerable to attacks from PGD, which is a stronger attack mechanism than FGSM. Attacking the PGD Regularized Model
# Set up the neighbor config. fgsm_nbr_config = nsl.configs.AdvNeighborConfig( adv_grad_norm=HPARAMS.adv_grad_norm, adv_step_size=HPARAMS.pgd_epsilon, clip_value_min=0.0, clip_value_max=1.0, ) # The labeled loss function provides the loss for each sample we pass in. This # will be used to calculate the...
workshops/kdd_2020/adversarial_regularization_mnist.ipynb
tensorflow/neural-structured-learning
apache-2.0
2. Set Configuration This code is required to initialize the project. Fill in required fields and press play. If the recipe uses a Google Cloud Project: Set the configuration project value to the project identifier from these instructions. If the recipe has auth set to user: If you have user credentials: Set the c...
from starthinker.util.configuration import Configuration CONFIG = Configuration( project="", client={}, service={}, user="/content/user.json", verbose=True )
colabs/google_api_to_bigquery.ipynb
google/starthinker
apache-2.0
3. Enter Google API To BigQuery Recipe Parameters Enter an api name and version. Specify the function using dot notation. Specify the arguments using json. Iterate is optional, use if API returns a list of items that are not unpacking correctly. The API Key may be required for some calls. The Developer Token may be re...
FIELDS = { 'auth_read':'user', # Credentials used for reading data. 'api':'displayvideo', # See developer guide. 'version':'v1', # Must be supported version. 'function':'advertisers.list', # Full function dot notation path. 'kwargs':{'partnerId': 234340}, # Dictionray object of name value pairs. 'kwarg...
colabs/google_api_to_bigquery.ipynb
google/starthinker
apache-2.0
4. Execute Google API To BigQuery This does NOT need to be modified unless you are changing the recipe, click play.
from starthinker.util.configuration import execute from starthinker.util.recipe import json_set_fields TASKS = [ { 'google_api':{ 'auth':{'field':{'name':'auth_read','kind':'authentication','order':1,'default':'user','description':'Credentials used for reading data.'}}, 'api':{'field':{'name':'api','...
colabs/google_api_to_bigquery.ipynb
google/starthinker
apache-2.0
Generate Features And Target Data
# Generate features matrix and target vector X, y = make_classification(n_samples = 10000, n_features = 3, n_informative = 3, n_redundant = 0, n_classes = 2, random_state = 1)
machine-learning/f1_score.ipynb
tpin3694/tpin3694.github.io
mit
Create Logistic Regression
# Create logistic regression logit = LogisticRegression()
machine-learning/f1_score.ipynb
tpin3694/tpin3694.github.io
mit
Cross-Validate Model Using F1
# Cross-validate model using precision cross_val_score(logit, X, y, scoring="f1")
machine-learning/f1_score.ipynb
tpin3694/tpin3694.github.io
mit
Just adding some imports and setting graph display options.
from textblob import TextBlob import pandas as pd import numpy as np import matplotlib.pyplot as plt import matplotlib import seaborn as sns import cartopy pd.set_option('display.max_colwidth', 200) pd.options.display.mpl_style = 'default' matplotlib.style.use('ggplot') sns.set_context('talk') sns.set_style('whitegrid'...
arrows.ipynb
savioabuga/arrows
mit
Let's look at our data! load_df loads it in as a pandas.DataFrame, excellent for statistical analysis and graphing.
df = load_df('arrows/data/results.csv') df.info()
arrows.ipynb
savioabuga/arrows
mit
We'll be looking primarily at candidate, created_at, lang, place, user_followers_count, user_time_zone, polarity, and influenced_polarity, and text.
df[['candidate', 'created_at', 'lang', 'place', 'user_followers_count', 'user_time_zone', 'polarity', 'influenced_polarity', 'text']].head(1)
arrows.ipynb
savioabuga/arrows
mit
First I'll look at sentiment, calculated with TextBlob using the text column. Sentiment is composed of two values, polarity - a measure of the positivity or negativity of a text - and subjectivity. Polarity is between -1.0 and 1.0; subjectivity between 0.0 and 1.0.
TextBlob("Tear down this wall!").sentiment
arrows.ipynb
savioabuga/arrows
mit
Unfortunately, it doesn't work too well on anything other than English.
TextBlob("Radix malorum est cupiditas.").sentiment
arrows.ipynb
savioabuga/arrows
mit
TextBlob has a cool translate() function that uses Google Translate to take care of that for us, but we won't be using it here - just because tweets include a lot of slang and abbreviations that can't be translated very well.
sentence = TextBlob("Radix malorum est cupiditas.").translate() print(sentence) print(sentence.sentiment)
arrows.ipynb
savioabuga/arrows
mit
All right - let's figure out the most (positively) polarized English tweets.
english_df = df[df.lang == 'en'] english_df.sort('polarity', ascending = False).head(3)[['candidate', 'polarity', 'subjectivity', 'text']]
arrows.ipynb
savioabuga/arrows
mit
Extrema don't mean much. We might get more interesting data with mean polarities for each candidate. Let's also look at influenced polarity, which takes into account the number of retweets and followers.
candidate_groupby = english_df.groupby('candidate') candidate_groupby[['polarity', 'influence', 'influenced_polarity']].mean()
arrows.ipynb
savioabuga/arrows
mit