repo_name stringlengths 6 77 | path stringlengths 8 215 | license stringclasses 15
values | content stringlengths 335 154k |
|---|---|---|---|
DanielMcAssey/steamSummerMinigame | Analysis of WH spam strategy.ipynb | mit | %pylab inline
import matplotlib.pyplot as plt
n_wormholes = 10
n_games = 20
def calc(n_active, n_game, multiplier=1.0):
return n_wormholes * multiplier *(n_active/1000.0 + n_active/10000.0)**(n_game-1)
title = "WH spam stategy (starting w/ %d WHs for each player)" % n_wormholes
plt.figure(figsize=(8,4), dpi=72,... |
ClaudiaEsp/inet | Analysis/misc/Counting inhibitory connectivity motifs.ipynb | gpl-2.0 | # loading python modules
import numpy as np
np.random.seed(0)
from matplotlib.pyplot import figure
from terminaltables import AsciiTable
import matplotlib.pyplot as plt
%matplotlib inline
from __future__ import division
# loading custom inet modules
from inet import DataLoader, __version__
from inet.motifs import... |
dnc1994/MachineLearning-UW | ml-regression/ridge-regression-l2.ipynb | mit | import graphlab
"""
Explanation: Regression Week 4: Ridge Regression (interpretation)
In this notebook, we will run ridge regression multiple times with different L2 penalties to see which one produces the best fit. We will revisit the example of polynomial regression as a means to see the effect of L2 regularization.... |
unnati-xyz/intro-python-data-science | onion/3-Refine.ipynb | mit | # Import the two library we need, which is Pandas and Numpy
import pandas as pd
import numpy as np
# Read the csv file of Month Wise Market Arrival data that has been scraped.
df = pd.read_csv('MonthWiseMarketArrivals.csv')
df.head()
df.tail()
"""
Explanation: 2. Refine the Data
"Data is messy"
We will be perform... |
srcole/qwm | burrito/u/UNFINISHED_Burrito_correlations.ipynb | mit | %config InlineBackend.figure_format = 'retina'
%matplotlib inline
import numpy as np
import scipy as sp
import matplotlib.pyplot as plt
import pandas as pd
import seaborn as sns
sns.set_style("white")
"""
Explanation: San Diego Burrito Analytics
Scott Cole
23 April 2016
This notebook contains analyses on the burrito... |
steven-murray/halomod | docs/examples/beyond_galaxy.ipynb | mit | from halomod import TracerHaloModel
import numpy as np
from matplotlib import pyplot as plt
hm = TracerHaloModel(hod_model="Constant", transfer_model='EH')
hm.central_occupation
plt.plot(np.log10(hm.m),hm.satellite_occupation)
"""
Explanation: Going beyond galaxies as tracers with halomod
halomod is written in a wa... |
lit-mod-viz/middlemarch-critical-histories | old/e1/e1a-analysis.ipynb | gpl-3.0 | import pandas as pd
%matplotlib inline
from ast import literal_eval
import numpy as np
import re
import json
from nltk.corpus import names
from collections import Counter
from matplotlib import pyplot as plt
plt.rcParams["figure.figsize"] = [16, 6]
with open('../txt/e1a.json') as f:
rawData = f.read()
df = pd.re... |
simonsfoundation/CaImAn | demos/notebooks/demo_OnACID_mesoscope.ipynb | gpl-2.0 | try:
if __IPYTHON__:
# this is used for debugging purposes only. allows to reload classes when changed
get_ipython().magic('load_ext autoreload')
get_ipython().magic('autoreload 2')
except NameError:
pass
import logging
import numpy as np
logging.basicConfig(format=
... |
jorgemauricio/INIFAP_Course | ejercicios/Pandas/Ejercicios de Visualizacion con Pandas-Solucion.ipynb | mit | import pandas as pd
import matplotlib.pyplot as plt
df3 = pd.read_csv('../data/df3')
%matplotlib inline
df3.plot.scatter(x='a',y='b',c='red',s=50
df3.info()
df3.head()
"""
Explanation: Ejercicio de visualizacion de informacion con Pandas - Soluciones
Este es un pequenio ejercicio para revisar las diferentes graficas... |
chetan51/nupic.research | projects/dynamic_sparse/notebooks/ExperimentAnalysis-ImprovedMagvsSETcomparison.ipynb | gpl-3.0 | %load_ext autoreload
%autoreload 2
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
import glob
import tabulate
import pprint
import click
import numpy as np
import pandas as pd
from ray.tune.commands import *
from nupic.research.frameworks.dynamic... |
whitead/numerical_stats | unit_12/lectures/lecture_2.ipynb | gpl-3.0 | %matplotlib inline
import random
import numpy as np
import matplotlib.pyplot as plt
import scipy.stats
import scipy.linalg as linalg
import matplotlib
"""
Explanation: Nonlinear Least Squares
Unit 12, Lecture 2
Numerical Methods and Statistics
Prof. Andrew White, April 17 2018
Goals:
Be able to apply the same analys... |
mmathioudakis/moderndb | 2017/spark_tutorial.ipynb | mit | #On windows
#import findspark
#findspark.init(spark_home="C:/Users/me/software/spark-1.6.3-bin-hadoop2.6/")
import pyspark
import numpy as np # we'll be using numpy for some numeric operations
sc = pyspark.SparkContext(master="local[*]", appName="tour")
sc.stop()
"""
Explanation: Lecture 7: Spark Programming
In what... |
compsocialscience/summer-institute | 2018/materials/boulder/day2-digital-trace-data/Day 2 - Case Study - Web Scraping.ipynb | mit | pet_pages = ["https://www.boulderhumane.org/animals/adoption/dogs",
"https://www.boulderhumane.org/animals/adoption/cats",
"https://www.boulderhumane.org/animals/adoption/adopt_other"]
r = requests.get(pet_pages[0])
html = r.text
print(html[:500]) # Print the first 500 characters of the HTM... |
UltronAI/Deep-Learning | CS231n/assignment1/.ipynb_checkpoints/features-checkpoint.ipynb | mit | import random
import numpy as np
from cs231n.data_utils import load_CIFAR10
import matplotlib.pyplot as plt
from __future__ import print_function
%matplotlib inline
plt.rcParams['figure.figsize'] = (10.0, 8.0) # set default size of plots
plt.rcParams['image.interpolation'] = 'nearest'
plt.rcParams['image.cmap'] = 'gr... |
GoogleCloudPlatform/bigquery-notebooks | notebooks/community/analytics-componetized-patterns/retail/recommendation-system/bqml-scann/00_prep_bq_and_datastore.ipynb | apache-2.0 | !pip install -q -U apache-beam[gcp]
# Automatically restart kernel after installs
import IPython
app = IPython.Application.instance()
app.kernel.do_shutdown(True)
"""
Explanation: Import the sample data into BigQuery and Datastore
This notebook is the first of two notebooks that guide you through completing the prer... |
jasonding1354/PRML_Notes | 1.PROBABILITY_DISTRIBUTIONS/1.1 Binary_Variables.ipynb | mit | %matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
"""
Explanation: 概率论在解决模式识别问题时具有重要作用,它是构成更复杂模型的基石。
概率分布的一个作用是在给定有限次观测x1, . . . , xN的前提下,对随机变量x的概率分布p(x)建模。这个问题被称为密度估计(density estimation)。选择一个合适的分布与模型选择的问题相关,这是模式识别领域的一个中心问题。
二元变量
End of explanation
"""
from scipy.stats import bernoulli
"""
Expla... |
cvxgrp/cvxpylayers | examples/torch/data_poisoning_attack.ipynb | apache-2.0 | import cvxpy as cp
import matplotlib.pyplot as plt
import numpy as np
import torch
from cvxpylayers.torch import CvxpyLayer
"""
Explanation: Data poisoning attack
In this notebook, we use a convex optimization layer to perform a data poisoning attack; i.e., we show how to perturb the data used to train a logistic reg... |
ppik/playdata | Kaggle-Expedia/Expedia Hotel Recommendations - Logistic Regression.ipynb | mit | import collections
import itertools
import operator
import random
import heapq
import matplotlib.pyplot as plt
import ml_metrics as metrics
import numpy as np
import pandas as pd
import sklearn
import sklearn.decomposition
import sklearn.linear_model
import sklearn.preprocessing
%matplotlib notebook
"""
Explanation:... |
LSSTC-DSFP/LSSTC-DSFP-Sessions | Sessions/Session03/Day3/MapReduce.ipynb | mit | import numpy as np
def mapper(arr):
return np.sum(arr)
def reducer(x, y):
return x + y
a = [1, 12, 3]
b = [4, 12, 6, 3]
c = [8, 1, 12, 11, 12, 2]
inputData = [a, b, c]
# Find the sum of all the numbers:
intermediate = map(mapper, inputData)
reduce(reducer, intermediate)
"""
Explanation: Data Management Pa... |
BrownDwarf/ApJdataFrames | notebooks/Chapman2009.ipynb | mit | import warnings
warnings.filterwarnings("ignore")
"""
Explanation: ApJdataFrames Chapman 2009
Title: THE MID-INFRARED EXTINCTION LAW IN THE OPHIUCHUS, PERSEUS, AND SERPENS MOLECULAR CLOUDS
Authors: Nicholas L. Chapman, Lee G Mundy, Shih-Ping Lai, and Neal J Evans
Data is from this paper:
http://iopscience.iop.org/00... |
robertoalotufo/ia898 | src/isolines.ipynb | mit | import numpy as np
def isolines(f, nc=10, n=1):
from colormap import colormap
from applylut import applylut
maxi = int(np.ceil(f.max()))
mini = int(np.floor(f.min()))
d = int(np.ceil(1.*(maxi-mini)/nc))
m = np.zeros((d,1))
m[0:n,:] = 1
m = np.resize(m, (maxi-mini, 1))
m = np.con... |
vinitsamel/udacitydeeplearning | intro-to-tflearn/TFLearn_Digit_Recognition.ipynb | mit | # Import Numpy, TensorFlow, TFLearn, and MNIST data
import numpy as np
import tensorflow as tf
import tflearn
import tflearn.datasets.mnist as mnist
"""
Explanation: Handwritten Number Recognition with TFLearn and MNIST
In this notebook, we'll be building a neural network that recognizes handwritten numbers 0-9.
This... |
wittawatj/kernel-gof | ipynb/gof_me_test.ipynb | mit | %load_ext autoreload
%autoreload 2
%matplotlib inline
#%config InlineBackend.figure_format = 'svg'
#%config InlineBackend.figure_format = 'pdf'
import freqopttest.tst as tst
import kgof
import kgof.data as data
import kgof.density as density
import kgof.goftest as gof
import kgof.intertst as tgof
import kgof.kernel as... |
rashikaranpuria/Machine-Learning-Specialization | Machine Learning Foundations: A Case Study Approach/Assignment_three/.ipynb_checkpoints/Document retrieval-checkpoint.ipynb | mit | import graphlab
graphlab.product_key.set_product_key("7348-CE53-3B3E-DBED-152B-828E-A99E-F303")
"""
Explanation: Document retrieval from wikipedia data
Fire up GraphLab Create
End of explanation
"""
people = graphlab.SFrame('people_wiki.gl/people_wiki.gl')
"""
Explanation: Load some text data - from wikipedia, page... |
asurve/arvind-sysml | scripts/staging/SystemML-NN/examples/Example - MNIST Softmax Classifier.ipynb | apache-2.0 | # Create a SystemML MLContext object
from systemml import MLContext, dml
ml = MLContext(sc)
"""
Explanation: Quick Setup
End of explanation
"""
%%sh
mkdir -p data/mnist/
cd data/mnist/
curl -O http://pjreddie.com/media/files/mnist_train.csv
curl -O http://pjreddie.com/media/files/mnist_test.csv
"""
Explanation: Dow... |
kyleabeauchamp/mdtraj | examples/ramachandran-plot.ipynb | lgpl-2.1 | traj = md.load('ala2.h5')
atoms, bonds = traj.topology.to_dataframe()
atoms
"""
Explanation: Lets load up the trajectory that we simulated in a previous example
End of explanation
"""
psi_indices, phi_indices = [6, 8, 14, 16], [4, 6, 8, 14]
angles = md.compute_dihedrals(traj, [phi_indices, psi_indices])
"""
Explana... |
statsmodels/statsmodels.github.io | v0.13.0/examples/notebooks/generated/exponential_smoothing.ipynb | bsd-3-clause | import os
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from statsmodels.tsa.api import ExponentialSmoothing, SimpleExpSmoothing, Holt
%matplotlib inline
data = [
446.6565,
454.4733,
455.663,
423.6322,
456.2713,
440.5881,
425.3325,
485.1494,
506.0482,
5... |
AtmaMani/pyChakras | udemy_ml_bootcamp/Machine Learning Sections/Natural-Language-Processing/NLP (Natural Language Processing) with Python.ipynb | mit | # ONLY RUN THIS CELL IF YOU NEED
# TO DOWNLOAD NLTK AND HAVE CONDA
# WATCH THE VIDEO FOR FULL INSTRUCTIONS ON THIS STEP
# Uncomment the code below and run:
# !conda install nltk #This installs nltk
# import nltk # Imports the library
# nltk.download() #Download the necessary datasets
"""
Explanation: <a href='http... |
gfeiden/Notebook | Projects/ngc2516_spots/cool_spot_colors.ipynb | mit | %matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
from scipy.interpolate import LinearNDInterpolator
"""
Explanation: BC Grid Extrapolation
Testing errors generated by grid extrapolation for extremely cool spot bolometric corrections. A first test of this will be to use a more extensive Phoenix col... |
rflamary/POT | notebooks/plot_UOT_1D.ipynb | mit | # Author: Hicham Janati <hicham.janati@inria.fr>
#
# License: MIT License
import numpy as np
import matplotlib.pylab as pl
import ot
import ot.plot
from ot.datasets import make_1D_gauss as gauss
"""
Explanation: 1D Unbalanced optimal transport
This example illustrates the computation of Unbalanced Optimal transport
u... |
Cyb3rWard0g/ThreatHunter-Playbook | docs/notebooks/windows/06_credential_access/WIN-191030201010.ipynb | gpl-3.0 | from openhunt.mordorutils import *
spark = get_spark()
"""
Explanation: Remote Interactive Task Manager LSASS Dump
Metadata
| | |
|:------------------|:---|
| collaborators | ['@Cyb3rWard0g', '@Cyb3rPandaH'] |
| creation date | 2019/10/30 |
| modification date | 2020/09/20 |
| playbook rel... |
GoogleCloudPlatform/asl-ml-immersion | notebooks/end-to-end-structured/solutions/5b_deploy_keras_ai_platform_babyweight.ipynb | apache-2.0 | import os
"""
Explanation: LAB 5b: Deploy and predict with Keras model on Cloud AI Platform.
Learning Objectives
Setup up the environment
Deploy trained Keras model to Cloud AI Platform
Online predict from model on Cloud AI Platform
Batch predict from model on Cloud AI Platform
Introduction
In this notebook, we'll ... |
GoogleCloudPlatform/professional-services | examples/e2e-home-appliance-status-monitoring/notebook/EnergyDisaggregationEDA.ipynb | apache-2.0 | # @title Upload files (skip this if this is run locally)
# Use this cell to update the following files
# 1. requirements.txt
from google.colab import files
uploaded = files.upload()
# @title Install missing packages
# run this cell to install packages if some are missing
!pip install -r requirements.txt
# @title ... |
ES-DOC/esdoc-jupyterhub | notebooks/miroc/cmip6/models/miroc6/atmos.ipynb | gpl-3.0 | # DO NOT EDIT !
from pyesdoc.ipython.model_topic import NotebookOutput
# DO NOT EDIT !
DOC = NotebookOutput('cmip6', 'miroc', 'miroc6', 'atmos')
"""
Explanation: ES-DOC CMIP6 Model Properties - Atmos
MIP Era: CMIP6
Institute: MIROC
Source ID: MIROC6
Topic: Atmos
Sub-Topics: Dynamical Core, Radiation, Turbulence... |
mspieg/dynamical-systems | LorenzEquationsDerivation.ipynb | cc0-1.0 | %matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint
from mpl_toolkits.mplot3d import Axes3D
"""
Explanation: <table>
<tr align=left><td><img align=left src="./images/CC-BY.png">
<td>Text provided under a Creative Commons Attribution license, CC-BY. All code is ma... |
kellerberrin/OSM-QSAR | Notebooks/OSM_Results/OSM Prelim Results.ipynb | mit | from IPython.display import display
import pandas as pd
print("Meta Results")
meta_results = pd.read_csv("./meta_results.csv")
display(meta_results)
"""
Explanation: OSM COMPETITION: A Meta Model that optimally combines the outputs of other models.
The aim of the competition is to develop a computational model that p... |
pfschus/fission_bicorrelation | methods/build_plot_bhp_e.ipynb | mit | %%javascript
$.getScript('https://kmahelona.github.io/ipython_notebook_goodies/ipython_notebook_toc.js')
"""
Explanation: Goal: Build and plot bhp_e
P. Schuster, University of Michigan
June 21, 2018
Load bhm_e
Build a function to sum across custom pairs for bhp_e
Plot it
Plot slices
End of explanation
"""
%load_ext... |
LDSSA/learning-units | units/07-data-diagnostics/examples/Diagnosing data problems example .ipynb | mit | import pandas as pd
import numpy as np
% matplotlib inline
from matplotlib import pyplot as plt
"""
Explanation: Diagnosing the data issues:
End of explanation
"""
data = pd.read_csv('all_data.csv')
data.head(10)
"""
Explanation: The data you'll be exloring:
End of explanation
"""
duplicated_data = data.duplic... |
daniel-koehn/Theory-of-seismic-waves-II | 00_Intro_Python_Jupyter_notebooks/4_NumPy_Arrays_and_Plotting.ipynb | gpl-3.0 | # Execute this cell to load the notebook's style sheet, then ignore it
from IPython.core.display import HTML
css_file = '../style/custom.css'
HTML(open(css_file, "r").read())
"""
Explanation: Content under Creative Commons Attribution license CC-BY 4.0, code under BSD 3-Clause License © 2017 L.A. Barba, N.C. Clementi
... |
turbomanage/training-data-analyst | courses/fast-and-lean-data-science/04_Keras_Flowers_transfer_learning_playground.ipynb | apache-2.0 | import os, sys, math
import numpy as np
from matplotlib import pyplot as plt
if 'google.colab' in sys.modules: # Colab-only Tensorflow version selector
%tensorflow_version 2.x
import tensorflow as tf
print("Tensorflow version " + tf.__version__)
AUTO = tf.data.experimental.AUTOTUNE
"""
Explanation: Training on GPU w... |
ES-DOC/esdoc-jupyterhub | notebooks/mohc/cmip6/models/hadgem3-gc31-ll/atmos.ipynb | gpl-3.0 | # DO NOT EDIT !
from pyesdoc.ipython.model_topic import NotebookOutput
# DO NOT EDIT !
DOC = NotebookOutput('cmip6', 'mohc', 'hadgem3-gc31-ll', 'atmos')
"""
Explanation: ES-DOC CMIP6 Model Properties - Atmos
MIP Era: CMIP6
Institute: MOHC
Source ID: HADGEM3-GC31-LL
Topic: Atmos
Sub-Topics: Dynamical Core, Radia... |
spencer2211/deep-learning | tv-script-generation/dlnd_tv_script_generation.ipynb | mit | """
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper
data_dir = './data/simpsons/moes_tavern_lines.txt'
text = helper.load_data(data_dir)
# Ignore notice, since we don't use it for analysing the data
text = text[81:]
"""
Explanation: TV Script Generation
In this project, you'll generate your own Simpsons TV scrip... |
darioflute/CS4A | Lecture-python_intro.ipynb | gpl-3.0 | %%writefile hello-world.py
#!/usr/bin/env python
print ("Hello world!")
ls hello-world*.py
cat hello-world.py
!python hello-world.py
"""
Explanation: Introduction to Python programming
Python program files
Python code is usually stored in text files with the file ending ".py":
myprogram.py
Every line in a Pyth... |
mspcvsp/cincinnati311Data | ComputeCincinnatiNeighborhoodCentroids.ipynb | gpl-3.0 | import findspark
import numpy as np
import os
import re
import subprocess
import shapefile
findspark.init()
import pyspark
sc = pyspark.SparkContext()
sqlContext = pyspark.sql.SQLContext(sc)
"""
Explanation: Initialize software environment
Initialize Spark Environment for Juypter Notebook
End of explanation
"""
... |
TomTranter/OpenPNM | examples/simulations/Transient Fickian Diffusion.ipynb | mit | import numpy as np
import openpnm as op
np.random.seed(10)
%matplotlib inline
np.set_printoptions(precision=5)
"""
Explanation: Transient Fickian Diffusion
The package OpenPNM allows for the simulation of many transport phenomena in porous media such as Stokes flow, Fickian diffusion, advection-diffusion, transport of... |
dietmarw/EK5312_ElectricalMachines | Chapman/Ch8-Problem_8-22.ipynb | unlicense | %pylab notebook
%precision %.4g
"""
Explanation: Excercises Electric Machinery Fundamentals
Chapter 8
Problem 8-22
End of explanation
"""
n0 = 1800 # [r/min]
Ra = 0.18 # [Ohm]
Vf = 120 # [V]
Radj_min = 0 # [Ohm]
Radj_max = 40 # [Ohm]
Rf = 20 # [Ohm]
Nf = 1000
... |
jackovt/Presentation-Design-Patterns | examples/python-example/observe.ipynb | mit | class Observable:
""" Extend this class to be observable. """
def __init__(self):
self.observers = []
def register(self, observer):
if not observer in self.observers:
self.observers.append(observer)
def unregister(self, observer):
if observer in self.observers:
... |
semsturgut/Robotic_ARM | SCS_Documents/ikpy-master/tutorials/ikpy/Moving the Poppy Torso using Inverse Kinematics.ipynb | gpl-3.0 | from poppy.creatures import PoppyTorso
poppy = PoppyTorso(simulator="vrep")
"""
Explanation: Moving the Poppy Torso using Inverse Kinematics
This notebook illustrates how you can use the kinematic chains defined by the PoppyTorso class to directly control the arms of the robot in the cartesian space.
Said in a simpl... |
kbase/data_api | examples/notebooks/data_api-display.ipynb | mit | %matplotlib inline
import matplotlib.pyplot as plt
import qgrid
qgrid.nbinstall()
from biokbase import data_api
from biokbase.data_api import display
display.nbviewer_mode(True)
"""
Explanation: Example
of building a notebook-friendly object into the output of the data API
Author: Dan Gunter
Initialization
Imports
S... |
emiliom/stuff | MMW_API_watershed_demo.ipynb | cc0-1.0 | import json
import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.retry import Retry
def requests_retry_session(
retries=3,
backoff_factor=0.3,
status_forcelist=(500, 502, 504),
session=None,
):
session = session or requests.Session()
retry = Retry(
... |
joannekoong/neuroscience_tutorials | basic/3. Imagined movement.ipynb | bsd-2-clause | %pylab inline
import numpy as np
import scipy.io
m = scipy.io.loadmat('data_set_IV/BCICIV_calib_ds1d.mat', struct_as_record=True)
# SciPy.io.loadmat does not deal well with Matlab structures, resulting in lots of
# extra dimensions in the arrays. This makes the code a bit more cluttered
sample_rate = m['nfo']['fs']... |
google-research/google-research | aptamers_mlpd/figures/Figure_2_Machine_learning_guided_aptamer_discovery_(submission).ipynb | apache-2.0 | import matplotlib.pyplot as plt
import seaborn as sns
import numpy as np
import pandas as pd
"""
Explanation: Copyright 2021 Google LLC
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://ww... |
sthuggins/phys202-2015-work | assignments/assignment07/AlgorithmsEx01.ipynb | mit | %matplotlib inline
from matplotlib import pyplot as plt
import numpy as np
"""
Explanation: Algorithms Exercise 1
Imports
End of explanation
"""
def tokenize(s, stop_words=None, punctuation='`~!@#$%^&*()_-+={[}]|\:;"<,>.?/}\t'):
"""Split a string into a list of words, removing punctuation and stop words."""
... |
feffenberger/StatisticalMethods | examples/Cepheids/PeriodMagnitudeRelation.ipynb | gpl-2.0 | from __future__ import print_function
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
plt.rcParams['figure.figsize'] = (15.0, 8.0)
"""
Explanation: A Period - Magnitude Relation in Cepheid Stars
Cepheids are stars whose brightness oscillates with a stable period that appears to be strongly cor... |
zmechz/CarND-TrafficSign-P2 | Traffic_Sign_Classifier.ipynb | mit | # Load pickled data
import pickle
# TODO: Fill this in based on where you saved the training and testing data
training_file = "traffic-signs/train.p"
validation_file= "traffic-signs/valid.p"
testing_file = "traffic-signs/test.p"
with open(training_file, mode='rb') as f:
train = pickle.load(f)
with open(validatio... |
stevetjoa/stanford-mir | basic_feature_extraction.ipynb | mit | kick_signals = [
librosa.load(p)[0] for p in Path().glob('audio/drum_samples/train/kick_*.mp3')
]
snare_signals = [
librosa.load(p)[0] for p in Path().glob('audio/drum_samples/train/snare_*.mp3')
]
len(kick_signals)
len(snare_signals)
"""
Explanation: ← Back to Index
Basic Feature Extraction
Somehow, we... |
NeuroDataDesign/fngs | docs/ebridge2/fngs_merge/week_0602/timeseries.ipynb | apache-2.0 | %matplotlib inline
import numpy as np
import matplotlib
import matplotlib.pyplot as plt
fngs_ts = np.load('/home/eric/cmp/fngs/outputs/ts_roi/pp264-2mm/sub-0025864_ses-1_bold_pp264-2mm.npy')
cpac_ts = np.load('/home/eric/cmp/cpac/pipeline_HCPtest/sub-0025864_ses-1/roi_timeseries/_scan_rest_1_rest/_csf_threshold_0.96/_... |
empet/Plotly-plots | Plotly-cube.ipynb | gpl-3.0 | %matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
img=plt.imread('Data/Plotly-logo3.png')
plt.imshow(img)
print 'image shape', img.shape
"""
Explanation: Plotly Cube: a cube with Plotly logo mapped on its faces
Our aim is to plot a cube having on each face the Plotly logo.
For, we choose a png imag... |
bspalding/research_public | lectures/drafts/Graphic presentation of data.ipynb | apache-2.0 | import numpy as np
import matplotlib.pyplot as plt
# Get returns data for S&P 500
start = '2014-01-01'
end = '2015-01-01'
spy = get_pricing('SPY', fields='price', start_date=start, end_date=end).pct_change()[1:]
# Plot a histogram using 20 bins
fig = plt.figure(figsize = (16, 7))
_, bins, _ = plt.hist(spy, 20)
labels... |
kadamkaustubh/project-Goldilocks | Ch2_MorePyMC_PyMC2.ipynb | mit | import pymc as pm
parameter = pm.Exponential("poisson_param", 1)
data_generator = pm.Poisson("data_generator", parameter)
data_plus_one = data_generator + 1
"""
Explanation: Chapter 2
This chapter introduces more PyMC syntax and design patterns, and ways to think about how to model a system from a Bayesian perspect... |
eds-uga/csci1360-fa16 | lectures/L15.ipynb | mit | # File "csv_file.txt" contains the following:
# 1,2,3,4
# 5,6,7,8
# 9,10,11,12
matrix = []
with open("csv_file.txt", "r") as f:
full_file = f.read()
# Split into lines.
lines = full_file.strip().split("\n")
for line in lines:
# Split on commas.
elements = line.strip().split(",")... |
ES-DOC/esdoc-jupyterhub | notebooks/cmcc/cmip6/models/cmcc-cm2-hr4/toplevel.ipynb | gpl-3.0 | # DO NOT EDIT !
from pyesdoc.ipython.model_topic import NotebookOutput
# DO NOT EDIT !
DOC = NotebookOutput('cmip6', 'cmcc', 'cmcc-cm2-hr4', 'toplevel')
"""
Explanation: ES-DOC CMIP6 Model Properties - Toplevel
MIP Era: CMIP6
Institute: CMCC
Source ID: CMCC-CM2-HR4
Sub-Topics: Radiative Forcings.
Properties: 8... |
tensorflow/probability | tensorflow_probability/examples/jupyter_notebooks/Factorial_Mixture.ipynb | apache-2.0 | #@title Licensed under the Apache License, Version 2.0 (the "License"); { display-mode: "form" }
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, sof... |
azhurb/deep-learning | sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 1).ipynb | mit | def pretty_print_review_and_label(i):
print(labels[i] + "\t:\t" + reviews[i][:80] + "...")
g = open('reviews.txt','r') # What we know!
reviews = list(map(lambda x:x[:-1],g.readlines()))
g.close()
g = open('labels.txt','r') # What we WANT to know!
labels = list(map(lambda x:x[:-1].upper(),g.readlines()))
g.close()... |
amkatrutsa/MIPT-Opt | Fall2021/03-MatrixCalculus/jax_autodiff_tutorial.ipynb | mit | import jax
import jax.numpy as jnp
"""
Explanation: Automatic differentiation with JAX
Main features
Numpy wrapper
Auto-vectorization
Auto-parallelization (SPMD paradigm)
Auto-differentiation
XLA backend and JIT support
How to compute gradient of your objective?
Define it as a standard Python function
Call jax.grad... |
tarashor/vibrations | py/notebooks/draft/.ipynb_checkpoints/Corrugated geometries simplified-checkpoint.ipynb | mit | from sympy import *
from sympy.vector import CoordSys3D
N = CoordSys3D('N')
x1, x2, x3 = symbols("x_1 x_2 x_3")
alpha1, alpha2, alpha3 = symbols("alpha_1 alpha_2 alpha_3")
R, L, ga, gv = symbols("R L g_a g_v")
init_printing()
"""
Explanation: Corrugated Shells
Init symbols for sympy
End of explanation
"""
a1 = pi / ... |
nimish-jose/dlnd | tv-script-generation/dlnd_tv_script_generation.ipynb | gpl-3.0 | """
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper
data_dir = './data/simpsons/moes_tavern_lines.txt'
text = helper.load_data(data_dir)
# Ignore notice, since we don't use it for analysing the data
text = text[81:]
"""
Explanation: TV Script Generation
In this project, you'll generate your own Simpsons TV scrip... |
GoogleCloudPlatform/vertex-ai-samples | notebooks/community/gapic/automl/showcase_automl_tabular_regression_online_bq.ipynb | apache-2.0 | import os
import sys
# Google Cloud Notebook
if os.path.exists("/opt/deeplearning/metadata/env_version"):
USER_FLAG = "--user"
else:
USER_FLAG = ""
! pip3 install -U google-cloud-aiplatform $USER_FLAG
"""
Explanation: Vertex client library: AutoML tabular regression model for online prediction
<table align="... |
steinam/teacher | jup_notebooks/data-science-ipython-notebooks-master/numpy/02.03-Computation-on-arrays-ufuncs.ipynb | mit | import numpy as np
np.random.seed(0)
def compute_reciprocals(values):
output = np.empty(len(values))
for i in range(len(values)):
output[i] = 1.0 / values[i]
return output
values = np.random.randint(1, 10, size=5)
compute_reciprocals(values)
"""
Explanation: <!--BOOK_INFORMATION-->
<img a... |
hich28/mytesttxx | tests/python/highlighting.ipynb | gpl-3.0 | a = spot.translate('a U b U c')
"""
Explanation: This notebook shows you different ways in which states or transitions can be highlighted in Spot.
It should be noted that highlighting works using some special named properties: basically, two maps that are attached to the automaton, and associated state or edge numbe... |
oscaribv/pyaneti | inpy/example_toyp1/toy_model1.ipynb | gpl-3.0 | #Imort modules
from __future__ import print_function, division, absolute_import
import numpy as np
#Import citlalatonac from pyaneti_extras, note that pyaneti has to be compiled in your machine
#and pyaneti has to be in your PYTHONPATH, e.g., you have to add in your bashrc file
#export PYTHONPATH=${PYTHONPATH}:/pathtop... |
ActivisionGameScience/blog | _notebooks/IPython Parallel Introduction.ipynb | apache-2.0 | # You can also use the IPython magic shell command. but errors are harder to see and stopping the cluster can be janky.
!ipcluster start -n 4 --daemon
"""
Explanation: How to Deploy an IPython Cluster Using Mesos and Docker
John Dennison
April 19th, 2016
The members of the Analytics Services team here at Activision ar... |
manifoldai/merf | notebooks/MERF Gain Experiment.ipynb | mit | # Globals
num_clusters_each_size = 20
train_sizes = [1, 3, 5, 7, 9]
known_sizes = [9, 27, 45, 63, 81]
new_sizes = [10, 30, 50, 70, 90]
n_estimators = 300
max_iterations = 100
train_cluster_sizes = MERFDataGenerator.create_cluster_sizes_array(train_sizes, num_clusters_each_size)
known_cluster_sizes = MERFDataGenerator.c... |
mne-tools/mne-tools.github.io | 0.13/_downloads/plot_topo_compare_conditions.ipynb | bsd-3-clause | # Authors: Denis Engemann <denis.engemann@gmail.com>
# Alexandre Gramfort <alexandre.gramfort@telecom-paristech.fr>
# License: BSD (3-clause)
import matplotlib.pyplot as plt
import mne
from mne.viz import plot_evoked_topo
from mne.datasets import sample
print(__doc__)
data_path = sample.data_path()
"""
... |
keras-team/keras-io | examples/keras_recipes/ipynb/quasi_svm.ipynb | apache-2.0 |
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.layers.experimental import RandomFourierFeatures
"""
Explanation: A Quasi-SVM in Keras
Author: fchollet<br>
Date created: 2020/04/17<br>
Last modified: 2020/04/17<br>
Description: Demonstration of how to train a Keras model that a... |
gully/adrasteia | notebooks/adrasteia_02-03_get_real_gaia_data.ipynb | mit | ! wget 'http://cdn.gea.esac.esa.int/Gaia/gaia_source/csv/GaiaSource_000-000-001.csv.gz'
! ls
! gzip -d GaiaSource_000-000-000.csv.gz
"""
Explanation: Gaia
Real data!
gully
Sept 14, 2016
Outline:
Download the data
Estimate how much data it will be
Batch download more
1. Download the data
End of explanation
"""
! ... |
anhaidgroup/py_entitymatching | notebooks/guides/step_wise_em_guides/Reading CSV Files from Disk.ipynb | bsd-3-clause | import py_entitymatching as em
import pandas as pd
import os, sys
"""
Explanation: Introduction
This IPython notebook illustrates how to read a CSV file from disk as a table and set its metadata.
First, we need to import py_entitymatching package and other libraries as follows:
End of explanation
"""
# Get the datas... |
Azure/azure-sdk-for-python | sdk/digitaltwins/azure-digitaltwins-core/samples/notebooks/04_Lots_on_Queries.ipynb | mit | from azure.identity import AzureCliCredential
from azure.digitaltwins.core import DigitalTwinsClient
# using yaml instead of
import yaml
import uuid
# using altair instead of matplotlib for vizuals
import numpy as np
import pandas as pd
# you will get this from the ADT resource at portal.azure.com
your_digital_twin... |
bakerjd99/jacks | notebooks/Extracting SQL code from SSIS dtsx packages with Python lxml.ipynb | unlicense | # imports
import os
from lxml import etree
# set sql output directory
sql_out = r"C:\temp\dtsxsql"
if not os.path.isdir(sql_out):
os.makedirs(sql_out)
# set dtsx package file
ssis_dtsx = r'C:\temp\dtsx\ParseXML.dtsx'
if not os.path.isfile(ssis_dtsx):
print("no package file")
# read and parse ssis package
tre... |
folivetti/PIPYTHON | ListaEX_04.ipynb | mit | # Contador de palavras
import codecs
from collections import defaultdict
def ContaPalavras(texto):
for palavra, valor in ContaPalavras('exemplo.txt').iteritems():
print (palavra, valor)
"""
Explanation: Exercício 01: Crie uma função ContaPalavras que receba como entrada o nome de um arquivo de texto e retorn... |
ernestyalumni/MLgrabbag | kaggle/kaggle.ipynb | mit | print( os.listdir( os.getcwd() ))
timeseries_pd = pd.read_hdf( 'train.h5')
timeseries_pd.describe()
timeseries_pd.head()
timeseries_pd.columns
print( len(timeseries_pd.columns) )
for col in timeseries_pd.columns: print col
timeseries_pd["timestamp"]; # Name: timestamp, dtype: int16
timeseries_pd[["id","timestamp... |
tensorflow/examples | templates/notebook.ipynb | apache-2.0 | #@title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under... |
yw-fang/readingnotes | machine-learning/McKinney-pythonbook2013/chapter03-note.ipynb | apache-2.0 | a = 5
a
import numpy as np
from numpy.random import randn
data = {i: randn() for i in range(7)}
print(data)
data1 = {j: j**2 for j in range(5)}
print(data1)
"""
Explanation: 阅读笔记
作者:方跃文
Email: fyuewen@gmail.com
时间:始于2017年9月12日
第三章笔记始于2017年9月28日23:38,结束于 2017年10月17日
第三章 IPtyhon: 一种交互式计算和开发环境
IPython鼓励一种“执行探索—... |
Danghor/Formal-Languages | Python/Top-Down-Parser.ipynb | gpl-2.0 | import re
"""
Explanation: A Recursive Parser for Arithmetic Expressions
In this notebook we implement a simple recursive descend parser for arithmetic expressions.
This parser will implement the following grammar:
$$
\begin{eqnarray}
\mathrm{expr} & \rightarrow & \mathrm{product}\;\;\mathrm{exprRest} ... |
net-titech/CREST-Deep-M | notebooks/weight-clustering.ipynb | mit | import numpy as np
import os
import sys
weights_path = '/'.join(os.getcwd().split('/')[:-1]) + '/local-trained/alexnet/weights/'
print(weights_path)
os.listdir(weights_path)
keys = ['conv1', 'conv2', 'conv3', 'conv4', 'conv5', 'fc6', 'fc7', 'fc8']
weights = {}
for k in keys:
weights[k] = np.load(weights_path + k... |
ioshchepkov/SHTOOLS | examples/notebooks/tutorial_4.ipynb | bsd-3-clause | %matplotlib inline
from __future__ import print_function # only necessary if using Python 2.x
import matplotlib.pyplot as plt
import numpy as np
from pyshtools.shclasses import SHCoeffs, SHGrid, SHWindow
lmax = 100
coeffs = SHCoeffs.from_zeros(lmax)
coeffs.set_coeffs(values=[1], ls=[5], ms=[2])
"""
Explanation: Sphe... |
MingChen0919/learning-apache-spark | notebooks/04-miscellaneous/.ipynb_checkpoints/user-defined-sql-function (udf)-checkpoint.ipynb | mit | from pyspark.sql.types import *
from pyspark.sql.functions import udf
mtcars = spark.read.csv('../../data/mtcars.csv', inferSchema=True, header=True)
mtcars = mtcars.withColumnRenamed('_c0', 'model')
mtcars.show(5)
"""
Explanation: udf() function and sql types`
The pyspark.sql.functions.udf() function is a very impor... |
joshspeagle/frankenz | demos/5 - Population Inference with Redshifts.ipynb | mit | from __future__ import print_function, division
import sys
import pickle
import numpy as np
import scipy
import matplotlib
from matplotlib import pyplot as plt
from six.moves import range
# import frankenz code
import frankenz
# plot in-line within the notebook
%matplotlib inline
np.random.seed(7001826)
# re-defini... |
ewulczyn/talk_page_abuse | misc/kaggle/src/n-grams.ipynb | apache-2.0 | data_filename = '../data/train.csv'
data_df = pd.read_csv(data_filename)
corpus = data_df['Comment']
labels = data_df['Insult']
train_corpus, test_corpus, train_labels, test_labels = \
sklearn.cross_validation.train_test_split(corpus, labels, test_size=0.33)
"""
Explanation: Load and Split Kaggle Data
End of explana... |
lodrantl/github_analysis | github_analysis/analysis.ipynb | apache-2.0 | import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
pd.options.display.max_rows = 20
"""
Explanation: GitHub Analiza
iz
V tem projektu bomo analizirali najpopularnejše odprte repozitorije na priljubljeni strani GitHub. Podatki so bili zajeti iz https://api.github.com, kar pa v t... |
vangj/py-bbn | jupyter/libpgm.ipynb | apache-2.0 | json_data = {
"V": ["Letter", "Grade", "Intelligence", "SAT", "Difficulty"],
"E": [["Difficulty", "Grade"],
["Intelligence", "Grade"],
["Intelligence", "SAT"],
["Grade", "Letter"]],
"Vdata": {
"Letter": {
"ord": 4,
"numoutcomes": 2,
"vals":... |
TwistedHardware/mltutorial | notebooks/IPython-Tutorial/4 - Numpy Basics.ipynb | gpl-2.0 | import numpy as np
"""
Explanation: Tutorial Brief
numpy is a powerful set of tools to perform mathematical operations of on lists of numbers. It works faster than normal python lists operations and can manupilate high dimentional arrays too.
Finding Help:
http://wiki.scipy.org/Tentative_NumPy_Tutorial
http://docs.sc... |
jph00/part2 | seq2seq-translation.ipynb | apache-2.0 | import unicodedata, string, re, random, time, math, torch, torch.nn as nn
from torch.autograd import Variable
from torch import optim
import torch.nn.functional as F
import keras, numpy as np
from keras.preprocessing import sequence
"""
Explanation: Requirements
End of explanation
"""
SOS_token = 0
EOS_token = 1
c... |
phoebe-project/phoebe2-docs | development/tutorials/emcee_continue_from.ipynb | gpl-3.0 | #!pip install -I "phoebe>=2.4,<2.5"
import phoebe
from phoebe import u # units
import numpy as np
logger = phoebe.logger('error')
"""
Explanation: Advanced: Continuing Emcee from a Previous Run
IMPORTANT: this tutorial assumes basic knowledge (and uses a file resulting from) the emcee tutorial.
Setup
Let's first mak... |
vanheck/blog-notes | QuantTrading/time-series-analyze_1-pandas.ipynb | mit | import datetime
MY_VERSION = 1,0
print('Verze notebooku:', '.'.join(map(str, MY_VERSION)))
print('Poslední aktualizace:', datetime.datetime.now())
"""
Explanation: Analýza časových řad 1 - manipulace s daty v Pandas
Popis základních funkcí pomocí pro analýzu dat v Pandas.
Info o verzi a notebooku
End of explanation
... |
mne-tools/mne-tools.github.io | 0.24/_downloads/3d564af6b3f1e758cf01cd38abefd45f/50_epochs_to_data_frame.ipynb | bsd-3-clause | import os
import seaborn as sns
import mne
sample_data_folder = mne.datasets.sample.data_path()
sample_data_raw_file = os.path.join(sample_data_folder, 'MEG', 'sample',
'sample_audvis_filt-0-40_raw.fif')
raw = mne.io.read_raw_fif(sample_data_raw_file, verbose=False)
"""
Explanation... |
ueapy/ueapy.github.io | content/notebooks/2018-02-05-oop-vs-procedural.ipynb | mit | import numpy as np
import itertools
import warnings
warnings.simplefilter(action='ignore')
"""
Explanation: Instructions for the example in the code can be found here: https://adventofcode.com/2015/day/21
And other approaches to this problem (including other languages) can be found on Reddit: https://www.reddit.com/r/... |
ucsdlib/python-novice-inflammation | 6-errors.ipynb | cc0-1.0 | cd code
import errors_01
errors_01.favorite_ice_cream()
"""
Explanation: Errors and Exceptions
every programmer deals with errors and they can be v. frustrating
understanding what the different error types are and when you are likely to encounter them helps a lot
Errors in python ahve a specific form, called a tra... |
gautam1858/tensorflow | tensorflow/lite/g3doc/performance/post_training_integer_quant.ipynb | apache-2.0 | #@title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under... |
ChadFulton/statsmodels | examples/notebooks/tsa_dates.ipynb | bsd-3-clause | from __future__ import print_function
import statsmodels.api as sm
import numpy as np
import pandas as pd
"""
Explanation: Dates in timeseries models
End of explanation
"""
data = sm.datasets.sunspots.load()
"""
Explanation: Getting started
End of explanation
"""
from datetime import datetime
dates = sm.tsa.datet... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.