repo_name stringlengths 6 77 | path stringlengths 8 215 | license stringclasses 15
values | content stringlengths 335 154k |
|---|---|---|---|
LSST-Supernova-Workshops/Pittsburgh-2016 | Tutorials/Cadence/Cadence_And_OpSim.ipynb | mit | import numpy as np
import pandas as pd
import os
import sqlite3
from sqlalchemy import create_engine
%matplotlib inline
import matplotlib.pyplot as plt
opsimdbpath = os.environ.get('OPSIMDBPATH')
print(opsimdbpath)
engine = create_engine('sqlite:///' + opsimdbpath)
conn = sqlite3.connect(opsimdbpath)
cursor = co... |
srcole/qwm | burrito/.ipynb_checkpoints/UNFINISHED_Burrito_correlations-checkpoint.ipynb | mit | %config InlineBackend.figure_format = 'retina'
%matplotlib inline
import numpy as np
import scipy as sp
import matplotlib.pyplot as plt
import pandas as pd
import seaborn as sns
sns.set_style("white")
"""
Explanation: San Diego Burrito Analytics
Scott Cole
23 April 2016
This notebook contains analyses on the burrito... |
google-research/google-research | evolution/regularized_evolution_algorithm/regularized_evolution.ipynb | apache-2.0 | DIM = 100 # Number of bits in the bit strings (i.e. the "models").
NOISE_STDEV = 0.01 # Standard deviation of the simulated training noise.
class Model(object):
"""A class representing a model.
It holds two attributes: `arch` (the simulated architecture) and `accuracy`
(the simulated accuracy / fitness). Se... |
dh7/ML-Tutorial-Notebooks | tf-image-generation.ipynb | bsd-2-clause | import numpy as np
import tensorflow as tf
"""
Explanation: Tensor Flow to create a useless images
To learn how to encode a simple image and a GIF
Import needed for Tensorflow
End of explanation
"""
%matplotlib notebook
import matplotlib
import matplotlib.pyplot as plt
from IPython.display import Image
"""
Explana... |
james-prior/cohpy | 20150601-dojo-in-membership-tests-lists-tuples-sets-dicts.ipynb | mit | n = 10**7
a = list(range(n))
b = tuple(a)
c = set(a)
d = dict(zip(a, a))
5 in a, 5 in b, 5 in c, 5 in d
i = n/2
%timeit i in a
%timeit i in b
"""
Explanation: This notebook explores how fast determining if some value
is in lists, tuples, sets, and dictionaries.
See simplified conclusion at bottom of notebook.
The ... |
tensorflow/docs-l10n | site/es-419/tutorials/keras/classification.ipynb | apache-2.0 | #@title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under... |
ga7g08/ga7g08.github.io | _notebooks/2016-10-19-Basic-primer-on-MCMC.ipynb | mit | %matplotlib inline
import numpy as np
import scipy as sp
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
from scipy.stats import norm, uniform
sns.set_style('white')
sns.set_context('talk')
np.random.seed(123)
"""
Explanation: Primer on Markov Chain Monte Carlo (MCMC) sampling
This is my ... |
tensorflow/hub | examples/colab/senteval_for_universal_sentence_encoder_cmlm.ipynb | apache-2.0 | # Copyright 2021 The TensorFlow Hub Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by app... |
samgoodgame/sf_crime | iterations/misc/Cha_Goodgame_Kao_Moore_W207_Final_Project_updated_08_20_1213.ipynb | mit | # Import relevant libraries:
import time
import numpy as np
import pandas as pd
from sklearn.neighbors import KNeighborsClassifier
from sklearn import preprocessing
from sklearn.preprocessing import MinMaxScaler
from sklearn.preprocessing import StandardScaler
from sklearn.naive_bayes import BernoulliNB
from sklearn.na... |
zomansud/coursera | ml-classification/week-2/module-4-linear-classifier-regularization-assignment-blank.ipynb | mit | from __future__ import division
import graphlab
"""
Explanation: Logistic Regression with L2 regularization
The goal of this second notebook is to implement your own logistic regression classifier with L2 regularization. You will do the following:
Extract features from Amazon product reviews.
Convert an SFrame into a... |
kkozarev/mwacme | notebooks/ICRS_on_AIA_BinChen.ipynb | gpl-2.0 | aiamap=sunpy.map.Map('/Users/kkozarev/sunpy/data/sample_data/AIA20110319_105400_0171.fits')
"""
Explanation: Load an AIA image
End of explanation
"""
sunc_1au=SkyCoord(ra='23h53m53.47',dec='-00d39m44.3s', distance=1.*u.au,frame='icrs').transform_to(aiamap.coordinate_frame)
"""
Explanation: I then go to JPL Horizon... |
rishuatgithub/MLPy | nlp/UPDATED_NLP_COURSE/02-Parts-of-Speech-Tagging/01-Visualizing-POS.ipynb | apache-2.0 | # Perform standard imports
import spacy
nlp = spacy.load('en_core_web_sm')
# Import the displaCy library
from spacy import displacy
# Create a simple Doc object
doc = nlp(u"The quick brown fox jumped over the lazy dog's back.")
# Render the dependency parse immediately inside Jupyter:
displacy.render(doc, style='dep... |
chbrandt/pynotes | moc/MOC_LaMassa.ipynb | gpl-2.0 | baseurl = 'ftp://cdsarc.u-strasbg.fr/pub/cats/J/ApJ/817/172/'
readme_file = 'ReadMe'
chandra_file = 'chandra.dat'
import astropy
print "astropy version:",astropy.__version__
import mocpy
print "mocpy version:",mocpy.__version__
import healpy
print "healpy version:",healpy.__version__
"""
Explanation: Building a MOC... |
kubeflow/kfp-tekton-backend | components/gcp/dataproc/submit_pig_job/sample.ipynb | apache-2.0 | %%capture --no-stderr
KFP_PACKAGE = 'https://storage.googleapis.com/ml-pipeline/release/0.1.14/kfp.tar.gz'
!pip3 install $KFP_PACKAGE --upgrade
"""
Explanation: Name
Data preparation using Apache Pig on YARN with Cloud Dataproc
Label
Cloud Dataproc, GCP, Cloud Storage, YARN, Pig, Apache, Kubeflow, pipelines, componen... |
googledatalab/notebooks | samples/Exploring Genomics Data.ipynb | apache-2.0 | import google.datalab.bigquery as bq
"""
Explanation: Exploring Genomic Data
This notebook demonstrates working with genetic variant data stored as publicly accessible Google BigQuery datasets.
Specifically, we will work with the Illumina Platinum Genomes data. The source data was originally in VCF format, which was i... |
rvm-segfault/edx | python_for_data_sci_dse200x/week3/.ipynb_checkpoints/Intro Notebook-checkpoint.ipynb | apache-2.0 | 365 * 24 * 60 * 60
print(str(_/1e6) + ' million')
x = 4 + 3
print (x)
"""
Explanation: Number of seconds in a year
End of explanation
"""
%matplotlib inline
from matplotlib.pyplot import plot
plot([0,1,0,1])
"""
Explanation: This is a markdown cell
This is heading 2
This is heading 3
Hi!
One Fish
Two Fish
Red ... |
stevetjoa/stanford-mir | segmentation.ipynb | mit | T = 3.0 # duration in seconds
sr = 22050 # sampling rate in Hertz
amplitude = numpy.logspace(-3, 0, int(T*sr), endpoint=False, base=10.0) # time-varying amplitude
print(amplitude.min(), amplitude.max()) # starts at 110 Hz, ends at 880 Hz
"""
Explanation: ← Back to Index
Segmentation
In audio processing, it... |
tensorflow/agents | docs/tutorials/per_arm_bandits_tutorial.ipynb | apache-2.0 | #@title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under... |
hosford42/xcs | doc/XCSTutorial.ipynb | bsd-3-clause | import logging
logging.root.setLevel(logging.INFO)
"""
Explanation: XCS Tutorial
This is the official tutorial for the xcs package for Python 3. You can find the latest release and get updates on the project's status at the project home page.
What is XCS?
XCS is a Python 3 implementation of the XCS algorithm as descri... |
mramanathan/pydiary_notes | howto_pickle.ipynb | gpl-3.0 | """
An example to store the output without "pickle"
"""
testfile = 'nopickle.txt'
var1 = 1143
var2 = ["AECS", "LAYOUT", "KUNDALAHALLI"]
var3 = 58.30
var4 = ("Bangalore", 560037)
def ezhudhu():
with open(testfile, 'w+') as f:
f.write(str(var1))
f.write(str(var2))
f.write(str(var3))
... |
GoogleCloudPlatform/tf-estimator-tutorials | 08_Text_Analysis/06 - Part_3 - Text Classification - Hacker News - Custom Estimator Word Embedding.ipynb | apache-2.0 | import os
class Params:
pass
# Set to run on GCP
Params.GCP_PROJECT_ID = 'ksalama-gcp-playground'
Params.REGION = 'europe-west1'
Params.BUCKET = 'ksalama-gcs-cloudml'
Params.PLATFORM = 'local' # local | GCP
Params.DATA_DIR = 'data/news' if Params.PLATFORM == 'local' else 'gs://{}/data/news'.format(Params.BUCKE... |
LucaCanali/Miscellaneous | Spark_Notes/Spark_Histograms/Spark_SQL_Frequency_Histograms.ipynb | apache-2.0 | # Start the Spark Session
# This uses local mode for simplicity
# the use of findspark is optional
# install pyspark if needed
# ! pip install pyspark
# import findspark
# findspark.init("/home/luca/Spark/spark-3.3.0-bin-hadoop3")
from pyspark.sql import SparkSession
spark = (SparkSession.builder
.appName("... |
Illumina/interop | docs/src/Tutorial_04_Indexing_Metrics.ipynb | gpl-3.0 | run_folder = ""
"""
Explanation: Using the Illumina InterOp Library in Python: Part 4
Install
If you do not have the Python InterOp library installed, then you can do the following:
$ pip install -f https://github.com/Illumina/interop/releases/latest interop
You can verify that InterOp is properly installed:
$ python... |
tombstone/models | research/object_detection/colab_tutorials/eager_few_shot_od_training_tf2_colab.ipynb | apache-2.0 | !pip install -U --pre tensorflow=="2.2.0"
import os
import pathlib
# Clone the tensorflow models repository if it doesn't already exist
if "models" in pathlib.Path.cwd().parts:
while "models" in pathlib.Path.cwd().parts:
os.chdir('..')
elif not pathlib.Path('models').exists():
!git clone --depth 1 https://git... |
LSSTC-DSFP/LSSTC-DSFP-Sessions | Sessions/Session09/Day4/workbook_QPO.ipynb | mit | a = Table()
a.meta['dt'] = 0.0001 # time step, in seconds
a.meta['duration'] = 200 # length of time, in seconds
a.meta['omega'] = 2*np.pi # angular frequency, in radians
a.meta['phi'] = 0.0 # offset angle, in radians
"""
Explanation: Using Fourier Analysis to Analyze Quasi-Periodic Oscillations
By Abigail Stevens
P... |
dlsun/symbulate | labs/Lab 6 - Joint and Conditional Distributions.ipynb | mit | from symbulate import *
%matplotlib inline
"""
Explanation: Symbulate Lab 6 - Joint and Conditional Distributions
This Jupyter notebook provides a template for you to fill in. Read the notebook from start to finish, completing the parts as indicated. To run a cell, make sure the cell is highlighted by clicking on it... |
SteveDiamond/cvxpy | examples/machine_learning/ridge_regression.ipynb | gpl-3.0 | import cvxpy as cp
import numpy as np
import matplotlib.pyplot as plt
"""
Explanation: Machine Learning: Ridge Regression
Ridge regression is a regression technique that is quite similar to unadorned least squares linear regression: simply adding an $\ell_2$ penalty on the parameters $\beta$ to the objective function ... |
philmui/datascience2016fall | lecture03.numpy.pandas/ch05.ipynb | mit | from pandas import Series, DataFrame
import pandas as pd
from __future__ import division
from numpy.random import randn
import numpy as np
import os
import matplotlib.pyplot as plt
np.random.seed(12345)
plt.rc('figure', figsize=(10, 6))
from pandas import Series, DataFrame
import pandas as pd
np.set_printoptions(preci... |
tensorflow/docs | site/en/tutorials/images/classification.ipynb | apache-2.0 | #@title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under... |
vascotenner/holoviews | doc/Examples/SRI_Model.ipynb | bsd-3-clause | import collections
import itertools
import math
import numpy as np
np.seterr(divide='ignore')
import numpy.random as rnd
import networkx as nx
import param
import holoviews as hv
SPREADING_SUSCEPTIBLE = 'S'
SPREADING_VACCINATED = 'V'
SPREADING_INFECTED = 'I'
SPREADING_RECOVERED = 'R'
DEAD = 'D'
class SRI_Model(para... |
gaufung/PythonStandardLibrary | DataPersistence/Pickle.ipynb | mit | import pickle
import pprint
data = [{'a': 'A', 'b': 2, 'c': 3.0}]
print('DATA:', end=' ')
pprint.pprint(data)
data_string = pickle.dumps(data)
print('PICKLE: {!r}'.format(data_string))
import pickle
import pprint
data1 = [{'a': 'A', 'b': 2, 'c': 3.0}]
print('BEFORE: ', end=' ')
pprint.pprint(data1)
data1_string = ... |
Vincibean/machine-learning-with-tensorflow | simulated-linear-regression.ipynb | apache-2.0 | import numpy as np
num_points = 1000
vectors_set = []
for i in range(num_points):
x1= np.random.normal(0.0, 0.55)
y1= x1 * 0.1 + 0.3 + np.random.normal(0.0, 0.03)
vectors_set.append([x1, y1])
x_data = [v[0] for v in vectors_set]
y_data = [v[1] for v in vectors_set]
"""
Explanation: Simulat... |
waltervh/BornAgain-tutorial | talks/day_3/advanced_geometry_M/EvanescentWave.ipynb | gpl-3.0 | %matplotlib inline
# %load depthprobe_ex.py
import numpy as np
import bornagain as ba
from bornagain import deg, angstrom, nm
# layer thicknesses in angstroms
t_Ti = 130.0 * angstrom
t_Pt = 320.0 * angstrom
t_Ti_top = 100.0 * angstrom
t_TiO2 = 30.0 * angstrom
# beam data
ai_min = 0.0 * deg # minimum incident angle... |
balarsen/pymc_learning | Propagation_of_uncertainty/Sullivan1971_GF.ipynb | bsd-3-clause | from pprint import pprint
import numpy as np
import pymc3 as pm
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
sns.set(font_scale=1.5)
sns.set_context("notebook", rc={"lines.linewidth": 3})
%matplotlib inline
def getBoundedNormal_dist(mean=None, FWHM=None, name=None, lower=0, upper=1e6):
... |
gdementen/larray | doc/source/tutorial/tutorial_plotting.ipynb | gpl-3.0 | from larray import *
"""
Explanation: Plotting
Import the LArray library:
End of explanation
"""
demography_eurostat = load_example_data('demography_eurostat')
population = demography_eurostat.population / 1_000_000
# show the 'population' array
population
"""
Explanation: Import the test array population from the... |
sthuggins/phys202-2015-work | days/day08/Display.ipynb | mit | class Ball(object):
pass
b = Ball()
b.__repr__()
print(b)
"""
Explanation: Display of Rich Output
In Python, objects can declare their textual representation using the __repr__ method.
End of explanation
"""
class Ball(object):
def __repr__(self):
return 'TEST'
b = Ball()
print(b)
"""
Explanatio... |
Diyago/Machine-Learning-scripts | DEEP LEARNING/Pytorch from scratch/TODO/Autoencoders/convolutional-autoencoder/Convolutional_Autoencoder_Solution.ipynb | apache-2.0 | import torch
import numpy as np
from torchvision import datasets
import torchvision.transforms as transforms
# convert data to torch.FloatTensor
transform = transforms.ToTensor()
# load the training and test datasets
train_data = datasets.MNIST(root='data', train=True,
download=True... |
NEONScience/NEON-Data-Skills | tutorials/Python/Lidar/intro-lidar/classify_raster_with_threshold-py/classify_raster_with_threshold-py.ipynb | agpl-3.0 | import numpy as np
import gdal
import matplotlib.pyplot as plt
%matplotlib inline
import warnings
warnings.filterwarnings('ignore')
"""
Explanation: syncID: c324c554a35f463493349dbd0be19cec
title: "Classify a Raster Using Threshold Values in Python - 2017"
description: "Learn how to read NEON lidar raster GeoTIFFs (e.... |
csaladenes/csaladenes.github.io | test/eis-metadata-validation/Planon metadata validation2.ipynb | mit | import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
"""
Explanation: EIS metadata validation script
Used to validate Planon output with spreadsheet input
1. Data import
End of explanation
"""
planon=pd.read_excel('Data Loggers.xlsx',index_col = 'Code')
master_loggerscontrollers ... |
ssunkara1/bqplot | examples/Marks/Pyplot/HeatMap.ipynb | apache-2.0 | import numpy as np
from ipywidgets import Layout
import bqplot.pyplot as plt
from bqplot import *
"""
Explanation: Heatmap
The HeatMap mark represents a 2d matrix of values as a color image. It can be used to visualize a 2d function, or a grayscale image for instance.
HeatMap is very similar to the GridHeatMap, but sh... |
DCPROGS/HJCFIT | exploration/CKS.ipynb | gpl-3.0 | %matplotlib notebook
import numpy as np
import matplotlib.pyplot as plt
from dcprogs.likelihood import QMatrix
tau = 0.2
qmatrix = QMatrix([[-1, 1, 0], [19, -29, 10], [0, 0.026, -0.026]], 1)
"""
Explanation: CKF Model
The following tries to reproduce Fig 9 from Hawkes, Jalali, Colquhoun (1992). First we create the ... |
google/applied-machine-learning-intensive | content/xx_misc/regular_expressions/colab.ipynb | apache-2.0 | # Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the L... |
DJCordhose/ai | notebooks/booster/2-manual-prediction.ipynb | mit | import warnings
warnings.filterwarnings('ignore')
%matplotlib inline
%pylab inline
import pandas as pd
print(pd.__version__)
"""
Explanation: Manuel Prediction and Validation
End of explanation
"""
# df = pd.read_csv('./insurance-customers-300.csv', sep=';')
df = pd.read_csv('./insurance-customers-300-2.csv', sep=... |
KECB/learn | BAMM.101x/Functions_part_1.ipynb | mit | x=5
y=7
z=max(x,y) #max is the function. x and y are the arguments
print(z) #print is the function. z is the argument
"""
Explanation: <h1>Functions</h1>
<h2>Calling a function</h2>
End of explanation
"""
!pip install easygui
#pip: python installer program
# ! run the program from the shell (not from python)
# ea... |
sys-bio/tellurium | examples/notebooks/core/tellurium_stochastic.ipynb | apache-2.0 | from __future__ import print_function
import tellurium as te
te.setDefaultPlottingEngine('matplotlib')
%matplotlib inline
import numpy as np
r = te.loada('S1 -> S2; k1*S1; k1 = 0.1; S1 = 40')
r.integrator = 'gillespie'
r.integrator.seed = 1234
results = []
for k in range(1, 50):
r.reset()
s = r.simulate(0, 40... |
ES-DOC/esdoc-jupyterhub | notebooks/noaa-gfdl/cmip6/models/gfdl-am4/land.ipynb | gpl-3.0 | # DO NOT EDIT !
from pyesdoc.ipython.model_topic import NotebookOutput
# DO NOT EDIT !
DOC = NotebookOutput('cmip6', 'noaa-gfdl', 'gfdl-am4', 'land')
"""
Explanation: ES-DOC CMIP6 Model Properties - Land
MIP Era: CMIP6
Institute: NOAA-GFDL
Source ID: GFDL-AM4
Topic: Land
Sub-Topics: Soil, Snow, Vegetation, Ener... |
akloster/amplicon_classification | notebooks/amplicon_metadata.ipynb | isc | %load_ext autoreload
%autoreload 2
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn
import porekit
import re
import pysam
import random
import feather
%matplotlib inline
"""
Explanation: Preparing the reads
[Loose et al] published their raw read files on ENA. This script uses four... |
maxis42/ML-DA-Coursera-Yandex-MIPT | 4 Stats for data analysis/Homework/14 test AB browser test/AB browser test.ipynb | mit | from __future__ import division
import numpy as np
import pandas as pd
from scipy import stats
from statsmodels.sandbox.stats.multicomp import multipletests
%matplotlib inline
import matplotlib.pyplot as plt
import seaborn as sns
from IPython.core.interactiveshell import InteractiveShell
InteractiveShell.ast_node_... |
jamesnw/wtb-data | notebooks/Style Similarity.ipynb | mit | import math
# Square the difference of each row, and then return the mean of the column.
# This is the average difference between the two.
# It will be higher if they are different, and lower if they are similar
def similarity(styleA, styleB):
diff = np.square(wtb[styleA] - wtb[styleB])
return diff.mean()
res... |
junhwanjang/DataSchool | Lecture/09. 기초 확률론 3 - 확률모형/2) 이항 확률 분포.ipynb | mit | N = 10
theta = 0.6
rv = sp.stats.binom(N, theta)
rv
"""
Explanation: 이항 확률 분포
베르누이 시도(Bernoulli trial)란 성공 혹은 실패로 결과가 나오는 것을 말한다.
성공확률이 $\theta$ 인 베르누이 시도를 $N$번 하는 경우를 생각해 보자. 가장 운이 좋을 때에는 $N$번 모두 성공할 것이고 가장 운이 나쁜 경우에는 한 번도 성공하지 못할 겻이다. $N$번 중 성공한 횟수를 확률 변수 $X$ 라고 한다면 $X$의 값은 0 부터 $N$ 까지의 정수 중 하나가 될 것이다.
이러한 확률 변수를 ... |
Diyago/Machine-Learning-scripts | DEEP LEARNING/NLP/BERD pretrained model/Named Entity Recognition With Bert.ipynb | apache-2.0 | MAX_LEN = 75
bs = 32
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
n_gpu = torch.cuda.device_count()
torch.cuda.get_device_name(0)
"""
Explanation: We will limit our sequence length to 75 tokens and we will use a batch size of 32 as suggested by the Bert paper. Note, that Bert natively suppor... |
msultan/msmbuilder | examples/Clustering-Comparison.ipynb | lgpl-2.1 | import numpy as np
from sklearn import datasets
from collections import OrderedDict
np.random.seed(0)
n_samples = 2500
ds = OrderedDict()
ds['noisy_circles'] = datasets.make_circles(
n_samples=n_samples, factor=.5, noise=.05)
ds['noisy_moons'] = datasets.make_moons(
n_samples=n_samples, noise=.05)
ds['blobs... |
harpolea/CMG_testing_workshop | Testing.ipynb | mit | import numpy
from numpy.random import rand
import matplotlib.pyplot as plt
%matplotlib inline
plt.rcParams.update({'font.size': 18})
from scipy.integrate import quad
import unittest
"""
Explanation: Testing Scientific Codes
End of explanation
"""
def normalise(v):
norm = numpy.sqrt(numpy.sum(v**2))
retu... |
daniel-koehn/Theory-of-seismic-waves-II | 05_2D_acoustic_FD_modelling/4_fdac2d_absorbing_boundary.ipynb | gpl-3.0 | # Execute this cell to load the notebook's style sheet, then ignore it
from IPython.core.display import HTML
css_file = '../style/custom.css'
HTML(open(css_file, "r").read())
"""
Explanation: Content under Creative Commons Attribution license CC-BY 4.0, code under BSD 3-Clause License © 2018 by D. Koehn, heterogeneou... |
joshnsolomon/phys202-2015-work | assignments/assignment06/InteractEx05.ipynb | mit | %matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
from IPython.html.widgets import interact, interactive, fixed
from IPython.html import widgets
from IPython.display import display, SVG
"""
Explanation: Interact Exercise 5
Imports
Put the standard imports for Matplotlib, Numpy and the IPython widge... |
MingChen0919/learning-apache-spark | notebooks/01-data-strcture/1.4-merge-and-split-columns.ipynb | mit | mtcars = spark.read.csv(path='../../data/mtcars.csv',
sep=',',
encoding='UTF-8',
comment=None,
header=True,
inferSchema=True)
mtcars.show(n=5)
# adjust first column name
colnames = mtcars.columns
c... |
andim/pysnippets | sparse-matrix-updating-benchmarking.ipynb | mit | import numpy as np
import scipy.sparse
sparsity = 0.001
N, M = 10000, 200
rowvec = np.ones(N)
colvec = np.ones(M)
Aorig = np.random.random((N, M)) < sparsity
%timeit scipy.sparse.csr_matrix(Aorig)
%timeit scipy.sparse.lil_matrix(Aorig)
Alil = scipy.sparse.lil_matrix(Aorig)
%timeit scipy.sparse.csr_matrix(Alil)
Ado... |
ling7334/tensorflow-get-started | mnist/MNIST_For_ML_Beginners.ipynb | apache-2.0 | from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)
"""
Explanation: ML初学者的MNIST
本指导是写给初学机器学习和Tensorflow的朋友们。如果你已经知道MNIST和softmax(多元无序多分类)回归,你可以浏览快速教程。确保在开始本指导前安装了Tensorflow。
人们学习编程,总是从“Hello World”程序开始的。机器学习也有自己的“Hello World”——MNIST。
MNIST 是一个简单的计算... |
statsmodels/statsmodels.github.io | v0.12.2/examples/notebooks/generated/variance_components.ipynb | bsd-3-clause | import numpy as np
import statsmodels.api as sm
from statsmodels.regression.mixed_linear_model import VCSpec
import pandas as pd
"""
Explanation: Variance Component Analysis
This notebook illustrates variance components analysis for two-level
nested and crossed designs.
End of explanation
"""
np.random.seed(3123)
"... |
xlbaojun/Note-jupyter | 05其他/pandas文档-zh-master/.ipynb_checkpoints/检索 ,查询数据-checkpoint.ipynb | gpl-2.0 | import numpy as np
import pandas as pd
"""
Explanation: 检索,查询数据
这一节学习如何检索pandas数据。
End of explanation
"""
dates = pd.date_range('1/1/2000', periods=8)
dates
df = pd.DataFrame(np.random.randn(8,4), index=dates, columns=list('ABCD'))
df
panel = pd.Panel({'one':df, 'two':df-df.mean()})
panel
"""
Explanation: Python... |
turbomanage/training-data-analyst | quests/rl/a2c/a2c_on_gcp.ipynb | apache-2.0 | %%bash
BUCKET=<your-bucket-here> # Change to your bucket name
JOB_NAME=pg_on_gcp_$(date -u +%y%m%d_%H%M%S)
REGION='us-central1' # Change to your bucket region
IMAGE_URI=gcr.io/cloud-training-prod-bucket/pg:latest
gcloud ai-platform jobs submit training $JOB_NAME \
--staging-bucket=gs://$BUCKET \
--region=$REGI... |
TUW-GEO/rt1 | doc/examples/example01.ipynb | apache-2.0 | # imports
from rt1.rt1 import RT1
from rt1.volume import Rayleigh
from rt1.surface import CosineLobe
import numpy as np
import pandas as pd
# definition of volume and surface
V = Rayleigh(tau=0.7, omega=0.3)
SRF = CosineLobe(ncoefs=10, i=5, NormBRDF=np.pi)
"""
Explanation: Example 01
This example reproduces the... |
statsmodels/statsmodels.github.io | v0.12.1/examples/notebooks/generated/quantile_regression.ipynb | bsd-3-clause | %matplotlib inline
import numpy as np
import pandas as pd
import statsmodels.api as sm
import statsmodels.formula.api as smf
import matplotlib.pyplot as plt
data = sm.datasets.engel.load_pandas().data
data.head()
"""
Explanation: Quantile regression
This example page shows how to use statsmodels' QuantReg class to r... |
GoogleCloudPlatform/training-data-analyst | courses/machine_learning/deepdive2/launching_into_ml/labs/improve_data_quality.ipynb | apache-2.0 | !sudo chown -R jupyter:jupyter /home/jupyter/training-data-analyst
"""
Explanation: Improving Data Quality
Learning Objectives
Resolve missing values
Convert the Date feature column to a datetime format
Rename a feature column, remove a value from a feature column
Create one-hot encoding features
Understand temporal ... |
keras-team/keras-io | examples/generative/ipynb/gan_ada.ipynb | apache-2.0 | import matplotlib.pyplot as plt
import tensorflow as tf
import tensorflow_datasets as tfds
from tensorflow import keras
from tensorflow.keras import layers
"""
Explanation: Data-efficient GANs with Adaptive Discriminator Augmentation
Author: András Béres<br>
Date created: 2021/10/28<br>
Last modified: 2021/10/28<br>
... |
kaczla/PJN | src/NLTK/nltk.ipynb | gpl-2.0 | import nltk
"""
Explanation: <center>NLTK</center>
<center>PYTHON</center>
Spis treści:
Co to NLTK?
Wymagania
Instalacja NLTK
Przykład dla języka angielskiego:
Import NLTK
Word Tokenize
Sentence Tokenize
Pos Tagger
Lemmatizer
Sentence Tokenize dla języka niemieckiego
Wersja Online
Uwagi
Co to NLTK?
<b>Natural La... |
sn0wle0pard/tracer | example/sort/.ipynb_checkpoints/Insertion-checkpoint.ipynb | mit | def insertion_sort(unsorted_list):
x = ipytracer.List1DTracer(unsorted_list)
display(x)
for i in range(1, len(x)):
j = i - 1
key = x[i]
while x[j] > key and j >= 0:
x[j+1] = x[j]
j = j - 1
x[j+1] = key
return x.data
"""
Explanation: Insertion Sor... |
tritemio/multispot_paper | out_notebooks/usALEX - Corrections - Leakage fit-out.ipynb | mit | #bsearch_ph_sel = 'all-ph'
#bsearch_ph_sel = 'Dex'
bsearch_ph_sel = 'DexDem'
data_file = 'results/usALEX-5samples-PR-raw-%s.csv' % bsearch_ph_sel
"""
Explanation: Executed: Mon Mar 27 11:37:02 2017
Duration: 3 seconds.
Leakage coefficient fit
This notebook estracts the leakage coefficient from the set of 5 us-ALEX s... |
ilyasku/jpkfile | notes_on_jpk_archives/notes_on_jpk.ipynb | mit | from zipfile import ZipFile
fname = "../examples/force-save-2016.06.15-13.17.08.jpk-force"
z = ZipFile(fname)
"""
Explanation: JPK archive
JPK files are zipped archives of data.
There is a header file at the top-level
Header files are normal text files, nothing special needed to read them
There is a segments folder... |
dsacademybr/PythonFundamentos | Cap10/Mini-Projeto2-Solucao/Mini-Projeto2 - Analise1.ipynb | gpl-3.0 | # Versão da Linguagem Python
from platform import python_version
print('Versão da Linguagem Python Usada Neste Jupyter Notebook:', python_version())
# Imports
import os
import subprocess
import stat
import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib as mat
import matplotlib.pyplot as plt
fr... |
PMEAL/OpenPNM | examples/reference/class_inheritance/creating_a_custom_phase.ipynb | mit | import numpy as np
import openpnm as op
pn = op.network.Cubic(shape=[3, 3, 3], spacing=1e-4)
print(pn)
"""
Explanation: Creating a Custom Phase
Creating a custom fluid using GenericPhase
OpenPNM comes with a small selection of pre-written phases (Air, Water, Mercury). In many cases users will want different options ... |
QuantScientist/Deep-Learning-Boot-Camp | day03/Advanced_Keras_Tutorial/2.0. AutoEncoders and Embeddings.ipynb | mit | from keras.layers import Input, Dense
from keras.models import Model
from keras.datasets import mnist
import numpy as np
# this is the size of our encoded representations
encoding_dim = 32 # 32 floats -> compression of factor 24.5, assuming the input is 784 floats
# this is our input placeholder
input_img = Input(... |
ehongdata/Network-Analysis-Made-Simple | 7. Bipartite Graphs (Instructor).ipynb | mit | import networkx as nx
from networkx.algorithms import bipartite
# Initialize the city/person bipartite graph.
B = nx.Graph()
cities = ['Beijing', "Xi'an", 'Vancouver', 'San Francisco', 'Austin', 'Boston'] # populate a list of cities
people = ['Eric', 'Nan'] # populate a list of people's names
B.add_nodes_from(citie... |
mne-tools/mne-tools.github.io | 0.20/_downloads/6f729e8febc223b1d7003304f207eea9/plot_40_epochs_to_data_frame.ipynb | bsd-3-clause | import os
import seaborn as sns
import mne
sample_data_folder = mne.datasets.sample.data_path()
sample_data_raw_file = os.path.join(sample_data_folder, 'MEG', 'sample',
'sample_audvis_filt-0-40_raw.fif')
raw = mne.io.read_raw_fif(sample_data_raw_file, verbose=False)
"""
Explanation... |
mapagron/Boot_camp | hw6/Homework#6.ipynb | gpl-3.0 | # Dependencies
import json
import requests as req
import random
import seaborn as sns
import pandas as pd
import math as math
import time
import numpy as np
import matplotlib.pyplot as plt
from citipy import citipy
"""
Explanation: In this example, you'll be creating a Python script to visualize the weather of 500+ ci... |
AAbercrombie0492/satellite_imagery_feature_detection | notebooks/data_wrangling/interface_Spark_with_EC2.ipynb | mit | # Environment at time of execution
%load_ext watermark
%pylab inline
%watermark -a "Anthony Abercrombie" -d -t -v -p numpy,pandas,matplotlib -g
from __future__ import print_function
import os
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
import dotenv
import os
import sys
import dotenv
... |
AhmetHamzaEmra/Deep-Learning-Specialization-Coursera | Convolutional Neural Networks/Face+Recognition+for+the+Happy+House+-+v3.ipynb | mit | from keras.models import Sequential
from keras.layers import Conv2D, ZeroPadding2D, Activation, Input, concatenate
from keras.models import Model
from keras.layers.normalization import BatchNormalization
from keras.layers.pooling import MaxPooling2D, AveragePooling2D
from keras.layers.merge import Concatenate
from kera... |
SIMEXP/Projects | NSC2006/labo1/labo_NSC2006_donnees_multidimentionnelles_Octave.ipynb | mit | %matplotlib inline
from pymatbridge import Octave
octave = Octave()
octave.start()
%load_ext pymatbridge
"""
Explanation: <div align="center">
<h2> Méthodes quantitatives en neurosciences </h2>
</div>
<div align="center">
<b><i> Cours NSC-2006, année 2015</i></b><br>
<b>Laboratoire d'analyse de données multidimensi... |
ga7g08/ga7g08.github.io | _notebooks/2015-02-09-Gibbs-sampler-with-a-bivariate-normal-distribution.ipynb | mit | from numpy.random import normal
import matplotlib.pyplot as plt
%matplotlib inline
plt.rcParams['figure.figsize'] = (8, 6)
plt.rcParams['axes.labelsize'] = 22
def GibbsSampler(theta0, y, k, rho):
""" Simple implementation of the Gibbs sampler for a bivariate normal
distribution. """
theta =... |
sadahanu/DataScience_SideProject | Movie_Rating/Culture_difference_movie_rating.ipynb | mit | imdb_dat = pd.read_csv("movie_metadata.csv")
imdb_dat.info()
"""
Explanation: Data from <font color = "red"> the "IMDB5000"</font> database
End of explanation
"""
import requests
import re
from bs4 import BeautifulSoup
import time
import string
# return the douban movie rating that matches the movie name and year
# ... |
UWashington-Astro300/Astro300-A17 | FirstLast_Sympy.ipynb | mit | %matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
import sympy as sp
"""
Explanation: First Last - SymPy
End of explanation
"""
sp.init_printing()
x = sp.symbols('x')
my_x = np.linspace(-10,10,100)
"""
Explanation: $$ \Large {\displaystyle f(x)=3e^{-{\frac {x^{2}}{8}}}} \sin(x/3)$$
Find the ... |
zihangdai/xlnet | notebooks/colab_imdb_gpu.ipynb | apache-2.0 | ! pip install sentencepiece
"""
Explanation: <a href="https://colab.research.google.com/github/zihangdai/xlnet/blob/master/notebooks/colab_imdb_gpu.ipynb" target="_parent"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/></a>
XLNet IMDB movie review classification project
This n... |
kubeflow/community | scripts/github_stats.ipynb | apache-2.0 | import os
import subprocess
if os.path.exists("/var/run/secrets/kubernetes.io/serviceaccount"):
subprocess.check_call(["pip", "install", "--user", "-r", "requirements.txt"], stderr=subprocess.STDOUT, bufsize=1)
# NOTE: The RuntimeWarnings (if any) are harmless. See ContinuumIO/anaconda-issues#6678.
import altair a... |
BrownDwarf/ApJdataFrames | notebooks/Rayner2009.ipynb | mit | import warnings
warnings.filterwarnings("ignore")
from astropy.io import ascii
import pandas as pd
"""
Explanation: ApJdataFrames Rayner et al. 2009
Title: THE INFRARED TELESCOPE FACILITY (IRTF) SPECTRAL LIBRARY: COOL STARS
Authors: John T. Rayner, Michael C. Cushing, and William D. Vacca
Data is from this paper:
ht... |
mne-tools/mne-tools.github.io | 0.24/_downloads/93beebed8738eca8bfe26d41a12f4260/10_stc_class.ipynb | bsd-3-clause | import os
from mne import read_source_estimate
from mne.datasets import sample
print(__doc__)
# Paths to example data
sample_dir_raw = sample.data_path()
sample_dir = os.path.join(sample_dir_raw, 'MEG', 'sample')
subjects_dir = os.path.join(sample_dir_raw, 'subjects')
fname_stc = os.path.join(sample_dir, 'sample_au... |
I2MAX-LearningProject/Flask-server | Tests/Prophet_trial2_8_16.ipynb | mit | rawArrayDatas=[["2017-08-11", "2017-08-12", "2017-08-13", "2017-08-14", "2017-08-15","2017-08-16"],
[20.0, 30.0, 40.0, 50.0, 60.0,20.0]]
processId=12
forecastDay=4
"""
Explanation: 실제함수 input
End of explanation
"""
mockForecast={}
rmse={}
forecast=[]
realForecast={}
trainSize=int(len(rawArrayDatas[0]... |
ajgpitch/qutip-notebooks | development/development-smesolver-new-methods.ipynb | lgpl-3.0 | %matplotlib inline
%config InlineBackend.figure_formats = ['svg']
from qutip import *
from qutip.ui.progressbar import BaseProgressBar
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint
y_sse = None
import time
"""
Explanation: Test for different SME solvers against analytical solut... |
drvinceknight/cfm | assets/assessment/2020-2021/ind/assignment.ipynb | mit | import random
def sample_experiment():
### BEGIN SOLUTION
### END SOLUTION
"""
Explanation: Computing for Mathematics - 2020/2021 individual coursework
Important Do not delete the cells containing:
```
BEGIN SOLUTION
END SOLUTION
```
write your solution attempts in those cells.
To submit this notebook:
Change... |
GoogleCloudPlatform/training-data-analyst | courses/machine_learning/deepdive2/production_ml/labs/distributed_training_with_TF.ipynb | apache-2.0 | # Import TensorFlow
import tensorflow as tf
"""
Explanation: Distributed training with TensorFlow
Learning Objectives
1. Create MirroredStrategy
2. Integrate tf.distribute.Strategy with tf.keras
3. Create the input dataset and call tf.distribute.Strategy.experimental_distribute_dataset
Introduction
tf.distribute... |
CNS-OIST/STEPS_Example | user_manual/source/diffusion.ipynb | gpl-2.0 | import math
import numpy
import pylab
import random
import time
import steps.model as smodel
import steps.solver as solvmod
import steps.geom as stetmesh
import steps.rng as srng
"""
Explanation: Simulating Diffusion in Volumes
The simulation script described in this chapter is available at STEPS_Example repository.
... |
benneely/qdact-basic-analysis | notebooks/caresetting.ipynb | gpl-3.0 | from IPython.core.display import display, HTML;from string import Template;
HTML('<script src="//d3js.org/d3.v3.min.js" charset="utf-8"></script>')
css_text2 = '''
#main { float: left; width: 750px;}#sidebar { float: right; width: 100px;}#sequence { width: 600px; height: 70px;}#legend { padding: 10px 0 0 3px;}... |
ajrichards/bayesian-examples | reference/linear-algebra.ipynb | bsd-3-clause | import numpy as np
from numpy.random import randn as randn
from numpy.random import randint as randint
"""
Explanation: Linear Algebra with examples using Numpy
End of explanation
"""
from IPython.display import Image
Image('images/vector.png')
x = np.array([1,2,3,4])
print x
print x.shape
"""
Explanation: Linear... |
alsam/jlclaw | src/euler/Euler.ipynb | mit | %matplotlib inline
%config InlineBackend.figure_format = 'svg'
from exact_solvers import euler
from exact_solvers import euler_demos
from ipywidgets import widgets
from ipywidgets import interact
State = euler.Primitive_State
gamma = 1.4
"""
Explanation: The Euler equations of gas dynamics
In this notebook, we discus... |
ES-DOC/esdoc-jupyterhub | notebooks/ncc/cmip6/models/sandbox-1/landice.ipynb | gpl-3.0 | # DO NOT EDIT !
from pyesdoc.ipython.model_topic import NotebookOutput
# DO NOT EDIT !
DOC = NotebookOutput('cmip6', 'ncc', 'sandbox-1', 'landice')
"""
Explanation: ES-DOC CMIP6 Model Properties - Landice
MIP Era: CMIP6
Institute: NCC
Source ID: SANDBOX-1
Topic: Landice
Sub-Topics: Glaciers, Ice.
Properties: 3... |
ES-DOC/esdoc-jupyterhub | notebooks/ncc/cmip6/models/noresm2-lme/atmos.ipynb | gpl-3.0 | # DO NOT EDIT !
from pyesdoc.ipython.model_topic import NotebookOutput
# DO NOT EDIT !
DOC = NotebookOutput('cmip6', 'ncc', 'noresm2-lme', 'atmos')
"""
Explanation: ES-DOC CMIP6 Model Properties - Atmos
MIP Era: CMIP6
Institute: NCC
Source ID: NORESM2-LME
Topic: Atmos
Sub-Topics: Dynamical Core, Radiation, Turb... |
ppossemiers/analyzer | analyzer.ipynb | mit | # Imports and directives
%matplotlib inline
import numpy as np
from math import log
import matplotlib.pyplot as plt
from matplotlib.mlab import PCA as mlabPCA
import javalang
import os, re, requests, zipfile, json, operator
from collections import Counter
import colorsys
import random
from StringIO import StringIO
fro... |
khalido/nd101 | keyboard-shortcuts.ipynb | gpl-3.0 | # mode practice
"""
Explanation: Keyboard shortcuts
In this notebook, you'll get some practice using keyboard shortcuts. These are key to becoming proficient at using notebooks and will greatly increase your work speed.
First up, switching between edit mode and command mode. Edit mode allows you to type into cells whi... |
Jackporter415/phys202-2015-work | assignments/assignment05/InteractEx01.ipynb | mit | %matplotlib inline
from matplotlib import pyplot as plt
import numpy as np
from IPython.html.widgets import interact, interactive, fixed
from IPython.display import display
"""
Explanation: Interact Exercise 01
Import
End of explanation
"""
def print_sum(a, b):
print (a+b)
"""
Explanation: Interact basics
Wri... |
radu941208/DeepLearning | Convolutional_Neural_Network/Convolution+model+-+Application+-+v1.ipynb | mit | import math
import numpy as np
import h5py
import matplotlib.pyplot as plt
import scipy
from PIL import Image
from scipy import ndimage
import tensorflow as tf
from tensorflow.python.framework import ops
from cnn_utils import *
%matplotlib inline
np.random.seed(1)
"""
Explanation: Convolutional Neural Networks: Appli... |
yl565/statsmodels | examples/notebooks/generic_mle.ipynb | bsd-3-clause | from __future__ import print_function
import numpy as np
from scipy import stats
import statsmodels.api as sm
from statsmodels.base.model import GenericLikelihoodModel
"""
Explanation: Maximum Likelihood Estimation (Generic models)
This tutorial explains how to quickly implement new maximum likelihood models in statsm... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.