markdown
stringlengths
0
1.02M
code
stringlengths
0
832k
output
stringlengths
0
1.02M
license
stringlengths
3
36
path
stringlengths
6
265
repo_name
stringlengths
6
127
Setup TMVA
TMVA.Tools.Instance()
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Reader. One reader for each application.
reader = TMVA.Reader("Color:!Silent") reader_S = TMVA.Reader("Color:!Silent") reader_B = TMVA.Reader("Color:!Silent")
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Inputs=============Load dataAn unknown sample
trfile = "Zp2TeV_ttbar.root" data = TFile.Open(trfile) tree = data.Get('tree')
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Known signal
trfile_S = "Zp1TeV_ttbar.root" data_S = TFile.Open(trfile_S) tree_S = data_S.Get('tree')
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Known background
trfile_B = "SM_ttbar.root" data_B = TFile.Open(trfile_B) tree_B = data_B.Get('tree')
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Set input variables. Do this for each reader
branches = {} for branch in tree.GetListOfBranches(): branchName = branch.GetName() branches[branchName] = array('f', [-999]) tree.SetBranchAddress(branchName, branches[branchName]) if branchName not in ["mtt_truth", "weight", "nlep", "njets"]: reader.AddVariable(branchName, branches[branchName]...
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Book method(s)=============BDT
methodName1 = "BDT" weightfile = 'dataset/weights/TMVAClassification_{0}.weights.xml'.format(methodName1) reader.BookMVA( methodName1, weightfile ) reader_S.BookMVA( methodName1, weightfile ) reader_B.BookMVA( methodName1, weightfile )
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
BDTG
methodName2 = "BDTG" weightfile = 'dataset/weights/TMVAClassification_{0}.weights.xml'.format(methodName2) reader.BookMVA( methodName2, weightfile ) reader_S.BookMVA( methodName2, weightfile ) reader_B.BookMVA( methodName2, weightfile )
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Loop events for evaluation================ Book histograms
nbins, xmin, xmax=20, -1, 1
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Signal
tag = "S" hname="BDT_{0}".format(tag) h1 = TH1F(hname, hname, nbins, xmin, xmax) h1.Sumw2() hname="BDTG_{0}".format(tag) h2 = TH1F(hname, hname, nbins, xmin, xmax) h2.Sumw2() nevents = tree_S.GetEntries() for i in range(nevents): tree_S.GetEntry(i) BDT = reader_S.EvaluateMVA(methodName1) BDTG = reader_S.Evaluat...
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Background
tag = "B" hname="BDT_{0}".format(tag) h3 = TH1F(hname, hname, nbins, xmin, xmax) h3.Sumw2() hname="BDTG_{0}".format(tag) h4 = TH1F(hname, hname, nbins, xmin, xmax) h4.Sumw2() nevents = tree_B.GetEntries() for i in range(nevents): tree_B.GetEntry(i) BDT = reader_B.EvaluateMVA(methodName1) BDTG = reader_B.Evaluat...
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
New sample
tag = "N" hname="BDT_{0}".format(tag) h5 = TH1F(hname, hname, nbins, xmin, xmax) h5.Sumw2() hname="BDTG_{0}".format(tag) h6 = TH1F(hname, hname, nbins, xmin, xmax) h6.Sumw2() nevents = tree.GetEntries() for i in range(nevents): tree.GetEntry(i) BDT = reader.EvaluateMVA(methodName1) BDTG = reader.EvaluateMVA(met...
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Helper function to normalize hists
def norm_hists(h): h_new = h.Clone() hname = h.GetName() + "_normalized" h_new.SetName(hname) h_new.SetTitle(hname) ntot = h.Integral() if ntot!=0: h_new.Scale(1./ntot) return h_new
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Plotting
myc = TCanvas("c", "c", 800, 600) myc.SetFillColor(0) myc.cd()
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Compare the performance for BDT
nh1 = norm_hists(h1) nh1.GetXaxis().SetTitle("BDT") nh1.GetYaxis().SetTitle("A.U.") nh1.Draw("hist") nh3 = norm_hists(h3) nh3.SetLineColor(2) nh3.SetMarkerColor(2) nh3.Draw("same hist") nh5 = norm_hists(h5) nh5.SetLineColor(4) nh5.SetMarkerColor(4) nh5.Draw("same") ymin = 0 ymax = max(nh1.GetMaximum(), nh3.GetMaximum(...
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Draw legends
lIy = 0.92 lg = TLegend(0.60, lIy-0.25, 0.85, lIy) lg.SetBorderSize(0) lg.SetFillStyle(0) lg.SetTextFont(42) lg.SetTextSize(0.04) lg.AddEntry(nh1, "Signal 1 TeV", "l") lg.AddEntry(nh3, "Background", "l") lg.AddEntry(nh5, "Signal 2 TeV", "l") lg.Draw() myc.Draw() myc.SaveAs("TMVA_tutorial_cla_app_1.png")
Info in <TCanvas::Print>: png file TMVA_tutorial_cla_app_1.png has been created
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Compare the performance for BDTG
nh1 = norm_hists(h2) nh1.GetXaxis().SetTitle("BDTG") nh1.GetYaxis().SetTitle("A.U.") nh1.Draw("hist") nh3 = norm_hists(h4) nh3.SetLineColor(2) nh3.SetMarkerColor(2) nh3.Draw("same hist") nh5 = norm_hists(h6) nh5.SetLineColor(4) nh5.SetMarkerColor(4) nh5.Draw("same") ymin = 0 ymax = max(nh1.GetMaximum(), nh3.GetMaximum...
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Draw legends
lIy = 0.92 lg = TLegend(0.60, lIy-0.25, 0.85, lIy) lg.SetBorderSize(0) lg.SetFillStyle(0) lg.SetTextFont(42) lg.SetTextSize(0.04) lg.AddEntry(nh1, "Signal 1 TeV", "l") lg.AddEntry(nh3, "Background", "l") lg.AddEntry(nh5, "Signal 2 TeV", "l") lg.Draw() myc.Draw() myc.SaveAs("TMVA_tutorial_cla_app_2.png")
Info in <TCanvas::Print>: png file TMVA_tutorial_cla_app_2.png has been created
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Draw all canvases
from ROOT import gROOT gROOT.GetListOfCanvases().Draw()
_____no_output_____
CC-BY-4.0
MVA/TMVA_tutorial_classification_tmva_app.py.nbconvert.ipynb
LailinXu/hepstat-tutorial
Data Processing
%pylab inline matplotlib.rcParams['figure.figsize'] = [20, 10] import pandas as pd import numpy as np import warnings warnings.filterwarnings("ignore") # All variables we concern about columnNames1 = ["releaseNum", "1968ID", "personNumber", "gender", "marriage", "familyNumber", "sequenceNum", "relationT...
_____no_output_____
MIT
20201120/20201116/empirical/.ipynb_checkpoints/DataProcessing-checkpoint.ipynb
dongxulee/lifeCycle
Individual Data
Idf = compile_data_with_features(["1968ID", "personNumber", "familyNumber","gender", "marriage", "age", 'employmentStatus', "education", "relationToHead"], years) Idf["ID"] = Idf["1968ID"]* 1000 + Idf["personNumber"] # pick out the head in the individual df_head = Idf[Idf["relationToH...
_____no_output_____
MIT
20201120/20201116/empirical/.ipynb_checkpoints/DataProcessing-checkpoint.ipynb
dongxulee/lifeCycle
Family Data
# prepare the combined dataset and set up dummy variables for qualitative data df = Fcompile_data_with_features(['composition', 'headCount', 'ageHead', 'maritalStatus', 'employmentStatus', 'liquidWealth', 'race', 'industry' ,'geoCode','incomeHead', "incomeWife", ...
_____no_output_____
MIT
20201120/20201116/empirical/.ipynb_checkpoints/DataProcessing-checkpoint.ipynb
dongxulee/lifeCycle
Link Family Data with Individual Head Data
completeFamilyData = [] for individual in completeIndividualData: idf = pd.DataFrame() for i in range(len(individual)): idf = pd.concat([idf, Fdf[(Fdf.year == individual.iloc[i].year)& (Fdf.familyID == individual.iloc[i].familyNumber)]]) completeFamilyData.append(idf...
_____no_output_____
MIT
20201120/20201116/empirical/.ipynb_checkpoints/DataProcessing-checkpoint.ipynb
dongxulee/lifeCycle
Individual plot
def inFeaturePlot(FamilyData, feature, n): plt.figure() for i in range(n[0],n[1]): FamilyData[i][feature].plot(marker='o') plt.show() def plotFeatureVsAge(FamilyData, feature, n): plt.figure() for i in range(n[0],n[1]): plt.plot(FamilyData[i].ageHead, FamilyData[i][feature], marker ...
_____no_output_____
MIT
20201120/20201116/empirical/.ipynb_checkpoints/DataProcessing-checkpoint.ipynb
dongxulee/lifeCycle
Average variable plot
def plotFeature(FamilyData, feature): df = FamilyData[0][feature] * 0 for i in range(len(FamilyData)): df = df + FamilyData[i][feature] df = df/len(FamilyData) df.plot(marker='o') print(df) # laborIncome plotFeature(FamilyData, "laborIncome") # laborIncome plotFeature(FamilyData, "investment...
year 1999 31462.509377 2001 34869.478459 2003 31720.994932 2005 40220.458186 2007 50619.510897 2009 41815.880892 2011 62674.657881 2013 65190.544349 2015 78211.521034 2017 84070.374050 Name: annuityIRA, dtype: float64
MIT
20201120/20201116/empirical/.ipynb_checkpoints/DataProcessing-checkpoint.ipynb
dongxulee/lifeCycle
Compare The Distribution Over Age
df = Fdf[(Fdf["ageHead"]>=20) & (Fdf["ageHead"]<=80)] df[['liquidWealth', 'laborIncome', 'costPerPerson', 'totalExpense','investmentAmount', 'annuityIRA', 'wealthWithoutHomeEquity', 'wealthWithHomeEquity']] = df[['liquidWealth', 'laborIncome', 'costPerPerson', 'totalExpense','investmentAmount', 'annuityIRA', ...
_____no_output_____
MIT
20201120/20201116/empirical/.ipynb_checkpoints/DataProcessing-checkpoint.ipynb
dongxulee/lifeCycle
Data
path = Path('/content/drive/My Drive/Archieve/ValueLabs'); path.ls() train_df = pd.read_csv(path/'Train.csv'); train_df.head() bs = 24
_____no_output_____
MIT
Colab Notebooks/competition/valuelabs_ml_hiring_challenge.ipynb
ankschoubey/notes
LM
data_lm = (TextList .from_df(train_df,cols=['question', 'answer_text', 'distractor']) .split_by_rand_pct(0.1) .label_for_lm() .databunch(bs=bs) ) data_lm.save(path/'lm.pkl') data_lm = load_data(path, 'lm.pkl', bs=bs) data_lm.show_batch() learn = language_model_lear...
_____no_output_____
MIT
Colab Notebooks/competition/valuelabs_ml_hiring_challenge.ipynb
ankschoubey/notes
RadarCOVID-Report Data Extraction
import datetime import json import logging import os import shutil import tempfile import textwrap import uuid import matplotlib.pyplot as plt import matplotlib.ticker import numpy as np import pandas as pd import retry import seaborn as sns %matplotlib inline current_working_directory = os.environ.get("PWD") if curr...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Constants
from Modules.ExposureNotification import exposure_notification_io spain_region_country_code = "ES" germany_region_country_code = "DE" default_backend_identifier = spain_region_country_code backend_generation_days = 7 * 2 daily_summary_days = 7 * 4 * 3 daily_plot_days = 7 * 4 tek_dumps_load_limit = daily_summary_days...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Parameters
environment_backend_identifier = os.environ.get("RADARCOVID_REPORT__BACKEND_IDENTIFIER") if environment_backend_identifier: report_backend_identifier = environment_backend_identifier else: report_backend_identifier = default_backend_identifier report_backend_identifier environment_enable_multi_backend_download ...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
COVID-19 Cases
report_backend_client = \ exposure_notification_io.get_backend_client_with_identifier( backend_identifier=report_backend_identifier) @retry.retry(tries=10, delay=10, backoff=1.1, jitter=(0, 10)) def download_cases_dataframe_from_ecdc(): return pd.read_csv( "https://opendata.ecdc.europa.eu/covid1...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Extract API TEKs
raw_zip_path_prefix = "Data/TEKs/Raw/" fail_on_error_backend_identifiers = [report_backend_identifier] multi_backend_exposure_keys_df = \ exposure_notification_io.download_exposure_keys_from_backends( backend_identifiers=report_backend_identifiers, generation_days=backend_generation_days, fa...
/opt/hostedtoolcache/Python/3.8.6/x64/lib/python3.8/site-packages/pandas/core/frame.py:4110: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-co...
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Dump API TEKs
tek_list_df = multi_backend_exposure_keys_df[ ["sample_date_string", "region", "key_data"]].copy() tek_list_df["key_data"] = tek_list_df["key_data"].apply(str) tek_list_df.rename(columns={ "sample_date_string": "sample_date", "key_data": "tek_list"}, inplace=True) tek_list_df = tek_list_df.groupby( ["sa...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Load TEK Dumps
import glob def load_extracted_teks(mode, region=None, limit=None) -> pd.DataFrame: extracted_teks_df = pd.DataFrame(columns=["region"]) file_paths = list(reversed(sorted(glob.glob(tek_list_path_prefix + mode + "/RadarCOVID-TEKs-*.json")))) if limit: file_paths = file_paths[:limit] for file_pat...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Daily New TEKs
tek_list_df = daily_extracted_teks_df.groupby("extraction_date").tek_list.apply( lambda x: set(sum(x, []))).reset_index() tek_list_df = tek_list_df.set_index("extraction_date").sort_index(ascending=True) tek_list_df.head() def compute_teks_by_generation_and_upload_date(date): day_new_teks_set_df = tek_list_df.c...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Hourly New TEKs
hourly_extracted_teks_df = load_extracted_teks( mode="Hourly", region=report_backend_identifier, limit=25) hourly_extracted_teks_df.head() hourly_new_tek_count_df = hourly_extracted_teks_df \ .groupby("extraction_date_with_hour").tek_list. \ apply(lambda x: set(sum(x, []))).reset_index().copy() hourly_new_t...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Data Merge
result_summary_df = exposure_keys_summary_df.merge( new_tek_df, on=["sample_date_string"], how="outer") result_summary_df.head() result_summary_df = result_summary_df.merge( shared_teks_uploaded_on_generation_date_df, on=["sample_date_string"], how="outer") result_summary_df.head() result_summary_df = result_su...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Report Results
display_column_name_mapping = { "sample_date": "Sample\u00A0Date\u00A0(UTC)", "source_regions": "Source Countries", "datetime_utc": "Timestamp (UTC)", "upload_date": "Upload Date (UTC)", "generation_to_upload_days": "Generation to Upload Period in Days", "region": "Backend", "region_x": "Bac...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Daily Summary Table
result_summary_df_ = result_summary_df.copy() result_summary_df = result_summary_df[summary_columns] result_summary_with_display_names_df = result_summary_df \ .rename_axis(index=display_column_name_mapping) \ .rename(columns=display_column_name_mapping) result_summary_with_display_names_df
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Daily Summary Plots
result_plot_summary_df = result_summary_df.head(daily_plot_days)[summary_columns] \ .droplevel(level=["source_regions"]) \ .rename_axis(index=display_column_name_mapping) \ .rename(columns=display_column_name_mapping) summary_ax_list = result_plot_summary_df.sort_index(ascending=True).plot.bar( title=f"...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Daily Generation to Upload Period Table
display_generation_to_upload_period_pivot_df = \ generation_to_upload_period_pivot_df \ .head(backend_generation_days) display_generation_to_upload_period_pivot_df \ .head(backend_generation_days) \ .rename_axis(columns=display_column_name_mapping) \ .rename_axis(index=display_column_name_mappin...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Hourly Summary Plots
hourly_summary_ax_list = hourly_summary_df \ .rename_axis(index=display_column_name_mapping) \ .rename(columns=display_column_name_mapping) \ .plot.bar( title=f"Last 24h Summary", rot=45, subplots=True, legend=False) ax_ = hourly_summary_ax_list[-1] ax_.get_figure().tight_layout() ax_.get_fi...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Publish Results
def get_temporary_image_path() -> str: return os.path.join(tempfile.gettempdir(), str(uuid.uuid4()) + ".png") def save_temporary_plot_image(ax): if isinstance(ax, np.ndarray): ax = ax[0] media_path = get_temporary_image_path() ax.get_figure().savefig(media_path) return media_path def save_...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Save Results
report_resources_path_prefix = "Data/Resources/Current/RadarCOVID-Report-" result_summary_df.to_csv( report_resources_path_prefix + "Summary-Table.csv") result_summary_df.to_html( report_resources_path_prefix + "Summary-Table.html") hourly_summary_df.to_csv( report_resources_path_prefix + "Hourly-Summary-Ta...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Publish Results as JSON
summary_results_api_df = result_summary_df.reset_index() summary_results_api_df["sample_date_string"] = \ summary_results_api_df["sample_date"].dt.strftime("%Y-%m-%d") summary_results_api_df["source_regions"] = \ summary_results_api_df["source_regions"].apply(lambda x: x.split(",")) today_summary_results_api_d...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Publish on README
with open("Data/Templates/README.md", "r") as f: readme_contents = f.read() readme_contents = readme_contents.format( extraction_date_with_hour=extraction_date_with_hour, github_project_base_url=github_project_base_url, daily_summary_table_html=daily_summary_table_html, multi_backend_summary_table_...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Publish on Twitter
enable_share_to_twitter = os.environ.get("RADARCOVID_REPORT__ENABLE_PUBLISH_ON_TWITTER") github_event_name = os.environ.get("GITHUB_EVENT_NAME") if enable_share_to_twitter and github_event_name == "schedule" and \ (shared_teks_by_upload_date_last_hour or not are_today_results_partial): import tweepy t...
_____no_output_____
Apache-2.0
Notebooks/RadarCOVID-Report/Daily/RadarCOVID-Report-2020-11-07.ipynb
pvieito/Radar-STATS
Custom conversionsHere we show how custom conversions can be passed to OpenSCM-Units' `ScmUnitRegistry`.
# NBVAL_IGNORE_OUTPUT import traceback import pandas as pd from openscm_units import ScmUnitRegistry
_____no_output_____
BSD-3-Clause
notebooks/custom-conversions.ipynb
openscm/openscm-units
Custom conversions DataFrameOn initialisation, a `pd.DataFrame` can be provided which contains the custom conversions. This `pd.DataFrame` should be formatted as shown below, with an index that contains the different species and columns which contain the conversion for different metrics.
metric_conversions_custom = pd.DataFrame([ { "Species": "CH4", "Custom1": 20, "Custom2": 25, }, { "Species": "N2O", "Custom1": 341, "Custom2": 300, }, ]).set_index("Species") metric_conversions_custom
_____no_output_____
BSD-3-Clause
notebooks/custom-conversions.ipynb
openscm/openscm-units
With such a `pd.DataFrame`, we can use custom conversions in our unit registry as shown.
# initialise the unit registry with custom conversions unit_registry = ScmUnitRegistry(metric_conversions=metric_conversions_custom) # add standard conversions before moving on unit_registry.add_standards() # start with e.g. N2O nitrous_oxide = unit_registry("N2O") display(f"N2O: {nitrous_oxide}") # our unit registry...
_____no_output_____
BSD-3-Clause
notebooks/custom-conversions.ipynb
openscm/openscm-units
Interesting to note that the loss of high-frequency information had almost no effect on the SVM classifier.Also interesting that shot noise was more harmful to the vRNN than to SNN methods.Not surprising that additive white noise had the greatest effect on accuracy for all classification methods.
mf1 = pd.read_hdf('results/whitenoise_mag_exp_res_11_03_42.h5') mf2 = pd.read_hdf('results/whitenoise_mag_exp_res_10_35_19.h5') mf = pd.concat((mf1, mf2)) mf.columns mf_filt = mf[mf['noise magnitude'] < 0.15] ff = sns.lmplot("noise magnitude", "accuracy", hue="approach", data=mf_filt, x_estimator=np.mean, ...
_____no_output_____
MIT
plot_noise_results.ipynb
Seanny123/rnn-comparison
Table of ContentsTower of HanoiLearning OutcomesDemo: How to Play Tower of HanoiDefinitionsDemo: How to Play Tower of HanoiStudent ActivityReflectionDid any group derive formula for minimum number of moves?Questions? Tower of Hanoi as RL problem Tower of Hanoi SolutionsGreedy Tower of Hanoi Tower of Hanoi SolutionsTHER...
reset -fs from IPython.display import YouTubeVideo # 3 rings YouTubeVideo('S4HOSbrS4bY') # 6 rings YouTubeVideo('iFV821yY7Ns')
_____no_output_____
Apache-2.0
01_rl_introduction__markov_decision_process/2_tower_of_hanoi_intro.ipynb
loftiskg/rl-course
Did any group derive formula for minimum number of moves?
# Calculate the optimal number of moves print(f"{'# disks':>7} | {'# moves':>10}") for n_disks in range(1, 21): n_moves = (2 ** n_disks)-1 print(f"{n_disks:>7} {n_moves:>10,}")
# disks | # moves 1 1 2 3 3 7 4 15 5 31 6 63 7 127 8 255 9 511 10 1,023 11 2,047 12 4,095 13 8,191 14 16,383 15 32,767 16 65...
Apache-2.0
01_rl_introduction__markov_decision_process/2_tower_of_hanoi_intro.ipynb
loftiskg/rl-course
Auto detection to main + 4 cropped images**Pipeline:**1. Load cropped image csv file2. Apply prediction3. Save prediction result back to csv file* pred_value* pred_cat* pred_bbox
# Import libraries %matplotlib inline from pycocotools.coco import COCO from keras.models import load_model # from utils.utils import * # from utils.bbox import * # from utils.image import load_image_pixels from keras.preprocessing.image import load_img from keras.preprocessing.image import img_to_array import numpy as...
_____no_output_____
MIT
thesis_code/auto_detection.ipynb
hhodac/keras-yolo3
Utilities
class BoundBox: def __init__(self, xmin, ymin, xmax, ymax, objness = None, classes = None): self.xmin = xmin self.ymin = ymin self.xmax = xmax self.ymax = ymax self.objness = objness self.classes = classes self.label = -1 self.score = -1 def get_l...
_____no_output_____
MIT
thesis_code/auto_detection.ipynb
hhodac/keras-yolo3
Load model
# load yolov3 model model = load_model('yolov3_model.h5') # define the expected input shape for the model input_w, input_h = 416, 416 # define the anchors anchors = [[116,90, 156,198, 373,326], [30,61, 62,45, 59,119], [10,13, 16,30, 33,23]] # define the probability threshold for detected objects class_threshold = 0.6 #...
WARNING:tensorflow:From /Users/haiho/PycharmProjects/yolov3_huynhngocanh/venv/lib/python3.5/site-packages/tensorflow_core/python/ops/resource_variable_ops.py:1630: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future versi...
MIT
thesis_code/auto_detection.ipynb
hhodac/keras-yolo3
Gather & concatenate all csv files
all_files = [] cat = 'book' for subdir, dirs, files in os.walk(os.path.join(imageDir,cat)): for filename in files: filepath = subdir + os.sep + filename if filepath.endswith(".csv"): all_files.append(filepath) print(filepath) li = [] for filename in all_files: df = pd.rea...
_____no_output_____
MIT
thesis_code/auto_detection.ipynb
hhodac/keras-yolo3
Apply prediction to multiple images
df_pred = pd.DataFrame(columns=['pred','pred_cat','pred_bbox']) iou_threshold = 0.5 for idx, item in df_images.iterrows(): file_path = os.path.join(item['path'], item['filename']) image, image_w, image_h = load_image_pixels(file_path, (input_w, input_h)) yhat = model.predict(image) boxes = list() fo...
_____no_output_____
MIT
thesis_code/auto_detection.ipynb
hhodac/keras-yolo3
Generate speaker labels from max raw audio magnitudes
data_dir = '/Users/cgn/Dropbox (Facebook)/EGOCOM/raw_audio/wav/' fn_dict = {} for fn in sorted(os.listdir(data_dir)): key = fn[9:23] + fn[32:37] if 'part' in fn else fn[9:21] fn_dict[key] = fn_dict[key] + [fn] if key in fn_dict else [fn] samplerate = 44100 window = 1 # Averages signals with windows of N second...
_____no_output_____
MIT
paper_experiments_work_log/speaker_recognition.ipynb
cgnorthcutt/EgoCom-Dataset
Generate ground truth speaker labels
def create_gt_speaker_labels( df_times_speaker, duration_in_seconds, time_window_seconds = 0.5, ): stack = rev_times[::-1] stack_time = stack.pop() label_times = np.arange(0, duration_in_seconds, time_window_seconds) result = [-1] * len(label_times) for i, t in enumerate(label_times): ...
day_1__con_1__part1 | day_1__con_1__part2 | day_1__con_1__part3 | day_1__con_1__part4 | day_1__con_1__part5 | day_1__con_2__part1 | day_1__con_2__part2 | day_1__con_2__part3 | day_1__con_2__part4 | day_1__con_2__part5 | day_1__con_3__part1 | day_1__con_3__part2 | day_1__con_3__part3 | day_1__con_3__part4 | day_1__con_4...
MIT
paper_experiments_work_log/speaker_recognition.ipynb
cgnorthcutt/EgoCom-Dataset
Generate subtitles
for key in labels.keys(): gt = labels[key] with open("subtitles/est_" + key + '.srt', 'w') as f: for t, s in enumerate(gt): print(t + 1, file = f) print(async_srt_format_timestamp(t*window), end = "", file = f) print(' --> ', end = '', file = f) print(asyn...
_____no_output_____
MIT
paper_experiments_work_log/speaker_recognition.ipynb
cgnorthcutt/EgoCom-Dataset
Caníbales y misionerosmediante búsqueda primero en anchura.
from copy import deepcopy from collections import deque import sys # (m, c, b) hace referencia a el número de misioneros, canibales y el bote class Estado(object): def __init__(self, misioneros, canibales, bote): self.misioneros = misioneros self.canibales = canibales self.bote = bote #se establecen lo...
0 misioneros y 2 canibales Ida. < Estado (3, 1, 0) > 0 misioneros y 1 canibales Vuelta. < Estado (3, 2, 1) > 0 misioneros y 2 canibales Ida. < Estado (3, 0, 0) > 0 misioneros y 1 canibales Vuelta. < Estado (3, 1, 1) > 2 misioneros y 0 canibales Ida. < Estado (1, 1, 0) > 1 misioneros y 1 canibales Vuelta. < Estado...
MIT
Artificial-Int.ipynb
danielordonezg/Machine-Learning-Algorithms
Función en Python que reciba un grafo, un nodo inicial y un nodo final y devuelva la ruta del nodo inicial al nodo final utilizando búsqueda primero en profundidad. Se deben crear las clases Grafo y Nodo con sus respectivos métodos y atributos. La función debe retornar None en caso de que no haya ninguna ruta posible.
class Enlace: #pendientes def __init__(self, a=None, b=None): self.a = a self.b = b def __eq__(self, other): return self.a == other.a and self.b == other.b def __str__(self): return "(" + str(self.a) + "," + str(self.b) + ")" def __repr__(self): return self.__str__() class Nodo: def...
La ruta final es [A, C, F] None
MIT
Artificial-Int.ipynb
danielordonezg/Machine-Learning-Algorithms
Desarrollar un programa en Python que solucione el problema del rompecabezas des-lizante para 8 números utilizando búsqueda en anchura . El programa debe leer elestado inicial desde un archivo. Algunas configuraciones no tienen solución.
class Estados(): def __init__(self,Mat,Npadre=None): self.Npadre=Npadre self.Mat=Mat #Mat=[[0 1 2],[3 4 5],[6 7 8]] def BuscZ(self): #Buscar el 0 dentro de la matriz itera=0 Pos=[] for y,i in enumerate(self.Mat): if 0 in i: itera+=1 Pos.append(y) Pos....
_____no_output_____
MIT
Artificial-Int.ipynb
danielordonezg/Machine-Learning-Algorithms
Desarrollar un programa en Python que encuentre la ruta de salida en un laberinto representado por una matriz de 0 y 1. Un 0 significa que se puede pasar por esa casilla un 1 representa que hay pared en dicha casilla y 2 que es la salida. El programa debe leer la configuración del laberinto desde un archivo, solicitar ...
#creacion de ambiente from IPython.display import display import ipywidgets as widgets import time import random # se crea el tablero class Tablero: def __init__(self, tamanoCelda=(40, 40), nCeldas=(5,5)): #dimensiones del tablero self.out = widgets.HTML() display(self.out) self.tamanoCelda = tamanoCeld...
_____no_output_____
MIT
Artificial-Int.ipynb
danielordonezg/Machine-Learning-Algorithms
Prelim Exam Question 1.A 4 x 4 matrix whose diagonal elements are all one (1's)
import numpy as np A = np.array([1,1,1,1,]) C = np.diag(A) print (C)
[[1 0 0 0] [0 1 0 0] [0 0 1 0] [0 0 0 1]]
Apache-2.0
Prelim_Exam.ipynb
MishcaGestoso/Linear-Algebra-58019
Question 2. doubles all the values of each element.
import numpy as np A = np.array([1, 1, 1, 1]) B = np.diag(A) print (C*2) #To double the value of array C
[[2 0 0 0] [0 2 0 0] [0 0 2 0] [0 0 0 2]]
Apache-2.0
Prelim_Exam.ipynb
MishcaGestoso/Linear-Algebra-58019
Question 3. The cross-product of matrices, A = [2,7,4] and B = [3,9,8]
import numpy as np A = np.array([2, 7, 4]) B = np.array([3, 9, 8]) #To compute the cross of arrays A and B cross = np.cross(A,B) print(cross)
[20 -4 -3]
Apache-2.0
Prelim_Exam.ipynb
MishcaGestoso/Linear-Algebra-58019
SkyScan ConfigMake temporary changes to a running SkyScan instance. It will revert back to the values in the environment file when restart.
broker="192.168.1.47" # update with the IP for the Raspberry PI !pip install paho-mqtt import paho.mqtt.client as mqtt import json client = mqtt.Client("notebook-config")
_____no_output_____
Apache-2.0
Config.ipynb
rcaudill/SkyScan
Camera ZoomThis is how much the camera is zoomed in. It is an Int number between 0-9999 (max)
client.connect(broker) data = {} data['cameraZoom'] = 9999 # Update this Value json_data = json.dumps(data) client.publish("skyscan/config/json",json_data)
_____no_output_____
Apache-2.0
Config.ipynb
rcaudill/SkyScan
Camera DelayFloat value for the number of seconds to wait after sending the camera move API command, before sending the take picture API command.
client.connect(broker) data = {} data['cameraDelay'] = 0.25 # Update this Value json_data = json.dumps(data) client.publish("skyscan/config/json",json_data)
_____no_output_____
Apache-2.0
Config.ipynb
rcaudill/SkyScan
Camera Move SpeedThis is how fast the camea will move. It is an Int number between 0-99 (max)
client.connect(broker) data = {} data['cameraMoveSpeed'] = 99 # Update this Value json_data = json.dumps(data) client.publish("skyscan/config/json",json_data)
_____no_output_____
Apache-2.0
Config.ipynb
rcaudill/SkyScan
Camera LeadThis is how far the tracker should move the center point in front of the currently tracked plane. It is a float, and is measured in seconds, example: 0.25 . It is based on the planes current heading and how fast it is going.
client.connect(broker) data = {} data['cameraLead'] = 0.45 # Update this Value json_data = json.dumps(data) client.publish("skyscan/config/json",json_data)
_____no_output_____
Apache-2.0
Config.ipynb
rcaudill/SkyScan
Camera BearingThis is a float to correct the cameras heading to help it better align with True North. It can be from -180 to 180.
client.connect(broker) data = {} data['cameraBearing'] = -2 # Update this Value json_data = json.dumps(data) client.publish("skyscan/config/json",json_data)
_____no_output_____
Apache-2.0
Config.ipynb
rcaudill/SkyScan
Minimum ElevationThe minimum elevation above the horizon which the Tracker will follow an airplane. Int value between 0-90 degrees.
client.connect(broker) data = {} data['minElevation'] = 45 # Update this Value json_data = json.dumps(data) client.publish("skyscan/config/json",json_data)
_____no_output_____
Apache-2.0
Config.ipynb
rcaudill/SkyScan
Running and Plotting Coeval Cubes The aim of this tutorial is to introduce you to how `21cmFAST` does the most basic operations: producing single coeval cubes, and visually verifying them. It is a great place to get started with `21cmFAST`.
%matplotlib inline import matplotlib.pyplot as plt import os # We change the default level of the logger so that # we can see what's happening with caching. import logging, sys, os logger = logging.getLogger('21cmFAST') logger.setLevel(logging.INFO) import py21cmfast as p21c # For plotting the cubes, we use the plott...
Using 21cmFAST version 3.0.2
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Clear the cache so that we get the same results for the notebook every time (don't worry about this for now). Also, set the default output directory to `_cache/`:
if not os.path.exists('_cache'): os.mkdir('_cache') p21c.config['direc'] = '_cache' cache_tools.clear_cache(direc="_cache")
2020-10-02 09:51:10,651 | INFO | Removed 0 files from cache.
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Basic Usage The simplest (and typically most efficient) way to produce a coeval cube is simply to use the `run_coeval` method. This consistently performs all steps of the calculation, re-using any data that it can without re-computation or increased memory overhead.
coeval8, coeval9, coeval10 = p21c.run_coeval( redshift = [8.0, 9.0, 10.0], user_params = {"HII_DIM": 100, "BOX_LEN": 100, "USE_INTERPOLATION_TABLES": True}, cosmo_params = p21c.CosmoParams(SIGMA_8=0.8), astro_params = p21c.AstroParams({"HII_EFF_FACTOR":20.0}), random_seed=12345 )
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
There are a number of possible inputs for `run_coeval`, which you can check out either in the [API reference](../reference/py21cmfast.html) or by calling `help(p21c.run_coeval)`. Notably, the `redshift` must be given: it can be a single number, or a list of numbers, defining the redshift at which the output coeval cube...
print("Random Seed: ", coeval8.random_seed) print("Redshift: ", coeval8.redshift) print(coeval8.user_params)
Random Seed: 12345 Redshift: 8.0 UserParams(BOX_LEN:100, DIM:300, HII_DIM:100, HMF:1, POWER_SPECTRUM:0, USE_FFTW_WISDOM:False, USE_RELATIVE_VELOCITIES:False)
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
This is where the utility of being able to pass a *class instance* for the parameters arises: we could run another iteration of coeval cubes, with the same user parameters, simply by doing `p21c.run_coeval(user_params=coeval8.user_params, ...)`.Also in the `Coeval` instance are the various outputs from the different st...
print(coeval8.hires_density.shape) print(coeval8.brightness_temp.shape)
(300, 300, 300) (100, 100, 100)
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Along with these, full instances of the output from each step are available as attributes that end with "struct". These instances themselves contain the `numpy` arrays of the data cubes, and some other attributes that make them easier to work with:
coeval8.brightness_temp_struct.global_Tb
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
By default, each of the components of the cube are cached to disk (in our `_cache/` folder) as we run it. However, the `Coeval` cube itself is _not_ written to disk by default. Writing it to disk incurs some redundancy, since that data probably already exists in the cache directory in seperate files. Let's save to dis...
filename = coeval8.save(direc='_cache')
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
The filename of the saved file is returned:
print(os.path.basename(filename))
Coeval_z8.0_a3c7dea665420ae9c872ba2fab1b3d7d_r12345.h5
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Such files can be read in:
new_coeval8 = p21c.Coeval.read(filename, direc='.')
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Some convenient plotting functions exist in the `plotting` module. These can work directly on `Coeval` objects, or any of the output structs (as we'll see further on in the tutorial). By default the `coeval_sliceplot` function will plot the `brightness_temp`, using the standard traditional colormap:
fig, ax = plt.subplots(1,3, figsize=(14,4)) for i, (coeval, redshift) in enumerate(zip([coeval8, coeval9, coeval10], [8,9,10])): plotting.coeval_sliceplot(coeval, ax=ax[i], fig=fig); plt.title("z = %s"%redshift) plt.tight_layout()
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Any 3D field can be plotted, by setting the `kind` argument. For example, we could alternatively have plotted the dark matter density cubes perturbed to each redshift:
fig, ax = plt.subplots(1,3, figsize=(14,4)) for i, (coeval, redshift) in enumerate(zip([coeval8, coeval9, coeval10], [8,9,10])): plotting.coeval_sliceplot(coeval, kind='density', ax=ax[i], fig=fig); plt.title("z = %s"%redshift) plt.tight_layout()
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
To see more options for the plotting routines, see the [API Documentation](../reference/_autosummary/py21cmfast.plotting.html). `Coeval` instances are not cached themselves -- they are containers for data that is itself cached (i.e. each of the `_struct` attributes of `Coeval`). See the [api docs](../reference/_autosum...
coeval8.init_struct.filename
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
You can also write the struct anywhere you'd like on the filesystem. This will not be able to be automatically used as a cache, but it could be useful for sharing files with colleagues.
coeval8.init_struct.save(fname='my_init_struct.h5')
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
This brief example covers most of the basic usage of `21cmFAST` (at least with `Coeval` objects -- there are also `Lightcone` objects for which there is a separate tutorial). For the rest of the tutorial, we'll cover a more advanced usage, in which each step of the calculation is done independently. Advanced Step-by-S...
initial_conditions = p21c.initial_conditions( user_params = {"HII_DIM": 100, "BOX_LEN": 100}, cosmo_params = p21c.CosmoParams(SIGMA_8=0.8), random_seed=54321 )
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
We've already come across all these parameters as inputs to the `run_coeval` function. Indeed, most of the steps have very similar interfaces, and are able to take a random seed and parameters for where to look for the cache. We use a different seed than in the previous section so that all our boxes are "fresh" (we'll ...
p21c.CosmoParams._defaults_
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
(these defaults correspond to the Planck15 cosmology contained in Astropy). So what is in the ``initial_conditions`` object? It is what we call an ``OutputStruct``, and we have seen it before, as the `init_box_struct` attribute of `Coeval`. It contains a number of arrays specifying the density and velocity fields of ou...
initial_conditions.cosmo_params
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
A handy tip is that the ``CosmoParams`` class also has a reference to a corresponding Astropy cosmology, which can be used more broadly:
initial_conditions.cosmo_params.cosmo
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Merely printing the initial conditions object gives a useful representation of its dependent parameters:
print(initial_conditions)
InitialConditions(UserParams(BOX_LEN:100, DIM:300, HII_DIM:100, HMF:1, POWER_SPECTRUM:0, USE_FFTW_WISDOM:False, USE_RELATIVE_VELOCITIES:False); CosmoParams(OMb:0.04897468161869667, OMm:0.30964144154550644, POWER_INDEX:0.9665, SIGMA_8:0.8, hlittle:0.6766); random_seed:54321)
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
(side-note: the string representation of the object is used to uniquely define it in order to save it to the cache... which we'll explore soon!).To see which arrays are defined in the object, access the ``fieldnames`` (this is true for *all* `OutputStruct` objects):
initial_conditions.fieldnames
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
The `coeval_sliceplot` function also works on `OutputStruct` objects (as well as the `Coeval` object as we've already seen). It takes the object, and a specific field name. By default, the field it plots is the _first_ field in `fieldnames` (for any `OutputStruct`).
plotting.coeval_sliceplot(initial_conditions, "hires_density");
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Perturbed Field After obtaining the initial conditions, we need to *perturb* the field to a given redshift (i.e. the redshift we care about). This step clearly requires the results of the previous step, which we can easily just pass in. Let's do that:
perturbed_field = p21c.perturb_field( redshift = 8.0, init_boxes = initial_conditions )
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
Note that we didn't need to pass in any input parameters, because they are all contained in the `initial_conditions` object itself. The random seed is also taken from this object.Again, the output is an `OutputStruct`, so we can view its fields:
perturbed_field.fieldnames
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
This time, it has only density and velocity (the velocity direction is chosen without loss of generality). Let's view the perturbed density field:
plotting.coeval_sliceplot(perturbed_field, "density");
_____no_output_____
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST
It is clear here that the density used is the *low*-res density, but the overall structure of the field looks very similar. Ionization Field Next, we need to ionize the box. This is where things get a little more tricky. In the simplest case (which, let's be clear, is what we're going to do here) the ionization occurs...
ionized_field = p21c.ionize_box( perturbed_field = perturbed_field )
2020-02-29 15:10:43,902 | INFO | Existing init_boxes found and read in (seed=54321).
MIT
docs/tutorials/coeval_cubes.ipynb
daviesje/21cmFAST