markdown
stringlengths
0
1.02M
code
stringlengths
0
832k
output
stringlengths
0
1.02M
license
stringlengths
3
36
path
stringlengths
6
265
repo_name
stringlengths
6
127
You can imagine that if you had a long list of ENSG IDs and you created a `for` loop, you can quickly convert all the ENSG IDs to their common gene names. Finally, we note that it is advisable to structure your code into reusable "chunks". This is useful for both simple organization of code and for cases where you can...
def is_even(x): # need to check if it's an integer. If not, raise an error if type(x) == int: if x % 2 == 0: return True else: return False else: print('This only works on integers') raise Exception('is_even only accepts integers')
_____no_output_____
MIT
Python_DS_course/intro.ipynb
yaoyu-e-wang/teaching
The function takes a single input variable which we call `x`. It produces an output that is either `True` or `False` (both of which are special Boolean values in Python). The value that is produced by the function is often called its "return value" and is made explicit when we write something like `return True`. One ...
a_list = [0,5,6,7, 'a'] for x in a_list: if is_even(x): print('Even!') else: print('Odd')
Even! Odd Even! Odd This only works on integers
MIT
Python_DS_course/intro.ipynb
yaoyu-e-wang/teaching
Computes $y = \sqrt{x}$
# My comment for i in range(1,11): print(i, math.sqrt(i)) for i in range(1,11): for j in range(1,11): print(i*j, end=' ') print() for i in range(2,41): print(i/2, (i/2)**2, (i/2)**3, end="") x_2 = 0 x_1 = 1 print(x_1, end="\n") for i in range(3): y = x_1 + x_2 print(x_2, x_1,...
_____no_output_____
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
$\pi\alpha \frac{3}{\eta}$
plt.rcParams['text.usetex'] = True plt.rcParams['text.latex.unicode'] = True TH = [th/360*2*math.pi for th in range(-360,361)] s = [math.sin(th/360*2*math.pi) for th in range(-360,361)] c = [math.cos(th/360*2*math.pi) for th in range(-360,361)] plt.plot(TH, s, 'r') plt.plot(TH, c, 'b') plt.xticks([-2*math.pi, 0, 2...
_____no_output_____
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
Algorithmic trade
%matplotlib inline plt.figure(figsize=(12,5)) L = 30 # Moving average window N = 500 # Number of timesteps # Generate Gaussian noise e = np.random.randn(N) # Brownian walk y = np.zeros_like(e) y[0] = e[0] for t in range(1,N): y[t] = y[t-1] + e[t] mav = np.zeros_like(e) for t in range(1,N): idx0 = max(0,...
_____no_output_____
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
European
S0 = 100 r = 0.1 T = 1 sigma = 0.5 K = 120 Num = 10000 opt = 'Put' #opt = 'Call' C_T = 0.0 for i in range(Num): S_T = S0*np.exp(T*(r - 0.5*sigma**2) + sigma*np.sqrt(T)*np.random.randn()) if opt=='Call': C_T += np.max([S_T-K,0]) else: C_T += np.max([K-S_T,0]) C_T = C_T/Nu...
Put: 25.3531139008
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
Asian
S0 = 100 r = 0.1 T = 1 sigma = 0.05 K = 90 N = 100 Num = 10000 C_T = 0.0 for i in range(Num): S = [0]*N S[0] = S0 for n in range(1,N): S[n] = S[n-1]*np.exp(T/N*(r - 0.5*sigma**2) + sigma*np.sqrt(T/N)*np.random.randn()) C_T += np.max([np.mean(S)-K,0]) C_T = C_T/Num print('Asian Call:', np...
Asian Call: 13.6679815687
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
Lookback
S0 = 100 r = 0.1 T = 1 sigma = 0.05 K = 90 N = 10 Num = 10000 C_T = 0.0 for i in range(Num): S = [0]*N S[0] = S0 for n in range(1,N): S[n] = S[n-1]*np.exp(T/N*(r - 0.5*sigma**2) + sigma*np.sqrt(T/N)*np.random.randn()) C_T += np.max([np.max(S)-K,0]) C_T = C_T/Num print('Lookback Call:', n...
Lookback Call: 18.0746854795
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
Floating Lookback
S0 = 100 r = 0.1 T = 1 sigma = 0.05 K = 90 N = 10 Num = 10000 C_T = 0.0 for i in range(Num): S = [0]*N S[0] = S0 for n in range(1,N): S[n] = S[n-1]*np.exp(T/N*(r - 0.5*sigma**2) + sigma*np.sqrt(T/N)*np.random.randn()) C_T += np.max([S[-1]-np.min(S),0]) C_T = C_T/Num print('Floating Lookb...
_____no_output_____
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
Finding the root of a function
def f(x): return x**2 - 2 def sgn(x): return 1 if x>0 else -1 def find_root(a, b, epsilon=0.0000001): left = f(a) right = f(b) sgn_l = sgn(left) sgn_r = sgn(right) if sgn_l == sgn_r: error('No root in the interval') while (right-left>epsilon): mid = (righ...
1 2 2 5 5 8 8 1 1 3 3 5
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
1 2 2 5 5 8 8 1 1 3 3 5
N = 4 for i in range(len(x)-N+1): for j in range(N): print(x[i+N-1-j], end=' ') print('')
8 5 2 1 1 8 5 2 3 1 8 5 5 3 1 8
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
Binary Search
L = [1,3,4,7,8,9,12] i = 0 j = len(L)-1 x = 12 found = False while (i<=j): mid = (i+j)//2 if L[mid]==x: found = True break elif L[mid]<x: i = mid+1 elif L[mid]>x: j = mid-1 if found: print('Found') else: print('Not Found')
Found
MIT
fe588/FE588 Fall 2018.ipynb
atcemgil/notes
Instance 1
instance1 = Data('instances/instance1.txt') instance1.gurobi_solver()
Using license file /home/jpvt/gurobi.lic Academic license - for non-commercial use only Gurobi Optimizer version 9.0.2 build v9.0.2rc0 (linux64) Optimize a model with 20 rows, 13 columns and 39 nonzeros Model fingerprint: 0x83656b1a Coefficient statistics: Matrix range [1e+00, 1e+00] Objective range [1e+00, 1e...
MIT
PO_class/assignment_1/gurobi/instace_examples.ipynb
ItamarRocha/Operations-Research
Instance 2
instance2 = Data('instances/instance2.txt') instance2.gurobi_solver()
Gurobi Optimizer version 9.0.2 build v9.0.2rc0 (linux64) Optimize a model with 22 rows, 14 columns and 42 nonzeros Model fingerprint: 0xcd6ccbdb Coefficient statistics: Matrix range [1e+00, 1e+00] Objective range [1e+00, 1e+00] Bounds range [0e+00, 0e+00] RHS range [1e+00, 4e+01] Presolve remove...
MIT
PO_class/assignment_1/gurobi/instace_examples.ipynb
ItamarRocha/Operations-Research
Instance 3
instance3 = Data('instances/instance3.txt') instance3.gurobi_solver()
Gurobi Optimizer version 9.0.2 build v9.0.2rc0 (linux64) Optimize a model with 24 rows, 16 columns and 48 nonzeros Model fingerprint: 0xbe752641 Coefficient statistics: Matrix range [1e+00, 1e+00] Objective range [1e+00, 1e+00] Bounds range [0e+00, 0e+00] RHS range [4e+00, 3e+01] Presolve remove...
MIT
PO_class/assignment_1/gurobi/instace_examples.ipynb
ItamarRocha/Operations-Research
Instance 4
instance4 = Data('instances/instance4.txt') instance4.gurobi_solver()
Gurobi Optimizer version 9.0.2 build v9.0.2rc0 (linux64) Optimize a model with 13 rows, 9 columns and 21 nonzeros Model fingerprint: 0xec335147 Coefficient statistics: Matrix range [1e+00, 1e+00] Objective range [1e+00, 1e+00] Bounds range [0e+00, 0e+00] RHS range [2e+00, 2e+01] Presolve removed...
MIT
PO_class/assignment_1/gurobi/instace_examples.ipynb
ItamarRocha/Operations-Research
Instance 5
instance5 = Data('instances/instance5.txt') instance5.gurobi_solver()
Gurobi Optimizer version 9.0.2 build v9.0.2rc0 (linux64) Optimize a model with 17 rows, 11 columns and 33 nonzeros Model fingerprint: 0x52b28128 Coefficient statistics: Matrix range [1e+00, 1e+00] Objective range [1e+00, 1e+00] Bounds range [0e+00, 0e+00] RHS range [4e+00, 3e+01] Presolve remove...
MIT
PO_class/assignment_1/gurobi/instace_examples.ipynb
ItamarRocha/Operations-Research
Instance 6
instance6 = Data('instances/instance6.txt') instance6.gurobi_solver()
Gurobi Optimizer version 9.0.2 build v9.0.2rc0 (linux64) Optimize a model with 37 rows, 22 columns and 66 nonzeros Model fingerprint: 0xa62c8fc4 Coefficient statistics: Matrix range [1e+00, 1e+00] Objective range [1e+00, 1e+00] Bounds range [0e+00, 0e+00] RHS range [1e+00, 5e+00] Presolve remove...
MIT
PO_class/assignment_1/gurobi/instace_examples.ipynb
ItamarRocha/Operations-Research
Instance 7
instance7 = Data('instances/instance7.txt') instance7.gurobi_solver()
Gurobi Optimizer version 9.0.2 build v9.0.2rc0 (linux64) Optimize a model with 36 rows, 25 columns and 75 nonzeros Model fingerprint: 0x7a1ac2be Coefficient statistics: Matrix range [1e+00, 1e+00] Objective range [1e+00, 1e+00] Bounds range [0e+00, 0e+00] RHS range [5e+00, 7e+01] Presolve remove...
MIT
PO_class/assignment_1/gurobi/instace_examples.ipynb
ItamarRocha/Operations-Research
Functional ExpansionsOpenMC's general tally system accommodates a wide range of tally *filters*. While most filters are meant to identify regions of phase space that contribute to a tally, there are a special set of functional expansion filters that will multiply the tally by a set of orthogonal functions, e.g. Legend...
%matplotlib inline import openmc import numpy as np import matplotlib.pyplot as plt
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
To begin, let us first create a simple model. The model will be a slab of fuel material with reflective boundaries conditions in the x- and y-directions and vacuum boundaries in the z-direction. However, to make the distribution slightly more interesting, we'll put some B4C in the middle of the slab.
# Define fuel and B4C materials fuel = openmc.Material() fuel.add_element('U', 1.0, enrichment=4.5) fuel.add_nuclide('O16', 2.0) fuel.set_density('g/cm3', 10.0) b4c = openmc.Material() b4c.add_element('B', 4.0) b4c.add_element('C', 1.0) b4c.set_density('g/cm3', 2.5) # Define surfaces used to construct regions zmin, zm...
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
For the starting source, we'll use a uniform distribution over the entire box geometry.
settings = openmc.Settings() spatial_dist = openmc.stats.Box(*geom.bounding_box) settings.source = openmc.Source(space=spatial_dist) settings.batches = 210 settings.inactive = 10 settings.particles = 1000
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Defining the tally is relatively straightforward. One simply needs to list 'flux' as a score and then add an expansion filter. For this case, we will want to use the `SpatialLegendreFilter` class which multiplies tally scores by Legendre polynomials evaluated on normalized spatial positions along an axis.
# Create a flux tally flux_tally = openmc.Tally() flux_tally.scores = ['flux'] # Create a Legendre polynomial expansion filter and add to tally order = 8 expand_filter = openmc.SpatialLegendreFilter(order, 'z', zmin, zmax) flux_tally.filters.append(expand_filter)
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
The last thing we need to do is create a `Tallies` collection and export the entire model, which we'll do using the `Model` convenience class.
tallies = openmc.Tallies([flux_tally]) model = openmc.model.Model(geometry=geom, settings=settings, tallies=tallies)
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Running a simulation is now as simple as calling the `run()` method of `Model`.
sp_file = model.run(output=False)
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Now that the run is finished, we need to load the results from the statepoint file.
with openmc.StatePoint(sp_file) as sp: df = sp.tallies[flux_tally.id].get_pandas_dataframe()
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
We've used the `get_pandas_dataframe()` method that returns tally data as a Pandas dataframe. Let's see what the raw data looks like.
df
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Since the expansion coefficients are given as$$ a_n = \frac{2n + 1}{2} \int_{-1}^1 dz' P_n(z') \phi(z')$$we just need to multiply the Legendre moments by $(2n + 1)/2$.
n = np.arange(order + 1) a_n = (2*n + 1)/2 * df['mean']
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
To plot the flux distribution, we can use the `numpy.polynomial.Legendre` class which represents a truncated Legendre polynomial series. Since we really want to plot $\phi(z)$ and not $\phi(z')$ we first need to perform a change of variables. Since$$ \lvert \phi(z) dz \rvert = \lvert \phi(z') dz' \rvert $$and, for this...
phi = np.polynomial.Legendre(a_n/10, domain=(zmin, zmax))
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Let's plot it and see how our flux looks!
z = np.linspace(zmin, zmax, 1000) plt.plot(z, phi(z)) plt.xlabel('Z position [cm]') plt.ylabel('Flux [n/src]')
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
As you might expect, we get a rough cosine shape but with a flux depression in the middle due to the boron slab that we introduced. To get a more accurate distribution, we'd likely need to use a higher order expansion.One more thing we can do is confirm that integrating the distribution gives us the same value as the f...
np.trapz(phi(z), z)
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
In addition to being able to tally Legendre moments, there are also functional expansion filters available for spherical harmonics (`SphericalHarmonicsFilter`) and Zernike polynomials over a unit disk (`ZernikeFilter`). A separate `LegendreFilter` class can also be used for determining Legendre scattering moments (i.e....
# Define fuel fuel = openmc.Material() fuel.add_element('U', 1.0, enrichment=5.0) fuel.add_nuclide('O16', 2.0) fuel.set_density('g/cm3', 10.0) # Define surfaces used to construct regions zmin, zmax, radius = -1., 1., 0.5 pin = openmc.ZCylinder(x0=0.0, y0=0.0, r=radius, boundary_type='vacuum') bottom = openmc.ZPlane(z...
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
For the starting source, we'll use a uniform distribution over the entire box geometry.
settings = openmc.Settings() spatial_dist = openmc.stats.Box(*geom.bounding_box) settings.source = openmc.Source(space=spatial_dist) settings.batches = 100 settings.inactive = 20 settings.particles = 100000
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Defining the tally is relatively straightforward. One simply needs to list 'flux' as a score and then add an expansion filter. For this case, we will want to use the `SpatialLegendreFilter`, `ZernikeFilter`, `ZernikeRadialFilter` classes which multiplies tally scores by Legendre, azimuthal Zernike and radial-only Zern...
# Create a flux tally flux_tally_legendre = openmc.Tally() flux_tally_legendre.scores = ['flux'] # Create a Legendre polynomial expansion filter and add to tally order = 10 cell_filter = openmc.CellFilter(fuel) legendre_filter = openmc.SpatialLegendreFilter(order, 'z', zmin, zmax) flux_tally_legendre.filters = [cell_f...
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
The last thing we need to do is create a `Tallies` collection and export the entire model, which we'll do using the `Model` convenience class.
tallies = openmc.Tallies([flux_tally_legendre, flux_tally_zernike, flux_tally_zernike1d]) model = openmc.model.Model(geometry=geom, settings=settings, tallies=tallies)
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Running a simulation is now as simple as calling the `run()` method of `Model`.
sp_file = model.run(output=False)
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Now that the run is finished, we need to load the results from the statepoint file.
with openmc.StatePoint(sp_file) as sp: df1 = sp.tallies[flux_tally_legendre.id].get_pandas_dataframe()
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
We've used the `get_pandas_dataframe()` method that returns tally data as a Pandas dataframe. Let's see what the raw data looks like.
df1
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Since the scaling factors for expansion coefficients will be provided by the Python API, thus, we do not need to multiply the moments by scaling factors.
a_n = df1['mean']
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Loading the coefficients is realized via calling the OpenMC Python API as follows:
phi = openmc.legendre_from_expcoef(a_n, domain=(zmin, zmax))
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Let's plot it and see how our flux looks!
z = np.linspace(zmin, zmax, 1000) plt.plot(z, phi(z)) plt.xlabel('Z position [cm]') plt.ylabel('Flux [n/src]')
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
A rough cosine shape is obtained. One can also numerically integrate the function using the trapezoidal rule.
np.trapz(phi(z), z)
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
The following cases show how to reconstruct the flux distribution Zernike polynomials tallied results.
with openmc.StatePoint(sp_file) as sp: df2 = sp.tallies[flux_tally_zernike.id].get_pandas_dataframe() df2
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Let's plot the flux in radial direction with specific azimuthal angle ($\theta = 0.0$).
z_n = df2['mean'] zz = openmc.Zernike(z_n, radius) rr = np.linspace(0, radius, 100) plt.plot(rr, zz(rr, 0.0)) plt.xlabel('Radial position [cm]') plt.ylabel('Flux')
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
A polar figure with all azimuthal can be plotted like this:
z_n = df2['mean'] zz = openmc.Zernike(z_n, radius=radius) # # Using linspace so that the endpoint of 360 is included... azimuths = np.radians(np.linspace(0, 360, 50)) zeniths = np.linspace(0, radius, 100) r, theta = np.meshgrid(zeniths, azimuths) values = zz(zeniths, azimuths) fig, ax = plt.subplots(subplot_kw=dict(pr...
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Sometimes, we just need the radial-only Zernike polynomial tallied flux distribution. Let us extract the tallied coefficients first.
with openmc.StatePoint(sp_file) as sp: df3 = sp.tallies[flux_tally_zernike1d.id].get_pandas_dataframe() df3
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
A plot along with r-axis is also done.
z_n = df3['mean'] zz = openmc.ZernikeRadial(z_n, radius=radius) rr = np.linspace(0, radius, 50) plt.plot(rr, zz(rr)) plt.xlabel('Radial position [cm]') plt.ylabel('Flux')
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Similarly, we can also re-construct the polar figure based on radial-only Zernike polinomial coefficients.
z_n = df3['mean'] zz = openmc.ZernikeRadial(z_n, radius=radius) azimuths = np.radians(np.linspace(0, 360, 50)) zeniths = np.linspace(0, radius, 100) r, theta = np.meshgrid(zeniths, azimuths) values = [[i for i in zz(zeniths)] for j in range(len(azimuths))] fig, ax = plt.subplots(subplot_kw=dict(projection='polar'), f...
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Based on Legendre polynomial coefficients and the azimuthal or radial-only Zernike coefficient, it's possible to reconstruct the flux both on radial and axial directions.
# Reconstruct 3-D flux based on radial only Zernike and Legendre polynomials z_n = df3['mean'] zz = openmc.ZernikeRadial(z_n, radius=radius) azimuths = np.radians(np.linspace(0, 360, 100)) # azimuthal mesh zeniths = np.linspace(0, radius, 100) # radial mesh zmin, zmax = -1.0, 1.0 z = np.linspace(zmin, zmax, 100) #...
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
One can also reconstruct the 3D flux distribution based on Legendre and Zernike polynomial tallied coefficients.
# Define needed function first def cart2pol(x, y): rho = np.sqrt(x**2 + y**2) phi = np.arctan2(y, x) return(rho, phi) # Reconstruct 3-D flux based on azimuthal Zernike and Legendre polynomials z_n = df2['mean'] zz = openmc.Zernike(z_n, radius=radius) # xstep = 2.0*radius/20 hstep = (zmax - zmin)/20 x = n...
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Let us print out with VTK format.
# You'll need to install pyevtk as a prerequisite from pyevtk.hl import gridToVTK import numpy as np # # Dimensions nx, ny, nz = len(x), len(x), len(h) lx, ly, lz = 2.0*radius, 2.0*radius, (zmax-zmin) dx, dy, dz = lx/nx, ly/ny, lz/nz # ncells = nx * ny * nz npoints = (nx + 1) * (ny + 1) * (nz + 1) # # Coordinates x = n...
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Use VisIt or ParaView to plot it as you want. Then, the plot can be loaded and shown as follows.
f1 = plt.imread('./images/flux3d.png') plt.imshow(f1, cmap='jet')
_____no_output_____
MIT
docs/source/examples/expansion-filters.ipynb
gridley/openmc
Practical 8: Pandas to Cluster AnalysisObjectives: In this practical we keep moving with applied demonstrations of modules you can use in Python. Today we continue to practice using Pandas, but also start applying some common machine learning techniques. Specifically, we will use Cluster Analysis [also known as unsupe...
import pandas as pd #Im using pd here as its easier to keep writing! You can use whatever you want, but it might help you to use 'pd' for now. import matplotlib.pyplot as plt from sklearn.cluster import KMeans import seaborn as sns # Read data from file # We are going to use the function 'read_csv' within the Pandas p...
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
Exercise 1: Plot a histogram of meteorological variables and fire extent. The purpose of this exercise is to understand our dataset a little before we start to apply any cluster analysis. We will discuss the reason for this as we apply cluster analysis. For the meteorological variables you need to produce a histogr...
# Make a boxplot for each column. We could group them into one figure but this is beyond the scope of this practical. # In the template below I have given you a template to include a boxplot in each subplot import matplotlib.pyplot as plt fig, axs = plt.subplots(2, 3, figsize=(12, 8), sharey=False) # Temperature data...
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
For the first three variables, we can easily infer a distribution of values. For the final two variables, however, the distribution is much harder to interpret due to a very high number of small values. Given that we are interested in forest fires, we need to consider whether this might influence our clustering. Why is...
import seaborn as sns # calculate the correlation matrix #------'INSERT CODE HERE'------ corr = data[['temp','RH','wind','rain','area']].corr() # Now use an internal function within Seaborn called '.heatmap' sns.heatmap(corr, xticklabels=corr.columns, yticklabels=corr.columns) #------------------------------ # And we n...
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
K-means cluster analysisK-means cluster analysis is perhaps the simplest of all, but allows us to practice turning a dataset into one that contains a different number of clusters, members of which should have 'similar' properties. How we define the similarity between members can vary widely. Take the following [figur...
from sklearn.cluster import KMeans # Extract our variables of interest from the dataframe into a new Numpy matrix numpy_matrix = data[['temp','RH','area']].values # Specify how many clusters we want the Kmeans algorithm to find clusterer=KMeans(n_clusters=4) # Fit the clustering algorithm to our Numpy matrix clusterer....
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
Now let us look at the properties of these clusters by generating box-plots of values from our dataframe. We have already met multiple functions that can be applied to our dataframe. In Practical 7 we briefly produced box-plots of our dataframe using the:```python>.boxplot(column=[>])```command. We can also select a su...
fig, ax = plt.subplots(1, 3, figsize=(10, 5)) data.boxplot(column=['temp'], by=['K-means label'], ax=ax[0]) data.boxplot(column=['RH'], by=['K-means label'], ax=ax[1]) data.boxplot(column=['area'], by=['K-means label'], ax=ax[2]).set_yscale('log')
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
Exercise 3: Create new dataframe with only positive values of fire area and repeat the above cluster analysis In this exercise you can copy the above code example, but you need to ensure operations are performed on a new set of datapoints from a new dataframe. Can you remember how we select a new dataframe acco...
#-------'INSERT CODE HERE'------- data_new = data[data["area"] > 0] numpy_matrix_new = data_new[['temp','RH','area']].values clusterer=KMeans(n_clusters=4) clusterer.fit(numpy_matrix_new) labels = clusterer.labels_ data_new['K-means label'] = labels #-------------------------------- data_new['K-means label'] #data['O...
C:\Users\Dave\anaconda3\lib\site-packages\ipykernel_launcher.py:7: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.h...
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
Working with 'other' data In the following code box we load some freely available data on Air BnB listings from New York in 2019. By previewing our column names, you can see we have a collection of both numeric and non-numerical data. Why might cluster analysis be of use here? Let's see if we can assign each available...
# Load the Air BnB data if 'google.colab' in str(get_ipython()): data_NYC = pd.read_csv('https://raw.githubusercontent.com/loftytopping/DEES_programming_course/master/data/AB_NYC_2019.csv') data_NYC.head() else: data_NYC = pd.read_csv("data/AB_NYC_2019.csv") data_NYC.head() # Preview the first 5 line...
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
For example, we can see there is a variable that reflects the neighborhood group of the listings. Let us say we wish to see how many unique entries there are. Rather than repeating the calculation we have done a number of times, we can produce a bar plot that automatically places each unique entry on the x axis, and ...
data_NYC['neighbourhood_group'].value_counts().plot(kind='bar')
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
Exercise 4: Clustering AirBnB data from New York by lattitude, longitude and price We want to cluster this dataset in order to determine the properties of similar listings. For this exercise, as this is a different dataset, let us repeat the procedure of using the K-means algorithm to produce 4 clusters by focus...
#-------'INSERT CODE HERE'------- numpy_matrix_NYC = data_NYC[['latitude', 'longitude', 'price']].values model_NYC = KMeans(n_clusters=4) model_NYC.fit(numpy_matrix_NYC) labels = model_NYC.labels_ #------------------------------ data_NYC['K-means label'] = labels data_NYC.boxplot(column=['price'], by=['K-means label'])
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
Ideally we would also like to get a feel for the ratio of each neighborhood in each cluster. We can certainly do that by producing 4 seperate barcharts as per the code box below. In this code snippet I'm expanding on the previous example of a barchart by selecting a subset according to the value of the K-means cluster ...
fig, axes = plt.subplots(1, 4, figsize=(10, 5)) # Produce 4 seperate plots data_NYC[data_NYC['K-means label']==0]['neighbourhood_group'].value_counts().plot(kind='bar', title='Cluster 0', ax=axes[0]) data_NYC[data_NYC['K-means label']==1]['neighbourhood_group'].value_counts().plot(kind='bar', title='Cluster 1', ax=axe...
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
What does this graph tell us? The median price of Cluster '1' is high, and these results confirm those listings are dominated by properties in Manhattan. However, there appears to be a very similar profile in Cluster '2' which has a much lower median price range. Exercise 5: Visualise cluster data by room type I...
fig, axes = plt.subplots(1, 4, figsize=(10, 5)) #-------'INSERT CODE HERE'------- # Produce 4 seperate plots data_NYC[data_NYC['K-means label']==0]['room_type'].value_counts().plot(kind='bar', title='Cluster 0', ax=axes[0]) data_NYC[data_NYC['K-means label']==1]['room_type'].value_counts().plot(kind='bar', title='Clus...
_____no_output_____
CC0-1.0
solutions/Practical_8.ipynb
loftytopping/DEES_programming_course
WeatherPy---- Note* Instructions have been included for each segment. You do not have to follow them exactly, but they are included to help you think through the steps. Generate Cities List
import matplotlib.pyplot as plt import pandas as pd import numpy as np import requests import time import json import scipy.stats as st from scipy.stats import linregress from api_keys import weather_api_key from citipy import citipy output_data_file = "output_data/cities.csv" lat_range = (-90, 90) lng_range = (-18...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Perform API Calls* Perform a weather check on each city using a series of successive API calls.* Include a print log of each city as it'sbeing processed (with the city number and city name).
url = "http://api.openweathermap.org/data/2.5/weather?" units = "imperial" query_url = f"{url}appid={weather_api_key}&units={units}&q=" city_id_list = [] city_name_list = [] country_list = [] lng_list = [] lat_list = [] temp_list = [] humidity_list = [] clouds_list = [] wind_speed_list = [] print('Beginning Data Ret...
Beginning Data Retrieval ---------------------------- City Name: rikitea, City ID: 4030556 City Name: palmer, City ID: 4946620 City Name: saldanha, City ID: 3361934 City Name: castro, City ID: 3466704 City Name: cabo san lucas, City ID: 3985710 City Name: nikolskoye, City ID: 546105 City Name: anda, City ID: 2038650 Ci...
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Convert Raw Data to DataFrame* Export the city data into a .csv.* Display the DataFrame
cities_df = pd.DataFrame({"City ID": city_id_list, "City": city_name_list, "Country": country_list, "Lat": lat_list, "Lng": lng_list, "Temperature": temp_list, "Humidity": humidity_list, "Clouds": clouds_list, "Wind Speed": wind_speed_list}) cities_df.head() cities_df.t...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Plotting the Data* Use proper labeling of the plots using plot titles (including date of analysis) and axes labels.* Save the plotted figures as .pngs. Latitude vs. Temperature Plot
x_values = cities_df["Lat"] y_values = cities_df["Temperature"] plt.scatter(x_values,y_values) plt.title('City Latitude vs. Max Temperature (04/01/20)') plt.xlabel('Latitude') plt.ylabel('Max Temperature (F)') plt.ylim(0, 100) plt.xlim(-60, 80) plt.minorticks_on() plt.grid(which='major', linestyle='-') plt.grid(which=...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Latitude vs. Humidity Plot
x_values = cities_df["Lat"] y_values = cities_df["Humidity"] plt.scatter(x_values,y_values) plt.title('City Latitude vs. Humidity') plt.xlabel('Latitude') plt.ylabel('Humidity (%)') plt.ylim(0, 105) plt.xlim(-60, 80) plt.minorticks_on() plt.grid(which='major', linestyle='-') plt.grid(which='minor', linestyle=':') plt....
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Latitude vs. Cloudiness Plot
x_values = cities_df["Lat"] y_values = cities_df["Clouds"] plt.scatter(x_values,y_values) plt.title('City Latitude vs. Cloudiness (04/01/20)') plt.xlabel('Latitude') plt.ylabel('Cloudiness (%)') plt.ylim(-5, 105) plt.xlim(-60, 80) plt.minorticks_on() plt.grid(which='major', linestyle='-') plt.grid(which='minor', lines...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Latitude vs. Wind Speed Plot
x_values = cities_df["Lat"] y_values = cities_df["Wind Speed"] plt.scatter(x_values,y_values) plt.title('City Latitude vs. Wind Speed (04/01/20)') plt.xlabel('Latitude') plt.ylabel('Wind Speed (mph)') plt.ylim(0, 40) plt.xlim(-60, 80) plt.minorticks_on() plt.grid(which='major', linestyle='-') plt.grid(which='minor', l...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Linear Regression
mask = cities_df['Lat'] > 0 northern_hemisphere = cities_df[mask] southern_hemisphere = cities_df[~mask]
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Northern Hemisphere - Max Temp vs. Latitude Linear Regression
x_values = northern_hemisphere["Lat"] y_values = northern_hemisphere["Temperature"] (slope, intercept, rvalue, pvalue, stderr) = st.linregress(x_values, y_values) regress_values = x_values * slope + intercept line_eq = "y = " + str(round(slope,2)) + "x + " + str(round(intercept,2)) correlation = st.pearsonr(x_values, ...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Southern Hemisphere - Max Temp vs. Latitude Linear Regression
x_values = southern_hemisphere["Lat"] y_values = southern_hemisphere["Temperature"] (slope, intercept, rvalue, pvalue, stderr) = st.linregress(x_values, y_values) regress_values = x_values * slope + intercept line_eq = "y = " + str(round(slope,2)) + "x + " + str(round(intercept,2)) correlation = st.pearsonr(x_values, ...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Northern Hemisphere - Humidity (%) vs. Latitude Linear Regression
x_values = northern_hemisphere["Lat"] y_values = northern_hemisphere["Humidity"] (slope, intercept, rvalue, pvalue, stderr) = st.linregress(x_values, y_values) regress_values = x_values * slope + intercept line_eq = "y = " + str(round(slope,2)) + "x + " + str(round(intercept,2)) correlation = st.pearsonr(x_values, y_v...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Southern Hemisphere - Humidity (%) vs. Latitude Linear Regression
x_values = southern_hemisphere["Lat"] y_values = southern_hemisphere["Humidity"] (slope, intercept, rvalue, pvalue, stderr) = st.linregress(x_values, y_values) regress_values = x_values * slope + intercept line_eq = "y = " + str(round(slope,2)) + "x + " + str(round(intercept,2)) correlation = st.pearsonr(x_values, y_v...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Northern Hemisphere - Cloudiness (%) vs. Latitude Linear Regression
x_values = northern_hemisphere["Lat"] y_values = northern_hemisphere["Clouds"] (slope, intercept, rvalue, pvalue, stderr) = st.linregress(x_values, y_values) regress_values = x_values * slope + intercept line_eq = "y = " + str(round(slope,2)) + "x + " + str(round(intercept,2)) correlation = st.pearsonr(x_values, y_val...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Southern Hemisphere - Cloudiness (%) vs. Latitude Linear Regression
x_values = southern_hemisphere["Lat"] y_values = southern_hemisphere["Clouds"] (slope, intercept, rvalue, pvalue, stderr) = st.linregress(x_values, y_values) regress_values = x_values * slope + intercept line_eq = "y = " + str(round(slope,2)) + "x + " + str(round(intercept,2)) correlation = st.pearsonr(x_values, y_val...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Northern Hemisphere - Wind Speed (mph) vs. Latitude Linear Regression
x_values = northern_hemisphere["Lat"] y_values = northern_hemisphere["Wind Speed"] (slope, intercept, rvalue, pvalue, stderr) = st.linregress(x_values, y_values) regress_values = x_values * slope + intercept line_eq = "y = " + str(round(slope,2)) + "x + " + str(round(intercept,2)) correlation = st.pearsonr(x_values, y...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
Southern Hemisphere - Wind Speed (mph) vs. Latitude Linear Regression
x_values = southern_hemisphere["Lat"] y_values = southern_hemisphere["Wind Speed"] (slope, intercept, rvalue, pvalue, stderr) = st.linregress(x_values, y_values) regress_values = x_values * slope + intercept line_eq = "y = " + str(round(slope,2)) + "x + " + str(round(intercept,2)) correlation = st.pearsonr(x_values, y...
_____no_output_____
ADSL
Weather and Vaction Notebooks/WeatherPy/WeatherPy.ipynb
uchenna23/Python-API-Challenge
sentence transformers 로딩
pip install -U sentence-transformers import torch from sentence_transformers import SentenceTransformer, util model_name = 'sentence-transformers/paraphrase-xlm-r-multilingual-v1' embedding_model = models.Transformer(model_name) pooler = models.Pooling( embedding_model.get_word_embedding_dimension(), poolin...
_____no_output_____
MIT
cohesion_test/[paraphrase_xlm_r_multilingual_v1]sentence_transformer.ipynb
cateto/python4NLP
명사,동사,형용사 필터링
# Colab에 Mecab 설치 !git clone https://github.com/SOMJANG/Mecab-ko-for-Google-Colab.git %cd Mecab-ko-for-Google-Colab !bash install_mecab-ko_on_colab190912.sh def cleaning(sentence): clean_words = [] for word in okt.pos(sentence, stem=True): if word[1] in ['Noun', 'Verb', 'Adjective']: #명사, 동사, 형용사 clean_wo...
_____no_output_____
MIT
cohesion_test/[paraphrase_xlm_r_multilingual_v1]sentence_transformer.ipynb
cateto/python4NLP
Tuning the $\kappa$ exponents
h = HeuristicScore() line = N_9x9().setline('e', [2,0,0,1, 1,1,0,0]).as_bits()[0] h.f_range(line, 0), h.cscore(line, 0), h.score(line) # expected: (array([0, 0, 1, 1, 1, 0, 0]), (3, 2), 3.4460950649911055) line = N_9x9().setline('e', [2,0,2,1, 1,1,0,0]).as_bits()[0] h.f_range(line, 0), h.cscore(line, 0), h.score(line)...
_____no_output_____
Apache-2.0
other_stuff/DeepGomoku/TuningKappa.ipynb
Project-Ellie/tutorials
--- Complete score of a N9x9 neighbourhood
h = HeuristicScore() n1 = (N_9x9() .setline('e', [1,2,0,2, 0,2,1,2]) .setline('ne', [1,2,2,1, 2,0,2,2]) .setline('n', [0,0,0,1, 1,0,0,2]) .setline('nw', [1,2,1,1, 0,1,0,2])) n2 = (N_9x9() .setline('e', [1,2,0,2, 0,2,1,2]) .setline('ne', [1,2,2,1, 2,0,2,2]) .setline('n', [0,0,2,1, 1,0,0,2]) .setline('nw', [1...
|o o o| | o | | x | | x o | |x o o * o x o| | x x x | | o o x | | o o | |x x| [0.0, 0.0, 2.0, 3.0] 3.2710663101885897
Apache-2.0
other_stuff/DeepGomoku/TuningKappa.ipynb
Project-Ellie/tutorials
---_You are currently looking at **version 1.2** of this notebook. To download notebooks and datafiles, as well as get help on Jupyter notebooks in the Coursera platform, visit the [Jupyter Notebook FAQ](https://www.coursera.org/learn/python-social-network-analysis/resources/yPcBs) course resource._--- Assignment 2 - ...
import networkx as nx # This line must be commented out when submitting to the autograder #!head email_network.txt #!head email_network.txt
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 1Using networkx, load up the directed multigraph from `email_network.txt`. Make sure the node names are strings.*This function should return a directed multigraph networkx graph.*
def answer_one(): G = nx.read_edgelist('email_network.txt', delimiter='\t', data=[('timestamp', int)], create_using=nx.MultiDiGraph()) return G
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 2How many employees and emails are represented in the graph from Question 1?*This function should return a tuple (employees, emails).*
def answer_two(): G = answer_one() return len(G.nodes()), len(G.edges())
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 3* Part 1. Assume that information in this company can only be exchanged through email. When an employee sends an email to another employee, a communication channel has been created, allowing the sender to provide information to the receiver, but not vice versa. Based on the emails sent in the data, is...
def answer_three(): G = answer_one() return nx.is_strongly_connected(G), nx.is_connected(G.to_undirected())
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 4How many nodes are in the largest (in terms of nodes) weakly connected component?*This function should return an int.*
def answer_four(): G = answer_one() wccs = nx.weakly_connected_components(G) return len(max(wccs, key=len))
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 5How many nodes are in the largest (in terms of nodes) strongly connected component?*This function should return an int*
def answer_five(): G = answer_one() sccs = nx.strongly_connected_components(G) return len(max(sccs, key=len))
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 6Using the NetworkX function strongly_connected_component_subgraphs, find the subgraph of nodes in a largest strongly connected component. Call this graph G_sc.*This function should return a networkx MultiDiGraph named G_sc.*
def answer_six(): G = answer_one() scc_subs = nx.strongly_connected_component_subgraphs(G) G_sc = max(scc_subs, key=len) return G_sc
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 7What is the average distance between nodes in G_sc?*This function should return a float.*
def answer_seven(): G = answer_six() return nx.average_shortest_path_length(G)
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 8What is the largest possible distance between two employees in G_sc?*This function should return an int.*
def answer_eight(): G = answer_six() return nx.diameter(G)
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 9What is the set of nodes in G_sc with eccentricity equal to the diameter?*This function should return a set of the node(s).*
def answer_nine(): G = answer_six() return set(nx.periphery(G))
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 10What is the set of node(s) in G_sc with eccentricity equal to the radius?*This function should return a set of the node(s).*
def answer_ten(): G = answer_six() return set(nx.center(G))
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 11Which node in G_sc is connected to the most other nodes by a shortest path of length equal to the diameter of G_sc?How many nodes are connected to this node?*This function should return a tuple (name of node, number of satisfied connected nodes).*
def answer_eleven(): G = answer_six() d = nx.diameter(G) peripheries = nx.periphery(G) max_count = -1 result_node = None for node in peripheries: sp = nx.shortest_path_length(G, node) count = list(sp.values()).count(d) if count > max_count: result_node = node ...
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 12Suppose you want to prevent communication from flowing to the node that you found in the previous question from any node in the center of G_sc, what is the smallest number of nodes you would need to remove from the graph (you're not allowed to remove the node from the previous question or the center nodes)?...
def answer_twelve(): G = answer_six() center = nx.center(G)[0] node = answer_eleven()[0] return len(nx.minimum_node_cut(G, center, node))
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 13Construct an undirected graph G_un using G_sc (you can ignore the attributes).*This function should return a networkx Graph.*
def answer_thirteen(): G = answer_six() undir_subgraph = G.to_undirected() G_un = nx.Graph(undir_subgraph) return G_un
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Question 14What is the transitivity and average clustering coefficient of graph G_un?*This function should return a tuple (transitivity, avg clustering).*
def answer_fourteen(): G = answer_thirteen() return nx.transitivity(G), nx.average_clustering(G)
_____no_output_____
MIT
Part II/Excercises1.ipynb
ivanarielcaceres/boocamp-networks-python
Alpha - Beta calculation with initial velocity determination This notebook allows you to calculate the basic dimensionless $\alpha - \beta$ parameters for a set of fireball data after [Sansom et al. 2019](https://doi.org/10.3847/1538-4357/ab4516) and [Gritsevich 2012](https://doi.org/10.1134/S0010952512010017). This u...
# import astropy import scipy import numpy as np import matplotlib.pyplot as plt from scipy.optimize import minimize, basinhopping from astropy.table import Table, vstack from astropy import units as u from ipywidgets import interact, interactive, fixed, interact_manual import ipywidgets as widgets import os, glob from...
_____no_output_____
MIT
alpha_beta_v0_fun.ipynb
desertfireballnetwork/alpha_beta_modules
Function definitions
def Q4_min(vvals, yvals, err=[]): """ initiates and calls the Q4 minimisation given in Gritsevich 2007 - 'Validity of the photometric formula for estimating the mass of a fireball projectile' """ b0 = 1. a0 = np.exp(Yvalues[-1])/(2. * b0) x0 = [a0, b0] xmin = [0.01, 0.0001] xmax = [1...
_____no_output_____
MIT
alpha_beta_v0_fun.ipynb
desertfireballnetwork/alpha_beta_modules