markdown
stringlengths
0
1.02M
code
stringlengths
0
832k
output
stringlengths
0
1.02M
license
stringlengths
3
36
path
stringlengths
6
265
repo_name
stringlengths
6
127
Step 4. See the first 5 entries
food.head()
_____no_output_____
BSD-3-Clause
01_Getting_&_Knowing_Your_Data/World Food Facts/Exercises_with_solutions.ipynb
KarimaCha/pandas_exercises
Step 5. What is the number of observations in the dataset?
food.shape #will give you both (observations/rows, columns) food.shape[0] #will give you only the observations/rows number
_____no_output_____
BSD-3-Clause
01_Getting_&_Knowing_Your_Data/World Food Facts/Exercises_with_solutions.ipynb
KarimaCha/pandas_exercises
Step 6. What is the number of columns in the dataset?
print(food.shape) #will give you both (observations/rows, columns) print(food.shape[1]) #will give you only the columns number #OR food.info() #Columns: 163 entries
(356027, 163) 163 <class 'pandas.core.frame.DataFrame'> RangeIndex: 356027 entries, 0 to 356026 Columns: 163 entries, code to water-hardness_100g dtypes: float64(107), object(56) memory usage: 442.8+ MB
BSD-3-Clause
01_Getting_&_Knowing_Your_Data/World Food Facts/Exercises_with_solutions.ipynb
KarimaCha/pandas_exercises
Step 7. Print the name of all the columns.
food.columns
_____no_output_____
BSD-3-Clause
01_Getting_&_Knowing_Your_Data/World Food Facts/Exercises_with_solutions.ipynb
KarimaCha/pandas_exercises
Step 8. What is the name of 105th column?
food.columns[104]
_____no_output_____
BSD-3-Clause
01_Getting_&_Knowing_Your_Data/World Food Facts/Exercises_with_solutions.ipynb
KarimaCha/pandas_exercises
Step 9. What is the type of the observations of the 105th column?
food.dtypes['-glucose_100g']
_____no_output_____
BSD-3-Clause
01_Getting_&_Knowing_Your_Data/World Food Facts/Exercises_with_solutions.ipynb
KarimaCha/pandas_exercises
Step 10. How is the dataset indexed?
food.index
_____no_output_____
BSD-3-Clause
01_Getting_&_Knowing_Your_Data/World Food Facts/Exercises_with_solutions.ipynb
KarimaCha/pandas_exercises
Step 11. What is the product name of the 19th observation?
food.values[18][7]
_____no_output_____
BSD-3-Clause
01_Getting_&_Knowing_Your_Data/World Food Facts/Exercises_with_solutions.ipynb
KarimaCha/pandas_exercises
Robot ClassIn this project, we'll be localizing a robot in a 2D grid world. The basis for simultaneous localization and mapping (SLAM) is to gather information from a robot's sensors and motions over time, and then use information about measurements and motion to re-construct a map of the world. UncertaintyAs you've l...
# import some resources import numpy as np import matplotlib.pyplot as plt import random %matplotlib inline # the robot class class robot: # -------- # init: # creates a robot with the specified parameters and initializes # the location (self.x, self.y) to the center of the world # def __...
_____no_output_____
MIT
1. Robot Moving and Sensing.ipynb
dgander000/P3_Implement_SLAM
Define a world and a robotNext, let's instantiate a robot object. As you can see in `__init__` above, the robot class takes in a number of parameters including a world size and some values that indicate the sensing and movement capabilities of the robot.In the next example, we define a small 10x10 square world, a meas...
world_size = 10.0 # size of world (square) measurement_range = 5.0 # range at which we can sense landmarks motion_noise = 0.2 # noise in robot motion measurement_noise = 0.2 # noise in the measurements # instantiate a robot, r r = robot(world_size, measurement_range, motion_noise, meas...
Robot: [x=5.00000 y=5.00000]
MIT
1. Robot Moving and Sensing.ipynb
dgander000/P3_Implement_SLAM
Visualizing the WorldIn the given example, we can see/print out that the robot is in the middle of the 10x10 world at (x, y) = (5.0, 5.0), which is exactly what we expect!However, it's kind of hard to imagine this robot in the center of a world, without visualizing the grid itself, and so in the next cell we provide a...
# import helper function from helpers import display_world # define figure size plt.rcParams["figure.figsize"] = (5,5) # call display_world and display the robot in it's grid world print(r) display_world(int(world_size), [r.x, r.y])
Robot: [x=5.00000 y=5.00000]
MIT
1. Robot Moving and Sensing.ipynb
dgander000/P3_Implement_SLAM
MovementNow you can really picture where the robot is in the world! Next, let's call the robot's `move` function. We'll ask it to move some distance `(dx, dy)` and we'll see that this motion is not perfect by the placement of our robot `o` and by the printed out position of `r`. Try changing the values of `dx` and `dy...
# choose values of dx and dy (negative works, too) dx = 1 dy = 2 r.move(dx, dy) # print out the exact location print(r) # display the world after movement, not that this is the same call as before # the robot tracks its own movement display_world(int(world_size), [r.x, r.y])
Robot: [x=6.13189 y=6.86628]
MIT
1. Robot Moving and Sensing.ipynb
dgander000/P3_Implement_SLAM
LandmarksNext, let's create landmarks, which are measurable features in the map. You can think of landmarks as things like notable buildings, or something smaller such as a tree, rock, or other feature.The robot class has a function `make_landmarks` which randomly generates locations for the number of specified landma...
# create any number of landmarks num_landmarks = 3 r.make_landmarks(num_landmarks) # print out our robot's exact location print(r) # display the world including these landmarks display_world(int(world_size), [r.x, r.y], r.landmarks) # print the locations of the landmarks print('Landmark locations [x,y]: ', r.landmar...
Robot: [x=6.13189 y=6.86628]
MIT
1. Robot Moving and Sensing.ipynb
dgander000/P3_Implement_SLAM
SenseOnce we have some landmarks to sense, we need to be able to tell our robot to *try* to sense how far they are away from it. It will be up t you to code the `sense` function in our robot class.The `sense` function uses only internal class parameters and returns a list of the the measured/sensed x and y distances t...
# try to sense any surrounding landmarks measurements = r.sense() # this will print out an empty list if `sense` has not been implemented print(measurements)
[[0, 1.9448626199060042, -1.0031616156828798], [2, 0.8032812034875207, 2.1645375343944453]]
MIT
1. Robot Moving and Sensing.ipynb
dgander000/P3_Implement_SLAM
**Refer back to the grid map above. Do these measurements make sense to you? Are all the landmarks captured in this list (why/why not)?** --- Data Putting it all togetherTo perform SLAM, we'll collect a series of robot sensor measurements and motions, in that order, over a defined period of time. Then we'll use only th...
data = [] # after a robot first senses, then moves (one time step) # that data is appended like so: data.append([measurements, [dx, dy]]) # for our example movement and measurement print(data) # in this example, we have only created one time step (0) time_step = 0 # so you can access robot measurements: print('Measu...
Measurements: [[0, 1.9448626199060042, -1.0031616156828798], [2, 0.8032812034875207, 2.1645375343944453]] Motion: [1, 2]
MIT
1. Robot Moving and Sensing.ipynb
dgander000/P3_Implement_SLAM
prepared by Maksim Dimitrijev (QLatvia) This cell contains some macros. If there is a problem with displaying mathematical formulas, please run this cell to load these macros. $ \newcommand{\bra}[1]{\langle 1|} $$ \newcommand{\ket}[1]{|1\rangle} $$ \newcommand{\...
# # your solution is here #
_____no_output_____
Apache-2.0
silver/C06_State_Conversion_And_Visualization.ipynb
asif-saad/qSilver
click for our solution Visualization We can visualize the state $\ket{\psi} = \cos{\frac{\theta}{2}} \ket{0} + e^{i\phi} \sin{\frac{\theta}{2}} \ket{1}$ by separately drawing angles $\frac{\theta}{2}$ and $\phi$. In the next notebook we will combine the visualization of both angles.Suppose that we have $\theta = \frac...
theta = 90 #pi/2 myangle = theta/2 from matplotlib.pyplot import figure,gca from matplotlib.patches import Arc from math import sin,cos,pi %run qlatvia.py %run drawing.py #matplotlib.pyplot.subplot(1, 2, 1) figure(figsize=(6,6), dpi=60) draw_real_part() gca().add_patch( Arc((0,0),2,2,angle=0,theta1=0,theta2=90,color...
_____no_output_____
Apache-2.0
silver/C06_State_Conversion_And_Visualization.ipynb
asif-saad/qSilver
After that we draw angle $\phi$ to see the local phase.
from matplotlib.pyplot import figure,gca from matplotlib.patches import Arc from math import sin,cos,pi %run qlatvia.py %run drawing.py phi = 240 draw_imaginary_part() gca().add_patch( Arc((0,0),2,2,angle=0,theta1=0,theta2=phi,color="blue",linewidth=2) ) myangle_in_radian = 2*pi*(phi/360) x = cos(myangle_in_radian) ...
_____no_output_____
Apache-2.0
silver/C06_State_Conversion_And_Visualization.ipynb
asif-saad/qSilver
As you can see, the visualization of a complex quantum state in two parts is quite demonstrative. The visualization of angle $\frac{\theta}{2}$ gives us sense about which state has bigger amplitude, and so - higher probability to be observed. The angle $\phi$ clearly represents the local phase, and also gives an idea a...
# # your solution is here #
_____no_output_____
Apache-2.0
silver/C06_State_Conversion_And_Visualization.ipynb
asif-saad/qSilver
click for our solution Task 3 Implement the code to visualize arbitrary state $\ket{\psi} = \cos{\frac{\theta}{2}} \ket{0} + e^{i\phi} \sin{\frac{\theta}{2}} \ket{1}$.Test it with angles $\frac{\theta}{2} = \frac{\pi/2}{2}$ and $\phi = \frac{4\pi}{3}$.
# # your solution is here #
_____no_output_____
Apache-2.0
silver/C06_State_Conversion_And_Visualization.ipynb
asif-saad/qSilver
click for our solution Task 4 Implement the code to visualize arbitrary state $\ket{\psi} = \alpha \ket{0} + \beta \ket{1}$. You can do the conversion first, and then use the visualization from the previous task.Test it with the state $\frac{1}{\sqrt{2}} \ket{0} + \frac{1}{\sqrt{2}}i \ket{1}$.
# # your solution is here #
_____no_output_____
Apache-2.0
silver/C06_State_Conversion_And_Visualization.ipynb
asif-saad/qSilver
Data Science 2 Numerical analysis - Differential equationsThe following material is more elaborately covered in Chapter 7 - *Initial value problems* of the book *Numerical methods in engineering with Python 3* by Jaan Kiusalaas (see BlackBoard). IntroductionA [differential equation](https://www.wikiwand.com/en/Differ...
import numpy as np def euler(f, y0, x0, x1, steps): """xs, ys = euler(f, y0, x0, x1, steps). Euler's method for solving the initial value problem {y}' = {f(x,{y})}, where {y} = {y[0],y[1],...,y[n-1]}. x0, y0 = initial conditions x1 = terminal value of x steps = number of integration st...
_____no_output_____
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
We apply the method to an example where we look for a solution to the system of equations $y_0' = -2y_1$ and $y_1' = 2y_0$ with starting values $y_0(0) = 1$ and $y_1(0) = 0$. Because the exact solution equals $y_0(x) = \cos(2x)$ and $y_1(x) = \sin(2x)$ (i.e. it describes a two-dimensional circular motion) the solutio...
# Example: Solve {y}' = {-2*y1, 2*y0} with y(0) = {1, 0} func = lambda x, y: np.array([-2.0 * y[1], 2.0 * y[0]]) x0, x1, y0 = 0.0, np.pi, np.array([1.0, 0.0]) %matplotlib inline import matplotlib.pyplot as plt grid = np.linspace(-1.5, 1.5, 16) # for gx in grid: # for gy in grid: # print(func(x0, np.array([...
_____no_output_____
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
We can investigate the behaviour of the error of the method by varying the step size $h$. Verify below that the order of Euler's method is $\mathcal{O}(h)$; therefore, it is a first-order method.
ns = [1, 10, 100, 1000, 10000, 100000] for n in ns: xs, ys = euler(func, y0, x0, x1, n) print(f'n = {n:6}: |Δy| = {np.linalg.norm(ys[0]-ys[-1]):8.1e}')
n = 1: |Δy| = 6.3e+00 n = 10: |Δy| = 4.5e+00 n = 100: |Δy| = 2.2e-01 n = 1000: |Δy| = 2.0e-02 n = 10000: |Δy| = 2.0e-03 n = 100000: |Δy| = 2.0e-04
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
Heun's methodAs demonstrated in the figure above, the accuracy of Euler's method is limited. The reason is that the derivative $\boldsymbol{y}'$ is only calculated at the beginning of the step, whereas it is assumed to apply throughout the entire step. Therefore, it cannot account for any changes in $\boldsymbol{y}'$ ...
def heun(f, y0, x0, x1, steps): """xs, ys = heun(f, y0, x0, x1, steps). Heun's method for solving the initial value problem {y}' = {f(x,{y})}, where {y} = {y[0],y[1],...,y[n-1]}. x0, y0 = initial conditions x1 = terminal value of x steps = number of integration steps f = user-s...
n = 1: |Δy| = 2.1e+01 n = 10: |Δy| = 4.5e-01 n = 100: |Δy| = 4.1e-03 n = 1000: |Δy| = 4.1e-05 n = 10000: |Δy| = 4.1e-07 n = 100000: |Δy| = 4.1e-09
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
4th-order Runge-Kutta MethodThe idea of using intermediate evaluations of the derivative $\boldsymbol{y}'$ for the integration can be further extended. For example, the [fourth-order Runge-Kutta method](https://en.wikipedia.org/wiki/Runge%E2%80%93Kutta_methods) evaluates four different derivatives according to the fol...
def runge_kutta(f, y0, x0, x1, steps): """xs, ys = runge_kutta(f, y0, x0, x1, steps). 4th-order Runge-Kutta method for solving the initial value problem {y}' = {f(x,{y})}, where {y} = {y[0],y[1],...,y[n-1]}. x0, y0 = initial conditions x1 = terminal value of x steps = number of integrat...
n = 1: |Δy| = 5.7e+01 n = 10: |Δy| = 8.1e-03 n = 100: |Δy| = 8.2e-07 n = 1000: |Δy| = 8.2e-11 n = 10000: |Δy| = 7.0e-15 n = 100000: |Δy| = 1.6e-14
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
The Runge-Kutta Method FamilyAll of the above algorithms form special cases in a more general [Runge-Kutta family of methods](https://en.wikipedia.org/wiki/List_of_Runge%E2%80%93Kutta_methods) that calculate any number of intermediate derivatives according to the following equations$$\begin{aligned}\boldsymbol{k}_0 &=...
def midpoint(f, y0, x0, x1, steps): h = (x1 - x0) / steps xs = np.linspace(x0, x1, steps + 1) y = y0 ys =[y] for x in xs[:-1]: k1 = f(x, y) k2 = f(x + (h/2), y + (h/2)*k1) y = y + h*(k2) ys.append(y) return xs, ys # Example: Solve {y}' = {-2*y1, 2*y...
n = 1: |Δy| = 2.1e+01 n = 10: |Δy| = 4.5e-01 n = 100: |Δy| = 4.1e-03 n = 1000: |Δy| = 4.1e-05 n = 10000: |Δy| = 4.1e-07 n = 100000: |Δy| = 4.1e-09
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
**Exercise 5**Pick one of the methods from the above table that has not been implemented yet. Write a function `my_method` that integrates a differential equation given by some function $\boldsymbol{f}(x, \boldsymbol{y})$ over the interval from `x0` to `x1` in a given number of steps using that method, starting from a ...
def ralston(f, y0, x0, x1, steps): """y = ralston(f, y0, t, h)""" h = (x1 - x0) / steps xs = np.linspace(x0, x1, steps + 1) y = y0 ys =[y] for x in xs[:-1]: #Initial calculation k1 = f(x, y) # Middle calculations k2 = f(x + ((3*h)/4), y + ((3*h)/4)*k1) ...
n = 1: |Δy| = 2.1e+01 n = 10: |Δy| = 4.5e-01 n = 100: |Δy| = 4.1e-03 n = 1000: |Δy| = 4.1e-05 n = 10000: |Δy| = 4.1e-07 n = 100000: |Δy| = 4.1e-09
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
Exercises**Exercise 6**Solve the differential equation $y' = 3y - 4e^{-x}$ with initial value $y(0) = 1$ numerically from $x=0$ to $4$ in steps of $h=0.01$. Compare the result with the exact analytical solution $y = e^{-x}$.Does it make a visible difference which solver you use? Can you understand what is happening? (...
yo = lambda x, y: (3.0 * y) - (4*np.e**(-x)) def y(x): return np.e**(-x) x = np.linspace(-1., 5., 50) y0 = 1.0 x0 = 0.0 x1 = 4.0 h = 0.01 steps = int((x1 - x0) / h) steps xeul, yeul = euler(yo, y0, x0, x1, steps) xheun, yheun = heun(yo, y0, x0, x1, steps) xkutta, ykutta = runge_kutta(yo, y0, x0, x1, steps) ns = ...
_____no_output_____
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
**Exercise 7**The [Lotka-Volterra equations](https://en.wikipedia.org/wiki/Lotka%E2%80%93Volterra_equations) describe the dynamics of biological systems in which two species interact, one as a predator and the other as prey. The populations change through time according to the pair of equations$$\begin{aligned}\frac{dx...
from scipy.integrate import solve_ivp def model(t, v): return[ ((2/3)*v[0]) - ((4/3)*v[0]*v[1]), (v[0]*v[1]) - v[1] ] fromto = (0.0, 60.0) start = [2/3, 2/3] times = np.linspace(0.0, 60.0, 501) solution = solve_ivp(model, fromto, start, t_eval=times) plt.plot(solution.t, solution.y[0], '...
_____no_output_____
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
**Exercise 8**The [Bessel function $J_0$](https://en.wikipedia.org/wiki/Bessel_function) is given by the 2nd-order differential equation$$J_0'' + \frac{1}{x} \cdot J_0' + J_0 = 0$$with initial values $J_0(0) = 1$ and $J_0'(0) = 0$. How many zeroes does this function have in the range $x = 0$ to $25$?(Hint: To avoid the...
y0 = np.array([1.0, 0.0]) x0 = 1e-12 x1 = 25 h = 0.1 steps = int((x1 - x0) / h) steps
_____no_output_____
MIT
Numerical_analysis/Lessons/5 - Differential equations - st.ipynb
StevetheGreek97/Master_DSLS
I eventually want to do text analysis with the Kickstarter data, but I'll need to do some data cleaning and text preprocessing before I can do so.
import psycopg2 import pandas as pd import nltk import re
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
Load dataLoad data from database. List of columns found on day44
dbname = "kick" tblname = "info" # Connect to database conn = psycopg2.connect(dbname=dbname) cur = conn.cursor() colnames = ["id", "name", "blurb"] cur.execute("SELECT {col} FROM {tbl}".format(col=', '.join(colnames), tbl=tblname)) rows = cur.fetchall() pd.DataFrame(rows, columns=colnames).head()
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
I want to combine `name` and `blurb`. We can use the `concat_ws` command in postgres
# Treat name + blurb as 1 document cur.execute("SELECT id, concat_ws(name, blurb) FROM info") rows = cur.fetchall() df = pd.DataFrame(rows, columns=["id", "document"]) df.head() # close communication cur.close() conn.close() # Number of documents df.shape
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
Text processing for 1 document
text = df["document"][1] text
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
To lower case
text = text.lower() text
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
Bag of word & tokenizationDigits are also removed
words = nltk.wordpunct_tokenize(re.sub('[^a-zA-Z_ ]', '', text)) words
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
Remove stopwordsReference: https://www.kaggle.com/c/word2vec-nlp-tutorial/details/part-1-for-beginners-bag-of-words
from nltk.corpus import stopwords english_stopwords = stopwords.words("english") print(len(english_stopwords)) print(english_stopwords)
153 ['i', 'me', 'my', 'myself', 'we', 'our', 'ours', 'ourselves', 'you', 'your', 'yours', 'yourself', 'yourselves', 'he', 'him', 'his', 'himself', 'she', 'her', 'hers', 'herself', 'it', 'its', 'itself', 'they', 'them', 'their', 'theirs', 'themselves', 'what', 'which', 'who', 'whom', 'this', 'that', 'these', 'those', 'a...
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
We have a list of 153 english stopwords
# Remove stopwords from document words = [w for w in words if not w in english_stopwords] words
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
Stemming vs LemmatizationReference: http://stackoverflow.com/questions/771918/how-do-i-do-word-stemming-or-lemmatization
from nltk.stem import PorterStemmer, WordNetLemmatizer port = PorterStemmer() wnl = WordNetLemmatizer() ## Stemming [port.stem(w) for w in words] ## Lemmatizing [wnl.lemmatize(w) for w in words]
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
Putting it all together
def text_processing(text, method=None): # Lower case text = text.lower() # Remove non-letters & # Tokenize words = nltk.wordpunct_tokenize(re.sub('[^a-zA-Z_ ]', '', text)) # Remove stop words words = [w for w in words if not w in stopwords.words("english")] # Stemming ...
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
- Find some words are untouched: - scifi - save - world- Some words are touched only in stemming: - fantsy-fantasi - anime->anim - styled->style - series->seri - trying->tri - probably->probabl- Agreement of stemming and lemmatizng - guys->guy ---(Aside) How does stemming compare for o...
[port.stem(w) for w in ["trying", "triangle", "triple"]] [port.stem(w) for w in ["series", "serious"]]
_____no_output_____
MIT
src/ipynb/day49_text_preprocessing.ipynb
csiu/kick
运行以下代码,注意查看输出
import turtle import random from turtle import * from time import sleep # t = turtle.Turtle() # w = turtle.Screen() def tree(branchLen, t): if branchLen > 3: if 8 <= branchLen <= 12: if random.randint(0, 2) == 0: t.color('snow') else: t.color('light...
_____no_output_____
Apache-2.0
python/jupyternotebook/0.0 annaconda_test.ipynb
WhitePhosphorus4/xh-learning-code
uniBachelors
x=df.uniBachelors.value_counts() pd.set_option('display.max_rows', len(x)) print(x) pd.reset_option('display.max_rows') df.uniBachelors.replace(u'دانشگاه','',inplace=True) df.uniBachelors=df.uniBachelors.str.lower() df['uniBachelorsOLD']=df.uniBachelors.copy() def renamer(table): for i in df.index: f=df.get...
_____no_output_____
MIT
03_DataPreprocessing/09_Uni/2_Fill_FarsiUni.ipynb
yazdipour/DM17
uniMasters
df['uniMastersOLD']=df.uniMasters.copy() df.uniMasters.value_counts().head() renamer('uniMasters') df.uniMasters.value_counts() for i in df.index: f=df.get_value(i,'uniMasters').strip() if f in ['tehran','sharif','beheshti','amir-kabir','tarbiat-modares','isfahan','elmo-sanat','toosi','mashhad','tabriz']: ...
_____no_output_____
MIT
03_DataPreprocessing/09_Uni/2_Fill_FarsiUni.ipynb
yazdipour/DM17
getCikFilings> Get filings list for a CIK using the SEC's RESTful API.
#hide %load_ext autoreload %autoreload 2 from nbdev import show_doc #export from secscan import utils
_____no_output_____
Apache-2.0
04_getCikFilings.ipynb
ikedim01/secscan
Download and parse a list of filings for a CIK:
#export def getRecent(cik) : cik = str(cik).lstrip('0') restFilingsUrl = f'/submissions/CIK{cik.zfill(10)}.json' filingsJson = utils.downloadSecUrl(restFilingsUrl, restData=True, toFormat='json') recentList = filingsJson['filings']['recent'] accNos = recentList['accessionNumber'] print(len(accN...
_____no_output_____
Apache-2.0
04_getCikFilings.ipynb
ikedim01/secscan
Test downloading list of filings for a CIK:
testF = getRecent(83350) assert all(tup in testF for tup in ( ('8-K', '0001437749-21-013386', '20210526'), ('10-Q', '0001437749-21-012377', '20210517') )),"testing get recent CIK filings" #hide # uncomment and run to regenerate all library Python files # from nbdev.export import notebook2script; not...
_____no_output_____
Apache-2.0
04_getCikFilings.ipynb
ikedim01/secscan
The set index() and reset and reset_index()
import pandas as pd bond=pd.read_csv("jamesbond.csv") bond.set_index(["Film"],inplace=True) bond.head() bond.reset_index().head()
_____no_output_____
MIT
Untitled23.ipynb
Mohan-lal1993/quickstart
Advanced Notebook
%matplotlib inline import numpy as np import pandas as pd from pandas.tools.plotting import scatter_matrix from sklearn.datasets import load_boston import matplotlib as mpl import matplotlib.pyplot as plt import seaborn as sns sns.set_context('poster') sns.set_style('whitegrid') plt.rcParams['figure.figsize'] = 12, 8 ...
_____no_output_____
MIT
notebooks/Advanced-Notebook-Tricks.ipynb
ddeloss/jupyter-tips-and-tricks
BQPlotExamples here are shamelessly stolen from the amazing: https://github.com/maartenbreddels/jupytercon-2017/blob/master/jupytercon2017-widgets.ipynb
# mixed feelings about this import import bqplot.pyplot as plt import numpy as np x = np.linspace(0, 2, 50) y = x**2 fig = plt.figure() scatter = plt.scatter(x, y) plt.show() fig.animation_duration = 500 scatter.y = 2 * x**.5 scatter.selected_style = {'stroke':'red', 'fill': 'orange'} plt.brush_selector(); scatter.sele...
_____no_output_____
MIT
notebooks/Advanced-Notebook-Tricks.ipynb
ddeloss/jupyter-tips-and-tricks
ipyvolume
import ipyvolume as ipv N = 1000 x, y, z = np.random.random((3, N)) fig = ipv.figure() scatter = ipv.scatter(x, y, z, marker='box') ipv.show() scatter.x = scatter.x - 0.5 scatter.x = x scatter.color = "green" scatter.size = 5 scatter.color = np.random.random((N,3)) scatter.size = 2 ex = ipv.datasets.animated_stream.fet...
_____no_output_____
MIT
notebooks/Advanced-Notebook-Tricks.ipynb
ddeloss/jupyter-tips-and-tricks
**Neural Network** **Final Project** Computer Systems Engineering LIS3082-1 Artificial Intelligence By Victor Armando Canales Lima (162328) Professor Gerardo Ayala San Martín Department of Computing, Electronics and Mechatronics Universidad de las Am ́ericas Puebla, San Andr ́es Cholula, Puebla, México May 14, 2021 I...
!pip install numpy matplotlib scipy numba scikit-learn mne PyWavelets pandas !pip install mne-features from mne_features.univariate import compute_hjorth_complexity_spect as hjorthComp from mne_features.univariate import compute_hjorth_mobility_spect as hjorthMob from mne_features.univariate import compute_ptp_amp as p...
Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount("/content/drive", force_remount=True).
MIT
FinalProjectAI.ipynb
VictorCanLima/NeuralNetworkEEG
We create a class to import and preprocess the signal, we apply filters to clean the brain signal samples, we have 2 classes, right hand movement and left hand movement.
class EEG_Signal_Handler(): def __init__(self, datapath): self.data = load(datapath) # Importando datos self.C1=np.array(self.data['C1']) # Movimiento Mano Izquierda 1 self.C2=np.array(self.data['C2']) # Movimiento Mano Derecha 2 self.channels=len(self.C1[:,0,0]) # Numero de cana...
_____no_output_____
MIT
FinalProjectAI.ipynb
VictorCanLima/NeuralNetworkEEG
Feature Extraction Now we extract the features we are going to use. We have seven different features, but as we have 3 channels (eeg sensors), we have 3 different signals, in this class, we create our datasets to train and test our model. We will have 21 different inputs for our model, one for each channel and one for...
class Dataset_Creator(): def __init__(self,class1,class2): featsC1 = self.get_features(class1) featsC2 = self.get_features(class2) labelsC1 = np.ones(mysig.experiments) labelsC2 = np.ones(mysig.experiments) self.x_complete = np.concatenate((featsC1,featsC2),axis=0) se...
_____no_output_____
MIT
FinalProjectAI.ipynb
VictorCanLima/NeuralNetworkEEG
Defining Neural Network Now we define our Neural Network, inspired in the tutorial of the official YouTube channel of TensorFlow You. Our model will have 2 hiden layers, Rectifier activation function and as decision fuction for classification, the sigmoid fuction. As error function, we pick a binary crossentropy funct...
from keras.models import Sequential from keras.layers import Dense classifier = Sequential() # Initialising the ANN classifier.add(Dense(units = 11, activation = 'relu', input_dim = 21)) classifier.add(Dense(units = 6, activation = 'relu')) classifier.add(Dense(units = 3, activation = 'relu')) classifier.add(Dense(uni...
_____no_output_____
MIT
FinalProjectAI.ipynb
VictorCanLima/NeuralNetworkEEG
Trainning
classifier.fit(myfeatures.x_train, myfeatures.y_train, batch_size = 1, epochs = 100)
Epoch 1/100 316/316 [==============================] - 1s 1ms/step - loss: 9634.6702 Epoch 2/100 316/316 [==============================] - 0s 1ms/step - loss: 20.1148 Epoch 3/100 316/316 [==============================] - 0s 1ms/step - loss: 0.6192 Epoch 4/100 316/316 [==============================] - 0s 1ms/step - l...
MIT
FinalProjectAI.ipynb
VictorCanLima/NeuralNetworkEEG
Testing
Y_pred = classifier.predict(myfeatures.x_test) Y_pred = [ 1 if y>=0.5 else 0 for y in Y_pred ] total = 0 correct = 0 wrong = 0 for i in Y_pred: total=total+1 if(myfeatures.y_test[i] == Y_pred[i]): correct=correct+1 else: wrong=wrong+1 print("Total " + str(total)) print("Correct " + str(correct)) print("...
Total 80 Correct 80 Wrong 0
MIT
FinalProjectAI.ipynb
VictorCanLima/NeuralNetworkEEG
Define clipping functions
bagnet33 = bagnets.pytorch.bagnet33(pretrained=True, avg_pool=False).to(device) bagnet33.eval() print() def clip_pm1(values, **kwargs): """Clip values to [-1, 1] Input: - values(torch tensor): values to be clipped Output: (torch tensor) clipped values """ return torch.clamp_(values, -1., 1.) de...
_____no_output_____
BSD-3-Clause
code/notebooks/2019-6-1_threshold_model.ipynb
davidwagner/bagnet-patch-defense
Data Extraction Importing libraries and NOAA Reef Bleaching dataset
%matplotlib inline from numpy import arange import numpy from matplotlib import pyplot as plt from scipy.stats import norm import pandas as pd from sklearn.model_selection import cross_val_score from sklearn.model_selection import KFold from sklearn.linear_model import LogisticRegression from sklearn import model_selec...
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Removing the first row of the dataset. It will be used later.
#Total rows in the original dataframe len(df.axes[0]) #Original dataframe df.head(2) #Assigning first row of the dataframe to 'row' variable row = df.iloc[0] row=list(row) print(row) #Removing first row from the dataframe df = df.drop([0],axis=0) #New dataframe df.head(1) #Total rows in the new dataframe len(df.axes[0]...
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Data Exploration
#Information about the dataset df.info() #Data types of the dataset columns df.dtypes #Memory used by each column in the dataset df.memory_usage() #Total memory used by the dataset df.memory_usage().sum()
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Data Cleaning
#Check if there are missing values in the dataset df.isnull().sum().sum() #Check if there are duplicate rows in the dataset df.duplicated().sum() #Removing duplicates from the dataset df.drop_duplicates(keep="first",inplace=True) #Check if duplicate rows have been removed successfully from the dataset df.duplicated().s...
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Label encoding columns having non-integer values
df['Human Impact'].replace({'none':0,'low':1,'moderate':2,'high':3},inplace=True) df['Siltation'].replace({'never':0,'occasionally':1,'often':2,'always':3},inplace=True) df['Dynamite'].replace({'none':0,'low':1,'moderate':2,'high':3},inplace=True) df['Poison'].replace({'none':0,'low':1,'moderate':2,'high':3},inplace=Tr...
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Data Visualization
#Boxplot df['Depth'].plot.box(figsize=(8, 5)); #Boxplot of all the columns with numerical data df.boxplot(figsize=(20,20)) #Histogram df['Depth'].hist(bins=30, figsize=(8, 5)); #Histogram with details ax = df['Depth'].hist(bins=30, grid=False, color='green', figsize=(8, 5)) #grid turned off and colour changed ax.set_xl...
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Data Selection
#Column-wise correlation in the dataset df.corr() #Import seaborn library import seaborn as sns #Set the size of the heatmap sns.set(rc={'figure.figsize':(15,10)}) #Pearson correlation sns.heatmap(df.corr('pearson'),annot=True) #Spearman correlation sns.heatmap(df.corr('spearman'),annot=True) #Kendall correlation sns.h...
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Pearson, Spearman and Kendall all give similar results.Target column: BleachingPearson correlation results in:Columns Year, Siltation and Commercial are the least correlated to the target column Bleaching. They are dropped.
df=df.drop(['Year','Siltation','Commercial'],axis=1) df.replace('', numpy.nan, inplace=True) df.dropna(inplace=True) df.head()
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Data Splitting and Model Building (Logistic regression) Logistic Regression using sklearn
#Logistic regression model using sklearn X = df.iloc[:, 1:] y = df.iloc[:,0] #Split in training and testing sets from sklearn.model_selection import train_test_split X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25) #Scale from sklearn.preprocessing import StandardScaler X_sca = StandardScaler()...
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Logistic regression equation y -> target variable i.e. Bleachinga -> y-intercept of Bleachingb0 -> co-efficient of Oceanb1 -> co-efficient of Depthb2 -> co-efficient of Stormsb3 -> co-efficient of Human Impactb4 -> co-efficient of Dynamiteb5 -> co-efficient of Poisonb6 -> co-efficient of Sewageb7 -> co-efficient of In...
#K-fold cross-validation #Logistic Regression X = df.iloc[:,1:] y = df.iloc[:,0] k = 5 kf = model_selection.KFold(n_splits=k, random_state=None) model = LogisticRegression(solver= 'liblinear') result = cross_val_score(model , X, y, cv = kf) print("Avg accuracy: {}".format(result.mean())) #Root mean square error import ...
_____no_output_____
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Performing similar analysis for Pearson, Spearman and Kendall correlations gives the following results: Accuracy through Logistic regression using sklearn:Pearson: 0.968955223880597Spearman: 0.9659701492537314Kendall: 0.9707462686567164Thus, Kendall correlation gives the most accurate results. Average accuracy through ...
#Bleaching didn't occur - 0 #Bleaching occured - 1 df.head(1) if bool(clf.predict([[0,4,1,3,0,0,0,0]])): print("Bleaching will occur") else: print("Bleaching will not occur")
Bleaching will not occur
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
Using the model to predict whether bleaching will occur for unseen data
#Using the "row" saved earlier #It is a data sample never seen by the model before actual_bleaching = row[0] if actual_bleaching: print("Bleaching doesn't actually occur") else: print("Bleaching Actually occurs") #Preparing the 'row' for prediction row.pop() row=[row[1]]+row[3:6]+row[7:] #Original row print("...
No, Bleaching doesn't occur
BSD-3-Clause
NOAA Reef Bleaching.ipynb
Aadya178/NOAA-Check-Reef-Bleaching
SageMaker/DeepAR demo on electricity datasetThis notebook complements the [DeepAR introduction notebook](https://github.com/awslabs/amazon-sagemaker-examples/blob/master/introduction_to_amazon_algorithms/deepar_synthetic/deepar_synthetic.ipynb). Here, we will consider a real use case and show how to use DeepAR on Sage...
import sys !{sys.executable} -m pip install s3fs from __future__ import print_function %matplotlib inline import sys from urllib.request import urlretrieve import zipfile from dateutil.parser import parse import json from random import shuffle import random import datetime import os import boto3 import s3fs import ...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Before starting, we can override the default values for the following:- The S3 bucket and prefix that you want to use for training and model data. This should be within the same region as the Notebook Instance, training, and hosting.- The IAM role arn used to give training and hosting access to your data. See the docum...
s3_bucket = sagemaker.Session().default_bucket() # replace with an existing bucket if needed s3_prefix = "deepar-electricity-demo-notebook" # prefix used for all data stored within the bucket role = sagemaker.get_execution_role() # IAM role to use by SageMaker region = sagemaker_session.boto_region_name s3_data_pa...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Next, we configure the container image to be used for the region that we are running in.
image_name = sagemaker.amazon.amazon_estimator.get_image_uri(region, "forecasting-deepar", "latest")
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Import electricity dataset and upload it to S3 to make it available for Sagemaker As a first step, we need to download the original data set of from the UCI data set repository.
DATA_HOST = "https://archive.ics.uci.edu" DATA_PATH = "/ml/machine-learning-databases/00321/" ARCHIVE_NAME = "LD2011_2014.txt.zip" FILE_NAME = ARCHIVE_NAME[:-4] def progress_report_hook(count, block_size, total_size): mb = int(count * block_size // 1e6) if count % 500 == 0: sys.stdout.write("\r{} MB dow...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Then, we load and parse the dataset and convert it to a collection of Pandas time series, which makes common time series operations such as indexing by time periods or resampling much easier. The data is originally recorded in 15min interval, which we could use directly. Here we want to forecast longer periods (one wee...
data = pd.read_csv(FILE_NAME, sep=";", index_col=0, parse_dates=True, decimal=",") num_timeseries = data.shape[1] data_kw = data.resample("2H").sum() / 8 timeseries = [] for i in range(num_timeseries): timeseries.append(np.trim_zeros(data_kw.iloc[:, i], trim="f"))
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Let us plot the resulting time series for the first ten customers for the time period spanning the first two weeks of 2014.
fig, axs = plt.subplots(5, 2, figsize=(20, 20), sharex=True) axx = axs.ravel() for i in range(0, 10): timeseries[i].loc["2014-01-01":"2014-01-14"].plot(ax=axx[i]) axx[i].set_xlabel("date") axx[i].set_ylabel("kW consumption") axx[i].grid(which="minor", axis="x")
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Train and Test splitsOften times one is interested in evaluating the model or tuning its hyperparameters by looking at error metrics on a hold-out test set. Here we split the available data into train and test sets for evaluating the trained model. For standard machine learning tasks such as classification and regress...
# we use 2 hour frequency for the time series freq = "2H" # we predict for 7 days prediction_length = 7 * 12 # we also use 7 days as context length, this is the number of state updates accomplished before making predictions context_length = 7 * 12
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
We specify here the portion of the data that is used for training: the model sees data from 2014-01-01 to 2014-09-01 for training.
start_dataset = pd.Timestamp("2014-01-01 00:00:00", freq=freq) end_training = pd.Timestamp("2014-09-01 00:00:00", freq=freq)
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
The DeepAR JSON input format represents each time series as a JSON object. In the simplest case each time series just consists of a start time stamp (``start``) and a list of values (``target``). For more complex cases, DeepAR also supports the fields ``dynamic_feat`` for time-series features and ``cat`` for categorica...
training_data = [ { "start": str(start_dataset), "target": ts[ start_dataset : end_training - timedelta(days=1) ].tolist(), # We use -1, because pandas indexing includes the upper bound } for ts in timeseries ] print(len(training_data))
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
As test data, we will consider time series extending beyond the training range: these will be used for computing test scores, by using the trained model to forecast their trailing 7 days, and comparing predictions with actual values.To evaluate our model performance on more than one week, we generate test data that ext...
num_test_windows = 4 test_data = [ { "start": str(start_dataset), "target": ts[start_dataset : end_training + timedelta(days=k * prediction_length)].tolist(), } for k in range(1, num_test_windows + 1) for ts in timeseries ] print(len(test_data))
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Let's now write the dictionary to the `jsonlines` file format that DeepAR understands (it also supports gzipped jsonlines and parquet).
def write_dicts_to_file(path, data): with open(path, "wb") as fp: for d in data: fp.write(json.dumps(d).encode("utf-8")) fp.write("\n".encode("utf-8")) %%time write_dicts_to_file("train.json", training_data) write_dicts_to_file("test.json", test_data)
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Now that we have the data files locally, let us copy them to S3 where DeepAR can access them. Depending on your connection, this may take a couple of minutes.
s3 = boto3.resource("s3") def copy_to_s3(local_file, s3_path, override=False): assert s3_path.startswith("s3://") split = s3_path.split("/") bucket = split[2] path = "/".join(split[3:]) buk = s3.Bucket(bucket) if len(list(buk.objects.filter(Prefix=path))) > 0: if not override: ...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Let's have a look to what we just wrote to S3.
s3filesystem = s3fs.S3FileSystem() with s3filesystem.open(s3_data_path + "/train/train.json", "rb") as fp: print(fp.readline().decode("utf-8")[:100] + "...")
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
We are all set with our dataset processing, we can now call DeepAR to train a model and generate predictions. Train a modelHere we define the estimator that will launch the training job.
estimator = sagemaker.estimator.Estimator( image_uri=image_name, sagemaker_session=sagemaker_session, role=role, train_instance_count=1, train_instance_type="ml.c4.2xlarge", base_job_name="deepar-electricity-demo", output_path=s3_output_path, )
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Next we need to set the hyperparameters for the training job. For example frequency of the time series used, number of data points the model will look at in the past, number of predicted data points. The other hyperparameters concern the model to train (number of layers, number of cells per layer, likelihood function) ...
hyperparameters = { "time_freq": freq, "epochs": "400", "early_stopping_patience": "40", "mini_batch_size": "64", "learning_rate": "5E-4", "context_length": str(context_length), "prediction_length": str(prediction_length), } estimator.set_hyperparameters(**hyperparameters)
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
We are ready to launch the training job. SageMaker will start an EC2 instance, download the data from S3, start training the model and save the trained model.If you provide the `test` data channel as we do in this example, DeepAR will also calculate accuracy metrics for the trained model on this test. This is done by p...
%%time data_channels = {"train": "{}/train/".format(s3_data_path), "test": "{}/test/".format(s3_data_path)} estimator.fit(inputs=data_channels, wait=True)
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Since you pass a test set in this example, accuracy metrics for the forecast are computed and logged (see bottom of the log).You can find the definition of these metrics from [our documentation](https://docs.aws.amazon.com/sagemaker/latest/dg/deepar.html). You can use these to optimize the parameters and tune your mode...
from sagemaker.serializers import IdentitySerializer class DeepARPredictor(sagemaker.predictor.Predictor): def __init__(self, *args, **kwargs): super().__init__( *args, # serializer=JSONSerializer(), serializer=IdentitySerializer(content_type="application/json"), ...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Now we can deploy the model and create and endpoint that can be queried using our custom DeepARPredictor class.
predictor = estimator.deploy( initial_instance_count=1, instance_type="ml.m5.large", predictor_cls=DeepARPredictor )
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Make predictions and plot results Now we can use the `predictor` object to generate predictions.
predictor.predict(ts=timeseries[120], quantiles=[0.10, 0.5, 0.90]).head()
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Below we define a plotting function that queries the model and displays the forecast.
def plot( predictor, target_ts, cat=None, dynamic_feat=None, forecast_date=end_training, show_samples=False, plot_history=7 * 12, confidence=80, ): freq = target_ts.index.freq print( "calling served model to generate predictions starting from {}".format(str(forecast_date)...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
We can interact with the function previously defined, to look at the forecast of any customer at any point in (future) time. For each request, the predictions are obtained by calling our served model on the fly.Here we forecast the consumption of an office after week-end (note the lower week-end consumption). You can s...
style = {"description_width": "initial"} @interact_manual( customer_id=IntSlider(min=0, max=369, value=91, style=style), forecast_day=IntSlider(min=0, max=100, value=51, style=style), confidence=IntSlider(min=60, max=95, value=80, step=5, style=style), history_weeks_plot=IntSlider(min=1, max=20, value=1...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Additional featuresWe have seen how to prepare a dataset and run DeepAR for a simple example.In addition DeepAR supports the following features:* missing values: DeepAR can handle missing values in the time series during training as well as for inference.* Additional time features: DeepAR provides a set default time s...
def create_special_day_feature(ts, fraction=0.05): # First select random day indices (plus the forecast day) num_days = (ts.index[-1] - ts.index[0]).days rand_indices = list(np.random.randint(0, num_days, int(num_days * 0.1))) + [num_days] feature_value = np.zeros_like(ts) for i in rand_indices: ...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
We now create the up-lifted time series and randomly remove time points.The figures below show some example time series and the `special_day` feature value in green.
timeseries_uplift = [ts * (1.0 + feat) for ts, feat in zip(timeseries, special_day_features)] time_series_processed = [drop_at_random(ts) for ts in timeseries_uplift] fig, axs = plt.subplots(5, 2, figsize=(20, 20), sharex=True) axx = axs.ravel() for i in range(0, 10): ax = axx[i] ts = time_series_processed[i][:...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
As before, we spawn an endpoint to visualize our forecasts on examples we send on the fly.
%%time predictor_new_features = estimator_new_features.deploy( initial_instance_count=1, instance_type="ml.m5.large", predictor_cls=DeepARPredictor ) customer_id = 120 predictor_new_features.predict( ts=time_series_processed[customer_id][:-prediction_length], dynamic_feat=[special_day_features[customer_id]....
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
As before, we can query the endpoint to see predictions for arbitrary time series and time points.
@interact_manual( customer_id=IntSlider(min=0, max=369, value=13, style=style), forecast_day=IntSlider(min=0, max=100, value=21, style=style), confidence=IntSlider(min=60, max=95, value=80, step=5, style=style), missing_ratio=FloatSlider(min=0.0, max=0.95, value=0.2, step=0.05, style=style), show_sa...
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
Delete endpoints
predictor.delete_endpoint() predictor_new_features.delete_endpoint()
_____no_output_____
Apache-2.0
introduction_to_amazon_algorithms/deepar_electricity/DeepAR-Electricity.ipynb
EthanShouhanCheng/amazon-sagemaker-examples
価格予測モデルのBaseline - CNNを用いたモデルを作成する. - 価格予測とクラス分類でタスクが大きく異なるので,imagenetで学習したモデルを用いないものを最初に作成する. - サイトに載せられる画像を教師データとしており,画像が大きく回転したりなどは不要と考えられるためそのような前処理は行わない. - 損失関数にはmaeもしくはrmseを用いる. モデルの構築 - EfficientNetB0(未学習)を用いて特徴量を抽出. - num_sales, コレクション名のone-hotベクトルを抽出した特徴量に結合. - 全結合層を重ねて出力. - ImageNetを用いて事前学習したものとしてい...
import os from typing import List, Optional, Tuple, Dict import math import tempfile import random import numpy as np import pandas as pd from sklearn.model_selection import train_test_split from sklearn.metrics import mean_squared_error, mean_absolute_error import cv2 import tensorflow as tf import tensorflow.keras.l...
/usr/local/lib/python3.8/dist-packages/IPython/core/interactiveshell.py:3444: DtypeWarning: Columns (3,27,28,71,88,119) have mixed types.Specify dtype option on import or set low_memory=False. exec(code_obj, self.user_global_ns, self.user_ns) /usr/local/lib/python3.8/dist-packages/IPython/core/interactiveshell.py:344...
MIT
notebooks/training_model.ipynb
nft-appraiser/nft-appraiser-ml
Helper Functions DataLoader
class FullPathDataLoader(Sequence): """ Data loader that load images, meta data and targets. This class is inherited Sequence class of Keras. """ def __init__(self, path_list: np.ndarray, target: Optional[np.ndarray], meta_data: Optional[np.ndarray] = None, batch_size: int = 16, ...
_____no_output_____
MIT
notebooks/training_model.ipynb
nft-appraiser/nft-appraiser-ml
seed settings
def set_seed(random_state=6174): tf.random.set_seed(random_state) np.random.seed(random_state) random.seed(random_state) os.environ['PYTHONHASHSEED'] = str(random_state)
_____no_output_____
MIT
notebooks/training_model.ipynb
nft-appraiser/nft-appraiser-ml