markdown
stringlengths
0
1.02M
code
stringlengths
0
832k
output
stringlengths
0
1.02M
license
stringlengths
3
36
path
stringlengths
6
265
repo_name
stringlengths
6
127
위의 그림에서 0~4는 레이블 컬럼에 들어가 있는 것!!마스크 방식은 `for문` + `if문````pythonfor ...: if ...: for ...: if (~(df['label'] ==0) | (df['label'] == 4)) : 0도 아니고 4도 아니고```
dfx = df[(~(df['label'] ==0) | (df['label'] == 4))] df.shape, dfx.shape dfx.plot(kind='scatter', x='Grocery',y='Frozen',c='label', cmap='Set1', figsize=(7,7)) df.to_excel('./wholesale.xls')
_____no_output_____
Apache-2.0
0702_ML19_clustering_kmeans.ipynb
msio900/minsung_machinelearning
[Index](Index.ipynb) - [Next](Widget List.ipynb) Simple Widget Introduction What are widgets? Widgets are eventful python objects that have a representation in the browser, often as a control like a slider, textbox, etc. What can they be used for? You can use widgets to build **interactive GUIs** for your notebooks....
import ipywidgets as widgets
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
repr Widgets have their own display `repr` which allows them to be displayed using IPython's display framework. Constructing and returning an `IntSlider` automatically displays the widget (as seen below). Widgets are displayed inside the output area below the code cell. Clearing cell output will also remove the widg...
widgets.IntSlider()
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
display() You can also explicitly display the widget using `display(...)`.
from IPython.display import display w = widgets.IntSlider() display(w)
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Multiple display() calls If you display the same widget twice, the displayed instances in the front-end will remain in sync with each other. Try dragging the slider below and watch the slider above.
display(w)
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Why does displaying the same widget twice work? Widgets are represented in the back-end by a single object. Each time a widget is displayed, a new representation of that same object is created in the front-end. These representations are called views.![Kernel & front-end diagram](images/WidgetModelView.png) Closing ...
display(w) w.close()
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Widget properties All of the IPython widgets share a similar naming scheme. To read the value of a widget, you can query its `value` property.
w = widgets.IntSlider() display(w) w.value
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Similarly, to set a widget's value, you can set its `value` property.
w.value = 100
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Keys In addition to `value`, most widgets share `keys`, `description`, and `disabled`. To see the entire list of synchronized, stateful properties of any specific widget, you can query the `keys` property.
w.keys
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Shorthand for setting the initial values of widget properties While creating a widget, you can set some or all of the initial values of that widget by defining them as keyword arguments in the widget's constructor (as seen below).
widgets.Text(value='Hello World!', disabled=True)
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Linking two similar widgets If you need to display the same value two different ways, you'll have to use two different widgets. Instead of attempting to manually synchronize the values of the two widgets, you can use the `link` or `jslink` function to link two properties together (the difference between these is dis...
a = widgets.FloatText() b = widgets.FloatSlider() display(a,b) mylink = widgets.jslink((a, 'value'), (b, 'value'))
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Unlinking widgets Unlinking the widgets is simple. All you have to do is call `.unlink` on the link object. Try changing one of the widgets above after unlinking to see that they can be independently changed.
# mylink.unlink()
_____no_output_____
BSD-3-Clause
docs/source/examples/Widget Basics.ipynb
akhand1111/ipywidgets
Data format conversion for WEASEL_MUSE===---Input---Two file types, each **data file** represents a single sample; the **label file** contains labels of all samples***Note:*** *both training and testing data should do the conversion***data files**: - file name: "sample_id.csv"- file contents: L * D, L is the MTS length...
import numpy as np
_____no_output_____
MIT
Baselines/mtsc_weasel_muse/.ipynb_checkpoints/Preprocess_weasel_muse-checkpoint.ipynb
JingweiZuo/SMATE
On country (only MS)
df.fund= df.fund=='TRUE' df.gre= df.gre=='TRUE' df.highLevelBachUni= df.highLevelBachUni=='TRUE' df.highLevelMasterUni= df.highLevelMasterUni=='TRUE' df.uniRank.fillna(294,inplace=True) df.columns oldDf=df.copy() df=df[['countryCoded','degreeCoded','engCoded', 'fieldGroup','fund','gpaBachelors','gre', 'highLevelBachUni...
('ball_tree', 'braycurtis', 55.507529507529512) ('ball_tree', 'canberra', 44.839072039072036) ('ball_tree', 'chebyshev', 53.738054538054541) ('ball_tree', 'cityblock', 55.735775335775337) ('ball_tree', 'euclidean', 55.793080993080991) ('ball_tree', 'dice', 46.14798534798534) ('ball_tree', 'hamming', 47.408547008547011)...
MIT
08_AfterAcceptance/06_KNN/knn.ipynb
yazdipour/DM17
On Fund (only MS)
bestAvg=[] for alg in algorithm: for dis in dist: k_fold = KFold(n=len(df), n_folds=5) scores = [] try: clf = KNeighborsClassifier(n_neighbors=3, weights='distance',algorithm=alg, metric=dis) except Exception as err: continue for train_indices, test_in...
('ball_tree', 'braycurtis', 76.495400895400905) ('ball_tree', 'canberra', 75.354008954008961) ('ball_tree', 'chebyshev', 75.584533984533977) ('ball_tree', 'cityblock', 77.293935693935694) ('ball_tree', 'euclidean', 76.496703296703302) ('ball_tree', 'dice', 74.383557183557173) ('ball_tree', 'hamming', 76.152706552706562...
MIT
08_AfterAcceptance/06_KNN/knn.ipynb
yazdipour/DM17
Best : ('kd_tree', 'cityblock', 77.692144892144896)
me=[0,2,0,2.5,False,False,1.5,400] n=bestClf.kneighbors([me]) n for i in n[1]: print(xtr.iloc[i])
countryCoded engCoded fieldGroup gpaBachelors gre highLevelBachUni \ 664 0 2 0 2.5 False False 767 0 2 0 3.0 False False 911 0 2 0 3.0 False False...
MIT
08_AfterAcceptance/06_KNN/knn.ipynb
yazdipour/DM17
 Periodo de los datos: 6 ciclos y medio (+ 3 ciclos-0)
df['date'].hist(bins=51, figsize=(10,5)) plt.xlim(df['date'].min(), df['date'].max()) plt.title('Histograma de la Fecha de Envío del Mensaje') plt.ylabel('Número de Mensajes') plt.xlabel('Año-Mes') plt.show() #plt.savefig('hist_fecha.svg', format='svg') df['month'] = df['date'].dt.month df['dayofweek'] = df['date'].dt....
_____no_output_____
MIT
deep_learning/models/combine_processes/Data_Cleaning_NLP.ipynb
Claudio9701/mailbot
Email pairing algorithm1. Extrae los mensajes enviados por alumno y los mensajes enviados por usuarios internos a cada alumno, respectivamente2. Extrae el asunto de cada mensaje del punto 1. Si el asunto del mensaje es igual al asunto enviado en el mensaje anterior aumenta el contador de mensajes con el mismo asunto.3...
# Separate mails sended to each alumn dfs = [send_by_internals[send_by_internals.recipient_email == alumn] for alumn in send_by_alumns.sender_email.unique()] unique_alumns = send_by_alumns.sender_email.unique() n = len(unique_alumns) # Count causes to not being able to process a text resp_date_bigger_than_input_date =...
100%|██████████| 3781/3781 [00:57<00:00, 65.81it/s]
MIT
deep_learning/models/combine_processes/Data_Cleaning_NLP.ipynb
Claudio9701/mailbot
Format data
total_unpaired_mails = repited_id+resp_date_bigger_than_input_date+responses_with_same_subject_lower_than_counter+subject_equal_none+n_obs_less_than_0 print() print('Filtros del algoritmo de emparejamiento') print('resp_date_bigger_than_input_date:',resp_date_bigger_than_input_date) print('subject_equal_none:',subject_...
_____no_output_____
MIT
deep_learning/models/combine_processes/Data_Cleaning_NLP.ipynb
Claudio9701/mailbot
NLP
## Tokenization using NLTK # Define input (x) and target (y) sequences variables x = [word_tokenize(msg, language='spanish') for msg in paired_mails['input_body'].values] y = [word_tokenize(msg, language='spanish') for msg in paired_mails['resp_body'].values] # Variables to store lenghts hist_len_inp = [] hist_len_out...
_____no_output_____
MIT
deep_learning/models/combine_processes/Data_Cleaning_NLP.ipynb
Claudio9701/mailbot
Matplotlib Matplotlib is a python 2D plotting library which produces publication quality figures in a variety of hardcopy formats and interactive environments across platforms. matplotlib can be used in python scripts, the python and ipython shell, web application servers, and six graphical user interface toolkits.Mat...
# needed to display the graphs %matplotlib inline import numpy as np import matplotlib.pyplot as plt x = np.linspace(0, 5, 10) y = x ** 2 fig = plt.figure() # left, bottom, width, height (range 0 to 1) axes = fig.add_axes([0.1, 0.1, 0.8, 0.8]) axes.plot(x, y, 'r') axes.set_xlabel('x') axes.set_ylabel('y') axes.set_...
_____no_output_____
MIT
Matplotlib-BEst.ipynb
imamol555/Machine-Learning
Copyright 2018 The TF-Agents Authors.
#@title Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # https://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
Environments View on TensorFlow.org Run in Google Colab View source on GitHub Download notebook Introduction The goal of Reinforcement Learning (RL) is to design agents that learn by interacting with an environment. In the standard RL setting, the agent receives an ...
!pip install tf-agents !pip install 'gym==0.10.11' from __future__ import absolute_import from __future__ import division from __future__ import print_function import abc import tensorflow as tf import numpy as np from tf_agents.environments import py_environment from tf_agents.environments import tf_environment from...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
Python Environments Python environments have a `step(action) -> next_time_step` method that applies an action to the environment, and returns the following information about the next step:1. `observation`: This is the part of the environment state that the agent can observe to choose its actions at the next step.2. `r...
class PyEnvironment(object): def reset(self): """Return initial_time_step.""" self._current_time_step = self._reset() return self._current_time_step def step(self, action): """Apply action and return new time_step.""" if self._current_time_step is None: return self.reset() self._cu...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
In addition to the `step()` method, environments also provide a `reset()` method that starts a new sequence and provides an initial `TimeStep`. It is not necessary to call the `reset` method explicitly. We assume that environments reset automatically, either when they get to the end of an episode or when step() is call...
environment = suite_gym.load('CartPole-v0') print('action_spec:', environment.action_spec()) print('time_step_spec.observation:', environment.time_step_spec().observation) print('time_step_spec.step_type:', environment.time_step_spec().step_type) print('time_step_spec.discount:', environment.time_step_spec().discount) ...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
So we see that the environment expects actions of type `int64` in [0, 1] and returns `TimeSteps` where the observations are a `float32` vector of length 4 and discount factor is a `float32` in [0.0, 1.0]. Now, let's try to take a fixed action `(1,)` for a whole episode.
action = np.array(1, dtype=np.int32) time_step = environment.reset() print(time_step) while not time_step.is_last(): time_step = environment.step(action) print(time_step)
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
Creating your own Python EnvironmentFor many clients, a common use case is to apply one of the standard agents (see agents/) in TF-Agents to their problem. To do this, they have to frame their problem as an environment. So let us look at how to implement an environment in Python.Let's say we want to train an agent to ...
class CardGameEnv(py_environment.PyEnvironment): def __init__(self): self._action_spec = array_spec.BoundedArraySpec( shape=(), dtype=np.int32, minimum=0, maximum=1, name='action') self._observation_spec = array_spec.BoundedArraySpec( shape=(1,), dtype=np.int32, minimum=0, name='observation')...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
Let's make sure we did everything correctly defining the above environment. When creating your own environment you must make sure the observations and time_steps generated follow the correct shapes and types as defined in your specs. These are used to generate the TensorFlow graph and as such can create hard to debug p...
environment = CardGameEnv() utils.validate_py_environment(environment, episodes=5)
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
Now that we know the environment is working as intended, let's run this environment using a fixed policy: ask for 3 cards and then end the round.
get_new_card_action = np.array(0, dtype=np.int32) end_round_action = np.array(1, dtype=np.int32) environment = CardGameEnv() time_step = environment.reset() print(time_step) cumulative_reward = time_step.reward for _ in range(3): time_step = environment.step(get_new_card_action) print(time_step) cumulative_rewa...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
Environment WrappersAn environment wrapper takes a python environment and returns a modified version of the environment. Both the original environment and the modified environment are instances of `py_environment.PyEnvironment`, and multiple wrappers can be chained together.Some common wrappers can be found in `enviro...
env = suite_gym.load('Pendulum-v0') print('Action Spec:', env.action_spec()) discrete_action_env = wrappers.ActionDiscretizeWrapper(env, num_actions=5) print('Discretized Action Spec:', discrete_action_env.action_spec())
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
The wrapped `discrete_action_env` is an instance of `py_environment.PyEnvironment` and can be treated like a regular python environment. TensorFlow Environments The interface for TF environments is defined in `environments/tf_environment.TFEnvironment` and looks very similar to the Python environments. TF Environments...
class TFEnvironment(object): def time_step_spec(self): """Describes the `TimeStep` tensors returned by `step()`.""" def observation_spec(self): """Defines the `TensorSpec` of observations provided by the environment.""" def action_spec(self): """Describes the TensorSpecs of the action expected by `...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
The `current_time_step()` method returns the current time_step and initializes the environment if needed.The `reset()` method forces a reset in the environment and returns the current_step.If the `action` doesn't depend on the previous `time_step` a `tf.control_dependency` is needed in `Graph` mode.For now, let us look...
env = suite_gym.load('CartPole-v0') tf_env = tf_py_environment.TFPyEnvironment(env) print(isinstance(tf_env, tf_environment.TFEnvironment)) print("TimeStep Specs:", tf_env.time_step_spec()) print("Action Specs:", tf_env.action_spec())
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
Note the specs are now of type: `(Bounded)TensorSpec`. Usage Examples Simple Example
env = suite_gym.load('CartPole-v0') tf_env = tf_py_environment.TFPyEnvironment(env) # reset() creates the initial time_step after resetting the environment. time_step = tf_env.reset() num_steps = 3 transitions = [] reward = 0 for i in range(num_steps): action = tf.constant([i % 2]) # applies the action and returns...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
Whole Episodes
env = suite_gym.load('CartPole-v0') tf_env = tf_py_environment.TFPyEnvironment(env) time_step = tf_env.reset() rewards = [] steps = [] num_episodes = 5 for _ in range(num_episodes): episode_reward = 0 episode_steps = 0 while not time_step.is_last(): action = tf.random.uniform([1], 0, 2, dtype=tf.int32) ...
_____no_output_____
Apache-2.0
site/en-snapshot/agents/tutorials/2_environments_tutorial.ipynb
secsilm/docs-l10n
`Практикум по программированию на языке Python` `Занятие 2: Пользовательские и встроенные функции, итераторы и генераторы` `Мурат Апишев (mel-lain@yandex.ru)` `Москва, 2021` `Функции range и enumerate`
r = range(2, 10, 3) print(type(r)) for e in r: print(e, end=' ') for index, element in enumerate(list('abcdef')): print(index, element, end=' ')
0 a 1 b 2 c 3 d 4 e 5 f
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Функция zip`
z = zip([1, 2, 3], 'abc') print(type(z)) for a, b in z: print(a, b, end=' ') for e in zip('abcdef', 'abc'): print(e) for a, b, c, d in zip('abc', [1,2,3], [True, False, None], 'xyz'): print(a, b, c, d)
a 1 True x b 2 False y c 3 None z
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Определение собственных функций`
def function(arg_1, arg_2=None): print(arg_1, arg_2) function(10) function(10, 20)
10 None 10 20
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
Функция - это тоже объект, её имя - просто символическая ссылка:
f = function f(10) print(function is f)
10 None True
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Определение собственных функций`
retval = f(10) print(retval) def factorial(n): return n * factorial(n - 1) if n > 1 else 1 # recursion print(factorial(1)) print(factorial(2)) print(factorial(4))
1 2 24
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Передача аргументов в функцию`Параметры в Python всегда передаются по ссылке
def function(scalar, lst): scalar += 10 print(f'Scalar in function: {scalar}') lst.append(None) print(f'Scalar in function: {lst}') s, l = 5, [] function(s, l) print(s, l)
Scalar in function: 15 Scalar in function: [None] 5 [None]
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Передача аргументов в функцию`
def f(a, *args): print(type(args)) print([v for v in [a] + list(args)]) f(10, 2, 6, 8) def f(*args, a): print([v for v in [a] + list(args)]) print() f(2, 6, 8, a=10) def f(a, *args, **kw): print(type(kw)) print([v for v in [a] + list(args) + [(k, v) for k, v in kw.items()]]) f(2, *(6, 8),...
<class 'dict'> [2, 6, 8, ('arg1', 1), ('arg2', 2)]
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Области видимости переменных`В Python есть 4 основных уровня видимости:- Встроенная (buildins) - на этом уровне находятся все встроенные объекты (функции, классы исключений и т.п.)- Глобальная в рамках модуля (global) - всё, что определяется в коде модуля на верхнем уровне- Объемлюшей функции (enclosed) - всё, что оп...
def outer_func(x): def inner_func(x): return len(x) return inner_func(x) print(outer_func([1, 2]))
2
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
Кто определил имя `len`?- на уровне вложенной функции такого имени нет, смотрим выше- на уровне объемлющей функции такого имени нет, смотрим выше- на уровне модуля такого имени нет, смотрим выше- на уровне builtins такое имя есть, используем его `На builtins можно посмотреть`
import builtins counter = 0 lst = [] for name in dir(builtins): if name[0].islower(): lst.append(name) counter += 1 if counter == 5: break lst
_____no_output_____
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
Кстати, то же самое можно сделать более pythonic кодом:
list(filter(lambda x: x[0].islower(), dir(builtins)))[: 5]
_____no_output_____
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Локальные и глобальные переменные`
x = 2 def func(): print('Inside: ', x) # read func() print('Outside: ', x) x = 2 def func(): x += 1 # write print('Inside: ', x) func() # UnboundLocalError: local variable 'x' referenced before assignment print('Outside: ', x) x = 2 def func(): x = 3 x += 1 print('Inside: ', x) ...
Inside: 4 Outside: 2
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Ключевое слово global`
x = 2 def func(): global x x += 1 # write print('Inside: ', x) func() print('Outside: ', x) x = 2 def func(x): x += 1 print('Inside: ', x) return x x = func(x) print('Outside: ', x)
Inside: 3 Outside: 3
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Ключевое слово nonlocal`
a = 0 def out_func(): b = 10 def mid_func(): c = 20 def in_func(): global a a += 100 nonlocal c c += 100 nonlocal b b += 100 print(a, b, c) in_func() mid_func()...
100 110 120
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
__Главный вывод:__ не надо злоупотреблять побочными эффектами при работе с переменными верхних уровней `Пример вложенных функций: замыкания`- В большинстве случаев вложенные функции не нужны, плоская иерархия будет и проще, и понятнее- Одно из исключений - фабричные функции (замыкания)
def function_creator(n): def function(x): return x ** n return function f = function_creator(5) f(2)
_____no_output_____
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
Объект-функция, на который ссылается `f`, хранит в себе значение `n` `Анонимные функции`- `def` - не единственный способ объявления функции- `lambda` создаёт анонимную (lambda) функциюТакие функции часто используются там, где синтаксически нельзя записать определение через `def`
def func(x): return x ** 2 func(6) lambda_func = lambda x: x ** 2 # should be an expression lambda_func(6) def func(x): print(x) func(6) lambda_func = lambda x: print(x ** 2) # as print is function in Python 3.* lambda_func(6)
36
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Встроенная функция sorted`
lst = [5, 2, 7, -9, -1] def abs_comparator(x): return abs(x) print(sorted(lst, key=abs_comparator)) sorted(lst, key=lambda x: abs(x)) sorted(lst, key=lambda x: abs(x), reverse=True)
_____no_output_____
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Встроенная функция filter`
lst = [5, 2, 7, -9, -1] f = filter(lambda x: x < 0, lst) # True condition type(f) # iterator list(f)
_____no_output_____
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Встроенная функция map`
lst = [5, 2, 7, -9, -1] m = map(lambda x: abs(x), lst) type(m) # iterator list(m)
_____no_output_____
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Ещё раз сравним два подхода`Напишем функцию скалярного произведения в императивном и функциональном стилях:
def dot_product_imp(v, w): result = 0 for i in range(len(v)): result += v[i] * w[i] return result dot_product_func = lambda v, w: sum(map(lambda x: x[0] * x[1], zip(v, w))) print(dot_product_imp([1, 2, 3], [4, 5, 6])) print(dot_product_func([1, 2, 3], [4, 5, 6]))
32 32
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Функция reduce``functools` - стандартный модуль с другими функциями высшего порядка.Рассмотрим пока только функцию `reduce`:
from functools import reduce lst = list(range(1, 10)) reduce(lambda x, y: x * y, lst)
_____no_output_____
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Итерирование, функции iter и next`
r = range(3) for e in r: print(e) it = iter(r) # r.__iter__() - gives us an iterator print(next(it)) print(it.__next__()) print(next(it)) print(next(it))
0 1 2
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Итераторы часто используются неявно`Как выглядит для нас цикл `for`:
for i in 'seq': print(i)
s e q
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
Как он работает на самом деле:
iterator = iter('seq') while True: try: i = next(iterator) print(i) except StopIteration: break
s e q
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Генераторы`- Генераторы, как и итераторы, предназначены для итерирования по коллекции, но устроены несколько иначе- Они определяются с помощью функций с оператором `yield` или генераторов списков, а не вызовов `iter()` и `next()`- В генераторе есть внутреннее изменяемое состояние в виде локальных переменных, которое ...
def my_range(n): yield 'You really want to run this generator?' i = -1 while i < n: i += 1 yield i gen = my_range(3) while True: try: print(next(gen), end=' ') except StopIteration: # we want to catch this type of exceptions break for e in my_range(3): print(e...
You really want to run this generator? 0 1 2 3
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Особенность range``range` не является генератором, хотя и похож, поскольку не хранит всю последовательность
print('__next__' in dir(zip([], []))) print('__next__' in dir(range(3)))
True False
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
Полезные особенности:- объекты `range` неизменяемые (могут быть ключами словаря)- имеют полезные атрибуты (`len`, `index`, `__getitem__`)- по ним можно итерироваться многократно `Модуль itetools`- Модуль представляет собой набор инструментов для работы с итераторами и последовательностями- Содержит три основных типа и...
from itertools import count for i in count(start=0): print(i, end=' ') if i == 5: break from itertools import cycle count = 0 for item in cycle('XYZ'): if count > 4: break print(item, end=' ') count += 1
X Y Z X Y
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Модуль itetools: примеры`
from itertools import accumulate for i in accumulate(range(1, 5), lambda x, y: x * y): print(i) from itertools import chain for i in chain([1, 2], [3], [4]): print(i)
1 2 3 4
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
`Модуль itetools: примеры`
from itertools import groupby vehicles = [('Ford', 'Taurus'), ('Dodge', 'Durango'), ('Chevrolet', 'Cobalt'), ('Ford', 'F150'), ('Dodge', 'Charger'), ('Ford', 'GT')] sorted_vehicles = sorted(vehicles) for key, group in groupby(sorted_vehicles, lambda x: x[0]): for maker, model in group:...
Cobalt is made by Chevrolet **** END OF THE GROUP *** Charger is made by Dodge Durango is made by Dodge **** END OF THE GROUP *** F150 is made by Ford GT is made by Ford Taurus is made by Ford **** END OF THE GROUP ***
MIT
lectures/02-functions.ipynb
sir-rois/mipt-python
***Introduction to Radar Using Python and MATLAB*** Andy Harrison - Copyright (C) 2019 Artech House Coherent Detector*** The in-phase and quadrature signal components from a coherent detector may be written as (Equation 5.13)$$ x(t) = a(t) \cos(2\pi f_0 t) \cos(\phi(t)) - a(t) \sin(2 \pi f_0 t) \sin(\phi(t)) = ...
import lib_path
_____no_output_____
Apache-2.0
jupyter/Chapter05/coherent_detector.ipynb
miltondsantos/software
Set the sampling frequency (Hz), the start frequency (Hz), the end frequency (Hz), the amplitude modulation frequency (Hz) and amplitude (relative) for the sample signal
sampling_frequency = 100 start_frequency = 4 end_frequency = 25 am_amplitude = 0.1 am_frequency = 9
_____no_output_____
Apache-2.0
jupyter/Chapter05/coherent_detector.ipynb
miltondsantos/software
Calculate the bandwidth (Hz) and center frequency (Hz)
bandwidth = end_frequency - start_frequency center_frequency = 0.5 * bandwidth + start_frequency
_____no_output_____
Apache-2.0
jupyter/Chapter05/coherent_detector.ipynb
miltondsantos/software
Set up the waveform
from numpy import arange, sin from scipy.constants import pi from scipy.signal import chirp time = arange(sampling_frequency) / sampling_frequency if_signal = chirp(time, start_frequency, time[-1], end_frequency) if_signal *= (1.0 + am_amplitude * sin(2.0 * pi * am_frequency * time))
_____no_output_____
Apache-2.0
jupyter/Chapter05/coherent_detector.ipynb
miltondsantos/software
Set up the keyword args
kwargs = {'if_signal': if_signal, 'center_frequency': center_frequency, 'bandwidth': bandwidth, 'sample_frequency': sampling_frequency, 'time': time}
_____no_output_____
Apache-2.0
jupyter/Chapter05/coherent_detector.ipynb
miltondsantos/software
Calculate the baseband in-phase and quadrature signals
from Libs.receivers import coherent_detector i_signal, q_signal = coherent_detector.iq(**kwargs)
_____no_output_____
Apache-2.0
jupyter/Chapter05/coherent_detector.ipynb
miltondsantos/software
Use the `matplotlib` routines to display the results
from matplotlib import pyplot as plt from numpy import real, imag # Set the figure size plt.rcParams["figure.figsize"] = (15, 10) # Display the results plt.plot(time, real(i_signal), '', label='In Phase') plt.plot(time, real(q_signal), '-.', label='Quadrature') # Set the plot title and labels plt.title('Cohe...
_____no_output_____
Apache-2.0
jupyter/Chapter05/coherent_detector.ipynb
miltondsantos/software
Visualizing and Analyzing Jigsaw
import pandas as pd import re import numpy as np
_____no_output_____
BSD-3-Clause
.ipynb_checkpoints/Visualizing and Analyzing Jigsaw-checkpoint.ipynb
dudaspm/LDA_Bias_Data
In the previous section, we explored how to generate topics from a textual dataset using LDA. But how can this be used as an application? Therefore, in this section, we will look into the possible ways to read the topics as well as understand how it can be used. We will now import the preloaded data of the LDA result t...
df = pd.read_csv("https://raw.githubusercontent.com/dudaspm/LDA_Bias_Data/main/topics.csv") df.head()
_____no_output_____
BSD-3-Clause
.ipynb_checkpoints/Visualizing and Analyzing Jigsaw-checkpoint.ipynb
dudaspm/LDA_Bias_Data
We will visualize these results to understand what major themes are present in them.
%%html <iframe src='https://flo.uri.sh/story/941631/embed' title='Interactive or visual content' class='flourish-embed-iframe' frameborder='0' scrolling='no' style='width:100%;height:600px;' sandbox='allow-same-origin allow-forms allow-scripts allow-downloads allow-popups allow-popups-to-escape-sandbox allow-top-navig...
_____no_output_____
BSD-3-Clause
.ipynb_checkpoints/Visualizing and Analyzing Jigsaw-checkpoint.ipynb
dudaspm/LDA_Bias_Data
An Overview of the analysis From the above visualization, an anomaly that we come across is that the dataset we are examining is supposed to be related to people with physical, mental and learning disability. But unfortunately based on the topics that were extracted, we notice just a small subset of words that are rel...
headers = {"Authorization": f"Bearer api_ZtUEFtMRVhSLdyTNrRAmpxXgMAxZJpKLQb"}
_____no_output_____
BSD-3-Clause
.ipynb_checkpoints/Visualizing and Analyzing Jigsaw-checkpoint.ipynb
dudaspm/LDA_Bias_Data
To get access to this software, you will need to get an API KEY at https://huggingface.co/unitary/toxic-bertHere is an example of what this would look like.```pythonheaders = {"Authorization": f"Bearer api_XXXXXXXXXXXXXXXXXXXXXXXXXXX"}```
import requests API_URL = "https://api-inference.huggingface.co/models/unitary/toxic-bert" def query(payload): response = requests.post(API_URL, headers=headers, json=payload) return response.json() query({"inputs": "addict"})
_____no_output_____
BSD-3-Clause
.ipynb_checkpoints/Visualizing and Analyzing Jigsaw-checkpoint.ipynb
dudaspm/LDA_Bias_Data
You can input words or sentences in \, in the code, to look at the results that are generated through this.This example can provide an idea as to how ML can be used for toxicity analysis.
query({"inputs": "<insert word here>"}) %%html <iframe src='https://flo.uri.sh/story/941681/embed' title='Interactive or visual content' class='flourish-embed-iframe' frameborder='0' scrolling='no' style='width:100%;height:600px;' sandbox='allow-same-origin allow-forms allow-scripts allow-downloads allow-popups allow-...
_____no_output_____
BSD-3-Clause
.ipynb_checkpoints/Visualizing and Analyzing Jigsaw-checkpoint.ipynb
dudaspm/LDA_Bias_Data
The BiasThe visualization shows how contextually toxic words are derived as important words within various topics related to this dataset. This can lead to any Natural Language Processing kernel learning this dataset to provide skewed analysis for the population in consideration, i.e. people with mental, physical and ...
%%html <iframe src='https://flo.uri.sh/visualisation/6867000/embed' title='Interactive or visual content' class='flourish-embed-iframe' frameborder='0' scrolling='no' style='width:100%;height:600px;' sandbox='allow-same-origin allow-forms allow-scripts allow-downloads allow-popups allow-popups-to-escape-sandbox allow-t...
_____no_output_____
BSD-3-Clause
.ipynb_checkpoints/Visualizing and Analyzing Jigsaw-checkpoint.ipynb
dudaspm/LDA_Bias_Data
It is hence important to be aware of the dataset that is being used to analyse a specific population. With LDA, we were able to understand that this dataset cannot be used as a good representation of the disabled community. To bring about a movement of unbiased AI, we need to perform such preliminary analysis and more,...
%%html <iframe src='https://flo.uri.sh/visualisation/6856937/embed' title='Interactive or visual content' class='flourish-embed-iframe' frameborder='0' scrolling='no' style='width:100%;height:600px;' sandbox='allow-same-origin allow-forms allow-scripts allow-downloads allow-popups allow-popups-to-escape-sandbox allow-...
_____no_output_____
BSD-3-Clause
.ipynb_checkpoints/Visualizing and Analyzing Jigsaw-checkpoint.ipynb
dudaspm/LDA_Bias_Data
Figure 1 - Overview
df = pd.read_csv(datdir / 'fig_1.csv') scores = df[list(map(str, range(20)))].values selected = ~np.isnan(df['Selected'].values) gens_sel = np.nonzero(selected)[0] scores_sel = np.array([np.max(scores[g]) for g in gens_sel]) ims_sel = [plt.imread(str(datdir / 'images' / 'overview' / f'gen{gen:03d}.png')) for...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Define Custom Violinplot
def violinplot2(data=None, x=None, y=None, hue=None, palette=None, linewidth=1, orient=None, order=None, hue_order=None, x_disp=None, palette_per_violin=None, hline_at_1=True, legend_palette=None, legend_kwargs=None, width=0.7, control_widt...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Figure 3 - Compare Target Nets, Layers
df = pd.read_csv(datdir/'fig_2.csv') df = df[~np.isnan(df['Rel_act'])] # remove invalid data df.head() nets = ('caffenet', 'resnet-152-v2', 'resnet-269-v2', 'inception-v3', 'inception-v4', 'inception-resnet-v2', 'placesCNN') layers = {'caffenet': ('conv2', 'conv4', 'fc6', 'fc8'), 'resnet-152-v2': ('res15_e...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Figure 5 - Compare Generators Compare representation "depth"
df = pd.read_csv(datdir / 'fig_5-repr_depth.csv') df = df[~np.isnan(df['Rel_act'])] df['Classifier, layer'] = [', '.join(tuple(a)) for a in df[['Classifier', 'Layer']].values] df.head() nets = ('caffenet', 'inception-resnet-v2') layers = {'caffenet': ('conv2', 'fc6', 'fc8'), 'inception-resnet-v2': ('classifi...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Compare training dataset
df = pd.read_csv(datdir / 'fig_5-training_set.csv') df = df[~np.isnan(df['Rel_act'])] df['Classifier, layer'] = [', '.join(tuple(a)) for a in df[['Classifier', 'Layer']].values] df.head() nets = ('caffenet', 'inception-resnet-v2') cs = ('caffenet', 'placesCNN', 'inception-resnet-v2') layers = {c: ('conv2', 'conv4', 'fc...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Figure 4 - Compare Inits
layers = ('conv2', 'conv4', 'fc6', 'fc8') layers_disp = tuple(v.capitalize() for v in layers)
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Rand inits, fraction change
df = pd.read_csv(datdir/'fig_4-rand_init.csv').set_index(['Layer', 'Unit', 'Init_seed']) df = (df.drop(0, level='Init_seed') - df.xs(0, level='Init_seed')).mean(axis=0,level=('Layer','Unit')) df = df.rename({'Rel_act': 'Fraction change'}, axis=1) df = df.reset_index() df.head() palette = get_cmap('Blues')(np.linspace(0...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Rand inits, interpolation
df = pd.read_csv(datdir/'fig_4-rand_init_interp.csv').set_index(['Layer', 'Unit', 'Seed_i0', 'Seed_i1']) df = df.mean(axis=0,level=('Layer','Unit')) df2 = pd.read_csv(datdir/'fig_4-rand_init_interp-2.csv').set_index(['Layer', 'Unit']) # control conditions df2_normed = df2.divide(df[['Rel_act_loc_0.0','Rel_act_loc_1....
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Per-neuron inits
df = pd.read_csv(datdir/'fig_4-per_neuron_init.csv') df.head() hue_order = ('rand', 'none', 'worst_opt', 'mid_opt', 'best_opt', 'worst_ivt', 'mid_ivt', 'best_ivt') palette = [get_cmap(main_c)(np.linspace(0.3,0.8,4)) for main_c in ('Blues', 'Greens', 'Purples')] palette = np.concatenate([[ pa...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Figure 6 - Compare Optimizers & Stoch Scales Compare optimizers
df = pd.read_csv(datdir/'fig_6-optimizers.csv') df['OCL'] = ['_'.join(v) for v in df[['Optimizer','Classifier','Layer']].values] df.head() opts = ('genetic', 'FDGD', 'NES') layers = {'caffenet': ('conv2', 'conv4', 'fc6', 'fc8'), 'inception-resnet-v2': ('classifier',)} cls = [(c, l) for c in layers for l in la...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Compare varying amounts of noise
df = pd.read_csv(datdir/'fig_6-stoch_scales.csv') df = df[~np.isnan(df['Rel_noise'])] df['Stoch_scale_plot'] = [str(int(v)) if ~np.isnan(v) else 'None' for v in df['Stoch_scale']] df.head() layers = ('conv2', 'conv4', 'fc6', 'fc8') stoch_scales = list(map(str, (5, 10, 20, 50, 75, 100, 250))) + ['None'] stoch_scales_dis...
_____no_output_____
MIT
figure_data/Make Plots.ipynb
willwx/XDream
Libraries and auxiliary functions
#load the libraries from time import sleep from kafka import KafkaConsumer import datetime as dt import pygeohash as pgh #fuctions to check the location based on the geo hash (precision =5) #function to check location between 2 data def close_location (data1,data2): print("checking location...of sender",data1.get("...
_____no_output_____
MIT
Assignment_TaskC_Streaming_Application.ipynb
tonbao30/Parallel-dataprocessing-simulation
Streaming Application
import os os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.3.0 pyspark-shell' import sys import time import json from pymongo import MongoClient from pyspark import SparkContext, SparkConf from pyspark.streaming import StreamingContext from pyspark.streaming.kafka impo...
------------------------------------------- Time: 2019-05-24 17:45:20 ------------------------------------------- sender_2 sender_3 sender_1 sender_1 ------------------------------------------- Time: 2019-05-24 17:45:30 ------------------------------------------- sender_3 sender_1 sender_1 ---------------------------...
MIT
Assignment_TaskC_Streaming_Application.ipynb
tonbao30/Parallel-dataprocessing-simulation
Text Summarization Sequenece to Sequence Modelling Attention Mechanism Import Libraries
#import all the required libraries import numpy as np import pandas as pd import pickle from statistics import mode import nltk from nltk import word_tokenize from nltk.stem import LancasterStemmer nltk.download('wordnet') nltk.download('stopwords') nltk.download('punkt') from nltk.corpus import stopwords from tensorfl...
[nltk_data] Downloading package wordnet to /usr/share/nltk_data... [nltk_data] Package wordnet is already up-to-date! [nltk_data] Downloading package stopwords to /usr/share/nltk_data... [nltk_data] Package stopwords is already up-to-date! [nltk_data] Downloading package punkt to /usr/share/nltk_data... [nltk_data]...
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
Parse the Data We’ll take a sample of 100,000 reviews to reduce the training time of our model.
#read the dataset file for text Summarizer df=pd.read_csv("../input/amazon-fine-food-reviews/Reviews.csv",nrows=10000) # df = pd.read_csv("../input/amazon-fine-food-reviews/Reviews.csv") #drop the duplicate and na values from the records df.drop_duplicates(subset=['Text'],inplace=True) df.dropna(axis=0,inplace=True) #d...
_____no_output_____
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
Preprocessing Performing basic preprocessing steps is very important before we get to the model building part. Using messy and uncleaned text data is a potentially disastrous move. So in this step, we will drop all the unwanted symbols, characters, etc. from the text that do not affect the objective of our problem.Her...
contraction_mapping = {"ain't": "is not", "aren't": "are not","can't": "cannot", "'cause": "because", "could've": "could have", "couldn't": "could not", "didn't": "did not", "doesn't": "does not", "don't": "do not", "hadn't": "had not", "hasn't": "has not", "haven't": "have not", ...
_____no_output_____
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
We can use the contraction using two method, one we can use the above dictionary or we can keep the contraction file as a data set and import it.
input_texts=[] # Text column target_texts=[] # summary column input_words=[] target_words=[] # contractions=pickle.load(open("../input/contraction/contractions.pkl","rb"))['contractions'] contractions = contraction_mapping #initialize stop words and LancasterStemmer stop_words=set(stopwords.words('english')) stemm=La...
_____no_output_____
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
Data Cleaning
def clean(texts,src): texts = BeautifulSoup(texts, "lxml").text #remove the html tags words=word_tokenize(texts.lower()) #tokenize the text into words #filter words which contains \ #integers or their length is less than or equal to 3 words= list(filter(lambda w:(w.isalpha() and len(w)>=3),words)) #con...
number of input words : 10344 number of target words : 4169 maximum input length : 73 maximum target length : 17
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
Split it
#split the input and target text into 80:20 ratio or testing size of 20%. x_train,x_test,y_train,y_test=train_test_split(input_texts,target_texts,test_size=0.2,random_state=0) #train the tokenizer with all the words in_tokenizer = Tokenizer() in_tokenizer.fit_on_texts(x_train) tr_tokenizer = Tokenizer() tr_tokenizer.f...
_____no_output_____
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
Model Building
K.clear_session() latent_dim = 500 #create input object of total number of encoder words en_inputs = Input(shape=(max_in_len,)) en_embedding = Embedding(num_in_words+1, latent_dim)(en_inputs) #create 3 stacked LSTM layer with the shape of hidden dimension for text summarizer using deep learning #LSTM 1 en_lstm1= L...
_____no_output_____
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
Decoder
# Decoder. dec_inputs = Input(shape=(None,)) dec_emb_layer = Embedding(num_tr_words+1, latent_dim) dec_embedding = dec_emb_layer(dec_inputs) #initialize decoder's LSTM layer with the output states of encoder dec_lstm = LSTM(latent_dim, return_sequences=True, return_state=True) dec_outputs, *_ = dec_lstm(dec_embed...
_____no_output_____
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
Attention Layer
#Attention layer attention =Attention() attn_out = attention([dec_outputs,en_outputs3]) #Concatenate the attention output with the decoder outputs merge=Concatenate(axis=-1, name='concat_layer1')([dec_outputs,attn_out]) #Dense layer (output layer) dec_dense = Dense(num_tr_words+1, activation='softmax') dec_outputs =...
_____no_output_____
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism
Train the Model
#Model class and model summary for text Summarizer model = Model([en_inputs, dec_inputs], dec_outputs) model.summary() plot_model(model, to_file='model_plot.png', show_shapes=True, show_layer_names=True) model.compile(optimizer="rmsprop", loss="sparse_categorical_crossentropy", metrics=["accuracy"] ) history = model....
_____no_output_____
MIT
text-summarization-attention-mechanism.ipynb
buddhadeb33/Text-Summarization-Attention-Mechanism