row_id
int64 0
48.4k
| init_message
stringlengths 1
342k
| conversation_hash
stringlengths 32
32
| scores
dict |
|---|---|---|---|
40,157
|
class pswMng {
private:
//private variables
std::string passPhase;
public:
// Setter
void setpassPhase(std::string p) {
passPhase = p;
}
// Getter
std::string getpassPhase() const {
return passPhase;
}
};
how to make deleter
|
97005b16872f65d05a60cbc85b116e16
|
{
"intermediate": 0.2596912682056427,
"beginner": 0.5239582061767578,
"expert": 0.21635054051876068
}
|
40,158
|
Python code to read data from csv file
|
fd3b0ec04d77b853c96e98381a45081e
|
{
"intermediate": 0.536635160446167,
"beginner": 0.1791781336069107,
"expert": 0.2841867208480835
}
|
40,159
|
def add_sprite():
""" Add cow sprite to the stage and set its speed """
cow = codesters.Sprite("cow",-400, -150)
cow.set_x_speed(3)
return cow
def setup_stage():
""" Set up the stage with background and sprites """
stage.set_background("barn")
stage.disable_right_wall()
hay = codesters.Sprite("hay",-150,-150)
hay = codesters.Sprite("hay",0,-150)
hay = codesters.Sprite("hay",150,-150)
def collision(sprite, hit_sprite):
""" Collision event for the cow to eat hay """
if hit_sprite.get_image_name() == "hay":
sprite.say("yum!",.2)
hit_sprite.hide()
def main():
""" Sets up the program and calls other functions """
setup_stage()
sprite = add_sprite()
main()
This program has a logic error. The cow is hungry and wants to eat the hay, but nothing happens!
RULE: An event handler links a sprite to a function that is called whenever the event is triggered.
Click Run and see what happens. Look in the code to see where the cow should be eating the hay.
Fix the program by adding an event handler to the main() function so that the collision event happens.
Click Run to test if you fixed the program. When it is fixed, click Submit and Next
|
521a52f0ff788a82d464e5e6160261ad
|
{
"intermediate": 0.42214643955230713,
"beginner": 0.2671257555484772,
"expert": 0.3107277750968933
}
|
40,160
|
voici classement pourquoi j'ai cette erreur ? [{'num_licence': 49866, 'victoires': {'V': 1, 'TD': Decimal('8'), 'TR': Decimal('15'), 'TD-TR': Decimal('-7')}, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('13'), 'TD-TR': Decimal('-3')}, 'ratio_donner_recu': 0}, {'num_licence': 177560, 'victoires': {'V': 3, 'TD': Decimal('15'), 'TR': Decimal('5'), 'TD-TR': Decimal('10')}, 'ratio_donner_recu': 0}, {'num_licence': 192891, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 232027, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 49866, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}]
0
[{'num_licence': 49866, 'victoires': {'V': 1, 'TD': Decimal('8'), 'TR': Decimal('15'), 'TD-TR': Decimal('-7')}, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('13'), 'TD-TR': Decimal('-3')}, 'ratio_donner_recu': 0}, {'num_licence': 177560, 'victoires': {'V': 3, 'TD': Decimal('15'), 'TR': Decimal('5'), 'TD-TR': Decimal('10')}, 'ratio_donner_recu': 0}, {'num_licence': 192891, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 232027, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 49866, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': 0, 'ratio_donner_recu': 0}]
0
[{'num_licence': 49866, 'victoires': {'V': 1, 'TD': Decimal('8'), 'TR': Decimal('15'), 'TD-TR': Decimal('-7')}, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('13'), 'TD-TR': Decimal('-3')}, 'ratio_donner_recu': 0}, {'num_licence': 177560, 'victoires': {'V': 3, 'TD': Decimal('15'), 'TR': Decimal('5'), 'TD-TR': Decimal('10')}, 'ratio_donner_recu': 0}, {'num_licence': 192891, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 232027, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 49866, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}]
0
[{'num_licence': 49866, 'victoires': {'V': 1, 'TD': Decimal('8'), 'TR': Decimal('15'), 'TD-TR': Decimal('-7')}, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('13'), 'TD-TR': Decimal('-3')}, 'ratio_donner_recu': 0}, {'num_licence': 177560, 'victoires': {'V': 3, 'TD': Decimal('15'), 'TR': Decimal('5'), 'TD-TR': Decimal('10')}, 'ratio_donner_recu': 0}, {'num_licence': 192891, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 232027, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 49866, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': 0, 'ratio_donner_recu': 0}]
0
[{'num_licence': 49866, 'victoires': {'V': 1, 'TD': Decimal('8'), 'TR': Decimal('15'), 'TD-TR': Decimal('-7')}, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('13'), 'TD-TR': Decimal('-3')}, 'ratio_donner_recu': 0}, {'num_licence': 177560, 'victoires': {'V': 3, 'TD': Decimal('15'), 'TR': Decimal('5'), 'TD-TR': Decimal('10')}, 'ratio_donner_recu': 0}, {'num_licence': 192891, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 232027, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 49866, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 177560, 'victoires': 0, 'ratio_donner_recu': 0}]
0
[{'num_licence': 49866, 'victoires': {'V': 1, 'TD': Decimal('8'), 'TR': Decimal('15'), 'TD-TR': Decimal('-7')}, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('13'), 'TD-TR': Decimal('-3')}, 'ratio_donner_recu': 0}, {'num_licence': 177560, 'victoires': {'V': 3, 'TD': Decimal('15'), 'TR': Decimal('5'), 'TD-TR': Decimal('10')}, 'ratio_donner_recu': 0}, {'num_licence': 192891, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 232027, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 49866, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 73096, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 113727, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 126524, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}, {'num_licence': 166830, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 177560, 'victoires': 0, 'ratio_donner_recu': 0}, {'num_licence': 192891, 'victoires': {'V': 2, 'TD': Decimal('10'), 'TR': Decimal('10'), 'TD-TR': Decimal('0')}, 'ratio_donner_recu': 0}]
0TypeError
TypeError: '<' not supported between instances of 'int' and 'dict'
Traceback (most recent call last)
File "C:\Users\alexa\AppData\Roaming\Python\Python310\site-packages\flask\app.py", line 1478, in __call__
return self.wsgi_app(environ, start_response)
File "C:\Users\alexa\AppData\Roaming\Python\Python310\site-packages\flask\app.py", line 1458, in wsgi_app
response = self.handle_exception(e)
File "C:\Users\alexa\AppData\Roaming\Python\Python310\site-packages\flask\app.py", line 1455, in wsgi_app
response = self.full_dispatch_request()
File "C:\Users\alexa\AppData\Roaming\Python\Python310\site-packages\flask\app.py", line 869, in full_dispatch_request
rv = self.handle_user_exception(e)
File "C:\Users\alexa\AppData\Roaming\Python\Python310\site-packages\flask\app.py", line 867, in full_dispatch_request
rv = self.dispatch_request()
File "C:\Users\alexa\AppData\Roaming\Python\Python310\site-packages\flask\app.py", line 852, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
File "C:\Users\alexa\Desktop\COMP_ESCRIME\appli\views.py", line 995, in classement_provisioire
etablir_classement_poule(id_comp)
File "C:\Users\alexa\Desktop\COMP_ESCRIME\appli\models.py", line 867, in etablir_classement_poule
TypeError: '<' not supported between instances of 'int' and 'dict'
|
da1a333553d66e14381def145ae79b41
|
{
"intermediate": 0.24898651242256165,
"beginner": 0.5372166633605957,
"expert": 0.21379688382148743
}
|
40,161
|
refactor using playwright library
import os
import time
import logging
from concurrent.futures import ProcessPoolExecutor
from selenium import webdriver
from selenium.common import TimeoutException, NoSuchElementException
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.chrome.service import Service as ChromeService
# Create a logs directory if it doesn't exist
if not os.path.exists('logs'):
os.makedirs('logs')
start = time.time()
def threading_task(
url_param, process_id, name_ids, residence_ids, birth_years, usernames, passwords
):
process_log_path = f'logs/{time.strftime("%Y-%m-%d_%H-%M-%S")}/process_{process_id}.log'
os.makedirs(os.path.dirname(process_log_path), exist_ok=True)
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
file_handler = logging.FileHandler(process_log_path)
formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s', datefmt='%Y-%m-%d %H:%M:%S')
file_handler.setFormatter(formatter)
logger.addHandler(file_handler)
chrome_options = webdriver.ChromeOptions()
service = ChromeService(executable_path="chromedriver.exe")
driver = webdriver.Chrome(options=chrome_options, service=service)
for name_id, residence_id, birth_year, username, password in zip(
name_ids, residence_ids, birth_years, usernames, passwords
):
threading(driver, url_param, process_id, name_id, residence_id, birth_year, username, password, logger)
driver.quit()
def threading(
driver, url_param, process_id, name_id, residence_id, birth_year, username, password, logger
):
driver.get(url_param)
start_time = time.time()
try:
username_field = WebDriverWait(driver, 10).until(
EC.visibility_of_element_located(
(
By.ID,
"EmaratechSG_Theme_wtLayoutBlock_block_wtMainContent_wtLoginBlock_WebPatterns_wtNavigationTabs_block_wtContent2_wtEstablishmentLogin_wtinpUsername",
)
)
)
logger.info(f"Clicked new button with id {username_field.get_attribute('id')}")
username_field.send_keys(username)
logger.info("Username entered successfully.")
password_field = WebDriverWait(driver, 10).until(
EC.visibility_of_element_located(
(
By.ID,
"EmaratechSG_Theme_wtLayoutBlock_block_wtMainContent_wtLoginBlock_WebPatterns_wtNavigationTabs_block_wtContent2_wtEstablishmentLogin_wtinpPassword",
)
)
)
logger.info(f"Clicked new button with id {password_field.get_attribute('id')}")
password_field.send_keys(password)
logger.info("Password entered successfully.")
submit_password = WebDriverWait(driver, 10).until(
EC.element_to_be_clickable(
(
By.ID,
"EmaratechSG_Theme_wtLayoutBlock_block_wtMainContent_wtLoginBlock_WebPatterns_wtNavigationTabs_block_wtContent2_wtEstablishmentLogin_wt15",
)
)
)
logger.info(f"Clicked checkbox with id {submit_password.get_attribute('id')}")
submit_password.click()
logger.info("Password submitted successfully.")
new_button = WebDriverWait(driver, 10).until(
EC.element_to_be_clickable(
(By.ID, "EmaratechSG_Theme_wt11_block_wtMainContent_wt6_wtRow2")
)
)
logger.info(f"Clicked new button with id {new_button.get_attribute('id')}")
new_button.click()
logger.info("Button clicked successfully.")
dropdown = WebDriverWait(driver, 10).until(
EC.element_to_be_clickable((By.ID, "select2-chosen-1"))
)
logger.info(f"Clicked dropdown with id {dropdown.get_attribute('id')}")
dropdown.click()
logger.info("Dropdown clicked successfully.")
option_element = WebDriverWait(driver, 10).until(
EC.visibility_of_element_located((By.ID, "select2-result-label-7"))
)
logger.info(f"Selected option from dropdown {option_element.get_attribute('id')}")
option_element.click()
logger.info("Selected option successfully.")
residence_input = WebDriverWait(driver, 10).until(
EC.visibility_of_element_located(
(
By.ID,
"EmaratechSG_Theme_wt32_block_wtMainContent_wt50_wtSubmitterResidenceNumber",
)
)
)
logger.info(f"Entered residence ID with id {residence_input.get_attribute('id')}")
residence_input.send_keys(residence_id)
logger.info("Entered residence ID successfully.")
country_dropdown = WebDriverWait(driver, 10).until(
EC.element_to_be_clickable(
(
By.ID,
"s2id_EmaratechSG_Theme_wt32_block_wtMainContent_wt50_wtApplicantNationality",
)
)
)
logger.info(f"Clicking country dropdown with id {country_dropdown.get_attribute('id')}")
country_dropdown.click()
logger.info("Clicked country dropdown ID successfully.")
india_option = WebDriverWait(driver, 10).until(
EC.element_to_be_clickable(
(
By.XPATH,
"//div[contains(@id, 'select2-result-label-') and contains(text(), 'INDIA')]",
)
)
)
logger.info(f"Selected India from country dropdown with id {india_option.get_attribute('id')}")
india_option.click()
logger.info("Selected India from country dropdown successfully.")
birth_year_input = WebDriverWait(driver, 10).until(
EC.visibility_of_element_located(
(
By.ID,
"EmaratechSG_Theme_wt32_block_wtMainContent_wt50_wtApplicantYearOfBirth",
)
)
)
logger.info(f"Entered birth year with id {birth_year_input.get_attribute('id')}")
birth_year_input.send_keys(birth_year)
logger.info("Entered birth year successfully.")
login_container = WebDriverWait(driver, 20).until(
EC.visibility_of_element_located((By.CLASS_NAME, "loginContainer"))
)
login_container.click()
logout = WebDriverWait(driver, 20).until(
EC.visibility_of_element_located(
(
By.ID,
"EmaratechSG_Theme_wt32_block_wtHeader_SmartChannels_Th_wt45_block_wtwbThHeaderGdrfa_wtMenuItems_wtTopAccessibilityBar_wt53_WebPatterns_wt21_block_wtMenuLink6_wt15",
)
)
)
logout.click()
end_time = time.time()
duration = end_time - start_time
logger.info(f"Thread ID: {name_id}, Duration: {duration:.2f} seconds")
logger.info(
f"Process ID: {process_id}, Thread Name ID: {name_id}, Residence ID: {residence_id}, Birth Year: {birth_year}, Username: {username}, Password: {password}"
)
except TimeoutException as te:
logger.error(f"Timeout occurred in thread {name_id}: {te}")
except NoSuchElementException as nse:
logger.error(f"Element not found in thread {name_id}: {nse}")
except Exception as e:
logger.error(f"An error occurred in thread {name_id}: {e}")
if __name__ == "__main__":
url = "https://smart.gdrfad.gov.ae/HomePage.aspx?GdfraLocale=en-US"
num_threads = 3
user_groups = {
1: {
"name_ids": ["Muskan 1"] * num_threads,
"residence_ids": ["444"] * num_threads,
"birth_years": ["2000"] * num_threads,
"usernames": ["muskan123"] * num_threads,
"passwords": ["Kumari@3011"] * num_threads
},
2: {
"name_ids": ["Muskan 2"] * num_threads,
"residence_ids": ["555"] * num_threads,
"birth_years": ["1999"] * num_threads,
"usernames": ["muskankum"] * num_threads,
"passwords": ["Muskan@3012"] * num_threads
},
3: {
"name_ids": ["Pratiksha"] * num_threads,
"residence_ids": ["555"] * num_threads,
"birth_years": ["1999"] * num_threads,
"usernames": ["Pratiksha122"] * num_threads,
"passwords": ["P@ss123455"] * num_threads
}
# Uncomment and add more groups as needed
# 4: {
# "name_ids": ["Sara"] * num_threads,
# "residence_ids": ["555"] * num_threads,
# "birth_years": ["1999"] * num_threads,
# "usernames": ["SaraShahh"] * num_threads,
# "passwords": ["Abcd@123"] * num_threads
# },
# 5: {
# "name_ids": ["Kat"] * num_threads,
# "residence_ids": ["555"] * num_threads,
# "birth_years": ["1999"] * num_threads,
# "usernames": ["katz"] * num_threads,
# "passwords": ["P@ss12344444"] * num_threads
# }
}
with ProcessPoolExecutor(max_workers=len(user_groups)) as exe:
for group_id, data in user_groups.items():
exe.submit(
threading_task,
url,
group_id,
data["name_ids"],
data["residence_ids"],
data["birth_years"],
data["usernames"],
data["passwords"]
)
end = time.time()
print(end - start)
|
96444a9d6345635ad8eb0d4d8fac8c9f
|
{
"intermediate": 0.370364785194397,
"beginner": 0.48584407567977905,
"expert": 0.14379113912582397
}
|
40,162
|
laravel flash message not working it didn't display on front-end
|
2ae108ed5bbde80456aa53dcb52cb06f
|
{
"intermediate": 0.3127191960811615,
"beginner": 0.38039112091064453,
"expert": 0.30688974261283875
}
|
40,163
|
keep in remember the following, following to this requirement i will give you my code alter the code according to the below given requirement.
State: The RL agent processes the circuit graph component by component. For a circuit with n components in topology graph G, the state sk for the kth component is defined as sk = (k; t; h),
where k is the one-hot representation of the transistor index, t is the one-hot representation of component type and h is the selected model feature vector for the component which further distinguishes different component types. For the NMOS and PMOS, the model parameters. For the Bias current, capacitor and resistor, we set the model parameters to zeros. For instance, for a circuit with ten components of five different kinds (NMOS, PMOS, R, C, Ib) and a five-dimensional model feature vector.
Action Space. The action vector varies for different types of components because the parameters needed to search are not the same. We use a continuous action space to determine the transistor sizes even though we will round them to discrete values. The reason why we do not use a discrete action space is because that will lose the relative order information, also because the discrete action space is too large.
Action Representation: Inspired by human designers who iterate with fine-grained tuning steps to find optimal device parameters, we use discrete action space to tune device parameters. For each tunable parameter ๐ฅ of a device (e.g., width and finger number of transistors), there are three possible actions at each step: increasing (๐ฅ+del ๐ฅ), keeping (๐ฅ +0), or decreasing (๐ฅโdel ๐ฅ) the parameter, where โdel ๐ฅ" is the smallest unit to update the parameter within its bound [๐ฅmin, ๐ฅmax]. Assuming total ๐ device parameters, the output of the policy network is an ๐ ร 3 probability distribution matrix with each row corresponding to a parameter. The action is taken based on the probability distribution.
Reward Function:The reward is directly related to the design goal. We define the reward ๐๐ at each time step ๐ as ๐๐ = ๐, if ๐ < 0 or ๐๐ = ๐
, if ๐ = 0, where ๐ = summation of min{(๐๐๐ โ ๐๐โ )/(๐๐๐ + ๐๐โ), 0}. where, it is a normalized difference between the obtained specifications ๐๐ and the given specifications ๐โ. The upper bound of ๐ is set to be 0 to avoid overoptimizing the parameters once the given specifications are reached. All ๐ specifications are equally important. We also give a large reward (i.e., ๐
= 10) to encourage the agent if the design goals are reached at some step. The episode return ๐
๐ 0,๐โ of searching optimal device parameters for the given goals ๐โ starting from an initial state ๐ 0, is the accumulated reward of all steps: ๐
๐ 0,๐โ = summation of ๐๐ . Our goal is to train a good policy to maximize ๐
๐ 0,๐โ .
|
804b18005f3c46cb7646212c63e8f42b
|
{
"intermediate": 0.2395201027393341,
"beginner": 0.19606654345989227,
"expert": 0.56441330909729
}
|
40,164
|
@foreach (session('flash_notification', collect())->toArray() as $message)
AIZ.plugins.notify('{{ $message['level'] }}', '{{ $message['message'] }}');
@endforeach
notify message will not popup
|
4a4c0c9593c78583df716cecfd8c1c4e
|
{
"intermediate": 0.3820097744464874,
"beginner": 0.28486326336860657,
"expert": 0.3331269323825836
}
|
40,165
|
# To clarify dict is dictionary in which I have stored my values(questions and answers)
questions_dict = {
"What is the capital of France?": "Paris",
"What color do you get when you mix blue and yellow?": "Green",
"Who is the president of the USA?": "Joe Biden",
"What is the name of the fairy in Peter Pan?": "Tinker Bell",
"What do bees produce?": "Honey",
}
score = 0
total_questions = len(questions_dict)
for question, correct_answer in questions_dict.items():
user_answer = input(question + " ").strip()
if user_answer.lower() == correct_answer.lower():
score += 1
print("Correct!\n")
else:
print(f"Wrong! The correct answer is {correct_answer}.\n")
print(f"You got {score} out of {total_questions} questions correct.")
score = 0
total_questions = len(questions_dict)
for question, correct_answer in questions_dict.items():
user_answer = input(question + " ").strip()
if user_answer.lower() == correct_answer.lower():
score += 1
print("Correct!\n")
else:
print(f"Wrong! The correct answer is {correct_answer}.\n")
print(f"You got {score} out of {total_questions} questions correct.")
are there any varibles in this code
|
d3b2635970eeac89c8832941dd6f05a1
|
{
"intermediate": 0.2658253014087677,
"beginner": 0.5195912718772888,
"expert": 0.21458348631858826
}
|
40,166
|
Explain this joke to me
|
f693023b454b29986f1a28fa60f1cca1
|
{
"intermediate": 0.2879602909088135,
"beginner": 0.5039299130439758,
"expert": 0.20810973644256592
}
|
40,167
|
**Thank you**
|
53f0f57cab2519409999b065d213803a
|
{
"intermediate": 0.34148427844047546,
"beginner": 0.2575589716434479,
"expert": 0.4009567201137543
}
|
40,168
|
**Generate longer text, but monospace text**
|
a2e567319367599fdfb2954fbf4bee8e
|
{
"intermediate": 0.3239867389202118,
"beginner": 0.17022740840911865,
"expert": 0.5057858228683472
}
|
40,169
|
**Generate longer text, but monospace text**
|
17d6f83b6c748ee747a07036ac781179
|
{
"intermediate": 0.3239867389202118,
"beginner": 0.17022740840911865,
"expert": 0.5057858228683472
}
|
40,170
|
Deepika wants her server should be free from intruder. She wants to detect whether the received client request is a legitimate or an attack (normal/attack). Design an incursion detection model to help Deepika using Adaboost classifier.
give me python program
FEATURE SELECTION
rfc = RandomForestClassifier();
Importance of feature in visual plots
# Classification report
# Plotting the confusion matrix with orange colormap
# Confusion Matrix
# Plotting training and test accuracy
# Evaluate the model on training and test accuracy
training and test accuracy and confusion matrix and heat map
Testcsv contains:
duration protocol_type service flag src_bytes dst_bytes land \
0 0 4 111 12 0 0 0
1 0 4 111 12 0 0 0
2 2 4 85 20 12983 0 0
3 0 3 79 20 20 0 0
4 1 4 121 13 0 15 0
wrong_fragment urgent hot ... dst_host_count dst_host_srv_count \
0 0 0 0 ... 255 10
1 0 0 0 ... 255 1
2 0 0 0 ... 134 86
3 0 0 0 ... 3 57
4 0 0 0 ... 29 86
dst_host_same_srv_rate dst_host_diff_srv_rate \
0 0.04 0.06
1 0.00 0.06
2 0.61 0.04
3 1.00 0.00
4 0.31 0.17
dst_host_same_src_port_rate dst_host_srv_diff_host_rate \
0 0.00 0.00
1 0.00 0.00
2 0.61 0.02
3 1.00 0.28
4 0.03 0.02
dst_host_serror_rate dst_host_srv_serror_rate dst_host_rerror_rate \
0 0.0 0.0 1.00
1 0.0 0.0 1.00
2 0.0 0.0 0.00
3 0.0 0.0 0.00
4 0.0 0.0 0.83
dst_host_srv_rerror_rate
0 1.00
1 1.00
2 0.00
3 0.00
4 0.71
[5 rows x 41 columns]
Traincsv contains:
duration protocol_type service flag src_bytes dst_bytes land \
0 0 1 11 10 491 0 0
1 0 2 36 10 146 0 0
2 0 1 41 6 0 0 0
3 0 1 15 10 232 8153 0
4 0 1 15 10 199 420 0
wrong_fragment urgent hot ... dst_host_srv_count \
0 0 0 0 ... 25
1 0 0 0 ... 1
2 0 0 0 ... 26
3 0 0 0 ... 255
4 0 0 0 ... 255
dst_host_same_srv_rate dst_host_diff_srv_rate \
0 0.17 0.03
1 0.00 0.60
2 0.10 0.05
3 1.00 0.00
4 1.00 0.00
dst_host_same_src_port_rate dst_host_srv_diff_host_rate \
0 0.17 0.00
1 0.88 0.00
2 0.00 0.00
3 0.03 0.04
4 0.00 0.00
dst_host_serror_rate dst_host_srv_serror_rate dst_host_rerror_rate \
0 0.00 0.00 0.05
1 0.00 0.00 0.00
2 1.00 1.00 0.00
3 0.03 0.01 0.00
4 0.00 0.00 0.00
dst_host_srv_rerror_rate class
0 0.00 normal
1 0.00 normal
2 0.00 anomaly
3 0.01 normal
4 0.00 normal
[5 rows x 42 columns]
class: normal (or) anomaly
dataset fetched from https://www.kaggle.com/code/aman1801/network-intrusion-detection-upgraded/input
|
2afdf133085e2eea993c780a0d82e8ee
|
{
"intermediate": 0.38951265811920166,
"beginner": 0.3043423891067505,
"expert": 0.3061448931694031
}
|
40,171
|
i have this script "import gradio as gr
def greet(name):
return "Hello " + name + "!!"
iface = gr.Interface(fn=greet, inputs="text", outputs="text")
iface.launch()
" i want to add video in left and the prediction parrt with yolo in the riht
|
3ed609903b17b8de67524136fb72fbc5
|
{
"intermediate": 0.44310295581817627,
"beginner": 0.27042290568351746,
"expert": 0.28647416830062866
}
|
40,172
|
Please create the froshims.db for the following flask app:
# Implements a registration form, storing registrants in a SQLite database, with support for deregistration
from cs50 import SQL
from flask import Flask, redirect, render_template, request
app = Flask(__name__)
db = SQL("sqlite:///froshims.db")
SPORTS = [
"Basketball",
"Soccer",
"Ultimate Frisbee"
]
@app.route("/")
def index():
return render_template("index.html", sports=SPORTS)
@app.route("/deregister", methods=["POST"])
def deregister():
# Forget registrant
id = request.form.get("id")
if id:
db.execute("DELETE FROM registrants WHERE id = ?", id)
return redirect("/registrants")
@app.route("/register", methods=["POST"])
def register():
# Validate submission
name = request.form.get("name")
sport = request.form.get("sport")
if not name or sport not in SPORTS:
return render_template("failure.html")
# Remember registrant
db.execute("INSERT INTO registrants (name, sport) VALUES(?, ?)", name, sport)
# Confirm registration
return redirect("/registrants")
@app.route("/registrants")
def registrants():
registrants = db.execute("SELECT * FROM registrants")
return render_template("registrants.html", registrants=registrants)
|
1e21e4bcb72485c43a88ab2a3b50f2c3
|
{
"intermediate": 0.6464335918426514,
"beginner": 0.20223434269428253,
"expert": 0.1513320654630661
}
|
40,173
|
sistemami questo <body bgcolor="#0000ff" background=url(https://cdn.hovia.com/app/uploads/Green-Tropical-Plant-Wallpaper-Mural-Plain-820x532.jpg")>
|
eb49fdd74fa7f1ff8acaba0df874da74
|
{
"intermediate": 0.21574677526950836,
"beginner": 0.40125760436058044,
"expert": 0.38299560546875
}
|
40,174
|
Comment afficher des donnรฉes de ce type avec une indexation comme un json.
[(INITIAL, END_PKI, {'subsystem_abbr': 'MGR', '@timestamp': '2024-02-20T09:42:56.000Z', 'event': {'original': 'Feb 20 09:42:56 15[MGR] <1> checkin of IKE_SA successful'}, 'message': '%{subsytem_abbr}: IKE_SA manager, handling synchronization for IKE_SA access', 'agent': {'ephemeral_id': '1c44fd87-6fa4-45b2-8ae3-59893122af63', 'name': 'laf', 'type': 'filebeat', 'version': '7.16.3', 'hostname': 'laf', 'id': '3b49d3fa-3fd4-45fc-9492-a5fa7f2439fe'}, 'description': '<1> checkin of IKE_SA successful', 'ecs': {'version': '1.0.0'}, 'input': {'type': 'log'}, 'timestamp': 'Feb 20 09:42:56', 'host': {'name': 'laf'}, 'thread': '15', '__patch__order__': -1}), ('Ike Security Association successful', 'IKE', {'subsystem_abbr': 'MGR', '@timestamp': '2024-02-20T09:42:56.000Z', 'event': {'original': 'Feb 20 09:42:56 15[MGR] <1> checkin of IKE_SA successful'}, 'message': '%{subsytem_abbr}: IKE_SA manager, handling synchronization for IKE_SA access', 'agent': {'ephemeral_id': '1c440687-6fa4-45b2-8ae3-59893122af63', 'name': 'laf', 'type': 'filebeat', 'version': '9.0.0', 'hostname': 'laf', 'id': '3b49d3fa-39b4-45fc-9492-adeea7f2439fe'}, 'description': '<1> checkin of IKE_SA successful', 'ecs': {'version': '1.0.0'}, 'input': {'type': 'log'}, 'timestamp': 'Feb 20 09:42:56', 'host': {'name': 'laf'}, 'thread': '15', '__patch__order__': -1})]
|
2eea6af7c6ca8114eb97faa0335017e3
|
{
"intermediate": 0.3309198319911957,
"beginner": 0.4549320340156555,
"expert": 0.21414814889431
}
|
40,175
|
give me the code for an aop annotation that I can put on a method and it will log.info the inputs and the ouput while specifying the name of each
|
7198d3d6f6524904485782a067edef5b
|
{
"intermediate": 0.5109885931015015,
"beginner": 0.21057452261447906,
"expert": 0.2784368395805359
}
|
40,176
|
bool level_3() {
float total = 0;
unsigned int *seed;
vector<float> n_arr;
// Random seed
seed = (unsigned int *)getauxval(AT_RANDOM);
srand(*seed);
// Add user input
add_user_input(&n_arr, "Number to add to array to equal zero: ");
// Add many random integers
for (int i = 0; i < 1024 * (8 + rand() % 1024); i++)
n_arr.push_back((rand() % 1024) + 1);
// Add user input
add_user_input(&n_arr, "Number to add to array to equal zero: ");
// Get sum
for (int i = 0; i < n_arr.size(); i++)
total += n_arr[i];
cout << fixed << setprecision(20) << total; // Print with higher precision
// Check if equal to zero
return total == 0;
}
is there a way to cause the total to be 0
|
ba4d1b39e53dcad519f0e7faa71757fa
|
{
"intermediate": 0.2929113805294037,
"beginner": 0.39635804295539856,
"expert": 0.31073057651519775
}
|
40,177
|
this is my algorithm
Step 1: Start
Step 2: Set score to 0
Step 3: Ask the user โWhat is the capital of France?โ
Step 4: If correct, add point
Step 5: Ask the user โWhat color do you get when you mix blue and yellow?โ
Step 6: If correct, add a point
Step 7: Ask the userโWho is the president of the USA?โ
Step 8: If correct, add a point
Step 9: Ask the userโWhat is the name of the fairy in Peter Panโ
Step 10: If correct, add a point
Step 11: Ask the user โWhat do bees produce?โ
Step 12: If correct, add a point
Step 13: Show the user the score
Step 14: Stop
i have to create a flowchart in draw.io help me
|
89c1b348e437ce8084902ca6cdbc2758
|
{
"intermediate": 0.2249426394701004,
"beginner": 0.1522650122642517,
"expert": 0.6227923631668091
}
|
40,178
|
My current code only finds first video in this json. I want it to search all the jsonData.data.mediaDetails.
This is my current code : "const getMp4FileSizes = async (jsonData) => {
const mp4FileSizes = [];
if (jsonData.data.mediaDetails) {
for (const media of jsonData.data.mediaDetails) {
if (media.video_info && media.video_info.variants) {
for (const variant of media.video_info.variants) {
if (variant.content_type === "video/mp4") {
const fileSize = await getMp4FileSize(variant.url);
if (fileSize) {
mp4FileSizes.push({
url: variant.url,
fileSize: fileSize
});
}
}
}
}
}
} else {
for (const media of jsonData.data.quoted_tweet.mediaDetails) {
if (media.video_info && media.video_info.variants) {
for (const variant of media.video_info.variants) {
if (variant.content_type === "video/mp4") {
const fileSize = await getMp4FileSize(variant.url);
if (fileSize) {
mp4FileSizes.push({
url: variant.url,
fileSize: fileSize
});
}
}
}
}
}
}
return mp4FileSizes;
};"
This json has more than 1 mediaDetails : "{
"data": {
"__typename": "Tweet",
"lang": "ta",
"favorite_count": 15,
"possibly_sensitive": false,
"created_at": "2024-02-23T12:05:48.000Z",
"display_text_range": [
0,
215
],
"entities": {
"hashtags": [],
"urls": [],
"user_mentions": [],
"symbols": [],
"media": [
{
"display_url": "pic.twitter.com/HgdvxydCtH",
"expanded_url": "https://twitter.com/rubane3/status/1760999397951692825/video/1",
"indices": [
216,
239
],
"url": "https://t.co/HgdvxydCtH"
}
]
},
"id_str": "1760999397951692825",
"text": "เฎคเฎฟเฎฐเฎพเฎตเฎฟเฎเฎฎเฏ เฎคเฎฟเฎฐเฎพเฎตเฎฟเฎ เฎเฎเฏเฎเฎฟเฎเฎณเฏ เฎฎเฎฃเฏเฎฃเฎฟเฎฉเฏ เฎเฎพเฎช เฎเฏเฎเฏเฎเฎณเฏ.. เฎฎเฎฒเฏเฎเฏเฎเฏเฎณเฏเฎณเฏ เฎเฎฒเฏ เฎเฏเฎตเฎพเฎฐเฎฟ เฎเฎเฏเฎเฎฟ เฎตเฎเฏเฎเฎฟเฎฐเฏเฎเฏเฎเฎพเฎฉเฏเฎเฏเฎ.. เฎเฎชเฏเฎชเฎเฎฟ เฎชเฎเฏเฎ เฎคเฎฟเฎฐเฏเฎเฏเฎเฏ เฎเฏเฎเฏเฎเฎคเฏเฎเฏเฎเฏ เฎตเฎพเฎเฏเฎเฏ เฎเฏเฎฒเฏเฎคเฏเฎคเฎฟ เฎฎเฎฃเฏเฎฃเฏ เฎ
เฎดเฎฟเฎเฏเฎ เฎตเฏเฎฒเฏเฎฒ เฎตเฏเฎเฏเฎเฏเฎฎเฏ เฎชเฎเฎฟเฎคเฏเฎค เฎฎเฏเฎเฏเฎเฎพเฎณเฏ เฎตเฎพเฎดเฏเฎฎเฏ เฎเฏเฎฎเฎฐเฎฟ เฎฎเฎพเฎตเฎเฏเฎเฎฎเฏ.... https://t.co/HgdvxydCtH",
"user": {
"id_str": "551734383",
"name": "เฎฐเฏเฎชเฎฉเฏ เฎฐเฎพเฎเฏ",
"profile_image_url_https": "https://pbs.twimg.com/profile_images/1631857768553496580/4vPhrhSW_normal.jpg",
"screen_name": "rubane3",
"verified": false,
"is_blue_verified": false,
"profile_image_shape": "Circle"
},
"edit_control": {
"edit_tweet_ids": [
"1760999397951692825"
],
"editable_until_msecs": "1708693548000",
"is_edit_eligible": true,
"edits_remaining": "5"
},
"mediaDetails": [
{11 items},
{11 items}
],
"photos": [],
"video": {
"aspectRatio": [
16,
9
],
"contentType": "media_entity",
"durationMs": 68147,
"mediaAvailability": {
"status": "available"
},
"poster": "https://pbs.twimg.com/ext_tw_video_thumb/1760998855049428993/pu/img/_QNJ2aTpfUC_2tiX.jpg",
"variants": [
{
"type": "video/mp4",
"src": "https://video.twimg.com/ext_tw_video/1760998855049428993/pu/vid/avc1/480x270/YB-ZAON6aIbcoeyz.mp4?tag=12"
},
{
"type": "video/mp4",
"src": "https://video.twimg.com/ext_tw_video/1760998855049428993/pu/vid/avc1/640x360/hQ_eH-MGRWAv4xEB.mp4?tag=12"
},
{
"type": "video/mp4",
"src": "https://video.twimg.com/ext_tw_video/1760998855049428993/pu/vid/avc1/1280x720/F-HtKmjr_mqerw-E.mp4?tag=12"
},
{
"type": "application/x-mpegURL",
"src": "https://video.twimg.com/ext_tw_video/1760998855049428993/pu/pl/bBHXRY9ItFCLmL7W.m3u8?tag=12&container=cmaf"
}
],
"videoId": {
"type": "tweet",
"id": "1760999397951692825"
},
"viewCount": 0
},
"conversation_count": 0,
"news_action_type": "conversation",
"isEdited": false,
"isStaleEdit": false
}
}"
|
0fba8f06f8886f010d55caf056ac4b3a
|
{
"intermediate": 0.29113316535949707,
"beginner": 0.5130364894866943,
"expert": 0.19583027064800262
}
|
40,179
|
Please show me the complete code for collectMediaVariantsAndImageUrls function as I canโt to figure out which code to add from previous code
|
4dd2ae68286f68d4dfd9a01838c5ed87
|
{
"intermediate": 0.6177976727485657,
"beginner": 0.17567987740039825,
"expert": 0.2065224051475525
}
|
40,180
|
Could you make me a Java script bookmark to automatically add site:YouTube.com to google search
|
ebd360948880c3fced83f4766c313430
|
{
"intermediate": 0.43732398748397827,
"beginner": 0.18792301416397095,
"expert": 0.37475305795669556
}
|
40,181
|
this is my current code :
"
async function fetchAndSendMedia(number) {
try {
const response = await fetch(`https://react-tweet.vercel.app/api/tweet/${number}`);
const jsonData = await response.json();
if (jsonData.data === null) {
console.log("Protected content");
return new Response("Protected content", { status: 200 });
}
const allVideoUrls = new Set();
const allImageUrls = new Set();
// Collect all unique video and image URLs
await collectMediaUrls(jsonData.data, allVideoUrls, allImageUrls, 'mediaDetails');
// If quoted tweets exist, do the same
if (jsonData.data.quoted_tweet) {
await collectMediaUrls(jsonData.data.quoted_tweet, allVideoUrls, allImageUrls, 'mediaDetails');
}
// Process videos and images
const sendVideoPromises = Array.from(allVideoUrls).map(url => sendVideo(url));
const sendImagePromises = Array.from(allImageUrls).map(url => sendImageToTelegram(url));
// Wait for both video and image processing to complete
await Promise.all([...sendVideoPromises, ...sendImagePromises]);
return new Response("Media processing completed", { status: 200 });
} catch (error) {
console.error(error);
return new Response("Error processing media", { status: 500 });
}
}
async function modifyImageUrl(inputUrl) {
try {
const modifiedUrl = inputUrl.replace('.jpg', '?format=jpg&name=large');
return modifiedUrl;
} catch (error) {
// Handle any errors
console.error('An error occurred:', error);
return null;
}
}
async function collectMediaUrls(data, videoUrls, imageUrls, mediaDetailsKey) {
if (data[mediaDetailsKey] && data[mediaDetailsKey].length > 0) {
for (const mediaDetail of data[mediaDetailsKey]) {
if (mediaDetail.video_info && mediaDetail.video_info.variants) {
// Find the highest resolution video that is less than 50MB in size
const sortedVariants = mediaDetail.video_info.variants
.filter(variant => variant.content_type === "video/mp4")
.sort((a, b) => {
// Sort in descending order of bitrate; assuming higher bitrate is higher resolution
return b.bitrate - a.bitrate;
});
for (const variant of sortedVariants) {
const fileSize = await getMp4FileSize(variant.url);
const fileSizeInMB = fileSize / (1024 * 1024);
// Select the highest resolution variant that is less than 50MB
if (fileSizeInMB < 50) {
videoUrls.add(variant.url);
break; // We've found the optimal variant, so no need to check others
}
}
} else if (mediaDetail.type === "photo" && mediaDetail.media_url_https) {
imageUrls.add(mediaDetail.media_url_https);
}
}
}
}
async function sendVideo(url) {
try {
const fileSize = await getMp4FileSize(url);
const fileSizeInMB = fileSize / (1024 * 1024);
if (fileSizeInMB < 50) {
await sendVideoToTelegram(url);
} else {
console.log(`Video file size is too large: ${fileSizeInMB} MB for URL: ${url}`);
}
} catch (error) {
console.error(`Failed to send video ${url}: ${error}`);
}
}
async function sendImageToTelegram(url) {
try {
const largeJpgUrl = await modifyImageUrl(url);
await sendPhoto(largeJpgUrl);
console.log(`Image sent: ${url}`);
} catch (error) {
console.error(`Failed to send image ${url}: ${error}`);
}
}
async function sendVideoToTelegram(url) {
const botToken = "mybottoken";
const chatId = '5225794753';
let tgurl = `https://api.telegram.org/bot${botToken}/sendVideo?chat_id=${chatId}&video=${url}`;
// Send the request to the Telegram API to send the video
const response = await fetch(tgurl, { method: 'POST' });
const data = await response.json();
// Check the response and log accordingly
if (data.ok) {
console.log(`Video sent: ${url}`);
} else {
throw new Error(`Failed to send video: ${url}`);
}
}
async function sendPhoto(url) {
const botToken = "mybottoken";
const chatId = '5225794753';
let tgurl = `https://api.telegram.org/bot${botToken}/sendPhoto?chat_id=${chatId}&photo=${url}`;
// Send the request to the Telegram API to send the photo
const response = await fetch(tgurl, { method: 'POST' });
const data = await response.json();
// Check the response and log accordingly
if (!data.ok) {
throw new Error(`Failed to send photo: ${url}`);
}
}
async function getMp4FileSize(url) {
// Implement the MP4 file size check logic. Perform a HEAD request to the video URL and check the Content-Length header.
// If the Content-Length header is not available, or the request fails, return 0.
const response = await fetch(url, { method: 'HEAD' });
const length = response.headers.get('content-length');
if (length) {
return parseInt(length, 10);
} else {
return 0;
}
}"
this code is working perfectly but sometimes the highest resolution available video is not sent. in that case, I want to try sending second highest resolution video available. how to do that? Please make changes in my current code to do that functionality. Also show me full code of the functions without suggesting me to take it from my code. Thanks
|
b81a0c795e978775af8eb8366dc062fe
|
{
"intermediate": 0.34728580713272095,
"beginner": 0.4313805103302002,
"expert": 0.22133372724056244
}
|
40,182
|
How do i replace pinterest for whatsapp in the code and ensure that when users click on the whatsapp icon or link, their default WhatsApp client will open in a new tab and they will be able to start a chat with me: import React from 'react';
import Link from "next/link";
import { Tooltip } from "react-tooltip";
import {
RiYoutubeLine,
RiInstagramLine,
RiFacebookLine,
RiDribbbleLine,
RiPinterestLine,
} from "react-icons/ri";
import { MdLocalPhone } from "react-icons/md";
export const socialData = [
{
name: "YouTube",
link: "https://youtube.com",
Icon: RiYoutubeLine,
},
{
name: "Instagram",
link: "https://instagram.com",
Icon: RiInstagramLine,
},
{
name: "Facebook",
link: "https://facebook.com",
Icon: RiFacebookLine,
},
{
name: "Dribbble",
link: "https://dribbble.com",
Icon: RiDribbbleLine,
},
{
name: "Pinterest",
link: "https://pinterest.com",
Icon: RiPinterestLine,
},
{
name: "If unavailable on phone, email me or use whatsapp",
link: "tel:<PRESIDIO_ANONYMIZED_PHONE_NUMBER>",
Icon: MdLocalPhone,
},
];
const Socials = () => {
return (
<div className="flex items-center gap-x-5 text-lg">
{socialData.map((social, i) => {
const Icon = social.Icon; // Extract the Icon component here
return (
<Link
key={i}
title={social.name}
href={social.link}
target="_blank"
rel="noreferrer noopener"
className={`${social.name === "If unavailable on phone, email me or use whatsapp"
? "bg-accent rounded-full p-[5px] hover:text-white"
: "hover:text-accent"
} transition-all duration-300`}
data-tooltip-id={`tooltip-${i}`}
data-tooltip-content={social.name}
>
<Icon aria-hidden="true" /> {/* Use the Icon component here */}
<Tooltip id={`tooltip-${i}`} place="bottom" className="custom-tooltip" />
<span className="sr-only">{social.name}</span>
</Link>
);
})}
</div>
);
};
export default Socials;
|
0ff909ad70b3a6cc64128943211727f1
|
{
"intermediate": 0.320573091506958,
"beginner": 0.46100541949272156,
"expert": 0.21842150390148163
}
|
40,183
|
If i am adding a js game that uses jquery to a react app, do i have to import jquery to react and if yes how?
|
3155a9585d624f262edab740e30b834b
|
{
"intermediate": 0.5846304893493652,
"beginner": 0.24477653205394745,
"expert": 0.1705930382013321
}
|
40,184
|
If P and Q are logically equivalent, then the implication, inverse, converse, contrapositive are all logically equivalent.
1. True
2. False
|
d144e7505e93588901b27383c3fd9d1f
|
{
"intermediate": 0.3722848892211914,
"beginner": 0.3053441047668457,
"expert": 0.3223710358142853
}
|
40,185
|
Task 3
In this task, you will be learning how to scrape website (extracting data from websites and
cleaning the scrapped Data) and data visualization using matplotlib.
a) Write a Python script to scrape the Top 250 TV-shows of all time from the IMDB
website. After scraping the data, save it to a MySQL database named โtop-250-showsโ for
further analysis. You must also use the data from the obtained database to plot the following
graphs:
i. A bar graph representing Genre (on x-axis) to no. of TV-shows belonging to that genre
(on y-axis). (Note: A TV Show might have multiple genre)
ii. A line graph representing the frequency count of TV-shows having n episodes, n varies
from 1 to maximum no. of episodes present. Represent no. of episodes (on x-axis) and
frequency count (on y-axis).
b) Write a Python Program that allows user to filter the TV Shows based on:-
1. Genre
2. IMDB rating
3. No. of episode
For each filter take user-input to choose the criteria. The user must be prompted a range
(inclusive of both the limits) for IMDB rating and No. of episode and Genre must be a
string input consisting of genres separated by spaces. Print the TV-show in the descending
order based on the user-filtering.
>>./q3 b.py
Comdey Thriller Drama Documentary
8.5 9.5
10 20
Reference: Helpful reference for learning about web-scrapping using Python
I WNT TO RUN THIS IN JUPYTER NOTEBOOK WITH LOCAL INSTALLATION OF MYSQL
|
076ad30f60cd425ba17219502443b365
|
{
"intermediate": 0.5386105179786682,
"beginner": 0.22310136258602142,
"expert": 0.23828813433647156
}
|
40,186
|
I have a prestylized text layer on After Effects. Say its name is "01".
I need a script which imports txt file and creates multiple comps
for each line separated with "*****". (Comp names should be as "Text01", "Text02",...
Such as;
This is text 01
*****
This is text 02
*****
So script should create 2 comps and replace text content with these strings.
|
8d7b70c7e2fc4c6eb28017ea0f467349
|
{
"intermediate": 0.41955098509788513,
"beginner": 0.2192881852388382,
"expert": 0.36116087436676025
}
|
40,187
|
How can i send this every 3 seconds
|
517f7e6a0a3016ad17dd1314080873fd
|
{
"intermediate": 0.4262734055519104,
"beginner": 0.16603171825408936,
"expert": 0.40769490599632263
}
|
40,188
|
I have a txt file with the content
This is a test 01
-----
This is a test 02
-----
This is a test 03
-----
This is a test 04
I want an user-interface which user can select txt file and run it. The script import the contents into separate lines with the delimeter "-----" and duplicate current after effects comp,(There is a text layer) and replace the content of text layer on other duplicated comps according to txt file. For example TextComp01, TextComp02, TextComp03 and etc. And for example TextComp02's text layer should be like This is a test 02.
|
411a37e91dedebaad77ffb093df08fa3
|
{
"intermediate": 0.4180503785610199,
"beginner": 0.2705708146095276,
"expert": 0.3113787770271301
}
|
40,189
|
Can you find and fix the errors : function readTextFile(filePath) {
var file = new File(filePath);
var textArray = [];
if (file.exists) {
file.encoding = "UTF-8"; // Set the file encoding to UTF-8
file.open('r');
var text = file.read(); // Read the contents of the file
file.close();
var rawTextArray = text.split("\n");
$.writeln("Raw Text Array:"); $.writeln(rawTextArray);
for (var i = 0; i < rawTextArray.length; i++) {
var trimmedBlock = rawTextArray[i].trim();
if (trimmedBlock !== "") {
textArray.push(trimmedBlock);
} else { $.writeln("Empty block at index " + i); }
}
// Debugging: Output textArray to inspect its content
$.writeln("Text Array:");
$.writeln(textArray);
} else { alert("File not found at path: " + filePath); }
return textArray;
}
|
8e50c655517efbc6ca26a2425fc4af85
|
{
"intermediate": 0.7375653386116028,
"beginner": 0.15202149748802185,
"expert": 0.11041311919689178
}
|
40,190
|
Using after effects scripting I'm getting an error on this line: var trimmedBlock = rawTextArray[i].trim();
|
64e59e316f42fac620b8d84858c8c7ea
|
{
"intermediate": 0.22330045700073242,
"beginner": 0.49658918380737305,
"expert": 0.28011029958724976
}
|
40,191
|
give me professional upload image and video "<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Object Detection with YOLO</title>
<!-- Bootstrap CSS -->
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/css/bootstrap.min.css" rel="stylesheet">
<style>
body {
font-family: Arial, sans-serif;
background-color: #f8f9fa;
padding: 20px;
}
.container {
max-width: 600px;
margin: 0 auto;
}
.upload-btn {
margin-top: 20px;
}
#result {
margin-top: 20px;
}
</style>
</head>
<body>
<div class="container">
<h1 class="text-center mb-4">Object Detection with YOLO</h1>
<form id="uploadForm" enctype="multipart/form-data">
<div class="mb-3">
<label for="fileInput" class="form-label">Select Image or Video:</label>
<input type="file" class="form-control" id="fileInput" name="file" accept="image/*, video/*" required>
<div class="invalid-feedback">Please select an image or video file.</div>
</div>
<button type="submit" class="btn btn-primary upload-btn">Upload</button>
</form>
<div id="result"></div>
</div>
<!-- Bootstrap JS -->
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha1/dist/js/bootstrap.bundle.min.js"></script>
<script>
document.getElementById('uploadForm').addEventListener('submit', function(event) {
event.preventDefault();
const fileInput = document.getElementById('fileInput');
const file = fileInput.files[0];
if (!file) return;
if (!file.type.startsWith('image/') && !file.type.startsWith('video/')) {
fileInput.classList.add('is-invalid');
return;
}
fileInput.classList.remove('is-invalid');
const formData = new FormData();
formData.append('file', file);
fetch('/upload/', {
method: 'POST',
body: formData,
})
.then(response => response.json())
.then(result => {
displayResult(result);
})
.catch(error => console.error('Error:', error));
});
function displayResult(result) {
const resultDiv = document.getElementById('result');
resultDiv.innerHTML = '';
for (const obj of result) {
const objDiv = document.createElement('div');
objDiv.textContent = `Object: ${obj.label}, Confidence: ${obj.confidence}`;
resultDiv.appendChild(objDiv);
}
}
</script>
</body>
</html>
|
fa7c5764394a1573b2270ccdfd9b4006
|
{
"intermediate": 0.35111096501350403,
"beginner": 0.5482949018478394,
"expert": 0.10059409588575363
}
|
40,192
|
Task 3
In this task, you will be learning how to scrape website (extracting data from websites and
cleaning the scrapped Data) and data visualization using matplotlib.
a) Write a Python script to scrape the Top 250 TV-shows of all time from the IMDB
website. After scraping the data, save it to a MySQL database named โtop-250-showsโ for
further analysis. You must also use the data from the obtained database to plot the following
graphs:
i. A bar graph representing Genre (on x-axis) to no. of TV-shows belonging to that genre
(on y-axis). (Note: A TV Show might have multiple genre)
ii. A line graph representing the frequency count of TV-shows having n episodes, n varies
from 1 to maximum no. of episodes present. Represent no. of episodes (on x-axis) and
frequency count (on y-axis).
b) Write a Python Program that allows user to filter the TV Shows based on:-
1. Genre
2. IMDB rating
3. No. of episode
For each filter take user-input to choose the criteria. The user must be prompted a range
(inclusive of both the limits) for IMDB rating and No. of episode and Genre must be a
string input consisting of genres separated by spaces. Print the TV-show in the descending
order based on the user-filtering.
>>./q3 b.py
Comdey Thriller Drama Documentary
8.5 9.5
10 20
Reference: Helpful reference for learning about web-scrapping using Python
|
0c1b4a1af42f87b56fe123df9c60bed0
|
{
"intermediate": 0.5702911019325256,
"beginner": 0.19412709772586823,
"expert": 0.23558175563812256
}
|
40,193
|
In this task, you will be learning how to scrape website (extracting data from websites and
cleaning the scrapped Data) and data visualization using matplotlib.
a) Write a Python script to scrape the Top 250 TV-shows of all time from the IMDB
website. After scraping the data, save it to a MySQL database named โtop-250-showsโ for
further analysis. You must also use the data from the obtained database to plot the following
graphs:
i. A bar graph representing Genre (on x-axis) to no. of TV-shows belonging to that genre
(on y-axis). (Note: A TV Show might have multiple genre)
ii. A line graph representing the frequency count of TV-shows having n episodes, n varies
from 1 to maximum no. of episodes present. Represent no. of episodes (on x-axis) and
frequency count (on y-axis).
b) Write a Python Program that allows user to filter the TV Shows based on:-
1. Genre
2. IMDB rating
3. No. of episode
For each filter take user-input to choose the criteria. The user must be prompted a range
(inclusive of both the limits) for IMDB rating and No. of episode and Genre must be a
string input consisting of genres separated by spaces. Print the TV-show in the descending
order based on the user-filtering.
>>./q3 b.py
Comdey Thriller Drama Documentary
8.5 9.5
10 20
Reference: Helpful reference for learning about web-scrapping using Python
JUST GIVE ME PYTHON CODE WITHOUT EXPLANATION , USING LOCALHOST DATABAS
|
373a4f9a9473126b81ef536617c73d38
|
{
"intermediate": 0.5031692981719971,
"beginner": 0.3128461241722107,
"expert": 0.18398454785346985
}
|
40,194
|
from statsforecast import StatsForecast
from statsforecast.models import AutoARIMA, AutoETS, AutoCES, DynamicOptimizedTheta
from statsforecast.utils import ConformalIntervals
import numpy as np
import polars as pl
# Polars option to display all rows
pl.Config.set_tbl_rows(None)
# Initialize the models
models = [
AutoARIMA(season_length=52),
AutoETS(season_length=52),
AutoCES(season_length=52),
DynamicOptimizedTheta(season_length=52)
]
# Initialize the StatsForecast model
sf = StatsForecast(models=models, freq='1w', n_jobs=-1)
# Perform cross-validation with a step size of 1 to mimic an expanding window
crossvalidation_df = sf.cross_validation(df=y_cl4, h=16, step_size=1, n_windows=18, sort_df=True)
# Calculate the ensemble mean
ensemble = crossvalidation_df[['AutoARIMA', 'AutoETS', 'CES', 'DynamicOptimizedTheta']].mean(axis=1)
# Create a Series for the ensemble mean
ensemble_series = pl.Series('Ensemble', ensemble)
# Add the ensemble mean as a new column to the DataFrame
crossvalidation_df = crossvalidation_df.with_columns(ensemble_series)
def wmape(y_true, y_pred):
return np.abs(y_true - y_pred).sum() / np.abs(y_true).sum()
# Calculate the WMAPE for the ensemble model
wmape_value = wmape(crossvalidation_df['y'], crossvalidation_df['Ensemble'])
print('Average WMAPE for Ensemble: ', round(wmape_value, 4))
# Calculate the errors for the ensemble model
errors = crossvalidation_df['y'] - crossvalidation_df['Ensemble']
# For an individual forecast
individual_accuracy = 1 - (abs(crossvalidation_df['y'] - crossvalidation_df['Ensemble']) / crossvalidation_df['y'])
individual_bias = (crossvalidation_df['Ensemble'] / crossvalidation_df['y']) - 1
# Add these calculations as new columns to DataFrame
crossvalidation_df = crossvalidation_df.with_columns([
individual_accuracy.alias("individual_accuracy"),
individual_bias.alias("individual_bias")
])
# Print the individual accuracy and bias for each week
for row in crossvalidation_df.to_dicts():
id = row['unique_id']
date = row['ds']
accuracy = row['individual_accuracy']
bias = row['individual_bias']
print(f"{id}, {date}, Individual Accuracy: {accuracy:.4f}, Individual Bias: {bias:.4f}")
# For groups of forecasts
group_accuracy = 1 - (errors.abs().sum() / crossvalidation_df['y'].sum())
group_bias = (crossvalidation_df['Ensemble'].sum() / crossvalidation_df['y'].sum()) - 1
# Print the average group accuracy and group bias over all folds for the ensemble model
print('Average Group Accuracy: ', round(group_accuracy, 4))
print('Average Group Bias: ', round(group_bias, 4))
# Fit the models on the entire dataset
sf.fit(y_cl4)
# Instantiate the ConformalIntervals class
prediction_intervals = ConformalIntervals()
# Generate 24 months forecasts
forecasts_df = sf.forecast(h=52*2, prediction_intervals=prediction_intervals, level=[95], id_col='unique_id', sort_df=True)
# Calculate the ensemble forecast
ensemble_forecast = forecasts_df.select(
[
pl.when(pl.col('AutoARIMA') < 0).then(0).otherwise(pl.col('AutoARIMA')).alias('AutoARIMA'),
pl.when(pl.col('AutoETS') < 0).then(0).otherwise(pl.col('AutoETS')).alias('AutoETS'),
pl.when(pl.col('CES') < 0).then(0).otherwise(pl.col('CES')).alias('CES'),
pl.when(pl.col('DynamicOptimizedTheta') < 0).then(0).otherwise(pl.col('DynamicOptimizedTheta')).alias('DynamicOptimizedTheta'),
]
).mean(axis=1)
# Calculate the lower and upper prediction intervals for the ensemble forecast
ensemble_lo_95 = forecasts_df.select(
[
pl.when(pl.col('AutoARIMA-lo-95') < 0).then(0).otherwise(pl.col('AutoARIMA-lo-95')).alias('AutoARIMA-lo-95'),
pl.when(pl.col('AutoETS-lo-95') < 0).then(0).otherwise(pl.col('AutoETS-lo-95')).alias('AutoETS-lo-95'),
pl.when(pl.col('CES-lo-95') < 0).then(0).otherwise(pl.col('CES-lo-95')).alias('CES-lo-95'),
pl.when(pl.col('DynamicOptimizedTheta-lo-95') < 0).then(0).otherwise(pl.col('DynamicOptimizedTheta-lo-95')).alias('DynamicOptimizedTheta-lo-95'),
]
).mean(axis=1)
ensemble_hi_95 = forecasts_df[['AutoARIMA-hi-95', 'AutoETS-hi-95', 'CES-hi-95', 'DynamicOptimizedTheta-hi-95']].mean(axis=1)
# Create Series for the ensemble forecast and its prediction intervals
ensemble_forecast_series = pl.Series('EnsembleForecast', ensemble_forecast)
ensemble_lo_95_series = pl.Series('Ensemble-lo-95', ensemble_lo_95)
ensemble_hi_95_series = pl.Series('Ensemble-hi-95', ensemble_hi_95)
# Add the ensemble forecast and its prediction intervals as new columns to the DataFrame
forecasts_df = forecasts_df.with_columns([ensemble_forecast_series, ensemble_lo_95_series, ensemble_hi_95_series])
# Round the ensemble forecast and prediction intervals and convert to integer
forecasts_df = forecasts_df.with_columns([
pl.col("EnsembleForecast").round().cast(pl.Int32),
pl.col("Ensemble-lo-95").round().cast(pl.Int32),
pl.col("Ensemble-hi-95").round().cast(pl.Int32)
])
# Reorder the columns
forecasts_df = forecasts_df.select([
"unique_id",
"ds",
"EnsembleForecast",
"Ensemble-lo-95",
"Ensemble-hi-95",
"AutoARIMA",
"AutoARIMA-lo-95",
"AutoARIMA-hi-95",
"AutoETS",
"AutoETS-lo-95",
"AutoETS-hi-95",
"CES",
"CES-lo-95",
"CES-hi-95",
"DynamicOptimizedTheta",
"DynamicOptimizedTheta-lo-95",
"DynamicOptimizedTheta-hi-95"
])
# Create an empty list
forecasts_list = []
# Append each row to the list
for row in forecasts_df.to_dicts():
forecasts_list.append(row)
# Print the list
for forecast in forecasts_list:
print(forecast)
output {'unique_id': '6922332', 'ds': datetime.datetime(2025, 6, 16, 0, 0), 'EnsembleForecast': 19444, 'Ensemble-lo-95': 16762, 'Ensemble-hi-95': 22125, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 14011.3857421875, 'CES-lo-95': 8402.5244140625, 'CES-hi-95': 19620.248046875, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 6, 23, 0, 0), 'EnsembleForecast': 23127, 'Ensemble-lo-95': 20446, 'Ensemble-hi-95': 25809, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 28746.3984375, 'CES-lo-95': 23137.537109375, 'CES-hi-95': 34355.26171875, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 6, 30, 0, 0), 'EnsembleForecast': 22671, 'Ensemble-lo-95': 19990, 'Ensemble-hi-95': 25353, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 26922.3515625, 'CES-lo-95': 21313.490234375, 'CES-hi-95': 32531.21484375, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 7, 7, 0, 0), 'EnsembleForecast': 19052, 'Ensemble-lo-95': 16370, 'Ensemble-hi-95': 21733, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 12443.15234375, 'CES-lo-95': 6834.291015625, 'CES-hi-95': 18052.013671875, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 7, 14, 0, 0), 'EnsembleForecast': 21149, 'Ensemble-lo-95': 18467, 'Ensemble-hi-95': 23831, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 20832.93359375, 'CES-lo-95': 15224.0712890625, 'CES-hi-95': 26441.794921875, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 7, 21, 0, 0), 'EnsembleForecast': 20599, 'Ensemble-lo-95': 17918, 'Ensemble-hi-95': 23281, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 18633.75390625, 'CES-lo-95': 13024.892578125, 'CES-hi-95': 24242.615234375, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 7, 28, 0, 0), 'EnsembleForecast': 20781, 'Ensemble-lo-95': 18099, 'Ensemble-hi-95': 23462, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 19359.54296875, 'CES-lo-95': 13750.6826171875, 'CES-hi-95': 24968.40625, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 8, 4, 0, 0), 'EnsembleForecast': 18442, 'Ensemble-lo-95': 15761, 'Ensemble-hi-95': 21124, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 10005.392578125, 'CES-lo-95': 4396.53125, 'CES-hi-95': 15614.2548828125, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 8, 11, 0, 0), 'EnsembleForecast': 20543, 'Ensemble-lo-95': 17862, 'Ensemble-hi-95': 23225, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 18410.65625, 'CES-lo-95': 12801.794921875, 'CES-hi-95': 24019.517578125, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 8, 18, 0, 0), 'EnsembleForecast': 19830, 'Ensemble-lo-95': 17148, 'Ensemble-hi-95': 22511, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 15556.056640625, 'CES-lo-95': 9947.1953125, 'CES-hi-95': 21164.91796875, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 8, 25, 0, 0), 'EnsembleForecast': 19568, 'Ensemble-lo-95': 16886, 'Ensemble-hi-95': 22249, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 14507.791015625, 'CES-lo-95': 8898.9296875, 'CES-hi-95': 20116.65234375, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 9, 1, 0, 0), 'EnsembleForecast': 18793, 'Ensemble-lo-95': 16111, 'Ensemble-hi-95': 21474, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 11407.0390625, 'CES-lo-95': 5798.177734375, 'CES-hi-95': 17015.900390625, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 9, 8, 0, 0), 'EnsembleForecast': 17686, 'Ensemble-lo-95': 15004, 'Ensemble-hi-95': 20367, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 6979.3349609375, 'CES-lo-95': 1370.473388671875, 'CES-hi-95': 12588.1962890625, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 9, 15, 0, 0), 'EnsembleForecast': 21029, 'Ensemble-lo-95': 18347, 'Ensemble-hi-95': 23711, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 20352.7578125, 'CES-lo-95': 14743.896484375, 'CES-hi-95': 25961.62109375, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 9, 22, 0, 0), 'EnsembleForecast': 20262, 'Ensemble-lo-95': 17580, 'Ensemble-hi-95': 22944, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 17284.330078125, 'CES-lo-95': 11675.4677734375, 'CES-hi-95': 22893.19140625, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375}
{'unique_id': '6922332', 'ds': datetime.datetime(2025, 9, 29, 0, 0), 'EnsembleForecast': 20816, 'Ensemble-lo-95': 18134, 'Ensemble-hi-95': 23498, 'AutoARIMA': 21242.763671875, 'AutoARIMA-lo-95': 19512.095703125, 'AutoARIMA-hi-95': 22973.431640625, 'AutoETS': 21279.15625, 'AutoETS-lo-95': 19626.21484375, 'AutoETS-hi-95': 22932.09765625, 'CES': 19500.451171875, 'CES-lo-95': 13891.58984375, 'CES-hi-95': 25109.3125, 'DynamicOptimizedTheta': 21241.310546875, 'DynamicOptimizedTheta-lo-95': 19507.21875, 'DynamicOptimizedTheta-hi-95': 22975.40234375} unique_count = y_cl4['unique_id'].unique().shape[0]
print('Unique count of unique_id:', unique_count)Unique count of unique_id: 113 y_cl4.head() unique_id ds y
str datetime[ฮผs] f64
"6665053" 2020-11-30 00:00:00 5484.0
"5338737" 2020-11-30 00:00:00 52310.0
"5339184" 2020-11-30 00:00:00 97457.0
"1131041" 2020-11-30 00:00:00 15202.588
"1131030" 2020-11-30 00:00:00 17838.0 based on all the information I have given you, is there a way to see the y in y_cl4 dataframe with the ensemble forecasted values? Normally if I just do 1 series, I could create a plot and see if the forecasted values would make sense given historical data, but there are 113 series here
|
357dee9ae5ef8ccd5e386c5f84066c58
|
{
"intermediate": 0.31975889205932617,
"beginner": 0.3983381390571594,
"expert": 0.281902939081192
}
|
40,195
|
Write a Python script to scrape the Top 250 TV-shows of all time from the IMDB
website. After scraping the data, save it to a MySQL database named โtop-250-showsโ localhost
|
90453f39236d77ac34c89f32d852bb6d
|
{
"intermediate": 0.42733633518218994,
"beginner": 0.192007377743721,
"expert": 0.38065624237060547
}
|
40,196
|
provide me fully functional without errors a python program for AI Audio Stems Extractor
|
4f2a82f104a2d6276dad0fbb72943745
|
{
"intermediate": 0.21118156611919403,
"beginner": 0.1059979647397995,
"expert": 0.6828204393386841
}
|
40,197
|
Write a Python script to scrape the Top 250 TV-shows of all time from the IMDB
website. After scraping the data, save it to a MySQL database named โtop-250-showsโ localhost
GIVE ME CODE WITHOUT EXPLANAION
ITS NOT I;LLGAL LINK IS https://www.imdb.com/chart/toptv/
|
485416fdcf72ab61a490a2079c1ba407
|
{
"intermediate": 0.31903719902038574,
"beginner": 0.47857585549354553,
"expert": 0.2023869901895523
}
|
40,198
|
create me a javascript async function that extracts all the mp4 variants from json and saves it in a object : "{
"data": {
"__typename": "Tweet",
"lang": "ta",
"favorite_count": 1374,
"possibly_sensitive": false,
"created_at": "2024-02-08T02:14:37.000Z",
"display_text_range": [
0,
181
],
"entities": {
"hashtags": [
{
"indices": [
147,
155
],
"text": "SunNews"
},
{
"indices": [
158,
171
],
"text": "ElectionNews"
},
{
"indices": [
174,
181
],
"text": "Seeman"
}
],
"urls": [],
"user_mentions": [],
"symbols": [],
"media": [
{
"display_url": "pic.twitter.com/thIWNZAjD3",
"expanded_url": "https://twitter.com/sunnewstamil/status/1755414803613647221/video/1",
"indices": [
182,
205
],
"url": "https://t.co/thIWNZAjD3"
}
]
},
"id_str": "1755414803613647221",
"text": "Watch | "เฎฐเฎคเฏเฎคเฎคเฏเฎคเฏ เฎเฏเฎเฎพเฎเฏเฎเฏเฎฎเฏ เฎ
เฎณเฎตเฏเฎเฏเฎเฏ เฎชเฏเฎเฏเฎฎเฏ เฎเฏเฎฎเฎพเฎฉเฎฟเฎเฎฎเฏ เฎเฏเฎฏเฎฒเฎฟเฎฒเฏ เฎเฎฉเฏเฎฑเฏเฎฎเฏ เฎเฎฒเฏเฎฒเฏ" เฎจเฎพเฎฎเฏ เฎคเฎฎเฎฟเฎดเฎฐเฏ เฎเฎเฏเฎเฎฟเฎฏเฎฟเฎฒเฏ เฎเฎฐเฏเฎจเฏเฎคเฏ เฎตเฎฟเฎฒเฎเฎฟเฎฏ เฎจเฎฟเฎฐเฏเฎตเฎพเฎเฎฟ เฎเฎเฏเฎฎเฏ เฎเฏเฎฑเฏเฎฑเฎเฏเฎเฎพเฎเฏเฎเฏ! #SunNews | #ElectionNews | #Seeman https://t.co/thIWNZAjD3",
"user": {
"id_str": "1079310252",
"name": "Sun News",
"profile_image_url_https": "https://pbs.twimg.com/profile_images/1173895565215588352/Pcxpw82k_normal.jpg",
"screen_name": "sunnewstamil",
"verified": false,
"verified_type": "Business",
"is_blue_verified": true,
"profile_image_shape": "Square"
},
"edit_control": {
"edit_tweet_ids": [
"1755414803613647221"
],
"editable_until_msecs": "1707362077000",
"is_edit_eligible": true,
"edits_remaining": "5"
},
"mediaDetails": [
{
"additional_media_info": {},
"display_url": "pic.twitter.com/thIWNZAjD3",
"expanded_url": "https://twitter.com/sunnewstamil/status/1755414803613647221/video/1",
"ext_media_availability": {
"status": "Available"
},
"indices": [
182,
205
],
"media_url_https": "https://pbs.twimg.com/ext_tw_video_thumb/1755413108527636480/pu/img/2vVptgobx7qItB0m.jpg",
"original_info": {
"height": 1080,
"width": 1920,
"focus_rects": []
},
"sizes": {
"large": {
"h": 1080,
"resize": "fit",
"w": 1920
},
"medium": {
"h": 675,
"resize": "fit",
"w": 1200
},
"small": {
"h": 383,
"resize": "fit",
"w": 680
},
"thumb": {
"h": 150,
"resize": "crop",
"w": 150
}
},
"type": "video",
"url": "https://t.co/thIWNZAjD3",
"video_info": {
"aspect_ratio": [
16,
9
],
"duration_millis": 200160,
"variants": [
{
"bitrate": 2176000,
"content_type": "video/mp4",
"url": "https://video.twimg.com/ext_tw_video/1755413108527636480/pu/vid/avc1/1280x720/lTiC6YwiHWtV-f5N.mp4?tag=12"
},
{
"bitrate": 832000,
"content_type": "video/mp4",
"url": "https://video.twimg.com/ext_tw_video/1755413108527636480/pu/vid/avc1/640x360/YqqnF0zVOuBTwrcn.mp4?tag=12"
},
{
"content_type": "application/x-mpegURL",
"url": "https://video.twimg.com/ext_tw_video/1755413108527636480/pu/pl/gwS3TWGlodp2mZPw.m3u8?tag=12&container=cmaf"
},
{
"bitrate": 256000,
"content_type": "video/mp4",
"url": "https://video.twimg.com/ext_tw_video/1755413108527636480/pu/vid/avc1/480x270/VRos5z54HyKlMyI6.mp4?tag=12"
}
]
}
}
],
"photos": [],
"video": {
"aspectRatio": [
16,
9
],
"contentType": "media_entity",
"durationMs": 200160,
"mediaAvailability": {
"status": "available"
},
"poster": "https://pbs.twimg.com/ext_tw_video_thumb/1755413108527636480/pu/img/2vVptgobx7qItB0m.jpg",
"variants": [
{
"type": "video/mp4",
"src": "https://video.twimg.com/ext_tw_video/1755413108527636480/pu/vid/avc1/1280x720/lTiC6YwiHWtV-f5N.mp4?tag=12"
},
{
"type": "video/mp4",
"src": "https://video.twimg.com/ext_tw_video/1755413108527636480/pu/vid/avc1/640x360/YqqnF0zVOuBTwrcn.mp4?tag=12"
},
{
"type": "application/x-mpegURL",
"src": "https://video.twimg.com/ext_tw_video/1755413108527636480/pu/pl/gwS3TWGlodp2mZPw.m3u8?tag=12&container=cmaf"
},
{
"type": "video/mp4",
"src": "https://video.twimg.com/ext_tw_video/1755413108527636480/pu/vid/avc1/480x270/VRos5z54HyKlMyI6.mp4?tag=12"
}
],
"videoId": {
"type": "tweet",
"id": "1755414803613647221"
},
"viewCount": 0
},
"conversation_count": 130,
"news_action_type": "conversation",
"isEdited": false,
"isStaleEdit": false
}
}"
|
2d8ae1f820e536667388abe0079cb522
|
{
"intermediate": 0.41235071420669556,
"beginner": 0.38050350546836853,
"expert": 0.20714572072029114
}
|
40,199
|
how to repair: npm run build
> build
> next build
โฒ Next.js 14.1.0
โ Linting and checking validity of types
Creating an optimized production build ...
โ Compiled successfully
โ Collecting page data
Generating static pages (0/5) [= ]TypeError: Class constructor a cannot be invoked without 'new'
at Wc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:68:44)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:253)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:481)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:481)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at $c (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:78:98)
at bd (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:77:404)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:217)
Error occurred prerendering page "/lib/SceneInit". Read more: https://nextjs.org/docs/messages/prerender-error
TypeError: Class constructor a cannot be invoked without 'new'
at Wc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:68:44)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:253)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:481)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:481)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at $c (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:78:98)
at bd (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:77:404)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:7 Generating static pages (3/5) [=== ]
TypeError: Class constructor a cannot be invoked without 'new'
at Wc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:68:44)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:253)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:481)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:481)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at $c (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:78:98)
at bd (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:77:404)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:217)
Error occurred prerendering page "/lib/SnakeGame". Read more: https://nextjs.org/docs/messages/prerender-error
TypeError: Class constructor a cannot be invoked without 'new'
at Wc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:68:44)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:253)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:481)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at Zc (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:70:481)
at Z (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:76:89)
at $c (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:78:98)
at bd (C:\Users\VICE\Documents\three-js-games-main\09-snake\node_modules\react-dom\cjs\react-dom-server.browser.production.min.js:77:404)
โ Generating static pages (5/5)
> Export encountered errors on following paths:
/lib/SceneInit
/lib/SnakeGame
|
348ae88f2eaed7c3553b6a94c48cd9f2
|
{
"intermediate": 0.3489142954349518,
"beginner": 0.45996585488319397,
"expert": 0.19111984968185425
}
|
40,200
|
import subprocess
import sys
def install(package):
subprocess.check_call([sys.executable, "-m", "pip", "install", package])
# Ensure that spleeter is installed
try:
import spleeter
except ImportError:
install('spleeter')
from spleeter.separator import Separator
def extract_stems(input_file, output_path, stems=2):
"""
Extracts the audio stems from a given audio file using spleeter.
Parameters:
input_file (str): The path to the audio file you want to process.
output_path (str): The path where the stems will be saved.
stems (int): The number of stems to extract. Can be 2, 4, or 5.
"""
# Use the 'spleeter' command via subprocess to avoid any potential library clashes in the environment
try:
separator = Separator(f'spleeter:{stems}stems')
separator.separate_to_file(input_file, output_path)
print(f'Stems extracted successfully to {output_path}')
except Exception as e:
print(f'An error occurred: {e}')
# Example usage
extract_stems('childhood bedroom.m4a', 'output', stems=5)
after splitting the stems , the audio quality of each stem is very poor.
I need cd quality 16-bit 44.1khz from deezer
i will give my token if needed:
|
e88e534238c277aa8ca5fd706a6d80b4
|
{
"intermediate": 0.5118393301963806,
"beginner": 0.23881229758262634,
"expert": 0.24934834241867065
}
|
40,201
|
# Convert 'WeekDate' to datetime format
dataset_lowvolume = dataset_lowvolume.with_columns(
pl.col("WeekDate").str.strptime(pl.Datetime, "%Y-%m-%d")
)
# Group by 'CL4' and 'WeekDate', then sum 'OrderQuantity'
y_cl4 = dataset_lowvolume.groupby(['MaterialID', 'SalesOrg', 'DistrChan', 'CL4', 'WeekDate']).agg(
pl.sum("OrderQuantity").alias("OrderQuantity")
)
# Sort by 'WeekDate'
y_cl4 = y_cl4.sort("WeekDate") # Get the number of rows in y
num_rows = y_cl4.shape[0]
print(f"Number of rows in y: {num_rows}") now fix this code, y_cl4 = y_cl4.rename({'WeekDate': 'ds', 'CL4': 'unique_id', 'OrderQuantity': 'y'}) the unique_id should be combination of these 4 'MaterialID', 'SalesOrg', 'DistrChan', 'CL4', how should I do it? may be concat? using polars only
|
5057c478475a672f34a701acdcd61317
|
{
"intermediate": 0.3618541359901428,
"beginner": 0.27716419100761414,
"expert": 0.3609815835952759
}
|
40,202
|
Image we have a text layer in after effects let's say "This is a ***test***". I want to highlight word "test" by adding a rectangle shape behind text layer same size with test. How do it with scripting? (Notice that hightlighted test will be in between "***" characters.
|
8a4f0fa7a6b6726c45ce4bd20fd4cf84
|
{
"intermediate": 0.38686898350715637,
"beginner": 0.3634116053581238,
"expert": 0.24971947073936462
}
|
40,203
|
class VideoStream:
def __init__(self, video_bytes: bytes):
self.video_bytes = video_bytes
self.buffer = np.frombuffer(self.video_bytes, dtype=np.uint8)
self.frame_idx = 0
def read(self):
# Read the next frame from the video bytes buffer
frame = cv2.imdecode(self.buffer[self.frame_idx], cv2.IMREAD_COLOR)
self.frame_idx += 1
return True, frame
def detect_objects_in_video(video_bytes: bytes) -> List[List[dict]]:
# Load the YOLOv8 model
model = YOLO("best_seg_75.pt")
# Initialize an empty list to store detections for each frame
all_detections = []
# Create a video stream object from the video bytes
stream = VideoStream(video_bytes)
# Create a video capture object using the custom stream
cap = cv2.VideoCapture()
cap.open(stream)
# Loop through the video frames
while cap.isOpened():
# Read a frame from the video
ret, frame = cap.read()
if not ret:
break
# Perform object detection with YOLOv8 on the frame
detections = model(frame)
# Append the detections for the current frame to the list
all_detections.append(detections)
# Release the video capture object
cap.release()
return all_detections
|
06aab101c325bea44f49f86992b9b843
|
{
"intermediate": 0.35258668661117554,
"beginner": 0.3243612051010132,
"expert": 0.3230521082878113
}
|
40,204
|
In this exercise, youโll make some changes to Magpie.java to improve how it understands and responds.
Exercises
Teach Magpie to ask about your pets. Any time a statement contains โdog,โ โcat,โ or โpet,โ have Magpie ask a followup question, like โWhat do your pets eat?โ
If you say something about your teacher, Magpie should say something nice about them.
Make the code check that you give it any input. You can do this by using the trim method to remove spaces from the beginning and end of the input, then checking the length. If there are no characters in the input, you should have Magpie say something like โSpeak up,โ or โI canโt hear you.โ
Add two more non-committal responses to the randomResponse method.
MagpieRunner.java
import java.util.Scanner;
public class MagpieRunner
{
public static void main(String[] args)
{
Magpie magpie = new Magpie();
System.out.println(magpie.greeting());
Scanner scanner = new Scanner(System.in);
String statement = scanner.nextLine();
while (!statement.equals("Bye"))
{
System.out.println(magpie.getResponse(statement));
statement = scanner.nextLine();
}
scanner.close();
}
}
Magpie.java
public class Magpie
{
/**
* Gets a default greeting.
* @return String
*/
public String greeting()
{
return "Hey, what's up?";
}
/**
* Resturns a response to a user statement
*
* @param statement
* @return String
*/
public String getResponse(String statement)
{
String response = "";
if (statement.indexOf("no") >= 0)
{
response = "Don't be so negative!";
} else if (
statement.indexOf("mother") >= 0 ||
statement.indexOf("brother") >= 0 ||
statement.indexOf("sister") >= 0 ||
statement.indexOf("father") >= 0
)
{
response = "Tell me more about your family!";
} else if (
statement.indexOf("weather") >= 0 ||
statement.indexOf("sun") >= 0 ||
statement.indexOf("rain") >= 0
)
{
response = "The weather here is really nice.";
} else {
response = randomResponse();
}
return response;
}
/**
* Pick a default response to use if nothing else fits.
* @return String
*/
private String randomResponse()
{
int NUMBER_OF_RESPONSES = 6;
double responseIndex = Math.random();
int whichResponse = (int)(responseIndex * NUMBER_OF_RESPONSES);
String response = "";
if (whichResponse == 0)
{
response = "Very cool!";
}
else if (whichResponse == 1)
{
response = "Tell me more about that.";
}
else if (whichResponse == 2)
{
response = "That's really interesting!";
}
else if (whichResponse == 3)
{
response = "Can we talk about something else?";
}
else if (whichResponse == 4)
{
response = "Booooring.";
}
else if (whichResponse == 5)
{
response = "You really like to talk, don't you?";
}
return response;
}
}
|
d5a26359e35ef21dccd1c230696877d5
|
{
"intermediate": 0.31690531969070435,
"beginner": 0.2973731458187103,
"expert": 0.38572147488594055
}
|
40,205
|
make it professional "<html>
<head>
<title>FastAPI File Upload</title>
</head>
<body>
<form method="POST" action="/uploadfile/" enctype="multipart/form-data">
<input type="file" name="file_upload">
<input type="submit">
</form>
</body>
</html>
|
828742e951e2acaf5d908c7298c210f9
|
{
"intermediate": 0.49326732754707336,
"beginner": 0.2478330135345459,
"expert": 0.25889962911605835
}
|
40,206
|
I do have the code
Where do you think it could possibly be caused the exception?
internal ObjectWeaponItem DropWeaponItem(SFD.Weapons.WeaponItemType wpnToDrop, int directionToDrop, Microsoft.Xna.Framework.Vector2 power, bool ignoreHolsteredModel, Microsoft.Xna.Framework.Vector2 dropOffset, bool handleCoverObject, Player.DropWeaponItemSource dropWeaponItemSource = Player.DropWeaponItemSource.Other)
{
string mapObjectID = "";
bool pickupable = false;
bool flag = false;
SFD.Weapons.WeaponItem weaponItem = null;
if ((this.CurrentAction == PlayerAction.DrawWeapon || this.CurrentAction == PlayerAction.HipFire || this.CurrentAction == PlayerAction.ManualAim) && this.CurrentWeaponDrawn == wpnToDrop)
{
this.CurrentAction = PlayerAction.Idle;
this.ImportantUpdate = true;
}
switch (wpnToDrop)
{
case SFD.Weapons.WeaponItemType.NONE:
return null;
case SFD.Weapons.WeaponItemType.Handgun:
if (this.CurrentHandgunWeapon != null)
{
mapObjectID = this.CurrentHandgunWeapon.Properties.ModelID;
pickupable = true;
flag = (!this.CurrentHandgunWeapon.IsEmpty && this.CurrentHandgunWeapon.Properties.SpawnsInSheath);
weaponItem = new SFD.Weapons.WeaponItem(SFD.Weapons.WeaponItemType.Handgun, this.CurrentHandgunWeapon);
goto IL_1FE;
}
goto IL_1FE;
case SFD.Weapons.WeaponItemType.Rifle:
if (this.CurrentRifleWeapon != null)
{
mapObjectID = this.CurrentRifleWeapon.Properties.ModelID;
pickupable = true;
flag = (!this.CurrentRifleWeapon.IsEmpty && this.CurrentRifleWeapon.Properties.SpawnsInSheath);
weaponItem = new SFD.Weapons.WeaponItem(SFD.Weapons.WeaponItemType.Rifle, this.CurrentRifleWeapon);
goto IL_1FE;
}
goto IL_1FE;
case SFD.Weapons.WeaponItemType.Thrown:
if (this.CurrentThrownWeapon != null)
{
mapObjectID = this.CurrentThrownWeapon.Properties.ModelID;
pickupable = true;
flag = this.CurrentThrownWeapon.Properties.SpawnsInSheath;
weaponItem = new SFD.Weapons.WeaponItem(SFD.Weapons.WeaponItemType.Thrown, this.CurrentThrownWeapon);
goto IL_1FE;
}
goto IL_1FE;
case SFD.Weapons.WeaponItemType.Melee:
if (this.CurrentMeleeMakeshiftWeapon != null)
{
mapObjectID = this.CurrentMeleeMakeshiftWeapon.Properties.ModelID;
pickupable = true;
flag = this.CurrentMeleeMakeshiftWeapon.Properties.SpawnsInSheath;
weaponItem = new SFD.Weapons.WeaponItem(SFD.Weapons.WeaponItemType.Melee, this.CurrentMeleeMakeshiftWeapon);
goto IL_1FE;
}
if (this.CurrentMeleeWeapon != null)
{
mapObjectID = this.CurrentMeleeWeapon.Properties.ModelID;
pickupable = true;
flag = this.CurrentMeleeWeapon.Properties.SpawnsInSheath;
weaponItem = new SFD.Weapons.WeaponItem(SFD.Weapons.WeaponItemType.Melee, this.CurrentMeleeWeapon);
goto IL_1FE;
}
goto IL_1FE;
case SFD.Weapons.WeaponItemType.Powerup:
if (this.CurrentPowerupItem != null)
{
mapObjectID = this.CurrentPowerupItem.Properties.ModelID;
pickupable = true;
flag = this.CurrentPowerupItem.Properties.SpawnsInSheath;
weaponItem = new SFD.Weapons.WeaponItem(SFD.Weapons.WeaponItemType.Powerup, this.CurrentPowerupItem);
goto IL_1FE;
}
goto IL_1FE;
}
return null;
IL_1FE:
if (weaponItem == null)
{
return null;
}
this.WeaponAbortBurstMode();
this.m_reloadPrepared = false;
this.RemoveWeaponItem(wpnToDrop, true, false);
if (!weaponItem.BaseProperties.IsMakeshift && this.IsDead && this.ItemDropMode == Player.ItemDropModeEnum.RemoveOnDeath)
{
this.QueueRemovedWeaponCallback(weaponItem.Type, weaponItem.BaseProperties.WeaponID, 0, dropWeaponItemSource == Player.DropWeaponItemSource.ManuallyDropped, false);
return null;
}
SpawnObjectInformation spawnObjectInformation = new SpawnObjectInformation(this.GameWorld.IDCounter.NextObjectData(mapObjectID), this.Position + dropOffset, (weaponItem.BaseProperties.WeaponID == 36) ? (0.7853982f * -(float)this.LastDirectionX) : 0f, (wpnToDrop == SFD.Weapons.WeaponItemType.Powerup) ? 1 : ((short)this.LastDirectionX), new Microsoft.Xna.Framework.Vector2((float)directionToDrop * 3f * power.X, 4f * power.Y), Constants.RANDOM.NextFloat(-3.14f, 3.14f));
if (handleCoverObject && this.CoverObject != null && this.CoverObject.Body != null && this.CoverObjectCanShootThrough)
{
spawnObjectInformation.IgnoreBodyID = this.CoverObject.Body.BodyID;
}
flag = (flag && !ignoreHolsteredModel);
Body body = this.m_gameWorld.CreateWeaponItem(spawnObjectInformation, flag, pickupable, true);
ObjectData objectData = ObjectData.Read(body.GetFixtureList());
ObjectWeaponItem objectWeaponItem = (ObjectWeaponItem)objectData;
if (this.m_modifiers.InfiniteAmmo >= 1 && dropWeaponItemSource != Player.DropWeaponItemSource.ManuallyDropped && weaponItem.RWeaponData != null)
{
weaponItem.RWeaponData.RestoreAmmo();
}
objectWeaponItem.InternalData = weaponItem;
objectWeaponItem.DroppedByPlayerID = this.ObjectID;
objectWeaponItem.DroppedByPlayerSource = dropWeaponItemSource;
if (!weaponItem.BaseProperties.IsMakeshift)
{
objectWeaponItem.SetDespawnTime(Constants.RANDOM.NextFloat(19000f, 21000f), false);
if ((weaponItem.Type == SFD.Weapons.WeaponItemType.Rifle || weaponItem.Type == SFD.Weapons.WeaponItemType.Handgun) && weaponItem.RWeaponData.IsEmpty)
{
objectWeaponItem.QueueItemRemoval();
}
if (weaponItem.Type == SFD.Weapons.WeaponItemType.Thrown && weaponItem.TWeaponData is WpnC4Detonator && this.GameWorld.GetObjectDataByID(((WpnC4Detonator)weaponItem.TWeaponData).ConnectedC4ObjectID) == null)
{
objectWeaponItem.QueueItemRemoval();
}
}
if (weaponItem.Type == SFD.Weapons.WeaponItemType.Melee & weaponItem.MWeaponData != null)
{
objectWeaponItem.Health.Fullness = weaponItem.MWeaponData.Durability.Fullness;
}
this.GameWorld.WeaponSpawnManager.RemoveSupplyItemTracking(objectData);
if (objectData.Tile.CanBeMissile)
{
this.GameWorld.AddMissileObject(objectData, ObjectMissileStatus.Dropped, this);
if (objectData.MissileData != null)
{
objectData.MissileData.SetHitCooldown();
}
}
if (this.ItemDropMode == Player.ItemDropModeEnum.Break || this.ItemDropMode == Player.ItemDropModeEnum.RemoveOnDeath)
{
objectWeaponItem.QueueBreakOnDrop();
objectWeaponItem.QueueItemRemoval();
if (this.IsDead)
{
objectWeaponItem.Activateable = false;
objectWeaponItem.ActivateableHighlightning = false;
}
}
if (this.GameOwner == GameOwnerEnum.Server)
{
this.GameWorld.AddBodyToSync(objectData.BodyData);
}
this.QueueRemovedWeaponCallback(weaponItem.Type, weaponItem.BaseProperties.WeaponID, objectWeaponItem.ObjectID, dropWeaponItemSource == Player.DropWeaponItemSource.ManuallyDropped, this.InThrowingMode && dropWeaponItemSource == Player.DropWeaponItemSource.Other);
return objectWeaponItem;
}
public ObjectData DropThrowable(Microsoft.Xna.Framework.Vector2 velocity)
{
if (this.CurrentWeaponDrawn == SFD.Weapons.WeaponItemType.Thrown && (this.CurrentAction == PlayerAction.ManualAim || this.FullLanding || (!this.m_throwFullyPerformed && this.ThrowableIsActivated)))
{
ObjectData result = this.ReleaseThrow(this.GetThrowLocation(true), velocity, Constants.RANDOM.NextFloat(-3f, 3f), false);
this.ThrowCharging = false;
this.CurrentAction = PlayerAction.Idle;
return result;
}
return null;
}
public void ReleaseThrow()
{
if (this.InThrowingMode && this.GameOwner != GameOwnerEnum.Server)
{
SoundHandler.PlaySound("Throw", this.Position, this.GameWorld);
}
if (this.GameOwner == GameOwnerEnum.Client)
{
if (this.InThrowingMode)
{
return;
}
if (this.CurrentThrownWeapon != null)
{
TWeaponOnThrowArgs e = new TWeaponOnThrowArgs(this, this.CurrentThrownWeapon, null, 0f, this.ThrowableIsActivated);
this.CurrentThrownWeapon.OnThrow(e);
return;
}
}
else
{
float num = -(float)this.LastDirectionX * Constants.RANDOM.NextFloat(1.5f, 2f);
if (this.InThrowingMode)
{
num *= 14f;
}
this.ReleaseThrow(this.GetThrowLocation(false), this.GetThrowVector(), num, true);
}
}
public ObjectData ReleaseThrow(Microsoft.Xna.Framework.Vector2 throwLocation, Microsoft.Xna.Framework.Vector2 throwLinearVelocity, float angularVelocity, bool isThrowTOrDropF)
{
ObjectData result = null;
if (this.InThrowingMode)
{
float angle = (float)Math.Atan2((double)throwLinearVelocity.Y, (double)throwLinearVelocity.X) + ((this.LastDirectionX == 1) ? 0f : 3.1415927f);
throwLinearVelocity *= this.ThrowForceModifier;
ObjectWeaponItem objectWeaponItem = this.DropWeaponItem(this.CurrentWeaponDrawn, this.LastDirectionX, Microsoft.Xna.Framework.Vector2.Zero, true, Microsoft.Xna.Framework.Vector2.Zero, true, Player.DropWeaponItemSource.Other);
if (objectWeaponItem != null)
{
this.Statisticts.m_TotalItemsThrown++;
objectWeaponItem.Body.SetTransform(Converter.WorldToBox2D(throwLocation), angle);
objectWeaponItem.Body.SetLinearVelocity(throwLinearVelocity + this.AirControlBaseVelocity);
objectWeaponItem.Body.SetAngularVelocity(angularVelocity);
this.TimeSequence.TimeThrowCooldown = 425f;
this.AddMissileObjectAndIgnoreTeammates(objectWeaponItem, ObjectMissileStatus.Thrown, false);
SFD.Weapons.WeaponItem weaponItem = objectWeaponItem.GetWeaponItem();
switch (weaponItem.Type)
{
case SFD.Weapons.WeaponItemType.Handgun:
weaponItem.RWeaponData.OnThrowWeaponItem(this, objectWeaponItem);
break;
case SFD.Weapons.WeaponItemType.Rifle:
weaponItem.RWeaponData.OnThrowWeaponItem(this, objectWeaponItem);
break;
case SFD.Weapons.WeaponItemType.Thrown:
weaponItem.TWeaponData.OnThrowWeaponItem(this, objectWeaponItem);
break;
case SFD.Weapons.WeaponItemType.Melee:
weaponItem.MWeaponData.OnThrowWeaponItem(this, objectWeaponItem);
break;
case SFD.Weapons.WeaponItemType.Powerup:
weaponItem.PItemData.OnThrowWeaponItem(this, objectWeaponItem);
break;
}
}
result = objectWeaponItem;
this.InThrowingMode = false;
}
else if (this.CurrentThrownWeapon != null)
{
this.Statisticts.m_TotalItemsThrown++;
bool flag = isThrowTOrDropF || this.ThrowableIsActivated;
string mapObjectID = this.CurrentThrownWeapon.Properties.ThrowObjectID;
if (!flag)
{
mapObjectID = this.CurrentThrownWeapon.Properties.ModelID;
}
SpawnObjectInformation spawnObjectInformation = new SpawnObjectInformation(this.GameWorld.IDCounter.NextObjectData(mapObjectID), throwLocation, 0f, (short)this.LastDirectionX, throwLinearVelocity * this.ThrowForceModifier, angularVelocity);
if (this.CoverObject != null && this.CoverObject.Body != null && this.CoverObjectCanShootThrough)
{
spawnObjectInformation.IgnoreBodyID = this.CoverObject.Body.BodyID;
}
Body body;
if (flag)
{
body = this.m_gameWorld.CreateTile(spawnObjectInformation);
}
else
{
body = this.m_gameWorld.CreateWeaponItem(spawnObjectInformation, false, true, true);
}
ObjectData objectData = ObjectData.Read(body);
result = objectData;
TWeaponOnThrowArgs tweaponOnThrowArgs = new TWeaponOnThrowArgs(this, this.CurrentThrownWeapon, objectData, this.FireSequence.ThrowableDeadlineTimer, flag);
this.TimeSequence.TimeThrowCooldown = 425f;
if (isThrowTOrDropF)
{
if (objectData.Tile.CanBeMissile)
{
this.AddMissileObjectAndIgnoreTeammates(objectData, ObjectMissileStatus.Thrown, false);
}
}
else if (objectData.Tile.CanBeMissile)
{
this.AddMissileObjectAndIgnoreTeammates(objectData, ObjectMissileStatus.Dropped, false);
}
if (!flag)
{
TWeapon tweapon = this.CurrentThrownWeapon.Copy();
tweapon.NumberOfThrowablesLeft = 1;
objectData.InternalData = new SFD.Weapons.WeaponItem(SFD.Weapons.WeaponItemType.Thrown, tweapon);
}
this.SetCurrentThrownAmmo((int)(this.InfiniteAmmo ? this.CurrentThrownWeapon.NumberOfThrowablesLeft : (this.CurrentThrownWeapon.NumberOfThrowablesLeft - 1)), objectData);
if (isThrowTOrDropF)
{
tweaponOnThrowArgs.ThrowableWeapon.OnThrow(tweaponOnThrowArgs);
}
else
{
tweaponOnThrowArgs.ThrowableWeapon.OnDrop(tweaponOnThrowArgs);
}
}
this.FireSequence.ThrowableDeadlineTimer = 0f;
this.m_throwFullyPerformed = true;
return result;
}
|
28d73d90e0b683fbd237688c26883953
|
{
"intermediate": 0.39743611216545105,
"beginner": 0.34321412444114685,
"expert": 0.2593498229980469
}
|
40,207
|
analogRead(A0) returns the value 414. Whatโs the voltage on the A0 pin on an Arduino
|
98e6f3e9bfe9725ca529f702bd58d1c5
|
{
"intermediate": 0.42158129811286926,
"beginner": 0.2650371193885803,
"expert": 0.313381552696228
}
|
40,208
|
Hi I would like to split sales by item. How can I do this in sql while keeping the totals equal? I.e a transaction of 10$ into 3 should be 33.33,33.33 and 33.34
|
12c4abd69d30885ab00ec913eb19b5fc
|
{
"intermediate": 0.45713844895362854,
"beginner": 0.20664066076278687,
"expert": 0.336220920085907
}
|
40,209
|
Help
|
d159c4235d4e4509c16120f4640b25ab
|
{
"intermediate": 0.3571339249610901,
"beginner": 0.32243141531944275,
"expert": 0.32043468952178955
}
|
40,210
|
i got a string "3 c=32 r=4-9"
write a python string that exctracts "4-9"
|
3d0d864f15cacc9d9edd732705378aa6
|
{
"intermediate": 0.2857537269592285,
"beginner": 0.4817226827144623,
"expert": 0.2325236052274704
}
|
40,211
|
convert list [3,4] to slice "(3,4)
|
35d85098b65c4d4b3d4ba560c61f9bcb
|
{
"intermediate": 0.3538084030151367,
"beginner": 0.28877562284469604,
"expert": 0.35741594433784485
}
|
40,212
|
I am trying to make a nice looking octagon in Minecraft. It currently exists 40 blocks from a center point in NESW fashion. How long should each side be for it to look nice?
|
929d974a99471b54bf4311f66e22d814
|
{
"intermediate": 0.4637402594089508,
"beginner": 0.2978905141353607,
"expert": 0.23836928606033325
}
|
40,213
|
const altObjects = {
sockets: [],
parties: [],
altId: 0,
mousePosition: { x: 0, y: 0 },
mouseMove: true,
keyboard: true,
states: {
controlled: true,
locked: false
},
sendPacketToAll: (opcode, packetData = {}) => {
altObjects.sockets.forEach((ws) => {
if (!altObjects.states.controlled) return;
ws.network.sendPacket(opcode, packetData, ws);
});
},
forEachSocket: (callback) => {
altObjects.sockets.forEach((ws) => {
callback(ws);
})
},
getDistanceSquared(point1, point2) {
const dx = point2.x - point1.x;
const dy = point2.y - point1.y;
return dx * dx + dy * dy;
},
getPlayerData(sockets) {
return Object.values(sockets)
.filter(socket => socket.myPlayer.entityClass === "PlayerEntity")
.map(socket => ({
x: socket.myPlayer.position.x,
y: socket.myPlayer.position.y,
uid: socket.myPlayer.uid
}));
},
findClosestPlayer(playerData, mousePosition) {
return playerData.reduce((closest, player) => {
const distanceSquared = this.getDistanceSquared(mousePosition, player);
if (!closest || distanceSquared < closest.distanceSquared) {
return { ...player, distanceSquared };
}
return closest;
}, null);
},
getClosestPlayerToMouse() {
const mousePosition = altObjects.mousePosition;
const players = this.getPlayerData(this.sockets);
const closestPlayer = this.findClosestPlayer(players, mousePosition);
return closestPlayer ? {
x: closestPlayer.x,
y: closestPlayer.y,
uid: closestPlayer.uid
} : null;
},
createParty(raidType) {
// TODO: This only works for 1by1 atm, need to make it work for xkey
if (this.sockets.length === 0) {
console.log("No connections available to form a party.");
return;
}
let numSocketsToAdd;
if (raidType === "1by1") {
numSocketsToAdd = 1;
} else if (raidType === "xkey") {
numSocketsToAdd = 3;
} else {
console.error("Invalid raid type.");
return;
}
let index = 0; // Keep track of the current index in the sockets array
let cloneIdSet = new Set(); // Set to track unique cloneIds
while (index < this.sockets.length) {
let party = [];
let socketsProcessed = 0;
while (socketsProcessed < numSocketsToAdd && index < this.sockets.length) {
const socket = this.sockets[index];
if (!cloneIdSet.has(socket.cloneId)) {
cloneIdSet.add(socket.cloneId);
party.push({ socket, psk: socket.psk, hasStash: false });
socketsProcessed++;
}
index++; // Move to the next socket in the array
}
if (party.length > 0) {
console.log(`Party ${this.parties.length + 1} created with ${party.length} connections.`);
this.parties.push(party);
} else {
// All remaining sockets have duplicate cloneIds, stop creating parties
break;
}
cloneIdSet.clear(); // Clear the set for the next party
}
}
};
class WebSocketHandler {
constructor(name = "", sid = "") {
this.name = name;
this.sid = sid;
this.enterworld2data = null;
this.worker = new Worker("worker.js");
// this.workerSever = new WebSocket('ws://localhost:3000');
this.createWebSocket();
this.setupWorker();
}
createWebSocket() {
const serverId = this.sid || game.options.serverId;
const server = game.options.servers[serverId];
this.ws = new WebSocket(`wss://${server.hostname}:443/`);
this.ws.binaryType = "arraybuffer";
this.ws.onopen = () => this.handleOpen();
this.ws.onmessage = (msg) => this.handleMessage(msg);
this.ws.onclose = (e) => this.handleClose(e);
this.ws.onRpc = () => this.handleRpc();
}
setupWorker() {
this.worker.onmessage = (event) => {
const {
status,
result,
extra,
enterworld2
} = event.data;
if (status === "finalizeOpcode10") {
this.ws.send(result);
} else if (status === "opcode5Complete") {
this.ws.network.sendPacket(4, {
displayName: this.name,
extra: extra
});
this.enterworld2data = enterworld2;
}
};
}
handleOpen() {
this.ws.network = new game.networkType();
this.ws.network.sendPacket = (e, t) => {
this.ws.send(this.ws.network.codec.encode(e, t));
};
this.ws.player = {};
this.ws.myPlayer = {uid: null, position: {x: 0, y: 0}, model: "GamePlayer"};
this.ws.inventory = {};
this.ws.buildings = {};
this.ws.reversedYaw = false;
}
handleRpc() {
switch (this.ws.data.name) {
case "Dead":
this.ws.network.sendPacket(3, { respawn: 1 });
break;
case "SetItem":
this.ws.inventory[this.ws.data.response.itemName] = this.ws.data.response;
!this.ws.inventory[this.ws.data.response.itemName].stacks ? delete this.ws.inventory[this.ws.data.response.itemName] : 0;
break;
case "PartyShareKey":
this.ws.psk = this.ws.data.response.partyShareKey;
break;
case "DayCycle":
this.ws.isDay = this.ws.data.response.isDay;
break;
case "LocalBuilding":
for (let i in this.ws.data.response) {
this.ws.buildings[this.ws.data.response[i].uid] = this.ws.data.response[i];
this.ws.buildings[this.ws.data.response[i].uid].dead ? delete this.ws.buildings[this.ws.data.response[i].uid] : 0;
}
break;
}
}
handleMessage(msg) {
const opcode = new Uint8Array(msg.data)[0];
if (opcode == 5) {
this.worker.postMessage({
action: "opcode5",
data: new Uint8Array(msg.data),
ipAddress: game.network.connectionOptions.ipAddress,
});
return;
}
if (opcode == 10) {
this.worker.postMessage({
action: "opcode10",
data: msg.data
});
return;
}
this.ws.data = this.ws.network.codec.decode(msg.data);
switch (this.ws.data.opcode) {
case 0:
if (!this.ws.isclosed) {
for (let uid in this.ws.data.entities[this.ws.myPlayer.uid]) {
uid !== "uid" ? this.ws.myPlayer[uid] = this.ws.data.entities[this.ws.myPlayer.uid][uid] : 0;
}
}
break;
case 4:
this.enterworld2data && this.ws.send(this.enterworld2data);
this.ws.myPlayer.uid = this.ws.data.uid;
this.ws.network.sendPacket(3, {
up: 1
});
if (game.world.inWorld) {
this.ws.network.sendPacket(9, {
name: "JoinPartyByShareKey",
partyShareKey: game.ui.playerPartyShareKey
});
}
// Load Leaderboard
for (let i = 0; i < 26; i++) this.ws.send(new Uint8Array([3, 17, 123, 34, 117, 112, 34, 58, 49, 44, 34, 100, 111, 119, 110, 34, 58, 48, 125]));
this.ws.send(new Uint8Array([7, 0]));
this.ws.send(new Uint8Array([9, 6, 0, 0, 0, 126, 8, 0, 0, 108, 27, 0, 0, 146, 23, 0, 0, 82, 23, 0, 0, 8, 91, 11, 0, 8, 91, 11, 0, 0, 0, 0, 0, 32, 78, 0, 0, 76, 79, 0, 0, 172, 38, 0, 0, 120, 155, 0, 0, 166, 39, 0, 0, 140, 35, 0, 0, 36, 44, 0, 0, 213, 37, 0, 0, 100, 0, 0, 0, 120, 55, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 100, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 134, 6, 0, 0]));
this.ws.cloneId = ++altObjects.altId;
this.ws.network.sendRpc({
name: "BuyItem",
itemName: "PetCARL",
tier: 1
});
this.ws.network.sendRpc({
name: "BuyItem",
itemName: "PetMiner",
tier: 1
});
this.ws.player = document.createElement("div");
this.ws.player.classList.add("hud-map-player");
this.ws.player.style.display = "block";
this.ws.player.dataset.index = "4";
document.getElementsByClassName("hud-map")[0].appendChild(this.ws.player);
altObjects.sockets.push(this.ws)
break;
case 9:
!this.ws.isclosed ? this.ws.onRpc(this.ws.data) : 0;
break;
}
}
handleClose() {
this.ws.isclosed = true;
this.ws.player.remove();
this.worker.terminate();
}
};
Optimize, improve, and compact this code as much as possible. Make sure its modulated and to add any additional features you feel contribute to the codebase.
|
fd9ae28f1050e83854a29e74478b96fe
|
{
"intermediate": 0.3329053223133087,
"beginner": 0.3928236961364746,
"expert": 0.27427101135253906
}
|
40,214
|
quick question, is there any benefit of storing keys in a HashMap as bytes in rust?
|
4237a113bb428d23fa0b2cdbf0b3e0b2
|
{
"intermediate": 0.35638850927352905,
"beginner": 0.19172118604183197,
"expert": 0.4518902897834778
}
|
40,215
|
This is my code:
pub fn parse_tracks<'a>(contents: &'a str) -> Result<TranscriptMap, &'static str> {
// let pseudomap: Arc<DashMap<String, Vec<(u32, u32, String)>>> = Arc::new(DashMap::new());
let mut tracks = contents
.par_lines()
.filter(|x| !x.starts_with("#"))
.filter_map(|x| Record::new(x).ok())
.fold(
|| HashMap::new(),
|mut acc, record| {
// let mut ps_acc = pseudomap.entry(record.chrom.clone()).or_insert(Vec::new());
// ps_acc.push((record.tx_start, record.tx_end, record.id.clone()));
let k = acc.entry(record.chrom).or_insert(vec![]);
k.push(record.info);
acc
},
)
.reduce(
|| HashMap::new(),
|mut acc, map| {
for (k, v) in map {
let acc_v = acc.entry(k).or_insert(Vec::new());
acc_v.extend(v);
}
acc
},
);
tracks.par_iter_mut().for_each(|(_, v)| {
v.par_sort_unstable_by_key(|x| (x.0, x.1));
});
Ok(tracks)
}
where:
pub type TranscriptMap = HashMap<Chromosome, Vec<Transcript>>;
pub type Chromosome = String;
this is pretty fast but I just realized that real use cases will build the hashmap with millions of elements (if no thousands of millions). I need to make this more faster and more efficient to hold. Any ideas? You are free to use any crate, algorithm or data structure you want!
|
708d9859d5ccd973e714103d02280980
|
{
"intermediate": 0.3780178129673004,
"beginner": 0.3024728298187256,
"expert": 0.3195093274116516
}
|
40,216
|
This is my code:
pub fn parse_tracks<'a>(contents: &'a str) -> Result<TranscriptMap, &'static str> {
let mut tracks = contents
.par_lines()
.filter(|x| !x.starts_with("#"))
.filter_map(|x| Record::new(x).ok())
.fold(
|| HashMap::new(),
|mut acc, record| {
let k = acc.entry(record.chrom).or_insert(vec![]);
k.push(record.info);
acc
},
)
.reduce(
|| HashMap::new(),
|mut acc, map| {
for (k, v) in map {
let acc_v = acc.entry(k).or_insert(Vec::new());
acc_v.extend(v);
}
acc
},
);
tracks.par_iter_mut().for_each(|(_, v)| {
v.par_sort_unstable_by_key(|x| (x.0, x.1));
});
Ok(tracks)
}
where:
pub type TranscriptMap = HashMap<Chromosome, Vec<Transcript>>;
pub type Chromosome = String;
this is pretty fast but I just realized that real use cases will build the hashmap with millions of elements (if no thousands of millions). I need to make this more faster and more efficient to hold. Any ideas? You are free to use any crate, algorithm or data structure you want!
|
6e9f5b2b856f1adf3aa86c3f750d0620
|
{
"intermediate": 0.43713515996932983,
"beginner": 0.2696744203567505,
"expert": 0.2931903600692749
}
|
40,217
|
const altObjects = {
sockets: [],
parties: [],
altId: 0,
mousePosition: { x: 0, y: 0 },
mouseMove: true,
keyboard: true,
states: {
controlled: true,
locked: false
},
sendPacketToAll: (opcode, packetData = {}) => {
altObjects.sockets.forEach((ws) => {
if (!altObjects.states.controlled) return;
ws.network.sendPacket(opcode, packetData, ws);
});
},
forEachSocket: (callback) => {
altObjects.sockets.forEach((ws) => {
callback(ws);
})
},
getDistanceSquared(point1, point2) {
const dx = point2.x - point1.x;
const dy = point2.y - point1.y;
return dx * dx + dy * dy;
},
getPlayerData(sockets) {
return Object.values(sockets)
.filter(socket => socket.myPlayer.entityClass === "PlayerEntity")
.map(socket => ({
x: socket.myPlayer.position.x,
y: socket.myPlayer.position.y,
uid: socket.myPlayer.uid
}));
},
findClosestPlayer(playerData, mousePosition) {
return playerData.reduce((closest, player) => {
const distanceSquared = this.getDistanceSquared(mousePosition, player);
if (!closest || distanceSquared < closest.distanceSquared) {
return { ...player, distanceSquared };
}
return closest;
}, null);
},
getClosestPlayerToMouse() {
const mousePosition = altObjects.mousePosition;
const players = this.getPlayerData(this.sockets);
const closestPlayer = this.findClosestPlayer(players, mousePosition);
return closestPlayer ? {
x: closestPlayer.x,
y: closestPlayer.y,
uid: closestPlayer.uid
} : null;
},
setupEventHandler() {
document.addEventListener('wsEvents', (data) => {
console.log(data);
});
},
};
Move all the logic corresponding to getClosestPlayerToMouse into a compact fashion so they dont take up so much space
|
c7386cde97e9a243f442e1f9a4455c87
|
{
"intermediate": 0.3252173960208893,
"beginner": 0.35484084486961365,
"expert": 0.31994184851646423
}
|
40,218
|
const DirectionMapper = {
movements: [90, 225, 44, 314, 135, 359, 180, 270],
yawActions: {
90: { right: 1, left: 0, up: 0, down: 0 },
225: { down: 1, left: 1, up: 0, right: 0 },
44: { down: 0, left: 0, up: 1, right: 1 },
314: { down: 0, left: 1, up: 1, right: 0 },
135: { down: 1, left: 0, up: 0, right: 1 },
359: { up: 1, down: 0, right: 0, left: 0 },
180: { down: 1, up: 0, right: 0, left: 0 },
270: { left: 1, right: 0, up: 0, down: 0 }
},
typeToValue: {
"top": 359,
"top right": 44,
"right": 90,
"bottom right": 135,
"bottom": 180,
"bottom left": 225,
"left": 270,
"top left": 314
},
valueToType: {
359: "top",
44: "top right",
90: "right",
135: "bottom right",
180: "bottom",
225: "bottom left",
270: "left",
314: "top left"
},
aimToYaw: (num) => {
const tolerance = 23;
num = (num === 360 || num === 0) ? 359 : num;
for (let movement of DirectionMapper.movements) {
if (num >= movement - tolerance && num <= movement + tolerance) {
return movement;
}
}
return null;
}
};
Optimize, improve, compact this code along with reducing its complexity. Remember, keep the code as close to an O(1n) as possible
|
247bba6191ec95114ef343f8179b436d
|
{
"intermediate": 0.35921087861061096,
"beginner": 0.2175305187702179,
"expert": 0.42325860261917114
}
|
40,219
|
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.20;
contract SimpleContract {
uint256 public number = 5;
// Long-named function
function ThisIsAVeryLongFunctionNameThatDoesTheSameThingAsTheShortOne() public {
number++;
}
}
contract SimpleContract {
uint256 public numner = 5;
// Short-named function
function A() public {
number++;
}
}
Which of the following functions provided will cost more gas and why?
|
1caca902ff938914bbd55cf7f1a400c1
|
{
"intermediate": 0.40583518147468567,
"beginner": 0.36268532276153564,
"expert": 0.23147952556610107
}
|
40,220
|
def pseudo_rand_num_gen(seed=None, k=1):
if seed is None:
seed = int(time.time()) # Use current system time as seed
# Ensure we have an even length for the seed
seed_str = str(seed)
if len(seed_str) % 2 != 0:
seed_str = '0' + seed_str
n = len(seed_str)
rand_nums = []
for _ in range(k):
seed_squared = int(seed_str) ** 2
seed_squared_str = str(seed_squared).zfill(n * 2)
middle_digits = int(seed_squared_str[(n // 2): (n // 2 + n)])
rand_nums.append(middle_digits)
seed_str = str(middle_digits).zfill(n) # Update seed for the next iteration
# Normalize the numbers by dividing by 10^n
normalized_nums = [num / (10**n) for num in rand_nums]
# Plotting histogram
plt.hist(normalized_nums, bins=max(10, k//10), edgecolor='black')
plt.title('Pseudo-Random Numbers Distribution')
plt.xlabel('Number Range')
plt.ylabel('Frequency')
plt.show()
return normalized_nums
# Example usage:
random_numbers = pseudo_rand_num_gen(seed=9876, k=1000)
# Note that for each run, you should pass a different seed to get different sequences or omit the seed
# parameter to use the system time.
Write a Python function pseudo rand num gen(seed, k) that generates k pseudo random
numbers using the above process and uses your current system time as the seed to this function. The function returns a list containing all the k random numbers generated.
Update: The middle-square algorithm needs to be implemented for a general seed with
even number of digits. (Hint: A faster way to extract the middle digits is by converting the
number to a string first.) In case the seed after squaring has odd no. of digits pad it with a
zero in the front and then extract the middle digits.
Also plot a histogram of the generated random values (after dividing by 10n where n is the
no. of digits in the seed) using matplotlib.
|
737fffee3579b7fa430b587156b6ee87
|
{
"intermediate": 0.373975545167923,
"beginner": 0.17715109884738922,
"expert": 0.4488734006881714
}
|
40,221
|
contract SimpleContract {
uint256 public number = 5;
// Long-named function
function ThisIsAVeryLongFunctionNameThatDoesTheSameThingAsTheShortOne() public {
number++;
}
}
contract SimpleContract {
uint256 public number = 5;
// Short-named function
function A() public {
number++;
}
}
Which of the following functions provided will cost more gas and why?
|
45f1e1805b12555a9a21ec2b40d3b2a9
|
{
"intermediate": 0.33560290932655334,
"beginner": 0.4528568983078003,
"expert": 0.21154020726680756
}
|
40,222
|
const DirectionMapper = {
yawActions: {
90: { right: 1, left: 0, up: 0, down: 0 },
225: { down: 1, left: 1, up: 0, right: 0 },
44: { down: 0, left: 0, up: 1, right: 1 },
314: { down: 0, left: 1, up: 1, right: 0 },
135: { down: 1, left: 0, up: 0, right: 1 },
359: { up: 1, down: 0, right: 0, left: 0 },
180: { down: 1, up: 0, right: 0, left: 0 },
270: { left: 1, right: 0, up: 0, down: 0 }
},
typeToValue: {
"top": 359,
"top right": 44,
"right": 90,
"bottom right": 135,
"bottom": 180,
"bottom left": 225,
"left": 270,
"top left": 314
},
aimToYaw: (num) => {
const tolerance = 23;
num = num % 360;
let matchingMovement = null;
for (let angle in DirectionMapper.yawActions) {
angle = parseInt(angle);
if (num >= angle - tolerance && num <= angle + tolerance ||
num + 360 >= angle - tolerance && num <= angle - 360 + tolerance) {
matchingMovement = angle;
break;
}
}
return matchingMovement;
}
};
Why is this so smooth and fast?
|
5e7f51e748812b009139db4acc9caac1
|
{
"intermediate": 0.42988497018814087,
"beginner": 0.25347787141799927,
"expert": 0.31663718819618225
}
|
40,223
|
const DirectionMapper = {
movements: [90, 225, 44, 314, 135, 359, 180, 270],
yawActions: {
90: { right: 1, left: 0, up: 0, down: 0 },
225: { down: 1, left: 1, up: 0, right: 0 },
44: { down: 0, left: 0, up: 1, right: 1 },
314: { down: 0, left: 1, up: 1, right: 0 },
135: { down: 1, left: 0, up: 0, right: 1 },
359: { up: 1, down: 0, right: 0, left: 0 },
180: { down: 1, up: 0, right: 0, left: 0 },
270: { left: 1, right: 0, up: 0, down: 0 }
},
typeToValue: {
"top": 359,
"top right": 44,
"right": 90,
"bottom right": 135,
"bottom": 180,
"bottom left": 225,
"left": 270,
"top left": 314
},
valueToType: {
359: "top",
44: "top right",
90: "right",
135: "bottom right",
180: "bottom",
225: "bottom left",
270: "left",
314: "top left"
},
aimToYaw: (num) => {
const tolerance = 23;
num = (num === 360 || num === 0) ? 359 : num;
for (let movement of DirectionMapper.movements) {
if (num >= movement - tolerance && num <= movement + tolerance) {
return movement;
}
}
return null;
}
};
Why is this 24% faster than
const DirectionMapper = {
yawActions: {
90: { right: 1, left: 0, up: 0, down: 0 },
225: { down: 1, left: 1, up: 0, right: 0 },
44: { down: 0, left: 0, up: 1, right: 1 },
314: { down: 0, left: 1, up: 1, right: 0 },
135: { down: 1, left: 0, up: 0, right: 1 },
359: { up: 1, down: 0, right: 0, left: 0 },
180: { down: 1, up: 0, right: 0, left: 0 },
270: { left: 1, right: 0, up: 0, down: 0 }
},
// Might be used in the future
typeToValue: {
"top": 359,
"top right": 44,
"right": 90,
"bottom right": 135,
"bottom": 180,
"bottom left": 225,
"left": 270,
"top left": 314
},
aimToYaw: (num) => {
const tolerance = 23;
num = num % 360;
let matchingMovement = null;
for (let angle in DirectionMapper.yawActions) {
angle = parseInt(angle);
if (num >= angle - tolerance && num <= angle + tolerance ||
num + 360 >= angle - tolerance && num <= angle - 360 + tolerance) {
matchingMovement = angle;
break; // Exit the loop early as we found a valid match.
}
}
return matchingMovement;
}
};
|
8f6bd422081f894381b0d3a378db84ad
|
{
"intermediate": 0.315583735704422,
"beginner": 0.4066173732280731,
"expert": 0.2777988314628601
}
|
40,224
|
b) Estimating ฯ
Imagine a Cartesian coordinate plane and a square of side length 2 centered at the origin.
The square will have vertices at (-1, -1), (-1, 1), (1, 1), and (1, -1). Inside the square, a circle
of radius 1 unit centered at origin as shown below.
Figure 1: Monte-Carlo Simulation
Now, we randomly generate a large number of points (x, y) within the square. The x and y
coordinates of these points should be between -1 and 1. For each generated point, we check
if the point falls within the circle (x
2 + y
2 โค 1) and keep a count of no. of points falling
inside the circle and total no. of random points generated.
Now the P(random point lie in the circle) = Are of the circle
Area of the square =
ฯ
4
. We empirically estimate
this probability by random sampling done above. Therefore, ฯ = 4 ร
Are of the circle
Area of the square .
Update: Use Python to simulate the above process and compare the ฯ value obtained
using the pseudo-random number generator from the previous subtask and random.random()
function from random module in python.
Note: You must import the pseudo rand num gen(seed, k) function from part (a) .py file,
instead of copy-pasting the function code.
|
f27f14d9694c45c24b0532186e8cfdff
|
{
"intermediate": 0.3885692059993744,
"beginner": 0.2778486907482147,
"expert": 0.3335821330547333
}
|
40,225
|
print(f"Estimated ฯ using pseudo RNG: {pi_estimate_pseudo_rng}โ)
print(f"Estimated ฯ using built-in random.random(): {pi_estimate_random}โ)
The updated script should now look like this:
# Assuming the part (a) code is imported as pseudo_rand_num_gen
from q1_a import pseudo_rand_num_gen
import random
def monte_carlo_pi_estimation(total_points, rand_func):
inside_circle = 0
for _ in range(total_points):
x = rand_func() * 2 - 1 # Scaling to range [-1, 1]
y = rand_func() * 2 - 1 # Scaling to range [-1, 1]
if x2 + y2 <= 1: # Use ** for squaring
inside_circle += 1
return (inside_circle / total_points) * 4
# Function generator that uses the custom pseudo_rand_num_gen to yield one random number at a time
def my_pseudo_random(seed):
while True:
for number in pseudo_rand_num_gen(seed, k=1):
yield number
# Initialize the random number generator
seed = 123 # Set the seed, it can be any integer
random_generator = my_pseudo_random(seed)
# Perform the Monte Carlo simulation with a large number of points
total_points = 1000000 # The more points, the better the estimation
# We pass a lambda function to call โnext()โ on our generator, fetching new random numbers
pi_estimate_pseudo_rng = monte_carlo_pi_estimation(total_points, lambda: next(random_generator))
pi_estimate_random = monte_carlo_pi_estimation(total_points, random.random)
# Corrected print statements
print(f"Estimated ฯ using pseudo RNG: {pi_estimate_pseudo_rng}โ)
print(f"Estimated ฯ using built-in random.random(): {pi_estimate_random}โ)
something is wrong
|
8541cd1cf1e878b9b554e29ae5a00f06
|
{
"intermediate": 0.40674450993537903,
"beginner": 0.3619741201400757,
"expert": 0.23128141462802887
}
|
40,226
|
Write python code to read an image of your choice and display only the left half of the image.
|
8e78d173545c9430e977cc12dc08530d
|
{
"intermediate": 0.38590481877326965,
"beginner": 0.19657474756240845,
"expert": 0.4175204336643219
}
|
40,227
|
What does this mean in terms of building the website npm run build: Any file inside the folder pages/api is mapped to /api/* and will be treated as an API endpoint instead of a page. They are server-side only bundles and won't increase your client-side bundle size.
|
c8368c3cb2e2d6171fc31198162d2d60
|
{
"intermediate": 0.6124440431594849,
"beginner": 0.25844070315361023,
"expert": 0.1291152387857437
}
|
40,228
|
Hi, could you check out the code in this code base, and tell me how you would go about building this website locally on Win10? https://github.com/Esurielt/spirit-seeker-site
|
e987fb4b93e752f1f68963a5d22dc9af
|
{
"intermediate": 0.6201077103614807,
"beginner": 0.13331836462020874,
"expert": 0.24657389521598816
}
|
40,229
|
how to speed the process "from ultralytics import YOLO
import pandas as pd
import os
model = YOLO("/content/saqib/best_seg_75.pt") ## add he path of your model
## add the path of your video
results = model.predict('/content/out.mp4', save=True, save_txt=True, stream=True, save_frames=True)
list_entries = []
i = 1
for result in results:
boxes = result.boxes.cpu().numpy()
masks = result.masks
probs = result.probs
for box, mask in zip(boxes, masks):
cls = int(box.cls[0])
class_name = model.names[cls]
conf = int(box.conf[0]*100)
df = pd.DataFrame({'item_no': [i], 'class_name': [class_name], 'confidence': [conf]})
list_entries.append(df)
|
040296f64952670bd49cec5f61408c47
|
{
"intermediate": 0.5621098875999451,
"beginner": 0.22989876568317413,
"expert": 0.20799140632152557
}
|
40,230
|
speed up my soluution 'from ultralytics import YOLO
import pandas as pd
import os
model = YOLO("/content/saqib/best_seg_75.pt") ## add he path of your model
## add the path of your video
results = model.predict('/content/out.mp4', save=True, save_txt=True, stream=True, save_frames=True)
list_entries = []
i = 1
for result in results:
boxes = result.boxes.cpu().numpy()
masks = result.masks
probs = result.probs
for box, mask in zip(boxes, masks):
cls = int(box.cls[0])
class_name = model.names[cls]
conf = int(box.conf[0]*100)
df = pd.DataFrame({'item_no': [i], 'class_name': [class_name], 'confidence': [conf]})
list_entries.append(df)
|
b427afda32862819ce6f07f487c20396
|
{
"intermediate": 0.43421950936317444,
"beginner": 0.28959351778030396,
"expert": 0.2761869728565216
}
|
40,231
|
Write a full code in python to convert any 2d images into 3d model
|
64c2c6c41480a2aefa3e2746257c1361
|
{
"intermediate": 0.3002091646194458,
"beginner": 0.1092541366815567,
"expert": 0.5905367136001587
}
|
40,232
|
i have model detection, i detect object in video, and save the result in labels, with class and bounding box, i want script that can take many labels and extract just the object without repeating it with possibilite of righ, left, up, down
|
4e84a34675d02473f4b5b121b76384f8
|
{
"intermediate": 0.4494077265262604,
"beginner": 0.12411988526582718,
"expert": 0.42647236585617065
}
|
40,233
|
# Group by โunique_idโ and calculate the length of each group
lengths = y_cl4.groupby('unique_id').agg(pl.count().alias('length'))
# Count the occurrences of each length
counts = lengths.groupby('length').agg(pl.count().alias('count'))
counts = counts.sort('length')
pl.Config.set_tbl_rows(200)
print(counts) โโโโโโโโโโฌโโโโโโโโ
โ length โ count โ
โ --- โ --- โ
โ u32 โ u32 โ
โโโโโโโโโโชโโโโโโโโก
โ 1 โ 1942 โ
โ 2 โ 357 โ
โ 3 โ 157 โ
โ 4 โ 107 โ
โ 5 โ 74 โ
โ 6 โ 40 โ
โ 7 โ 48 โ
โ 8 โ 37 โ
โ 9 โ 39 โ
โ 10 โ 47 โ
โ 11 โ 54 โ
โ 12 โ 36 โ
โ 13 โ 35 โ
โ 14 โ 43 โ
โ 15 โ 47 โ
โ 16 โ 45 โ
โ 17 โ 36 โ
โ 18 โ 37 โ
โ 19 โ 51 โ
โ 20 โ 35 โ
โ 21 โ 41 โ
โ 22 โ 29 โ
โ 23 โ 26 โ
โ 24 โ 33 โ
โ 25 โ 35 โ
โ 26 โ 41 โ
โ 27 โ 39 โ
โ 28 โ 34 โ
โ 29 โ 37 โ
โ 30 โ 31 โ
โ 31 โ 32 โ
โ 32 โ 26 โ
โ 33 โ 30 โ
โ 34 โ 22 โ
โ 35 โ 39 โ
โ 36 โ 32 โ
โ 37 โ 32 โ
โ 38 โ 33 โ
โ 39 โ 37 โ
โ 40 โ 34 โ
โ 41 โ 24 โ
โ 42 โ 22 โ
โ 43 โ 17 โ
โ 44 โ 18 โ
โ 45 โ 13 โ
โ 46 โ 10 โ
โ 47 โ 18 โ
โ 48 โ 15 โ
โ 49 โ 17 โ
โ 50 โ 12 โ
โ 51 โ 15 โ
โ 52 โ 10 โ
โ 53 โ 11 โ
โ 54 โ 6 โ
โ 55 โ 9 โ
โ 56 โ 7 โ
โ 57 โ 11 โ
โ 58 โ 11 โ
โ 59 โ 9 โ
โ 60 โ 13 โ
โ 61 โ 14 โ
โ 62 โ 7 โ
โ 63 โ 5 โ
โ 64 โ 3 โ
โ 65 โ 6 โ
โ 66 โ 6 โ
โ 67 โ 5 โ
โ 68 โ 11 โ
โ 69 โ 7 โ
โ 70 โ 4 โ
โ 71 โ 2 โ
โ 72 โ 4 โ
โ 73 โ 3 โ
โ 74 โ 1 โ
โโโโโโโโโโดโโโโโโ # Filter the lengths DataFrame for lengths greater than 17
lengths_filtered = lengths.filter(pl.col('length') > 18)
# y_soldto filtered with only values greater than 17 in length
y_cl4_filtered = y_cl4.join(
lengths_filtered.select(pl.col('unique_id')),
on='unique_id',
how='semi'
)
# Sort by 'WeekDate'
y_cl4_filtered = y_cl4_filtered.sort("ds")
print(y_cl4_filtered)
# Group by โunique_idโ and calculate the length of each group
lengths = y_cl4_filtered.groupby('unique_id').agg(pl.count().alias('length'))
# Count the occurrences of each length
counts = lengths.groupby('length').agg(pl.count().alias('count'))
counts = counts.sort('length')
pl.Config.set_tbl_rows(200)
print(counts) u32 โ u32 โ
โโโโโโโโโโชโโโโโโโโก
โ 19 โ 51 โ
โ 20 โ 35 โ
โ 21 โ 41 โ
โ 22 โ 29 โ
โ 23 โ 26 โ
โ 24 โ 33 โ
โ 25 โ 35 โ
โ 26 โ 41 โ
โ 27 โ 39 โ
โ 28 โ 34 โ
โ 29 โ 37 โ
โ 30 โ 31 โ
โ 31 โ 32 โ
โ 32 โ 26 โ
โ 33 โ 30 โ
โ 34 โ 22 โ
โ 35 โ 39 โ
โ 36 โ 32 โ
โ 37 โ 32 โ
โ 38 โ 33 โ
โ 39 โ 37 โ
โ 40 โ 34 โ
โ 41 โ 24 โ
โ 42 โ 22 โ
โ 43 โ 17 โ
โ 44 โ 18 โ
โ 45 โ 13 โ
โ 46 โ 10 โ
โ 47 โ 18 โ
โ 48 โ 15 โ
โ 49 โ 17 โ
โ 50 โ 12 โ
โ 51 โ 15 โ
โ 52 โ 10 โ
โ 53 โ 11 โ
โ 54 โ 6 โ
โ 55 โ 9 โ
โ 56 โ 7 โ
โ 57 โ 11 โ
โ 58 โ 11 โ
โ 59 โ 9 โ
โ 60 โ 13 โ
โ 61 โ 14 โ
โ 62 โ 7 โ
โ 63 โ 5 โ
โ 64 โ 3 โ
โ 65 โ 6 โ
โ 66 โ 6 โ
โ 67 โ 5 โ
โ 68 โ 11 โ
โ 69 โ 7 โ
โ 70 โ 4 โ
โ 71 โ 2 โ
โ 72 โ 4 โ
โ 73 โ 3 โ
โ 74 โ 1 โ
โโโโโโโโโโดโ from statsforecast import StatsForecast
from statsforecast.models import AutoARIMA, AutoETS, AutoCES, DynamicOptimizedTheta
from statsforecast.utils import ConformalIntervals
import numpy as np
import polars as pl
# Polars option to display all rows
pl.Config.set_tbl_rows(None)
# Initialize the models
models = [
AutoARIMA(season_length=12),
AutoETS(season_length=12),
AutoCES(season_length=12),
DynamicOptimizedTheta(season_length=12)
]
# Initialize the StatsForecast model
sf = StatsForecast(models=models, freq='1w', n_jobs=-1)
# Perform cross-validation with a step size of 1 to mimic an expanding window
crossvalidation_df = sf.cross_validation(df=y_cl4_filtered, h=3, step_size=1, n_windows=8, sort_df=True)
# Calculate the ensemble mean
ensemble = crossvalidation_df[['AutoARIMA', 'AutoETS', 'CES', 'DynamicOptimizedTheta']].mean(axis=1)
# Create a Series for the ensemble mean
ensemble_series = pl.Series('Ensemble', ensemble)
# Add the ensemble mean as a new column to the DataFrame
crossvalidation_df = crossvalidation_df.with_columns(ensemble_series)
def wmape(y_true, y_pred):
return np.abs(y_true - y_pred).sum() / np.abs(y_true).sum()
# Calculate the WMAPE for the ensemble model
wmape_value = wmape(crossvalidation_df['y'], crossvalidation_df['Ensemble'])
print('Average WMAPE for Ensemble: ', round(wmape_value, 4))
# Calculate the errors for the ensemble model
errors = crossvalidation_df['y'] - crossvalidation_df['Ensemble']
# For an individual forecast
individual_accuracy = 1 - (abs(crossvalidation_df['y'] - crossvalidation_df['Ensemble']) / crossvalidation_df['y'])
individual_bias = (crossvalidation_df['Ensemble'] / crossvalidation_df['y']) - 1
# Add these calculations as new columns to DataFrame
crossvalidation_df = crossvalidation_df.with_columns([
individual_accuracy.alias("individual_accuracy"),
individual_bias.alias("individual_bias")
])
# Print the individual accuracy and bias for each week
for row in crossvalidation_df.to_dicts():
id = row['unique_id']
date = row['ds']
accuracy = row['individual_accuracy']
bias = row['individual_bias']
print(f"{id}, {date}, Individual Accuracy: {accuracy:.4f}, Individual Bias: {bias:.4f}")
# For groups of forecasts
group_accuracy = 1 - (errors.abs().sum() / crossvalidation_df['y'].sum())
group_bias = (crossvalidation_df['Ensemble'].sum() / crossvalidation_df['y'].sum()) - 1
# Print the average group accuracy and group bias over all folds for the ensemble model
print('Average Group Accuracy: ', round(group_accuracy, 4))
print('Average Group Bias: ', round(group_bias, 4))
# Fit the models on the entire dataset
sf.fit(y_cl4_fit_1)
# Instantiate the ConformalIntervals class
prediction_intervals = ConformalIntervals()
# Generate 24 months forecasts
forecasts_df = sf.forecast(h=52*2, prediction_intervals=prediction_intervals, level=[95], id_col='unique_id', sort_df=True)
# Apply the non-negative constraint to the forecasts of individual models
forecasts_df = forecasts_df.with_columns([
pl.when(pl.col('AutoARIMA') < 0).then(0).otherwise(pl.col('AutoARIMA')).alias('AutoARIMA'),
pl.when(pl.col('AutoETS') < 0).then(0).otherwise(pl.col('AutoETS')).alias('AutoETS'),
pl.when(pl.col('CES') < 0).then(0).otherwise(pl.col('CES')).alias('CES'),
pl.when(pl.col('DynamicOptimizedTheta') < 0).then(0).otherwise(pl.col('DynamicOptimizedTheta')).alias('DynamicOptimizedTheta'),
])
# Calculate the ensemble forecast
ensemble_forecast = forecasts_df[['AutoARIMA', 'AutoETS', 'CES', 'DynamicOptimizedTheta']].mean(axis=1)
# Calculate the lower and upper prediction intervals for the ensemble forecast
ensemble_lo_95 = forecasts_df.select(
[
pl.when(pl.col('AutoARIMA-lo-95') < 0).then(0).otherwise(pl.col('AutoARIMA-lo-95')).alias('AutoARIMA-lo-95'),
pl.when(pl.col('AutoETS-lo-95') < 0).then(0).otherwise(pl.col('AutoETS-lo-95')).alias('AutoETS-lo-95'),
pl.when(pl.col('CES-lo-95') < 0).then(0).otherwise(pl.col('CES-lo-95')).alias('CES-lo-95'),
pl.when(pl.col('DynamicOptimizedTheta-lo-95') < 0).then(0).otherwise(pl.col('DynamicOptimizedTheta-lo-95')).alias('DynamicOptimizedTheta-lo-95'),
]
).mean(axis=1)
ensemble_hi_95 = forecasts_df[['AutoARIMA-hi-95', 'AutoETS-hi-95', 'CES-hi-95', 'DynamicOptimizedTheta-hi-95']].mean(axis=1)
# Create Series for the ensemble forecast and its prediction intervals
ensemble_forecast_series = pl.Series('EnsembleForecast', ensemble_forecast)
ensemble_lo_95_series = pl.Series('Ensemble-lo-95', ensemble_lo_95)
ensemble_hi_95_series = pl.Series('Ensemble-hi-95', ensemble_hi_95)
# Add the ensemble forecast and its prediction intervals as new columns to the DataFrame
forecasts_df = forecasts_df.with_columns([ensemble_forecast_series, ensemble_lo_95_series, ensemble_hi_95_series])
# Round the ensemble forecast and prediction intervals and convert to integer
forecasts_df = forecasts_df.with_columns([
pl.col("EnsembleForecast").round().cast(pl.Int32),
pl.col("Ensemble-lo-95").round().cast(pl.Int32),
pl.col("Ensemble-hi-95").round().cast(pl.Int32)
])
# Reorder the columns
forecasts_df = forecasts_df.select([
"unique_id",
"ds",
"EnsembleForecast",
"Ensemble-lo-95",
"Ensemble-hi-95",
"AutoARIMA",
"AutoARIMA-lo-95",
"AutoARIMA-hi-95",
"AutoETS",
"AutoETS-lo-95",
"AutoETS-hi-95",
"CES",
"CES-lo-95",
"CES-hi-95",
"DynamicOptimizedTheta",
"DynamicOptimizedTheta-lo-95",
"DynamicOptimizedTheta-hi-95"
])
# Create an empty list
forecasts_list = []
# Append each row to the list
for row in forecasts_df.to_dicts():
forecasts_list.append(row)
# Print the list
for forecast in forecasts_list:
print(forecast) (...)
1245 ),
1246 )
1247 futures.append(future)
-> 1248 out = [f.get() for f in futures]
1249 fcsts = [d["forecasts"] for d in out]
1250 fcsts = np.vstack(fcsts)
File ~/anaconda3/lib/python3.10/site-packages/statsforecast/core.py:1248, in <listcomp>(.0)
1232 future = executor.apply_async(
1233 ga.cross_validation,
1234 (
(...)
1245 ),
1246 )
1247 futures.append(future)
-> 1248 out = [f.get() for f in futures]
1249 fcsts = [d["forecasts"] for d in out]
1250 fcsts = np.vstack(fcsts)
File ~/anaconda3/lib/python3.10/multiprocessing/pool.py:774, in ApplyResult.get(self, timeout)
772 return self._value
773 else:
--> 774 raise self._value
Exception: no model able to be fitted 1620 step_size=step_size,
1621 test_size=test_size,
1622 input_size=input_size,
1623 level=level,
1624 fitted=fitted,
1625 refit=refit,
1626 sort_df=sort_df,
1627 prediction_intervals=prediction_intervals,
1628 id_col=id_col,
1629 time_col=time_col,
1630 target_col=target_col,
1631 )
1632 assert df is not None
1633 engine = make_execution_engine(infer_by=[df])
File ~/anaconda3/lib/python3.10/site-packages/statsforecast/core.py:1026, in _StatsForecast.cross_validation(self, h, df, n_windows, step_size, test_size, input_size, level, fitted, refit, sort_df, prediction_intervals, id_col, time_col, target_col)
1012 res_fcsts = self.ga.cross_validation(
1013 models=self.models,
1014 h=h,
(...)
1023 target_col=target_col,
1024 )
1025 else:
-> 1026 res_fcsts = self._cross_validation_parallel(
1027 h=h, The above exception was the direct cause of the following exception:
Exception Traceback (most recent call last)
Cell In[17], line 22
19 sf = StatsForecast(models=models, freq='1w', n_jobs=-1)
21 # Perform cross-validation with a step size of 1 to mimic an expanding window
---> 22 crossvalidation_df = sf.cross_validation(df=y_cl4_filtered, h=3, step_size=1, n_windows=8, sort_df=True)
24 # Calculate the ensemble mean
25 ensemble = crossvalidation_df[['AutoARIMA', 'AutoETS', 'CES', 'DynamicOptimizedTheta']].mean(axis=1)
File ~/anaconda3/lib/python3.10/site-packages/statsforecast/core.py:1616, in StatsForecast.cross_validation(self, h, df, n_windows, step_size, test_size, input_size, level, fitted, refit, sort_df, prediction_intervals, id_col, time_col, target_col)
1598 def cross_validation(
1599 self,
1600 h: int,
(...)
1613 target_col: str = "y",
1614 ):
1615 if self._is_native(df=df):
-> 1616 return super().cross_validation(
1617 h=h,
1618 df=df,
1619 n_windows=n_windows, ---------------------------------------------------------------------------
RemoteTraceback Traceback (most recent call last)
RemoteTraceback:
"""
Traceback (most recent call last):
File "/Users/tungnguyen/anaconda3/lib/python3.10/multiprocessing/pool.py", line 125, in worker
result = (True, func(*args, **kwds))
File "/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/core.py", line 322, in cross_validation
raise error
File "/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/core.py", line 319, in cross_validation
res_i = model.forecast(**forecast_kwargs)
File "/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/models.py", line 1057, in forecast
mod = auto_ces(y, m=self.season_length, model=self.model)
File "/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ces.py", line 782, in auto_ces
raise Exception("no model able to be fitted")
Exception: no model able to be fitted
"""
The above exception was the direct cause of the following exception: what could have caused this error, normally I get tiny datasets error which I adjust h, n_windows or this to make it work # Filter the lengths DataFrame for lengths greater than 17
lengths_filtered = lengths.filter(pl.col('length') > 18) h=3, step_size=1, n_windows=8, not sure how to fix this error I'm getting
|
c56eea5df693ac11e0319455f321c9a7
|
{
"intermediate": 0.37903082370758057,
"beginner": 0.44379737973213196,
"expert": 0.17717181146144867
}
|
40,234
|
<frozen importlib._bootstrap>:241: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject
<frozen importlib._bootstrap>:241: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject
<frozen importlib._bootstrap>:241: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject
<frozen importlib._bootstrap>:241: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject
<frozen importlib._bootstrap>:241: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject
<frozen importlib._bootstrap>:241: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject
<frozen importlib._bootstrap>:241: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject
<frozen importlib._bootstrap>:241: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject
Cross Validation Time Series 1: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 1: 25%|โโโ | 2/8 [00:38<01:36, 16.03s/it]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 1: 12%|โโ | 1/8 [00:39<04:34, 39.17s/it]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 1: 62%|โโโโโโโ | 5/8 [00:39<00:10, 3.52s/it]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 1: 38%|โโโโ | 3/8 [00:39<00:44, 8.97s/it]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 1: 100%|โโโโโโโโโโ| 8/8 [00:39<00:00, 4.97s/it]
Cross Validation Time Series 1: 100%|โโโโโโโโโโ| 8/8 [00:39<00:00, 4.99s/it]
Cross Validation Time Series 1: 100%|โโโโโโโโโโ| 8/8 [00:40<00:00, 5.04s/it]
Cross Validation Time Series 1: 100%|โโโโโโโโโโ| 8/8 [00:40<00:00, 5.11s/it]
Cross Validation Time Series 1: 100%|โโโโโโโโโโ| 8/8 [00:40<00:00, 5.11s/it]
Cross Validation Time Series 2: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 1: 100%|โโโโโโโโโโ| 8/8 [00:41<00:00, 5.15s/it]
Cross Validation Time Series 2: 38%|โโโโ | 3/8 [00:00<00:01, 3.38it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 2: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 11.32it/s]
Cross Validation Time Series 2: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 8.24it/s]
Cross Validation Time Series 2: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.26it/s]
Cross Validation Time Series 2: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.42it/s]
Cross Validation Time Series 3: 12%|โโ | 1/8 [00:00<00:00, 7.91it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 2: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.78it/s]
Cross Validation Time Series 3: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 3: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 6.02it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 3: 25%|โโโ | 2/8 [00:00<00:01, 3.85it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 2: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.91it/s]
Cross Validation Time Series 3: 38%|โโโโ | 3/8 [00:00<00:01, 4.79it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 3: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.04it/s]
Cross Validation Time Series 4: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 3: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 5.82it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 3: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.11it/s]
Cross Validation Time Series 4: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 3: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.73it/s]
Cross Validation Time Series 4: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.43it/s]
Cross Validation Time Series 3: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.01it/s]
Cross Validation Time Series 4: 38%|โโโโ | 3/8 [00:00<00:01, 4.30it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 4: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 7.48it/s]
Cross Validation Time Series 5: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 4: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.66it/s]
Cross Validation Time Series 5: 50%|โโโโโ | 4/8 [00:00<00:00, 11.14it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 5: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 10.05it/s]
Cross Validation Time Series 6: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 3: 88%|โโโโโโโโโ | 7/8 [00:03<00:00, 1.37it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 4: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.21it/s]
Cross Validation Time Series 5: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.08it/s]
Cross Validation Time Series 5: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.31it/s]
Cross Validation Time Series 3: 100%|โโโโโโโโโโ| 8/8 [00:05<00:00, 1.55it/s]
Cross Validation Time Series 6: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.36it/s]
Cross Validation Time Series 4: 25%|โโโ | 2/8 [00:00<00:00, 8.73it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 4: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 12.05it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 7: 38%|โโโโ | 3/8 [00:00<00:00, 5.12it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 4: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 10.99it/s]
Cross Validation Time Series 5: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 5: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.64it/s]
Cross Validation Time Series 7: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 3.96it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 6: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.92it/s]
Cross Validation Time Series 6: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.70it/s]
Cross Validation Time Series 7: 88%|โโโโโโโโโ | 7/8 [00:01<00:00, 3.87it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 7: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.20it/s]
Cross Validation Time Series 3: 100%|โโโโโโโโโโ| 8/8 [00:07<00:00, 1.14it/s]
Cross Validation Time Series 6: 62%|โโโโโโโ | 5/8 [00:01<00:01, 2.99it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 7: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 6.10it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 8: 50%|โโโโโ | 4/8 [00:00<00:00, 4.35it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 7: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.64it/s]
Cross Validation Time Series 8: 62%|โโโโโโโ | 5/8 [00:01<00:00, 4.35it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 6: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.24it/s]
Cross Validation Time Series 4: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.86it/s]
Cross Validation Time Series 8: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.49it/s]
Cross Validation Time Series 5: 62%|โโโโโโโ | 5/8 [00:03<00:01, 1.57it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 9: 12%|โโ | 1/8 [00:00<00:02, 3.21it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 7: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.08it/s]
Cross Validation Time Series 5: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 8.52it/s]
Cross Validation Time Series 6: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 7: 62%|โโโโโโโ | 5/8 [00:01<00:00, 3.56it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 9: 50%|โโโโโ | 4/8 [00:01<00:01, 3.41it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 8: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.89it/s]
Cross Validation Time Series 6: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 7.56it/s]
Cross Validation Time Series 7: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.34it/s]
Cross Validation Time Series 5: 100%|โโโโโโโโโโ| 8/8 [00:05<00:00, 1.49it/s]
Cross Validation Time Series 9: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.37it/s]
Cross Validation Time Series 10: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 10: 25%|โโโ | 2/8 [00:00<00:00, 12.22it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 8: 38%|โโโโ | 3/8 [00:00<00:01, 4.62it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 8: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.05it/s]]
Cross Validation Time Series 10: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 7.79it/s]
Cross Validation Time Series 9: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.21it/s]]
Cross Validation Time Series 8: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.13it/s]]
Cross Validation Time Series 6: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.66it/s]
Cross Validation Time Series 10: 12%|โโ | 1/8 [00:00<00:01, 4.04it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 9: 12%|โโ | 1/8 [00:00<00:01, 5.88it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 7: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.44it/s]]
Cross Validation Time Series 11: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 5.63it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 10: 38%|โโโโ | 3/8 [00:00<00:01, 3.62it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 11: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.01it/s]
Cross Validation Time Series 7: 62%|โโโโโโโ | 5/8 [00:00<00:00, 5.00it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 9: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.89it/s]]
Cross Validation Time Series 7: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.85it/s]
Cross Validation Time Series 12: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.70it/s]
Cross Validation Time Series 8: 25%|โโโ | 2/8 [00:01<00:05, 1.00it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 8: 38%|โโโโ | 3/8 [00:00<00:01, 4.10it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 13: 38%|โโโโ | 3/8 [00:00<00:00, 5.35it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 10: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.46it/s]
Cross Validation Time Series 13: 88%|โโโโโโโโโ | 7/8 [00:01<00:00, 5.11it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 13: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.41it/s]
Cross Validation Time Series 9: 100%|โโโโโโโโโโ| 8/8 [00:04<00:00, 1.64it/s]]
Cross Validation Time Series 11: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.25it/s]
Cross Validation Time Series 14: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 5.53it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 14: 88%|โโโโโโโโโ | 7/8 [00:01<00:00, 4.64it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 10: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.86it/s]
Cross Validation Time Series 14: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.77it/s]
Cross Validation Time Series 8: 88%|โโโโโโโโโ | 7/8 [00:04<00:00, 1.10it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 8: 100%|โโโโโโโโโโ| 8/8 [00:05<00:00, 1.43it/s]]
Cross Validation Time Series 8: 62%|โโโโโโโ | 5/8 [00:06<00:05, 1.72s/it]]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 15: 38%|โโโโ | 3/8 [00:01<00:02, 1.97it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 12: 100%|โโโโโโโโโโ| 8/8 [00:04<00:00, 1.96it/s]
Cross Validation Time Series 10: 100%|โโโโโโโโโโ| 8/8 [00:09<00:00, 1.15s/it]
Cross Validation Time Series 15: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.42it/s]
Cross Validation Time Series 11: 100%|โโโโโโโโโโ| 8/8 [00:04<00:00, 1.97it/s]
Cross Validation Time Series 9: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.24it/s]]
Cross Validation Time Series 16: 50%|โโโโโ | 4/8 [00:00<00:00, 4.68it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 13: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.01it/s]
Cross Validation Time Series 11: 38%|โโโโ | 3/8 [00:01<00:02, 2.46it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 14: 12%|โโ | 1/8 [00:00<00:01, 4.97it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 8: 88%|โโโโโโโโโ | 7/8 [00:10<00:01, 1.62s/it]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 16: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.70it/s]
Cross Validation Time Series 10: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.95it/s]
Cross Validation Time Series 8: 100%|โโโโโโโโโโ| 8/8 [00:11<00:00, 1.41s/it]]
Cross Validation Time Series 14: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.81it/s]
Cross Validation Time Series 12: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.93it/s]
Cross Validation Time Series 11: 75%|โโโโโโโโ | 6/8 [00:03<00:01, 1.42it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 15: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 8.15it/s]
Cross Validation Time Series 11: 62%|โโโโโโโ | 5/8 [00:01<00:00, 3.89it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 9: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.26it/s]]
Cross Validation Time Series 17: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.38it/s]
Cross Validation Time Series 11: 88%|โโโโโโโโโ | 7/8 [00:01<00:00, 3.76it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 11: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.61it/s]
Cross Validation Time Series 10: 50%|โโโโโ | 4/8 [00:00<00:00, 7.59it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 10: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 7.27it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 18: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 8.65it/s]
Cross Validation Time Series 11: 100%|โโโโโโโโโโ| 8/8 [00:05<00:00, 1.55it/s]
Cross Validation Time Series 12: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 10: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.92it/s]
Cross Validation Time Series 12: 25%|โโโ | 2/8 [00:00<00:01, 5.29it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 12: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.61it/s]
Cross Validation Time Series 16: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.16it/s]
Cross Validation Time Series 19: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 5.40it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 19: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.22it/s]
Cross Validation Time Series 12: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.58it/s]
Cross Validation Time Series 17: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 13.27it/s]
Cross Validation Time Series 13: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.15it/s]
Cross Validation Time Series 13: 62%|โโโโโโโ | 5/8 [00:01<00:00, 4.03it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 13: 75%|โโโโโโโโ | 6/8 [00:04<00:01, 1.28it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 20: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.06it/s]
Cross Validation Time Series 18: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.86it/s]
Cross Validation Time Series 19: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 13: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 4.06it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 13: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.94it/s]
Cross Validation Time Series 21: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 7.25it/s]
Cross Validation Time Series 11: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.05it/s]
Cross Validation Time Series 13: 100%|โโโโโโโโโโ| 8/8 [00:06<00:00, 1.27it/s]
Cross Validation Time Series 14: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 22: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 11.46it/s]
Cross Validation Time Series 23: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 14: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.24it/s]
Cross Validation Time Series 14: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.63it/s]
Cross Validation Time Series 12: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.72it/s]
Cross Validation Time Series 15: 12%|โโ | 1/8 [00:00<00:01, 4.95it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 15: 12%|โโ | 1/8 [00:01<00:07, 1.11s/it]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 23: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.96it/s]
Cross Validation Time Series 13: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.25it/s]
Cross Validation Time Series 14: 100%|โโโโโโโโโโ| 8/8 [00:06<00:00, 1.27it/s]
Cross Validation Time Series 19: 100%|โโโโโโโโโโ| 8/8 [00:06<00:00, 1.28it/s]
Cross Validation Time Series 15: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.15it/s]
Cross Validation Time Series 20: 38%|โโโโ | 3/8 [00:01<00:01, 2.59it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 15: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.03it/s]
Cross Validation Time Series 20: 50%|โโโโโ | 4/8 [00:01<00:01, 2.71it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 24: 62%|โโโโโโโ | 5/8 [00:03<00:02, 1.33it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 16: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.86it/s]
Cross Validation Time Series 20: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.46it/s]
Cross Validation Time Series 14: 100%|โโโโโโโโโโ| 8/8 [00:05<00:00, 1.57it/s]
Cross Validation Time Series 24: 100%|โโโโโโโโโโ| 8/8 [00:06<00:00, 1.20it/s]
Cross Validation Time Series 17: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.79it/s]
Cross Validation Time Series 25: 62%|โโโโโโโ | 5/8 [00:01<00:00, 6.29it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 15: 100%|โโโโโโโโโโ| 8/8 [00:09<00:00, 1.22s/it]
Cross Validation Time Series 21: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.77it/s]
Cross Validation Time Series 15: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.29it/s]
Cross Validation Time Series 16: 12%|โโ | 1/8 [00:00<00:02, 2.78it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 25: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.00it/s]
Cross Validation Time Series 16: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.52it/s]
Cross Validation Time Series 16: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.21it/s]
Cross Validation Time Series 17: 12%|โโ | 1/8 [00:00<00:02, 2.56it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 18: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.09it/s]
Cross Validation Time Series 26: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.50it/s]
Cross Validation Time Series 27: 12%|โโ | 1/8 [00:00<00:00, 7.63it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 19: 25%|โโโ | 2/8 [00:00<00:01, 4.18it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 17: 25%|โโโ | 2/8 [00:00<00:02, 2.77it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 27: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.70it/s]
Cross Validation Time Series 22: 88%|โโโโโโโโโ | 7/8 [00:04<00:00, 1.93it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 19: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.30it/s]
Cross Validation Time Series 22: 100%|โโโโโโโโโโ| 8/8 [00:04<00:00, 1.73it/s]
Cross Validation Time Series 28: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 13.11it/s]
Cross Validation Time Series 29: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 17: 88%|โโโโโโโโโ | 7/8 [00:02<00:00, 3.34it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 23: 25%|โโโ | 2/8 [00:00<00:00, 6.11it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 17: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.09it/s]
Cross Validation Time Series 18: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 20: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 8.05it/s]
Cross Validation Time Series 16: 100%|โโโโโโโโโโ| 8/8 [00:10<00:00, 1.31s/it]
Cross Validation Time Series 29: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.38it/s]
Cross Validation Time Series 23: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.11it/s]
Cross Validation Time Series 24: 12%|โโ | 1/8 [00:00<00:01, 5.04it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 17: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 10.52it/s]
Cross Validation Time Series 21: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 7.21it/s]
Cross Validation Time Series 22: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 30: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 15.02it/s]
Cross Validation Time Series 31: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 22: 25%|โโโ | 2/8 [00:00<00:00, 9.93it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 22: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 10.07it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 22: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 9.83it/s]
Cross Validation Time Series 24: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.20it/s]
Cross Validation Time Series 23: 50%|โโโโโ | 4/8 [00:00<00:00, 5.83it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 18: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.27it/s]
Cross Validation Time Series 19: 25%|โโโ | 2/8 [00:00<00:01, 4.96it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.75it/s]
Cross Validation Time Series 32: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 25: 50%|โโโโโ | 4/8 [00:00<00:00, 4.34it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 23: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.71it/s]
Cross Validation Time Series 25: 62%|โโโโโโโ | 5/8 [00:01<00:00, 3.74it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 17: 100%|โโโโโโโโโโ| 8/8 [00:07<00:00, 1.10it/s]
Cross Validation Time Series 18: 12%|โโ | 1/8 [00:00<00:01, 5.38it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 25: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.09it/s]
Cross Validation Time Series 19: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.92it/s]
Cross Validation Time Series 18: 62%|โโโโโโโ | 5/8 [00:00<00:00, 5.19it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 18: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.18it/s]
Cross Validation Time Series 26: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.61it/s]
Cross Validation Time Series 20: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.56it/s]
Cross Validation Time Series 21: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 32: 38%|โโโโ | 3/8 [00:03<00:06, 1.28s/it]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 27: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 3.63it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 27: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.68it/s]
Cross Validation Time Series 19: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.72it/s]
Cross Validation Time Series 21: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.85it/s]
Cross Validation Time Series 24: 100%|โโโโโโโโโโ| 8/8 [00:07<00:00, 1.08it/s]
Cross Validation Time Series 20: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.96it/s]
Cross Validation Time Series 22: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.13it/s]
Cross Validation Time Series 28: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.25it/s]
Cross Validation Time Series 25: 38%|โโโโ | 3/8 [00:00<00:00, 5.57it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 29: 12%|โโ | 1/8 [00:00<00:01, 5.94it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 21: 25%|โโโ | 2/8 [00:00<00:02, 2.32it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 21: 38%|โโโโ | 3/8 [00:01<00:03, 1.39it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 25: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.34it/s]
Cross Validation Time Series 23: 88%|โโโโโโโโโ | 7/8 [00:02<00:00, 2.04it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 29: 88%|โโโโโโโโโ | 7/8 [00:02<00:00, 2.32it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 23: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.71it/s]
Cross Validation Time Series 26: 50%|โโโโโ | 4/8 [00:00<00:00, 4.84it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 29: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.64it/s]
Cross Validation Time Series 32: 100%|โโโโโโโโโโ| 8/8 [00:11<00:00, 1.47s/it]
Cross Validation Time Series 30: 38%|โโโโ | 3/8 [00:00<00:01, 4.86it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 26: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.90it/s]
Cross Validation Time Series 27: 12%|โโ | 1/8 [00:00<00:01, 4.52it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 21: 100%|โโโโโโโโโโ| 8/8 [00:04<00:00, 1.71it/s]
Cross Validation Time Series 24: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.61it/s]
Cross Validation Time Series 33: 88%|โโโโโโโโโ | 7/8 [00:01<00:00, 6.02it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 30: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.32it/s]
Cross Validation Time Series 33: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.97it/s]
Cross Validation Time Series 31: 12%|โโ | 1/8 [00:00<00:01, 5.96it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 25: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 13.08it/s]
Cross Validation Time Series 27: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 4.68it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 50%|โโโโโ | 4/8 [00:00<00:00, 7.12it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 62%|โโโโโโโ | 5/8 [00:00<00:00, 7.33it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 6.55it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 22: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.45it/s]
Cross Validation Time Series 31: 88%|โโโโโโโโโ | 7/8 [00:01<00:00, 6.55it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 34: 62%|โโโโโโโ | 5/8 [00:01<00:00, 5.11it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 26: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 6.05it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.60it/s]
Cross Validation Time Series 27: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.83it/s]
Cross Validation Time Series 26: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.60it/s]
Cross Validation Time Series 18: 100%|โโโโโโโโโโ| 8/8 [00:18<00:00, 2.31s/it]
Cross Validation Time Series 34: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.65it/s]
Cross Validation Time Series 32: 62%|โโโโโโโ | 5/8 [00:00<00:00, 7.29it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 32: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 6.63it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 27: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 14.27it/s]
Cross Validation Time Series 32: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 7.09it/s]
Cross Validation Time Series 28: 50%|โโโโโ | 4/8 [00:00<00:00, 14.21it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 35: 50%|โโโโโ | 4/8 [00:00<00:00, 4.62it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 28: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 12.21it/s]
Cross Validation Time Series 28: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.01it/s]
Cross Validation Time Series 35: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 2.66it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 29: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 9.08it/s]
Cross Validation Time Series 29: 25%|โโโ | 2/8 [00:00<00:01, 3.06it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 35: 88%|โโโโโโโโโ | 7/8 [00:02<00:00, 2.76it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 33: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.51it/s]
Cross Validation Time Series 34: 25%|โโโ | 2/8 [00:00<00:00, 16.13it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 35: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.04it/s]
Cross Validation Time Series 29: 50%|โโโโโ | 4/8 [00:01<00:01, 3.17it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 29: 62%|โโโโโโโ | 5/8 [00:01<00:00, 3.04it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 34: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 9.43it/s]
Cross Validation Time Series 35: 38%|โโโโ | 3/8 [00:00<00:00, 6.26it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 19: 100%|โโโโโโโโโโ| 8/8 [00:04<00:00, 1.99it/s]
Cross Validation Time Series 23: 100%|โโโโโโโโโโ| 8/8 [00:05<00:00, 1.58it/s]
Cross Validation Time Series 24: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 30: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.50it/s]
Cross Validation Time Series 24: 25%|โโโ | 2/8 [00:00<00:00, 11.19it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 35: 88%|โโโโโโโโโ | 7/8 [00:01<00:00, 4.99it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 20: 50%|โโโโโ | 4/8 [00:00<00:00, 5.10it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 35: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.07it/s]
Cross Validation Time Series 36: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 14.28it/s]
Cross Validation Time Series 24: 62%|โโโโโโโ | 5/8 [00:00<00:00, 7.77it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 29: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.27it/s]
Cross Validation Time Series 24: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.65it/s]
Cross Validation Time Series 32: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 9.83it/s]
Cross Validation Time Series 20: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.87it/s]
Cross Validation Time Series 25: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.76it/s]
Cross Validation Time Series 21: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.41it/s]
Cross Validation Time Series 36: 100%|โโโโโโโโโโ| 8/8 [00:06<00:00, 1.15it/s]
Cross Validation Time Series 26: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.80it/s]
Cross Validation Time Series 36: 100%|โโโโโโโโโโ| 8/8 [00:05<00:00, 1.37it/s]
Cross Validation Time Series 33: 100%|โโโโโโโโโโ| 8/8 [00:06<00:00, 1.31it/s]
Cross Validation Time Series 34: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 22: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.40it/s]
Cross Validation Time Series 27: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.03it/s]
Cross Validation Time Series 37: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 2.87it/s]
Cross Validation Time Series 38: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 11.22it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 30: 100%|โโโโโโโโโโ| 8/8 [00:08<00:00, 1.01s/it]
Cross Validation Time Series 34: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.76it/s]
Cross Validation Time Series 38: 100%|โโโโโโโโโโ| 8/8 [00:00<00:00, 10.38it/s]
Cross Validation Time Series 37: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.51it/s]
Cross Validation Time Series 23: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.54it/s]
Cross Validation Time Series 35: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.21it/s]
Cross Validation Time Series 39: 25%|โโโ | 2/8 [00:01<00:04, 1.45it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 28: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.38it/s]
Cross Validation Time Series 38: 75%|โโโโโโโโ | 6/8 [00:01<00:00, 3.44it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 36: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 7.80it/s]
Cross Validation Time Series 38: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.10it/s]
Cross Validation Time Series 29: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.94it/s]
Cross Validation Time Series 30: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 62%|โโโโโโโ | 5/8 [00:04<00:02, 1.29it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 37: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.07it/s]
Cross Validation Time Series 30: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 6.62it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 30: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 7.02it/s]
Cross Validation Time Series 31: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 38: 75%|โโโโโโโโ | 6/8 [00:00<00:00, 7.40it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 39: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 4.00it/s]
Cross Validation Time Series 31: 88%|โโโโโโโโโ | 7/8 [00:05<00:00, 1.46it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 38: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.76it/s]
Cross Validation Time Series 39: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 100%|โโโโโโโโโโ| 8/8 [00:05<00:00, 1.35it/s]
Cross Validation Time Series 32: 0%| | 0/8 [00:00<?, ?it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 39: 38%|โโโโ | 3/8 [00:00<00:00, 6.11it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 24: 38%|โโโโ | 3/8 [00:05<00:07, 1.56s/it]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 40: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.50it/s]
Cross Validation Time Series 39: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 5.55it/s]
Cross Validation Time Series 32: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.31it/s]
Cross Validation Time Series 41: 25%|โโโ | 2/8 [00:00<00:01, 4.37it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 40: 50%|โโโโโ | 4/8 [00:00<00:00, 7.69it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 33: 25%|โโโ | 2/8 [00:00<00:01, 3.54it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 40: 88%|โโโโโโโโโ | 7/8 [00:01<00:00, 5.72it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 40: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 6.02it/s]
Cross Validation Time Series 24: 75%|โโโโโโโโ | 6/8 [00:07<00:01, 1.03it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 39: 100%|โโโโโโโโโโ| 8/8 [00:08<00:00, 1.05s/it]
Cross Validation Time Series 41: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.98it/s]
Cross Validation Time Series 41: 25%|โโโ | 2/8 [00:00<00:01, 3.45it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 33: 100%|โโโโโโโโโโ| 8/8 [00:02<00:00, 3.67it/s]
Cross Validation Time Series 24: 100%|โโโโโโโโโโ| 8/8 [00:09<00:00, 1.23s/it]
Cross Validation Time Series 41: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.29it/s]
Cross Validation Time Series 42: 100%|โโโโโโโโโโ| 8/8 [00:03<00:00, 2.40it/s]
Cross Validation Time Series 42: 25%|โโโ | 2/8 [00:00<00:01, 4.42it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 25: 38%|โโโโ | 3/8 [00:01<00:02, 1.78it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 43: 25%|โโโ | 2/8 [00:00<00:02, 2.39it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 42: 100%|โโโโโโโโโโ| 8/8 [00:01<00:00, 4.01it/s]
Cross Validation Time Series 25: 75%|โโโโโโโโ | 6/8 [00:03<00:00, 2.30it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 31: 100%|โโโโโโโโโโ| 8/8 [00:09<00:00, 1.14s/it]
Cross Validation Time Series 34: 100%|โโโโโโโโโโ| 8/8 [00:04<00:00, 1.64it/s]
Cross Validation Time Series 43: 75%|โโโโโโโโ | 6/8 [00:02<00:00, 2.68it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1)
Cross Validation Time Series 43: 62%|โโโโโโโ | 5/8 [00:00<00:00, 6.66it/s]/Users/tungnguyen/anaconda3/lib/python3.10/site-packages/statsforecast/ets.py:1038: RuntimeWarning: divide by zero encountered in double_scalars
sigma2 = np.sum(e**2) / (ny - np_ - 1) waht is np ny - np that caused divide by zero
|
dee9cd3c4c6b8ddaacdc9ce6540e07ec
|
{
"intermediate": 0.32800722122192383,
"beginner": 0.3323187232017517,
"expert": 0.33967408537864685
}
|
40,235
|
Hi
|
e08cc890dc56baf1eeba784bcb61b0f5
|
{
"intermediate": 0.33010533452033997,
"beginner": 0.26984941959381104,
"expert": 0.400045245885849
}
|
40,236
|
AI stems audio AI
write me python program to extract audio files to drums, guitar, vocals, etc in high quality lossless
|
acf7498aa2773a667c50bfa44d199adb
|
{
"intermediate": 0.16278554499149323,
"beginner": 0.06499689072370529,
"expert": 0.7722175717353821
}
|
40,237
|
Hi, i got a problem in my unity project. NullReferenceException: Object reference not set to an instance of an object
FPSController.HandleMovement () (at Assets/Scripts/FPSController.cs:42)
FPSController.Update () (at Assets/Scripts/FPSController.cs:35)
There is code snippet it reffering to;
public class FPSController : MonoBehaviour
{
[Header("Movement Speeds")]
[SerializeField] private float _walkSpeed = 3.0f;
[SerializeField] private float _sprintMultiplier = 2.0f;
[Header("Jump Parameters")]
[SerializeField] private float _jumpForce = 5.0f;
[SerializeField] private float _gravity = 9.81f;
[Header("Camera Parameters")]
[SerializeField] private float _mouseSensitivity = 5.0f;
[SerializeField] private float _upDownClampRange = 89f;
private CharacterController characterController;
private Camera mainCamera;
private PlayerInputHandler inputHandler;
private Vector3 currentMovement;
private float verticalRotation;
private void Awake()
{
characterController = GetComponent<CharacterController>();
mainCamera = Camera.main;
inputHandler = PlayerInputHandler.Instance;
}
private void Update()
{
HandleMovement();
HandleRotation();
HandleJumping();
}
void HandleMovement()
{
float speed = _walkSpeed * (inputHandler.SprintValue > 0 ? _sprintMultiplier : 1f);
Vector3 inputDirection = new Vector3(inputHandler.MoveInput.x, 0f, inputHandler.MoveInput.y);
Vector3 worldDirection = transform.TransformDirection(inputDirection);
worldDirection.Normalize();
currentMovement.x = worldDirection.x * speed;
currentMovement.x = worldDirection.z * speed;
characterController.Move(currentMovement * Time.deltaTime);
}
void HandleJumping()
{
if(characterController.isGrounded)
{
currentMovement.y = -0.5f;
if(inputHandler.JumpTriggered)
{
currentMovement.y = _jumpForce;
}
}
else
{
currentMovement.y -= _gravity * Time.deltaTime;
}
}
void HandleRotation()
{
float mouseXRotation = inputHandler.LookInput.x * _mouseSensitivity;
transform.Rotate(0f, mouseXRotation, 0f);
verticalRotation -= inputHandler.LookInput.y * _mouseSensitivity;
verticalRotation = Mathf.Clamp(verticalRotation, -_upDownClampRange, _upDownClampRange);
mainCamera.transform.localRotation = Quaternion.Euler(verticalRotation, 0f, 0f);
}
}
|
ae13473efa8a68511e30e3bb7848488c
|
{
"intermediate": 0.4108164310455322,
"beginner": 0.43335631489753723,
"expert": 0.15582719445228577
}
|
40,238
|
Traceback (most recent call last):
File "D:\ะ ะฐะฑะพัะธะต ัะฐะนะปั ะธ ะฟัะพะณัะฐะผะผั\ะฅะปะตะฑะฝะธะบะพะฒ_Movie_Code\MoviePyCodesNew.zip_expanded\projectMovie\loadusers.py", line 1, in <module>
from py2neo import Graph, Node
ModuleNotFoundError: No module named 'py2neo'
|
39381df4bbd64951cec6984042e3d633
|
{
"intermediate": 0.45001861453056335,
"beginner": 0.2639007866382599,
"expert": 0.28608065843582153
}
|
40,239
|
Please convert this Batch file to linux shell file:
@echo off
python -m venv venv
call venv\Scripts\activate.bat (for linux it is just venv\Scripts\activate. Remember chmod +x)
pip install flask
pip install youtube-search-python
pip install pytube
pip install moviepy
pip install pydub
python server.py
pause
|
ad8197a57d94d70451797054c5ce8ec5
|
{
"intermediate": 0.32988032698631287,
"beginner": 0.30797824263572693,
"expert": 0.3621414005756378
}
|
40,240
|
Write a python function which sharpens an image. Try to find the associated kernel for sharpening images.
|
0258767c7f7314d4c9d36bac1c74186e
|
{
"intermediate": 0.28332969546318054,
"beginner": 0.21367892622947693,
"expert": 0.5029913187026978
}
|
40,241
|
Write a python function which implements the convolution operation on an image given the kernel.def 2D_convolution(img, kernel):
# Write your code here
code compatible with jupyter notebook
|
c4dce6e297a47ae21033b9810a14278b
|
{
"intermediate": 0.3138531446456909,
"beginner": 0.24578097462654114,
"expert": 0.44036588072776794
}
|
40,242
|
Using after effects expressions I would like to select each word in a sentence of a text layer using Animator. I would like to control it using Controller Slider.
|
a5b011b28b5579340b73d5fcf8ec029f
|
{
"intermediate": 0.40825361013412476,
"beginner": 0.26353153586387634,
"expert": 0.3282148838043213
}
|
40,243
|
I have a PC. I need to make it only run one app: Skype. I'm thinking of using linux as a base, as I don't need any functionality nor a desktop interface, just the Skype app. How can I achieve this?
|
00d76a2a1ad3dc72ab63dec9e30d568f
|
{
"intermediate": 0.30434152483940125,
"beginner": 0.317080020904541,
"expert": 0.37857842445373535
}
|
40,244
|
import os
from discord.ext import commands
import discord
from discord import app_commands
json_files_folder = 'comfyUI-workflows'
class Bot(commands.Bot):
def __init__(self):
intents = discord.Intents.default()
intents.message_content = True
super().__init__(command_prefix='!', intents=intents,proxy="http://127.0.0.1:7890")
async def setup_hook(self):
await self.tree.sync(guild=discord.Object(id=1200468352464863382))
print("Synced slash command for {self.user}")
await self.setup_commands()
async def on_command_error(self , ctx, error):
await ctx.reply(error, ephemeral=True)
async def setup_commands(self):
# ้ๅๆไปถๅคนไธ็ๆๆๆไปถๅคนๅ
folder_names = [f for f in os.listdir(json_files_folder) if os.path.isdir(os.path.join(json_files_folder, f))]
for folder_name in folder_names:
group = commands.HybridGroup(name=folder_name)
# ้ๅๆไปถๅคนไธญ็jsonๆไปถ
json_files = [f for f in os.listdir(os.path.join(json_files_folder, folder_name)) if f.endswith('.json')]
for json_file in json_files:
command_name = os.path.splitext(json_file)[0]
@group.hybrid_command(name=command_name)
async def dynamic_command(ctx):
# ๅจ่ฟ้็ผๅๅค็ๅฝไปค็้ป่พ
await ctx.send(f'ๆง่กไบๅจๆๅฝไปค: {command_name}')
setattr(Bot, command_name, commands.hybrid_command(name=command_name, with_app_command=True, description="testing")(dynamic_command))
# ๅค็ไธๅจๆไปถๅคนไธ็jsonๆไปถ
other_json_files = [f for f in os.listdir(json_files_folder) if f.endswith('.json') and not os.path.isdir(os.path.join(json_files_folder, f))]
for json_file in other_json_files:
command_name = os.path.splitext(json_file)[0]
async def dynamic_command( ctx: commands.Context) -> None:
await ctx.defer(ephemeral=True)
await ctx.reply("hi")
setattr(Bot, command_name, commands.hybrid_command(name=command_name, with_app_command=True, description="testing")(dynamic_command))
bot = Bot()
# @bot.hybrid_command(name="test", with_app)
# @bot.hybrid_command(name="test", with_app_command=True, description="testing")
# @app_commands.guilds(discord.Object(id=1200468352464863382))
# async def test( ctx: commands.Context) -> None:
# await ctx.defer(ephemeral=True)
# await ctx.reply("hi")
bot.run('MTIwMDQ1Njk0MTI2OTQyNjE4Nw.GpkQ4X.m_ga-5IVgwVSNE4crl7H-lTITLsS0V1-v5yi3w')
ๆๆณๅจๆ็ๆ hybrid_command ่ฟๆฎตไปฃ็ ๆไปไน้ฎ้ข๏ผๆ็่ฏไฟฎๅคๅฎ
|
88a7d5977d4127bb1b0a529921863ba7
|
{
"intermediate": 0.25545668601989746,
"beginner": 0.6421927809715271,
"expert": 0.10235056281089783
}
|
40,245
|
I have a PC. I need to make it only run one app: Skype. Iโm thinking of using linux as a base, as I donโt need any functionality nor a desktop interface, just the Skype app. How can I achieve this? I want to achieve all this on a system without a graphical interface. What OS do you recommend for this (that has only very little system requirements, and command line only, with Skype running graphically)?
|
66202d9a593af7a00885488057464e7a
|
{
"intermediate": 0.30265429615974426,
"beginner": 0.3646959066390991,
"expert": 0.33264973759651184
}
|
40,246
|
I have a PC. I need to make it only run one app: Skype. Iโm thinking of using linux as a base, as I donโt need any functionality nor a desktop interface, just the Skype app. How can I achieve this? I want to achieve all this on a system without a graphical interface. What OS do you recommend for this (that has only very little system requirements, and command line only, with Skype running graphically)? Or maybe if it would be better, some kind of kiosk mode?
|
ec5dbfb42c145b6f5057f5d8c353db0e
|
{
"intermediate": 0.292106956243515,
"beginner": 0.3682247996330261,
"expert": 0.33966824412345886
}
|
40,247
|
def convolve_polynomial(poly1, poly2):
# The size of the resultant polynomial will be
# the sum of the sizes of the input polynomials minus 1
result_size = len(poly1) + len(poly2) - 1
# Initialize the result polynomial with zeros
result_poly = [0] * result_size
# Perform the convolution operation
for i in range(len(poly1)):
for j in range(len(poly2)):
result_poly[i + j] += poly1[i] * poly2[j]
return result_poly
Write a python function which implements the convolution operation on an image given the kernel. Please refer to this [video](https://www.youtube.com/watch?v=8rrHTtUzyZA) by Grant Sanderson to learn about 2D convolutions and how it is connected to the context of blurring images.
|
63252010991e7932673f66cf6dad2ad1
|
{
"intermediate": 0.2739759683609009,
"beginner": 0.19089162349700928,
"expert": 0.5351323485374451
}
|
40,248
|
Imagine a Cartesian coordinate plane and a square of side length 2 centered at the origin.
The square will have vertices at (-1, -1), (-1, 1), (1, 1), and (1, -1). Inside the square, a circle
of radius 1 unit centered at origin as shown below.
Figure 1: Monte-Carlo Simulation
Now, we randomly generate a large number of points (x, y) within the square. The x and y
coordinates of these points should be between -1 and 1. For each generated point, we check
if the point falls within the circle (x2 + y 2 โค 1) and keep a count of no. of points falling
inside the circle and total no. of random points generated.
Are of the circle
= ฯ4 . We empirically estimate
Now the P(random point lie in the circle) = Area
of the square
Are of the circle
this probability by random sampling done above. Therefore, ฯ = 4 ร Area
.
of the square
Update: Use Python to simulate the above process and compare the ฯ value obtained
using the pseudo-random number generator from the previous subtask and random.random()
function from random module in python.
Note: You must import the pseudo rand num gen(seed, k) function from part (a) .py file,
instead of copy-pasting the function code.
|
5480783b5b72a52acf657b1cfc29ab98
|
{
"intermediate": 0.3849082887172699,
"beginner": 0.2665771245956421,
"expert": 0.3485146760940552
}
|
40,249
|
in ubuntu linux, how can I start an app in Kiosk mode form terminal? (For example Skype)?
|
dd5a4e91141d108bda6c76347e1c016e
|
{
"intermediate": 0.5480345487594604,
"beginner": 0.2133040577173233,
"expert": 0.23866139352321625
}
|
40,250
|
tell me the differences between python and scratch in points
|
de03aec72bb4c03ec4a4d5ca77e4bc5d
|
{
"intermediate": 0.4497406780719757,
"beginner": 0.20001740753650665,
"expert": 0.35024186968803406
}
|
40,251
|
Can you analyse the Bitcoin market
|
e8ae99c5ccbc4e7ce653d4d904104793
|
{
"intermediate": 0.35270634293556213,
"beginner": 0.2720164358615875,
"expert": 0.37527719140052795
}
|
40,252
|
Give me a python coding to scrap TikTok data
|
7e095c4d24864b580728a1c22d1411f5
|
{
"intermediate": 0.4307386875152588,
"beginner": 0.19375041127204895,
"expert": 0.37551090121269226
}
|
40,253
|
Hi
|
644b5e8383aa79f2bb1df6629ae0b350
|
{
"intermediate": 0.33010533452033997,
"beginner": 0.26984941959381104,
"expert": 0.400045245885849
}
|
40,254
|
I had a nightmare where I was in my school, but things were "off". But as I continue, the layout changes. Soon after I find one of my teachers who apparently has moved into a new building, who leads me into the basement. The basement apparently leads to a long hallway, nothings off about it, besides the fact that there's no lights. Soon after, I need to use the bathroom, and I go the rooms which apparently has bathrooms. Inside contains people who were probably just joking around, but I don't really know. As I continue looking for a bathroom, they're all occupied. I try to go to one down the hall, when I turn around, the room completely disappears. The layout the suddenyl changes to a house, and everyone disappears. I then see a door that looks like it leads to a crawlspace and was open partially, and beyond that it was like a void. I was like, no, I'm not doing out, and I go to leave the house. The front door was wide open, and outside I was just standing outside this massive hosue. I then heard something and woke up.
|
fecc9915d1cd83f9349a17b005179c25
|
{
"intermediate": 0.31356656551361084,
"beginner": 0.3174292743206024,
"expert": 0.36900418996810913
}
|
40,255
|
How can i make it so my python code checks for an xpath and if it find the element clicks it and if it doesnt find that element it just continues?
|
dc0529d3e943dddf861470f70976f261
|
{
"intermediate": 0.6303231120109558,
"beginner": 0.06724792718887329,
"expert": 0.3024290204048157
}
|
40,256
|
Write a python code snippet that checks a certain website (that is already opened in the webdriver) for the xpath of an element. As long as its present, it should wait and after its gone, the code should continue
|
a3479945da0127864b5d9ead25319e91
|
{
"intermediate": 0.6138734817504883,
"beginner": 0.1546512097120285,
"expert": 0.23147527873516083
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.