row_id
int64 0
48.4k
| init_message
stringlengths 1
342k
| conversation_hash
stringlengths 32
32
| scores
dict |
|---|---|---|---|
42,368
|
Create a Roblox script where a tool will throw any part
|
3127aed18952787b8d3c77bf7f004730
|
{
"intermediate": 0.3212747871875763,
"beginner": 0.37279248237609863,
"expert": 0.3059327304363251
}
|
42,369
|
(java) Create a Mcq for each essential knowledge statement that requires knowledge of the previous essential knowledge statement to solve the question. Use these names in the questions as they are my friends and want to wish me Srinath, happy birthday! names:Armaan, Ganesh, Srinath and Daksh.
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.B.1
Iteration statements can be used to access
all the elements in an array. This is called
traversing the array.
VAR-2.B.2
Traversing an array with an indexed for
loop or while loop requires elements to be
accessed using their indices.
VAR-2.B.3
Since the indices for an array start at
0 and end at the number of elements
− 1, “off by one” errors are easy to make
when traversing an array, resulting in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.C.1
An enhanced for loop header includes a
variable, referred to as the enhanced for
loop variable.
VAR-2.C.2
For each iteration of the enhanced for loop,
the enhanced for loop variable is assigned a
copy of an element without using its index.
VAR-2.C.3
Assigning a new value to the enhanced for
loop variable does not change the value stored
in the array.
VAR-2.C.4
Program code written using an enhanced for
loop to traverse and access elements in an
array can be rewritten using an indexed for
loop or a while loop.
CON-2.I.1
There are standard algorithms that utilize array
traversals to:
§ Determine a minimum or maximum value
§ Compute a sum, average, or mode
§ Determine if at least one element has a
particular property
§ Determine if all elements have a particular
property
§ Access all consecutive pairs of elements
§ Determine the presence or absence of
duplicate elements
§ Determine the number of elements meeting
specific criteria
CON-2.I.2
There are standard array algorithms that utilize
traversals to:
§ Shift or rotate elements left or right
§ Reverse the order of the elements
You need to tests these skills in your questions:
1.A Determine an
appropriate program
design to solve a
problem or accomplish
a task (not assessed).
1.B Determine code
that would be used
to complete code
segments.
1.C Determine code
that would be used to
interact with completed
program code.
2.A Apply the
meaning of specific
operators.
2.B Determine the
result or output
based on statement
execution order in a
code segment without
method calls (other
than output).
2.C Determine the
result or output
based on the
statement execution
order in a code
segment containing
method calls.
2.D Determine the
number of times
a code segment
will execute.
4.A Use test-cases
to find errors or
validate results.
4.B Identify errors in
program code.
4.C Determine if
two or more code
segments yield
equivalent results.
5.A Describe the
behavior of a
given segment of
program code.
5.B Explain why a
code segment will
not compile or work
as intended.
5.C Explain how
the result of program
code changes, given
a change to the
initial code.
5.D Describe the
initial conditions that
must be met for a
program segment
to work as intended
or described.
|
0e206327e1522cc5876a8a308f95fef5
|
{
"intermediate": 0.31191015243530273,
"beginner": 0.38574790954589844,
"expert": 0.30234193801879883
}
|
42,370
|
(java) Create a Mcq for each essential knowledge statement that requires knowledge of the previous essential knowledge statement to solve the question. Use these names in the questions as they are my friends and want to wish me Srinath, happy birthday! names:Armaan, Ganesh, Srinath and Daksh.
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
You need to test these skills in your questions:
1.A Determine an
appropriate program
design to solve a
problem or accomplish
a task (not assessed).
1.B Determine code
that would be used
to complete code
segments.
1.C Determine code
that would be used to
interact with completed
program code.
2.A Apply the
meaning of specific
operators.
2.B Determine the
result or output
based on statement
execution order in a
code segment without
method calls (other
than output).
2.C Determine the
result or output
based on the
statement execution
order in a code
segment containing
method calls.
2.D Determine the
number of times
a code segment
will execute.
4.A Use test-cases
to find errors or
validate results.
4.B Identify errors in
program code.
4.C Determine if
two or more code
segments yield
equivalent results.
5.A Describe the
behavior of a
given segment of
program code.
5.B Explain why a
code segment will
not compile or work
as intended.
5.C Explain how
the result of program
code changes, given
a change to the
initial code.
5.D Describe the
initial conditions that
must be met for a
program segment
to work as intended
or described.
|
100cade7f9b3ea2f1b27d12ec2ddcd28
|
{
"intermediate": 0.46157535910606384,
"beginner": 0.2644326388835907,
"expert": 0.27399197220802307
}
|
42,371
|
(java) Create 2 Mcqs for the content below. Use these names in the questions as they are my friends and want to wish me Srinath, happy birthday! names:Armaan, Ganesh, Srinath and Daksh.
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
You need to test these skills in your questions:
1.A Determine an
appropriate program
design to solve a
problem or accomplish
a task (not assessed).
1.B Determine code
that would be used
to complete code
segments.
1.C Determine code
that would be used to
interact with completed
program code.
2.A Apply the
meaning of specific
operators.
2.B Determine the
result or output
based on statement
execution order in a
code segment without
method calls (other
than output).
2.C Determine the
result or output
based on the
statement execution
order in a code
segment containing
method calls.
2.D Determine the
number of times
a code segment
will execute.
4.A Use test-cases
to find errors or
validate results.
4.B Identify errors in
program code.
4.C Determine if
two or more code
segments yield
equivalent results.
5.A Describe the
behavior of a
given segment of
program code.
5.B Explain why a
code segment will
not compile or work
as intended.
5.C Explain how
the result of program
code changes, given
a change to the
initial code.
5.D Describe the
initial conditions that
must be met for a
program segment
to work as intended
or described.
|
d5e403412076dea1aa0025a761e4790d
|
{
"intermediate": 0.39131757616996765,
"beginner": 0.32102730870246887,
"expert": 0.2876551151275635
}
|
42,372
|
(java) Create 2 Mcqs for the content below.
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
You need to test these skills in your questions:
1.A Determine an
appropriate program
design to solve a
problem or accomplish
a task (not assessed).
1.B Determine code
that would be used
to complete code
segments.
1.C Determine code
that would be used to
interact with completed
program code.
2.A Apply the
meaning of specific
operators.
2.B Determine the
result or output
based on statement
execution order in a
code segment without
method calls (other
than output).
2.C Determine the
result or output
based on the
statement execution
order in a code
segment containing
method calls.
2.D Determine the
number of times
a code segment
will execute.
4.A Use test-cases
to find errors or
validate results.
4.B Identify errors in
program code.
4.C Determine if
two or more code
segments yield
equivalent results.
5.A Describe the
behavior of a
given segment of
program code.
5.B Explain why a
code segment will
not compile or work
as intended.
5.C Explain how
the result of program
code changes, given
a change to the
initial code.
5.D Describe the
initial conditions that
must be met for a
program segment
to work as intended
or described.
|
741b38088f4688c0629c1aa26704e780
|
{
"intermediate": 0.3615092933177948,
"beginner": 0.32003727555274963,
"expert": 0.31845346093177795
}
|
42,373
|
(java) Create an Mcq for the content below.
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
You questions needs to test these skills:
1.A Determine an
appropriate program
design to solve a
problem or accomplish
a task (not assessed).
1.B Determine code
that would be used
to complete code
segments.
1.C Determine code
that would be used to
interact with completed
program code.
2.A Apply the
meaning of specific
operators.
2.B Determine the
result or output
based on statement
execution order in a
code segment without
method calls (other
than output).
2.C Determine the
result or output
based on the
statement execution
order in a code
segment containing
method calls.
2.D Determine the
number of times
a code segment
will execute.
4.A Use test-cases
to find errors or
validate results.
4.B Identify errors in
program code.
4.C Determine if
two or more code
segments yield
equivalent results.
5.A Describe the
behavior of a
given segment of
program code.
5.B Explain why a
code segment will
not compile or work
as intended.
5.C Explain how
the result of program
code changes, given
a change to the
initial code.
5.D Describe the
initial conditions that
must be met for a
program segment
to work as intended
or described.
|
ea2ba77743af94473f63bbe63e1caa83
|
{
"intermediate": 0.3800930976867676,
"beginner": 0.25524264574050903,
"expert": 0.3646641969680786
}
|
42,374
|
(java) Create an Mcq for the content below.
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
You questions needs to test these skills:
1.A Determine an
appropriate program
design to solve a
problem or accomplish
a task (not assessed).
1.B Determine code
that would be used
to complete code
segments.
1.C Determine code
that would be used to
interact with completed
program code.
2.A Apply the
meaning of specific
operators.
2.B Determine the
result or output
based on statement
execution order in a
code segment without
method calls (other
than output).
2.C Determine the
result or output
based on the
statement execution
order in a code
segment containing
method calls.
2.D Determine the
number of times
a code segment
will execute.
4.A Use test-cases
to find errors or
validate results.
4.B Identify errors in
program code.
4.C Determine if
two or more code
segments yield
equivalent results.
5.A Describe the
behavior of a
given segment of
program code.
5.B Explain why a
code segment will
not compile or work
as intended.
5.C Explain how
the result of program
code changes, given
a change to the
initial code.
5.D Describe the
initial conditions that
must be met for a
program segment
to work as intended
or described.
|
04e6cac568d56f3c1cad309cffb2859b
|
{
"intermediate": 0.3800930976867676,
"beginner": 0.25524264574050903,
"expert": 0.3646641969680786
}
|
42,375
|
cannot import name 'Optional' from 'langchain.llms.base'
|
351527d6e72457fdcb06f097695252f2
|
{
"intermediate": 0.3746025860309601,
"beginner": 0.3173459470272064,
"expert": 0.3080514669418335
}
|
42,376
|
(java) Create an Mcq for the content below.
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
You questions needs to test these skills:
1.A Determine an
appropriate program
design to solve a
problem or accomplish
a task (not assessed).
1.B Determine code
that would be used
to complete code
segments.
1.C Determine code
that would be used to
interact with completed
program code.
2.A Apply the
meaning of specific
operators.
2.B Determine the
result or output
based on statement
execution order in a
code segment without
method calls (other
than output).
2.C Determine the
result or output
based on the
statement execution
order in a code
segment containing
method calls.
2.D Determine the
number of times
a code segment
will execute.
4.A Use test-cases
to find errors or
validate results.
4.B Identify errors in
program code.
4.C Determine if
two or more code
segments yield
equivalent results.
5.A Describe the
behavior of a
given segment of
program code.
5.B Explain why a
code segment will
not compile or work
as intended.
5.C Explain how
the result of program
code changes, given
a change to the
initial code.
5.D Describe the
initial conditions that
must be met for a
program segment
to work as intended
or described.
|
39382a0c42fe3128fa49a9a8712a9cf9
|
{
"intermediate": 0.3800930976867676,
"beginner": 0.25524264574050903,
"expert": 0.3646641969680786
}
|
42,377
|
report any faults in the definiton of the transformer architecture of the gating network or the position encoding or any hardcoded parameters of the model that can't be modified or represented by a variable like the already modifieable hyperparameters, **code**: import torch
import torch.nn as nn
import torch.nn.functional as F
import json
import math
from torch.nn.utils.rnn import pad_sequence
from torch.utils.data import DataLoader, Dataset
from collections import Counter
from tqdm import tqdm
import matplotlib.pyplot as plt
from sklearn.metrics import precision_score, recall_score, f1_score, accuracy_score
from tokenizers import Tokenizer
# ---------- Device Configuration ----------
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
# ---------- Utility Functions ----------
def positional_encoding(seq_len, d_model, device):
pos = torch.arange(seq_len, dtype=torch.float, device=device).unsqueeze(1)
div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model)).to(device)
pe = torch.zeros(seq_len, d_model, device=device)
pe[:, 0::2] = torch.sin(pos * div_term)
pe[:, 1::2] = torch.cos(pos * div_term)
return pe.unsqueeze(0)
# -------- Performance ----------
def evaluate_model(model, data_loader, device):
model.eval()
all_preds, all_targets = [], []
with torch.no_grad():
for inputs, targets in data_loader:
inputs, targets = inputs.to(device), targets.to(device)
outputs = model(inputs)
predictions = torch.argmax(outputs, dim=-1).view(-1) # Flatten predicted indices
all_preds.extend(predictions.cpu().numpy())
all_targets.extend(targets.view(-1).cpu().numpy()) # Ensure targets are also flattened
# Calculate precision, recall, and F1 score after ensuring all_preds and all_targets are correctly aligned.
accuracy = accuracy_score(all_targets, all_preds)
precision = precision_score(all_targets, all_preds, average='macro', zero_division=0)
recall = recall_score(all_targets, all_preds, average='macro', zero_division=0)
f1 = f1_score(all_targets, all_preds, average='macro', zero_division=0)
print(f"Accuracy: {accuracy:.4f}")
print(f"Precision: {precision:.4f}")
print(f"Recall: {recall:.4f}")
print(f"F1 Score: {f1:.4f}")
return accuracy ,precision, recall, f1
# Function to plot loss over time
def plot_loss(loss_history):
plt.figure(figsize=(10, 5))
plt.plot(loss_history, label='Training Loss')
plt.xlabel('Batches')
plt.ylabel('Loss')
plt.title('Training Loss Over Time')
plt.legend()
plt.show()
# ---------- Model Definitions ----------
class TransformerExpert(nn.Module):
def __init__(self, input_size, d_model, output_size, nhead, dim_feedforward, num_encoder_layers=1):
super(TransformerExpert, self).__init__()
self.d_model = d_model
self.input_fc = nn.Linear(input_size, d_model)
self.pos_encoder = nn.Parameter(positional_encoding(1, d_model, device), requires_grad=True)
encoder_layer = nn.TransformerEncoderLayer(d_model=d_model, nhead=nhead,
dim_feedforward=dim_feedforward,
batch_first=True,
norm_first=True)
self.transformer_encoder = nn.TransformerEncoder(encoder_layer, num_layers=num_encoder_layers)
self.output_fc = nn.Linear(d_model, output_size)
self.norm = nn.LayerNorm(d_model)
def forward(self, x):
x = self.norm(self.input_fc(x)) + self.pos_encoder
transformer_output = self.transformer_encoder(x)
output = self.output_fc(transformer_output)
return output
class GatingNetwork(nn.Module):
def __init__(self, input_feature_dim, num_experts, hidden_dims=[6144, 3072, 1536, 768], dropout_rate=0.2):
super(GatingNetwork, self).__init__()
layers = []
last_dim = input_feature_dim
for hidden_dim in hidden_dims:
layers.extend([
nn.Linear(last_dim, hidden_dim),
nn.GELU(),
nn.Dropout(dropout_rate),
])
last_dim = hidden_dim
layers.append(nn.Linear(last_dim, num_experts))
self.fc_layers = nn.Sequential(*layers)
self.softmax = nn.Softmax(dim=1)
def forward(self, x):
x = x.mean(dim=1) # To ensure gating is based on overall features across the sequence
x = self.fc_layers(x)
return self.softmax(x)
class MixtureOfTransformerExperts(nn.Module):
def __init__(self, input_size, d_model, output_size, nhead, dim_feedforward, num_experts, num_encoder_layers=1):
super(MixtureOfTransformerExperts, self).__init__()
self.num_experts = num_experts
self.output_size = output_size
self.experts = nn.ModuleList([TransformerExpert(input_size, d_model, output_size, nhead, dim_feedforward, num_encoder_layers) for _ in range(num_experts)])
self.gating_network = GatingNetwork(d_model, num_experts)
def forward(self, x):
gating_scores = self.gating_network(x)
expert_outputs = [expert(x) for expert in self.experts]
stacked_expert_outputs = torch.stack(expert_outputs)
expanded_gating_scores = gating_scores.unsqueeze(2).unsqueeze(3)
expanded_gating_scores = expanded_gating_scores.expand(-1, -1, x.size(1), self.output_size)
expanded_gating_scores = expanded_gating_scores.transpose(0, 1)
mixed_output = torch.sum(stacked_expert_outputs * expanded_gating_scores, dim=0)
return mixed_output
class MoETransformerModel(nn.Module):
def __init__(self, vocab_size, d_model, moe):
super(MoETransformerModel, self).__init__()
self.embedding = nn.Embedding(num_embeddings=vocab_size, embedding_dim=d_model)
self.moe = moe
self.dropout = nn.Dropout(p=0.1)
def forward(self, x):
embedded = self.dropout(self.embedding(x))
return self.moe(embedded)
# ---------- Dataset Definitions ----------
class QAJsonlDataset(Dataset):
def __init__(self, path, seq_len, tokenizer_path):
# Load the trained tokenizer
self.tokenizer = Tokenizer.from_file(tokenizer_path)
self.seq_len = seq_len
self.pairs = self.load_data(path)
# Using BPE, so no need for manual vocab or idx2token.
# Tokenization will now happen using self.tokenizer.
self.tokenized_pairs = [(self.tokenize(q), self.tokenize(a)) for q, a in self.pairs]
def load_data(self, path):
pairs = []
with open(path, "r", encoding="utf-8") as f:
for line in f:
data = json.loads(line.strip())
question, answer = data.get("user", ""), data.get("content", "")
pairs.append((question, answer)) # Store questions and answers as raw strings
return pairs
def tokenize(self, text):
# Tokenizing using the BPE tokenizer
encoded = self.tokenizer.encode(text)
tokens = encoded.ids
# Padding/truncation
if len(tokens) < self.seq_len:
# Padding
tokens += [self.tokenizer.token_to_id("<pad>")] * (self.seq_len - len(tokens))
else:
# Truncation
tokens = tokens[:self.seq_len - 1] + [self.tokenizer.token_to_id("<eos>")]
return tokens
def __len__(self):
return len(self.tokenized_pairs)
def __getitem__(self, idx):
tokenized_question, tokenized_answer = self.tokenized_pairs[idx]
return torch.tensor(tokenized_question, dtype=torch.long), torch.tensor(tokenized_answer, dtype=torch.long)
def collate_fn(batch):
questions, answers = zip(*batch)
questions = pad_sequence(questions, batch_first=True, padding_value=0)
answers = pad_sequence(answers, batch_first=True, padding_value=0)
return questions, answers
# ---------- Training and Inference Functions ----------
def train_model(model, criterion, optimizer, num_epochs, data_loader):
model.train()
loss_history = [] # Initialize a list to keep track of losses
for epoch in range(num_epochs):
total_loss = 0
total_items = 0 # Keep track of total items processed
progress_bar = tqdm(enumerate(data_loader), total=len(data_loader), desc=f"Epoch {epoch+1}", leave=False)
for i, (inputs, targets) in progress_bar:
inputs, targets = inputs.to(device), targets.to(device)
optimizer.zero_grad()
# Predict
predictions = model(inputs)
predictions = predictions.view(-1, predictions.size(-1)) # Make sure predictions are the right shape
targets = targets.view(-1) # Flatten targets to match prediction shape if necessary
# Calculate loss
loss = criterion(predictions, targets)
loss.backward()
# Gradient clipping for stabilization
torch.nn.utils.clip_grad_norm_(model.parameters(), max_norm=1.0)
optimizer.step()
scheduler.step()
# Update total loss and the number of items
total_loss += loss.item() * inputs.size(0) # Multiply loss by batch size
total_items += inputs.size(0)
loss_history.append(loss.item())
progress_bar.set_postfix({"Loss": loss.item()})
average_loss = total_loss / total_items # Correctly compute average loss
print(f"Epoch {epoch+1}, Average Loss: {average_loss:.6f}")
return loss_history
class WarmupLR(torch.optim.lr_scheduler._LRScheduler):
def __init__(self, optimizer, warmup_steps, scheduler_step_lr):
self.warmup_steps = warmup_steps
self.scheduler_step_lr = scheduler_step_lr # The subsequent scheduler
super(WarmupLR, self).__init__(optimizer)
def get_lr(self):
if self._step_count <= self.warmup_steps:
warmup_factor = float(self._step_count) / float(max(1, self.warmup_steps))
for base_lr in self.base_lrs:
yield base_lr * warmup_factor
else:
self.scheduler_step_lr.step() # Update the subsequent scheduler
for param_group in self.optimizer.param_groups:
yield param_group['lr']
class GERU(nn.Module):
def __init__(self, in_features):
super(GERU, self).__init__()
self.alpha = nn.Parameter(torch.rand(in_features))
def forward(self, x):
return torch.max(x, torch.zeros_like(x)) + self.alpha * torch.min(x, torch.zeros_like(x))
def generate_text(model, tokenizer, seed_text, num_generate, temperature=1.0):
model.eval()
generated_tokens = []
# Encode the seed text using the tokenizer
encoded_input = tokenizer.encode(seed_text)
input_ids = torch.tensor(encoded_input.ids, dtype=torch.long).unsqueeze(0).to(device)
# Generate num_generate tokens
with torch.no_grad():
for _ in range(num_generate):
output = model(input_ids)
# Get the last logits and apply temperature
logits = output[:, -1, :] / temperature
probabilities = F.softmax(logits, dim=-1)
next_token_id = torch.multinomial(probabilities, num_samples=1).item()
# Append generated token ID and prepare the new input_ids
generated_tokens.append(next_token_id)
input_ids = torch.cat([input_ids, torch.tensor([[next_token_id]], dtype=torch.long).to(device)], dim=1)
# Decode the generated token IDs back to text
generated_text = tokenizer.decode(generated_tokens)
return generated_text
def count_tokens_in_dataset(dataset):
return sum([len(pair[0]) + len(pair[1]) for pair in dataset.pairs])
def count_parameters(model):
return sum(p.numel() for p in model.parameters() if p.requires_grad)
# ---------- Hyperparameters and Model Instantiation ----------
# Transformer :
d_model = 384
nhead = 12
dim_feedforward = 1536
num_encoder_layers = 12
num_experts = 2
# Training Parameters
batch_size = 384 # Adjustable batch size
warmup_steps = 2000 # Warmup steps for learning rate
optimizer_type = "AdamW" # Could be “SGD”, “RMSprop”, etc.
learning_rate = 1e-1
weight_decay = 1e-5 # For L2 regularization
num_epochs = 10
# Dataset :
path_to_dataset = "C:/Users/L14/Documents/Projets/Easy-MoE/Easy-MoE/data/Real_talk.jsonl"
tokenizer_path = "BPE_tokenizer(Real-Talk).json"
seq_len = 256
dataset = QAJsonlDataset(path_to_dataset, seq_len, tokenizer_path)
data_loader = DataLoader(dataset, batch_size=batch_size, shuffle=True, collate_fn=collate_fn, pin_memory=True)
num_tokens = count_tokens_in_dataset(dataset)
print(f"Total number of tokens in the dataset: {num_tokens}")
# Load the tokenizer
tokenizer = Tokenizer.from_file(tokenizer_path)
# Determine the vocabulary size
vocab_size = tokenizer.get_vocab_size()
moe = MixtureOfTransformerExperts(
input_size=d_model,
d_model=d_model,
output_size=vocab_size,
nhead=nhead,
dim_feedforward=dim_feedforward,
num_experts=num_experts,
num_encoder_layers=num_encoder_layers
).to(device)
moe_transformer_model = MoETransformerModel(vocab_size, d_model, moe).to(device)
# Count of total parameters :
total_params = count_parameters(moe_transformer_model)
print(f"Total trainable parameters: {total_params}")
# ---------- Training ----------
# Adjusting optimizer setup to include weight decay and allow switching between types
if optimizer_type == "AdamW":
optimizer = torch.optim.AdamW(moe_transformer_model.parameters(), lr=learning_rate, weight_decay=weight_decay)
elif optimizer_type == "SGD":
optimizer = torch.optim.SGD(moe_transformer_model.parameters(), lr=learning_rate, momentum=0.9, weight_decay=weight_decay)
elif optimizer_type == "Adam":
optimizer = torch.optim.Adam(moe_transformer_model.parameters(), lr=learning_rate, weight_decay=weight_decay)
# Instantiate your main scheduler (StepLR)
step_lr_scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=10, gamma=0.95)
# Wrap it with WarmupLR
scheduler = WarmupLR(optimizer, warmup_steps=warmup_steps, scheduler_step_lr=step_lr_scheduler)
criterion = nn.CrossEntropyLoss(label_smoothing=0.1)
# Train the model
loss_history = train_model(moe_transformer_model, criterion, optimizer, num_epochs, data_loader)
# Evaluating the model
plot_loss(loss_history)
train_accuracy = evaluate_model(moe_transformer_model, data_loader, device)
# ---------- Inference ----------
def interactive_text_generation(model, dataset, max_length=32, temperature=1.0):
while True:
try:
# Get user input
seed_text = input("Enter seed text (type 'quit' to exit and save the model): ").strip()
# Check if the user wants to quit the interaction
if seed_text.lower() == 'quit':
print("Exiting text generation mode.")
break
# Generate text based on the seed text
if seed_text:
generated_text = generate_text(model, dataset, seed_text, max_length, temperature) # Modify max_length/temperature as needed
print("Generated Text:", generated_text)
else:
print("Seed text cannot be empty. Please enter some text.")
except Exception as e:
# Handle potential errors gracefully
print(f"An error occurred: {e}. Try again.")
interactive_text_generation(moe_transformer_model, tokenizer)
# ---------- Save Trained Model ----------
torch.save(moe_transformer_model.state_dict(), "Transformer-Alpha-v04.pth")
|
cbc3cc9857aa7b4709dfd5d6c3089dce
|
{
"intermediate": 0.44208094477653503,
"beginner": 0.29767751693725586,
"expert": 0.2602415382862091
}
|
42,378
|
Create a question to test the content below, make sure the question is similar to the ap test
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
|
3622914579f2b9bd9dff4aa868b48297
|
{
"intermediate": 0.3394216299057007,
"beginner": 0.32795172929763794,
"expert": 0.3326266407966614
}
|
42,379
|
Create a complete the code question to test the content below.
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
|
6a217e9ed6efd75b99707ab2c29b0202
|
{
"intermediate": 0.31314489245414734,
"beginner": 0.3997867703437805,
"expert": 0.2870683968067169
}
|
42,380
|
Create a complete the code question to test the content below in this format[ Question type- multiple choice (you get 4 choices), complete the code, find the error
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
].
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
|
02a5e74e7fee03993223314602310740
|
{
"intermediate": 0.34566760063171387,
"beginner": 0.36984097957611084,
"expert": 0.2844914197921753
}
|
42,381
|
Create questions to test the content below in this format[ Question type- multiple choice (you get 4 choices), complete the code, find the error
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
].
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.B.1
Iteration statements can be used to access
all the elements in an array. This is called
traversing the array.
VAR-2.B.2
Traversing an array with an indexed for
loop or while loop requires elements to be
accessed using their indices.
VAR-2.B.3
Since the indices for an array start at
0 and end at the number of elements
− 1, “off by one” errors are easy to make
when traversing an array, resulting in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.C.1
An enhanced for loop header includes a
variable, referred to as the enhanced for
loop variable.
VAR-2.C.2
For each iteration of the enhanced for loop,
the enhanced for loop variable is assigned a
copy of an element without using its index.
VAR-2.C.3
Assigning a new value to the enhanced for
loop variable does not change the value stored
in the array.
VAR-2.C.4
Program code written using an enhanced for
loop to traverse and access elements in an
array can be rewritten using an indexed for
loop or a while loop.
CON-2.I.1
There are standard algorithms that utilize array
traversals to:
§ Determine a minimum or maximum value
§ Compute a sum, average, or mode
§ Determine if at least one element has a
particular property
§ Determine if all elements have a particular
property
§ Access all consecutive pairs of elements
§ Determine the presence or absence of
duplicate elements
§ Determine the number of elements meeting
specific criteria
CON-2.I.2
There are standard array algorithms that utilize
traversals to:
§ Shift or rotate elements left or right
§ Reverse the order of the elements
|
e67f0c0ad3a23083392f172d952f595e
|
{
"intermediate": 0.2536627948284149,
"beginner": 0.5700666308403015,
"expert": 0.17627054452896118
}
|
42,382
|
функций display python
|
627cd80abbc2419de932d849c89d99ae
|
{
"intermediate": 0.34171032905578613,
"beginner": 0.2775222063064575,
"expert": 0.3807674050331116
}
|
42,383
|
Create questions to test the content below in this format[ Question type- multiple choice (you get 4 choices), complete the code, find the error
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
].
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.B.1
Iteration statements can be used to access
all the elements in an array. This is called
traversing the array.
VAR-2.B.2
Traversing an array with an indexed for
loop or while loop requires elements to be
accessed using their indices.
VAR-2.B.3
Since the indices for an array start at
0 and end at the number of elements
− 1, “off by one” errors are easy to make
when traversing an array, resulting in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.C.1
An enhanced for loop header includes a
variable, referred to as the enhanced for
loop variable.
VAR-2.C.2
For each iteration of the enhanced for loop,
the enhanced for loop variable is assigned a
copy of an element without using its index.
VAR-2.C.3
Assigning a new value to the enhanced for
loop variable does not change the value stored
in the array.
VAR-2.C.4
Program code written using an enhanced for
loop to traverse and access elements in an
array can be rewritten using an indexed for
loop or a while loop.
CON-2.I.1
There are standard algorithms that utilize array
traversals to:
§ Determine a minimum or maximum value
§ Compute a sum, average, or mode
§ Determine if at least one element has a
particular property
§ Determine if all elements have a particular
property
§ Access all consecutive pairs of elements
§ Determine the presence or absence of
duplicate elements
§ Determine the number of elements meeting
specific criteria
CON-2.I.2
There are standard array algorithms that utilize
traversals to:
§ Shift or rotate elements left or right
§ Reverse the order of the elements
|
50d35d38679d7ad556e5dc866b4a105b
|
{
"intermediate": 0.2536627948284149,
"beginner": 0.5700666308403015,
"expert": 0.17627054452896118
}
|
42,384
|
Create questions to test the content below in this format[ Question type- multiple choice (you get 4 choices), complete the code, find the error
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (hard)
].
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.B.1
Iteration statements can be used to access
all the elements in an array. This is called
traversing the array.
VAR-2.B.2
Traversing an array with an indexed for
loop or while loop requires elements to be
accessed using their indices.
VAR-2.B.3
Since the indices for an array start at
0 and end at the number of elements
− 1, “off by one” errors are easy to make
when traversing an array, resulting in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.C.1
An enhanced for loop header includes a
variable, referred to as the enhanced for
loop variable.
VAR-2.C.2
For each iteration of the enhanced for loop,
the enhanced for loop variable is assigned a
copy of an element without using its index.
VAR-2.C.3
Assigning a new value to the enhanced for
loop variable does not change the value stored
in the array.
VAR-2.C.4
Program code written using an enhanced for
loop to traverse and access elements in an
array can be rewritten using an indexed for
loop or a while loop.
CON-2.I.1
There are standard algorithms that utilize array
traversals to:
§ Determine a minimum or maximum value
§ Compute a sum, average, or mode
§ Determine if at least one element has a
particular property
§ Determine if all elements have a particular
property
§ Access all consecutive pairs of elements
§ Determine the presence or absence of
duplicate elements
§ Determine the number of elements meeting
specific criteria
CON-2.I.2
There are standard array algorithms that utilize
traversals to:
§ Shift or rotate elements left or right
§ Reverse the order of the elements
|
a26f957b77d9450c820f0e075846ad31
|
{
"intermediate": 0.2598467767238617,
"beginner": 0.5498799681663513,
"expert": 0.19027328491210938
}
|
42,385
|
Create hard questions to test the content below in this format[ Question type- multiple choice (you get 4 choices), complete the code, find the error
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
].
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.B.1
Iteration statements can be used to access
all the elements in an array. This is called
traversing the array.
VAR-2.B.2
Traversing an array with an indexed for
loop or while loop requires elements to be
accessed using their indices.
VAR-2.B.3
Since the indices for an array start at
0 and end at the number of elements
− 1, “off by one” errors are easy to make
when traversing an array, resulting in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.C.1
An enhanced for loop header includes a
variable, referred to as the enhanced for
loop variable.
VAR-2.C.2
For each iteration of the enhanced for loop,
the enhanced for loop variable is assigned a
copy of an element without using its index.
VAR-2.C.3
Assigning a new value to the enhanced for
loop variable does not change the value stored
in the array.
VAR-2.C.4
Program code written using an enhanced for
loop to traverse and access elements in an
array can be rewritten using an indexed for
loop or a while loop.
CON-2.I.1
There are standard algorithms that utilize array
traversals to:
§ Determine a minimum or maximum value
§ Compute a sum, average, or mode
§ Determine if at least one element has a
particular property
§ Determine if all elements have a particular
property
§ Access all consecutive pairs of elements
§ Determine the presence or absence of
duplicate elements
§ Determine the number of elements meeting
specific criteria
CON-2.I.2
There are standard array algorithms that utilize
traversals to:
§ Shift or rotate elements left or right
§ Reverse the order of the elements
|
5f6bef8c681050aadacf580514a1185f
|
{
"intermediate": 0.2975737750530243,
"beginner": 0.46431389451026917,
"expert": 0.23811225593090057
}
|
42,386
|
Create hard questions to test the content below in this format[
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
].
VAR-2.A.1
The use of array objects allows multiple related
items to be represented using a single variable.
VAR-2.A.2
The size of an array is established at the time of
creation and cannot be changed.
VAR-2.A.3
Arrays can store either primitive data or object
reference data.
VAR-2.A.4
When an array is created using the keyword
new, all of its elements are initialized with a
specific value based on the type of elements:
§ Elements of type int are initialized to 0
§ Elements of type double are initialized to 0.0
§ Elements of type boolean are initialized
to false
§ Elements of a reference type are initialized
to the reference value null. No objects are
automatically created
VAR-2.A.5
Initializer lists can be used to create and
initialize arrays.
VAR-2.A.6
Square brackets ([ ]) are used to access and
modify an element in a 1D array using an index.
VAR-2.A.7
The valid index values for an array are
0 through one less than the number of
elements in the array, inclusive. Using an index
value outside of this range will result in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.B.1
Iteration statements can be used to access
all the elements in an array. This is called
traversing the array.
VAR-2.B.2
Traversing an array with an indexed for
loop or while loop requires elements to be
accessed using their indices.
VAR-2.B.3
Since the indices for an array start at
0 and end at the number of elements
− 1, “off by one” errors are easy to make
when traversing an array, resulting in an
ArrayIndexOutOfBoundsException
being thrown.
VAR-2.C.1
An enhanced for loop header includes a
variable, referred to as the enhanced for
loop variable.
VAR-2.C.2
For each iteration of the enhanced for loop,
the enhanced for loop variable is assigned a
copy of an element without using its index.
VAR-2.C.3
Assigning a new value to the enhanced for
loop variable does not change the value stored
in the array.
VAR-2.C.4
Program code written using an enhanced for
loop to traverse and access elements in an
array can be rewritten using an indexed for
loop or a while loop.
CON-2.I.1
There are standard algorithms that utilize array
traversals to:
§ Determine a minimum or maximum value
§ Compute a sum, average, or mode
§ Determine if at least one element has a
particular property
§ Determine if all elements have a particular
property
§ Access all consecutive pairs of elements
§ Determine the presence or absence of
duplicate elements
§ Determine the number of elements meeting
specific criteria
CON-2.I.2
There are standard array algorithms that utilize
traversals to:
§ Shift or rotate elements left or right
§ Reverse the order of the elements
|
f4b7519c921447bbfd3197a35673e0fa
|
{
"intermediate": 0.26963338255882263,
"beginner": 0.44767242670059204,
"expert": 0.2826941907405853
}
|
42,387
|
Create questions that are designed to trip you up and test the content below in this format[
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
].
6.1
Make arrays- multiple ways
Set values in arrays- multiple ways
6.2
Traverse arrays with for loops
Traverse arrays with while loops
Finding a particular value
Finding an indices of a value
Finding “edge” cases
OutOfBounds Errors
6.3
Enhanced For Loops aka For each loops
Traversing arrays with for each loops
Updating some value NOT an array element (like summing values, finding average)
6.4
Algorithms for arrays- there are too many to list- here are the “biggies”
Reverse order an array
Find duplicates
Find the largest value
Find the smallest value
Find the indices of the largest/smallest value
Find the indices of the most frequent element
Insert some thing.. some where….
Find null elements
-------------------
|
e75f946c169b06751d60496d752f24f4
|
{
"intermediate": 0.30636635422706604,
"beginner": 0.19061867892742157,
"expert": 0.5030150413513184
}
|
42,388
|
Create questions that are designed to trip you up and test the content below in this format[
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
].
6.1
Make arrays- multiple ways
Set values in arrays- multiple ways
|
88725f36ea95464452c042167762b0d4
|
{
"intermediate": 0.43767809867858887,
"beginner": 0.23780542612075806,
"expert": 0.3245164155960083
}
|
42,389
|
hi
|
377a3e5c1155e10ab0655cf16ad92b34
|
{
"intermediate": 0.3246487081050873,
"beginner": 0.27135494351387024,
"expert": 0.40399640798568726
}
|
42,390
|
Create questions that are designed to trip you up and test the content below in this format[
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
].
6.1
Make arrays- multiple ways
Set values in arrays- multiple ways
|
41ac30a6296e3de6d6d19632dd2b0035
|
{
"intermediate": 0.43767809867858887,
"beginner": 0.23780542612075806,
"expert": 0.3245164155960083
}
|
42,391
|
Create an mcq question that is hard and test the content below in this format[
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
].
Make arrays- multiple ways
Set values in arrays- multiple ways
Traverse arrays with for loops
Traverse arrays with while loops
Finding a particular value
Finding an indices of a value
Finding “edge” cases
OutOfBounds Errors
Enhanced For Loops aka For each loops
Traversing arrays with for each loops
Updating some value NOT an array element (like summing values, finding average)
Algorithms for arrays- there are too many to list- here are the “biggies”
Reverse order an array
Find duplicates
Find the largest value
Find the smallest value
Find the indices of the largest/smallest value
Find the indices of the most frequent element
Insert some thing.. some where….
Find null elements
|
4d8e44fa495580c1d98da777454ee96a
|
{
"intermediate": 0.2704296112060547,
"beginner": 0.20966175198554993,
"expert": 0.519908607006073
}
|
42,392
|
Create an mcq questions and test the content below in this format[
Question type- multiple choice (you get 4 choices), complete the code, find the error
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
].
Make arrays- multiple ways
Set values in arrays- multiple ways
Traverse arrays with for loops
Traverse arrays with while loops
Finding a particular value
Finding an indices of a value
Finding “edge” cases
OutOfBounds Errors
Enhanced For Loops aka For each loops
Traversing arrays with for each loops
Updating some value NOT an array element (like summing values, finding average)
Algorithms for arrays- there are too many to list- here are the “biggies”
Reverse order an array
Find duplicates
Find the largest value
Find the smallest value
Find the indices of the largest/smallest value
Find the indices of the most frequent element
Insert some thing.. some where….
Find null elements
|
dac6671523c95185ed1a006d66ce21a0
|
{
"intermediate": 0.3178768754005432,
"beginner": 0.2522469460964203,
"expert": 0.4298761785030365
}
|
42,393
|
Create an complete the code frq and test the content below in this format and these skills:
[
Question type- multiple choice (you get 4 choices), complete the code, find the error
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
].
Content:
6.1
Make arrays- multiple ways
Set values in arrays- multiple ways
6.2
Traverse arrays with for loops
Traverse arrays with while loops
Finding a particular value
Finding an indices of a value
Finding “edge” cases
OutOfBounds Errors
6.3
Enhanced For Loops aka For each loops
Traversing arrays with for each loops
Updating some value NOT an array element (like summing values, finding average)
6.4
Algorithms for arrays- there are too many to list- here are the “biggies”
Reverse order an array
Find duplicates
Find the largest value
Find the smallest value
Find the indices of the largest/smallest value
Find the indices of the most frequent element
Insert some thing.. some where….
Find null elements
|
937751f1723c2badbc9bdb2040c72836
|
{
"intermediate": 0.36344391107559204,
"beginner": 0.29002025723457336,
"expert": 0.3465358316898346
}
|
42,394
|
Create an frq and test the content below in this format and these skills:
[
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
].
Content:
6.1
Make arrays- multiple ways
Set values in arrays- multiple ways
6.2
Traverse arrays with for loops
Traverse arrays with while loops
Finding a particular value
Finding an indices of a value
Finding “edge” cases
OutOfBounds Errors
6.3
Enhanced For Loops aka For each loops
Traversing arrays with for each loops
Updating some value NOT an array element (like summing values, finding average)
6.4
Algorithms for arrays- there are too many to list- here are the “biggies”
Reverse order an array
Find duplicates
Find the largest value
Find the smallest value
Find the indices of the largest/smallest value
Find the indices of the most frequent element
Insert some thing.. some where….
Find null elements
|
fe8be41f8d6634e9f4315f1126241604
|
{
"intermediate": 0.3161256015300751,
"beginner": 0.21927259862422943,
"expert": 0.4646018147468567
}
|
42,395
|
Create a question and test the content below in this format.
[
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
].
Content:
6.1
Make arrays- multiple ways
Set values in arrays- multiple ways
|
32252d3316a61708245b9d2626c9acfb
|
{
"intermediate": 0.438927561044693,
"beginner": 0.20593683421611786,
"expert": 0.35513561964035034
}
|
42,396
|
Create a question and test the content below in this format.
[
Question body
Responses (or correct response if it is a complete the code or find the error)
Explanation of the correct answer
Tip
Difficulty (easy, medium, hard)
].
Content:
Reverse order an array(java)
|
a8b0ae55075e0827ddd9c1fdd3dcb2c9
|
{
"intermediate": 0.3430096209049225,
"beginner": 0.37068742513656616,
"expert": 0.28630295395851135
}
|
42,397
|
How do I explain query in DB2 database ?
|
952cdef8d631fda9ef5bbbea3386ec74
|
{
"intermediate": 0.3354019224643707,
"beginner": 0.3642164170742035,
"expert": 0.3003816306591034
}
|
42,398
|
CURRENT_DIR=$(pwd)
echo "***** Current directory: $CURRENT_DIR *****"
export PYTHONPATH="${CURRENT_DIR}:$PYTHONPATH"
streamlit run ./webui/Main.py 修改成bat脚本
|
72f4ae6f569db090e9019efff86723da
|
{
"intermediate": 0.4256414771080017,
"beginner": 0.29834070801734924,
"expert": 0.2760177552700043
}
|
42,399
|
class YoutubePlugin(lightbulb.Plugin):
@property
def app(self) -> SnedBot:
return super().app # type: ignore
@app.setter
def app(self, val: SnedBot) -> None:
self._app = val
self.create_commands()
@property
def bot(self) -> SnedBot:
return super().bot # type: ignore. 帮我写一下代码 ,使用 langchain 来搜索 youtube 的视频,然后给出链接
|
69e6cd60927f7de27b5c77ea0d9aa64f
|
{
"intermediate": 0.34178677201271057,
"beginner": 0.48672789335250854,
"expert": 0.17148539423942566
}
|
42,400
|
Could you write a pyrogram code for my telegram bot using mangadex api which can download desired manga using /search command, when uploading to telegram the bot needs to upload in pdf format
|
cb0634c956ef6b706ba610f9370b2685
|
{
"intermediate": 0.6439975500106812,
"beginner": 0.1375281661748886,
"expert": 0.21847429871559143
}
|
42,401
|
Что это за ошибка?
[centos@centos-std2-1-1-10gb ~]$ sudo dnf install python3
CentOS Linux 8 - AppStream 0.0 B/s | 0 B 00:00
Errors during downloading metadata for repository 'appstream':
- Curl error (6): Couldn't resolve host name for http://mirrorlist.centos.org/?release=8&arch=x86_64&repo=AppStream&infra=stock [Could not resolve host: mirrorlist.centos.org]
Error: Failed to download metadata for repo 'appstream': Cannot prepare internal mirrorlist: Curl error (6): Couldn't resolve host name for http://mirrorlist.centos.org/?release=8&arch=x86_64&repo=AppStream&infra=stock [Could not resolve host: mirrorlist.centos.org]
[centos@centos-std2-1-1-10gb ~]$ sudo dnf -y install python3-pip
CentOS Linux 8 - AppStream 0.0 B/s | 0 B 00:00
Errors during downloading metadata for repository 'appstream':
- Curl error (6): Couldn't resolve host name for http://mirrorlist.centos.org/?release=8&arch=x86_64&repo=AppStream&infra=stock [Could not resolve host: mirrorlist.centos.org]
Error: Failed to download metadata for repo 'appstream': Cannot prepare internal mirrorlist: Curl error (6): Couldn't resolve host name for http://mirrorlist.centos.org/?release=8&arch=x86_64&repo=AppStream&infra=stock [Could not resolve host: mirrorlist.centos.org]
|
7c0b2940230d98b37f5f815f758cc06f
|
{
"intermediate": 0.3815256357192993,
"beginner": 0.3501274287700653,
"expert": 0.268346905708313
}
|
42,402
|
What are callback functions in JavaScript and how they work. please explain with the help of an example
|
9ee91aeda531e33f5491124de2c81cd4
|
{
"intermediate": 0.29977405071258545,
"beginner": 0.5970683097839355,
"expert": 0.10315768420696259
}
|
42,403
|
help me here, please:
pub fn map_5_prime(exons: &Vec<(u32, u32)>, tx: Vec<&(u32, u32)>) -> FivePrimeMappingInfo {
let mut status = FivePrimeStatus::Complete;
let e = tx[0];
for (i, exon) in exons.iter().enumerate() {
// ahead | behind
if e.0 > exon.1 {
continue;
} else if e.1 < exon.0 {
break;
}
// truncated in exon
if e.0 > exon.0 && e.0 < exon.1 && e.1 > exon.1 {
status = FivePrimeStatus::TruncatedInExon;
break;
}
// if next exon is inside current exon, pass to the next and so on until find two
// non-overlapping exons | then -> evaluate truncated in intron
}
FivePrimeMappingInfo::new(status)
}
I am trying to do: if next exon is inside current exon, pass to the next and so on until find two
// non-overlapping exons
|
0dd10da35f9290401a025b76d06ad2a3
|
{
"intermediate": 0.5730741024017334,
"beginner": 0.2850145995616913,
"expert": 0.14191126823425293
}
|
42,404
|
мне в mql5 приходит {"ok":true,"result":[{"update_id":397152442,
"message":{"message_id":1,"from":{"id":1256290344,"is_bot":false,"first_name":"\u0418\u0432\u0430\u043d","language_code":"ru"},"chat":{"id":1256290344,"first_name":"\u0418\u0432\u0430\u043d","type":"private"},"date":1710402519,"text":"https://t.me/+GGmqZ_4PKH5kMmVi","entities":[{"offset":0,"length":30,"type":"url"}],"link_preview_options":{"url":"https://t.me/+GGmqZ_4PKH5kMmVi"}}},{"update_id":397152443,
"message":{"message_id":2,"from":{"id":1256290344,"is_bot":false,"first_name":"\u0418\u0432\u0430\u043d","language_code":"ru"},"chat":{"id":1256290344,"first_name":"\u0418\u0432\u0430\u043d","type":"private"},"date":1710402690,"text":"https://t.me/+GGmqZ_4PKH5kMmVi","entities":[{"offset":0,"length":30,"type":"url"}],"link_preview_options":{"is_disabled":true}}},{"update_id":397152444,
"my_chat_member":{"chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"from":{"id":1256290344,"is_bot":false,"first_name":"\u0418\u0432\u0430\u043d","language_code":"en"},"date":1710402807,"old_chat_member":{"user":{"id":7152618530,"is_bot":true,"first_name":"MQL5send","username":"mql5send_bot"},"status":"left"},"new_chat_member":{"user":{"id":7152618530,"is_bot":true,"first_name":"MQL5send","username":"mql5send_bot"},"status":"administrator","can_be_edited":false,"can_manage_chat":true,"can_change_info":true,"can_post_messages":true,"can_edit_messages":true,"can_delete_messages":true,"can_invite_users":true,"can_restrict_members":true,"can_promote_members":false,"can_manage_video_chats":true,"can_post_stories":true,"can_edit_stories":true,"can_delete_stories":true,"is_anonymous":false,"can_manage_voice_chats":true}}},{"update_id":397152445,
"channel_post":{"message_id":7,"sender_chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"date":1710413669,"text":"23432"}},{"update_id":397152446,
"channel_post":{"message_id":8,"sender_chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"date":1710413675,"text":"kl"}},{"update_id":397152447,
"channel_post":{"message_id":9,"sender_chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"date":1710413681,"text":"gfdhturtu"}},{"update_id":397152448,
"channel_post":{"message_id":11,"sender_chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"date":1710413752,"text":"435"}},{"update_id":397152449,
"channel_post":{"message_id":15,"sender_chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"date":1710414025,"text":"\u044b\u0432\u0430\u044b\u0432\u0430\u044b\u0432\u0430\u044b\u0432\u0430"}},{"update_id":397152450,
"channel_post":{"message_id":16,"sender_chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"chat":{"id":-1002113042792,"title":"bot_mt5","type":"channel"},"date":1710414173,"text":"API KEY 7152618530:AAGJJC3zdkmCce3B7i11Dn2JDMh7GqpamyM","entities":[{"offset":8,"length":10,"type":"phone_number"}]}}]} как из этого достать только текст
|
30a3dfbd079454e9d2b56ad40340d31e
|
{
"intermediate": 0.2733408212661743,
"beginner": 0.36784547567367554,
"expert": 0.35881367325782776
}
|
42,405
|
//+------------------------------------------------------------------+
//| send.mq5 |
//| Copyright 2024, MetaQuotes Ltd. |
//| https://www.mql5.com |
//+------------------------------------------------------------------+
#property copyright "Copyright 2024, MetaQuotes Ltd."
#property link "https://www.mql5.com"
#property version "1.00"
//+------------------------------------------------------------------+
//| Script program start function |
//+------------------------------------------------------------------+
void OnStart()
{
//---
string ID = "-1002113042792";
string token = "7152618530:AAGJJC3zdkmCce3B7i11Dn2JDMh7GqpamyM";
Print(getMessage(textID,token));
}
//+------------------------------------------------------------------+
//| |
//+------------------------------------------------------------------+
int getMessage(string chatID, string botToken)
{
string baseUrl = "https://api.telegram.org";
string headers = "";
string requestURL = "";
string requestHeaders = "";
char resultData[];
char posData[];
int timeout = 200;
requestURL = StringFormat("%s/bot%s/getUpdates?chat_id=%s&text=%s",baseUrl,botToken,chatID);
int response = WebRequest("GET",requestURL,headers,timeout,posData,resultData,requestHeaders);
string resultMessage = CharArrayToString(resultData);
Print(resultMessage["result"]);
return response;
}
//+------------------------------------------------------------------+
исправь это
|
1957cbd330230a11bb43e925f0c25157
|
{
"intermediate": 0.32350122928619385,
"beginner": 0.4748746454715729,
"expert": 0.20162414014339447
}
|
42,406
|
hi
|
994a77980d199219e37108fb1d16ebb3
|
{
"intermediate": 0.3246487081050873,
"beginner": 0.27135494351387024,
"expert": 0.40399640798568726
}
|
42,407
|
(bianshen) G:\spzz\wav2lip-studio.v0.2.1>python wav2lip_studio.py
Traceback (most recent call last):
File "G:\spzz\wav2lip-studio.v0.2.1\wav2lip_studio.py", line 5, in <module>
from scripts.ui import on_ui_tabs
File "G:\spzz\wav2lip-studio.v0.2.1\scripts\ui.py", line 18, in <module>
from scripts.wav2lip.w2l import W2l
File "G:\spzz\wav2lip-studio.v0.2.1\scripts\wav2lip\w2l.py", line 17, in <module>
from imutils import face_utils
ModuleNotFoundError: No module named 'imutils'
|
10c64a6ef75bbd7136b0439f521fca62
|
{
"intermediate": 0.42124590277671814,
"beginner": 0.35273295640945435,
"expert": 0.2260211557149887
}
|
42,408
|
scale_factors = np.array([0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.05, 0.05, 0.02])
deltas = np.multiply(raw_actions, [W_max - W_min, L_max - L_min] * 5 + [Io_max - Io_min, Cp_max - Cp_min, Vc_max - Vc_min])
I need to compute, deltas * scale_factors. How to perform my required computation.
|
8ba111088f829b7b3b3678aa6c421c2f
|
{
"intermediate": 0.31902438402175903,
"beginner": 0.2619200050830841,
"expert": 0.41905561089515686
}
|
42,409
|
I have a code to map bounding boxes from json to image using ocr textract output which in csv format. But I am strugging to get correct bounding boxes for multi token entities. Multi token entity should be in one bounding box. I want some good clustering approch to address this issue. And a better solution to match the entity token from dataframe.
Please modify my code according to my requirements. Code should avoid taking values from anywhere in dataframe. It should always take the nearest word of other token in case of multi token entity. Code should be robust.
import json
import pandas as pd
import cv2
from thefuzz import fuzz
from thefuzz import process
import numpy as np
def read_textract_output(csv_path):
return pd.read_csv(csv_path)
def read_json_entities(json_path):
with open(json_path, 'r') as file:
return json.load(file)
def find_entity_bounding_boxes(entity_text, textract_df, image_size):
entity_tokens = entity_text.split()
results = pd.DataFrame()
for token in entity_tokens:
choices = textract_df["text"].dropna().tolist()
best_match, score = process.extractOne(token, choices, scorer=fuzz.token_sort_ratio)
if score > 70:
best_matches_df = textract_df[textract_df["text"] == best_match]
results = pd.concat([results, best_matches_df])
bounding_boxes = []
try:
# Sort by line and word number to cluster words that are on the same line
sorted_results = results.sort_values(by=["line_num", "word_num"])
# Group the resulting bounding boxes by line number
grouped_results = sorted_results.groupby("line_num")
for _, group in grouped_results:
# Calculate scaling factors based on image size and original image dimensions
image_width, image_height = group.iloc[0]["image_width"], group.iloc[0]["image_height"]
scale_x = image_size[0] / image_width
scale_y = image_size[1] / image_height
# Calculate the bounding box for the whole line
min_left = np.min(group["left"])
min_top = np.min(group["top"])
max_right = np.max(group["left"] + group["width"])
max_bottom = np.max(group["top"] + group["height"])
bbox = (min_left * scale_x, min_top * scale_y, (max_right - min_left) * scale_x, (max_bottom - min_top) * scale_y)
bounding_boxes.append(bbox)
except Exception as e:
print(f"An error occurred: {e}")
return bounding_boxes
# def draw_bounding_boxes(image_path, entities, textract_df):
# image = cv2.imread(image_path)
# image_size = image.shape[1], image.shape[0]
# for category, details in entities.items():
# if category == "invoice_details" or category == "Payment Details" or category == "amounts_and_tax":
# for entity, value in details.items():
# if value:
# bounding_boxes = find_entity_bounding_boxes(value, textract_df, image_size)
# try:
# for bbox in bounding_boxes:
# x, y, w, h = map(int, bbox)
# cv2.rectangle(image, (x, y), (x + w, y + h), (0, 255, 0), 2)
# except:
# pass
# else:
# pass
def draw_bounding_boxes(image_path, entities, textract_df):
image = cv2.imread(image_path)
image_size = image.shape[1], image.shape[0]
font = cv2.FONT_HERSHEY_SIMPLEX
font_scale = 0.5
font_color = (0, 0, 0)
line_type = 2
padding = 5 # Padding between the start of bbox and text
for category, details in entities.items():
if category == "invoice_details" or category == "Payment Details" or category == "amounts_and_tax":
for entity, value in details.items():
if value:
bounding_boxes = find_entity_bounding_boxes(value, textract_df, image_size)
try:
for bbox in bounding_boxes:
x, y, w, h = map(int, bbox)
cv2.rectangle(image, (x, y), (x + w, y + h), (0, 255, 0), 2)
# Text placement calculation
text_position = (x, y - padding) if y - padding > 10 else (x, y + h + 20)
# Drawing the text
cv2.putText(image, entity, text_position, font, font_scale, font_color, line_type)
except:
pass
else:
pass
cv2.imwrite('/home/ritik1s/Desktop/bbox_issues/temp_GPT/annotated_invoice.jpg', image)
IMAGE_PATH = '/home/ritik1s/Desktop/bbox_issues/temp_GPT/check.jpeg'
CSV_PATH = "/home/ritik1s/Desktop/bbox_issues/temp_GPT/check.csv"
JSON_PATH = "/home/ritik1s/Desktop/bbox_issues/temp_GPT/row_skip.json"
# Read Textract output and JSON entities
textract_df = read_textract_output(CSV_PATH)
entities = read_json_entities(JSON_PATH)
# Draw the bounding boxes
draw_bounding_boxes(IMAGE_PATH, entities, textract_df)
|
b943eb973ba96bfd5b93ab22b0c60d7f
|
{
"intermediate": 0.4391675889492035,
"beginner": 0.2570965588092804,
"expert": 0.3037358522415161
}
|
42,410
|
I have a code to map bounding boxes from json to image using ocr textract output which in csv format. But I am strugging to get correct bounding boxes for multi token entities. Multi token entity should be in one bounding box. I want some good clustering approch to address this issue. And a better solution to match the entity token from dataframe.
Please modify my code according to my requirements. Code should avoid taking values from anywhere in dataframe. It should always take the nearest word of other token in case of multi token entity. Code should be robust.
You have to first check the value of entity and then find its nearest word before printing the bounding box
import json
import pandas as pd
import cv2
from thefuzz import fuzz
from thefuzz import process
import numpy as np
def read_textract_output(csv_path):
return pd.read_csv(csv_path)
def read_json_entities(json_path):
with open(json_path, 'r') as file:
return json.load(file)
def find_entity_bounding_boxes(entity_text, textract_df, image_size):
entity_tokens = entity_text.split()
results = pd.DataFrame()
for token in entity_tokens:
choices = textract_df["text"].dropna().tolist()
best_match, score = process.extractOne(token, choices, scorer=fuzz.token_sort_ratio)
if score > 70:
best_matches_df = textract_df[textract_df["text"] == best_match]
results = pd.concat([results, best_matches_df])
bounding_boxes = []
try:
# Sort by line and word number to cluster words that are on the same line
sorted_results = results.sort_values(by=["line_num", "word_num"])
# Group the resulting bounding boxes by line number
grouped_results = sorted_results.groupby("line_num")
for _, group in grouped_results:
# Calculate scaling factors based on image size and original image dimensions
image_width, image_height = group.iloc[0]["image_width"], group.iloc[0]["image_height"]
scale_x = image_size[0] / image_width
scale_y = image_size[1] / image_height
# Calculate the bounding box for the whole line
min_left = np.min(group["left"])
min_top = np.min(group["top"])
max_right = np.max(group["left"] + group["width"])
max_bottom = np.max(group["top"] + group["height"])
bbox = (min_left * scale_x, min_top * scale_y, (max_right - min_left) * scale_x, (max_bottom - min_top) * scale_y)
bounding_boxes.append(bbox)
except Exception as e:
print(f"An error occurred: {e}")
return bounding_boxes
# def draw_bounding_boxes(image_path, entities, textract_df):
# image = cv2.imread(image_path)
# image_size = image.shape[1], image.shape[0]
# for category, details in entities.items():
# if category == "invoice_details" or category == "Payment Details" or category == "amounts_and_tax":
# for entity, value in details.items():
# if value:
# bounding_boxes = find_entity_bounding_boxes(value, textract_df, image_size)
# try:
# for bbox in bounding_boxes:
# x, y, w, h = map(int, bbox)
# cv2.rectangle(image, (x, y), (x + w, y + h), (0, 255, 0), 2)
# except:
# pass
# else:
# pass
def draw_bounding_boxes(image_path, entities, textract_df):
image = cv2.imread(image_path)
image_size = image.shape[1], image.shape[0]
font = cv2.FONT_HERSHEY_SIMPLEX
font_scale = 0.5
font_color = (0, 0, 0)
line_type = 2
padding = 5 # Padding between the start of bbox and text
for category, details in entities.items():
if category == "invoice_details" or category == "Payment Details" or category == "amounts_and_tax":
for entity, value in details.items():
if value:
bounding_boxes = find_entity_bounding_boxes(value, textract_df, image_size)
try:
for bbox in bounding_boxes:
x, y, w, h = map(int, bbox)
cv2.rectangle(image, (x, y), (x + w, y + h), (0, 255, 0), 2)
# Text placement calculation
text_position = (x, y - padding) if y - padding > 10 else (x, y + h + 20)
# Drawing the text
cv2.putText(image, entity, text_position, font, font_scale, font_color, line_type)
except:
pass
else:
pass
cv2.imwrite('/home/ritik1s/Desktop/bbox_issues/temp_GPT/annotated_invoice.jpg', image)
IMAGE_PATH = '/home/ritik1s/Desktop/bbox_issues/temp_GPT/check.jpeg'
CSV_PATH = "/home/ritik1s/Desktop/bbox_issues/temp_GPT/check.csv"
JSON_PATH = "/home/ritik1s/Desktop/bbox_issues/temp_GPT/row_skip.json"
# Read Textract output and JSON entities
textract_df = read_textract_output(CSV_PATH)
entities = read_json_entities(JSON_PATH)
# Draw the bounding boxes
draw_bounding_boxes(IMAGE_PATH, entities, textract_df)
|
a88151eb979648e30de7eabe713b10cf
|
{
"intermediate": 0.4331812858581543,
"beginner": 0.330917626619339,
"expert": 0.2359011024236679
}
|
42,411
|
I have a code to map bounding boxes from json to image using ocr textract output which in csv format. But I am strugging to get correct bounding boxes for multi token entities. Multi token entity should be in one bounding box. I want some good clustering approch to address this issue. And a better solution to match the entity token from dataframe.
Please modify my code according to my requirements. Code should avoid taking values from anywhere in dataframe. It should always take the nearest word of other token in case of multi token entity. Code should be robust.
You have to first check the value of entity and then find its nearest word before printing the bounding box.
Write different different functions for single token entity and multi token entity should come as one single bounding box. Do calculate distances before aggregating bounding box values.
|
8115aeb35779110183df322e534ea20c
|
{
"intermediate": 0.36320987343788147,
"beginner": 0.40418753027915955,
"expert": 0.23260259628295898
}
|
42,412
|
With the above page finished, it's time to move on. I am building a page that is supposed to use bootstrap's cards to show up to 20 photos per page as clickable items. It is supposed to retrieve whatever photos are in the database and display that many items as a paginated view.
Here is some more code I am working with.
(ns jimmystore.ajax
(:require
[ajax.core :as ajax]
[luminus-transit.time :as time]
[cognitect.transit :as transit]
[re-frame.core :as rf]))
(defn local-uri? [{:keys [uri]}]
(not (re-find #"^\w+?://" uri)))
(defn default-headers [request]
(if (local-uri? request)
(-> request
(update :headers #(merge {"x-csrf-token" js/csrfToken} %)))
request))
;; injects transit serialization config into request options
(defn as-transit [opts]
(merge {:format (ajax/transit-request-format
{:writer (transit/writer :json time/time-serialization-handlers)})
:response-format (ajax/transit-response-format
{:reader (transit/reader :json time/time-deserialization-handlers)})}
opts))
(defn load-interceptors! []
(swap! ajax/default-interceptors
conj
(ajax/to-interceptor {:name "default headers"
:request default-headers})))
(ns jimmystore.events
(:require
[re-frame.core :as rf]
[ajax.core :as ajax]
[reitit.frontend.easy :as rfe]
[reitit.frontend.controllers :as rfc]))
;;dispatchers
(rf/reg-event-db
:common/navigate
(fn [db [_ match]]
(def foo match)
(let [old-match (:common/route db)
new-match (assoc match :controllers
(rfc/apply-controllers (:controllers old-match) match))]
(assoc db :common/route new-match))))
(rf/reg-fx
:common/navigate-fx!
(fn [[k & [params query]]]
(rfe/push-state k params query)))
(rf/reg-event-fx
:common/navigate!
(fn [_ [_ url-key params query]]
{:common/navigate-fx! [url-key params query]}))
(rf/reg-event-db :set-docs
(fn [db [_ docs]]
(assoc db :docs docs)))
(rf/reg-event-fx
:fetch-docs
(fn [_ _]
{:http-xhrio {:method :get
:uri "/docs"
:response-format (ajax/raw-response-format)
:on-success [:set-docs]}}))
(rf/reg-event-fx :test-get-api
(fn [_ _]
{:http-xhrio {:method :get
:uri "/test-get"
:response-format (ajax/json-response-format {:keywords? true})
:on-success [:set-docs]}}))
(rf/reg-event-fx :test-post-api
(fn [_ _]
{:http-xhrio {:method :post
:uri "/test-post"
:params {:test-post {:data 1 :foo :bar 2 "ASDASD"}}
:format (ajax/json-request-format)
:response-format (ajax/json-response-format {:keywords? true})
:on-success [:no-op]}}))
(rf/reg-event-fx :test-upload-file
(fn [{:keys [db]} [_ reference-id type]]
(let [form-data (js/FormData.)
{:keys [file size]} (get db :upload)]
(.append form-data "file" file)
(.append form-data "reference-id" reference-id)
(.append form-data "type" (name type)) ; Keywords are converted to strings
{:db (assoc-in db [:api-service :block-ui] true)
:http-xhrio {:method :post
:uri "/test-upload"
:timeout 60000
:body form-data
:response-format (ajax/json-response-format {:keywords? true})
:on-success [:no-op]
:on-failure [:no-op]}})))
(defn- get-file-size [file]
(.-size file))
(defn- get-file-name [file]
(.-name file))
(defn- get-file-type [file]
(.-type file))
(rf/reg-event-db :set-file-to-upload
(fn [db [_ file]] ;; Local url (for previews etc.)
(assoc db :upload {:object-url (js/window.webkitURL.createObjectURL file)
:file file
:size (get-file-size file)
:name (get-file-name file)
:type (get-file-type file)})))
(rf/reg-sub :upload
(fn [db _]
(-> db :upload)))
(rf/reg-event-db
:common/set-error
(fn [db [_ error]]
(assoc db :common/error error)))
(rf/reg-event-fx
:page/init-home
(fn [_ _]
{:dispatch [:fetch-docs]}))
(rf/reg-event-db
:no-op
(fn [db _]
db))
;;subscriptions
(rf/reg-sub
:common/route
(fn [db _]
(-> db :common/route)))
(rf/reg-sub
:common/page-id
:<- [:common/route]
(fn [route _]
(-> route :data :name)))
(rf/reg-sub
:common/page
:<- [:common/route]
(fn [route _]
(-> route :data :view)))
(rf/reg-sub
:docs
(fn [db _]
(:docs db)))
(rf/reg-sub
:common/error
(fn [db _]
(:common/error db)))
(ns jimmystore.core
(:require
[day8.re-frame.http-fx]
[reagent.dom :as rdom]
[reagent.core :as r]
[re-frame.core :as rf]
[goog.events :as events]
[goog.history.EventType :as HistoryEventType]
[markdown.core :refer [md->html]]
[jimmystore.ajax :as ajax]
[jimmystore.events]
[reitit.core :as reitit]
[reitit.frontend.easy :as rfe]
[jimmystore.page-handler :as page-handler])
(:import goog.History))
(defn nav-link [uri title page]
[:a.navbar-item
{:href uri
:class (when (= page @(rf/subscribe [:common/page-id])) :is-active)}
title])
(defn navbar-item [key name]
[:li {:class "nav-item"}
[:a {:class "nav-link"
:aria-current :page
:href (rfe/href key)}
name]])
(defn navbar []
[:header {:class "bg-body-secondary"}
[:nav {:class "navbar navbar-expand-sm container-sm"}
[:div {:class "container-fluid"}
[:a {:class "navbar-brand" :href "#"} "Jimmy Store :3"]
[:button.navbar-toggler {:type :button
:data-bs-toggle :collapse
:data-bs-target "#navbarSupportedContent"
:aria-controls "navbarSupportedContent"
:aria-expanded false
:aria-label "Toggle navigation"}
[:span.navbar-toggler-icon]]
[:div {:class "collapse navbar-collapse" :id "navbarSupportedContent"}
[:ul {:class "navbar-nav
me-auto
mb-2
mb-lg-0"}
(navbar-item :home "Home")
(navbar-item :about "About")
(navbar-item :photo "Photo Print")
(navbar-item :test-page "Test Page")]]]]])
(defn page []
(if-let [page @(rf/subscribe [:common/page])]
[:div
[navbar]
[page]]
#_(rfe/push-state :home)))
(defn navigate! [match _]
(rf/dispatch [:common/navigate match]))
(def router (reitit/router page-handler/pages))
(defn start-router! []
(rfe/start!
router
navigate!
{}))
;; -------------------------
;; Initialize app
(defn ^:dev/after-load mount-components []
(rf/clear-subscription-cache!)
(rdom/render [#'page] (.getElementById js/document "app")))
(defn init! []
(start-router!)
(ajax/load-interceptors!)
(mount-components))
And the start of the page I'm building.
(ns jimmystore.pages.photo-page
(:require
[reagent.core :as r]
[re-frame.core :as rf]))
(defn photos-element []
)
(defn page []
[:div.container
[:h3 "Select a photo to print."]
[photos-element]])
(def page-info {:page-id :photo
:view #'page})
|
eafe228557000aa23da115127da03181
|
{
"intermediate": 0.3125537931919098,
"beginner": 0.4817443788051605,
"expert": 0.20570184290409088
}
|
42,413
|
I am building a page in ClojureScript that uses re-frame and reagent. It is supposed to use bootstrap's cards to show up to 20 photos per page as clickable items. It is supposed to retrieve whatever photos are in the database and display that many items as a paginated view.
Here is some more code I am working with.
(ns jimmystore.ajax
(:require
[ajax.core :as ajax]
[luminus-transit.time :as time]
[cognitect.transit :as transit]
[re-frame.core :as rf]))
(defn local-uri? [{:keys [uri]}]
(not (re-find #"^\w+?://" uri)))
(defn default-headers [request]
(if (local-uri? request)
(-> request
(update :headers #(merge {"x-csrf-token" js/csrfToken} %)))
request))
;; injects transit serialization config into request options
(defn as-transit [opts]
(merge {:format (ajax/transit-request-format
{:writer (transit/writer :json time/time-serialization-handlers)})
:response-format (ajax/transit-response-format
{:reader (transit/reader :json time/time-deserialization-handlers)})}
opts))
(defn load-interceptors! []
(swap! ajax/default-interceptors
conj
(ajax/to-interceptor {:name "default headers"
:request default-headers})))
(ns jimmystore.events
(:require
[re-frame.core :as rf]
[ajax.core :as ajax]
[reitit.frontend.easy :as rfe]
[reitit.frontend.controllers :as rfc]))
;;dispatchers
(rf/reg-event-db
:common/navigate
(fn [db [_ match]]
(def foo match)
(let [old-match (:common/route db)
new-match (assoc match :controllers
(rfc/apply-controllers (:controllers old-match) match))]
(assoc db :common/route new-match))))
(rf/reg-fx
:common/navigate-fx!
(fn [[k & [params query]]]
(rfe/push-state k params query)))
(rf/reg-event-fx
:common/navigate!
(fn [_ [_ url-key params query]]
{:common/navigate-fx! [url-key params query]}))
(rf/reg-event-db :set-docs
(fn [db [_ docs]]
(assoc db :docs docs)))
(rf/reg-event-fx
:fetch-docs
(fn [_ _]
{:http-xhrio {:method :get
:uri "/docs"
:response-format (ajax/raw-response-format)
:on-success [:set-docs]}}))
(rf/reg-event-fx :test-get-api
(fn [_ _]
{:http-xhrio {:method :get
:uri "/test-get"
:response-format (ajax/json-response-format {:keywords? true})
:on-success [:set-docs]}}))
(rf/reg-event-fx :test-post-api
(fn [_ _]
{:http-xhrio {:method :post
:uri "/test-post"
:params {:test-post {:data 1 :foo :bar 2 "ASDASD"}}
:format (ajax/json-request-format)
:response-format (ajax/json-response-format {:keywords? true})
:on-success [:no-op]}}))
(rf/reg-event-fx :test-upload-file
(fn [{:keys [db]} [_ reference-id type]]
(let [form-data (js/FormData.)
{:keys [file size]} (get db :upload)]
(.append form-data "file" file)
(.append form-data "reference-id" reference-id)
(.append form-data "type" (name type)) ; Keywords are converted to strings
{:db (assoc-in db [:api-service :block-ui] true)
:http-xhrio {:method :post
:uri "/test-upload"
:timeout 60000
:body form-data
:response-format (ajax/json-response-format {:keywords? true})
:on-success [:no-op]
:on-failure [:no-op]}})))
(defn- get-file-size [file]
(.-size file))
(defn- get-file-name [file]
(.-name file))
(defn- get-file-type [file]
(.-type file))
(rf/reg-event-db :set-file-to-upload
(fn [db [_ file]] ;; Local url (for previews etc.)
(assoc db :upload {:object-url (js/window.webkitURL.createObjectURL file)
:file file
:size (get-file-size file)
:name (get-file-name file)
:type (get-file-type file)})))
(rf/reg-sub :upload
(fn [db _]
(-> db :upload)))
(rf/reg-event-db
:common/set-error
(fn [db [_ error]]
(assoc db :common/error error)))
(rf/reg-event-fx
:page/init-home
(fn [_ _]
{:dispatch [:fetch-docs]}))
(rf/reg-event-db
:no-op
(fn [db _]
db))
;;subscriptions
(rf/reg-sub
:common/route
(fn [db _]
(-> db :common/route)))
(rf/reg-sub
:common/page-id
:<- [:common/route]
(fn [route _]
(-> route :data :name)))
(rf/reg-sub
:common/page
:<- [:common/route]
(fn [route _]
(-> route :data :view)))
(rf/reg-sub
:docs
(fn [db _]
(:docs db)))
(rf/reg-sub
:common/error
(fn [db _]
(:common/error db)))
(ns jimmystore.core
(:require
[day8.re-frame.http-fx]
[reagent.dom :as rdom]
[reagent.core :as r]
[re-frame.core :as rf]
[goog.events :as events]
[goog.history.EventType :as HistoryEventType]
[markdown.core :refer [md->html]]
[jimmystore.ajax :as ajax]
[jimmystore.events]
[reitit.core :as reitit]
[reitit.frontend.easy :as rfe]
[jimmystore.page-handler :as page-handler])
(:import goog.History))
(defn nav-link [uri title page]
[:a.navbar-item
{:href uri
:class (when (= page @(rf/subscribe [:common/page-id])) :is-active)}
title])
(defn navbar-item [key name]
[:li {:class "nav-item"}
[:a {:class "nav-link"
:aria-current :page
:href (rfe/href key)}
name]])
(defn navbar []
[:header {:class "bg-body-secondary"}
[:nav {:class "navbar navbar-expand-sm container-sm"}
[:div {:class "container-fluid"}
[:a {:class "navbar-brand" :href "#"} "Jimmy Store :3"]
[:button.navbar-toggler {:type :button
:data-bs-toggle :collapse
:data-bs-target "#navbarSupportedContent"
:aria-controls "navbarSupportedContent"
:aria-expanded false
:aria-label "Toggle navigation"}
[:span.navbar-toggler-icon]]
[:div {:class "collapse navbar-collapse" :id "navbarSupportedContent"}
[:ul {:class "navbar-nav
me-auto
mb-2
mb-lg-0"}
(navbar-item :home "Home")
(navbar-item :about "About")
(navbar-item :photo "Photo Print")
(navbar-item :test-page "Test Page")]]]]])
(defn page []
(if-let [page @(rf/subscribe [:common/page])]
[:div
[navbar]
[page]]
#_(rfe/push-state :home)))
(defn navigate! [match _]
(rf/dispatch [:common/navigate match]))
(def router (reitit/router page-handler/pages))
(defn start-router! []
(rfe/start!
router
navigate!
{}))
;; -------------------------
;; Initialize app
(defn ^:dev/after-load mount-components []
(rf/clear-subscription-cache!)
(rdom/render [#'page] (.getElementById js/document "app")))
(defn init! []
(start-router!)
(ajax/load-interceptors!)
(mount-components))
And the start of the page I'm building.
(ns jimmystore.pages.photo-page
(:require
[reagent.core :as r]
[re-frame.core :as rf]))
(defn photos-element []
)
(defn page []
[:div.container
[:h3 "Select a photo to print."]
[photos-element]])
(def page-info {:page-id :photo
:view #'page})
|
878e69708301cbc0650e0515ca1d6fe8
|
{
"intermediate": 0.40119752287864685,
"beginner": 0.37262293696403503,
"expert": 0.2261795550584793
}
|
42,414
|
async function updateOnMaster(postObject, transaction) {
try {
let condition = { record_id: postObject.master_record_id, custId: postObject.custId }
const record = await db.tbbudmst.findOne({ where: { ...condition }, attributes: ['revised_amount'], transaction })
let paramData = {};
if (postObject.source_code == 'Revision') {
paramData = {
revision_amount_in: postObject.transaction_amount,
revised_amount: parseFloat(record.revised_amount) + parseFloat(postObject.transaction_amount),
revision_date: postObject.createOn,
}
} else {
paramData = {
adjust_amount_in: postObject.transaction_amount,
adjust_amount_out: postObject.transaction_amount,
adjust_amount: parseFloat(record.adjust_amount_in) + parseFloat(postObject.transaction_amount),
adjust_date: postObject.createOn,
}
}
let [data] = await db.tbbudmst.update(paramData, { where: condition, transaction });
if (!data) throw new Error("Error on updating master data");
} catch (error) {
util.handleResponse(1, false, [], error.message, callback);
}
}
optimize the above function
|
b1f5dbfa88716de4e5d1399d359ad3a6
|
{
"intermediate": 0.41596826910972595,
"beginner": 0.24789200723171234,
"expert": 0.3361397683620453
}
|
42,415
|
Hi there
|
377959d7003cafbdabe3eb218d9a45e2
|
{
"intermediate": 0.32728445529937744,
"beginner": 0.24503648281097412,
"expert": 0.42767903208732605
}
|
42,416
|
print (round (zpref . mean()/zpkeep .mean(), 2))
|
ef679ec68cd7db994f017899845fbf26
|
{
"intermediate": 0.2506967782974243,
"beginner": 0.36972928047180176,
"expert": 0.37957391142845154
}
|
42,417
|
中文
stderr:
ERROR: Ignored the following versions that require a different python version: 0.0.10.2 Requires-Python >=3.6.0, <3.9; 0.0.10.3 Requires-Python >=3.6.0, <3.9; 0.0.11 Requires-Python >=3.6.0, <3.9; 0.0.12 Requires-Python >=3.6.0, <3.9; 0.0.13.1 Requires-Python >=3.6.0, <3.9; 0.0.13.2 Requires-Python >=3.6.0, <3.9; 0.0.14.1 Requires-Python >=3.6.0, <3.9; 0.0.15 Requires-Python >=3.6.0, <3.9; 0.0.15.1 Requires-Python >=3.6.0, <3.9; 0.0.9 Requires-Python >=3.6.0, <3.9; 0.0.9.1 Requires-Python >=3.6.0, <3.9; 0.0.9.2 Requires-Python >=3.6.0, <3.9; 0.0.9a10 Requires-Python >=3.6.0, <3.9; 0.0.9a9 Requires-Python >=3.6.0, <3.9; 0.52.0 Requires-Python >=3.6,<3.9; 0.52.0rc3 Requires-Python >=3.6,<3.9
ERROR: Could not find a version that satisfies the requirement tb-nightly (from gfpgan) (from versions: none)
ERROR: No matching distribution found for tb-nightly
[notice] A new release of pip is available: 23.0.1 -> 24.0
[notice] To update, run: python.exe -m pip install --upgrade pip
|
ae7fa649a9bdeb39a6803b04d94cd0cb
|
{
"intermediate": 0.3747827112674713,
"beginner": 0.3301914632320404,
"expert": 0.29502585530281067
}
|
42,418
|
Create ebay clone in python
|
95f77894430ac1abf27838408adb91fe
|
{
"intermediate": 0.36391353607177734,
"beginner": 0.2195633500814438,
"expert": 0.4165230393409729
}
|
42,419
|
Is there a way of amending the VBA code below to do the following:
If the last active sheet was 'PREQUEST' then use this line in the code 'filePath = ThisWorkbook.Path & "\zzzz ServProvDocs\PurchaseRequest.docm" 'New docm for this code call'
Else use this line 'filePath = ThisWorkbook.Path & "\zzzz ServProvDocs\TaskRequest.docm" 'Original docm for this code call
Public Sub TaskRequest()
'USED TO OPEN WORD DOC FOR TASK REQUEST - DO NOT DELETE
If Not Module9.sheetJobRequestActive Then Exit Sub ' if TRUE it allows the module to run
'USED TO OPEN WORD DOC FOR TASK REQUEST - DO NOT DELETE
Dim wdApp As Object
Dim wdDoc As Object
Dim WsdShell As Object
Set wdApp = CreateObject("Word.Application")
Dim filePath As String
'filePath = ThisWorkbook.Path & "\zzzz ServProvDocs\TaskRequest.docm" 'Original docm for this code call
filePath = ThisWorkbook.Path & "\zzzz ServProvDocs\PurchaseRequest.docm" 'New docm for this code call
Set wdDoc = wdApp.Documents.Open(filePath)
wdApp.Visible = True
'wordApp.Visible = True
'Application.Wait (Now + TimeValue("0:00:01"))
'doc.Activate
Set WsdShell = CreateObject("WScript.Shell")
WsdShell.AppActivate wdApp.Caption
End Sub
|
f163fda7b16c1d6321b88fc54b0b5fb2
|
{
"intermediate": 0.537050187587738,
"beginner": 0.30301305651664734,
"expert": 0.15993674099445343
}
|
42,420
|
create a gradio chatinterface where, there is a text box for writing the query, there should be a textbox where intermediate result fromt he query can be shown and a chat window where query and processed results from the intermediatre results window can be shown in Gradio
|
42c1c9f0acbba99edf0b062f543e5c50
|
{
"intermediate": 0.3834381699562073,
"beginner": 0.1653442531824112,
"expert": 0.45121756196022034
}
|
42,421
|
Допиши код import pandas as pd
df = pd.read_csv('StudentsPerformance.csv')
def filter(str_data):
if str_data[7] + str_data[6] + str_data[5] > 150:
return 'good'
else:
return 'bad'
df['new_column'] = df.apply(lambda row: filter(row), axis=1)
df
|
5755738847fc8283de393b3825e05953
|
{
"intermediate": 0.36280179023742676,
"beginner": 0.36558249592781067,
"expert": 0.27161574363708496
}
|
42,422
|
I'm building a store website with clojure, reagent, react, and bootstrap. Here is the code for a page.
(ns jimmystore.pages.photo-page
(:require
[reagent.core :as r]
[re-frame.core :as rf]))
(def dummy-photo-data {:path "/img/atsui.jpg"
:title "test"})
(defn photo-card [photo]
[:div {:class "card
col-lg-2
col-sm-2"}
[:img {:src (:path photo)
:class "card-img-top"}]
[:div.card-body
[:h5.card-title (:title photo)]]])
(defn photos-element []
[:div {:class "container
d-flex
flex-wrap
align-items-center
justify-content-center"}
(repeat 18 (photo-card dummy-photo-data))
; TODO: logic for inserting objects from backend
])
(defn page []
[:div.container
[:h3 "Select a photo."]
[photos-element]])
(def page-info {:page-id :photo
:view #'page})
I want the photo cards to have a bit of spacing between them. How would I best accomplish this with bootstrap?
|
00d212f93752bcf31cc0ddf867515077
|
{
"intermediate": 0.7405539751052856,
"beginner": 0.20264117419719696,
"expert": 0.0568048395216465
}
|
42,423
|
I have a code to map bounding boxes from json to image using ocr textract output which in csv format. But I am strugging to get correct bounding boxes for multi token entities. Multi token entity should be in one bounding box. I want some good clustering approch to address this issue. And a better solution to match the entity token from dataframe.
Please modify my code according to my requirements. Code should avoid taking values from anywhere in dataframe. It should always take the nearest word of other token in case of multi token entity. Code should be robust.
You have to first check the value of entity and then find its nearest word before printing the bounding box.
Write different different functions for single token entity and multi token entity should come as one single bounding box. Do calculate distances before aggregating bounding box values.
import json
import pandas as pd
import cv2
from thefuzz import fuzz
from thefuzz import process
import numpy as np
def read_textract_output(csv_path):
return pd.read_csv(csv_path)
def read_json_entities(json_path):
with open(json_path, 'r') as file:
return json.load(file)
def find_entity_bounding_boxes(entity_text, textract_df, image_size):
entity_tokens = entity_text.split()
results = pd.DataFrame()
for token in entity_tokens:
choices = textract_df["text"].dropna().tolist()
best_match, score = process.extractOne(token, choices, scorer=fuzz.token_sort_ratio)
if score > 70:
best_matches_df = textract_df[textract_df["text"] == best_match]
results = pd.concat([results, best_matches_df])
bounding_boxes = []
try:
# Sort by line and word number to cluster words that are on the same line
sorted_results = results.sort_values(by=["line_num", "word_num"])
# Group the resulting bounding boxes by line number
grouped_results = sorted_results.groupby("line_num")
for _, group in grouped_results:
# Calculate scaling factors based on image size and original image dimensions
image_width, image_height = group.iloc[0]["image_width"], group.iloc[0]["image_height"]
scale_x = image_size[0] / image_width
scale_y = image_size[1] / image_height
# Calculate the bounding box for the whole line
min_left = np.min(group["left"])
min_top = np.min(group["top"])
max_right = np.max(group["left"] + group["width"])
max_bottom = np.max(group["top"] + group["height"])
bbox = (min_left * scale_x, min_top * scale_y, (max_right - min_left) * scale_x, (max_bottom - min_top) * scale_y)
bounding_boxes.append(bbox)
except Exception as e:
print(f"An error occurred: {e}")
return bounding_boxes
# def draw_bounding_boxes(image_path, entities, textract_df):
# image = cv2.imread(image_path)
# image_size = image.shape[1], image.shape[0]
# for category, details in entities.items():
# if category == "invoice_details" or category == "Payment Details" or category == "amounts_and_tax":
# for entity, value in details.items():
# if value:
# bounding_boxes = find_entity_bounding_boxes(value, textract_df, image_size)
# try:
# for bbox in bounding_boxes:
# x, y, w, h = map(int, bbox)
# cv2.rectangle(image, (x, y), (x + w, y + h), (0, 255, 0), 2)
# except:
# pass
# else:
# pass
def draw_bounding_boxes(image_path, entities, textract_df):
image = cv2.imread(image_path)
image_size = image.shape[1], image.shape[0]
font = cv2.FONT_HERSHEY_SIMPLEX
font_scale = 0.5
font_color = (0, 0, 0)
line_type = 2
padding = 5 # Padding between the start of bbox and text
for category, details in entities.items():
if category == "invoice_details" or category == "Payment Details" or category == "amounts_and_tax":
for entity, value in details.items():
if value:
bounding_boxes = find_entity_bounding_boxes(value, textract_df, image_size)
try:
for bbox in bounding_boxes:
x, y, w, h = map(int, bbox)
cv2.rectangle(image, (x, y), (x + w, y + h), (0, 255, 0), 2)
# Text placement calculation
text_position = (x, y - padding) if y - padding > 10 else (x, y + h + 20)
# Drawing the text
cv2.putText(image, entity, text_position, font, font_scale, font_color, line_type)
except:
pass
else:
pass
cv2.imwrite('/home/ritik1s/Desktop/bbox_issues/temp_GPT/annotated_invoice.jpg', image)
IMAGE_PATH = '/home/ritik1s/Desktop/bbox_issues/temp_GPT/check.jpeg'
CSV_PATH = "/home/ritik1s/Desktop/bbox_issues/temp_GPT/check.csv"
JSON_PATH = "/home/ritik1s/Desktop/bbox_issues/temp_GPT/row_skip.json"
# Read Textract output and JSON entities
textract_df = read_textract_output(CSV_PATH)
entities = read_json_entities(JSON_PATH)
# Draw the bounding boxes
draw_bounding_boxes(IMAGE_PATH, entities, textract_df)
|
47948ff52486514f3777b1ec1b5b1320
|
{
"intermediate": 0.4576093852519989,
"beginner": 0.3276471495628357,
"expert": 0.21474340558052063
}
|
42,424
|
John works as a Antispam Researcher as well as he has to escape from spam. Help him in order to analyze and
classify
the
spam
data using
XG
Boost
Classifier
Email No. the to ect and for of a you hou in on is this enron i be that will have with your at we s are it by com as from gas or not me deal if meter hpl please re e any our corp can d all has was know need an forwarded new t may up j mmbtu should do am get out see no there price daren but been company l these let so would m into xls farmer attached us information they message day time my one what only http th volume mail contract which month more robert sitara about texas nom energy pec questions www deals volumes pm ena now their file some email just also call change other here like b flow net following p production when over back want original them below o ticket c he could make inc report march contact were days list nomination system who april number sale don its first thanks business help per through july forward font free daily use order today r had fw set plant statements go gary oil line sales w effective well tenaska take june x within nbsp she how north america being under next week than january last two service purchase name less height off agreement k work tap group year based transport after think made each available changes due f h services smith send management stock sent ll co office needs cotten did actuals u money before looking then pills online request look desk ami his same george chokshi point delivery friday does size august product pat width iv noms address above sure give october future find market n mary vance melissa said internet still account those down link hsc rate people pipeline best actual very end home houston tu high her team products many currently spot receive good such going process feb monday info david lloyd again both click subject jackie december total na lisa ve september hours until resources because aol february where g investment issue duke since pay show way global computron further most place offer natural activity eastrans graves right prices date john utilities november clynes jan securities meeting susan hplc julie able received align term id revised thursday pg fee hplno trading additional site txu data wellhead reply taylor news unify michael provide note much access lannou every between keep tuesday review great tom put done long save section must v part nd million check trade bob created steve prior copy continue numbers via world demand hanks contracts phone transaction customer possible pefs meyers months special without used regarding software howard support buy young meters thru believe gcs cec entered control dec face create weissman st color come supply brian hplo own correct customers web allocation soon using development mark low power problem once however tickets border performance manager rates center companies risk details needed international field even someone doc fuel lee paid while start index include nominations act pricing scheduled gathering type href during aimee anything feel fuels getting advice why increase path sell works issues three enronxgate camp either form security interest financial family xp plan current top another src spreadsheet allen wednesday read him working wynne add deliveries buyback allocated firm james marketing tx results got stocks calpine might operations position logistics fax cost party zero pops old pt scheduling flowed dollars update gco katy including follow yahoo already suite error past page stop changed book program few better operating equistar move cotton aep y state ees rita provided employees period morning cd hotmail entex swing real exchange tomorrow lst counterparty parker person follows valid visit little professional quality confirm something megan brenda around windows im storage accounting called ranch tax problems case teco fact always too unsubscribe amount coastal never rodriguez love acton shut pipe project hope limited invoice credit full survey ray carlos anyone wanted yet ic scott years charlie soft notice advise addition donald lsk wish katherine website hplnl schumack prescription cover shares cash imbalance united handle big everyone style clear producer weekend city requested stone left payment mobil shows small confirmed technology meet extend life intended sherlyn schedule else letter box bill richard lamphier complete ever release newsletter anita clem having herod beginning papayoti try mike enter estimates location cut question things personal feedback cialis found area dow terms central necessary man run reason third midcon charge president de listed meds thomas thought capital added ask weeks investing commercial star several easy view cannot extended lauri beaumont union times open cause monthly action offers industry states side mailto probably neal second stephanie download flash agree mcf transfer doing important basis different final koch exxon remove microsoft interested application sept mg write lp east requirements code value thank together exploration mid dfarmer everything receipt thu afternoon late enserch coming bank response tell shipping night events cynthia lsp close legal country direct expected ces corporation options really voip nominated etc latest potential priced edward valero material stack victor redeliveries loss remember baumbach option private longer aware included drugs public reinhardt version hesse discuss related asked say viagra revision bgcolor kind pro completed health ready plans registered regards carthage zone fill away computer systems industrial mentioned told therefore growth sold track reports south rd jim costs image expect return physical el browser donna stacey begin china duty approximately showing unit jones hard verify updated eol cs orders talk trying base given server source pathed strong bryan directly risks whole major users purchases oo karen luong level required delivered portfolio riley ali easttexas poorman bellamy assistance nothing gif thing retail didn valley department cleburne allow gpgfin answer items paste avila taken mm nguyen ensure reference hall later lone user methanol facility network spoke though tabs taking status considered purchased says yourself paliourg dy jeff businesses fred transportation apache morris nov ltd brand federal statement oasis reflect assets lamadrid general bridge ability oct play enrononline compliance spam availability king understanding chance quick effort points reliantenergy fixed short hill cheryl aepin key understand valign capacity game took bring guys god green care withers property hub johnson employee wants albrecht meaning expectations mx moved cernosek matter devon calls worldwide records removed lose large referenced walker iferc enw ponton eileen ship upon enerfin jennifer looks staff pc target waha making cp impact partner immediately shall channel takes sat others hear went travel listing approved processing early enough sally starting distribution tejas transactions stay earl superty doesn reserves includes choose adobe publisher paso cornhusker training markets content solution shell jpg print drive pain password half herrera saturday moopid hotlist balance super vacation sex happy excess existing fund stella share sign wells won four text card tisdale fwd appreciate non experience savings settlements draft couple informed biz watch plus sun expense images land occur flowing mar terry darren cheap weight dynegy activities become mr format attention entire photoshop williams instructions neon janet contains ago friends against boas music certain liz svcs record fast dave held mind ua publication differ comments fun rest instant agent communications director partners investors expedia kevin assist safe approval allocate black none intrastate document eric hakemack expired lower active secure cc five determine press colspan missing jill discussion relief respect specific technologies al holmes white yesterday medical pinion sorry men leave pass video gomes doctor projects limit air knle pharmacy confirmation opportunity involve notify gtc class ken started outage confidential room blue estimated officer reach messages database words prc tracked transition light national hot offering gulf provides iit demokritos mckay average wide heard files dan billed mccoy rc exactly middle select bruce louisiana receiving california event roll mops william appear perfect html features join greater sunday pick featured cdnow prize reveffo olsen expects estimate near common package title whether bought evergreen difference elizabeth history monitor advised result sources school unaccounted paragraph turn kimberly increased communication members concerns uncertainties associated reduce committed wi asap goes trader waiting canada worth representative claim ceo london discussions php brazos trevino calling involved la gift southern groups hour tufco previously voice normally resolve efforts nor recent purchasing county ok express generic according respond situation hold lot interconnect word came west role opportunities corporate remain similar readers suggestions subscribers projections lead learn resolved agreed sec head enjoy img rnd responsible outstanding member panenergy american cass register promotions parties winfree selling usage appropriate assignment media believes require submit model spinnaker copano facilities opinion factors identified beverly ews gdp deliver job profile across neuweiler suggest girls manage usa local bad greg vs fees digital cf strangers registration delta rolex goliad hesco success primary quarter course chairman petroleum notes medications ei instead fine lake pre force seek recipient gain placed age least body asking discussed hanson emails nominate ext known ones ed assigned htmlimg means present various invoices gd agency along located reflects solutions ex house cds br owner apr sullivan basin linda worked car seen properties booked higher store est revenue wait women far met wholesale range kcs recorded brown lots match input grant providing huge investor kelly apply paths handling pipes advantage analysis focus draw red origination connection planning wilson golf summary item bankruptcy expenses pgev encina beaty memo initial thousand mills penis friend conversation multiple martin names bit dth talked behalf preliminary button herein gisb coupon sa oi appears door texaco csikos arrangements cpr expires popular sending research conditions gb board ca applications tried paying acquisition reporting normal maintenance resume announced attachment buyer objectives prod represent sandi hplnol government committee running tetco discount jo holding earlier positions happen mailing decided recently chris xanax valium broadband individual station td financing somehow pena critical attend kristen inform highly hl phillips minutes titles affiliate wife lonestar charlotte quickly paper test comes mobile internal privacy ideas live gotten floor benefit percent ms dr ebs msn gave dallas enterprise rx spring ftar ooking hawkins exclusive selected baxter actually single shop nominates guarantee minute correctly unique bid building stated accept assumptions centana senior pill kinsey sap immediate goals category mitchell acceptance termination sweeney facts amazon arrangement josey funds among accuracy mean rather kim egmnom indicate updates extra adjustment accounts lowest gold purposes remaining talking entry road load simply europe lindley understood logos hi speed profit notified jackson z vols serve additionally shipped connor fontfont q kept dollar jr almost fri paul documents analyst crude cap shopping aug clearance schneider ftworth father anticipated resellers congress counterparties epgt buying san invest cartwheel brandywine wrong mtbe split submitted hull gra children leader TRUE baseload mb letters billion rights mtr heidi clean historical asset foreign gr entity developed maybe jeffrey transmission outside lost membership invitation ocean legislation hernandez pep payments wallis rev kenneth seaman annual guess bammel lines guadalupe zivley exception example pathing revisions pipelines equity budget wed dealers window juno claims bottom standard alternative merchant braband topica telephone reliant speculative yes en morgan cable edmondson participate usb throughout checked myself contents fat investments six build giving calendar inherent edition darial hr trip pull moving concern proposed rm deer enquiries alt tammy front reduction evening concerning gets effect isn haven cowboy sea dvd launch minimum changing built avoid chief stephen chad manual finally strategy executive thousands conflict resulting policy commission stand positive quantity programs airmail texoma prepared austin matt intent uae citibank jaquet hol harris min hplr advance weather terminated whom sheet venturatos cellpadding hotel leading guaranteed idea announce pleased award operational prepare schedulers child sum quote adjusted warning issued ga cross detail pertaining tess owe crow availabilities griffin christy crosstex eel itoy heart licensed overnight cal otherwise luck stretch generation broker construed except traders carry column approx main alert charges step revenues games gottlob looked individuals beck stuff welcome port glover description daniel quantities park managing town seller summer tina dates eff dudley ferc robin charles customerservice zonedubai emirates aeor clickathome materia island vaughn sexual eiben forms delete realize tailgate behind villarreal lon benoit simple tech ahead double ordering se miss law eb post outlook equipment leslie reeves org tools cold adjustments contained saw edit deciding finance patti listbot river kathryn holiday successful unable advisor pool bryce outages adjust screen otc brent helps auto foot region links contain knowledge yvette dial pressure detailed indicated charged sites makes female mcmills cook mazowita meredith allocations meetings particular environment drug search mailings designed rock measurement art corrected kids benefits tv seems husband fix grow decision wireless mo conference interview levels copies cindy urgent regular payroll shown consumers reliable tr indicating coast greif severson tri vicodin liquids significant intend usd pager avails spencer ce charset verdana fully flynn da personnel multi closed vice administration gmt midstream eye speckels studio cilco likely managers structure sit parent preparation mix mmbtus timing happening lottery killing acquire mack pcx fares internationa notification swift identify areas separate unless producers allows pretty waste joanie drop taxes premium teams choice largest addressed dolphin ngo self davis htm ad graphics hit competitive thus incorrect ti acts previous edu proven electric pictures charlene benedict chevron treatment lesson player sds wc intraday assurance sdsnom rebecca quit netco intra whatever lyondell reviewed solicitation filings log noon locations joe completely rivers language street automatically ft powerful specials alone fyi properly proper explode decrease medication desks impacted anywhere completion banking consider certificate exercise zeroed websites tonight diligence education club vegas affordable sports predictions billing diamond posted prayer actions nomad resuits jason purpose deposit entertainment materially blank resolution anderson nat rom soma organization aquila solid affected transco spend responsibilities assume header accountant functionality meant killed analysts rick rolled noted discovered offices torch often york joint briley competition guide intercompany son settlement presently cart tim entries russ valadez rules molly apple atleast scheduler pi hector dell opm hottlist yap gone heal llc setting reached proposal hundred trust official table mcgee written operation cellspacing laptop feature ram victoria larry units requests continued external pack couldn lateral strictly resource although sr commodity pulled protocol bed generated redmond girl apparently tool reviews released movies inside shareholder rr compensation beliefs foresee lease rule marta chemical hillary hp tongue adonis advises master eight wasn itself documentation xl humble elsa pics hughes brokered distribute consultation sheri lists cannon treated factor putting verified releases enhanced controls craig worksheet conversion max hrs helpful hand producing dl developing design woman understands standards promotion sarco hospital ffffff respective richmond conoco driver easily sean den gateway holdings brad college gains adult dated em mcloughlin anticipates henderson julia negotiations sofftwaares garrick comstock trochta imceanotes ecom larger nommensen coordinate partnership otcbb announces louis dealer reliance season agua dulce offshore gathered forever function happened sample easier aim pa expensive thinks maximum war mining drilling owned todd advanced provider pending providers silver cherry hundreds thoughts addresses beach baby requires caused variance extension carbide anytime adding triple dawn martinez entering login bretz ls writeoff locker wiil block blood romeo responsibility brennan btu venture connected nascar opinions executed cell flag doctors invoiced marlin coffey nice amazing ii determined handled keeping touch upgrade shipment brought forwarding confidence hesitate seem electronic appreciated deadline franklin heather reasons passed safety procedures payback networks utility count africa exact creating loading processed court tier sender att mailbox glad buddy profiles portion protection compressor okay oba finding heads bar turned remote illustrator oem noticed mails darron nick urbanek jerry barrett ehronline und abdv egm couid technoiogies owns improved eat moment owners develop installed videos frank hearing inches busy ref valuable et un url shawna iso capture extremely ya causing consent anyway round discrepancies cheapest confidentiality disclosure prohibited vol correction communicate processes spain shareholders supported smoking mine biggest erections platform miles exciting association die restricted ma income goal bane collection nathan wind piece familiar gore experiencing pico mai dewpoint tessie hair bussell diane delivering originally accurate began seven tracking randall gay emerging prescriptions story arial florida space ownership european sutton concerned male spent agreements industries picture filled continues death choate majeure device hence ten campaign massive eyes requesting lives reminder eliminate copied consemiu died sound offered expressed anti duplicate steps books improve implementation gives ac peggy proprietary ways advertisement published earnings mortgage consumer ct tape fl cia organizational agenda rental carriere moshou church trouble medium aggressive smart zajac ail participants gap earthlink wire trades messaging ut wil richardson blvd glo seneca pubiisher imited isc contacts sleep kyle cooperation possibly leaving motor hopefully tie speak mi suggested canadian uses connect pvr rich places auction po spacer client recommended royalty amended default living regardless human bringing focused stores variety netherlands leaders bowen salary signed penny loan desktop chase pleasure compare session overall stranger length planned sp darrel raise palestinian expiration serial premiere suzanne reduced players applicable impotence buckley wayne hansen indicative sabrae dating winners marshall highest ea presentation allowed square danny gepl hydrocarbon alpine christmas muscle souza relating begins ecf forth answers audit approve lunch types starts difficult le lasts series till edge growing covered shipper sometime republic filter sooner increasing nelson percentage returned pop interface kin experienced prime merger obtain ryan servers attachments achieve effects gov examples procedure explore caribbean rally amounts comfort attempt greatly amelia engel delay fare der cove filing fletcher leth undervalued cents esther hlavaty reid lls troy palmer metals las carter luis migration brief hess therein ur pond joanne community tglo eogi ml wysak felipe errors affect convenient minimal boost incremental decide reserve superior kerr willing quite wild unlimited sans mother computers unfortunately ordered satisfaction priority traded testing portal ward lets aren knows refer shot fda tue saying cancel forecast cousino bass permanent phones technical whose objective cards distributed learning fire drill towards forget explosion gloria formula redelivery audio visual encoding approach doubt staffing excite corel tm enronavailso contacting alland heavy economic nigeria milwaukee phillip curve returns padre kathy buttons sir vary sounds disclose authority flw straight worldnet beemer ooo defs thorough officers flight prefer awesome macintosh feet constitutes formosa porn armstrong driscoll watches newsietter twenty tommy fields method setup allocating initially missed clarification especially dorcheus del millions insurance pooling trial tennessee ellis direction bold catch performing accepted matters batch continuing winning symbol offsystem decisions produced ended greatest degree solmonson imbalances fall fear hate fight reallocated debt reform australia plain prompt remains ifhsc enhancements connevey jay valued lay infrastructure military allowing ff dry Prediction
Email 1 0 0 1 0 0 0 2 0 0 0 0 1 0 0 2 0 0 0 0 0 0 0 0 3 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 2 4 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
give me python program with parameters like :-
# Calculate accuracy, precision, and F1-score
# Show classification report
# Confusion matrix with plots
plt.title('Confusion Matrix')
plt.xlabel('Predicted')
plt.ylabel('Actual')
# ROC curve
fpr, tpr, _ = roc_curve(y_test, y_score)
roc_auc = roc_auc_score(y_test, y_score)
plt.figure()
plt.plot(fpr, tpr, color='darkorange', lw=2, label=f'ROC curve (area = {roc_auc:.2})')
plt.plot([0, 1], [0, 1], color='navy', lw=2, linestyle=":") # Correctly specify linestyle
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver Operating Characteristic')
plt.legend(loc="lower right")
plt.show()
# Visualize feature correlations (consider doing this on a subset or using dimensionality reduction)
corr = X_train.corr() # Can be very intensive and not very informative for high-dimensional data
sns.heatmap(corr, cmap='coolwarm', xticklabels=False, yticklabels=False)
plt.title('Feature Correlation Matrix')
plt.show()
# Density plot of feature values by class
# Assuming a binary classification, 1 for spam and 0 for ham
spam_indices = y_train[y_train == 1].index
ham_indices = y_train[y_train == 0].index
plt.figure(figsize=(10, 6))
for feature in X_train.columns[:5]: # Example: plot first 5 features
sns.kdeplot(X_train.loc[spam_indices, feature], label=f'Spam - {feature}')
sns.kdeplot(X_train.loc[ham_indices, feature], label=f'Ham - {feature}')
plt.title(f'Density Plot of {feature} by Class')
plt.legend()
plt.show()
# Actual vs. Predicted labels plot
plt.figure(figsize=(8, 6))
sns.countplot(y_test, label='Actual', color='blue', alpha=0.5)
sns.countplot(y_pred, label='Predicted', color='red', alpha=0.5)
plt.title('Actual vs. Predicted Label Counts')
plt.legend()
plt.show()
help me write the full python program with all these outputs and without errors:-
|
cd7c62cbe314160582eae3fe0c3914c3
|
{
"intermediate": 0.435916930437088,
"beginner": 0.2769816815853119,
"expert": 0.2871014177799225
}
|
42,425
|
sql query: Now let’s calculate the conversion of park visitors to pizza orders. Let's remember that conversion is the ratio of the number of unique visitors who ordered pizza to the total number of unique visitors. The marketing department reports that the park received 1,000 visitors in three months. SELECT
count(distinct(bracelet_id) * 1.0) / count(bracelet_id)
as cr
FROM PIZZA;
|
c1ec7426386c732a6163d8000ff84337
|
{
"intermediate": 0.36374175548553467,
"beginner": 0.30891889333724976,
"expert": 0.3273393213748932
}
|
42,426
|
import socket
from flask import Flask, request, render_template
from werkzeug.exceptions import HTTPException
app = Flask(__name__)
def is_port_open(ip, port):
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(5)
try:
sock.connect((ip, port))
return True
except socket.error:
return False
finally:
sock.close()
@app.route('/', methods=['GET', 'POST'])
def index():
result = None
ip = ""
port = ""
if request.method == 'POST':
ip = request.form.get('ip')
port = request.form.get('port')
try:
port = int(port)
if 0 < port < 65536: # Port number should be between 1 and 65535
result = is_port_open(ip, port)
else:
result = "Invalid port number."
except ValueError:
result = "Invalid IP address or port number."
except Exception as e:
result = f"An error occurred: {str(e)}"
return render_template('index.html', result=result, ip=ip, port=port)
@app.errorhandler(Exception)
def handle_exception(e):
# pass through HTTP errors
if isinstance(e, HTTPException):
return e
# now you're handling non-HTTP exceptions only
return render_template("500_generic.html", e=e), 500
if __name__ == "__main__":
app.run(debug=False)
create a vercel.json for this code
|
425317c6a7b52290a8f0c9363ca61e3e
|
{
"intermediate": 0.5305714011192322,
"beginner": 0.2509883940219879,
"expert": 0.2184402495622635
}
|
42,427
|
Select name, radius,
COUNT(DISTINCT bracelet_id) * 1.0
/ COUNT(bracelet_id) * 1.0 AS conversion_rate
from pizza
where bracelet_id
not in (145738, 145759, 145773,
145807, 145815, 145821, 145873,
145880) and conversion_rate > 0.03
group by name; Display the name and radius of the pizza. Calculate conversion. Don't count company employees. Leave only those lines where the conversion is above 3% - this is the target indicator for park managers.
|
1e46b2435b523d3821e684c617ff3691
|
{
"intermediate": 0.3605749309062958,
"beginner": 0.278704971075058,
"expert": 0.36072012782096863
}
|
42,428
|
SELECT name, radius
FROM pizza
WHERE bracelet_id NOT IN (145738, 145759, 145773, 145807, 145815, 145821, 145873, 145880)
GROUP BY name, radius
HAVING COUNT(DISTINCT(bracelet_id)) * 1.0
/ COUNT(bracelet_id) * 1.0 > 0.03; "Or maybe the conversion is related to the size of the pizza? Let's check! Display the name and radius of the pizza. Calculate conversion. Don't count company employees. Leave only those lines where the conversion is above 3% - this is the target indicator for park managers."
|
ac1f9c15acac2467410aa54a8e45a77a
|
{
"intermediate": 0.3813015818595886,
"beginner": 0.2006973922252655,
"expert": 0.4180009663105011
}
|
42,429
|
raw actions [L1 L3 L5 L6 L7 W1 W3 W5 W6 W7 Io Cp Vc] va;ues given from the RL algorithm during each timesteps within the first episode are given below
timesteps_1:
raw actions [[1.80000001e-07 2.00000000e-07 2.00000000e-07 2.00000000e-07 1.80000001e-07 4.99999987e-05 5.00000000e-07 5.00000000e-07 4.99999987e-05 4.99999987e-05 1.50000000e-05 9.99999996e-12 1.39999998e+00]]
timesteps_2:
raw actions [[1.80000001e-07 1.80000001e-07 2.00000000e-07 1.80000001e-07 2.00000000e-07 4.99999987e-05 5.00000000e-07 4.99999987e-05 5.00000000e-07 5.00000000e-07 2.99999992e-05 1.00000000e-13 8.00000012e-01]]
timesteps_3:
raw actions [[1.80000001e-07 2.00000000e-07 2.00000000e-07 2.00000000e-07 1.80000001e-07 5.00000000e-07 4.99999987e-05 5.00000000e-07 4.99999987e-05 5.00000000e-07 2.99999992e-05 9.99999996e-12 8.00000012e-01]]
timesteps_4:
raw actions [[2.00000000e-07 2.00000000e-07 2.00000000e-07 2.00000000e-07 1.80000001e-07 5.00000000e-07 5.00000000e-07 4.99999987e-05 5.00000000e-07 5.00000000e-07 1.50000000e-05 1.00000000e-13 8.00000012e-01]]
bounds_low = np.array([0.18e-6, 0.18e-6, 0.18e-6, 0.18e-6, 0.18e-6, 0.5e-6, 0.5e-6, 0.5e-6, 0.5e-6, 0.5e-6, 15e-6, 0.1e-12, 0.8])
bounds_high = np.array([0.2e-6, 0.2e-6, 0.2e-6, 0.2e-6, 0.2e-6, 50e-6, 50e-6, 50e-6, 50e-6, 50e-6, 30e-6, 10e-12, 1.4])
The action space is in continuous and the environment is an analog two satge operational amplifier circuit simulator. From my above results you can able to see that the, result 'raw actions' from my below code, gives the variation only between the edges of its corresponding bounds, it doesnot gives the different range of values with in the bounds. What should we need to do, do we need to modify the existing implementation of actor and critic network or do we need to implement scaling on raw actions, or do we have any other techniques to implement for getting better variations in results.
# Training Loop
for episode in range(num_episodes):
# Define the Training Loop
state, performance_metrics = env.reset()
# Run one episode
for t in range(max_timesteps):
action, log_prob, value, perf_metrics = select_action(state, env.actor, env.critic, bounds_low, bounds_high, performance_metrics)
# Clip the action to ensure it's within the action space bounds
raw actions = np.clip(action, bounds_low, bounds_high)
|
f22821d1cd06b988d34c46812509b1c6
|
{
"intermediate": 0.2254476100206375,
"beginner": 0.5163356065750122,
"expert": 0.25821685791015625
}
|
42,430
|
SELECT name, radius
FROM pizza
WHERE bracelet_id NOT IN (145738, 145759, 145773, 145807, 145815, 145821, 145873, 145880)
GROUP BY name, radius
HAVING COUNT(DISTINCT(bracelet_id)) * 1.0
/ COUNT(bracelet_id) * 1.0 > 0.03; “Or maybe the conversion is related to the size of the pizza? Let’s check! Display the name and radius of the pizza. Calculate conversion. Don’t count company employees. Leave only those lines where the conversion is above 3% - this is the target indicator for park managers.”
|
b644f95384db410b45e5a9a6112b2766
|
{
"intermediate": 0.3724411427974701,
"beginner": 0.2554686367511749,
"expert": 0.372090220451355
}
|
42,431
|
Is any apps in windows 10 able to read opened tabs CAPTION in firefox browser?
|
17e19f9184f43877e3d7f944edf75b0e
|
{
"intermediate": 0.4645843207836151,
"beginner": 0.21047234535217285,
"expert": 0.32494327425956726
}
|
42,432
|
Hi
|
ba6b415d236f7f24be2bb9a27d0339bb
|
{
"intermediate": 0.33010533452033997,
"beginner": 0.26984941959381104,
"expert": 0.400045245885849
}
|
42,433
|
import socket
from flask import Flask, request, render_template
from werkzeug.exceptions import HTTPException
app = Flask(__name__)
def is_port_open(ip, port):
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(5)
try:
sock.connect((ip, port))
return True
except socket.error:
return False
finally:
sock.close()
@app.route('/', methods=['GET', 'POST'])
def index():
result = None
ip = ""
port = ""
if request.method == 'POST':
ip = request.form.get('ip')
port = request.form.get('port')
try:
port = int(port)
if 0 < port < 65536: # Port number should be between 1 and 65535
result = is_port_open(ip, port)
else:
result = "Invalid port number."
except ValueError:
result = "Invalid IP address or port number."
except Exception as e:
result = f"An error occurred: {str(e)}"
return render_template('index.html', result=result, ip=ip, port=port)
@app.errorhandler(Exception)
def handle_exception(e):
# pass through HTTP errors
if isinstance(e, HTTPException):
return e
# now you're handling non-HTTP exceptions only
return render_template("500_generic.html", e=e), 500
if __name__ == "__main__":
app.run(debug=False)
create a procfile to deploy on railway.app
|
f5fc37984e1d6d293b90d253f1827845
|
{
"intermediate": 0.5382727384567261,
"beginner": 0.25142815709114075,
"expert": 0.2102990448474884
}
|
42,434
|
public BuffCardOut AbsorbCardsAmbushToCost(LuaCardsIn param)
{
foreach (var absorb in param.cards)
{
Card absorbObject = GetActorObject(absorb);
Player player = GetActorObject(absorbObject.GetOwnerPlayer());
// 对手场上
List<ActorRef<Card>> ambushCards = new List<ActorRef<Card>>();
BattleField oppoBattleField =
GetBattleFieldObject(GetOppositePlayerPosition(player.GetPlayerPosition()));
BattleField battleField = GetBattleFieldObject(player.GetPlayerPosition());
ambushCards.AddRange(battleField.GetDeployedCardsByEffect(CardKeyWordType.Ambush).ToList());
ambushCards.AddRange(oppoBattleField.GetDeployedCardsByEffect(CardKeyWordType.Ambush).ToList());
foreach (var ambushCard in ambushCards)
{
Card ambushCardObject = GetActorObject(ambushCard);
ambushCardObject.RemoveEffect(CardKeyWordType.Ambush);
}
PlayerCost playerCost = player.GetPlayerCost();
playerCost += ambushCards.Count * param.value;
if (playerCost > player.GetMaxPlayerCost())
{
playerCost = player.GetMaxPlayerCost();
}
player.SetPlayerCost(playerCost);
}
return default;
} 这段代码还有优化空间嘛
|
98f7f776735cd049b01987814e2227fe
|
{
"intermediate": 0.3996141254901886,
"beginner": 0.3479176163673401,
"expert": 0.2524682283401489
}
|
42,435
|
private IEnumerator AttackCoroutine(Vector3 target)
{
sorting.sortingOrder += 1;
isAttackAnimating = true;
arrowAttack.gameObject.SetActive(false);
Vector3 startPos = display.transform.position;
Vector3 startLocalScale = display.transform.localScale;
Vector3 targetIgnoreZ = new Vector3(target.x, target.y, display.transform.position.z);
Vector3 attackDir = (targetIgnoreZ - startPos).normalized;
targetIgnoreZ -= attackDir * 2;
Vector3 backstepTarget = display.transform.position - attackDir * 3;
// 放大
//Debug.Log($" begin attack Coroutine {display.transform.position}");
yield return display.transform.TweenLocalScale(display.transform.localScale,
display.transform.localScale * 1.2f, 0.1f);
// 后撤
yield return display.transform.TweenPos(display.transform.position, backstepTarget, 0.1f);
// 前进
//Debug.Log($" begin attack target {display.transform.position}");
yield return display.transform.TweenPos(display.transform.position, targetIgnoreZ, 0.1f);
// 还原大小
//Debug.Log($" begin attack backscale {display.transform.position}");
yield return display.transform.TweenLocalScale(display.transform.localScale, startLocalScale, 0.15f);
// 回到原始位置
//Debug.Log($" begin attack backpos {display.transform.position}");
yield return display.transform.TweenPos(display.transform.position, startPos, 0.3f);
arrowAttack.gameObject.SetActive(true);
isAttackAnimating = false;
sorting.sortingOrder = origLayer;
} 帮忙优化一下这个代码
|
adcd1000bdbbfebc1372cc85fffeb893
|
{
"intermediate": 0.3697323799133301,
"beginner": 0.3072896897792816,
"expert": 0.3229779899120331
}
|
42,436
|
i have collected a dataset of cryptocurrencies historical data set that its each row contains following features:
Symbol Open High Low Close Volume Volume USDT tradecount volume_adi volume_obv volume_cmf volume_fi volume_em volume_sma_em volume_vpt volume_vwap volume_mfi volume_nvi volatility_bbm volatility_bbh volatility_bbl volatility_bbw volatility_bbp volatility_bbhi volatility_bbli volatility_kcc volatility_kch volatility_kcl volatility_kcw volatility_kcp volatility_kchi volatility_kcli volatility_dcl volatility_dch volatility_dcm volatility_dcw volatility_dcp volatility_atr volatility_ui trend_macd trend_macd_signal trend_macd_diff trend_sma_fast trend_sma_slow trend_ema_fast trend_ema_slow trend_vortex_ind_pos trend_vortex_ind_neg trend_vortex_ind_diff trend_trix trend_mass_index trend_dpo trend_kst trend_kst_sig trend_kst_diff trend_ichimoku_conv trend_ichimoku_base trend_ichimoku_a trend_ichimoku_b trend_stc trend_adx trend_adx_pos trend_adx_neg trend_cci trend_visual_ichimoku_a trend_visual_ichimoku_b trend_aroon_up trend_aroon_down trend_aroon_ind trend_psar_up trend_psar_down trend_psar_up_indicator trend_psar_down_indicator momentum_rsi momentum_stoch_rsi momentum_stoch_rsi_k momentum_stoch_rsi_d momentum_tsi momentum_uo momentum_stoch momentum_stoch_signal momentum_wr momentum_ao momentum_roc momentum_ppo momentum_ppo_signal momentum_ppo_hist momentum_pvo momentum_pvo_signal momentum_pvo_hist momentum_kama others_dr others_dlr others_cr T10YIE_extra T10Y2Y_extra DPRIME_extra oil_close_extra DeFi_cap_extra DEXCHUS_extra DEXJPUS_extra DEXUSEU_extra DJIA_extra EFFR_extra eth_gas_wei_extra fear_greed_extra gold_close_extra BAMLH0A0HYM2_extra DGS10_extra NASDAQCOM_extra silver_close_extra SP500_extra Tether_cap_extra Label
the Label column is as following:
abel as 2: If the price is ascending in next 5 days and increased more than 5%
label as 1: If the price is ascending in next 5 days and increased more than 2%
label as 0: If the price is ascending or descending in next 5 days and increased or decreased less than 2% or the trend in next five days cant determined
label as -1: If the price is descending in next 5 days and decreased more than than 2%
label as 4: If the price is descending in next 5 days and decreased more than than 2%
give me the proper code to implement XGBoost model on my dataset
separate data set to tran set,dev set, test set by 97% ,1.5%, 1.5%
consider proper normalization and feature scaling on dataset and also consider Grid Search and k-fold and also confusion matrix to help me pick best model
|
2d650c792f4950ca3f17b13b2b982c41
|
{
"intermediate": 0.37496936321258545,
"beginner": 0.3602074682712555,
"expert": 0.26482316851615906
}
|
42,437
|
hi
|
9f99e4348bca41b41c5635e521eeb0c5
|
{
"intermediate": 0.3246487081050873,
"beginner": 0.27135494351387024,
"expert": 0.40399640798568726
}
|
42,438
|
import socket
from flask import Flask, request, render_template
from werkzeug.exceptions import HTTPException
app = Flask(name)
def is_port_open(ip, port):
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(5)
try:
sock.connect((ip, port))
return True
except socket.error:
return False
finally:
sock.close()
@app.route(‘/’, methods=[‘GET’, ‘POST’])
def index():
result = None
ip = “”
port = “”
if request.method == ‘POST’:
ip = request.form.get(‘ip’)
port = request.form.get(‘port’)
try:
port = int(port)
if 0 < port < 65536: # Port number should be between 1 and 65535
result = is_port_open(ip, port)
else:
result = “Invalid port number.”
except ValueError:
result = “Invalid IP address or port number.”
except Exception as e:
result = f"An error occurred: {str(e)}"
return render_template(‘index.html’, result=result, ip=ip, port=port)
@app.errorhandler(Exception)
def handle_exception(e):
# pass through HTTP errors
if isinstance(e, HTTPException):
return e
# now you’re handling non-HTTP exceptions only
return render_template(“500_generic.html”, e=e), 500
if name == “__main__”:
app.run(debug=False)
procfile
web: flask run --host=0.0.0.0
upon deploying on railway.app i get "Application failed to respond"
|
13c00de2ad9ff0367cd0412239fa9f6e
|
{
"intermediate": 0.4260581433773041,
"beginner": 0.23376721143722534,
"expert": 0.3401746153831482
}
|
42,439
|
Do you know GD script?
|
bd54e2967195faa1ba5213ef0e139eb1
|
{
"intermediate": 0.14806152880191803,
"beginner": 0.7549827098846436,
"expert": 0.0969557985663414
}
|
42,440
|
app.py
import socket
from flask import Flask, request, render_template
from werkzeug.exceptions import HTTPException
app = Flask(__name__)
def is_port_open(ip, port):
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(5)
try:
sock.connect((ip, port))
return True
except socket.error:
return False
finally:
sock.close()
@app.route('/', methods=['GET', 'POST'])
def index():
result = None
ip = ""
port = ""
if request.method == 'POST':
ip = request.form.get('ip')
port = request.form.get('port')
try:
port = int(port)
if 0 < port < 65536: # Port number should be between 1 and 65535
result = is_port_open(ip, port)
else:
result = "Invalid port number."
except ValueError:
result = "Invalid IP address or port number."
except Exception as e:
result = f"An error occurred: {str(e)}"
return render_template('index.html', result=result, ip=ip, port=port)
@app.errorhandler(Exception)
def handle_exception(e):
# pass through HTTP errors
if isinstance(e, HTTPException):
return e
# now you're handling non-HTTP exceptions only
return render_template("500_generic.html", e=e), 500
if __name__ == "__main__":
port = int(os.environ.get("PORT", 5000))
app.run(host='0.0.0.0', port=port, debug=False)
procfile
web: flask run --host=0.0.0.0
upon deploying on railway.app i get “Application failed to respond”
|
806a3d411d2933f933053d48ca92bdbc
|
{
"intermediate": 0.5545110106468201,
"beginner": 0.20040130615234375,
"expert": 0.24508768320083618
}
|
42,441
|
1_ Translate the following legal text into colloquial Farsi 2_ Place the Persian and English text side by side in the table 3_ From the beginning to the end of the text, there should be an English sentence on the left side and a Persian sentence on the right side.
4- Using legal language for Persian translation
.parents anreasonably refuse consent, the children may apply 1o the magistrates for
permission to marry A marriage is void if either party is under 16 al the time of the
marriage, even if it is discovered many years later and it was a genuine mistake. The fact
that they had parental permission would not change the situation, but if 17-year-olds marry
without parental consent the marriage is valid.
It may seem unreasonable to declare a marriage void after many years, but there could
be a situation when one of the .married' couple died and another person could benefit on
an intestacy, (see p. 259).
6. Litigation
A minor may not personally bring a civil action, but must do so through a "next friend"
(usually a parent) and defend an action through a "guardian ad litem'. Minors under 17
may not apply for legal aid, although their parents or guardians may do so on their behalf.
7. Other Restrictions on the Rights of Minors
(a) Voting and Jury Service
Young persons cannot vote at a local or parliamentary election, or be selected for jury
service until they are 18 and their names appear on the electoral role.
(b) Wills
Generally, persons under 18 cannot make a valid formal will, although they may be
competent to sign as a witness to a will.
(c) Passports
A British passport may be obtained at the age of 16, although children under this age are
usually included on their parents' passports
(d) Driving
A person must be 17 to obtain a licence to drive a car or motorcycle, although a person of
16 may obtain a licence to drive a moped
(e) Drinking
Persons under 14 are not allowed in bars of licensed premises, and persons over 14 and
under 18, although allowed to enter a bar may not be served with, or drink, alcohol.
Persons over 16 may consume certain drinks if served with a meal.
|
1f0f59d9b9b74d342c79aa209be0bb38
|
{
"intermediate": 0.2549813985824585,
"beginner": 0.5182058811187744,
"expert": 0.22681273519992828
}
|
42,442
|
# python/taichi/examples/simulation/fractal.py
import taichi as ti
ti.__init__(arch=ti.cpu)
n = 320
pixels = ti.field(dtype=float, shape=(n * 2, n))
@ti.func
def complex_sqr(z):
return ti.Vector([z[0]**2 - z[1]**2, z[1] * z[0] * 2])
@ti.kernel
def paint(t: float):
for i, j in pixels: # Parallelized over all pixels
c = ti.Vector([-0.8, ti.cos(t) * 0.2])
z = ti.Vector([i / n - 1, j / n - 0.5]) * 2
iterations = 0
while z.norm() < 20 and iterations < 50:
z = complex_sqr(z) + c
iterations += 1
pixels[i, j] = 1 - iterations * 0.02
gui = ti.GUI("Julia Set", res=(n * 2, n))
for i in range(1000000):
paint(i * 0.03)
gui.set_image(pixels)
gui.show()
returns
line 5, in <module>
ti.__init__(arch=ti.cpu)
^^^^^^
AttributeError: partially initialized module 'taichi' has no attribute 'cpu' (most likely due to a circular import)
|
69ea7d2e66e478933b285dc9c34acb87
|
{
"intermediate": 0.4379878640174866,
"beginner": 0.20134752988815308,
"expert": 0.36066460609436035
}
|
42,443
|
i have collected a dataset of cryptocurrencies historical data set that its each row contains following features:
Symbol Open High Low Close Volume Volume USDT tradecount volume_adi volume_obv volume_cmf volume_fi volume_em volume_sma_em volume_vpt volume_vwap volume_mfi volume_nvi volatility_bbm volatility_bbh volatility_bbl volatility_bbw volatility_bbp volatility_bbhi volatility_bbli volatility_kcc volatility_kch volatility_kcl volatility_kcw volatility_kcp volatility_kchi volatility_kcli volatility_dcl volatility_dch volatility_dcm volatility_dcw volatility_dcp volatility_atr volatility_ui trend_macd trend_macd_signal trend_macd_diff trend_sma_fast trend_sma_slow trend_ema_fast trend_ema_slow trend_vortex_ind_pos trend_vortex_ind_neg trend_vortex_ind_diff trend_trix trend_mass_index trend_dpo trend_kst trend_kst_sig trend_kst_diff trend_ichimoku_conv trend_ichimoku_base trend_ichimoku_a trend_ichimoku_b trend_stc trend_adx trend_adx_pos trend_adx_neg trend_cci trend_visual_ichimoku_a trend_visual_ichimoku_b trend_aroon_up trend_aroon_down trend_aroon_ind trend_psar_up trend_psar_down trend_psar_up_indicator trend_psar_down_indicator momentum_rsi momentum_stoch_rsi momentum_stoch_rsi_k momentum_stoch_rsi_d momentum_tsi momentum_uo momentum_stoch momentum_stoch_signal momentum_wr momentum_ao momentum_roc momentum_ppo momentum_ppo_signal momentum_ppo_hist momentum_pvo momentum_pvo_signal momentum_pvo_hist momentum_kama others_dr others_dlr others_cr T10YIE_extra T10Y2Y_extra DPRIME_extra oil_close_extra DeFi_cap_extra DEXCHUS_extra DEXJPUS_extra DEXUSEU_extra DJIA_extra EFFR_extra eth_gas_wei_extra fear_greed_extra gold_close_extra BAMLH0A0HYM2_extra DGS10_extra NASDAQCOM_extra silver_close_extra SP500_extra Tether_cap_extra Label
the Label column is as following:
abel as 2: If the price is ascending in next 5 days and increased more than 5%
label as 1: If the price is ascending in next 5 days and increased more than 2%
label as 0: If the price is ascending or descending in next 5 days and increased or decreased less than 2% or the trend in next five days cant determined
label as 3: If the price is descending in next 5 days and decreased more than than 2%
label as 4: If the price is descending in next 5 days and decreased more than than 2%
I want to train a deep learning model on this dataset
what architecture you suggest I use based on number of features and rows(How many layers and neurons and...)?
|
894dd42ddc6479c5e1e4e01a8c9334df
|
{
"intermediate": 0.43756774067878723,
"beginner": 0.18961085379123688,
"expert": 0.3728214204311371
}
|
42,444
|
I have an image file (invoice) and a json where I have list of extracted entites. And I have a csv file containing OCR textract output containing these columns ((page_num,block_num,line_num,word_num,left,right,top,bottom,width,height,conf,text,image_height,image_width))
JSON Format: {
“invoice_details”:{
“invoice_number “: “”,
“invoice_date”: “”,
“invoice_due_date”: “”,
“order_id”: “”,
“vendor_name”: “”,
“buyer_name”: “”,
“shipto_name”: “”,
“billto_name”: “”,
},
“Payment Details”: {
“vendor_ifsccode” :””,
“vendor_bankname” :“”,
“account_number” :“”,
“bank_payment_terms”:“”,
},
“amounts_and_tax”: {
“Subtotal_or_taxable_amount”: “”,
"total_sgst_amount ": “”,
"total_cgst_amount ": “”,
"total_igst_amount ": “”,
“total_amount_after_tax”: “”
“billto_GSTIN”: “”,
“vendor_GSTIN”: “”,
“shipto_GSTIN”: “”
}
}
I want to map the bounding boxes from the json to image. And print the bounding boxes on image
Write a highly advance python code to map the bounding boxex.
There are some important conditions. That code should handle effectively.
1. I want separate function to handle single token entity and multi token entiy
2. In case of multi token entity :- Check all the words in multi token entity are nearby to each other or not. They should in same line or only in next line always. You can also check the distance between words before picking them up from dataframe.
3. Repetation of any entity is not allowed. If we have already got the bounding box for any entity. Then we should not consider that entity again.
4. In case of numerical value or entity we have to remove the comma from dataframe and extracted entity then only we have to match both.
|
afcc7c7c5dc74012f957aedaaf2a7940
|
{
"intermediate": 0.4149230718612671,
"beginner": 0.30761244893074036,
"expert": 0.27746447920799255
}
|
42,445
|
pip check taichi version command
|
a475b59d124175ef6ba00a57a17c6c41
|
{
"intermediate": 0.41582101583480835,
"beginner": 0.2610974609851837,
"expert": 0.3230815529823303
}
|
42,446
|
i can show taichi in pip but can't import taichi as ti
|
a866628e443a57f30535fdf6317b4f20
|
{
"intermediate": 0.42270657420158386,
"beginner": 0.28793060779571533,
"expert": 0.2893628478050232
}
|
42,447
|
check py directory
|
a9adc4aa56cc95667404967936f56775
|
{
"intermediate": 0.36092808842658997,
"beginner": 0.27288588881492615,
"expert": 0.3661859929561615
}
|
42,448
|
write a script in powershell to find proxies with the local domain .loc in all shared mailboxes in exchange organization and remove them
|
e563a9af57008f9c031f3c6b122ac590
|
{
"intermediate": 0.3761841058731079,
"beginner": 0.1709899604320526,
"expert": 0.45282599329948425
}
|
42,449
|
i have collected a dataset of cryptocurrencies historical data set that its each row contains following features:
Symbol Open High Low Close Volume Volume USDT tradecount volume_adi volume_obv volume_cmf volume_fi volume_em volume_sma_em volume_vpt volume_vwap volume_mfi volume_nvi volatility_bbm volatility_bbh volatility_bbl volatility_bbw volatility_bbp volatility_bbhi volatility_bbli volatility_kcc volatility_kch volatility_kcl volatility_kcw volatility_kcp volatility_kchi volatility_kcli volatility_dcl volatility_dch volatility_dcm volatility_dcw volatility_dcp volatility_atr volatility_ui trend_macd trend_macd_signal trend_macd_diff trend_sma_fast trend_sma_slow trend_ema_fast trend_ema_slow trend_vortex_ind_pos trend_vortex_ind_neg trend_vortex_ind_diff trend_trix trend_mass_index trend_dpo trend_kst trend_kst_sig trend_kst_diff trend_ichimoku_conv trend_ichimoku_base trend_ichimoku_a trend_ichimoku_b trend_stc trend_adx trend_adx_pos trend_adx_neg trend_cci trend_visual_ichimoku_a trend_visual_ichimoku_b trend_aroon_up trend_aroon_down trend_aroon_ind trend_psar_up trend_psar_down trend_psar_up_indicator trend_psar_down_indicator momentum_rsi momentum_stoch_rsi momentum_stoch_rsi_k momentum_stoch_rsi_d momentum_tsi momentum_uo momentum_stoch momentum_stoch_signal momentum_wr momentum_ao momentum_roc momentum_ppo momentum_ppo_signal momentum_ppo_hist momentum_pvo momentum_pvo_signal momentum_pvo_hist momentum_kama others_dr others_dlr others_cr T10YIE_extra T10Y2Y_extra DPRIME_extra oil_close_extra DeFi_cap_extra DEXCHUS_extra DEXJPUS_extra DEXUSEU_extra DJIA_extra EFFR_extra eth_gas_wei_extra fear_greed_extra gold_close_extra BAMLH0A0HYM2_extra DGS10_extra NASDAQCOM_extra silver_close_extra SP500_extra Tether_cap_extra Label
the Label column is as following:
abel as 2: If the price is ascending in next 5 days and increased more than 5%
label as 1: If the price is ascending in next 5 days and increased more than 2%
label as 0: If the price is ascending or descending in next 5 days and increased or decreased less than 2% or the trend in next five days cant determined
label as 3: If the price is descending in next 5 days and decreased more than than 2%
label as 4: If the price is descending in next 5 days and decreased more than than 2%
I want to train a deep learning model on this dataset
what architecture you suggest I use based on number of features(365) and rows(350000)(How many layers and neurons and…)?
|
ff71de7e7e95238337ef9303efaf90f9
|
{
"intermediate": 0.20941025018692017,
"beginner": 0.1270407736301422,
"expert": 0.66354900598526
}
|
42,450
|
i get this error: Epoch 1: 0%| | 0/1 [00:00<?, ?it/s]C:\Users\L14\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\optim\lr_scheduler.py:152: UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
Epoch 1, Average Loss: 5.066798 when i run this code: import os
import torch
import torch.nn as nn
import torch.nn.functional as F
import json
import math
from torch.nn.utils.rnn import pad_sequence
from torch.utils.data import DataLoader, Dataset
from tqdm import tqdm
import matplotlib.pyplot as plt
from sklearn.metrics import precision_score, recall_score, f1_score, accuracy_score
from tokenizers import Tokenizer
from torch.optim.lr_scheduler import SequentialLR, StepLR, LinearLR
# ---------- Device Configuration ----------
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
# ---------- Utility Functions ----------
def positional_encoding(seq_len, d_model, device):
pos = torch.arange(seq_len, dtype=torch.float, device=device).unsqueeze(1)
div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model)).to(device)
pe = torch.zeros(seq_len, d_model, device=device)
pe[:, 0::2] = torch.sin(pos * div_term)
pe[:, 1::2] = torch.cos(pos * div_term)
return pe.unsqueeze(0)
# -------- Performance ----------
def evaluate_model(model, data_loader, device):
model.eval()
all_preds, all_targets = [], []
with torch.no_grad():
for inputs, targets in data_loader:
inputs, targets = inputs.to(device), targets.to(device)
outputs = model(inputs)
predictions = torch.argmax(outputs, dim=-1).view(-1) # Flatten predicted indices
all_preds.extend(predictions.cpu().numpy())
all_targets.extend(targets.view(-1).cpu().numpy()) # Ensure targets are also flattened
# Calculate precision, recall, and F1 score after ensuring all_preds and all_targets are correctly aligned.
accuracy = accuracy_score(all_targets, all_preds)
precision = precision_score(all_targets, all_preds, average='macro', zero_division=0)
recall = recall_score(all_targets, all_preds, average='macro', zero_division=0)
f1 = f1_score(all_targets, all_preds, average='macro', zero_division=0)
print(f"Accuracy: {accuracy:.4f}")
print(f"Precision: {precision:.4f}")
print(f"Recall: {recall:.4f}")
print(f"F1 Score: {f1:.4f}")
return accuracy ,precision, recall, f1
# Function to plot loss over time
def plot_loss(loss_history):
plt.figure(figsize=(10, 5))
plt.plot(loss_history, label='Training Loss')
plt.xlabel('Batches')
plt.ylabel('Loss')
plt.title('Training Loss Over Time')
plt.legend()
plt.show()
# ---------- Model Definitions ----------
class TransformerExpert(nn.Module):
def __init__(self, input_size, d_model, output_size, nhead, dim_feedforward, num_encoder_layers=1):
super(TransformerExpert, self).__init__()
self.d_model = d_model
self.input_fc = nn.Linear(input_size, d_model)
self.pos_encoder = nn.Parameter(positional_encoding(1, d_model, device), requires_grad=True)
encoder_layer = nn.TransformerEncoderLayer(d_model=d_model, nhead=nhead,
dim_feedforward=dim_feedforward,
batch_first=True,
norm_first=True)
self.transformer_encoder = nn.TransformerEncoder(encoder_layer, num_layers=num_encoder_layers)
self.output_fc = nn.Linear(d_model, output_size)
self.norm = nn.LayerNorm(d_model)
def forward(self, x):
seq_len = x.shape[1]
pos_encoder = positional_encoding(seq_len, self.d_model, device)
x = self.norm(self.input_fc(x)) + pos_encoder
transformer_output = self.transformer_encoder(x)
output = self.output_fc(transformer_output)
return output
class GatingNetwork(nn.Module):
def __init__(self, input_feature_dim, num_experts, hidden_dims=[256], dropout_rate=0.2):
super(GatingNetwork, self).__init__()
layers = []
last_dim = input_feature_dim
for hidden_dim in hidden_dims:
layers.extend([
nn.Linear(last_dim, hidden_dim),
nn.GELU(),
nn.Dropout(dropout_rate),
])
last_dim = hidden_dim
layers.append(nn.Linear(last_dim, num_experts))
self.fc_layers = nn.Sequential(*layers)
self.softmax = nn.Softmax(dim=1)
def forward(self, x):
x = x.mean(dim=1) # To ensure gating is based on overall features across the sequence
x = self.fc_layers(x)
return self.softmax(x)
class MixtureOfTransformerExperts(nn.Module):
def __init__(self, input_size, d_model, output_size, nhead, dim_feedforward, num_experts, num_encoder_layers=1):
super(MixtureOfTransformerExperts, self).__init__()
self.num_experts = num_experts
self.output_size = output_size
self.experts = nn.ModuleList([TransformerExpert(input_size, d_model, output_size, nhead, dim_feedforward, num_encoder_layers) for _ in range(num_experts)])
self.gating_network = GatingNetwork(d_model, num_experts)
def forward(self, x):
gating_scores = self.gating_network(x)
expert_outputs = [expert(x) for expert in self.experts]
stacked_expert_outputs = torch.stack(expert_outputs)
expanded_gating_scores = gating_scores.unsqueeze(2).unsqueeze(3)
expanded_gating_scores = expanded_gating_scores.expand(-1, -1, x.size(1), self.output_size)
expanded_gating_scores = expanded_gating_scores.transpose(0, 1)
mixed_output = torch.sum(stacked_expert_outputs * expanded_gating_scores, dim=0)
return mixed_output
class MoETransformerModel(nn.Module):
def __init__(self, vocab_size, d_model, moe):
super(MoETransformerModel, self).__init__()
self.embedding = nn.Embedding(num_embeddings=vocab_size, embedding_dim=d_model)
self.moe = moe
self.dropout = nn.Dropout(p=0.1)
def forward(self, x):
embedded = self.dropout(self.embedding(x))
return self.moe(embedded)
# ---------- Dataset Definitions ----------
class QAJsonlDataset(Dataset):
def __init__(self, path, seq_len, tokenizer_path):
# Load the trained tokenizer
self.tokenizer = Tokenizer.from_file(tokenizer_path)
self.seq_len = seq_len
self.pairs = self.load_data(path)
# Using BPE, so no need for manual vocab or idx2token.
# Tokenization will now happen using self.tokenizer.
self.tokenized_pairs = [(self.tokenize(q), self.tokenize(a)) for q, a in self.pairs]
def load_data(self, path):
pairs = []
with open(path, "r", encoding="utf-8") as f:
for line in f:
data = json.loads(line.strip())
question, answer = data.get("input", ""), data.get("output", "")
pairs.append((question, answer)) # Store questions and answers as raw strings
return pairs
def tokenize(self, text):
# Tokenizing using the BPE tokenizer
encoded = self.tokenizer.encode(text)
tokens = encoded.ids
# Padding/truncation
if len(tokens) < self.seq_len:
# Padding
tokens += [self.tokenizer.token_to_id("<pad>")] * (self.seq_len - len(tokens))
else:
# Truncation
tokens = tokens[:self.seq_len - 1] + [self.tokenizer.token_to_id("<eos>")]
return tokens
def __len__(self):
return len(self.tokenized_pairs)
def __getitem__(self, idx):
tokenized_question, tokenized_answer = self.tokenized_pairs[idx]
return torch.tensor(tokenized_question, dtype=torch.long), torch.tensor(tokenized_answer, dtype=torch.long)
def collate_fn(batch):
questions, answers = zip(*batch)
questions = pad_sequence(questions, batch_first=True, padding_value=0)
answers = pad_sequence(answers, batch_first=True, padding_value=0)
return questions, answers
# ---------- Training and Inference Functions ----------
def train_model(model, criterion, optimizer, num_epochs, data_loader, label_smoothing=0.1):
criterion = nn.CrossEntropyLoss(label_smoothing=label_smoothing)
model.train()
loss_history = [] # Initialize a list to keep track of losses
for epoch in range(num_epochs):
total_loss = 0
total_items = 0 # Keep track of total items processed
progress_bar = tqdm(enumerate(data_loader), total=len(data_loader), desc=f"Epoch {epoch+1}", leave=False)
for i, (inputs, targets) in progress_bar:
inputs, targets = inputs.to(device), targets.to(device)
optimizer.zero_grad()
# Predict
predictions = model(inputs)
predictions = predictions.view(-1, predictions.size(-1)) # Make sure predictions are the right shape
targets = targets.view(-1) # Flatten targets to match prediction shape if necessary
# Calculate loss
loss = criterion(predictions, targets)
loss.backward()
# Gradient clipping for stabilization
torch.nn.utils.clip_grad_norm_(model.parameters(), max_norm=1.0)
optimizer.step()
scheduler.step()
# Update total loss and the number of items
total_loss += loss.item() * inputs.size(0) # Multiply loss by batch size
total_items += inputs.size(0)
loss_history.append(loss.item())
progress_bar.set_postfix({"Loss": loss.item()})
average_loss = total_loss / total_items # Correctly compute average loss
print(f"Epoch {epoch+1}, Average Loss: {average_loss:.6f}")
return loss_history
class WarmupLR(torch.optim.lr_scheduler._LRScheduler):
def __init__(self, optimizer, warmup_steps, scheduler_step_lr):
self.warmup_steps = warmup_steps
self.scheduler_step_lr = scheduler_step_lr # The subsequent scheduler
super(WarmupLR, self).__init__(optimizer)
def get_lr(self):
if self._step_count <= self.warmup_steps:
warmup_factor = float(self._step_count) / float(max(1, self.warmup_steps))
for base_lr in self.base_lrs:
yield base_lr * warmup_factor
else:
self.scheduler_step_lr.step() # Update the subsequent scheduler
for param_group in self.optimizer.param_groups:
yield param_group['lr']
class GERU(nn.Module):
def __init__(self, in_features):
super(GERU, self).__init__()
self.alpha = nn.Parameter(torch.rand(in_features))
def forward(self, x):
return torch.max(x, torch.zeros_like(x)) + self.alpha * torch.min(x, torch.zeros_like(x))
def generate_text(model, tokenizer, seed_text, num_generate, temperature=1.0):
model.eval()
generated_tokens = []
# Encode the seed text using the tokenizer
encoded_input = tokenizer.encode(seed_text)
input_ids = torch.tensor(encoded_input.ids, dtype=torch.long).unsqueeze(0).to(device)
# Generate num_generate tokens
with torch.no_grad():
for _ in range(num_generate):
output = model(input_ids)
# Get the last logits and apply temperature
logits = output[:, -1, :] / temperature
probabilities = F.softmax(logits, dim=-1)
next_token_id = torch.multinomial(probabilities, num_samples=1).item()
# Append generated token ID and prepare the new input_ids
generated_tokens.append(next_token_id)
input_ids = torch.cat([input_ids, torch.tensor([[next_token_id]], dtype=torch.long).to(device)], dim=1)
# Decode the generated token IDs back to text
generated_text = tokenizer.decode(generated_tokens)
return generated_text
def count_tokens_in_dataset(dataset):
return sum([len(pair[0]) + len(pair[1]) for pair in dataset.pairs])
def count_parameters(model):
return sum(p.numel() for p in model.parameters() if p.requires_grad)
# ---------- Hyperparameters and Model Instantiation ----------
# Transformer :
d_model = 32
nhead = 2
dim_feedforward = 128
num_encoder_layers = 1
num_experts = 1
# Training Parameters
batch_size = 32 # Adjustable batch size
optimizer_type = "AdamW" # Could be “SGD”, “RMSprop”, etc.
learning_rate = 1e-6
weight_decay = 1e-5 # For L2 regularization
num_epochs = 10
# Dataset :
path_to_dataset = "C:/Users/L14/Documents/Projets/Easy-MoE/Easy-MoE/data/basic.jsonl"
tokenizer_path = "BPE_tokenizer(basic).json"
seq_len = 32
dataset = QAJsonlDataset(path_to_dataset, seq_len, tokenizer_path)
data_loader = DataLoader(dataset, batch_size=batch_size, shuffle=True, collate_fn=collate_fn, pin_memory=True)
num_tokens = count_tokens_in_dataset(dataset)
print(f"Total number of tokens in the dataset: {num_tokens}")
# Load the tokenizer
tokenizer = Tokenizer.from_file(tokenizer_path)
# Determine the vocabulary size
vocab_size = tokenizer.get_vocab_size()
moe = MixtureOfTransformerExperts(
input_size=d_model,
d_model=d_model,
output_size=vocab_size,
nhead=nhead,
dim_feedforward=dim_feedforward,
num_experts=num_experts,
num_encoder_layers=num_encoder_layers
).to(device)
moe_transformer_model = MoETransformerModel(vocab_size, d_model, moe).to(device)
# Count of total parameters :
total_params = count_parameters(moe_transformer_model)
print(f"Total trainable parameters: {total_params}")
# ---------- Training ----------
# Adjusting optimizer setup to include weight decay and allow switching between types
if optimizer_type == "AdamW":
optimizer = torch.optim.AdamW(moe_transformer_model.parameters(), lr=learning_rate, weight_decay=weight_decay)
elif optimizer_type == "SGD":
optimizer = torch.optim.SGD(moe_transformer_model.parameters(), lr=learning_rate, momentum=0.9, weight_decay=weight_decay)
elif optimizer_type == "Adam":
optimizer = torch.optim.Adam(moe_transformer_model.parameters(), lr=learning_rate, weight_decay=weight_decay)
# Setup optimizers just like before
warmup_epochs = 1
scheduler1 = LinearLR(optimizer, start_factor=1e-5, total_iters=warmup_epochs)
scheduler2 = StepLR(optimizer, step_size=10, gamma=0.95)
scheduler = SequentialLR(optimizer, schedulers=[scheduler1, scheduler2], milestones=[warmup_epochs])
criterion = nn.CrossEntropyLoss(label_smoothing=0.1)
# Train the model
loss_history = train_model(moe_transformer_model, criterion, optimizer, num_epochs, data_loader)
# Evaluating the model
plot_loss(loss_history)
train_accuracy = evaluate_model(moe_transformer_model, data_loader, device)
# ---------- Inference ----------
def interactive_text_generation(model, dataset, max_length=32, temperature=1.0):
while True:
try:
# Get user input
seed_text = input("Enter seed text (type 'quit' to exit and save the model): ").strip()
# Check if the user wants to quit the interaction
if seed_text.lower() == 'quit':
print("Exiting text generation mode.")
break
# Generate text based on the seed text
if seed_text:
generated_text = generate_text(model, dataset, seed_text, max_length, temperature) # Modify max_length/temperature as needed
print("Generated Text:", generated_text)
else:
print("Seed text cannot be empty. Please enter some text.")
except Exception as e:
# Handle potential errors gracefully
print(f"An error occurred: {e}. Try again.")
interactive_text_generation(moe_transformer_model, tokenizer)
# ---------- Save Trained Model ----------
def save_model_with_config(model, config, save_dir, model_name):
"""
Saves the model weights, configuration, and performance metrics.
Parameters:
- model: the PyTorch model to save.
- config: a dictionary with the model's configuration.
- metrics: a dictionary with the model's performance metrics.
- save_dir: the root directory to save the model and its info.
- model_name: the name of the model, used to create a subdirectory.
"""
model_path = os.path.join(save_dir, model_name)
os.makedirs(model_path, exist_ok=True)
# Save model weigths
torch.save(model.state_dict(), os.path.join(model_path, '.pth'))
# Save configuration
with open(os.path.join(model_path, 'config.json'), 'w') as config_file:
json.dump(config, config_file, indent=4)
# Save metrics
#with open(os.path.join(model_path, 'metrics.json'), 'w') as metrics_file:
# json.dump(metrics, metrics_file, indent=4)
print(f"Model, configuration, and metrics saved in {model_path}")
config = {
'd_model': d_model,'nhead': nhead,'dim_feedforward': dim_feedforward,'num_encoder_layers': num_encoder_layers,
'num_experts': num_experts,'seq_len': seq_len,'batch_size': batch_size,'learning_rate': learning_rate,
'weight_decay': weight_decay,'num_epochs': num_epochs,
}
save_model_with_config(moe_transformer_model, config, "Trained_models", "Transformer-Alpha-v04")
|
e260ad8243da33c1e96513b8df84e2b5
|
{
"intermediate": 0.3810877203941345,
"beginner": 0.4453001618385315,
"expert": 0.17361211776733398
}
|
42,451
|
from playwright.sync_api import sync_playwright
import concurrent.futures
import time
# user_data = [
# {
# "username": "muskan123",
# "password": "Kumari@3011",
# "passport": "123456789",
# "country": "India",
# "birth_year": "2001",
# },
# {
# "username": "muskankum",
# "password": "Muskan@3012",
# "passport": "123456789",
# "country": "India",
# "birth_year": "2001",
# },
# {
# "username": "katz",
# "password": "P@ss12344444",
# "passport": "123456789",
# "country": "India",
# "birth_year": "2001",
# }
# ]
def run() -> bool:
with sync_playwright() as p:
s = time.time()
# Launch a Chromium browser
browser = p.chromium.launch(headless=False, slow_mo=800)
context = browser.new_context()
# Start tracing for screenshots, snapshots, and sources
context.tracing.start(screenshots=True, snapshots=True, sources=True)
# Create a new page and navigate to the website
page = context.new_page()
page.goto("https://smart.gdrfad.gov.ae/HomePage.aspx?GdfraLocale=en-US")
# Fill in username and password, then click on the login button
page.get_by_label("lblUsername").fill("john9898")
page.get_by_label("lblPassword").fill("Kalpesh@08")
page.get_by_role("button", name="Login").click()
# Click on the link to create a new application
page.get_by_role("link", name="New Application Create new").click()
# Click on the link to create a new 5 years tourism entry application
page.get_by_role("link", name="New 5 Years Tourism Entry").click()
# Fill Emirate in address inside UAE
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbAddressInsideEmiratesId").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="DUBAI").click()
# Select purpose of visit
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt586_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbVisitPurpose").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="Tourism").click()
# Select passport type
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt217_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbPassportTypeId").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="Normal").click()
# Fill in passport number
page.get_by_label("Passport Number").fill("P100000")
# Select current nationality
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt217_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantNationality").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Current Nationality", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
# Select previous nationality
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt217_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantPreviousNationality").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Previous Nationality", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
# Click Search Data button
page.get_by_role("button", name="Search Data").click()
# Enter mother's name in english
page.get_by_label("Mother Name En").fill("SIYA DEO")
# Click button to translate mother's name to arabic
page.locator('css=div[onclick="translateInputText(\'inpMotherNameEn\');"] img').click()
# Enter mother's name in english
page.get_by_label("Mother Name En").fill("SIYA DEO")
# Fill religion
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantReligion").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="HINDHU").click()
# Fill marital status
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantMaritalStatus").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="SINGLE").click()
# Fill education
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantEducation").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="B.SC. COMMERCE & BUSINESS ADMN").click()
# Fill language
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantLanguage1").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="ENGLISH").click()
# Fill coming from country
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbComingFromCountry").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Coming From Country", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
# Fill bank statement country
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbBankStatementCountry").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Bank Satement Country", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
# Fill profession
page.get_by_placeholder("Type or click for list").click()
page.get_by_placeholder("Type or click for list").fill("MANAGER")
# Fill email
page.get_by_label("Email", exact=True).fill("rpa.support@visaero.com")
page.get_by_label("Approval Email Copy Recipient").fill("rpa.support@visaero.com")
# Fill contact details mobile number
page.locator(
"#EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtinpMobileNumber").fill(
"0559856719")
# Fill preferred sms langauge
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbPreferredLanguage").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="ENGLISH").click()
# Click address inside the UAE city
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbAddressInsideCityId").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="Dubai").click()
# Fill address inside the UAE street, building/villa, floor and flat/villa number
page.get_by_label("Street").fill("SHAIKH ZAYED ROAD")
page.get_by_label("Building/Villa").fill("LATIFA TOWER")
page.get_by_label("Floor").fill("3")
page.get_by_label("Flat/Villa no.").fill("310")
# Fill address outside the UAE country, city
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantOutsideCountry").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Country", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
page.get_by_role("textbox", name="City *").fill("MUMBAI")
# Fill address outside the UAE address
page.get_by_label("Address").fill("MALAPPURAM DIST, KERALA -676507.^H301318430/03/2009 JEDDAH")
# Fill address outside the UAE mobile number
page.locator(
"#EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtinpAddressOutsideMobileNumber").click()
page.locator(
"#EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtinpAddressOutsideMobileNumber").press_sequentially(
"0501234567")
# Fill applicant faith
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantFaith").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="UnKnown").click()
# Address Inside UAE area
page.get_by_role("link", name="Select Area ").click()
page.locator("li").filter(has_text="Trade Centre 1").click()
# Go on next page
# page.get_by_role("button", name="Continue").click()
# Stop tracing and close the context and browser
context.tracing.stop(path="trace.zip")
# ---------------------
context.close()
browser.close()
e = time.time()
print(e - s)
return True
# Run for each user concurrently
with concurrent.futures.ThreadPoolExecutor() as executor:
executor.submit(run)
modify this code to run for 20 processes
|
d5a1da294de6a56a212a4aaaa52cfff4
|
{
"intermediate": 0.3493892550468445,
"beginner": 0.3939765989780426,
"expert": 0.2566341161727905
}
|
42,452
|
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.model_selection import train_test_split
from sklearn.datasets import make_classification
from sklearn.metrics import accuracy_score, precision_score, recall_score, f1_score
from sklearn.metrics import classification_report, confusion_matrix, roc_curve, roc_auc_score
from xgboost import XGBClassifier
# Generating synthetic data for demonstration
X, y = make_classification(n_samples=1000, n_features=20, n_informative=2, n_redundant=10, n_classes=2, random_state=42)
X = pd.DataFrame(X)
y = pd.Series(y)
# Splitting dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25, random_state=42)
# Initializing and training the XGBoost classifier
model = XGBClassifier(use_label_encoder=False, eval_metric='logloss')
model.fit(X_train, y_train)
# Predicting on the test set
y_pred = model.predict(X_test)
y_score = model.predict_proba(X_test)[:, 1] # Probability estimates for ROC curve
# Calculate accuracy, precision, and F1-score
accuracy = accuracy_score(y_test, y_pred)
precision = precision_score(y_test, y_pred)
recall = recall_score(y_test, y_pred)
f1 = f1_score(y_test, y_pred)
print(f"Accuracy: {accuracy:.2f}, Precision: {precision:.2f}, Recall: {recall:.2f}, F1 Score: {f1:.2f}")
# Show classification report
print("Classification Report:")
print(classification_report(y_test, y_pred))
# Confusion matrix with plots
plt.figure(figsize=(6, 5))
cm = confusion_matrix(y_test, y_pred)
sns.heatmap(cm, annot=True, fmt='d', cmap='Blues')
plt.title('Confusion Matrix')
plt.xlabel('Predicted')
plt.ylabel('Actual')
plt.show()
# ROC curve
fpr, tpr, _ = roc_curve(y_test, y_score)
roc_auc = roc_auc_score(y_test, y_score)
plt.figure()
plt.plot(fpr, tpr, color='darkorange', lw=2, label=f'ROC curve (area = {roc_auc:.2f})')
plt.plot([0, 1], [0, 1], color='navy', lw=2, linestyle=":")
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver Operating Characteristic')
plt.legend(loc="lower right")
plt.show()
# Visualize feature correlations
plt.figure(figsize=(10, 8))
corr = X_train.corr()
sns.heatmap(corr, cmap='coolwarm', annot=True)
plt.title('Feature Correlation Matrix')
plt.show()
# Density plot of feature values by class
spam_indices = y_train[y_train == 1].index
ham_indices = y_train[y_train == 0].index
for feature in X_train.columns[:5]: # For demonstration, plot the first 5 features
plt.figure(figsize=(8, 6))
sns.kdeplot(X_train.loc[spam_indices, feature], label=f'Spam - {feature}')
sns.kdeplot(X_train.loc[ham_indices, feature], label=f'Ham - {feature}')
plt.title(f'Density Plot of {feature} by Class')
plt.legend()
plt.show()
# Actual vs. Predicted labels plot
plt.figure(figsize=(8, 6))
sns.countplot(x=y_test, color='blue', alpha=0.5, label='Actual')
sns.countplot(x=y_pred, color='red', alpha=0.5, label='Predicted')
plt.title('Actual vs. Predicted Label Counts')
plt.legend()
plt.show()
your_dataset = pd.read_csv('Emails(modified)reduced.csv')
|
a87ab23f6e9bd547ea4e3807b20b3e53
|
{
"intermediate": 0.3854224383831024,
"beginner": 0.431572288274765,
"expert": 0.18300528824329376
}
|
42,453
|
from playwright.sync_api import sync_playwright
import concurrent.futures
import time
# user_data = [
# {
# "username": "muskan123",
# "password": "Kumari@3011",
# "passport": "123456789",
# "country": "India",
# "birth_year": "2001",
# },
# {
# "username": "muskankum",
# "password": "Muskan@3012",
# "passport": "123456789",
# "country": "India",
# "birth_year": "2001",
# },
# {
# "username": "katz",
# "password": "P@ss12344444",
# "passport": "123456789",
# "country": "India",
# "birth_year": "2001",
# }
# ]
def run_browser_session() -> bool:
with sync_playwright() as p:
# Start timer
s = time.time()
# Launch a Chromium browser
browser = p.chromium.launch(headless=False, slow_mo=800)
context = browser.new_context()
# Start tracing for screenshots, snapshots, and sources
context.tracing.start(screenshots=True, snapshots=True, sources=True)
# Create a new page and navigate to the website
page = context.new_page()
page.goto("https://smart.gdrfad.gov.ae/HomePage.aspx?GdfraLocale=en-US")
# Fill in username and password, then click on the login button
page.get_by_label("lblUsername").fill("john9898")
page.get_by_label("lblPassword").fill("Kalpesh@08")
page.get_by_role("button", name="Login").click()
# Click on the link to create a new application
page.get_by_role("link", name="New Application Create new").click()
# Click on the link to create a new 5 years tourism entry application
page.get_by_role("link", name="New 5 Years Tourism Entry").click()
# Fill Emirate in address inside UAE
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbAddressInsideEmiratesId").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="DUBAI").click()
# Select purpose of visit
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt586_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbVisitPurpose").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="Tourism").click()
# Select passport type
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt217_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbPassportTypeId").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="Normal").click()
# Fill in passport number
page.get_by_label("Passport Number").fill("P100000")
# Select current nationality
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt217_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantNationality").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Current Nationality", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
# Select previous nationality
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt217_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantPreviousNationality").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Previous Nationality", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
# Click Search Data button
page.get_by_role("button", name="Search Data").click()
# Enter mother's name in english
page.get_by_label("Mother Name En").fill("SIYA DEO")
# Click button to translate mother's name to arabic
page.locator('css=div[onclick="translateInputText(\'inpMotherNameEn\');"] img').click()
# Enter mother's name in english
page.get_by_label("Mother Name En").fill("SIYA DEO")
# Fill religion
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantReligion").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="HINDHU").click()
# Fill marital status
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantMaritalStatus").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="SINGLE").click()
# Fill education
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantEducation").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="B.SC. COMMERCE & BUSINESS ADMN").click()
# Fill language
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantLanguage1").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="ENGLISH").click()
# Fill coming from country
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbComingFromCountry").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Coming From Country", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
# Fill bank statement country
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbBankStatementCountry").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Bank Satement Country", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
# Fill profession
page.get_by_placeholder("Type or click for list").click()
page.get_by_placeholder("Type or click for list").fill("MANAGER")
# Fill email
page.get_by_label("Email", exact=True).fill("rpa.support@visaero.com")
page.get_by_label("Approval Email Copy Recipient").fill("rpa.support@visaero.com")
# Fill contact details mobile number
page.locator(
"#EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtinpMobileNumber").fill(
"0559856719")
# Fill preferred sms langauge
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbPreferredLanguage").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="ENGLISH").click()
# Click address inside the UAE city
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbAddressInsideCityId").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="Dubai").click()
# Fill address inside the UAE street, building/villa, floor and flat/villa number
page.get_by_label("Street").fill("SHAIKH ZAYED ROAD")
page.get_by_label("Building/Villa").fill("LATIFA TOWER")
page.get_by_label("Floor").fill("3")
page.get_by_label("Flat/Villa no.").fill("310")
# Fill address outside the UAE country, city
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantOutsideCountry").get_by_role(
"link", name="-- Select -- ").click()
page.get_by_role("combobox", name="Country", exact=True).fill("india")
page.get_by_text("INDIA", exact=True).click()
page.get_by_role("textbox", name="City *").fill("MUMBAI")
# Fill address outside the UAE address
page.get_by_label("Address").fill("MALAPPURAM DIST, KERALA -676507.^H301318430/03/2009 JEDDAH")
# Fill address outside the UAE mobile number
page.locator(
"#EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtinpAddressOutsideMobileNumber").click()
page.locator(
"#EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt629_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtinpAddressOutsideMobileNumber").press_sequentially(
"0501234567")
# Fill applicant faith
page.locator(
"#s2id_EmaratechSG_Theme_wt789_block_wtFormContent_SmartChannels_Application_CW_wt437_block_WebPatterns_wtcntAppSimpleSecExpandable_block_wtContent_wtContent_wtcmbApplicantFaith").get_by_role(
"link", name="-- Select -- ").click()
page.locator("li").filter(has_text="UnKnown").click()
# Address Inside UAE area
page.get_by_role("link", name="Select Area ").click()
page.locator("li").filter(has_text="Trade Centre 1").click()
# Go on next page
# page.get_by_role("button", name="Continue").click()
# Stop tracing and close the context and browser
context.tracing.stop(path="trace.zip")
# ---------------------
context.close()
browser.close()
# End timer
e = time.time()
# Print time taken
print(e - s)
return True
def main():
with concurrent.futures.ProcessPoolExecutor(max_workers=4) as executor:
futures = [executor.submit(run_browser_session) for _ in range(4)]
# Optionally, wait for all futures to complete if needed.
# concurrent.futures.wait(futures)
if __name__ == "__main__":
main()
modify the code so that a separate trace.zip file is created for each process, the name of the file should be trace_firstname_lastname_processnumber
|
e4c28d9ad012d43ce03a1adc81922eca
|
{
"intermediate": 0.39257267117500305,
"beginner": 0.4312073588371277,
"expert": 0.17621999979019165
}
|
42,454
|
when a user select application development group on a reference field refer to sys_user_group then a popup will show on top of screen that you cant add this user and clear the group field value in catalog client script servicenow
|
45663947f3f9f0892daab16d635ed180
|
{
"intermediate": 0.30742594599723816,
"beginner": 0.27272701263427734,
"expert": 0.4198470413684845
}
|
42,455
|
how do i make a python project with a main.py file that defines some functions
|
431d3523b9a43e04f6e0c2df9ecc4512
|
{
"intermediate": 0.3887399137020111,
"beginner": 0.37658682465553284,
"expert": 0.23467335104942322
}
|
42,456
|
# Define the system message
system_msg = 'You are a helpful assistant who understands data science.'
# Define the user message
user_msg = 'Create a small dataset about total sales over the last year. The format of the dataset should be a data frame with 12 rows and 2 columns. The columns should be called "month" and "total_sales_usd". The "month" column should contain the shortened forms of month names from "Jan" to "Dec". The "total_sales_usd" column should contain random numeric values taken from a normal distribution with mean 100000 and standard deviation 5000. Provide Python code to generate the dataset, then provide the output in the format of a markdown table.'
# Create a dataset using GPT
response = openai.ChatCompletion.create(model="gpt-3.5-turbo",
messages=[{"role": "system", "content": system_msg},
{"role": "user", "content": user_msg}])
|
9aed41707fb7e57dc354d03cf02ad085
|
{
"intermediate": 0.4407106041908264,
"beginner": 0.20841233432292938,
"expert": 0.35087698698043823
}
|
42,457
|
how do i make a python project with a main.py file that defines some functions and a pytest test in tests/integration/test_integration.py that import that function from main.py and tests it with some asserts
|
cc5f5daddacd74046b8c56c9e00691b2
|
{
"intermediate": 0.4543306231498718,
"beginner": 0.33637064695358276,
"expert": 0.2092987447977066
}
|
42,458
|
Je viens de récupérer un fichier script.rpy d'un jeu. Peux-tu le commenter ?
default persistent.chance_fantasy = 0.25
define transition_time = 0.5
label common_start_level:
$ temp_level_number += 1
$ level_number = 0
pause 0.1
$ level_number = temp_level_number
$ start_level_setup()
show screen block
show screen gameplay
$ renpy.pause(0.5)
if len(active_deck) > 0:
call draw from _call_draw
$ renpy.pause(0.4)
if len(active_deck) > 0:
call draw from _call_draw_1
$ renpy.pause(0.4)
if len(active_deck) > 0:
call draw from _call_draw_2
$ renpy.pause(0.5)
hide screen block
return
label start:
jump expression "level_1_" + persistent.selected_girl
label level_1_Miriam:
scene university_bg with Dissolve (transition_time)
$ renpy.pause(0.5)
show lv1_particles
show screen player_interface with Dissolve (transition_time)
$ save_game("last_save")
#voice "voices/mc/mc_lv0.ogg"
mc "It's the start of a new day... time to get dressed and go to University."
$ basic_shirt.choose_clothes()
$ basic_pants.choose_clothes()
$ basic_shoes.choose_clothes()
$ start_level_setup()
$ renpy.pause(0.5)
$ set_level_images("lv1")
$ nr = str(renpy.random.randint(1, 4))
call expression "d_lv1_" + nr + "_" + persistent.selected_girl from _call_expression
$ cards_to_choose = choose_cards(hand_lv1)
call screen get_a_card(cards_to_choose)
$ renpy.pause(0.3)
$ cards_to_choose = choose_cards(hand_lv1)
call screen get_a_card(cards_to_choose)
$ renpy.pause(0.3)
$ cards_to_choose = choose_cards(hand_lv1)
call screen get_a_card(cards_to_choose)
$ renpy.pause(1)
$ stage_number = 1
call common_start_level from _call_common_start_level
$ random_list = [commonmate, friend, john, louis, rich_dude]
$ sure_list = [deskjob, basic_store, basic_hobby]
$ level_list = set_level_list(random_list, sure_list, 5)
$ end_level_setup()
jump new_turn
|
6a7ea53016cc38d5e25f82c070f38715
|
{
"intermediate": 0.2119644582271576,
"beginner": 0.528012216091156,
"expert": 0.2600233256816864
}
|
42,459
|
Voici le passage d'un fichier script.rpy. Peux-tu le commenter ?
label level_1_Miriam:
scene university_bg with Dissolve (transition_time)
$ renpy.pause(0.5)
show lv1_particles
show screen player_interface with Dissolve (transition_time)
$ save_game("last_save")
#voice "voices/mc/mc_lv0.ogg"
mc "It's the start of a new day... time to get dressed and go to University."
$ basic_shirt.choose_clothes()
$ basic_pants.choose_clothes()
$ basic_shoes.choose_clothes()
|
08d31813ee6dc7deaec4ce205a7e1ccc
|
{
"intermediate": 0.2734963893890381,
"beginner": 0.40021634101867676,
"expert": 0.32628723978996277
}
|
42,460
|
Voici le passage d'un fichier script.rpy. Te semble-t-il correct ?
$ start_level_setup()
$ renpy.pause(0.5)
$ set_level_images("lv1")
$ nr = str(renpy.random.randint(1, 4))
call expression "d_lv1_" + nr + "_" + persistent.selected_girl from _call_expression
$ cards_to_choose = choose_cards(hand_lv1)
call screen get_a_card(cards_to_choose)
$ renpy.pause(0.3)
$ cards_to_choose = choose_cards(hand_lv1)
call screen get_a_card(cards_to_choose)
$ renpy.pause(0.3)
$ cards_to_choose = choose_cards(hand_lv1)
call screen get_a_card(cards_to_choose)
$ renpy.pause(1)
|
061e59fe15ae28bab859ecd2fd894391
|
{
"intermediate": 0.3296571671962738,
"beginner": 0.49955153465270996,
"expert": 0.17079125344753265
}
|
42,461
|
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-9-a94a5e2a2c44> in <cell line: 22>()
20 # Initializing and training the XGBoost classifier
21 model = XGBClassifier(use_label_encoder=False, eval_metric=‘logloss’)
—> 22 model.fit(X_train, y_train)
23
24 # Predicting on the test set
15 frames
/usr/local/lib/python3.10/dist-packages/xgboost/data.py in _invalid_dataframe_dtype(data)
306 type_err = “DataFrame.dtypes for data must be int, float, bool or category.”
307 msg = f"“”{type_err} {_ENABLE_CAT_ERR} {err}“”"
–> 308 raise ValueError(msg)
309
310
ValueError: DataFrame.dtypes for data must be int, float, bool or category. When categorical type is supplied, The experimental DMatrix parameterenable_categorical must be set to True. Invalid columns:Email No.: object
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score, precision_score, recall_score, f1_score
from sklearn.metrics import classification_report, confusion_matrix, roc_curve, roc_auc_score
from xgboost import XGBClassifier
# Load your dataset
df = pd.read_csv('Emails(modified)reduced.csv') # Replace 'your_dataset_path' with the actual path
# Assuming the last column is the target variable. Adjust column names as per your dataset.
X = df.iloc[:, :-1] # Feature variables
y = df.iloc[:, -1] # Target variable
# Splitting dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25, random_state=42)
# Initializing and training the XGBoost classifier
model = XGBClassifier(use_label_encoder=False, eval_metric='logloss')
model.fit(X_train, y_train)
# Predicting on the test set
y_pred = model.predict(X_test)
y_score = model.predict_proba(X_test)[:, 1] # Probability estimates for ROC curve
# Calculate accuracy, precision, and F1-score
accuracy = accuracy_score(y_test, y_pred)
precision = precision_score(y_test, y_pred)
recall = recall_score(y_test, y_pred)
f1 = f1_score(y_test, y_pred)
print(f"Accuracy: {accuracy:.2f}, Precision: {precision:.2f}, Recall: {recall:.2f}, F1 Score: {f1:.2f}")
# Show classification report
print("Classification Report:")
print(classification_report(y_test, y_pred))
# Confusion matrix with plots
plt.figure(figsize=(6, 5))
cm = confusion_matrix(y_test, y_pred)
sns.heatmap(cm, annot=True, fmt='d', cmap='Blues')
plt.title('Confusion Matrix')
plt.xlabel('Predicted')
plt.ylabel('Actual')
plt.show()
# ROC curve
fpr, tpr, _ = roc_curve(y_test, y_score)
roc_auc = roc_auc_score(y_test, y_score)
plt.figure()
plt.plot(fpr, tpr, color='darkorange', lw=2, label=f'ROC curve (area = {roc_auc:.2f})')
plt.plot([0, 1], [0, 1], color='navy', lw=2, linestyle=":")
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver Operating Characteristic')
plt.legend(loc="lower right")
plt.show()
# Visualize feature correlations
plt.figure(figsize=(10, 8))
corr = X_train.corr()
sns.heatmap(corr, cmap='coolwarm', annot=True)
plt.title('Feature Correlation Matrix')
plt.show()
# Density plot of feature values by class
spam_indices = y_train[y_train == 1].index
ham_indices = y_train[y_train == 0].index
for feature in X_train.columns[:5]: # For demonstration, plot the first 5 features
plt.figure(figsize=(8, 6))
sns.kdeplot(X_train.loc[spam_indices, feature], label=f'Spam - {feature}')
sns.kdeplot(X_train.loc[ham_indices, feature], label=f'Ham - {feature}')
plt.title(f'Density Plot of {feature} by Class')
plt.legend()
plt.show()
# Actual vs. Predicted labels plot
plt.figure(figsize=(8, 6))
sns.countplot(x=y_test, color='blue', alpha=0.5, label='Actual')
sns.countplot(x=y_pred, color='red', alpha=0.5, label='Predicted')
plt.title('Actual vs. Predicted Label Counts')
plt.legend()
plt.show()
|
b7ef9a6dd0403320756d65a4c6b65d66
|
{
"intermediate": 0.36474481225013733,
"beginner": 0.42431703209877014,
"expert": 0.21093814074993134
}
|
42,462
|
Take reference from this graio code and provide the same without any html css etc
with gr.Blocks(css = """#col_container { margin-left: auto; margin-right: auto;}
#chatbot {height: 520px; overflow: auto;}""",
theme=theme) as demo:
gr.HTML(title)
#gr.HTML("""<h3 align="center">This app provides you full access to GPT-4 Turbo (128K token limit). You don't need any OPENAI API key.</h1>""")
#gr.HTML("""<h3 align="center" style="color: red;">If this app is too busy, consider trying our GPT-3.5 app, which has a much shorter queue time. Visit it below:<br/><a href="https://huggingface.co/spaces/yuntian-deng/ChatGPT">https://huggingface.co/spaces/yuntian-deng/ChatGPT</a></h3>""")
gr.HTML("""<h3 align="center" style="color: red;">If this app doesn't respond, it's likely due to our API key hitting the daily limit of our organization. Consider trying our GPT-3.5 app:<br/><a href="https://huggingface.co/spaces/yuntian-deng/ChatGPT">https://huggingface.co/spaces/yuntian-deng/ChatGPT</a></h3>""")
#gr.HTML('''<center><a href="https://huggingface.co/spaces/ysharma/ChatGPT4?duplicate=true"><img src="https://bit.ly/3gLdBN6" alt="Duplicate Space"></a>Duplicate the Space and run securely with your OpenAI API Key</center>''')
with gr.Column(elem_id = "col_container", visible=False) as main_block:
#GPT4 API Key is provided by Huggingface
#openai_api_key = gr.Textbox(type='password', label="Enter only your GPT4 OpenAI API key here")
chatbot = gr.Chatbot(elem_id='chatbot') #c
inputs = gr.Textbox(placeholder= "Hi there!", label= "Type an input and press Enter") #t
state = gr.State([]) #s
with gr.Row():
with gr.Column(scale=7):
b1 = gr.Button(visible=not DISABLED).style(full_width=True)
with gr.Column(scale=3):
server_status_code = gr.Textbox(label="Status code from OpenAI server", )
#inputs, top_p, temperature, top_k, repetition_penalty
with gr.Accordion("Parameters", open=False):
top_p = gr.Slider( minimum=-0, maximum=1.0, value=1.0, step=0.05, interactive=True, label="Top-p (nucleus sampling)",)
temperature = gr.Slider( minimum=-0, maximum=5.0, value=1.0, step=0.1, interactive=True, label="Temperature",)
#top_k = gr.Slider( minimum=1, maximum=50, value=4, step=1, interactive=True, label="Top-k",)
#repetition_penalty = gr.Slider( minimum=0.1, maximum=3.0, value=1.03, step=0.01, interactive=True, label="Repetition Penalty", )
chat_counter = gr.Number(value=0, visible=False, precision=0)
with gr.Column(elem_id = "user_consent_container") as user_consent_block:
# Get user consent
accept_checkbox = gr.Checkbox(visible=False)
js = "(x) => confirm('By clicking \"OK\", I agree that my data may be published or shared.')"
with gr.Accordion("User Consent for Data Collection, Use, and Sharing", open=True):
gr.HTML("""
<div>
<p>By using our app, which is powered by OpenAI's API, you acknowledge and agree to the following terms regarding the data you provide:</p>
<ol>
<li><strong>Collection:</strong> We may collect information, including the inputs you type into our app, the outputs generated by OpenAI's API, and certain technical details about your device and connection (such as browser type, operating system, and IP address) provided by your device's request headers.</li>
<li><strong>Use:</strong> We may use the collected data for research purposes, to improve our services, and to develop new products or services, including commercial applications, and for security purposes, such as protecting against unauthorized access and attacks.</li>
<li><strong>Sharing and Publication:</strong> Your data, including the technical details collected from your device's request headers, may be published, shared with third parties, or used for analysis and reporting purposes.</li>
<li><strong>Data Retention:</strong> We may retain your data, including the technical details collected from your device's request headers, for as long as necessary.</li>
</ol>
<p>By continuing to use our app, you provide your explicit consent to the collection, use, and potential sharing of your data as described above. If you do not agree with our data collection, use, and sharing practices, please do not use our app.</p>
</div>
""")
accept_button = gr.Button("I Agree")
def enable_inputs():
return user_consent_block.update(visible=False), main_block.update(visible=True)
accept_button.click(None, None, accept_checkbox, _js=js, queue=False)
accept_checkbox.change(fn=enable_inputs, inputs=[], outputs=[user_consent_block, main_block], queue=False)
inputs.submit(reset_textbox, [], [inputs, b1], queue=False)
inputs.submit(predict, [inputs, top_p, temperature, chat_counter, chatbot, state], [chatbot, state, chat_counter, server_status_code, inputs, b1],) #openai_api_key
b1.click(reset_textbox, [], [inputs, b1], queue=False)
b1.click(predict, [inputs, top_p, temperature, chat_counter, chatbot, state], [chatbot, state, chat_counter, server_status_code, inputs, b1],) #openai_api_key
demo.queue(max_size=20, concurrency_count=NUM_THREADS, api_open=False).launch(share=False)
|
bdda309857e1deb1602a6cfa0788fa02
|
{
"intermediate": 0.29122188687324524,
"beginner": 0.5689813494682312,
"expert": 0.13979674875736237
}
|
42,463
|
make me a python script that would take a code directory and would give it a score based on multiple tools like bandit, and other to find best practices and errors warnings
|
6fad504d6b779f7c366eaa9aaad3080d
|
{
"intermediate": 0.4753071069717407,
"beginner": 0.11364329606294632,
"expert": 0.41104957461357117
}
|
42,464
|
IMplement a basic chatbot UI in streamlit which takes user input show it in chat interface ,apart from this i need a button on click of which a text box should appear which will consist of sql query generated during the process
|
3193473b455b2a5a262442e4afd99c6b
|
{
"intermediate": 0.3113897740840912,
"beginner": 0.3052053451538086,
"expert": 0.3834048807621002
}
|
42,465
|
improve
using System.Numerics;
using CounterStrikeSharp.API.Core;
namespace CurrencyApi;
public interface ICurrencyValue
{
public object Value { get; }
bool MoreThan(ICurrencyValue otherValue);
bool LessThan(ICurrencyValue otherValue);
bool Equals(ICurrencyValue otherValue);
ICurrencyValue Multiply(float scale);
bool Take(CCSPlayerController player);
bool Add(CCSPlayerController player);
bool Set(CCSPlayerController player);
}
|
7f2a4f5c2c3e8de73d84ccb41a59d80b
|
{
"intermediate": 0.4147530496120453,
"beginner": 0.31959956884384155,
"expert": 0.26564741134643555
}
|
42,466
|
does this function take the highest probability prediction from the model ? def generate_text(model, tokenizer, seed_text, num_generate, temperature=1.0):
model.eval()
generated_tokens = []
# Encode the seed text using the tokenizer
encoded_input = tokenizer.encode(seed_text)
input_ids = torch.tensor(encoded_input.ids, dtype=torch.long).unsqueeze(0).to(device)
# Generate num_generate tokens
with torch.no_grad():
for _ in range(num_generate):
output = model(input_ids)
# Get the last logits and apply temperature
logits = output[:, -1, :] / temperature
probabilities = F.softmax(logits, dim=-1)
next_token_id = torch.multinomial(probabilities, num_samples=1).item()
# Append generated token ID and prepare the new input_ids
generated_tokens.append(next_token_id)
input_ids = torch.cat([input_ids, torch.tensor([[next_token_id]], dtype=torch.long).to(device)], dim=1)
|
1d391ddb3d05f58772bcc449aa38f9d1
|
{
"intermediate": 0.3886115550994873,
"beginner": 0.0797046646475792,
"expert": 0.5316838026046753
}
|
42,467
|
can you give me a python code to whenever i press space key it writes hello to the console
|
b375c5fef96e0b98250822672630435c
|
{
"intermediate": 0.4822418987751007,
"beginner": 0.2597501277923584,
"expert": 0.2580079734325409
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.