instruction stringlengths 0 30k ⌀ |
|---|
null |
I appreciate it if you can provide me with one-line code.
I used np.flip but I want a different approach to make it generalized.
This was my code: `np.flip(image, 1)`
I also used `np.fliplr(image)`.
Note: The image to be flipped has three channels. |
I want to flip an image (with three channels RGB) horizontally just using array slicing. How can I do it with python? |
|python|arrays|numpy-slicing| |
null |
I am trying to create a structure `Student` which contains q substructures named `Course`. Each `Course` is a structure with a credit and point `int` values.
How do I set up my `Student` structure to have an integer number of `Course` structures within it? Thanks
```
struct Student{
int q;
Course course[q];
};
struct Course{
int credit;
int point;
};
```
I tried this but VSC is telling me it is wrong.
Edit error:
```
#include <iostream>
#include <math.h>
#include <stdlib.h>
#include <vector>
using namespace std;
struct Course{
string name;
int credit;
int point;
};
typedef struct Student{
vector<Course> courses;
};
int main() {
Student student;
system("cls");
int input;
int input2;
string strinput;
cout<<"--------------------------------------------------------------------------"<<endl;
cout<<" GPA & CGPA Calculator (Developed by Ohid) "<<endl;
cout<<"--------------------------------------------------------------------------\n"<<endl;
cout<<"Set up the student!"<<endl;
sub:
cout<<"Add a course"<<endl;
cout<<"Course name: ";
cin>>strinput;
cout<<"Course points: ";
cin>>input;
cout<<"Course credits ";
cin>>input2;
student.courses.push_back({strinput,input,input2});
cout<<"Add another course (1/0)?: ";
cin>>input;
switch(input)
{
case 1:
goto sub;
break;
case 2:
break;
}
int numCourses = student.courses.size();
cout<<"--------------------------------------------------------------------------\n"<<endl;
cout<<"Student created with "<<student.q<<" courses!"<<endl;
cout<<"--------------------------------------------------------------------------\n"<<endl;
cout<<" MENU:"<<endl;
cout<<" 1. Calculate GPA (Grade Point Average)"<<endl;
cout<<" 2. Calculate CGPA (Cummulative Grade Point Average)"<<endl;
cout<<" 3. Method that is applied here for calclating GPA & CGPA"<<endl;
cout<<" 4. Exit Application"<<endl;
cout<<"--------------------------------------------------------------------------"<<endl;
sub:
cout<<"Enter your choice: ";
cin>>input;
switch(input)
{
case 1:
cout<<"You chose calculate GPA";
calculateGPA();
break;
case 2:
cout<<"You chose calculate CGPA";
calculateCGPA();
break;
case 3:
cout<<"You chose method";
method();
break;
case 4:
cout<<"You chose to exit";
exit(EXIT_SUCCESS);
break;
default:
cout<<"You are stupid";
goto sub;
break;
}
}
void calculateGPA()
{
}
void calculateCGPA()
{
}
void method()
{
}
```
Error text!
'Course': undeclared identifiercpp(C2065)
'std::vector': 'Course' is not a valid template type argument for parameter '_Ty'cpp(C2923)
'std::vector': too few template argumentscpp(C2976)
class std::vector<Course>
|
In JSONata How to print date from todays date+60 calendar days |
I think its problem is your post add to the firebase Quary. Check your Quary again or provides its your question. |
I need to validate English words. Please suggest me a good JS NPM library. Thanks. |
Which is the best NPM JS library to use for a English dictionary lookup? |
|node.js|typescript|npm| |
I've just installed Wordpress 6.4.3 & MAMP and managed to edit functions.php and saved my changes.
When I tried to make more changes to the file, Wordpress started displaying the following error message:
> Something went wrong. Your change may not have been saved. Please try again. There is also a chance that you may need to manually fix and upload the file over FTP.
This is what I've added to functions.php
if ($_SERVER["REQUEST_METHOD"] === "POST") {
$name = sanitize_text_field($_POST["name"]);
$email = sanitize_email($_POST["email"]);
$message = sanitize_textarea_field($_POST["description"]);
// Add code to save the form data to the database
global $wpdb;
$table_name = $wpdb->prefix . 'request_form';
$data = array(
'name' => $name,
'email' => $email,
'description' => $message,
'submission_time' => current_time('mysql')
);
$insert_result = $wpdb->insert($table_name, $data);
if ($insert_result === false) {
$response = array(
'success' => false,
'message' => 'Error saving the form data.',
);
} else {
$response = array(
'success' => true,
'message' => 'Form data saved successfully.'
);
}
// Return the JSON response
header('Content-Type: application/json');
echo json_encode($response);
exit;
}
What I have done:
- I've checked permissions of the file and gave read and write access to everyone on my localhost.
- Restarted the server
- When I changed http to https for wordpress Address (URL) and Site Address (URL) under Settings > General and saved the changes, I was redirected to the following address http://localhost:8888/wp-admin/options.php and can see the following:
<br />
<b>Notice</b>: Undefined index: name in
<b>/Applications/MAMP/htdocs/wp-
content/themes/twentytwentyfour/functions.php</b> on line
<b>209</b><br />
<br />
<b>Notice</b>: Undefined index: email in
<b>/Applications/MAMP/htdocs/wp-
content/themes/twentytwentyfour/functions.php</b> on line
<b>210</b><br />
<br />
<b>Notice</b>: Undefined index: description in
<b>/Applications/MAMP/htdocs/wp-
content/themes/twentytwentyfour/functions.php</b> on line
<b>211</b><br />
<div id="error"><p class="wpdberror"><strong>WordPress database
error:</strong> [Unknown column 'submission_time' in
'field list']<br /><code>INSERT INTO `wp_request_form`
(`name`, `email`, `description`, `submission_time`) VALUES
('', '', '', '2024-03-30
07:10:51')</code></p></div>{"success":false,"message":"Error
saving the form data."}
|
CSS selector to match an element without attribute x |
This would be one approach to get the weekday as a number:
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-js -->
const day = new Intl.DateTimeFormat('en-US', {
timeZone: 'America/New_York',
weekday: 'long',
});
const weekdays = ["Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday"];
const dayNumber = weekdays.indexOf(day.format(Date.now()));
console.log(dayNumber);
<!-- end snippet -->
|
Use below procedure.
1. Give your elements unique ids.
Like
```html
<div id="tab1"></div>
<div id="tab2"></div>
<div id="tab3"></div>
<div id="tab4"></div>
```
2. Give your urls tab id, that you need to focus
```python
url_path#tab2
```
3. Use js to focus tab with id from url
```javascript
document.getElementById(window.location.hash).click();
//or
document.getElementById(window.location.hash).focus();
``` |
I had RNDocument Picker, the below worked for me:
Remove the package to start with: npm remove react-native-document-picker Install version 8.0.0
```$ npm install react-native-document-picker@8.0.0```
You can also try:
`$ npx react-native-asset react-native-document-picker`
If you get an error saying react-natiuve.config.js not found create a new file with that name in the root folder and add this:
module.Exports = { project: {ios: {},android: {},},assets: ['./src/assets/fonts/'], // path of your assert file};
then `npm start -- --reset-cache` |
{"Voters":[{"Id":3744304,"DisplayName":"connexo"}],"DeleteType":1} |
I have an Elastic Beanstalk application that is intermittently not responding, and I'm unable to find out why. What Happens:
1. The app will periodically respond with 200s to my health checks. And then, it will just stop. It will then come back on its own.
2. Subsequent API calls 200, when the app is in a good mood. And then suddenly all calls fail (until they don't anymore).
3. In the logs, I don't see any indications of crashing, but I'm new to this. I do see this peculiarity which shows up many times, and shows up corresponding to my api calls that I make to the app:
```
Mar 31 05:15:47 ip-172-31-28-174 systemd[1]: Starting refresh-policy-routes@ens5.service - Refresh policy routes for ens5...
Mar 31 05:15:47 ip-172-31-28-174 ec2net[2485]: Starting configuration for ens5
Mar 31 05:15:48 ip-172-31-28-174 systemd[1]: refresh-policy-routes@ens5.service: Deactivated successfully.
Mar 31 05:15:48 ip-172-31-28-174 systemd[1]: Finished refresh-policy-routes@ens5.service - Refresh policy routes for ens5.
Mar 31 05:15:48 ip-172-31-28-174 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=refresh-policy-routes@ens5 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'
Mar 31 05:15:48 ip-172-31-28-174 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=refresh-policy-routes@ens5 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'
```
Also, here's the setup:
1. FastAPI python app, deployed originally through eb cli with classical load balancer. Load balancer was later migrated.
2. 2 min instances, 4 max instances (all t3 micro)
3. All instances are healthy.
4. EB environment is healthy
5. https:// listener is set up on EB configuration, using cert from AWS.
6. CNAME configuration for the SSL on subdomain.
7. Default VPC with two subnets in two separate zones.
8. Subnets are mapped to route table that maps to IGW
[![enter image description here][1]][1]
[1]: https://i.stack.imgur.com/VSaLt.png
9. Procfile: `web: gunicorn main:app --workers=4 --worker-class=uvicorn.workers.UvicornWorker`
What could it be? Is it a networking configuration issue? Load balancer? Or something with the application environment? I was also able to deploy my code to a single instance EBS application and had no issues with the downtime. I was not able to easily get https on that instance, so I can't identify if the issue was at the Load balancer level, or not. |
1) Make your url names unique.
2) Make url parameters as below.
```python
path("products/",product_list_view,name="product_list"),
path("product/<int:pid>/",product_detail_view,name="product_detail"),
path("product/<slug:slug>/",get_product,name="get_product")
```
[ref][1]
[1]: https://docs.djangoproject.com/en/5.0/topics/http/urls/#example
3)
Make an error page template and show this page on exception
```
def get_product(request, slug):
try:
product = Product.objects.get(slug=slug)
if request.GET.get('size'):
size = request.GET.get('size')
print(size)
return render(request, "core/product_detail.html")
except Exception as e:
print(e)
return render(request, "core/error_page.html")
```
4) Obviously, you model have no field called slug. So below code wouldn't work.
And `print(size)` won't print. It's because below line doesn't work / generate an error.
```python
product = Product.objects.get(slug=slug)
```
You can only get objects using fields values in the model.
Like below.
```python
product = Product.objects.get(title="product_title")
#or
product = Product.objects.get(id=4)
``` |
I missed the `content-type` part in headers, and language. Also itemid should be uint32.
The final codes example:
conn = http.client.HTTPSConnection("partner.steam-api.com")
headers = {'Content-Type': 'application/x-www-form-urlencoded'}
orderid = uuid.uuid4().int & (1<<64)-1
print("orderid = ", orderid)
key = "xxxxxxxxxxxxxxxxxxx" # omitted for security reason
steamid = "xxxxxxxxxxxxxxxxxxx" # omitted for security reason
itemid = 100001
appid = "480"
itemcount = 1
currency = 'CNY'
amount = 350
language = 'zh-CN'
description = 'testing_description'
urlSandbox = "/ISteamMicroTxnSandbox/"
s = f'key={key}&orderid={orderid}&appid={appid}&steamid={steamid}&itemcount={itemcount}&language={language}¤cy={currency}&itemid[0]={itemid}&qty[0]={1}&amount[0]={amount}&description[0]={description}'
conn.request('POST', url=f'{urlSandbox}InitTxn/v3/', headers=headers, body=s)
r = conn.getresponse()
print("InitTxn result = ", r.read())
|
{"Voters":[{"Id":1889329,"DisplayName":"IInspectable"},{"Id":2287576,"DisplayName":"Andrew Truckle"},{"Id":354577,"DisplayName":"Chris"}]} |
I want to use rust to move a file(or folder) from folder A to another folder, this is my code look like:
use std::{fs, path::PathBuf};
fn main() {
let mut file_path = PathBuf::from("/Users/xiaoqiangjiang/apps/a");
let mut destination_path = PathBuf::from("/Users/xiaoqiangjiang/apps/test/");
let result = fs::rename(file_path, destination_path);
if let Err(err) = result {
print!("{}",err);
}
}
`a` is a plain text file. when I run this code in Linux(CentOS 7.8)/macOS(13.3.1 (a) ), shows error:
Is a directory (os error 21)%
Am I missing something? I have read the docs and example from web, it seems could not work like this. what should I do to fixed this issue? This is the rust version info:
> rustup -V
rustup 1.27.0 (bbb9276d2 2024-03-08)
info: This is the version for the rustup toolchain manager, not the rustc compiler.
info: The currently active `rustc` version is `rustc 1.77.1 (7cf61ebde 2024-03-27)`
|
JPA Hibernate OneToOne Mapping |
|java|hibernate|jpa|orm|hibernate-mapping| |
For me it worked via setting the diacritical mark between brackets and double quotes:
First the declaration of index properties seems still necessary, otherwise the diacritical mark will simply appear as a subscript. E.g. if I want to have A' instead of A:
declare_index_properties (A, [postsuperscript])$
A["'"];
A could have other indices, for example:
declare_index_properties (A, [postsubscript,postsubscript,postsuperscript])$
A[g,u,"'"];
(for more fancy stuff see the help for declare_index_properties).
However, I didn't find a way to add the mark above a variable.
|
Here is the simple example of the generator with `.send()` usage. It returns and at the same time sets the counter. `or i` is used when no arguments provided. The first call must be `next()` or `.send(None)`.
```python
def gen():
"""Generator summing by 1 with the possibility of setting a counter"""
i = 0
while True:
i = (yield i) or i
i += 1
if __name__ == '__main__':
g = gen()
print(f"{next(g)=}")
print(f"{g.send(3)=}")
print(f"{next(g)=}")
print(f"{g.send(None)=}")
print(f"{next(g)=}")
```
Output:
```python
next(g)=0
g.send(3)=4
next(g)=5
g.send(None)=6
next(g)=7
``` |
I have a `swipeDissmissableNavController` with a `SwipeDismissableNavHost` and I want to add in animations when entering a new page. I tried it the following way:
```
val navController = rememberSwipeDismissableNavController()
SwipeDismissableNavHost(
navController = navController,
startDestination = "startDestination",
enterTransition = //enterTransition here
){
// composables here
}
```
But it does not take any enterTransition parameters ore similar.
Any ideas?
|
Use Animations with a Navigator in WearOS |
|kotlin|animation|wear-os|navcontroller| |
|inno-setup|mysql-connector| |
{"Voters":[{"Id":5317403,"DisplayName":"acw1668"},{"Id":17562044,"DisplayName":"Sunderam Dubey"},{"Id":9214357,"DisplayName":"Zephyr"}],"SiteSpecificCloseReasonIds":[13]} |
Facing an issue with the .NET 8 Web Api application.
While running the project in the console with Dotnet run, it is not running completely or showing any error either. The console screen is as follows:
[![Console Log][1]][1]
[1]: https://i.stack.imgur.com/K3TzF.png
But the application is running in debug mode from the Visual Studio 2022 without any issue.
Here's my program.cs
```using Institutor.Data;
using Microsoft.AspNetCore.Authentication.JwtBearer;
using Microsoft.AspNetCore.Identity;
using Microsoft.EntityFrameworkCore;
using Microsoft.IdentityModel.Tokens;
using Microsoft.OpenApi.Models;
using System.Text;
System.Environment.SetEnvironmentVariable("DOTNET_SYSTEM_GLOBALIZATION_INVARIANT", "0");
// Create a WebApplication builder with the arguments passed to the program
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
// Add the DbContext and configure it to use SQLite with the connection string from the configuration
builder.Services.AddDbContext<InstitutorDataContext>(options =>
options.UseSqlServer(builder.Configuration.GetConnectionString("DefaultConnection")));
// Add Identity services to the DI container and configure it to use the DbContext for storage
builder.Services.AddIdentity<IdentityUser, IdentityRole>() // Add IdentityRole here
.AddEntityFrameworkStores<InstitutorDataContext>()
.AddDefaultTokenProviders(); // Add this to add the default token providers
// Add JWT authentication services to the DI container and configure it
builder.Services.AddAuthentication(x=>
{
x.DefaultAuthenticateScheme = JwtBearerDefaults.AuthenticationScheme;
x.DefaultChallengeScheme = JwtBearerDefaults.AuthenticationScheme;
})
.AddJwtBearer(options =>
{
options.TokenValidationParameters = new TokenValidationParameters
{
ValidateIssuerSigningKey = true,
IssuerSigningKey = new SymmetricSecurityKey(Encoding.UTF8.GetBytes(builder.Configuration.GetSection("AppSettings:Token").Value)),
ValidateIssuer = false,
ValidateAudience = false
};
options.IncludeErrorDetails = true
;
});
// Add controllers to the DI container
builder.Services.AddControllers();
// Add API explorer services to the DI container
builder.Services.AddEndpointsApiExplorer();
//Adding Azure Services logging
builder.Services.AddLogging(builder =>
{
builder.AddAzureWebAppDiagnostics();
});
// Add Swagger services to the DI container and configure it
builder.Services.AddSwaggerGen(c =>
{
c.SwaggerDoc("v1", new() { Title = "Institutor API", Version = "v1" });
c.AddSecurityDefinition("Bearer", new OpenApiSecurityScheme
{
Description = "JWT Authorization header using the Bearer scheme. Just paste the token (No need to add Bearer)",
Name = "Authorization",
In = ParameterLocation.Header,
Type = SecuritySchemeType.Http,
Scheme = "bearer"
});
c.AddSecurityRequirement(new OpenApiSecurityRequirement
{
{
new OpenApiSecurityScheme
{
Reference = new OpenApiReference
{
Type = ReferenceType.SecurityScheme,
Id = "Bearer"
}
},
new string[] {}
}
});
});
// Build the application
var app = builder.Build();
// Configure the HTTP request pipeline.
// Use Swagger in development environment
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
// Use HTTPS redirection middleware
app.UseHttpsRedirection();
// Use authentication middleware
app.UseAuthentication();
// Use authorization middleware
app.UseAuthorization();
// Map controller routes
app.MapControllers();
// Run the application
app.Run();
```
Here's the csproj file for the same:
```<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<InvariantGlobalization>false</InvariantGlobalization>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="8.0.3" />
<PackageReference Include="Microsoft.AspNetCore.Identity.EntityFrameworkCore" Version="8.0.3" />
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="8.0.3" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="8.0.3">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="8.0.3" />
<PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="8.0.3" />
<PackageReference Include="Microsoft.Extensions.Logging.AzureAppServices" Version="8.0.3" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.4.0" />
<PackageReference Include="System.IdentityModel.Tokens.Jwt" Version="7.4.1" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\Data\Data.csproj" />
<ProjectReference Include="..\Model\Model.csproj" />
<ProjectReference Include="..\Services\Services.csproj" />
</ItemGroup>
</Project>
```
Please let me know if any more details required in this regards. |
Dotnet Run is not working but the application is running in visual studio |
|.net|visual-studio|asp.net-web-api|.net-8.0| |
null |
null |
null |
null |
|python|django|indexing|pytest-django|pg-trgm| |
Should your assignment demand a "purely functional" approach, then you might want to look into recursive solutions or into the `fold` function. Below the same algorithm in three versions.
```
let f1 eq1 vars =
let mutable x = eq1
for v in vars do
x <- A (B v) x
x
let rec f2 x vars =
match vars with
| [] -> x
| h::t -> f2 (A (B h) x) t
let f3 eq1 vars =
(eq1, vars) ||> List.fold (fun x v -> A (B v) x)
```
Both the recursive solution `f2` and the `fold` in `f3` are quite common in the functional world. But, as mentioned by @smoothdeveloper, in F# you are free to also use other paradigms.
(The above assumes that `eq1` is a bool value. You mentioned "equation", not sure if you mean "function", but that does not make sense in the code context that you have shown.) |
Enable Gzip compression to reduce the size of transmitted data.
Modify `http` block of your nginx configuration
http {
gzip on;
gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
gzip_min_length 1000;
gzip_comp_level 2;
gzip_proxied any;
gzip_vary on;
gzip_buffers 16 8k;
gzip_disable "MSIE [1-6]\.(?!.*SV1)";
gzip_http_version 1.1;
}
|
I've just installed Wordpress 6.4.3 & MAMP and managed to edit functions.php and saved my changes.
When I tried to make more changes to the file, Wordpress started displaying the following error message:
> Something went wrong. Your change may not have been saved. Please try again. There is also a chance that you may need to manually fix and upload the file over FTP.
This is what I've added to functions.php successfully and cannot edit it anymore
if ($_SERVER["REQUEST_METHOD"] === "POST") {
$name = sanitize_text_field($_POST["name"]);
$email = sanitize_email($_POST["email"]);
$message = sanitize_textarea_field($_POST["description"]);
// Add code to save the form data to the database
global $wpdb;
$table_name = $wpdb->prefix . 'request_form';
$data = array(
'name' => $name,
'email' => $email,
'description' => $message,
'submission_time' => current_time('mysql')
);
$insert_result = $wpdb->insert($table_name, $data);
if ($insert_result === false) {
$response = array(
'success' => false,
'message' => 'Error saving the form data.',
);
} else {
$response = array(
'success' => true,
'message' => 'Form data saved successfully.'
);
}
// Return the JSON response
header('Content-Type: application/json');
echo json_encode($response);
exit;
}
What I have done:
- I've checked permissions of the file and gave read and write access to everyone on my localhost.
- Restarted the server
- When I changed http to https for wordpress Address (URL) and Site Address (URL) under Settings > General and saved the changes, I was redirected to the following address http://localhost:8888/wp-admin/options.php and can see the following:
<br />
<b>Notice</b>: Undefined index: name in
<b>/Applications/MAMP/htdocs/wp-
content/themes/twentytwentyfour/functions.php</b> on line
<b>209</b><br />
<br />
<b>Notice</b>: Undefined index: email in
<b>/Applications/MAMP/htdocs/wp-
content/themes/twentytwentyfour/functions.php</b> on line
<b>210</b><br />
<br />
<b>Notice</b>: Undefined index: description in
<b>/Applications/MAMP/htdocs/wp-
content/themes/twentytwentyfour/functions.php</b> on line
<b>211</b><br />
<div id="error"><p class="wpdberror"><strong>WordPress database
error:</strong> [Unknown column 'submission_time' in
'field list']<br /><code>INSERT INTO `wp_request_form`
(`name`, `email`, `description`, `submission_time`) VALUES
('', '', '', '2024-03-30
07:10:51')</code></p></div>{"success":false,"message":"Error
saving the form data."}
|
After opening an [RStudio enhancement request](https://github.com/rstudio/rstudio/issues/14094), I got following [answer from @kevinushey](https://github.com/rstudio/rstudio/issues/14094#issuecomment-1881696927) :
> It's unfortunately deprecated, but you might have better luck using
`options(download.file.method = "wininet")` on Windows.
Turns out this still works with `R 4.3.1` and `R 4.3.2` under Windows and is exactly what I was looking for :
```
options(download.file.method = "wininet")
```
Allowed me under RStudio to use package update / install menu without specifying `https_proxy` and without exposing my Windows user credentials.
[EDIT] As `wininet` is now superseded and might disappear in coming R versions, final solution with default `curl` method is (again credit [@kevinushey](https://github.com/rstudio/rstudio/issues/14094#issuecomment-1887784080)):
```
Sys.setenv(https_proxy_user = ":")
Sys.setenv(https_proxy = 'proxy.<my.domain>:proxyport')
|
Is a directory (os error 21) when using rust to move a file |
|rust| |
I have a webserver that is hosted on Shuttle.io when I try to access it from a Wix site via javascirpt I get "(Reason: CORS header ‘Access-Control-Allow-Origin’ missing)" errors.
I set up cors for my webserver in 2 different ways:
1.
```
let config = move |cfg: &mut ServiceConfig| {
let cors = Cors::permissive();
cfg.service(web::scope("/").service(get_advice).wrap(cors).wrap(Governor::new(&governor_conf)));
};
```
2.
```
let config = move |cfg: &mut ServiceConfig| {
let cors = Cors::default().allow_any_header().allow_any_method().allow_any_origin().send_wildcard();
cfg.service(web::scope("/").service(get_advice).wrap(cors).wrap(Governor::new(&governor_conf)));
};
```
I get the same error with both configuratios
|
"(Reason: CORS header ‘Access-Control-Allow-Origin’ missing)" while trying to access Actix webserver from Wix site |
|rust|cors|actix-web| |
null |
Aren't you meant to be passing a **blob-URL** for the image instead of just blob-plain?
Because it doesn't look like you're converting the blob to a URL with this...
canvas.toBlob(function (blob){
jQuery.ajax({
url:"//domain.com/wp-admin/admin-ajax.php",
type: "POST",
data: {action: "addblobtodb", image: blob},
success: function(id) {console.log("Succesfully inserted into DB: " + id);
}
});
So I would change it to this...
canvas.toBlob((blob)=>{
let Url=URL.createObjectURL(blob);
jQuery.ajax({
url:"//domain.com/wp-admin/admin-ajax.php",
type: "POST",
data: {action: "addblobtodb", image: Url}, // <-- Image blob url passed here
success: function(id) {console.log("Succesfully inserted into DB: " + id);}
});
},'image/png',1); // <--- specify the type & quality of image here
I hope it helps...
|
> 1. Why after call "squareChan <- testNum" scheduler first check cube goroutine instead of square goroutine?
The scheduler isn't "checking" the `cube` routine because of the write to `squareChan`. Both goroutines are simply starting up, and have `fmt.Println` as the first logic.
If you only want statements that say when something has been read, you'd need to place that after receiving from the channel.
You can't rely on consistent starting order for goroutines. There are no guarantees there.
> 2. Why after call "cubeChan <- testNum" main goroutine not blocked?
If `squareChan <- testNum` didn't block, why would this one?
The `cube` goroutine is waiting, blocked on receiving from the channel. When main writes to this channel, it's picked up immediately.
If you're genuinely curious about what events are being sequenced, collecting a `trace` profile may be of interest. |
I found help at the Godot forum. I originally tried to increase the sprite size in photoshop, which didn't work out well because I couldn't choose in which direction the sprite should increase in size. Now I used Aesprite to increase the size of the spritesheet and it works now :) Thank you for your help! |
Elasticbeanstalk FastAPI application is intermittently not responding to https requests |
|amazon-web-services|deployment|amazon-elastic-beanstalk|devops|fastapi| |
Jetbrains Intellij Works but Fleet does not : "is not recognized as an internal or external command, operable program or batch file." |
I'm a newbie in OCAML. For this example on using ocamlyacc: https://v2.ocaml.org/manual/lexyacc.html#s:lexyacc-example. I can compile and run the calc program. How do I run calc.ml ***in utop***? When I do this in utop
open Parser;;
open Lexer;;
#use "calc.ml";;
I get this error
File "calc.ml", line 5, characters 19-30:
5 | let result = Parser.main Lexer.token lexbuf in
^^^^^^^^^^^
Error: Unbound value Parser.main
Here are the files from the above website.
parser.mly:
%token <int> INT
%token PLUS MINUS TIMES DIV
%token LPAREN RPAREN
%token EOL
%left PLUS MINUS /* lowest precedence */
%left TIMES DIV /* medium precedence */
%nonassoc UMINUS /* highest precedence */
%start main /* the entry point */
%type <int> main
%%
main:
expr EOL { $1 }
;
expr:
INT { $1 }
| LPAREN expr RPAREN { $2 }
| expr PLUS expr { $1 + $3 }
| expr MINUS expr { $1 - $3 }
| expr TIMES expr { $1 * $3 }
| expr DIV expr { $1 / $3 }
| MINUS expr %prec UMINUS { - $2 }
;
lexer.mll:
{
open Parser (* The type token is defined in parser.mli *)
exception Eof
}
rule token = parse
[' ' '\t'] { token lexbuf } (* skip blanks *)
| ['\n' ] { EOL }
| ['0'-'9']+ as lxm { INT(int_of_string lxm) }
| '+' { PLUS }
| '-' { MINUS }
| '*' { TIMES }
| '/' { DIV }
| '(' { LPAREN }
| ')' { RPAREN }
| eof { raise Eof }
calc.ml:
let _ =
try
let lexbuf = Lexing.from_channel stdin in
while true do
let result = Parser.main Lexer.token lexbuf in
print_int result; print_newline(); flush stdout
done
with Lexer.Eof ->
exit 0
|
how to run ocamlyacc generated code in utop |
|utop|ocamlyacc| |
I would like to mutate a column in a dataframe, based off a column in another dataframe (similar to the XLOOKUP() function in Excel) in R.
So I have this dataframe (df):
```
df <- structure(list(orientasi = c("gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay", "gay",
"gay", "gay", "gay", "gay", "gay"), tahun = c(1980L, 1981L, 1982L,
1983L, 1984L, 1985L, 1986L, 1987L, 1988L, 1989L, 1990L, 1991L,
1991L, 1991L, 1991L, 1991L, 1991L, 1991L, 1991L, 1991L, 1991L,
1991L, 1991L, 1991L, 1991L, 1991L, 1991L, 1992L, 1993L, 1993L,
1993L, 1993L, 1993L, 1993L, 1993L, 1994L, 1994L, 1994L, 1994L,
1994L, 1994L, 1994L, 1994L, 1994L, 1994L, 1994L, 1995L, 1995L,
1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L,
1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L,
1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L, 1995L,
1995L, 1995L, 1995L, 1995L, 1996L, 1996L, 1996L, 1996L, 1996L,
1996L, 1996L, 1996L, 1996L, 1996L, 1996L, 1996L, 1996L, 1996L,
1996L, 1996L, 1996L, 1996L, 1996L, 1996L, 1996L), token = c("абагайтуйская",
"абагайтуйская", "абагайтуйские", "абагайтуйский", "абагайтуйским", "абагайтуйских",
"абагайтуйское", "абагайтуйской", "абагайтуйскому", "абагайтуйскую", "cгей", "славяне", "", "эхо", "татарстане",
"донесшийся", "менска", "лихой", "клич",
"гей", "славяне", "вызвал", "многонациональной",
"республике", "ответного", "энтузиазма",
"character", "своему", "сексуальному", "поведению",
"ягей", "комбайн", "садомазохист", "",
"рассказывают", "двери", "женских", "туалетов",
"журфака", "изложена", "слезная", "история",
"девушки", "влюбившейся", "гея", "cих",
"брак", "закончился", "течение", "элтон",
"джон", "понял", "бисексуал", "настоящий",
"гей", "отныне", "женщины", "интересуют",
"похоже", "начинает", "обрушиваться",
"вторая", "пирамида", "мавроди", "сексменьшинствами",
"оказалось", "значительно", "пресловутая",
"", "права", "отчаянно", "бьются", "активистыгеи",
"лесбиянкитолько", "", "мужчин", "", "женщин",
"cгеи", "счастливо", "сложилась", "песнь",
"матерого", "гея", "элтона", "джона", "дуэте",
"именитым", "трансвеститом", "рупол",
"клип", "жизнерадостную", "парочку",
"пытается", "разлучить", "какнибудь",
"ущемить", "голубоватые", "права")), row.names = c(NA,
100L), class = "data.frame")
```
And this dataframe (df2)
```
df2 <- structure(list(token = c("абалмасовыми", "абалмасовых",
"абалымов", "абалымова", "абалымове",
"абалымовой", "абалымову", "абалымовы",
"абалымовым", "абалымовыми", "абалымовых",
"абальян", "абальяна", "абальянам", "абальянами",
"абальянах", "абальяне", "абальянов",
"абальяном", "абальяну", "абальяны",
"абаляев", "абаляева", "абаляеве", "абаляевой",
"абаляеву", "абаляевы", "абаляевым",
"абаляевыми", "абаляевых", "абам", "абами",
"абанагроавтотранс", "абанагроавтотранса",
"абанагроавтотрансам", "абанагроавтотрансами",
"абанагроавтотрансах", "абанагроавтотрансе",
"абанагроавтотрансов", "абанагроавтотрансом",
"абанагроавтотрансу", "абанагроавтотрансы",
"абанагропромхимией", "абанагропромхимии",
"абанагропромхимию", "абанагропромхимия",
"абандон", "абандона", "абандонам", "абандонами",
"абандонах", "абандоне", "абандонов",
"абандоном", "абандону", "абандоны",
"абанская", "абанские", "абанский", "абанским",
"абанскими", "абанских", "абанского",
"абанское", "абанской", "абанском", "абанскому",
"абанскою", "абанскую", "абарин", "абарина",
"абарине", "абариной", "абарину", "абарины",
"абариным", "абариными", "абариных",
"абарчук", "абарчука", "абарчукам", "абарчуками",
"абарчуках", "абарчуке", "абарчуки",
"абарчуков", "абарчуком", "абарчуку",
"абаршалин", "абаршалина", "абаршалине",
"абаршалиной", "абаршалину", "абаршалины",
"абаршалиным", "абаршалиными", "абаршалиных",
"абас", "абаса", "абасали", "абасалиевен"
), lemma = c("абалмасовы", "абалмасовы",
"абалымов", "абалымов", "абалымов", "абалымова",
"абалымов", "абалымовы", "абалымов",
"абалымовы", "абалымовы", "абальян",
"абальян", "абальяны", "абальяны", "абальяны",
"абальян", "абальяны", "абальян", "абальян",
"абальяны", "абаляев", "абаляев", "абаляев",
"абаляева", "абаляев", "абаляевы", "абаляев",
"абаляевы", "абаляевы", "аба", "аба", "абанагроавтотранс",
"абанагроавтотранс", "абанагроавтотранс",
"абанагроавтотранс", "абанагроавтотранс",
"абанагроавтотранс", "абанагроавтотранс",
"абанагроавтотранс", "абанагроавтотранс",
"абанагроавтотранс", "абанагропромхимия",
"абанагропромхимия", "абанагропромхимия",
"абанагропромхимия", "абандон", "абандон",
"абандон", "абандон", "абандон", "абандон",
"абандон", "абандон", "абандон", "абандон",
"абанский", "абанский", "абанский", "абанский",
"абанский", "абанский", "абанский", "абанский",
"абанский", "абанский", "абанский", "абанский",
"абанский", "абарин", "абарин", "абарин",
"абарина", "абарин", "абарины", "абарин",
"абарины", "абарины", "абарчук", "абарчук",
"абарчуки", "абарчуки", "абарчуки", "абарчук",
"абарчуки", "абарчуки", "абарчук", "абарчук",
"абаршалин", "абаршалин", "абаршалин",
"абаршалина", "абаршалин", "абаршалины",
"абаршалин", "абаршалины", "абаршалины",
"абас", "абас", "абасали", "абасалиевна"
)), sorted = "token", row.names = 1000:1100, class = c("data.table",
"data.frame"))
```
How can I mutate the lemma column in df2, based on the token column in both df and df2?
This is the result that I expect:
```
orientatie jaar lemma
1 gay 1980 абагайтуйский
2 gay 1981 абагайтуйский
3 gay 1982 абагайтуйский
4 gay 1983 абагайтуйский
5 gay 1984 абагайтуйский
...
```
I have tried to use the join function:
```
df_gelemmatiseerd <- df %>%
inner_join(df2, by = "token", relationship = "many-to-many")
```
But it retrieved me funky result.
Your help is appreciated, thank you! |
My nestjs code like this:
```
constructor(resource,action, @InjectModel(Role.name) public roleModel?: Model<Role>) {
this.resource = resource;
this.action = action;
}
```
I got an error if tri use a model
Error:
> Role model is not injected yet
> at RolesGuard.canActivate (/home/siva/projects/prac360/appointment-api/src/role/role.guard.ts:28:13)
> at GuardsConsumer.tryActivate (/home/siva/projects/prac360/appointment-api/node_modules/@nestjs/core/guards/guards-consumer.js:15:34)
How can I access the model? |
Initialise with out constructor |
i have a Steam Deck and would like to install GLFW on a Steamdeck, so that VSCode can find the Libary, i use Cmake. I used Cmake on another linux machine with no problems. I think its related to the steam deck and steam os.
I a location in with the following files in usr/lib/cmake/glfw3 :
- glfw3Targets.cmake,
- glfw3Targets-noconfig.cmake,
- glfw3ConfigVersion.cmak,
- glfw3Config.cmake
Also there are two header files in /usr/include/GLFW called:
- glfw3.h
- glfw3native.h
Also i have this so's:
- /usr/lib/libglfw.so
- /usr/lib/libglfw.so.3
- /usr/lib/libglfw.so.3.3
In the main.cpp file the #include \<GLFW/glfw3.h\> do not work.
My Cmake file looks now like this:
```
cmake_minimum_required(VERSION 3.12)
project(test)
# Add executable
add_executable(test src/main.cpp)
# Specify the path to GLFW include directory
set(GLFW_INCLUDE_DIR "/usr/include" CACHE PATH "Path to GLFW include directory")
# Specify the path to GLFW library directory
set(GLFW_LIBRARY_DIR "/usr/lib" CACHE PATH "Path to GLFW library directory")
# Add GLFW include directory
target_include_directories(test PRIVATE ${GLFW_INCLUDE_DIR})
# Link GLFW library to your executable
target_link_libraries(test ${GLFW_LIBRARY_DIR}/libglfw.so)
```
Maybe some one with a better understanding of cmake, c and linux can help me.
Thank you in advance.
I tried different combinations of cmake files with find() and so on. Looked into the internet, and used Chat GPT to find a solution.
Reinstalled with
```
sudo pacman -S glfw-x11
```
glfw again on the steam deck.
EDIT:
Here is the full error when i start the building:
```
[main] Building folder: game_engine all
[build] Starting build
[proc] Executing command: /usr/bin/cmake --build /home/deck/Programming/game_engine/build --config Debug --target all --
[build] ninja: error: '/usr/lib/libglfw.so', needed by 'test', missing and no known rule to make it
[proc] The command: /usr/bin/cmake --build /home/deck/Programming/game_engine/build --config Debug --target all -- exited with code: 1
[driver] Build completed: 00:00:00.029
[build] Build finished with exit code 1
```
EDIT2:
When i safe the CMakeList.txt i get following message in VSCode:
```
[main] Configuring project: game_engine
[proc] Executing command: /usr/bin/cmake --no-warn-unused-cli -DCMAKE_BUILD_TYPE:STRING=Debug -DCMAKE_EXPORT_COMPILE_COMMANDS:BOOL=TRUE -DCMAKE_C_COMPILER:FILEPATH=/usr/bin/gcc -DCMAKE_CXX_COMPILER:FILEPATH=/usr/bin/g++ -S/home/deck/Programming/game_engine -B/home/deck/Programming/game_engine/build -G Ninja
[cmake] Not searching for unused variables given on the command line.
[cmake] -- Configuring done (0.0s)
[cmake] -- Generating done (0.0s)
[cmake] -- Build files have been written to: /home/deck/Programming/game_engine/build
```
Permission of usr/lib/libglfw.so:
with following command: ```ls -l libglfw.so```
lrwxrwxrwx 1 root root 12 Jul 23 2022 libglfw.so -> libglfw.so.3
EDIT3:
if i change my ```target_link_libraries(test ${GLFW_LIBRARY_DIR}/libglfw.so)``` to ```target_link_libraries(test $usr/lib/libglfw.so.3.3)```
i get this error in the build process:
```
[main] Building folder: game_engine
[build] Starting build
[proc] Executing command: /usr/bin/cmake --build /home/deck/Programming/game_engine/build --config Debug --target all --
[build] [1/2 50% :: 0.022] Building CXX object CMakeFiles/test.dir/src/main.cpp.o
[build] FAILED: CMakeFiles/test.dir/src/main.cpp.o
[build] /usr/bin/g++ -g -std=gnu++11 -MD -MT CMakeFiles/test.dir/src/main.cpp.o -MF CMakeFiles/test.dir/src/main.cpp.o.d -o CMakeFiles/test.dir/src/main.cpp.o -c /home/deck/Programming/game_engine/src/main.cpp
[build] /home/deck/Programming/game_engine/src/main.cpp:3:10: fatal error: GLFW/glfw3.h: No such file or directory
[build] 3 | #include <GLFW/glfw3.h>
[build] | ^~~~~~~~~~~~~~
[build] compilation terminated.
[build] ninja: build stopped: subcommand failed.
[proc] The command: /usr/bin/cmake --build /home/deck/Programming/game_engine/build --config Debug --target all -- exited with code: 1
[driver] Build completed: 00:00:00.054
[build] Build finished with exit code 1
``` |
Nothing is showing on the page.
library/pdf.php
defined('BASEPATH') OR exit('No direct script access allowed');
// Dompdf namespace use Dompdf\Dompdf;
class Pdf{
public function __construct(){
require_once dirname(__FILE__).'/dompdf/autoload.inc.php';
$pdf = new DOMPDF(); $CI = & get_instance();
$CI->dompdf = $pdf;
}
}
?>
Controller
function invoice(){
$data_user = array();
$data_user['users'] = $this->db->get('users')->limit(2)->result();
$this->load->view('invoice',$data_user);
$html = $this->output->get_output();
$this->load->library('pdf');
$this->dompdf->loadHtml($html);
$this->dompdf->setPaper('A4', 'landscape');
$this->dompdf->render();
$this->dompdf->stream("welcome.pdf", array("Attachment"=>0));
}
I tried to change html to find if html is causing issue. To test, I did keep only one h3 heading. But still got blank page. |
domPdf is showing blank page |
|codeigniter|dompdf| |
This is a extensible but not-so-clean solution by tweaking Gradle [`sourceSet`](https://docs.gradle.org/8.6/dsl/org.gradle.api.tasks.SourceSet.html).
In `build.gradle.kts`:
1. Add flags for manually specifying which JDK version to use:
```kotlin
var only8 = false
var only21 = false
System.getenv("only")?.let {
when
{
it == "8" -> only8 = true
it == "21" -> only21 = true
}
}
```
2. Add both source sets for 8 and 21:
```kotlin
sourceSets {
val java8 by creating
val java21 by creating
}
```
3. Add dependencies for `main` `sourceSet`:
```kotlin
dependencies {
if (only8) implementation(sourceSets.getByName("java8").output)
if (only21) implementation(sourceSets.getByName("java21").output)
}
```
4. Remove useless `sourceSet` while packaging jar:
```kotlin
tasks.withType<Jar> {
if (only21 and sourceSets.any { it.name.equals("java8", ignoreCase = true) }) {
sourceSets.remove(sourceSets.getByName("java8"))
}
if (only8 and sourceSets.any { it.name.equals("java21", ignoreCase = true) }) {
sourceSets.remove(sourceSets.getByName("java21"))
}
}
```
Then, by specifying `only=8` or `only=21` environment variable while building jar, I can specify whether the compilation is done by (`src/main/java` + `src/java8/java`) or (`src/main/java` + `src/java21/java`).
- Pros: Not so hard to implement.
- Cons: File level isolation may be too thick. |
Official documentation: [`CStrBufT`][1]
I stumbled on this class. I did a search and find no examples. Is it a case of:
CString str1 = L”XYZ”;
CStrBufT str1buf = CStrBufT(str1);
? Anyone actually used this wrapper class?
I am not sure why there are close votes. It is very clear. I am asking for a worked example of using this class instead of:
str1.GetBuffer
str1.GetBufferSetLength
str1.ReleaseBuffer
[1]: https://learn.microsoft.com/en-us/cpp/atl-mfc-shared/reference/cstrbuft-class?view=msvc-170
|
My understanding is that once an element is provided `position: fixed` property, it is removed from the normal document rendering workflow. But in the below example, the fixed element respects the `margin-left : 300px` property of its parent.
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-html -->
<div style="height: 100vh; background-color: lightblue; margin-left: 300px;">
<div style="position: fixed; background-color: tomato; width: 100px; height: 100px;">
</div>
</div>
<!-- end snippet -->
And once I provide `top: 0px;` and `left: 0px` property, it stays at the top left as expected. Can anyone explain why the fixed element respects the `margin-left` of parent when `top` and `left` properties are not provided?
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-html -->
<div style="height: 100vh; background-color: lightblue; margin-left: 300px;">
<div style="position: fixed; background-color: tomato; width: 100px; height: 100px; top: 0px; left: 0px;">
</div>
</div>
<!-- end snippet -->
|
When I clicked the button, the console printed `render` **once**.
expect:
Because of `StrictMode`, it should print `render` **twice**
```html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Document</title>
<script
src="https://unpkg.com/react@17/umd/react.development.js"
crossorigin
></script>
<script
src="https://unpkg.com/react-dom@17/umd/react-dom.development.js"
crossorigin
></script>
<script src="https://unpkg.com/babel-standalone@6/babel.min.js"></script>
<script type="text/babel">
console.log(React.version, ReactDOM.version);
class App extends React.Component {
state = {
count: 0,
};
render() {
console.log("render");
return (
<div>
<button
onClick={() => {
this.setState({
count: this.state.count + 1,
});
}}
>
click
</button>
<div>count {this.state.count}</div>
</div>
);
}
}
ReactDOM.render(
<React.StrictMode>
<App />
</React.StrictMode>,
document.getElementById("root")
);
</script>
</head>
<body>
<div id="root"></div>
</body>
</html>
```
---
> If I change the version to 16, the console will print the render twice
```
https://unpkg.com/react@16/umd/react.development.js
https://unpkg.com/react-dom@16/umd/react-dom.development.js
``` |
Why did StrictMode not execute twice in React17 |
|reactjs| |
I've been trying to set up the reinforcement learning environment on my local PC (intel mac os), but I've been continuously having an issue with the pygame window. System is not letting me to close the pygame window with the env.close() function.
Here is my main.py.
```
import os
import gym
from stable_baselines3 import PPO
from stable_baselines3.common.vec_env import DummyVecEnv
from stable_baselines3.common.evaluation import evaluate_policy
env_name = 'CartPole-v1'
env = gym.make(env_name, render_mode="human")
for episode in range(1, 11):
score = 0
state = env.reset()
done = False
while not done:
env.render()
action = env.action_space.sample()
n_state, reward, done, info, _ = env.step(action)
score += reward
print('Episode:', episode, 'Score:', score)
env.close()
```
I've tested on multiple IDEs and different versions of python, but I'm currently lost about what to do. The only way I was able to turn it off was, restart the kernel on Jupiter Notebook, or re-run the script on pycharm. I'm looking for an advise to resolve this issue.
I was able to obtain the score values without using the render_mode="human", but the display would not show up unless I set the specific render_mode. I'm not quite sure how all the other tutorials on youtube were able to pop up the rendering display.
Here's an output sample of the code from pycharm.
```
/opt/anaconda3/envs/cartpole/lib/python3.8/site-packages/gym/utils/passive_env_checker.py:233: DeprecationWarning: `np.bool8` is a deprecated alias for `np.bool_`. (Deprecated NumPy 1.24)
if not isinstance(terminated, (bool, np.bool8)):
Episode: 1 Score: 25.0
Episode: 2 Score: 21.0
Episode: 3 Score: 24.0
Episode: 4 Score: 37.0
Episode: 5 Score: 33.0
Episode: 6 Score: 25.0
Episode: 7 Score: 20.0
Episode: 8 Score: 22.0
Episode: 9 Score: 10.0
Episode: 10 Score: 21.0
env.close()
```
Although I've added another env.close() function, pygame window never closes. Please let me know if I'm missing anything. I really appreciate for the help. |
We can connect from postman Once the we socket request establish we have to do subsequent events also
1. establish connection using CONNECT event for connect event we get Knowlagent from server heart beet [![CoonectEvent][1]][1]
2. Then You have to Subscribe the Destination to listen for messages
[![Subscribe][2]][2]
3. Once subscribe you will get the message from server
4. Send message using this SEND event[![SEND messsage][3]][3]
for the NUL character:
postman.setGlobalVariable("NULL_CHAR", '\0');
Then, whenever you need to use it for WS, just call the variable with {{NULL_CHAR}}
[1]: https://i.stack.imgur.com/sojyD.png
[2]: https://i.stack.imgur.com/nbBam.png
[3]: https://i.stack.imgur.com/19O5M.png |
Import your JSON file, extract each element and add it to a dictionary with its key as index. Convert the dictionary to a JSON object and write it on to a JSON file.
Here is the sample code below:
import json
f = open('data.json')
data = json.load(f)
updated_data = dict()
for index, item in enumerate(data, start=1):
updated_data[index] = item
json_object = json.dumps(updated_data)
with open("updated_data.json", "w") as outfile:
outfile.write(json_object)
Output:
{"1": {"B4": 14, "B5": 12}, "2": {"B4": 58, "B5": 54}, "3": {"B4": 26, "B5": 65}} |
I am trying to use an api for dolphin emulator and keep on getting the error "The "dolphin" repository has 18 submodules which won't be opened automatically. You can still open each one individually by opening a file within.". How do I make it so that they open automatically, and how do I actually find which submodules aren't opening and then open them(opening a file inside isn't working)? The submodules not opening is causing problems when I am trying to build the .sln file to make it work. I am using Git. There is an error message at the end of the cloning that I am not sure what it means or how I could fix it. Anyone know how to do so?
Here is the output for cloning the repo:
C:\Users\License_Name\Downloads\Dolphin modded>git clone --recurse-submodules https://github.com/Felk/dolphin.git
Cloning into 'dolphin'...
remote: Enumerating objects: 436701, done.
remote: Counting objects: 100% (7/7), done.
remote: Compressing objects: 100% (3/3), done.
remote: Total 436701 (delta 4), reused 4 (delta 4), pack-reused 436694
Receiving objects: 100% (436701/436701), 433.07 MiB | 5.72 MiB/s, done.
Resolving deltas: 100% (344859/344859), done.
Updating files: 100% (6680/6680), done.
Submodule 'Externals/FFmpeg-bin' (https://github.com/dolphin-emu/ext-win-ffmpeg.git) registered for path 'Externals/FFmpeg-bin'
Submodule 'Externals/Qt' (https://github.com/dolphin-emu/ext-win-qt.git) registered for path 'Externals/Qt'
Submodule 'SDL' (https://github.com/libsdl-org/SDL.git) registered for path 'Externals/SDL/SDL'
Submodule 'Externals/VulkanMemoryAllocator' (https://github.com/GPUOpen-LibrariesAndSDKs/VulkanMemoryAllocator.git) registered for path 'Externals/VulkanMemoryAllocator'
Submodule 'Externals/cubeb/cubeb' (https://github.com/mozilla/cubeb.git) registered for path 'Externals/cubeb/cubeb'
Submodule 'Externals/curl/curl' (https://github.com/curl/curl.git) registered for path 'Externals/curl/curl'
Submodule 'Externals/fmt/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'Externals/fmt/fmt'
Submodule 'Externals/gtest' (https://github.com/google/googletest.git) registered for path 'Externals/gtest'
Submodule 'Externals/implot/implot' (https://github.com/epezent/implot.git) registered for path 'Externals/implot/implot'
Submodule 'Externals/libadrenotools' (https://github.com/bylaws/libadrenotools.git) registered for path 'Externals/libadrenotools'
Submodule 'Externals/libspng/libspng' (https://github.com/randy408/libspng.git) registered for path 'Externals/libspng/libspng'
Submodule 'libusb' (https://github.com/libusb/libusb.git) registered for path 'Externals/libusb/libusb'
Submodule 'Externals/lz4/lz4' (https://github.com/lz4/lz4) registered for path 'Externals/lz4/lz4'
Submodule 'Externals/mGBA/mgba' (https://github.com/mgba-emu/mgba.git) registered for path 'Externals/mGBA/mgba'
Submodule 'Externals/python' (https://github.com/Felk/ext-python.git) registered for path 'Externals/python'
Submodule 'Externals/rcheevos/rcheevos' (https://github.com/RetroAchievements/rcheevos.git) registered for path 'Externals/rcheevos/rcheevos'
Submodule 'Externals/spirv_cross/SPIRV-Cross' (https://github.com/KhronosGroup/SPIRV-Cross.git) registered for path 'Externals/spirv_cross/SPIRV-Cross'
Submodule 'Externals/zlib-ng/zlib-ng' (https://github.com/zlib-ng/zlib-ng.git) registered for path 'Externals/zlib-ng/zlib-ng'
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/FFmpeg-bin'...
remote: Enumerating objects: 152, done.
remote: Counting objects: 100% (152/152), done.
remote: Compressing objects: 100% (123/123), done.
remote: Total 152 (delta 29), reused 152 (delta 29), pack-reused 0
Receiving objects: 100% (152/152), 11.75 MiB | 7.21 MiB/s, done.
Resolving deltas: 100% (29/29), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/Qt'...
remote: Enumerating objects: 2215, done.
remote: Counting objects: 100% (2215/2215), done.
remote: Compressing objects: 100% (1413/1413), done.
remote: Total 2215 (delta 350), reused 2179 (delta 326), pack-reused 0
Receiving objects: 100% (2215/2215), 89.60 MiB | 4.68 MiB/s, done.
Resolving deltas: 100% (350/350), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/SDL/SDL'...
remote: Enumerating objects: 23950, done.
remote: Counting objects: 100% (23950/23950), done.
remote: Compressing objects: 100% (10674/10674), done.
remote: Total 23950 (delta 17342), reused 16805 (delta 11724), pack-reused 0
Receiving objects: 100% (23950/23950), 24.42 MiB | 6.23 MiB/s, done.
Resolving deltas: 100% (17342/17342), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/VulkanMemoryAllocator'...
remote: Enumerating objects: 10405, done.
remote: Counting objects: 100% (2005/2005), done.
remote: Compressing objects: 100% (353/353), done.
remote: Total 10405 (delta 1733), reused 1771 (delta 1612), pack-reused 8400
Receiving objects: 100% (10405/10405), 26.42 MiB | 6.38 MiB/s, done.
Resolving deltas: 100% (8209/8209), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/cubeb/cubeb'...
remote: Enumerating objects: 627, done.
remote: Counting objects: 100% (627/627), done.
remote: Compressing objects: 100% (450/450), done.
remote: Total 627 (delta 413), reused 277 (delta 137), pack-reused 0
Receiving objects: 100% (627/627), 515.56 KiB | 3.71 MiB/s, done.
Resolving deltas: 100% (413/413), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/curl/curl'...
remote: Enumerating objects: 209969, done.
remote: Counting objects: 100% (5425/5425), done.
remote: Compressing objects: 100% (613/613), done.
remote: Total 209969 (delta 4931), reused 5147 (delta 4812), pack-reused 204544
Receiving objects: 100% (209969/209969), 64.58 MiB | 6.36 MiB/s, done.
Resolving deltas: 100% (166357/166357), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/fmt/fmt'...
remote: Enumerating objects: 35049, done.
remote: Counting objects: 100% (35049/35049), done.
remote: Compressing objects: 100% (8543/8543), done.
remote: Total 35049 (delta 23840), reused 34816 (delta 23725), pack-reused 0
Receiving objects: 100% (35049/35049), 14.41 MiB | 6.08 MiB/s, done.
Resolving deltas: 100% (23840/23840), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/gtest'...
remote: Enumerating objects: 27328, done.
remote: Counting objects: 100% (367/367), done.
remote: Compressing objects: 100% (194/194), done.
remote: Total 27328 (delta 223), reused 211 (delta 153), pack-reused 26961
Receiving objects: 100% (27328/27328), 12.86 MiB | 5.52 MiB/s, done.
Resolving deltas: 100% (20257/20257), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/implot/implot'...
remote: Enumerating objects: 2574, done.
remote: Counting objects: 100% (710/710), done.
remote: Compressing objects: 100% (76/76), done.
remote: Total 2574 (delta 657), reused 638 (delta 634), pack-reused 1864
Receiving objects: 100% (2574/2574), 1.61 MiB | 6.66 MiB/s, done.
Resolving deltas: 100% (1671/1671), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/libadrenotools'...
remote: Enumerating objects: 223, done.
remote: Counting objects: 100% (36/36), done.
remote: Compressing objects: 100% (27/27), done.
remote: Total 223 (delta 10), reused 17 (delta 5), pack-reused 187
Receiving objects: 100% (223/223), 138.08 KiB | 10.62 MiB/s, done.
Resolving deltas: 100% (109/109), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/libspng/libspng'...
remote: Enumerating objects: 1048, done.
remote: Counting objects: 100% (1048/1048), done.
remote: Compressing objects: 100% (683/683), done.
remote: Total 1048 (delta 566), reused 748 (delta 330), pack-reused 0
Receiving objects: 100% (1048/1048), 404.12 KiB | 4.87 MiB/s, done.
Resolving deltas: 100% (566/566), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/libusb/libusb'...
remote: Enumerating objects: 4544, done.
remote: Counting objects: 100% (4544/4544), done.
remote: Compressing objects: 100% (1575/1575), done.
remote: Total 4544 (delta 3509), reused 3792 (delta 2900), pack-reused 0
Receiving objects: 100% (4544/4544), 2.02 MiB | 7.91 MiB/s, done.
Resolving deltas: 100% (3509/3509), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/lz4/lz4'...
remote: Enumerating objects: 15889, done.
remote: Counting objects: 100% (1778/1778), done.
remote: Compressing objects: 100% (691/691), done.
remote: Total 15889 (delta 1187), reused 1480 (delta 1067), pack-reused 14111
Receiving objects: 100% (15889/15889), 6.93 MiB | 4.25 MiB/s, done.
Resolving deltas: 100% (10975/10975), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/mGBA/mgba'...
remote: Enumerating objects: 12580, done.
remote: Counting objects: 100% (12580/12580), done.
remote: Compressing objects: 100% (8204/8204), done.
remote: Total 12580 (delta 7920), reused 7970 (delta 4146), pack-reused 0
Receiving objects: 100% (12580/12580), 27.92 MiB | 6.33 MiB/s, done.
Resolving deltas: 100% (7920/7920), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/python'...
remote: Enumerating objects: 833, done.
remote: Counting objects: 100% (474/474), done.
remote: Compressing objects: 100% (308/308), done.
remote: Total 833 (delta 204), reused 344 (delta 166), pack-reused 359
Receiving objects: 100% (833/833), 48.85 MiB | 4.89 MiB/s, done.
Resolving deltas: 100% (326/326), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/rcheevos/rcheevos'...
remote: Enumerating objects: 4639, done.
remote: Counting objects: 100% (2299/2299), done.
remote: Compressing objects: 100% (672/672), done.
remote: Total 4639 (delta 1954), reused 1830 (delta 1621), pack-reused 2340
Receiving objects: 100% (4639/4639), 2.01 MiB | 6.27 MiB/s, done.
Resolving deltas: 100% (3155/3155), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/spirv_cross/SPIRV-Cross'...
remote: Enumerating objects: 10282, done.
remote: Counting objects: 100% (10282/10282), done.
remote: Compressing objects: 100% (5064/5064), done.
remote: Total 10282 (delta 6940), reused 7687 (delta 5104), pack-reused 0
Receiving objects: 100% (10282/10282), 3.32 MiB | 9.03 MiB/s, done.
Resolving deltas: 100% (6940/6940), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/zlib-ng/zlib-ng'...
remote: Enumerating objects: 5113, done.
remote: Counting objects: 100% (5113/5113), done.
remote: Compressing objects: 100% (2601/2601), done.
remote: Total 5113 (delta 3273), reused 4008 (delta 2420), pack-reused 0
Receiving objects: 100% (5113/5113), 5.46 MiB | 8.46 MiB/s, done.
Resolving deltas: 100% (3273/3273), done.
Submodule path 'Externals/FFmpeg-bin': checked out '9bc087fbca36ce5a85eb4fd73f0c73813593e5a2'
Submodule path 'Externals/Qt': checked out '495517af2b922c10c24f543e0fd6ea3ddf774e50'
Submodule path 'Externals/SDL/SDL': checked out 'ac13ca9ab691e13e8eebe9684740ddcb0d716203'
Submodule path 'Externals/VulkanMemoryAllocator': checked out '498e20dfd1343d99b9115201034bb0219801cdec'
remote: Enumerating objects: 6764, done.
remote: Counting objects: 100% (6258/6258), done.
remote: Compressing objects: 100% (1766/1766), done.
Receiving objects: 100% (6090/6090), 2.43 MiB | 2.74 MiB/s, done.
remote: Total 6090 (delta 4420), reused 5785 (delta 4176), pack-reused 0
Resolving deltas: 100% (4420/4420), completed with 81 local objects.
From https://github.com/mozilla/cubeb
* branch 27d2a102b0b75d9e49d43bc1ea516233fb87d778 -> FETCH_HEAD
Submodule path 'Externals/cubeb/cubeb': checked out '27d2a102b0b75d9e49d43bc1ea516233fb87d778'
Submodule 'cmake/sanitizers-cmake' (https://github.com/arsenm/sanitizers-cmake) registered for path 'Externals/cubeb/cubeb/cmake/sanitizers-cmake'
Submodule 'googletest' (https://github.com/google/googletest) registered for path 'Externals/cubeb/cubeb/googletest'
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/cubeb/cubeb/cmake/sanitizers-cmake'...
remote: Enumerating objects: 252, done.
remote: Counting objects: 100% (96/96), done.
remote: Compressing objects: 100% (19/19), done.
remote: Total 252 (delta 84), reused 79 (delta 77), pack-reused 156
Receiving objects: 100% (252/252), 55.82 KiB | 11.16 MiB/s, done.
Resolving deltas: 100% (160/160), done.
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/cubeb/cubeb/googletest'...
remote: Enumerating objects: 27328, done.
remote: Counting objects: 100% (367/367), done.
remote: Compressing objects: 100% (194/194), done.
remote: Total 27328 (delta 223), reused 211 (delta 153), pack-reused 26961
Receiving objects: 100% (27328/27328), 12.86 MiB | 3.80 MiB/s, done.
Resolving deltas: 100% (20257/20257), done.
Submodule path 'Externals/cubeb/cubeb/cmake/sanitizers-cmake': checked out 'aab6948fa863bc1cbe5d0850bc46b9ef02ed4c1a'
Submodule path 'Externals/cubeb/cubeb/googletest': checked out '800f5422ac9d9e0ad59cd860a2ef3a679588acb4'
Submodule path 'Externals/curl/curl': checked out '7ab9d43720bc34d9aa351c7ca683c1668ebf8335'
Submodule path 'Externals/fmt/fmt': checked out 'f5e54359df4c26b6230fc61d38aa294581393084'
Submodule path 'Externals/gtest': checked out '58d77fa8070e8cec2dc1ed015d66b454c8d78850'
Submodule path 'Externals/implot/implot': checked out 'cc5e1daa5c7f2335a9460ae79c829011dc5cef2d'
Submodule path 'Externals/libadrenotools': checked out 'f4ce3c9618e7ecfcdd238b17dad9a0b888f5de90'
Submodule 'lib/linkernsbypass' (https://github.com/bylaws/liblinkernsbypass/) registered for path 'Externals/libadrenotools/lib/linkernsbypass'
Cloning into 'C:/Users/License_Name/Downloads/Dolphin modded/dolphin/Externals/libadrenotools/lib/linkernsbypass'...
remote: Enumerating objects: 26, done.
remote: Counting objects: 100% (21/21), done.
remote: Compressing objects: 100% (14/14), done.
remote: Total 26 (delta 11), reused 14 (delta 7), pack-reused 5
Receiving objects: 100% (26/26), 10.28 KiB | 10.28 MiB/s, done.
Resolving deltas: 100% (11/11), done.
Submodule path 'Externals/libadrenotools/lib/linkernsbypass': checked out '2f8df932534999196751dde3e0302f83ef1f4513'
remote: Enumerating objects: 154, done.
remote: Counting objects: 100% (105/105), done.
remote: Compressing objects: 100% (31/31), done.
remote: Total 54 (delta 31), reused 40 (delta 23), pack-reused 0
Unpacking objects: 100% (54/54), 7.30 KiB | 43.00 KiB/s, done.
From https://github.com/randy408/libspng
* branch dc5b1032c08efac68ad30170f7ccbf0aa8dd55c9 -> FETCH_HEAD
Submodule path 'Externals/libspng/libspng': checked out 'dc5b1032c08efac68ad30170f7ccbf0aa8dd55c9'
remote: Enumerating objects: 86, done.
remote: Counting objects: 100% (72/72), done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 16 (delta 10), reused 12 (delta 6), pack-reused 0
Unpacking objects: 100% (16/16), 2.99 KiB | 49.00 KiB/s, done.
From https://github.com/libusb/libusb
* branch ba698478afc3d3a72644eef9fc4cd24ce8383a4c -> FETCH_HEAD
Submodule path 'Externals/libusb/libusb': checked out 'ba698478afc3d3a72644eef9fc4cd24ce8383a4c'
Submodule path 'Externals/lz4/lz4': checked out '5fc0630a0ed55e9755b0fe3990cd021ae1c3edc1'
remote: Enumerating objects: 4382, done.
remote: Counting objects: 100% (3205/3205), done.
remote: Compressing objects: 100% (695/695), done.
Receiving objects: 100% (2408/2408), 350.09 KiB | 7.00 MiB/s, done.
remote: Total 2408 (delta 1922), reused 2172 (delta 1698), pack-reused 0
Resolving deltas: 100% (1922/1922), completed with 258 local objects.
From https://github.com/mgba-emu/mgba
* branch 8739b22fbc90fdf0b4f6612ef9c0520f0ba44a51 -> FETCH_HEAD
Submodule path 'Externals/mGBA/mgba': checked out '8739b22fbc90fdf0b4f6612ef9c0520f0ba44a51'
Submodule path 'Externals/python': checked out '56bae2976e9618b3b75533a6a50bd926dea6704c'
Submodule path 'Externals/rcheevos/rcheevos': checked out 'd9e990e6d13527532b7e2bb23164a1f3b7f33bb5'
Submodule path 'Externals/spirv_cross/SPIRV-Cross': checked out '50b4d5389b6a06f86fb63a2848e1a7da6d9755ca'
remote: Enumerating objects: 4202, done.
remote: Counting objects: 100% (3222/3222), done.
remote: Compressing objects: 100% (816/816), done.
remote: Total 2569 (delta 1981), reused 2269 (delta 1691), pack-reused 0
Receiving objects: 100% (2569/2569), 541.15 KiB | 8.20 MiB/s, done.
Resolving deltas: 100% (1981/1981), completed with 214 local objects.
From https://github.com/zlib-ng/zlib-ng
* branch ce01b1e41da298334f8214389cc9369540a7560f -> FETCH_HEAD
Submodule path 'Externals/zlib-ng/zlib-ng': checked out 'ce01b1e41da298334f8214389cc9369540a7560f'
' is not recognized as an internal or external command,
operable program or batch file.n modded>
|
This is my source code, I don't know why I'm getting the error.
I tried doing the code from the first, still it didn't work.
```
import pickle
def writeFile():
with open('main.dat', mode='rb+') as fh:
stu = {}
n = int(input('How many records you wanna enter(press 0 if no records to be added):'))
for i in range(n):
name = input('Enter the name of the student:')
roll = int(input('Enter the roll no of the student:'))
age = int(input('Enter the age of the student:'))
marks = int(input('Enter the marks of the student:'))
# updating record
stu['name'] = name
stu['age'] = age
stu['roll'] = roll
stu['marks'] = marks
# dumping in the file
updateRec = pickle.dump(stu, fh)
try:
view = True
while view:
data = pickle.load(fh)
print(data)
except Exception as err:
print(err)
writeFile()
``` |
`TProperty` exists only in the C# source code text. The compiler always resolves it to a concrete type. If you have a method
void Test<T>(T arg)
{
}
and call it like this
Test("hello");
Test(3);
The compiler generates code for two methods!
void Test(string arg)
{
}
void Test(int arg)
{
}
This means that you have to supply concrete types for your generic parameters if you want to have an invokable method.
|
I'm deploying a zip package as an Azure function with the use of Terraform. The relevant code fragments below:
resource "azurerm_storage_account" "mtr_storage" {
name = "mtrstorage${random_string.random_storage_account_suffix.result}"
resource_group_name = azurerm_resource_group.mtr_rg.name
location = azurerm_resource_group.mtr_rg.location
account_kind = "BlobStorage"
account_tier = "Standard"
account_replication_type = "LRS"
network_rules {
default_action = "Deny"
ip_rules = ["127.0.0.1", "my.id.addr.here"]
virtual_network_subnet_ids = [azurerm_subnet.mtr_subnet.id]
bypass = ["AzureServices", "Metrics"]
}
tags = {
environment = "local"
}
}
resource "azurerm_storage_blob" "mtr_hello_function_blob" {
name = "MTR.ListBlobsFunction.publish.zip"
storage_account_name = azurerm_storage_account.mtr_storage.name
storage_container_name = azurerm_storage_container.mtr_hello_function_container.name
type = "Block"
source = "./example_code/MTR.ListBlobsFunction/MTR.ListBlobsFunction.publish.zip"
}
resource "azurerm_service_plan" "mtr_hello_function_svc_plan" {
name = "mtr-hello-function-svc-plan"
location = azurerm_resource_group.mtr_rg.location
resource_group_name = azurerm_resource_group.mtr_rg.name
os_type = "Linux"
sku_name = "B1"
tags = {
environment = "local"
}
}
data "azurerm_storage_account_blob_container_sas" "storage_account_blob_container_sas_for_hello" {
connection_string = azurerm_storage_account.mtr_storage.primary_connection_string
container_name = azurerm_storage_container.mtr_hello_function_container.name
start = timeadd(timestamp(), "-5m")
expiry = timeadd(timestamp(), "5m")
permissions {
read = true
add = false
create = false
write = false
delete = false
list = false
}
}
resource "azurerm_linux_function_app" "mtr_hello_function" {
name = "mtr-hello-function"
location = azurerm_resource_group.mtr_rg.location
resource_group_name = azurerm_resource_group.mtr_rg.name
service_plan_id = azurerm_service_plan.mtr_hello_function_svc_plan.id
storage_account_name = azurerm_storage_account.mtr_storage.name
storage_account_access_key = azurerm_storage_account.mtr_storage.primary_access_key
app_settings = {
"FUNCTIONS_WORKER_RUNTIME" = "dotnet"
"WEBSITE_RUN_FROM_PACKAGE" = "https://${azurerm_storage_account.mtr_storage.name}.blob.core.windows.net/${azurerm_storage_container.mtr_hello_function_container.name}/${azurerm_storage_blob.mtr_hello_function_blob.name}${data.azurerm_storage_account_blob_container_sas.storage_account_blob_container_sas_for_hello.sas}"
"AzureWebJobsStorage" = azurerm_storage_account.mtr_storage.primary_connection_string
"AzureWebJobsDisableHomepage" = "true"
}
site_config {
always_on = true
application_insights_connection_string = azurerm_application_insights.mtr_ai.connection_string
application_stack {
dotnet_version = "8.0"
use_dotnet_isolated_runtime = true
}
cors {
allowed_origins = ["*"]
}
}
tags = {
environment = "local"
}
}
The zip file is uploaded to the storage account, it is downloadable and has the correct structure (or so I think). Function App is created as well, however when I scroll down to see the functions, it tells me there was an error loading functions: `Encountered an error (ServiceUnavailable) from host runtime.` It's probably some configuration error, but I just can't see it. Also I went to the advanced tools tab and in wwwroot there's only one file named `run_from_package_failure.log`. Its contents are:
RunFromPackage> Failed to download package from https://mtrstorageqbr56n79h4.blob.core.windows.net/hello-function-releases/MTR.ListBlobsFunction.publish.zip?sv=2018-11-09&sr=c&st=2024-03-31T12:09:08Z&se=2024-03-31T12:19:08Z&sp=r&spr=https&sig=c6dhOCRPL%2BEQujbuq0589D2MXFN5eXfBmhow1QWSfHU%3D. Return code: 1 - (this URL is malformed, I swapped some characters ;) ). However when I go to the url the log contains, the zip file gets downloaded successfully... |
app.js
```
import React from "react";
import { createRoot } from "react-dom/client";
import { Provider } from "react-redux";
import configureStore from "./store/configureStore";
import { startSetExpenses } from "./actions/expenses";
import 'normalize.css/normalize.css';
import './styles/styles.scss';
import AppRouter from "./routers/AppRouter";
import 'react-dates/lib/css/_datepicker.css';
import { auth } from './firebase/firebase';
import { onAuthStateChanged } from "firebase/auth";
import { useNavigate } from "react-router-dom";
const root = createRoot(document.getElementById("app"));
const navigate = useNavigate();
const store = configureStore();
const jsx = (
<Provider store={store}>
<AppRouter />
</Provider>
);
const loadingComponent = <div><p>Loading...</p></div>;
root.render(loadingComponent);
store.dispatch(startSetExpenses())
.then(() => {
root.render(jsx);
})
.catch((error) => {
console.error("Error fetching expenses:", error);
});
onAuthStateChanged(auth, (user) => {
if(user){
console.log("LoggedIN");
navigate("/dashboard");
}else{
console.log("LoggedOUT");
navigate("/");
}
})
```
AppRouter.js:
```
import React from 'react';
import LoginPage from '../components/LoginPage';
import Header from '../components/Header';
import NotFound from '../components/NotFound';
import EditExpense from '../components/EditExpense';
import AddExpense from '../components/AddExpense';
import ExpenseDashboardPage from '../components/ExpenseDashboard';
import HelpPage from '../components/HelpPage';
import { Routes, Route, BrowserRouter } from 'react-router-dom';
const AppRouter = () => (
<BrowserRouter >
<Header/>
<Routes >
<Route path="/" element={<LoginPage />} />
<Route path="/dashboard" element={<ExpenseDashboardPage />} />
<Route path="/create" element={<AddExpense />} />
<Route path="/edit/:id" element={<EditExpense />} />
<Route path="/help" element={<HelpPage />} />
<Route path="*" element={<NotFound />} />
</Routes>
</BrowserRouter>
);
export default AppRouter;
```
The `store.dispatch(startSetExpenses())` is to fetch the initial data from firebase. Till it gets resolved, I am rendering the loadingComponent on screen.
What I expected by using the useNavigate from react-router-dom was to navigate to the "/" path when user is not logged in, in the onAuthStateChanged, and to "/dashboard" when a user is logged in.
But it isn't working. How do I achieve that? |
MacOS Bash-Script: while read p and echo |
|bash|macos| |
Coding a two-dimensional, dynamically allocated array of `int` is a non-trivial task. If you use a _double pointer_, there are significant memory allocation details. In particular, you need to check every allocation attempt, and, if any one fails, back out of the ones that didn't fail, without leaking memory. And, finally, to take a term from the C++ lexicon, you need to provide some equivalent of the _Rule of Three_ functions used by a C++ _RAII_ class.
A two-dimensional array of `int` with `n_rows` and `n_cols` should probably be allocated with a single call to `malloc`:
```lang-c
int* data = malloc(n_rows * n_cols * sizeof int);
```
With this approach, you have to write indexing functions that accept both `row` and `col` subscripts.
The program in the OP, however, takes the _double pointer_ route, and allocates an _array of pointers to int_. The elements in such an array are `int*`. Thus, the argument to the `sizeof` operator must be `int*`.
```lang-c
int** data = malloc(n_rows * sizeof (int*)); // step 1.
```
Afterwards, you run a loop to allocate the columns of each row. Conceptually, each row is an array of `int`, with `malloc` returning a pointer to the first element of each row.
```lang-c
for (size_t r = 0; r < n_rows; ++r)
data[r] = malloc(n_cols * sizeof int); // step 2.
```
The program in the OP errs in step 1.
```lang-c
// from the OP:
ret = (int **)malloc(sizeof(int) * len); // should be sizeof(int*)
```
Taking guidance from the [answer by @Eric Postpischil](https://stackoverflow.com/a/78248309/22193627), this can be coded as:
```lang-c
int** ret = malloc(len * sizeof *ret);
```
The program below, however, uses (the equivalent of):
```lang-c
int** ret = malloc(len * sizeof(int*));
```
#### Fixing memory leaks
The program in the OP is careful to check each call to `malloc`, and abort the allocation process if any one of them fails. After a failure, however, it does not call `free` to release the allocations that were successful. Potentially, therefore, it leaks memory. It should keep track of the row where a failure occurs, and call `free` on the preceding rows. It should also call `free` on the original array of pointers.
There is enough detail to warrant refactoring the code to create a separate header with functions to manage a two-dimensional array. This separates the business of array management from the application itself.
Header `tbx.int_array2D.h`, defined below, provides the following functions.
- _Define_ – `struct int_array2D_t` holds a `data` pointer, along with variables for `n_rows` and `n_cols`.
- _Make_ – Function `make_int_array2D` handles allocations, and returns a `struct int_array2D_t` object.
- _Free_ – Function `free_int_array2D` handles deallocations.
- _Clone_ – Function `clone_int_array2D` returns a _deep copy_ of a `struct int_array2D_t` object. It can be used both in initialization expressions and assignments.
- _Swap_ – Function `swap_int_array2D` swaps two `int_array2D_t` objects.
- _Copy assign_ – Function `copy_assign_int_array2D` replaces the existing value of an `int_array2D_t` object with a _deep copy_ of another. It performs allocation and deallocation, as needed.
- _Move assign_ – Function `move_assign_int_array2D` deallocates an existing `int_array2D_t` object, and replaces it with a _shallow copy_ of another. After assignment, it zeros-out the source.
- _Equals_ – Function `equals_int_array2D` performs a _deep comparison_ of two `int_array2D_t` objects, returning `1` when they are equal, and `0`, otherwise.
```lang-c
#pragma once
// tbx.int_array2D.h
#include <stddef.h>
struct int_array2D_t
{
int** data;
size_t n_rows;
size_t n_cols;
};
void free_int_array2D(
struct int_array2D_t* a);
struct int_array2D_t make_int_array2D(
const size_t n_rows,
const size_t n_cols);
struct int_array2D_t clone_int_array2D(
const struct int_array2D_t* a);
void swap_int_array2D(
struct int_array2D_t* a,
struct int_array2D_t* b);
void copy_assign_int_array2D(
struct int_array2D_t* a,
const struct int_array2D_t* b);
void move_assign_int_array2D(
struct int_array2D_t* a,
struct int_array2D_t* b);
int equals_int_array2D(
const struct int_array2D_t* a,
const struct int_array2D_t* b);
// end file: tbx.int_array2D.h
```
With these functions, code for the application becomes almost trivial:
```lang-c
// main.c
#include <stddef.h>
#include <stdio.h>
#include "tbx.int_array2D.h"
size_t ft_strlen(char* str)
{
size_t count = 0;
while (*str++)
count++;
return (count);
}
struct int_array2D_t parray(char* str)
{
size_t len = ft_strlen(str);
struct int_array2D_t ret = make_int_array2D(len, 2);
for (size_t r = ret.n_rows; r--;) {
ret.data[r][0] = str[r] - '0';
ret.data[r][1] = (int)(len - 1 - r);
}
return ret;
}
int main()
{
char* str = "1255555555555555";
struct int_array2D_t ret = parray(str);
for (size_t r = 0; r < ret.n_rows; ++r) {
printf("%d %d \n", ret.data[r][0], ret.data[r][1]);
}
free_int_array2D(&ret);
}
// end file: main.c
```
#### tbx.int_array2D.c
Function `free_int_array2D` has been designed so that it can be used for the normal deallocation of an array, such as happens in function `main`, and also so that it can be called from function `make_int_array2D`, when an allocation fails.
Either way, it sets the `data` pointer to `NULL`, and both `n_rows` and `n_cols` to zero. Applications that use header `tbx.int_array2D.h` can check the `data` pointer of objects returned by function `make_int_array2D` and function `clone_int_array2D`. If it is `NULL`, then the allocation failed.
```lang-c
// tbx.int_array2D.c
#include <stddef.h>
#include <stdlib.h>
#include "tbx.int_array2D.h"
//======================================================================
// free_int_array2D
//======================================================================
void free_int_array2D(struct int_array2D_t* a)
{
for (size_t i = a->n_rows; i--;)
free(a->data[i]);
free(a->data);
a->data = NULL;
a->n_rows = 0;
a->n_cols = 0;
}
//======================================================================
// make_int_array2D
//======================================================================
struct int_array2D_t make_int_array2D(
const size_t n_rows,
const size_t n_cols)
{
struct int_array2D_t a = {
malloc(n_rows * sizeof(int*)),
n_rows,
n_cols
};
if (!n_rows || !n_cols)
{
free(a.data);
a.data = NULL;
a.n_rows = 0;
a.n_cols = 0;
}
else if (a.data == NULL) {
a.n_rows = 0;
a.n_cols = 0;
}
else {
for (size_t i = 0u; i < n_rows; ++i) {
a.data[i] = malloc(n_cols * sizeof(int));
if (a.data[i] == NULL) {
a.n_rows = i;
free_int_array2D(&a);
break;
}
}
}
return a;
}
//======================================================================
// clone_int_array2D
//======================================================================
struct int_array2D_t clone_int_array2D(const struct int_array2D_t* a)
{
struct int_array2D_t clone = make_int_array2D(a->n_rows, a->n_cols);
for (size_t r = clone.n_rows; r--;) {
for (size_t c = clone.n_cols; c--;) {
clone.data[r][c] = a->data[r][c];
}
}
return clone;
}
//======================================================================
// swap_int_array2D
//======================================================================
void swap_int_array2D(
struct int_array2D_t* a,
struct int_array2D_t* b)
{
struct int_array2D_t t = *a;
*a = *b;
*b = t;
}
//======================================================================
// copy_assign_int_array2D
//======================================================================
void copy_assign_int_array2D(
struct int_array2D_t* a,
const struct int_array2D_t* b)
{
if (a->data != b->data) {
if (a->n_rows == b->n_rows && a->n_cols == b->n_cols) {
for (size_t r = a->n_rows; r--;) {
for (size_t c = a->n_cols; c--;) {
a->data[r][c] = b->data[r][c];
}
}
}
else {
free_int_array2D(a);
*a = clone_int_array2D(b);
}
}
}
//======================================================================
// move_assign_int_array2D
//======================================================================
void move_assign_int_array2D(
struct int_array2D_t* a,
struct int_array2D_t* b)
{
if (a->data != b->data) {
free_int_array2D(a);
*a = *b;
b->data = NULL;
b->n_rows = 0;
b->n_cols = 0;
}
}
//======================================================================
// equals_int_array2D
//======================================================================
int equals_int_array2D(
const struct int_array2D_t* a,
const struct int_array2D_t* b)
{
if (a->n_rows != b->n_rows ||
a->n_cols != b->n_cols) {
return 0;
}
for (size_t r = a->n_rows; r--;) {
for (size_t c = a->n_cols; c--;) {
if (a->data[r][c] != b->data[r][c]) {
return 0;
}
}
}
return 1;
}
// end file: tbx.int_array2D.c
```
|
I have an application that uses email addresses for user authentication.
I know that some universities use Shibboleth for user authentication, and I was wondering what the process is for being able to read emails from the university database that is used for Shibboleth. Note that I do not care about authentication through Shibboleth, I **only** need to be able to read the email addresses.
Is it general for all universities that use Shibboleth, or is each a unique case?
Is there any documentation on how to do this process?
|
How to obtain an email address from a Shibboleth auth system |
I am making a program that solves a sudoku game, but I received an error from my compiler.
```
error: invalid use of undefined type ‘struct Square’
30 | if(box->squares[x]->possible[number-1] == 0){
^~
```
the function is:
```
int updateBoxes(Square *** sudoku, int row, int column){
int x;
int number = sudoku[row][column]->number;
Box * box;
box = sudoku[row][column]->box;
for(x = 0; x < SIZE_ROWS; x++){
if(box->squares[x]->possible[number-1] == 0){
box->squares[x]->solvable--;
box->squares[x]->possible[number-1] = 1;
}
}
return 1;
}
```
the error also aplies to the next two lines.
the definition of Square and Box is:
```
typedef struct box
{
struct Square ** squares; /* array of Squares */
int numbers;
int possible[9];
int solvable;
struct Box * next;
} Box;
/* square is a single number in the board */
typedef struct square
{
int number;
int possible[9];
/*
the array is only made of 0 and 1
each index corresponds to a number: ex: index 4 == number 5 ...
1 tells me that 5 cannot go into this scare
0 tells me that 5 can go into this scare
*/
int solvable; /* will be subtracted 1 each time and when it is 1 it's because is solvable */
Box * box; /* tells me the corresponding box */
int row;
int column;
} Square;
```
I am following a tutorial on youtube and the guy doesn't get the error that I get, the youtube video is:
https://www.youtube.com/watch?v=CnzrCBLCDhc&t=1326s
|
invalid use of undefined type ‘struct Square’? |
|c| |
null |
{"Voters":[{"Id":1145388,"DisplayName":"Stephen Ostermiller"}]} |
I have a domain on aws, I have created a certificate with success
I have a A record on hosted zone pointing to my ELB dns
when using the ELB url - the website load unsecure.
when using the domain the website wont even load.
when I check ns lookup on my domain name it bring up IP this ip show my website
How can I fix the navigate from my domain to the website ? |
|javascript|function|module|frontend|web-frontend| |
[enter image description here](https://i.stack.imgur.com/2PeZH.png)
```java
driver.findElement(By.xpath("//*[@id=\"myModal\"]/div/div/div[2]/div/div[1]/div/div/div/div[1]/input")).sendKeys("Andre Felipe");
```
I use the code above to fill in the field but it stays fixed. I've tried simulating a click and enter and it didn't work.
I tried to simulate a click and Enter and it didn't work.Selenium managed to type but I couldn't find a way to fix the data in the form field. |
I'm facing an issue with an ASP.NET Core 8 Web API.
While running the project in the console with `dotnet run`, it is not running completely or showing any error either. The console screen is as follows:
[![Console Log][1]][1]
[1]: https://i.stack.imgur.com/K3TzF.png
But the application is running in debug mode from the Visual Studio 2022 without any issue.
Here's my `program.cs`:
```
using Institutor.Data;
using Microsoft.AspNetCore.Authentication.JwtBearer;
using Microsoft.AspNetCore.Identity;
using Microsoft.EntityFrameworkCore;
using Microsoft.IdentityModel.Tokens;
using Microsoft.OpenApi.Models;
using System.Text;
System.Environment.SetEnvironmentVariable("DOTNET_SYSTEM_GLOBALIZATION_INVARIANT", "0");
// Create a WebApplication builder with the arguments passed to the program
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
// Add the DbContext and configure it to use SQLite with the connection string from the configuration
builder.Services.AddDbContext<InstitutorDataContext>(options =>
options.UseSqlServer(builder.Configuration.GetConnectionString("DefaultConnection")));
// Add Identity services to the DI container and configure it to use the DbContext for storage
builder.Services.AddIdentity<IdentityUser, IdentityRole>() // Add IdentityRole here
.AddEntityFrameworkStores<InstitutorDataContext>()
.AddDefaultTokenProviders(); // Add this to add the default token providers
// Add JWT authentication services to the DI container and configure it
builder.Services.AddAuthentication(x=>
{
x.DefaultAuthenticateScheme = JwtBearerDefaults.AuthenticationScheme;
x.DefaultChallengeScheme = JwtBearerDefaults.AuthenticationScheme;
})
.AddJwtBearer(options =>
{
options.TokenValidationParameters = new TokenValidationParameters
{
ValidateIssuerSigningKey = true,
IssuerSigningKey = new SymmetricSecurityKey(Encoding.UTF8.GetBytes(builder.Configuration.GetSection("AppSettings:Token").Value)),
ValidateIssuer = false,
ValidateAudience = false
};
options.IncludeErrorDetails = true
;
});
// Add controllers to the DI container
builder.Services.AddControllers();
// Add API explorer services to the DI container
builder.Services.AddEndpointsApiExplorer();
//Adding Azure Services logging
builder.Services.AddLogging(builder =>
{
builder.AddAzureWebAppDiagnostics();
});
// Add Swagger services to the DI container and configure it
builder.Services.AddSwaggerGen(c =>
{
c.SwaggerDoc("v1", new() { Title = "Institutor API", Version = "v1" });
c.AddSecurityDefinition("Bearer", new OpenApiSecurityScheme
{
Description = "JWT Authorization header using the Bearer scheme. Just paste the token (No need to add Bearer)",
Name = "Authorization",
In = ParameterLocation.Header,
Type = SecuritySchemeType.Http,
Scheme = "bearer"
});
c.AddSecurityRequirement(new OpenApiSecurityRequirement
{
{
new OpenApiSecurityScheme
{
Reference = new OpenApiReference
{
Type = ReferenceType.SecurityScheme,
Id = "Bearer"
}
},
new string[] {}
}
});
});
// Build the application
var app = builder.Build();
// Configure the HTTP request pipeline.
// Use Swagger in development environment
if (app.Environment.IsDevelopment())
{
app.UseSwagger();
app.UseSwaggerUI();
}
// Use HTTPS redirection middleware
app.UseHttpsRedirection();
// Use authentication middleware
app.UseAuthentication();
// Use authorization middleware
app.UseAuthorization();
// Map controller routes
app.MapControllers();
// Run the application
app.Run();
```
Here's the `.csproj` file for project:
```
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<InvariantGlobalization>false</InvariantGlobalization>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="8.0.3" />
<PackageReference Include="Microsoft.AspNetCore.Identity.EntityFrameworkCore" Version="8.0.3" />
<PackageReference Include="Microsoft.EntityFrameworkCore" Version="8.0.3" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="8.0.3">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="8.0.3" />
<PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="8.0.3" />
<PackageReference Include="Microsoft.Extensions.Logging.AzureAppServices" Version="8.0.3" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.4.0" />
<PackageReference Include="System.IdentityModel.Tokens.Jwt" Version="7.4.1" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\Data\Data.csproj" />
<ProjectReference Include="..\Model\Model.csproj" />
<ProjectReference Include="..\Services\Services.csproj" />
</ItemGroup>
</Project>
```
Please let me know if any more details required in this regards. |
Dotnet Run is not working but the application is running in Visual Studio |
|visual-studio|asp.net-core-webapi|.net-8.0| |
How can I design entity auditing in a Spring Boot application?
```
@Getter
@Setter
@Entity
@AllArgsConstructor
@NoArgsConstructor
@Builder
@Table(name = "user")
public class User implements Serializable {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
@Column(name = "id", nullable = false)
private Long id;
@Column(name = "username", nullable = false)
private String username;
@Column(name = "password", nullable = false)
private String password;
@ManyToMany(fetch = FetchType.EAGER)
@JoinTable(name = "user_roles",
joinColumns = @JoinColumn(name = "user_id"),
inverseJoinColumns = @JoinColumn(name = "role_id"))
private List<Role> roles = new ArrayList<>();
}
```
```
@MappedSuperclass
@EntityListeners(AuditingEntityListener.class)
public abstract class AuditableEntity {
@CreatedDate
@Column(name = "created_on")
private LocalDateTime createdOn;
@CreatedBy
@Column(name = "created_by")
private User createdBy;
@LastModifiedDate
@Column(name = "updated_on")
private LocalDateTime updatedOn;
@LastModifiedBy
@Column(name = "updated_by")
private User updatedBy;
}
```
I want the following data to be stored in each entity that is subject to audit:
- createdBy
- createdDate
- updatedBy
- updatedDate
Is it possible to make the `createdBy` and `updatedBy` in `AuditableEntity` have a one-to-many relationship with the `User` entity, since one `User` can create many records of different entities, but the entity can have only one creator? I'm not sure if this can be done in `MappedSuperclass`, maybe it makes sense to just store the Id of user in `AuditableEntity`?
In the `getCurrentAuditor` implementation of `AuditorAware` I would get the current `principal` from `Authentication`, find a `User` entity from the database with the same id and return it:
```
Authentication authentication = SecurityContextHolder.getContext().getAuthentication();
CustomUserDetails user = (CustomUserDetails) authentication.getPrincipal();
return userService.findUserById(user.getId());
```
```
@Data
public class CustomUserDetails implements UserDetails {
private Long id;
private String username;
private String password;
private Collection<? extends GrantedAuthority> authorities;
private boolean accountNonExpired;
private boolean accountNonLocked;
private boolean credentialsNonExpired;
private boolean enabled;
public CustomUserDetails(Long id, String username, String password, Collection<? extends GrantedAuthority> authorities) {
this.id = id;
this.username = username;
this.password = password;
this.authorities = authorities;
this.accountNonExpired = true;
this.accountNonLocked = true;
this.credentialsNonExpired = true;
this.enabled = true;
}
@Override
public Collection<? extends GrantedAuthority> getAuthorities() {
return authorities;
}
public Long getId() {
return id;
}
@Override
public String getPassword() {
return password;
}
@Override
public String getUsername() {
return username;
}
@Override
public boolean isAccountNonExpired() {
return this.accountNonExpired;
}
@Override
public boolean isAccountNonLocked() {
return this.accountNonLocked;
}
@Override
public boolean isCredentialsNonExpired() {
return this.credentialsNonExpired;
}
@Override
public boolean isEnabled() {
return this.enabled;
}
}
```
```
@Service
public class CustomUserDetailsService implements UserDetailsService {
private final UserRepository userRepository;
@Autowired
public CustomUserDetailsService(UserRepository userRepository) {
this.userRepository = userRepository;
}
@Override
public UserDetails loadUserByUsername(String username) throws UsernameNotFoundException {
User user = userRepository.findByUsername(username).orElseThrow(() -> new UsernameNotFoundException("User not found: " + username));
return new CustomUserDetails(
user.getId(),
user.getUsername(),
user.getPassword(),
user.getRoles().stream().map((role) -> new SimpleGrantedAuthority(role.getName()))
.collect(Collectors.toList())
);
}
}
```
Then I would inherit my other entities from `AuditableEntity`, for example:
```
@Getter
@Setter
@Entity
@Builder
@AllArgsConstructor
@NoArgsConstructor
@Table(name = "patient")
public class Patient extends AuditableEntity {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
@Column(name = "id", nullable = false)
private Long id;
@Column(name = "name", nullable = false)
private String name;
@Column(name = "surname", nullable = false)
private String surname;
@Column(name = "middle_name", nullable = false)
private String middleName;
@Column(name = "address", nullable = false)
private String address;
@Column(name = "phone", nullable = false)
private String phone;
@Column(name = "messenger_contact", nullable = true)
private String messengerContact;
@Column(name = "birth_date", nullable = false)
private LocalDate birthDate;
@Column(name = "preferential_category", nullable = true)
private String preferentialCategory;
@OneToMany(mappedBy = "patient", orphanRemoval = true)
private List<Appointment> appointments = new ArrayList<>();
}
```
I understand that I will need to create fields
- createdBy
- createdDate
- updatedBy
- updatedDate
in each table of the database to be audited.
These are my thoughts, please advise me how I could have done it better and more correctly if possible |
Spring JPA Data Auditing - How to design it? |
|java|spring|spring-boot|spring-security|spring-data-jpa| |
null |
I am tryin to upload a new app to the App Store. It was built with expo and I am now tryin to build it using EAS.
I've uploaded my first and only app a few months ago.
From the root of my project I run:
`eas build --platform ios`
- Do you want to log in to your Apple account - I choose yes
- Apple ID (presented and correct) - I click yes
Then the following
Restoring session C:\Users\Yanay\.app-store\auth\yanaycodes@gmail.com\cookie
Team My Name (My ID)
Provider My Name (Some Numbers maybe another ID)
√ Logged in Local session
✔ Bundle identifier registered MY.ACTUAL.BUNDLE.IDENTIFIER
✔ Synced capabilities: Enabled: Push Notifications | Disabled: Sign In with Apple
✔ Synced capability identifiers: No updates
✔ Fetched Apple distribution certificates
- Reuse this distribution certificate?
Cert ID: SOME ID, Serial number: SOME SERIAL NUMBER, Team ID: MY TEAM ID, Team name: MY NAME (Individual) - I choose yes
- Generate a new Apple Provisioning Profile? - I choose yes
And then I get
✖ Failed to create Apple provisioning profile
Failed to set up credentials.
There is a problem with the request entity - You are not allowed to create 'iOS' profile with App ID 'APP ID'.
Error: build command failed.
What is happening?
I went to my developer account and I can even find that APP ID it shows me (last error log) in any of the apps (there like two there, one was just created to support apple log in).
How can I fix this?
This is my app.json if that helps
{
"expo": {
"name": "My Apps Name",
"slug": "MyAppsName",
"version": "1.0.3",
"orientation": "portrait",
"icon": "./assets/icon.png",
"userInterfaceStyle": "light",
"splash": {
"image": "./assets/splash.png",
"resizeMode": "contain",
"backgroundColor": "#202022"
},
"assetBundlePatterns": ["**/*"],
"ios": {
"supportsTablet": true,
"bundleIdentifier": "my.package.url",
"buildNumber": "1",
"infoPlist": {
"NSMicrophoneUsageDescription": "This app uses the microphone to allow you to capture audio as a note.",
"NSPhotoLibraryUsageDescription": "This app needs access to the photo library to allow you to upload photos as notes.",
"NSCameraUsageDescription": "This app uses the camera to allow you to capture photos as notes.",
"NSLocationWhenInUseUsageDescription": "This app needs access to your location to allow you to capture your location as a note."
}
},
"android": {
"package": "my.package.url",
"versionCode": 5,
"adaptiveIcon": {
"foregroundImage": "./assets/adaptive-icon.png",
"backgroundColor": "#202022"
},
"permissions": [
"android.permission.INTERNET",
"android.permission.READ_EXTERNAL_STORAGE",
"android.permission.WRITE_EXTERNAL_STORAGE",
"android.permission.MODIFY_AUDIO_SETTINGS",
"android.permission.AUDIO_CAPTURE",
"android.permission.CAMERA",
"android.permission.RECORD_AUDIO",
"android.permission.ACCESS_FINE_LOCATION",
"android.permission.ACCESS_COARSE_LOCATION",
"android.permission.FOREGROUND_SERVICE"
]
},
"web": {
"favicon": "./assets/favicon.png"
},
"extra": {
"eas": {
"projectId": "my-project-id"
}
}
}
}
|