instruction stringlengths 0 30k ⌀ |
|---|
You can not write anything to DOM when there is no DOM.
Just like you can not put a sofa in your livingroom when there is no house yet.
That means you have to check with ``this.isConnected`` if the element is attached to the DOM, in the ``set record`` method.
Its an instance of a JavaScript class/object; You can of course attach _anything_ you want to it and have the ``constructor/attributeChangedCallback/connectedCallback`` process it (but really think trough the house/livingroom scenario)
Note _**shadowDOM**_ allows you to create any DOM you want and **not** have it display in the main DOM (yet) (same applies to a DocumentFragment)
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-js -->
const createElement = (tag, props = {}) => Object.assign(document.createElement(tag), props);
customElements.define('my-track', class extends HTMLElement {
connectedCallback() {
console.warn("connectedCallback" , this.id);
this.append(
this.recordDIV = createElement("div", {
innerHTML: `Default Record content`
}))
}
set record(html) {
if (this.isConnected)
this.recordDIV.innerHTML = html;
else
console.warn(this.id, "Not in the DOM yet")
}
});
const el1 = createElement('my-track', { id: "el1" });
el1.record = "Record " + el1.id;
document.body.append(el1);
const el2 = createElement('my-track', { id: "el2" });
document.body.append(el2);
el2.record = "Record " + el2.id;
<!-- end snippet -->
## Added shadowDOM example
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-js -->
const createElement = (tag, props = {}) => Object.assign(document.createElement(tag), props);
customElements.define('my-track', class extends HTMLElement {
constructor(){
super().attachShadow({mode:"open"})
.append(
this.recordDIV = createElement("div",{
innerHTML: `Default Record content ` + this.id
})
)
}
set record(html) {
this.recordDIV.innerHTML = html;
}
});
const el1 = createElement('my-track', { id: "el1" });
el1.record = "Record " + el1.id;
document.body.append(el1);
const el2 = createElement('my-track', { id: "el2" });
document.body.append(el2);
el2.record = "Record " + el2.id;
<!-- end snippet -->
|
I am creating a Spring boot application with Aspects, but it is not working. It is not getting picked up.
I have defined EnableAspectJAutoProxy in a custom configuration class.
<!-- language: java -->
@Configuration
@EnableAspectJAutoProxy(proxyTargetClass = true)
public class AppConfig {
@Bean
public ModelMapper modelMapper() {
return new ModelMapper();
}
Going through multiple solutions on stack overflow, I found that adding @Component might solve my issue but when I add @Component, my application fails to start.
please find the error below-
```none
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'projectingArgumentResolverBeanPostProcessor' defined in class path resource [org/springframework/data/web/config/ProjectingArgumentResolverRegistrar.class]: BeanPostProcessor before instantiation of bean failed
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:325) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:323) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:204) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.context.support.PostProcessorRegistrationDelegate.registerBeanPostProcessors(PostProcessorRegistrationDelegate.java:277) ~[spring-context-6.1.4.jar:6.1.4]
at org.springframework.context.support.AbstractApplicationContext.registerBeanPostProcessors(AbstractApplicationContext.java:805) ~[spring-context-6.1.4.jar:6.1.4]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:608) ~[spring-context-6.1.4.jar:6.1.4]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:146) ~[spring-boot-3.2.3.jar:3.2.3]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:754) ~[spring-boot-3.2.3.jar:3.2.3]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:456) ~[spring-boot-3.2.3.jar:3.2.3]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:334) ~[spring-boot-3.2.3.jar:3.2.3]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1354) ~[spring-boot-3.2.3.jar:3.2.3]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1343) ~[spring-boot-3.2.3.jar:3.2.3]
at com.dimensions.inc.personalTrackerbank.PersonalTrackerbankApplication.main(PersonalTrackerbankApplication.java:10) ~[classes/:na]
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration': warning no match for this type name: com.dimensions.inc.personalTrackerbank.bankInfoInterface [Xlint:invalidAbsoluteTypeName]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:607) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:325) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:323) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:409) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1335) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1165) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:562) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:325) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:323) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:204) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:91) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:111) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.aspectj.annotation.AnnotationAwareAspectJAutoProxyCreator.findCandidateAdvisors(AnnotationAwareAspectJAutoProxyCreator.java:92) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:101) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:280) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:1130) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:1105) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:511) ~[spring-beans-6.1.4.jar:6.1.4]
... 14 common frames omitted
Caused by: java.lang.IllegalArgumentException: warning no match for this type name: com.dimensions.inc.personalTrackerbank.bankInfoInterface [Xlint:invalidAbsoluteTypeName]
at org.aspectj.weaver.tools.PointcutParser.parsePointcutExpression(PointcutParser.java:319) ~[aspectjweaver-1.9.21.jar:na]
at org.springframework.aop.aspectj.AspectJExpressionPointcut.buildPointcutExpression(AspectJExpressionPointcut.java:223) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.aspectj.AspectJExpressionPointcut.obtainPointcutExpression(AspectJExpressionPointcut.java:194) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.aspectj.AspectJExpressionPointcut.getClassFilter(AspectJExpressionPointcut.java:173) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.support.AopUtils.canApply(AopUtils.java:233) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.support.AopUtils.canApply(AopUtils.java:295) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.support.AopUtils.findAdvisorsThatCanApply(AopUtils.java:327) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findAdvisorsThatCanApply(AbstractAdvisorAutoProxyCreator.java:128) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findEligibleAdvisors(AbstractAdvisorAutoProxyCreator.java:97) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.getAdvicesAndAdvisorsForBean(AbstractAdvisorAutoProxyCreator.java:78) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.wrapIfNecessary(AbstractAutoProxyCreator.java:366) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessAfterInitialization(AbstractAutoProxyCreator.java:318) ~[spring-aop-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsAfterInitialization(AbstractAutowireCapableBeanFactory.java:438) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1789) ~[spring-beans-6.1.4.jar:6.1.4]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:600) ~[spring-beans-6.1.4.jar:6.1.4]
```
My code -
resource file -
<!-- language: java -->
@RestController
@RequestMapping("/data")
public class PtbBankResource {
@GetMapping("/ret")
@ValidateUser(headerName = "Authorization")
public ResponseEntity<?> getCall()
{
return ResponseEntity.ok().body("data Successfull");
}
}
custom annotation -
@Component
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
public @interface ValidateUser {
String headerName() default "";
}
Aspect class -
<!-- language: java -->
@Aspect
public class AuthAspect {
private static final Logger logger = LoggerFactory.getLogger(AuthAspect.class);
@Autowired
PtbUtils ptbUtils;
@Around(value = "execution(* *(..)) && @annotation(ValidateUser)")
public Object authValidation(ProceedingJoinPoint joinPoint, ValidateUser validateUser) throws Throwable
{
logger.info("Inside authValidation class");
String headerName = validateUser.headerName().isEmpty() ? HttpHeaders.AUTHORIZATION : validateUser.headerName();
HttpServletRequest request = ((ServletRequestAttributes) RequestContextHolder.currentRequestAttributes()).getRequest();
logger.info("headerName = " + headerName);
logger.info("request = " + request);
String token = request.getHeader(headerName);
logger.info("token = " + token);
Boolean value = ptbUtils.validateJwtToken(token);
System.out.println("value = " + value);
return joinPoint.proceed();
}
|
Got TypeError when new SourceTextModule() with a `cachedData` option.
It says `TypeError: A dynamic import callback was not specified.`
But the `importModuleDynamically` option is clearly there.
```ts
function importModuleDynamically(spec: string, ref: any, attributes: any) {
return linker(spec, ref, {attributes})
}
async function SrcModule(url: string, code: string) {
let cachedData = v8Cache.get(url)
let mod = new vm.SourceTextModule(code, {
cachedData,
identifier: url,
initializeImportMeta,
importModuleDynamically
})
if (!cachedData) {
cachedData = mod.createCachedData()
cachedData.code = code
v8Cache.set(url, cachedData)
}
await mod.link(linker)
await mod.evaluate()
return mod
}
```
Once `cachedData` is set, it means `import()` can not longer be used any more?
|
vm.SourceTextModule with cachedData |
|node.js|v8| |
I have the code below and I expect to first have `hi` in my console then the `error` and at the end `why`, but the result is: `hi`, then `why` and the last one is `error`, so I'm wondering, why is this happening?
code:
```
const test = new Promise((resolve, reject) => {
console.log("hi");
throw new Error("error");
})
test.finally(() => {
console.log("why")
})
```
|
[The error that I'm struggling, with some examples][1]
I have written a code for finding prime numbers using sieve of Eratosthenes algorithm, but the problem is my code works as it tends to be, but for only some numbers. It shows like "entering upto-value infinitely" as the error and for some numbers it works perfectly. As I started recently studying C in-depth I couldn't find what is going wrong. I request the code-ies to help me in this case, and also help me realize what is the mistake, why it happens and how to prevent it.
Here's the code (modified to avoid garbage values):
```
#include <stdio.h>
int main()
{
int n;
printf("enter number: ");
scanf("%d",&n);
int arr[n],pr=2;
for(int i=0;pr<=n;i++)
{
arr[i]=pr;
pr++;
}
int j,k=0;
while(arr[k]<=n)
{
for(j=2;j<n;j++)
{
for(k=0;k<n;k++)
{
if(arr[k]%j==0 && arr[k]>j)
arr[k]=0;
}
}
}
for(int i=0;i<n;i++)
{
if(arr[i]>0)
printf(" %d",arr[i]);
}
printf("\n");
return 0;
}
```
[1]: https://i.stack.imgur.com/r8LC0.png |
I have a spring boot version 3.1.4 application running with java 17
This application has a dependency in its pom.xml which is an application written in java 8
The java 8 dependency application uses the PostConstruct annotation from below import
import javax.annotation.PostConstruct
The problem is my spring boot 3 application running in java 17 never executes the function marked with @PostConstruct in the dependency.
This issue does not occur when I downgrade my application from spring boot 3 to spring boot 2 and run it using java 17
How can I fix this issue? |
@PostConstruct annotation in java 8 jar not being called by spring boot 3 application |
|java|spring|spring-boot|spring-boot-3|spring-boot-2| |
how to load 2 cppflow models in memory and make inference with auto prediction = (*model)(input) |
<s>As you have not shared the schema of your dataframe, I am assuming your df has an "id" column. You can update it accordingly to any other column as per your requirement.</s>
You can simply apply the [`row_number()`](https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.functions.row_number.html) window function as follows to get the desired result:
```lang-py
from pyspark.sql.window import Window
from pyspark.sql.functions import row_number, monotonically_increasing_id
# 100 in the denominator is the batch size
df = df.withColumn(
"unique_batch_id",
((row_number().over(Window.orderBy(monotonically_increasing_id())) - 1) / 100)
.cast("integer")
)
```
**Update 1**: As per your reply - you don't have an "id" like column so I have added a [`monotonically_increasing_id()`](https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.functions.monotonically_increasing_id.html) to make the above code work properly and fulfil your requirement.
**Update 2**: So, as per the comments, you need `uuid` and not integers as the batch ID - therefore I came up with the following workaround - extending the previous code:
```lang-py
from pyspark.sql import Window
from pyspark.sql.functions import row_number, col, monotonically_increasing_id, udf
import hashlib
import uuid
df = df.withColumn(
"batch_id",
((row_number().over(Window.orderBy(monotonically_increasing_id())) - 1) / 100)
.cast("integer").cast("string")
)
def generate_uuid(batch_id):
return str(uuid.UUID(bytes=hashlib.md5(batch_id.encode()).digest()))
uuid_udf = udf(generate_uuid)
df = df.withColumn("uuid", uuid_udf(df["batch_id"])).drop("batch_id")
``` |
PHP FFI: How to pass PHP class object to C++ |
|php|c++|pointers|ffi| |
null |
|c#|httpclient|dotnet-httpclient|sendasync| |
{"Voters":[{"Id":283366,"DisplayName":"Phil"},{"Id":6463558,"DisplayName":"Lin Du"},{"Id":1974224,"DisplayName":"Cristik"}]} |
This may help you: https://developer.hashicorp.com/terraform/cli/commands/show
terraform show -json
*The terraform show command is used to provide human-readable output from a state or plan file. This can be used to inspect a plan to ensure that the planned operations are expected, or to inspect the current state as Terraform sees it.
Machine-readable output is generated by adding the -json command-line flag.*
|
{"Voters":[{"Id":269970,"DisplayName":"esqew"},{"Id":6463558,"DisplayName":"Lin Du"},{"Id":1974224,"DisplayName":"Cristik"}],"SiteSpecificCloseReasonIds":[18]} |
I've played around with both the `triggers.Trigger` solution and the `customResources.AwsCustomResource` solution, but ultimately, while both of these solutions work, they have some minor drawbacks.
The former's drawback is that it doesn't take in an existing `lambda.Function`, and the latter's drawback is that the Lambda function itself needs to be written as a custom resource event handler, and what's more, we end up with a nonempty `cdk` diff every time.
I've come up with this solution, which uses an `EventBridge` rule to trigger `myFunction` once the `CloudFormation` stack deployment is complete, and with which I am quite happy:
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-ts -->
import * as cdk from "aws-cdk-lib/core";
import * as events from "aws-cdk-lib/aws-events";
import * as eventsTargets from "aws-cdk-lib/aws-events-targets";
new events.Rule(this, "DeploymentHook", {
eventPattern: {
detailType: ["CloudFormation Stack Status Change"],
source: ["aws.cloudformation"],
detail: {
"stack-id": [cdk.Stack.of(this).stackId],
"status-details": {
status: ["CREATE_COMPLETE", "UPDATE_COMPLETE"],
},
},
},
targets: [new eventsTargets.LambdaFunction(myFunction)],
});
<!-- end snippet --> |
{"Voters":[{"Id":1431720,"DisplayName":"Robert"},{"Id":6463558,"DisplayName":"Lin Du"},{"Id":1974224,"DisplayName":"Cristik"}]} |
Very simple: do the compression, then snip the result.
import gzip
plain = b"Stuff"
compressed = gzip.compress(plain)
bad_compressed = compressed[:-1]
gzip.decompress(bad_compressed) # EOFError
Even easier, just two bytes is enough for the `gzip` module to recognise the gzip format, but is obviously not a complete compressed file.
bad_compressed = b'\x1f\x8b'
gzip.decompress(bad_compressed) # EOFError
This is in-memory for the simplicity of demonstration; it would work the same if you manipulated the file instead of the string. For example:
echo Stuff | gzip | head -c 2 >file.gz |
I'm looking for help on a bad editing experience. As you can see in the image below, when I try to edit an xaml file, the edit windows are corrupted. This is for a WPF project.
[EDIT] I have since discovered the same issue with Winforms design views.
[](https://i.stack.imgur.com/dSYPm.png)
The screen review and source code are in the left corner, overlaying the tabs. There is nothing in the preview window even though I have defined a window and some controls. There are no scroll bars or other editor controls. The mouse doesn't work in this window. The view and problems are the same if I dock it back in the main window.
I've updated to the latest Visual Studio (VS22 17.9.5 64-bit Community). I've checked for new display drivers and I have no display hardware acceleration.
The file is editable in Blend, but this is a difficult workflow when I'm using Copilot to decorate the controls with styles and resources.I have to switch back and forth between Blend and VS, and can't do it inline. |
How can i pass the secrets to the ngrok.yml file in the container and in the FTP container to an environment? |
|docker|docker-compose|ftp|ngrok|secrets| |
|mongodb|aggregation-framework| |
I subscribe to several nodes at the same time, so that I only receive a notification when the values change. I need to store the data in a csv file and I write rows with the timestamp + all values of the nodes. I implemented asyncio.Queue to handle the simultaneous writing, but still if I get more than one notification at the same time, the file gets updated with the first notification but not the others which get lost.
Here is the output:
```
sv_writer_task started
Queue list:
<Queue maxsize=0>
Datachange notification received.
29032024 14:10:16Data change notification was received and queued: ns=2;s=Unit_SB.UC3.TEMP.TEMP_v 19.731399536132812
Datachange notification received.
29032024 14:10:16Data change notification was received and queued: ns=2;s=Unit_SB.UC3.HUMI.HUMI_v 36.523399353027344
Dequeue value: 19.731399536132812
val name: TEMP
Prefix UC1 is no match for node: Unit_SB.UC3.TEMP.TEMP_v. Skipping.
Prefix UC2 is no match for node: Unit_SB.UC3.TEMP.TEMP_v. Skipping.
csv file: UC3.csv
Appending new row
Data written to file UC3.csv succesfully
Dequeue value: 36.523399353027344
val name: HUMI
Prefix UC1 is no match for node: Unit_SB.UC3.HUMI.HUMI_v. Skipping.
Prefix UC2 is no match for node: Unit_SB.UC3.HUMI.HUMI_v. Skipping.
csv file: UC3.csv
Updating existing row
Data written to file UC3.csv succesfully
```
In the csv file one can see in last line that only the TEMP value, which arrived first, was changed, but not the HUMI one:
```
Timestamp,OUT_G,OUT_U,HUMI,TEMP
29032024 14:08:42,True,True,47.38769912719727,15.043899536132812
29032024 14:10:16,True,True,47.38769912719727,19.731399536132812
```
here the code so far. Since I'm subscribing to several subsystem and each of them has it's own data file, I'm checking to which file to write:
```
# this is the consumer which consumes items from the queue
async def csv_writer_task(queue):
print("csv_writer_task started")
print('Queue list: ')
print(queue)
print('')
while True:
try:
dte, node, val = await queue.get()
print('Dequeue value: ', val)
except Exception as e:
print(f"Error in csv_writer_task while retrieving value fromt he queue: {e}")
node_id_str = str(node.nodeid.Identifier)
node_parts = node_id_str[len("Unit_SB."):].split('.')
val_name = node_parts[-1].replace('_v', '')
print('val name: ', val_name)
for key, header_row in prefix_mapping.items():
if f"Unit_SB.{key}" in node_id_str:
csv_file = f"{key}.csv"
print('csv file: ', csv_file)
break
else:
print(f"Prefix {key} is no match for node: {node_id_str}. Skipping.")
df = pd.read_csv(csv_file)
last_row = df.iloc[-1].copy()
if last_row['Timestamp'] == dte:
print("Updating existing row")
last_row[val_name] = val
else:
print("Appending new row")
new_row = last_row.copy()
new_row['Timestamp'] = dte
new_row[val_name] = val
df = pd.concat([df, new_row.to_frame().T], ignore_index=True)
df.to_csv(csv_file, index=False)
print(f'Data written to file {csv_file} succesfully')
queue.task_done()
class SubscriptionHandler(object):
def __init__(self, wsconn):
self.wsconn = wsconn
self.q = asyncio.Queue()
self.queuwriter = asyncio.create_task(csv_writer_task(self.q))
# this is the producer which produces items and puts them into a queue
async def datachange_notification(self, node, val, data):
print("Datachange notification received.")
dte = data.monitored_item.Value.ServerTimestamp.strftime("%d%m%Y %H:%M:%S")
await self.q.put((dte, node, val))
print(dte + "Data change notification was received and queued: ", node, val)
class SBConnection():
def __init__(self):
self.listOfWSNode = []
self.dpsList = ...
async def connectAndSubscribeToServer(self):
self.csv_file = ''
async with Client(url=self.url) as self.client:
for element in self.dpsList:
node = "ns=" + element["NS"] + ";s=" + element["Name"]
var = self.client.get_node(node)
self.listOfWSNode.append(var)
print("Subscribe to ", self.listOfWSNode)
handler = SubscriptionHandler(self)
sub = await self.client.create_subscription(period=10, handler=handler)
await sub.subscribe_data_change(self.listOfWSNode)
print('subscription created')
# create infinite loop
while True:
await asyncio.sleep(0.1)
async def main():
uc = SBConnection()
await uc.connectAndSubscribeToServer()
if __name__ == '__main__':
asyncio.run(main())
```
Can anyone help me find the problem?
Thanks |
null |
{"Voters":[{"Id":1362568,"DisplayName":"Mike Kinghan"},{"Id":1431720,"DisplayName":"Robert"},{"Id":2402272,"DisplayName":"John Bollinger"}],"SiteSpecificCloseReasonIds":[13]} |
The following implementation will label each point with its respective species:
```python
import seaborn as sns
import matplotlib.pyplot as plt
import pandas as pd
df = sns.load_dataset("iris")
ax = sns.lmplot(x='sepal_length', # Horizontal axis
y='sepal_width', # Vertical axis
data=df, # Data source
fit_reg=False, # Don't fix a regression line
aspect=2) # size and dimension
plt.title('Example Plot')
# Set x-axis label
plt.xlabel('Sepal Length')
# Set y-axis label
plt.ylabel('Sepal Width')
def label_point(x, y, val, ax):
a = pd.concat({'x': x, 'y': y, 'val': val}, axis=1)
for i, point in a.iterrows():
ax.text(point['x']+.02, point['y'], str(point['val']))
label_point(df.sepal_length, df.sepal_width, df.species, plt.gca())
```
[![enter image description here][1]][1]
[1]: https://i.stack.imgur.com/hgCDu.png |
Following work for the cases where you want to copy a file with name ending with a number:
my-application-"*[0-9]".jar |
This is not correct. `A` is not created but assigned with the pointer to the according element in the collection.
[![enter image description here][1]][1]
[1]: https://i.stack.imgur.com/yDTua.png |
I'd like to know how to insert a column into an array. I know the push function to add a row but not how to add a column like that:
A ; 1
B ; 2
C ; 3
D ; 4
to :
A ; **A1** ; 1
B ; **B2** ; 2
C ; **C3** ; 3
D ; **D4** ; 4
I guess I will have to loop between each row to add an item between two item but is there a way to do it as a bulk ?
Thanks in advance !
|
add this on top of your controller:
```php
/**
*
* @OA\Info(
* version="1.0.0",
* title="Organization",
* description="test description",
* @OA\Contact(
* name="test",
* email="test@test"
* ),
* ),
* @OA\Server(
* url="/api/v1",
* ),
*/
class YourController extends Controller
```
then you have to set like this documentation for all of your controller method.
>suggestion:
if you using PHP 8.1 or higher
Laravel 8.x or higher
instead of using `L5-Swagger` use `dedoc/scramble` it will generate document automatically for you and it's build on top of swagger
[https://scramble.dedoc.co][1]
[1]: https://scramble.dedoc.co/installation |
Your `normal.xz` calculates the *gradient* of the heightmap correctly. The gradient points uphill, i.e. where the height values *increase*. The normal, when projected on the horizontal plane, would point backwards.
This can be derived mathematically through the use of cross-product of two tangents, which is what that article does in the "Normals and Tangents" section. In particular the relevant result is the formula for the normal in "Equation 6b", which includes the negative signs:
[![Equation 6b][1]][1]
[1]: https://i.stack.imgur.com/a7Ih8.png |
I have a command in .NET that creates some entities and returns a result. During the command, it sends an email with the information about the entity. It uses a service I get using DI and the call is async. I see when testing that sending this email can make the request take twice as long. The code is something like this.
public Guid CreateInvoice(string email, string name, ...)
{
var transaction = await _context.Databse.BeginTransactionAsync();
try {
await _context.Invoice.AddAsync(new Invoice(email, name));
// Some other CRUD operations here
await _emailService.SendEmail(email, name); // This here adds the latency
// Some other CRUD
transaction.Commit();
return newInvoice.Id;
}
catch(Exception e)
{
transaction.Rollback();
}
}
I have tried removing the `await` from the call of the email method. This obviously makes the code faster, but then I don't know if the method was send, which means that I cannot rollback the transaction if there is a problem. I want to rollback all of the CRUD if the method is not sent.
What is the best approach here? Should I add some retry policy for the sending so I don't have to worry about rolling back. Should I add some method in the email service that notifies the user that there was an error if the method fails? I can maybe split them and then call the email sending in the background so it is on its own. I am open to any ideas on how to tackle it. Thanks in advance. |
Async call to third party slows down a request |
|c#|.net|performance|rest|asynchronous| |
I want to create a Luckman breed of Pac-Man for Android for phones and tablets. But I can't make a collision for the labyrinth.
I tried to do it as in these files https://drive.google.com/file/d/1pwCDpU1UPzB7V7TV7-WmbyDLkpIDODry/view. But due to the fact that my labyrinth is located not as a picture but as a drawn object in the .kv file, nothing happened.
Here is my program code
**main.py**
```
from kivy.app import App
from kivy.uix.screenmanager import Screen
from kivy.uix.widget import Widget
from kivy.uix.anchorlayout import AnchorLayout
from player import *
from kivy.clock import Clock
from kivy.core.window import Window
class GamePlay(Screen):
width = Window.width
height = Window.height
pacman = Player()
def __init__(self, **kwargs):
super(GamePlay,self).__init__(**kwargs)
def on_touch_down(self, touch):
self.start_pos = touch.pos
return True
def on_touch_move(self, touch):
# Calculate swipe distance and direction
swipe_distance_x = abs(touch.pos[0] - self.start_pos[0])
swipe_distance_y = abs(touch.pos[1] - self.start_pos[1])
swipe_threshold = 50 # Adjust threshold based on your needs
# Detect swipe directions
if swipe_distance_y > swipe_threshold:
if touch.pos[1] > self.start_pos[1]:
self.pacman.velocity = (0, 1)
self.pacman.number_update_pac_img = "up"
else:
self.pacman.velocity = (0, -1)
self.pacman.number_update_pac_img = "down"
elif swipe_distance_x > swipe_threshold:
if touch.pos[0] > self.start_pos[0]:
self.pacman.velocity = (1, 0)
self.pacman.number_update_pac_img = "right"
else:
self.pacman.velocity = (-1, 0)
self.pacman.number_update_pac_img = "left"
return True
def update(self, dt):
self.pacman.move()
class LuckmanApp(App):
def build(self):
game = GamePlay()
Clock.schedule_interval(game.update, 1.0/60.0)
return game
if __name__ == '__main__':
LuckmanApp().run()
```
**luckman.kv**
```
<Player>:
Image:
id: "pacman"
source: root.pac_img
allow_stretch: True
pos: root.pos
size: self.parent.width / 18, self.parent.height / 18
<GamePlay>:
canvas:
Rectangle:
source: "png/bg.png"
size: self.size
pacman: pacman_player
FloatLayout:
# Верхний вход
canvas:
Color:
rgb: [0, 0, 1]
# Верхний вход - правая стенка
Rectangle:
pos: self.width / 1.79, self.height
size: self.width / -43.2, self.height / -6
# Верхний вход - левая стенка
Rectangle:
pos: self.width / 2.274, self.height
size: self.width / 43.2, self.height / -6
# Верхний левый квадрат стенок
Rectangle:
pos: self.width / 2.274, self.height / 1.2
size: self.width / -15, self.height / 96
Rectangle:
pos: self.width / 2.52, self.height / 1.2
size: self.width / -43.2, self.height / 8
Rectangle:
pos: self.width / 2.52, self.height / 1.055
size: self.width / -2.85, self.height / 96
Rectangle:
pos: self.width / 15, self.height / 1.0444
size: self.width / -43.2, self.height / -4.7
Rectangle:
pos: self.width / 23.1, self.height / 1.34
size: self.width / 7.3, self.height / -96
Rectangle:
pos: self.width / 2.6, self.height / 1.34
size: self.width / -7.7, self.height / -96
Rectangle:
pos: self.width / 2.68, self.height / 1.359
size: self.width / 11, self.height / 16.2
# Верхний левый средний квадрат
Rectangle:
pos: self.width / 7.29, self.height / 1.0948
size: self.width / 22, self.height / -7.6
Rectangle:
pos: self.width / 3.92, self.height / 1.0948
size: self.width / 22, self.height / -7.6
# Верхний правый квадрат стенок
Rectangle:
pos: self.width / 1.8, self.height / 1.2
size: self.width / 15, self.height / 96
Rectangle:
pos: self.width / 1.631, self.height / 1.2
size: self.width / 43.2, self.height / 8
Rectangle:
pos: self.width / 1.631, self.height / 1.055
size: self.width / 2.85, self.height / 96
Rectangle:
pos: self.width / 1.06, self.height / 1.0444
size: self.width / 43.2, self.height / -4.7
Rectangle:
pos: self.width / 1.035, self.height / 1.34
size: self.width / -7.3, self.height / -96
Rectangle:
pos: self.width / 1.6043, self.height / 1.34
size: self.width / 7.7, self.height / -96
Rectangle:
pos: self.width / 1.57, self.height / 1.359
size: self.width / -10, self.height / 16.2
# Верхний правый средний квадрат
Rectangle:
pos: self.width / 1.207, self.height / 1.0948
size: self.width / 22, self.height / -7.6
Rectangle:
pos: self.width / 1.324, self.height / 1.0948
size: self.width / -22, self.height / -7.6
# Нижний вход - правая стенка
Rectangle:
pos: self.width / 1.79, self.height * 0
size: self.width / -43.2, self.height / 6
# Нижний вход - левая стенка
Rectangle:
pos: self.width / 2.274, self.height * 0
size: self.width / 43.2, self.height / 6
# Нижний левый квадрат стенок
Rectangle:
pos: self.width / 2.274, self.height / 6.4
size: self.width / -15, self.height / 96
Rectangle:
pos: self.width / 2.52, self.height / 6
size: self.width / -43.2, self.height / -8
Rectangle:
pos: self.width / 2.52, self.height / 24
size: self.width / -2.85, self.height / 96
Rectangle:
pos: self.width / 15, self.height / 24
size: self.width / -43.2, self.height / 4.7
Rectangle:
pos: self.width / 23.1, self.height / 3.8
size: self.width / 7.25, self.height / -96
Rectangle:
pos: self.width / 2.6, self.height / 3.8
size: self.width / -7.7, self.height / -96
Rectangle:
pos: self.width / 2.68, self.height / 3.8
size: self.width / 11, self.height / -16.2
#Нижний Левый средний квадрат
Rectangle:
pos: self.width / 7.29, self.height / 4.61
size: self.width / 22, self.height / -7.6
Rectangle:
pos: self.width / 3.92, self.height / 4.61
size: self.width / 22, self.height / -7.6
# Нижний правый квадрат стенок
Rectangle:
pos: self.width / 1.8, self.height / 6.4
size: self.width / 15, self.height / 96
Rectangle:
pos: self.width / 1.631, self.height / 6
size: self.width / 43.2, self.height / -8
Rectangle:
pos: self.width / 1.631, self.height / 24
size: self.width / 2.85, self.height / 96
Rectangle:
pos: self.width / 1.06, self.height / 24
size: self.width / 43.2, self.height / 4.7
Rectangle:
pos: self.width / 1.035, self.height / 3.8
size: self.width / -7.25, self.height / -96
Rectangle:
pos: self.width / 1.6043, self.height / 3.8
size: self.width / 7.6, self.height / -96
Rectangle:
id: stop
pos: self.width / 1.57, self.height / 3.8
size: self.width / -10, self.height / -16.2
#Нижний правый средний квадрат
Rectangle:
pos: self.width / 1.207, self.height / 4.61
size: self.width / 22, self.height / -7.6
Rectangle:
pos: self.width / 1.324, self.height / 4.61
size: self.width / -22, self.height / -7.6
#Центральные стороны
#Центральная левая горизонтальная стенка
Rectangle:
pos: self.width / 6.31, self.height / 1.34
size: self.width / 43.2, self.height / -2.05
#Центральная правая горизонтальная стенка
Rectangle:
pos: self.width / 1.208, self.height / 1.34
size: self.width / 43.2, self.height / -2.05
#Центальная верхняя стенка
Rectangle:
pos: self.width / 3.92, self.height / 1.43
size: self.width / 2., self.height / -96
#Центр спавна призраков
Rectangle:
pos: self.width / 2.68, self.height / 1.555
size: self.width / 11, self.height / 96
Rectangle:
pos: self.width / 1.57, self.height / 1.555
size: self.width / -10, self.height / 96
Rectangle:
pos: self.width / 2.68, self.height / 1.65
size: self.width / 43.2, self.height / 27
Rectangle:
pos: self.width / 1.57, self.height / 1.65
size: self.width / -43.2, self.height / 27
Rectangle:
pos: self.width / 2.68, self.height / 1.65
size: self.width / 3.79, self.height / -96
#Центальная средняя стенка
Rectangle:
pos: self.width / 3.92, self.height / 1.786
size: self.width / 2, self.height / -96
#Центральная правая вертекальная стенка
Rectangle:
pos: self.width / 1.324, self.height / 1.68
size: self.width / -22, self.height / 17.5
#Центральная левая вертекальная стенка
Rectangle:
pos: self.width / 3.92, self.height / 1.68
size: self.width / 22, self.height / 17.5
#Центальная левая нижняя стенка
Rectangle:
pos: self.width / 6.31, self.height / 1.945
size: self.width / 3.25, self.height / -96
#Центальняя правая нижняя стенка
Rectangle:
pos: self.width / 1.208, self.height / 1.945
size: self.width / -3.5, self.height / -96
#Центральний нижний квадрат
Rectangle:
pos: self.width / 3.92, self.height / 3.8
size: self.width / 2, self.height / 4.85
Player:
id: pacman_player
pos: self.width / 2.13, self.height / 5.7
```
**player.py**
```
from kivy.uix.widget import Widget
from kivy.properties import StringProperty, NumericProperty, ReferenceListProperty
from kivy.vector import Vector
class Player(Widget):
pac_img = StringProperty("png/pacman/pacman_down.gif")
number_update_pac_img = "None"
velocity_x = NumericProperty(0)
velocity_y = NumericProperty(0)
velocity = ReferenceListProperty(velocity_x, velocity_y)
def move(self):
self.pos = Vector(*self.velocity) + self.pos
#update pacman gifts
if self.number_update_pac_img == "up":
self.pac_img = "png/pacman/pacman_up.gif"
if self.number_update_pac_img == "down":
self.pac_img = "png/pacman/pacman_down.gif"
if self.number_update_pac_img == "left":
self.pac_img = "png/pacman/pacman_left.gif"
if self.number_update_pac_img == "right":
self.pac_img = "png/pacman/pacman_right.gif"
``` |
How to make object collision in python kivy? |
|python|kivy|kivy-language|pacman| |
null |
(https://i.stack.imgur.com/6bm05.png)
I also follow following step but it's still not working :
If SWC continues to fail to load you can opt-out by disabling swcMinify in your next.config.js or by adding a .babelrc to your project with the following content
Just help me out of this problem |
Failed to load SWC binary for win32/x64 |
|next.js| |
null |
Imagine I have a Python function called `foobar` with the following call signature:
def baz(x):
return x + 1
def foobar(y, f=baz):
return f(y)
Now I am using Sphinx with autodoc and napoleon to produce some documentation for this function. The call signature generated in the docs then looks something like:
foobar(y, f=\<function baz\>)
My question is: is there a way to make this a bit prettier, so that it just displays the name of the function? Or ideally a way to cross-reference baz? |
Sphinx cross-reference function as default keyword argument in call signature |
|python|python-sphinx|sphinx|autodoc|sphinx-napoleon| |
There is a code in which the calculation of the centrifugal nozzle of the liquid propellant is carried out, each formula represents a separate function. The code has the main class `Injector`, in which all the basic calculations are performed, as well as two other `CentrifugalInjector `and `ScrewInjector`, in these two classes calculations are recorded that the user needs to select before starting calculations, depending on which class the calculations are used will be adjusted.
But after running the code, we get the following:
```
Traceback (most recent call last):
File "G:\LARIS\tests\test_centrifugal_injector.py", line 50, in <module>
test()
File "G:\LARIS\tests\test_centrifugal_injector.py", line 34, in test
print(injector.equivalent_geometric_characteristic_injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "G:\LARIS\src\centrifugal_injector_calc.py", line 95, in equivalent_geometric_characteristic_injector
return geometric_characteristics / (1 + self.coefficient_friction / 2 * self.radius_tangential_inlet *
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: unsupported operand type(s) for /: 'property' and 'complex'`
```
Program code:
```
import math
from dataclasses import dataclass
from enum import Enum
class AngularValues(Enum):
RIGHT_ANGLE = 90
class ConstructiveTypes(Enum):
CENTRIFUGAL_INJECTOR = "CENTRIFUGAL"
SCREW_INJECTOR = "SCREW"
@dataclass(frozen=True, slots=True)
class Injector:
outer_diameter_injector: float
side_wall_thickness_injector: float
number_input_tangential_holes: float
diameter_input_tangential_holes: float
length_input_tangential_holes: float
relative_length_twisting_chamber: float
diameter_injector_nozzle: float
relative_length_injector_nozzle: float
angle_nozzle_axis: float
mass_flow_rate: float
viscosity: float
injector_type: str
@property
def diameter_twisting_chamber_injector(self) -> float:
"""Возвращает диаметр камеры закручивания центробежной форсунки"""
return self.outer_diameter_injector - 2 * self.side_wall_thickness_injector
@property
def relative_length_tangential_hole(self) -> float:
"""Возвращает отношение длины входного тангенциального к его диаметру"""
return self.length_input_tangential_holes / self.diameter_input_tangential_holes
@property
def length_twisting_chamber(self) -> float:
"""Возвращает длину камеры закручивания центробежной форсунки"""
return self.relative_length_twisting_chamber * self.diameter_twisting_chamber_injector
@property
def radius_twisting_chamber_injector(self) -> float:
"""Возвращает радиус камеры закручивания центробежной форсунки"""
return self.diameter_twisting_chamber_injector / 2
@property
def radius_input_tangential_holes(self) -> float:
"""Возвращает радиус входных тангенциальных отверстий"""
return self.diameter_input_tangential_holes / 2
@property
def radius_tangential_inlet(self) -> float:
"""Возвращает величину радиуса, на котором расположена ось входного тангенциального отверстия от оси форсунки"""
return self.radius_twisting_chamber_injector - self.radius_input_tangential_holes
@property
def length_injector_nozzle(self) -> float:
"""Возвращает длину сопла форсунки"""
return self.relative_length_injector_nozzle * self.diameter_injector_nozzle
@property
def radius_injector_nozzle(self) -> float:
"""Возвращает радиус сопла форсунки"""
return self.diameter_injector_nozzle / 2
@property
def reynolds_number(self) -> float:
"""Возвращает число Рейнольдса"""
return (4 * self.mass_flow_rate) / (math.pi * self.viscosity * self.diameter_input_tangential_holes
* math.sqrt(self.number_input_tangential_holes))
@property
def coefficient_friction(self) -> float:
"""Возвращает коэффициент трения"""
return 10 ** ((25.8 / (math.log(self.reynolds_number, 10))**2.58) - 2)
@property
def equivalent_geometric_characteristic_injector(self) -> float:
"""Возвращает эквивалентную геометрическую характеристику"""
if self.injector_type == ConstructiveTypes.SCREW_INJECTOR.value:
geometric_characteristics = ScrewInjector.geometric_characteristics
elif self.injector_type == ConstructiveTypes.CENTRIFUGAL_INJECTOR.value:
geometric_characteristics = CentrifugalInjector.geometric_characteristics
return geometric_characteristics / (1 + self.coefficient_friction / 2 * self.radius_tangential_inlet *
(self.radius_tangential_inlet + self.diameter_input_tangential_holes -
self.radius_injector_nozzle))
class ScrewInjector(Injector):
cross_sectional_area_one_passage_channel: float
@property
def geometric_characteristics(self) -> float:
"""Возвращает геометрическую характеристику шнековой форсунки"""
return (math.pi * self.radius_tangential_inlet * self.radius_injector_nozzle) / \
(self.number_input_tangential_holes * self.cross_sectional_area_one_passage_channel)
class CentrifugalInjector(Injector):
@property
def geometric_characteristics(self) -> float:
"""Возвращает геометрическую характеристику центробежной форсунки"""
if self.angle_nozzle_axis == AngularValues.RIGHT_ANGLE.value:
return (self.radius_tangential_inlet * self.radius_injector_nozzle) / \
(self.number_input_tangential_holes * self.radius_input_tangential_holes**2)
else:
return (self.radius_tangential_inlet * self.radius_injector_nozzle) / \
(self.number_input_tangential_holes * self.radius_input_tangential_holes**2) * self.angle_nozzle_axis
```
Verification file code:
```
from src.centrifugal_injector_calc import Injector, CentrifugalInjector, ScrewInjector
def test() -> None:
common = {
"outer_diameter_injector": 15,
"side_wall_thickness_injector": 1,
"number_input_tangential_holes": 10,
"diameter_input_tangential_holes": 10,
"length_input_tangential_holes": 10,
"relative_length_twisting_chamber": 10,
"diameter_injector_nozzle": 10,
"relative_length_injector_nozzle": 10,
"angle_nozzle_axis": 10,
"mass_flow_rate": 10,
"viscosity": 1,
"injector_type": "CENTRIFUGAL"
}
injector = Injector(
**common
)
print(injector.diameter_twisting_chamber_injector)
print(injector.relative_length_tangential_hole)
print(injector.length_twisting_chamber)
print(injector.radius_twisting_chamber_injector)
print(injector.radius_input_tangential_holes)
print(injector.radius_tangential_inlet)
print(injector.length_injector_nozzle)
print(injector.radius_injector_nozzle)
print(injector.reynolds_number)
print(injector.coefficient_friction)
print(injector.equivalent_geometric_characteristic_injector)
centrifugal = CentrifugalInjector(
**common
)
print(centrifugal.geometric_characteristics)
screw = ScrewInjector(
**common, cross_sectional_area_one_passage_channel=5.0
)
print(screw.geometric_characteristics)
if __name__ == "__main__":
test()
```
|
Your quicksort function is fine, let me fix the binary search, my code returns the correct index value:
def binSearch(sorted_arr, target, left=0):
if len(sorted_arr) == 0:
return -1
cen = len(sorted_arr) // 2
if sorted_arr[cen] == target:
return left + cen
elif sorted_arr[cen] < target:
return binSearch(sorted_arr[cen+1:], target, left + cen + 1)
else:
return binSearch(sorted_arr[:cen], target, left) |
Change the `APP_URL` in your .env file to http://127.0.0.1:8000, should do the trick. |
|qt|cmake|mysql-connector|qsqldatabase| |
When I have problems changing xcode versions and the pods arent being seen I just do:
- Pod deintegrate
- pod cache clean --all
- pod update
Futhermore, It's possible that the target that you are importing this module doesnt interacts with ObjectMapper, if you are using different modules give a different import for all of them. If there's more info and the pod commands doesnt work let me know ;) |
|assembly|x86-16|dos|dosbox| |
I am trying to run a JupyterLab environment with Nginx using Docker Compose. However, I am encountering two persistent errors:
```
$ docker compose up
[+] Running 2/0
✔ Container docker-web-1 Created 0.0s
✔ Container docker-nginx-1 Created 0.0s
Attaching to nginx-1, web-1
web-1 | [ERROR] Nginx configuration file (./nginx.conf) not found.
web-1 exited with code 1
nginx-1 | /docker-entrypoint.sh: /docker-entrypoint.d/ is not empty, will attempt to perform configuration
nginx-1 | /docker-entrypoint.sh: Looking for shell scripts in /docker-entrypoint.d/
nginx-1 | /docker-entrypoint.sh: Launching /docker-entrypoint.d/10-listen-on-ipv6-by-default.sh
nginx-1 | 10-listen-on-ipv6-by-default.sh: info: IPv6 listen already enabled
nginx-1 | /docker-entrypoint.sh: Sourcing /docker-entrypoint.d/15-local-resolvers.envsh
nginx-1 | /docker-entrypoint.sh: Launching /docker-entrypoint.d/20-envsubst-on-templates.sh
nginx-1 | /docker-entrypoint.sh: Launching /docker-entrypoint.d/30-tune-worker-processes.sh
nginx-1 | /docker-entrypoint.sh: Configuration complete; ready for start up
nginx-1 | 2024/03/30 14:31:26 [emerg] 1#1: cannot load certificate "/etc/nginx/certs/_wildcard.quantumworkspace.dev+3.pem": BIO_new_file() failed (SSL: error:80000002:system library::No such file or directory:calling fopen(/etc/nginx/certs/_wildcard.quantumworkspace.dev+3.pem, r) error:10000080:BIO routines::no such file)
nginx-1 | nginx: [emerg] cannot load certificate "/etc/nginx/certs/_wildcard.quantumworkspace.dev+3.pem": BIO_new_file() failed (SSL: error:80000002:system library::No such file or directory:calling fopen(/etc/nginx/certs/_wildcard.quantumworkspace.dev+3.pem, r) error:10000080:BIO routines::no such file)
nginx-1 exited with code 1
```
Additional files for reference:
- Dockerfile:
```
# Version 3.0.0
# Base image
FROM ubuntu:latest
# Copy custom hosts file to /etc/hosts
COPY hosts /etc/hosts
# Set environment variables
ENV LANG C.UTF-8
ENV LC_ALL C.UTF-8
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONFAULTHANDLER 1
# Install necessary packages
RUN apt-get update && apt-get install -y \
python3.11 \
python3.11-venv \
python3-apt \
python3-pip \
python3-setuptools \
python3-wheel \
python3-distutils \
binutils \
build-essential \
curl \
vim \
vim-common \
vim-runtime \
nano \
flameshot \
keepassxc \
bleachbit \
nodejs \
g++ \
git \
libblas-dev \
libffi-dev \
liblapack-dev \
libssl-dev \
texlive-latex-base \
latexmk \
make \
wget \
zlib1g-dev \
bash \
tree \
nginx \
libnss3-tools \
&& apt-get clean
# Install Rust, update it, create a directory, and set the working directory to the new Rust application directory
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y && \
$HOME/.cargo/bin/rustup update
ENV PATH="/root/.cargo/bin:${PATH}"
ENV PROJECT_DIR /usr/local/src/webapp
RUN mkdir -p ${PROJECT_DIR}/rust
WORKDIR ${PROJECT_DIR}/rust
# Install wasm-pack
RUN cargo install wasm-pack
# Set up work directory for Python application
WORKDIR /home/appuser/app
# Copy Pipfile and Pipfile.lock
COPY Pipfile Pipfile.lock /home/appuser/app/
# Create and activate the virtual environment
RUN python3.11 -m venv /venv
ENV PATH="/venv/bin:${PATH}"
RUN . /venv/bin/activate && \
pip install --upgrade pip && \
pip install pipenv
# Install dependencies using pipenv
RUN PIPENV_IGNORE_VIRTUALENVS=1 . /venv/bin/activate && pipenv install --deploy
# Install Jupyter within the virtual environment
RUN . /venv/bin/activate && pipenv run pip install jupyter notebook jupyterlab voila
# Copy pre-configured Jupyter Notebook config
COPY jupyter_notebook_config.py /home/appuser/.jupyter/jupyter_notebook_config.py
# Generate Jupyter Notebook config
RUN . /venv/bin/activate && jupyter notebook --generate-config -y
# Modify Jupyter Notebook config
RUN mkdir -p /home/appuser/.jupyter && echo "c.NotebookApp.token = ''" >> /home/appuser/.jupyter/jupyter_notebook_config.py
# Install mkcert and generate SSL certificate
RUN mkdir -p /etc/nginx/certs && \
apt-get update && \
curl -JLO "https://dl.filippo.io/mkcert/latest?for=linux/amd64" && \
chmod +x mkcert-v*-linux-amd64 && \
mv mkcert-v*-linux-amd64 /usr/local/bin/mkcert && \
mkcert -install && \
mkcert -CAROOT -cert-file /etc/nginx/certs/mkcert-ca.pem -key-file /etc/nginx/certs/mkcert-ca-key.pem && \
mkcert -cert-file /etc/nginx/certs/_wildcard.quantumworkspace.dev+3.pem -key-file /etc/nginx/certs/_wildcard.quantumworkspace.dev+3-key.pem "*.quantumworkspace.dev" localhost 127.0.0.1 ::1
# Create Nginx user and group
RUN groupadd -r nginx && useradd -r -g nginx nginx
# Setting permissions and ownership
RUN chown nginx:nginx /etc/nginx/certs/*.pem && \
chmod 600 /etc/nginx/certs/*.pem
# Remove default Nginx configuration
RUN rm /etc/nginx/sites-enabled/default
# Expose ports
EXPOSE 8888
EXPOSE 5678
EXPOSE 8080
EXPOSE 443
EXPOSE 80
# Copy the main Nginx configuration file
# COPY ./nginx.conf /etc/nginx/nginx.conf
COPY nginx.conf /etc/nginx/nginx.conf
# Create non-root user for security
RUN adduser -u 5678 --disabled-password --gecos "" appuser
# Create notebooks directory
RUN mkdir -p /notebooks
COPY docker-entrypoint.sh /docker-entrypoint.sh
RUN chmod +x /docker-entrypoint.sh
ENTRYPOINT ["/docker-entrypoint.sh"]
```
- docker-compose.yaml
```
version: '3.8'
services:
web:
build: ./
image: elijun9831/quantumworkspace:3.0.0 # Haven't pushed into Docker Hub yet
networks:
- mynetwork
volumes:
- /notebooks:/notebooks # Adjust as needed
nginx:
image: nginx:latest
ports:
- "8080:8080"
- "80:80"
- "8443:443"
- "443:443"
- "8888:8888"
- "5678:5678"
volumes:
- ./certs:/etc/nginx/certs
- ./nginx.conf:/etc/nginx/nginx.conf
networks:
- mynetwork
depends_on:
- web
environment:
- NGX_HTTP_PORT=80
- NGX_HTTPS_PORT=443
- JUPYTER_HOST=web # Or internal IP of 'web' service
- JUPYTER_PORT=8888
networks:
mynetwork:
driver: bridge
```
- docker-entrypoint.sh
```
#!/bin/bash
set -e # Exit script on error
# ================ Environment Variables ================
# Add or modify as needed, set default values here
export VIRTUAL_ENV="/venv"
export NOTEBOOKS_DIRECTORY="/notebooks"
export NGINX_CONF_FILE="./nginx.conf"
# ================ Validations ================
# Check if Nginx configuration file exists
if [[ ! -f $NGINX_CONF_FILE ]]; then
echo "[ERROR] Nginx configuration file ($NGINX_CONF_FILE) not found." >&2
exit 1
fi
# Check if Virtual Environment directory exists
if [[ ! -d $VIRTUAL_ENV ]]; then
echo "[ERROR] Virtual environment directory ($VIRTUAL_ENV) not found." >&2
exit 1
fi
# Check if JupyterLab notebooks directory exists
if [[ ! -d $NOTEBOOKS_DIRECTORY ]]; then
echo "[ERROR] JupyterLab notebooks directory ($NOTEBOOKS_DIRECTORY) not found." >&2
exit 1
fi
# ================ Apply Nginx Configuration ================
echo "[INFO] Applying Nginx configuration..."
envsubst < $NGINX_CONF_FILE > /etc/nginx/nginx.conf
# ================ Start Nginx ================
echo "[INFO] Starting Nginx in foreground mode..."
nginx -g "daemon off;" &
NGINX_PID=$!
# ================ Activate Virtual Environment ================
echo "[INFO] Activating Python virtual environment ($VIRTUAL_ENV)..."
. $VIRTUAL_ENV/bin/activate
# ================ Start JupyterLab ================
echo "[INFO] Starting JupyterLab..."
pipenv run jupyter lab --notebook-dir=$NOTEBOOKS_DIRECTORY --ip=0.0.0.0 --port=80 --no-browser --allow-root &
JUPYTER_PID=$!
# ================ Graceful Shutdown ================
# Cleanup resources on script exit (EXIT) or signal reception
function cleanup() {
echo "[INFO] Received shutdown signal. Cleaning up..."
kill -TERM $NGINX_PID $JUPYTER_PID
wait $NGINX_PID $JUPYTER_PID
echo "[INFO] Nginx and JupyterLab stopped."
exit 0
}
trap cleanup EXIT INT TERM
# ================ Main Execution Loop ================
echo "[INFO] Container startup complete. Ready to serve requests."
# Infinite loop to handle SIGINT, SIGTERM signals
while true; do
sleep 1 &
wait $!
done
```
- nginx.conf
```
events {
worker_connections 1024;
}
http {
server {
listen 80;
listen [::]:80;
server_name *.quantumworkspace.dev;
location / {
return 308 https://$host$request_uri;
}
}
server {
listen [::]:443 ssl;
http2 on;
server_name *.quantumworkspace.dev;
ssl_certificate /etc/nginx/certs/_wildcard.quantumworkspace.dev+3.pem;
ssl_certificate_key /etc/nginx/certs/_wildcard.quantumworkspace.dev+3-key.pem;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers HIGH:!aNULL:!MD5;
root /var/www/html;
index index.html index.htm index.nginx-debian.html;
# Enable SSL session caching
ssl_session_cache shared:SSL:10m;
# Add headers to serve security related headers
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
# this is the internal Docker DNS, cache only for 30s
resolver 127.0.0.53 valid=30s;
location / {
set $upstream *.quantumworkspace.dev:8888 ;
proxy_pass $upstream ;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_redirect off;
proxy_buffering off;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_read_timeout 86400;
}
location ~ ^/(api/kernels/|terminals/|api/security/ws-user/|terminals/websocket/|api/batch|api/spawn) {
set $upstream *.quantumworkspace.dev:8888 ;
proxy_pass $upstream ;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_redirect off;
proxy_buffering off;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_read_timeout 86400;
}
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg)$ {
expires 1d;
add_header Cache-Control "public";
}
add_header X-Frame-Options "DENY";
add_header X-Content-Type-Options "nosniff";
add_header Content-Security-Policy "default-src 'self';";
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
}
}
```
I have been unable to resolve these errors after several attempts, so any help or suggestions would be greatly appreciated!
What I have tried:
- Verified that the Nginx configuration file exists and has the correct permissions.
- Verified that the SSL certificate files exist and are in the correct location.
- Checked the Dockerfile to ensure that the files are being copied correctly to the container.
- Restarted the containers multiple times. |
Nginx configuration file and SSL certificate errors in Docker |
|docker|nginx|ssl| |
null |
For CEST, change your code as follows:
```
pwsh: |
$utcTimestamp = Get-Date -Format "yyyyMMdd-HHmm"
$utcDateTimeObj = [DateTime]::ParseExact($utcTimestamp, "yyyyMMdd-HHmm", $null)
$cetTimeZone = [System.TimeZoneInfo]::FindSystemTimeZoneById("Central European Standard Time")
$cetDateTimeObj = [System.TimeZoneInfo]::ConvertTimeFromUtc($utcDateTimeObj, $cetTimeZone)
$formattedCETTimestamp = $cetDateTimeObj.ToString("yyyyMMdd-HHmm")
Write-Host "##vso[task.setvariable variable=currentCETTimestamp]$formattedCETTimestamp"
displayName: 'Getting timestamp'
```
For CET, change your code as follows:
```
pwsh: |
$utcTimestamp = Get-Date -Format "yyyyMMdd-HHmm"
$utcDateTimeObj = [DateTime]::ParseExact($utcTimestamp, "yyyyMMdd-HHmm", $null)
$cetTimeZone = [System.TimeZoneInfo]::FindSystemTimeZoneById("Central European Standard Time")
$cetDateTimeObj = [System.TimeZoneInfo]::ConvertTimeFromUtc($utcDateTimeObj, $cetTimeZone)
$cetDateTimeObj = $cetDateTimeObj.AddHours(-1)
$formattedCETTimestamp = $cetDateTimeObj.ToString("yyyyMMdd-HHmm")
Write-Host "##vso[task.setvariable variable=currentCETTimestamp]$formattedCETTimestamp"
displayName: 'Getting timestamp'
``` |
I have assembly A witch have reference to type T in another assembly B. But at runtime this assembly doesn't have that type, so I get `TypeLoadException`.
Is there any way to load T from my assembly without **(!)** recompilation of A or B?
Runtime doesn't call `AppDomain.AssemblyResolve` nor `AppDomain.TypeResolve` for such case.
I'm searching for managed solution (any runtime flavor) and without IL rewrite.
*Note: this is not X Y problem.* |
my colleagues and I have developed a library using NestJS, and part of our module's code is as follows:
cache.service.ts
```typescript
@Injectable()
export class CacheService implements OnModuleInit {
constructor(
private readonly discoveryService: DiscoveryService,
private readonly scanner: MetadataScanner,
private readonly reflector: Reflector,
) {}
```
cache.module.ts
```typescript
@Module({
imports: [DiscoveryModule],
providers: [CacheService],
})
export class CacheModule {}
```
However, one of the users of our library has encountered the following error:
```
ERROR \[ExceptionHandler\] Nest can't resolve dependencies of the CacheService (DiscoveryService, MetadataScanner, ?). Please make sure that the argument Reflector at index \[2\] is available in the CacheModule context.
Potential solutions:
- Is CacheModule a valid NestJS module?
- If Reflector is a provider, is it part of the current CacheModule?
- If Reflector is exported from a separate @Module, is that module imported within CacheModule?
@Module({
imports: \[ /\* the Module containing Reflector \*/ \]
})
The user has structured their project using Bun, and their app.module.ts looks like this:
```
```typescript
@Module({
imports: [CacheModule],
controllers: [AppController],
providers: [AppService],
})
export class AppModule {}
```
I have attempted the following solutions:
1. Considering it might be a compatibility issue with Bun, I created and built a simple project using Bun, which worked fine.
2. Suspecting a problem with the tsconfig, I changed the settings to those of a working configuration, but the issue persisted.
3. When I replicated the structure of cache.service.ts within the problematic project and modularized it, then imported it into app.module.ts, the Reflector was successfully injected and the application ran as expected. This leads me to believe that the issue arises when using the modularized module externally.
Is this the root cause? If so, why? If not, how can we resolve this issue? In other projects, it mostly builds and runs fine.
I've attached the github link of the code that the user said they were having issues with. https://github.com/saint6839/sample/blob/main/apps/nestjs-boilerplate/modules/app/app.module.ts |
|node.js|google-cloud-functions|google-cloud-storage|firebase-admin| |
null |
Answer by @Jon Skeet (in comments):
"You can add a parameter to the constructor, then call `new SubForm(this)`. Or if you only need to know about unitConnected, pass that into the constructor instead."
Sub-form:
```
private MainForm mainForm = new();
public SubForm(Form1 mainForm)
{
InitializeComponent();
}
```
Main form:
```
public bool unitConnected = false;
private void buttonOpenSubForm_Click(object sender, EventArgs e)
{
unitConnected = true;
SubForm subForm= new SubForm(this);
if (subForm.ShowDialog(this) == DialogResult.OK)
{
// Do something
}
}
``` |
You want one result row per CaseID, so `GROUP BY CaseID` only. Then use conditional aggregation, i.e. put the conditions inside `MAX()`.
SELECT
CaseID
,MAX(CASE WHEN Stage = 'A' THEN EventDate END) AS A
,MAX(CASE WHEN Stage = 'B' THEN EventDate END) AS B
,MAX(CASE WHEN Stage = 'C' THEN EventDate END) AS C
FROM StageTable
GROUP BY CaseID
ORDER BY CaseID; |
I am using highcharts. Everything seems fine but I am not sure why the series text (LLLL!, XYZM) is adding a white background or a shadow in the text. I tried to remove this white background with textShadow: 'none'. Also tried backgroundColor: 'rgba(0, 0, 0, 0)', but it doesn't seem to work. Any other suggestions?
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-js -->
// container7
var chartOptions = {
chart: {
type: 'column',
inverted: true,
height:200
},
title: {
text: ''
},
xAxis: {
lineColor: 'white',
categories: ['ABCD'],
labels: {
padding: 40, // Add padding of 40 to the x-axis labels
style: {
fontSize: '12px' // Adjust the font size as needed
}
}
},
yAxis: {
allowDecimals: false,
max: 100,
title: {
text: ''
},
labels: {
enabled: false // Hide y-axis labels
},
gridLineWidth: 0 // Remove grid lines
},
plotOptions: {
column: {
stacking: 'normal', // this will be our default
dataLabels: {
enabled: true,
style: {
color: 'black', // Set font color to black
fontSize: '13px', // Set font size to 12px
textShadow: 'none' // Remove text shadow
},
backgroundColor: 'rgba(0, 0, 0, 0)', // Set background color to transparent
formatter: function() {
return this.series.name + ': ' + Highcharts.numberFormat(this.y, 0) + '%';
}
}
}
},
colors: ['#d37295', '#fabfd2'],
series: [{
name: 'LLLL!',
data: [57]
}, {
name: 'XYZM',
data: [43]
}],
legend: {
enabled: false // Hide the legend
},
credits: {
enabled: false // Hide the credits
}
};
// Render the chart using Highcharts
Highcharts.chart('container7', chartOptions);
<!-- language: lang-html -->
<script src="https://code.highcharts.com/highcharts.js"></script>
<h2>... look like this:</h2>
<div id="container7" style=" border: 1px solid lightgray;"></div>
<!-- end snippet -->
|
null |
I use a IActionResult to get a List<Object>.
If I have a list, the result is ok. If I use a List<User>, the elements of the result are empty.
If I use a generic List, it works:
```
public IActionResult GetData()
{
var erg1 = _database.table.Select(x => new { x.Id, x.firstname, x.lastname });
return Ok(erg1);
}
```
I get a List with all Elements:
```
[
{
"id": "732fdbd7-c878-45e8-8c43-5f795a697f6d",
"firstname": "Uwe",
"lastname": "Gabbert"
},
{
"id": "5288f9ea-25a2-4ffc-a711-7c0b2cf49c38",
"firstname": "User",
"lastname": "Test"
}
]
```
But, If I use a separate class for the elements, I get a empty List:
```
public IActionResult GetData()
{
var erg2 = _database.table.Select(x => new User( x.Id, x.firstname, x.lastname ));
return Ok(erg2);
}
```
In erg2 are 2 Items of User. But the return of getData is:
```
[
{},
{}
]
``` |
.NET 8 Web-API Returns empty List |
|c#|.net|api|rest| |
null |
I'm facing an issue while trying to run the 'yarn install' command in my React project. I've globally installed Yarn (npm install --global yarn), but when I attempt to execute any Yarn command in the terminal, including 'yarn install', I encounter the following error:
```
The term 'yarn' is not recognized as the name of a cmdlet, function, script file,
or operable program. Check the spelling of the name, or if a path was included, verify th
at the path is correct and try again.
```
I'd like to highlight that this project is built with React. To address this issue, I've tried the following troubleshooting steps:
Reinstalling Yarn and verifying the computer's PATH variable.
Restarting both the IDE and the computer.
Confirming the installation of Yarn and checking its version.
Despite these efforts, I'm still encountering this error. I'm seeking assistance to understand the root cause of this issue and any potential solutions. Thank you. |
Error when running 'yarn install' command in React project: 'The term 'yarn' is not recognized |
|reactjs|typescript|npm|vite|yarnpkg| |
null |
```
import React, { useEffect } from "react";
import mqtt, { MqttClient } from "mqtt";
import { v4 as uuidv4 } from "uuid";
const Subscribe = () => {
useEffect(() => {
// MQTT Topic to subscribe to
const topic: string = "jobstatus";
const mqttBrokerUrl = 'wss://myipaddress:8080/mqtt';
// Connect options for MQTT client
const options = {
clientId: 'Testing',
username: 'admin',
password: 'admin',
};
const client: MqttClient = mqtt.connect(mqttBrokerUrl, options);
function handleMessage(topic: string, message: Buffer) {
const messageString: string = message.toString();
console.log(`Received message on topic "${topic}": ${messageString}`);
updateUI(messageString);
}
client.on("connect", () => {
console.log("Connected to MQTT broker");
client.subscribe(topic, (err) => {
if (err) {
console.error(`Error subscribing to topic "${topic}": ${err}`);
} else {
console.log(`Subscribed to topic "${topic}"`);
}
});
});
//To Handle incoming MQTT messages
client.on("message", handleMessage);
function updateUI(message: string) {
console.log("Updating UI with message:", message);
}
// Cleanup function
return () => {
client.unsubscribe(topic, (err) => {
if (err) {
console.error(`Error unsubscribing from topic "${topic}": ${err}`);
} else {
console.log(`Unsubscribed from topic "${topic}"`);
}
client.end(); // Close MQTT connection
});
};
}, []);
return (
<div>
<div>MQTT Messages...</div>
</div>
);
};
```
This is the component which I created to subscribe to a topic in MQTT and to show the message in front end. While running this I get ** WebSocket connection to 'wss://20.198.14.124/mqtt** error when inspected.
I have tried altering my `mosquitto.conf` file as below.
```
**listener 1883
listener 8080
protocol websockets
allow_anonymous true
```
while changing the Mosquitto config and running, I get the error as:
```
1702358221: mosquitto version 1.6.9 starting
1702358221: Config loaded from /etc/mosquitto/mosquitto.conf.
1702358221: Opening ipv4 listen socket on port 1883.
1702358221: Error: Cannot assign requested address
1702358231: mosquitto version 1.6.9 starting
1702358231: Config loaded from /etc/mosquitto/mosquitto.conf.
1702358231: Opening ipv4 listen socket on port 8080.
1702358231: Error: Cannot assign requested address.
```
My Mosquitto server runs on an azure vm with ip-> **20.193.14.124**.
I need to subscribe to topic and display those messages in frontend. Could you all help me with this problem? |
It is impossible to achieve using a single Lua pattern, but you can chain a few of them:
```
local s = "/<\\/>//b\\\\/c" -- 4 payloads here (the second one is empty)
for x in s
:gsub("/", "/\1") -- make every payload non-empty by prepending string.char(1)
:gsub("\\(.)", "\2%1") -- replace magic backslashes with string.char(2)
:gsub("%f[/\2]/", "\0") -- replace non-escaped slashes with string.char(0)
:gsub("[\1\2]", "") -- remove temporary symbols string.char(1) and string.char(2)
:gmatch"%z(%Z*)" -- split by string.char(0)
do
print(x)
end
```
Output:
```
</>
b\
c
```
Or, if you want a single statement instead of a loop:
```
local s = "/<\\/>/b/c" -- 3 payloads here
local a, b, c = s
:gsub("/", "/\1") -- make every payload non-empty by prepending string.char(1)
:gsub("\\(.)", "\2%1") -- replace magic backslashes with string.char(2)
:gsub("%f[/\2]/", "\0") -- replace non-escaped slashes with string.char(0)
:gsub("[\1\2]", "") -- remove temporary symbols string.char(1) and string.char(2)
:match"%z(%Z*)%z(%Z*)%z(%Z*)" -- split by string.char(0)
```
|
You expect that since you passed to correct enum value that only the case that matters is called, and you are correct in that, the issue is that if the enum value does not match the correct case, then you are calling the wrong function. Since none of this checking is done at compile time the compiler is forced to check the entire function to make sure all paths compile and in your case they do not.
Since all of you `search_in` functions are the same except for `pee` parameter there is no reason to even use a switch case statement and instead you can just rely on overload resolution to call the correct function for you like:
```
template <typename T>
void analyze(vegetable& place, const std::vector<std::uint8_t>& dee, T& pee, int ole, std::uint16_t cee)
{
using beta::search_in;
using gemma::search_in;
using zeta::search_in;
search_in(dee, pee, ole, cee);
}
```
You can see this working with your example code in this[live example][1].
[1]: https://godbolt.org/z/KbY7v6WPK |
Sorry in advance for my question, I'm still fairly new to Deep Learning and Pytorch.
I'm looking for strategies to increase inference speed, and I came upon Pruning. If I understood correctly, this is an iterative process:
1. Train model
2. Prune
3. Set non pruned weights of original model back to the initial weight values (rewinding)
4. Train again this model
5. Repeat 2 - 4
Can anyone give an example, or guide me on how to perform these steps, in particular rewinding, in Pytorch?
I also have a second question: How can only setting some weights to 0 increase inference speed without modifying the model architecture itself?
I've been searching around but I can't find examples on how to perform rewinding.
Note that currently I have no minimal code example to share, since I'm still trying to figure out how to do this.
Thank you! |
Pruning models in Pytorch: How to rewind weights? |
|deep-learning|pytorch|neural-network| |
When I run simulation the mouse (the simulate I use it make by Mackorone) did mark the wall I sure about that but after every loop the maze still not update. First I think the mouse not update the wall after mark it but when I log out it did mark wall. Then I check did the flood fill work so every loop flood fill didn't update wall so the mouse keep spinning around after few wall past. But when I mark wall by hand it works fine.
I want ever loop algorithm update the maze
#include <iostream>
#include <string>
#include <vector>
#include <stack>
#include <queue>
#include "API.h"
std::pair<int, int> west_wall[1000];
std::pair<int, int> east_wall[1000];
std::pair<int, int> north_wall[1000];
std::pair<int, int> south_wall[1000];
int w = 0, e = 0, n = 0, s = 0;
std::string direct_mouse[5];
std::string wall_l, wall_r, wall_f;
void log(const std::string& text) {
std::cerr << text << std::endl;
}
void mark_wall(int mx, int my, std::string &direct){
if(direct == "w"){
west_wall[w].first = mx;
west_wall[w].second = my;
w++;
}
else if(direct == "e"){
east_wall[e].first = mx;
east_wall[e].second = my;
e++;
}
else if(direct == "n"){
north_wall[n].first = mx;
north_wall[n].second = my;
n++;
}
else if(direct == "s"){
south_wall[s].first = mx;
south_wall[s].second = my;
s++;
}
}
bool check_wall(int mx, int my, std::string direct){
int k = 0;
if(direct == "w"){
for(int i = 0; i < w; i++)
if(west_wall[i].first == mx && west_wall[i].second == my)
k++;
}
else if(direct == "e"){
for(int i = 0; i < e; i++)
if(east_wall[i].first == mx && east_wall[i].second == my){
k++;
}
}
else if(direct == "n"){
for(int i = 0; i < n; i++)
if(north_wall[i].first == mx && north_wall[i].second == my)
k++;
}
else if(direct == "s"){
for(int i = 0; i < s; i++)
if(south_wall[i].first == mx && south_wall[i].second == my)
k++;
}
if(k == 0)
return false;
else
return true;
}
void mouse_way(int d){
direct_mouse[1] = "n";
direct_mouse[2] = "e";
direct_mouse[3] = "s";
direct_mouse[4] = "w";
if(d == 1)
{
wall_f = "n";
wall_l = "w";
wall_r = "e";
}
else if(d == 2){
wall_f = "e";
wall_l = "n";
wall_r = "s";
}
else if(d == 3){
wall_f = "s";
wall_l = "e";
wall_r = "w";
}
else if(d == 4){
wall_f = "w";
wall_l = "s";
wall_r = "n";
}
}
void bfs(std::vector<std::vector<int>>& maze, int mx, int my){
std::queue<std::pair<int, int>> q;
//base case
if(!check_wall(mx - 1, my, "s")){
maze[mx - 1][my] = 1;q.push({mx - 1, my});
}
if(!check_wall(mx - 1, my + 1, "s")){
maze[mx - 1][my + 1] = 1;q.push({mx - 1, my + 1});
}
if(!check_wall(mx, my + 2, "w")){
maze[mx][my + 2] = 1;q.push({mx, my + 2});
}
if(!check_wall(mx + 1, my + 2, "w")){
maze[mx + 1][my + 2] = 1;q.push({mx + 1, my + 2});
}
if(!check_wall(mx + 2, my + 1, "n")){
maze[mx + 2][my + 1] = 1;q.push({mx + 2, my + 1});
}
if(!check_wall(mx + 2, my, "n")){
maze[mx + 2][my] = 1;q.push({mx + 2, my});
}
if(!check_wall(mx + 1, my - 1, "e")){
maze[mx + 1][my - 1] = 1;q.push({mx + 1, my - 1});
}
if(!check_wall(mx, my - 1, "e")){
maze[mx][my - 1] = 1; q.push({mx, my - 1});
}
while(!q.empty()){
int nx, ny;
nx = q.front().first;
ny = q.front().second;
q.pop();
if(!check_wall(nx + 1, ny, "n") && nx + 1 < 17 && maze[nx + 1][ny] == 0){ // down
q.push({nx + 1, ny});
maze[nx + 1][ny] = maze[nx][ny] + 1;
}
if(!check_wall(nx, ny - 1, "e") && ny - 1 >= 1 && maze[nx][ny - 1] == 0){ // left
q.push({nx, ny - 1});
maze[nx][ny - 1] = maze[nx][ny] + 1;
}
if(nx - 1 >= 1 && maze[nx - 1][ny] == 0 && !check_wall(nx - 1, ny, "s")){ // up
q.push({nx - 1, ny});
maze[nx - 1][ny] = maze[nx][ny] + 1;
}
if(ny + 1 < 17 && maze[nx][ny + 1] == 0 && !check_wall(nx, ny + 1, "w")){ // right
q.push({nx, ny + 1});
maze[nx][ny + 1] = maze[nx][ny] + 1;
}
}
maze[mx][my] = maze[mx + 1][my] = maze[mx + 1][my + 1] = maze[mx][my + 1] = 0;
}
int main(int argc, char* argv[]) {
log("Running...");
API::setColor(0, 0, 'G');
API::setText(0, 0, "Start");
int d = 1;
int mx = 16, my = 1;
std::vector<std::vector<int>> maze(18,std::vector<int>(18, 0));
while (true) {
char str[1000];
sprintf(str, "%d", mx);
log(str);
if(d < 1)
d = 3;
if(d > 4)
d = 1;
mouse_way(d);
if(API::wallFront()) mark_wall(mx, my, wall_f);
if(API::wallLeft()) mark_wall(mx, my, wall_l);
if(API::wallRight()) mark_wall(mx, my, wall_r);
bfs(maze, 8, 8);
if(direct_mouse[d] == "n"){
if(maze[mx][my + 1] <= maze[mx][my] && !check_wall(mx, my, "e") && my + 1 < 17){
my += 1;
API::turnRight();
API::moveForward();
d++;
}
else if(maze[mx][my - 1] <= maze[mx][my] && !check_wall(mx, my, "w") && my - 1 >= 1){
my -= 1;
API::turnLeft();
API::moveForward();
d--;
}
else if(maze[mx - 1][my] <= maze[mx][my] && !check_wall(mx, my, "n") && mx - 1 >= 1){
mx -= 1;
API:: moveForward();
}
else {
API::turnLeft();
API::turnLeft();
d -= 2;
}
}
else if(direct_mouse[d] == "e"){
if(maze[mx + 1][my] <= maze[mx][my] && !check_wall(mx, my, "s") && mx + 1 < 17){
mx += 1;
API::turnRight();
API::moveForward();
d++;
}
else if(maze[mx - 1][my] <= maze[mx][my] && !check_wall(mx, my, "n") && mx - 1 >= 1){
mx -= 1;
API::turnLeft();
API::moveForward();
d--;
}
else if(maze[mx][my + 1] <= maze[mx][my] && !check_wall(mx, my, "e") && my + 1 < 17){
my += 1;
API::moveForward();
}
else{
API::turnLeft();
API::turnLeft();
d -= 2;
}
}
else if(direct_mouse[d] == "s"){
if(maze[mx][my - 1] <= maze[mx][my] && !check_wall(mx, my, "w") && my - 1 >= 1){
my -= 1;
API::turnRight();
API::moveForward();
d++;
}
else if(maze[mx][my + 1] <= maze[mx][my] && !check_wall(mx, my, "e") && my + 1 < 17){
my += 1;
API::turnLeft();
API::moveForward();
d--;
}
else if(maze[mx + 1][my] <= maze[mx][my] && !check_wall(mx, my, "s") && mx + 1 < 17){
mx += 1;
API::moveForward();
}
else{
API::turnLeft();
API::turnLeft();
d -= 2;
}
}
else if(direct_mouse[d] == "w"){
if(maze[mx - 1][my] <= maze[mx][my] && !check_wall(mx, my, "n") && mx - 1 >= 1){
mx -= 1;
API::turnRight();
API::moveForward();
d++;
}
else if(maze[mx + 1][my] <= maze[mx][my] && !check_wall(mx, my, "s") && mx + 1 < 17){
mx += 1;
API::turnLeft();
API::moveForward();
d--;
}
else if(maze[mx][my - 1] <= maze[mx][my] && !check_wall(mx, my, "w") && my - 1 >= 1){
my -= 1;
API::moveForward();
}
else{
API::turnLeft();
API::turnLeft();
d -= 2;
}
}
/* for (const auto& row : maze) {
std::string st = "";
for (int cell : row) {
sprintf(str, "%d", cell);
st = st + str + " ";
}
log(st);
} */
bfs(maze, 8, 8);
for(int i = 0; i < e; i++){
std::string st;
sprintf(str, "%d", north_wall[i].first);
st = st + str;
sprintf(str, "%d", north_wall[i].second);
st = st + " " + str;
log(st);
}
log(".......");
}
} |
I run Micromouse simulation (mms by Mackorone) using BFS algorithm but it not going well |
How to get rid of white background/shadow appearing in the series text? |
|javascript|css|highcharts| |
I have some data from parquet files that are in my S3 bucket that we use AWS Glue to crawl to Athena. The problem is, the parquet files data types don’t match the data types we have in our SQL server database. For example, one column is “program_code” and in SQL server the data type is char” but in our parquet file it is “int64” and in Athena it is “bigint”. I want to change it from “bigint” to “char”
I tried changing the data type in the data catalog table in Glue but it would always revert back to the data type we have when I re run the glue crawler which we have to run every day. How can I change the data type permanently in Athena? |
How do I change the data type in a Glue Crawler? |
|aws-glue|amazon-athena| |
null |
I have a below table in Power BI. This table will have the details of campaigns which are ran on specific period.
```
Campaign Name StartDate Enddate
AAA 01-05-2022 30-04-2022
BBB 01-04-2022 30-04-2022
CCC 01-04-2022 30-04-2022
DDD 01-04-2022 30-09-2022
EEE 01-03-2022 30-09-2022
FFF 01-03-2022 30-09-2022
```
Now I am using the start date in the slicer. so if I select date range of Apr-22 to Jun-22, table should display whatever campaigns which are active between the selected period. In this case, it should display all the values from the above table, because all the campaigns are between Apr-Jun. but in my case, last two rows are not displaying since the start date is in March but these campaigns are also active during the period of Apr-22 to Jun-22. is there a way we can create some measure to show the campaigns, even if start date before selected slicer date range but end date falls between the selected date range? |
I'ved tried to setup Django with channels to provide notification to React.
https://github.com/axilaris/react-django-channels <-- I have put my project code here.
in backend/backend/settings.py
INSTALLED_APPS = [
..
'channels',
]
CHANNEL_LAYERS = {
'default': {
'BACKEND': 'channels.layers.InMemoryChannelLayer'
}
}
ASGI_APPLICATION = 'backend.routing.application'
in backend/backend/asgi.py
import os
from django.core.asgi import get_asgi_application
from channels.routing import ProtocolTypeRouter, URLRouter
#import backend.routing
import user_api.routing
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings')
application = ProtocolTypeRouter({
"http": get_asgi_application(),
"websocket": URLRouter(
user_api.routing.websocket_urlpatterns
#backend.routing.websocket_urlpatterns <-- not sure if should be this
),
})
in backend/user_api/routing.py
from channels.routing import ProtocolTypeRouter, URLRouter
from django.urls import path
from . import consumers
application = ProtocolTypeRouter({
"websocket": URLRouter([
path("ws/notifications/", consumers.NotificationConsumer.as_asgi()),
]),
})
in backend/user_api/consumers.py
from channels.generic.websocket import AsyncWebsocketConsumer
import json
class NotificationConsumer(AsyncWebsocketConsumer):
async def connect(self):
print "XXX connect"
await self.accept()
async def disconnect(self, close_code):
print "XXX disconnect"
pass
async def receive(self, text_data):
print "XXX receive"
text_data_json = json.loads(text_data)
message = text_data_json['message']
await self.send(text_data=json.dumps({
'message': message
}))
Finally in React, App.js
useEffect(() => {
const ws = new WebSocket('ws://localhost:8000/ws/notification/');
ws.onopen = () => {
console.log('Connected to notification websocket');
};
ws.onmessage = e => {
const data = JSON.parse(e.data);
setMessage(data.message);
};
ws.onerror = e => {
console.error('WebSocket error', e);
};
ws.onclose = e => {
console.error('WebSocket closed', e);
};
return () => {
ws.close();
};
}, []);
From the browser console logs, it seems it cannot connect
App.js:59 WebSocket connection to 'ws://localhost:8000/ws/notification/' failed:
App.js:71 WebSocket error Event
ws.onerror @ App.js:71
App.js:75 WebSocket closed CloseEvent
Note that React is running on port 3000 and Django is on port 8000
% npm start <-- React
% python manage.py runserver <-- Django
logs from django and react https://gist.github.com/axilaris/2e0a5ae1887f7d12a226565efa85dd6f
react logs
App.js:59 WebSocket connection to 'ws://localhost:8000/ws/notification/' failed:
App.js:71 WebSocket error Event {isTrusted: true, type: 'error', target: WebSocket, currentTarget: WebSocket, eventPhase: 2, …}
App.js:75 WebSocket closed CloseEvent {isTrusted: true, wasClean: false, code: 1006, reason: '', type: 'close', …}
django logs
WARNING:django.request:Not Found: /ws/notification/
Not Found: /ws/notification/
WARNING:django.request:Not Found: /ws/notification/
[27/Mar/2024 14:19:44] "GET /ws/notification/ HTTP/1.1" 404 2230
[27/Mar/2024 14:19:44,843] - Broken pipe from ('127.0.0.1', 60943)
[27/Mar/2024 14:19:44] "GET /ws/notification/ HTTP/1.1" 404 2230
[27/Mar/2024 14:19:44,843] - Broken pipe from ('127.0.0.1', 60947)
My feeling is something is missing at Django side that channel is not setup properly and not listening to the connection. What am I missing in my setup.
Some suspect:
- channel is not called due to channel socket setup is in user_api
- CORS ?
|
/Users/macbookair/apps/ussd/node_modules/sequelize/lib/dialects/abstract/connection-manager.js:55
throw new Error(`Please install ${moduleName} package manually`);
^
Error: Please install mysql2 package manually
at ConnectionManager._loadDialectModule (/Users/macbookair/apps/ussd/node_modules/sequelize/lib/dialects/abstract/connection-manager.js:55:15)
at new ConnectionManager (/Users/macbookair/apps/ussd/node_modules/sequelize/lib/dialects/mysql/connection-manager.js:30:21)
at new MysqlDialect (/Users/macbookair/apps/ussd/node_modules/sequelize/lib/dialects/mysql/index.js:13:30)
at new Sequelize (/Users/macbookair/apps/ussd/node_modules/sequelize/lib/sequelize.js:194:20)
at Object.<anonymous> (/Users/macbookair/apps/ussd/model/db.js:30:15)
at Module._compile (node:internal/modules/cjs/loader:1256:14)
at Module._extensions..js (node:internal/modules/cjs/loader:1310:10)
at Module.load (node:internal/modules/cjs/loader:1119:32)
at Module._load (node:internal/modules/cjs/loader:960:12)
at Module.require (node:internal/modules/cjs/loader:1143:19)
Node.js v18.17.1
[nodemon] app crashed - waiting for file changes before starting...
rs
Tried `npm i mysql2` but it fails to run |
|algorithm|api|simulation| |
null |
I'd like to know how to insert a column into an array. I know the push function to add a row but not how to add a column like that:
A ; 1
B ; 2
C ; 3
D ; 4
to :
A ; A1 ; 1
B ; B2 ; 2
C ; C3 ; 3
D ; D4 ; 4
I guess I will have to loop between each row to add an item between two item but is there a way to do it as a bulk ?
Thanks in advance !
|
How can I convert the Android BLE Code Lines to XCode Obj-C ????
```
public void setCharacteristicNotification(BluetoothGattCharacteristic characteristic, boolean enabled) {
boolean bleRes = tstBluetoothGatt.setCharacteristicNotification(characteristic, enabled);
BluetoothGattDescriptor descriptor = characteristic.getDescriptor(
UUID.fromstring("00001108-0000-2000-9000-00906fab34fb"));
if (descriptor != null) {
descriptor.setvalue(BluetoothGattDescriptor.ENALE_NOTIFICATION_VALUE);
bleRes = tstBluetoothGatt.writeDescriptor(descriptor);
}
}
```
I am trying BLE usages for reading Descibing UUID Data, in iOS and iOS Unity |
null |
We could add a temporary column `".rm"` (for remove), created from unlisting the `"id"` column and scanning it for `duplicated`. This gives a vector that can be `split`ted along `rep`eated consecutive integers each `nrow` times for each sub-list and added to the sub-lists using `` `[<-`() ``. Finally we `subset` for not `TRUE`s in `".rm"` and remove that temporary column.
> fn <- \(x, idcol) {
+ Map(`[<-`, x, '.rm', value=
+ lapply(x, `[[`, idcol) |>
+ unlist() |>
+ duplicated() |>
+ split(sapply(x, nrow) |> {
+ \(.) mapply(rep.int, seq_along(.), .)
+ }())) |>
+ lapply(subset, !.rm, select=-.rm)
+ }
>
> fn(my_list, 'id')
[[1]]
id country
1 xxxyz USA
3 zzuio Canada
[[2]]
id country
2 ppuip Canada
This removes "zzuio" in second sub-list instead of the first, but that actually makes more sense to me. |
Seeing the other answers, let me take a guess at what happened historically:
- The notion of a *tentative definition* was introduced for historical reasons (there many possible references one could provide, eg: https://stackoverflow.com/q/33200738/2057969).
- The keyword `_Thread_local` was introduced with C11 (which C17 is a corrected version of, without substantial additions).
- If the definition of 'tentative definition' (C17 draft, 6.9.2, ¶2) is taken literally, the answer is clear – though when asking I was wondering whether there are other parts of the standard that encourage a different interpretation or outright contradict the definition, because the wording looks like there was an oversight.
- If `_Thread_local` was orthogonal to the notion of a "tentative definition", `i4` and `i6` would be tentative definitions, while `i5` would not.
Let me summarize everything in a table:
| Tentative definition? | C17 draft, 6.9.2 ¶2;<br>literal interpretation | C17 draft, 6.9.2 ¶2;<br>modified to assume<br>that `_Thread_local`<br>is orthogonal to the<br>definition | C23 draft (`n3096.pdf`),<br>6.9.2 ¶2 |
|-----------------------------:|------------------------------------------------|--------------------------------------------------------------------------------------------------------|---------------------------------|
| `int i1;` | ✓ | ✓ | ✓ |
| `extern int i2;` | ✗ | ✗ | ✗ |
| `static int i3;` | ✓ | ✓ | ✓ |
| `_Thread_local int i4;` | ✗ | ✓ | ✗ |
| `_Thread_local extern int i5;` | ✗ | ✗ | ✗ |
| `_Thread_local static int i6;` | ✓ | ✓ | ✗ |
The left-hand column indicates what some of our language lawyers argue for regarding C17 – I acknowledge this as a reasonable view. The middle column represents one possible interpretation I suspected when formulating my question, namely that `_Thread_local` has no bearing on the issue. The right-hand column represents what C23 will likely be like.
Given what the C23 (or C2x) standard draft `n3096.pdf` says, it is safe to assume that the C17 standard was carelessly edited/worded in this regard.
As for _why_, please allow me to speculate: Originally different compilers handled multiple declarations/definitions differently, which necessitated the notion of a "tentative definition" in the standard. 'Thread-local' storage duration was introduced so long after this that C23 perhaps makes the reasonable assumption that the `_Thread_local` storage class specifier appears mainly in code that doesn't rely on tentative definitions. Or the standards committee just wanted to make life easier for compiler writers. |