instruction stringlengths 0 30k ⌀ |
|---|
|r|algorithmic-trading|performanceanalytics|blotter| |
null |
Create container component, is reusable and responsive
```js
// src/Components/Container .jsx
import { Row, Col } from 'antd'
const Container = (props) => {
const { children } = props
return (<Row className='container'>
<Col xs={1} sm={2}></Col>
<Col xs={22} sm={20}>
{ children }
</Col>
<Col xs={1} sm={2}></Col>
</Row>)
}
export default Container
```
Usage
```js
// App.js
import Container from "@/Components/Container";
const App = () => {
return(<Container>
Content goes here
</Container>)
}
export default App
``` |
For swagger your API controller's all public method should be marked with [HttpGet/HttpPost/HttpPut/HttpDelete] or [NonAction] attribute.
If your controller is inheriting from ApiController, make sure your swagger is not complaining on **ExecuteAsync** method.
[![enter image description here][1]][1]
In this case you just need to override ExecuteAsync method in your controller and add [NonAction] attribute on it.
[![enter image description here][2]][2]
[1]: https://i.stack.imgur.com/R4eif.png
[2]: https://i.stack.imgur.com/MO1xp.png |
I am working with Spring MVC. All dependencies of Spring are added so app want to add micrometer-core for io/micrometer/observation/transport/RequestReplyReceiverContext and I did it
My pom.xml
```
> `<dependency>
> <groupId>io.micrometer</groupId>
> <artifactId>micrometer-core</artifactId>
> <version>1.12.2</version>
> </dependency>`
```
and library is added on my project so at execution....
```
> Context initialization failed
> org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping': Lookup method resolution failed
> at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.checkLookupMethods(AutowiredAnnotationBeanPostProcessor.java:497)
> at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.determineCandidateConstructors(AutowiredAnnotationBeanPostProcessor.java:367)
> at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.determineConstructorsFromBeanPostProcessors(AbstractAutowireCapableBeanFactory.java:1294)
> at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1189)
> at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:562)
> at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:522)
> at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:326)
> at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
> at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:324)
> at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200)
> at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:975)
> at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:962)
> at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:624)
> at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:394)
> at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:274)
> at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:102)
> at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4438)
> at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4876)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:171)
> at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1332)
> at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1322)
> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
> at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
> at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
> at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:866)
> at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:845)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:171)
> at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1332)
> at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1322)
> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
> at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
> at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145)
> at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:866)
> at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:240)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:171)
> at org.apache.catalina.core.StandardService.startInternal(StandardService.java:433)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:171)
> at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:917)
> at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:171)
> at org.apache.catalina.startup.Catalina.start(Catalina.java:795)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
> at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:568)
> at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:347)
> at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:478)
> Caused by: java.lang.IllegalStateException: Failed to introspect Class [org.springframework.web.servlet.mvc.method.RequestMappingInfoHandlerMapping] from ClassLoader [ParallelWebappClassLoader
> context: GameZone
> delegate: false
> Parent Classloader:
> java.net.URLClassLoader@100fc185
> ]
> at org.springframework.util.ReflectionUtils.getDeclaredMethods(ReflectionUtils.java:483)
> at org.springframework.util.ReflectionUtils.doWithLocalMethods(ReflectionUtils.java:320)
> at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.checkLookupMethods(AutowiredAnnotationBeanPostProcessor.java:475)
> 45 more
> Caused by: java.lang.NoClassDefFoundError: io/micrometer/observation/transport/RequestReplyReceiverContext
> at java.base/java.lang.ClassLoader.defineClass1(Native Method)
> at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1017)
> at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
> at org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2352)
> at org.apache.catalina.loader.WebappClassLoaderBase.findClass(WebappClassLoaderBase.java:800)
> at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1317)
> at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1165)
> at java.base/java.lang.Class.getDeclaredMethods0(Native Method)
> at java.base/java.lang.Class.privateGetDeclaredMethods(Class.java:3402)
> at java.base/java.lang.Class.getDeclaredMethods(Class.java:2504)
> at org.springframework.util.ReflectionUtils.getDeclaredMethods(ReflectionUtils.java:465)
> 47 more
> Caused by: java.lang.ClassNotFoundException: io.micrometer.observation.transport.RequestReplyReceiverContext
> at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1353)
> at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1165)
> 58 more`
>
```
What is the problem here?
|
For loading CSS styles asynchronously, **you should indicate it as a preload stylesheet**. **Whenever you use `rel="preload"` alone, it won't transform to stylesheet on page load. It will remain as `preload`**, so to indicate it as a stylesheet, you should use another attribute such as `as`, which **indicates the type of element; in your case, it should be `style`**. Then you need to tell the browser when the loading process is finished **You need to consider this a stylesheet, so you have to define another attribute like `onload` and then define its real relation**.
So you have to import it like this:
```html
<link rel="preload" as="style" onload="this.rel='stylesheet'" href="http://www.example.com/style.css">
```
**NOTE:** You can read more about it in [web.dev][1].
# Update 2024
### Basic Usage of `preload`
Thanks to @Hewlett, we have an update regarding browser support for `preload`. According to [Caniuse][2], the [`preload`][3] feature is now widely supported across all major browsers. This development makes `<link rel="preload">` a reliable choice for prefetching resources without immediately executing them. For preloading stylesheets, the basic implementation would look like this:
```html
<link rel="preload" as="style" href="http://www.example.com/style.css">
```
### Applying Preloaded Stylesheet
However, as highlighted by @MichaelFreidgeim, and per [MDN Web Docs][3], `preload` merely prioritizes downloading and caching the resource; it doesn't directly apply the stylesheet. To ensure that a preloaded stylesheet is utilized as a `stylesheet` once loaded, a JavaScript-based solution is ideal. This method allows us to preload the stylesheet, reducing its initial render-blocking impact, and then to apply it after the full page load.
Implement it as follows:
```html
<link rel="preload" as="style" href="http://www.example.com/style.css" id="preloadStyle">
<script>
window.addEventListener('load', function() {
var preloadLink = document.getElementById('preloadStyle');
preloadLink.rel = 'stylesheet';
});
</script>
```
In this setup, the `<link>` tag initially preloads the stylesheet with `rel="preload"`. The JavaScript snippet is designed to wait for the `window.load` event, which is triggered when the entire page, including all dependent resources, is fully loaded. Once this event occurs, the script changes the `rel` attribute of our preloaded link to `stylesheet`, thus applying the styles to the page. This approach strikes a balance between efficient resource loading and optimal rendering performance.
## Handling Specific Positioning in DOM
In cases where you need to preload stylesheets but append them at a specific position in the DOM, you can utilize JavaScript to dynamically insert the stylesheet after its preload phase. This method is especially useful when you want to maintain a specific load order for your styles. For instance, to preload a stylesheet and insert it below an existing style tag, you can do the following:
```html
<!-- Your first style tag -->
<style id="firstStyle">
/* Your CSS here */
</style>
<!-- Preload your CSS file -->
<link rel="preload" as="style" href="http://www.example.com/style.css" id="preloadStyle">
<script>
function loadAndInsertStyle(href, insertAfterId) {
var link = document.createElement('link');
link.href = href;
link.rel = 'stylesheet';
var ref = document.getElementById(insertAfterId);
ref.parentNode.insertBefore(link, ref.nextSibling);
}
window.addEventListener('load', function() {
loadAndInsertStyle('http://www.example.com/style.css', 'firstStyle');
});
</script>
```
This method ensures that your stylesheet is preloaded and then inserted in the desired position within the DOM, allowing for more granular control over stylesheet loading and application.
## Previous Update(obsolete)
Since **`preload` is not supported in Firefox** until now (according to [this][4]), the only way to do it is to declare it twice in one `rel` tag or two separate tags.
```html
<link rel="preload" as="style" href="http://www.example.com/style.css">
<link rel="stylesheet" href="http://www.example.com/style.css">
```
<s>Or</s>
<s>
```html
<link rel="stylesheet" rel="preload" as="style" href="http://www.example.com/style.css">
```
</s>
**NOTE:** As @JohnyFree tested the second one *(one with a line through)* in the Google page speed, it won't be recognized as a valid `preload` style, whilst the format is correct according to [W3.org][5].
[1]: https://web.dev/defer-non-critical-css/
[2]: https://caniuse.com/link-rel-preload
[3]: https://developer.mozilla.org/en-US/docs/Web/HTML/Attributes/rel/preload
[4]: https://developer.mozilla.org/en-US/docs/Web/HTML/Preloading_content
[5]: https://www.w3.org/TR/REC-html40/struct/links.html |
I have 2 worksheets. The first one is for the working file and the second is for the master file sheet. The master file sheet has 3 columns which are as follows Fund Number, Name, Portfolio Number. In the working file sheet, specifically in column 2, it has the values in the Fund Number Column of the master file sheet. What I would like to do is to perform vlook up function in the working file sheet column 2, that will get the corresponding values in Portfolio Number Column in the master file sheet and autofill it till the last row. Is vlookup function appropriate for this matter?
Any help will be highly appreciated.
Sub VLOOKUP_Formula_1()
Dim sh As Worksheet
Set sh = ThisWorkbook.Sheets("Working File")
Dim lr As Integer
lr = sh.Range("B" & Application.Rows.Count).End(xlUp).Row
sh.Range("B2").Value = "=VLOOKUP(B2,Database!A:C,3,0)"
sh.Range("B2" & lr).FillDown
sh.Range("B2" & lr).Copy
sh.Range("B2" & lr).PasteSpecial xlPasteValues
Application.CutCopyMode = False
End Sub |
Vlookup Function Excel VBA |
|excel|vba|macros| |
null |
I'm having permission problems when I'm trying to do a query on Firebase Firestore.
This is the code that performs the query.
```
Future<void> fetchData() async {
FirebaseFirestore db = FirebaseFirestore.instance;
CollectionReference diariosRef = db.collection("diarios");
QuerySnapshot feveireiro = await diariosRef
.where("data_publicacao",
isGreaterThanOrEqualTo: DateTime(1, DateTime.november, 2023))
.get();
for (var doc in feveireiro.docs) {
print(doc.data());
}
}
```
And i get this error:
> E/flutter ( 7158): [ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: [cloud_firestore/permission-denied] The caller does not have permission to execute the specified operation.
And this is how the permissions rules are set up in my Firebase:
```
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /{document=**} {
allow read, write: if true;
}
}
}
```
Im also using Cloud Storage and Firebase Auth, but Cloud Storage has the same rules.
What is wrong?
|
How to find mean value of k nearest points when some values are NaN? |
|geopandas| |
From the [C23 draft](https://open-std.org/JTC1/SC22/WG14/www/docs/n3220.pdf):
7.23.6.1 The `fprintf` function
> 8. The conversion specifiers and their meanings are:
>
> * `p` The argument shall be a pointer to void or a pointer to a character type. The value of the pointer is converted to a sequence of printing characters, in an **implementation-defined** manner.
That means that different implementations may produce different output, even if the actual pointer values would happen to be the same.
If you want the same format on all platforms and implementations, you could cast the pointer to `uintptr_t` and print that instead.
Example:
```c
#include <inttypes.h> // PRIxPTR
#include <stdint.h> // uintptr_t
//...
printf("&n stores the address of n = 0x%016" PRIxPTR "\n", (uintptr_t)&n);
```
Possible output:
```none
&n stores the address of n = 0x000000016fa13224
``` |
{"Voters":[{"Id":5741073,"DisplayName":"TomServo"},{"Id":2395282,"DisplayName":"vimuth"},{"Id":874188,"DisplayName":"tripleee"}]} |
You could write the pattern like this if there is no single dangling `——` at the bottom, but that would give you a single match of all the lines:
(?s).*^——$\R\K.*$
[Regex demo](https://regex101.com/r/7g3gji/1)
If you want separate matches (to count them), you could write the pattern as:
(?:[\s\S]*^——$|\G(?!^))\R\K.+
The pattern matches:
- `(?:` Non capture group
- `[\s\S]*` Match all the way to the end of the file
- `^——$` Match `——` on a single line
- `|` Or
- `\G(?!^)` Assert the current position at the end of the previous match, not at the start
- `)` Close the non capture group
- `\R\K` Match a newline and then forget what is matched so far
- `.+` Match a whole line with one or more characters
[Regex demo](https://regex101.com/r/At2M7O/1)
Or if there can not be a single `——` at the bottom:
(?:[\s\S]*^——$|\G(?!^))\R\K(?!^——$).+
[Regex demo](https://regex101.com/r/eDQzuB/1)
For example in a tool like Notepad++:
[![enter image description here][1]][1]
[1]: https://i.stack.imgur.com/MTvyO.png |
So when checking for duplicates in an array (that's what I think the algorithm is for; correct me if I'm wrong), you want to check each element against each other element once. Let's have a 2D array of all possible pairs of elements you can check.
For an array with 4 elements, the 2D array would look like this:
```
[(0,0),(0,1),(0,2),(0,3),
(1,0),(1,1),(1,2),(1,3),
(2,0),(2,1),(2,2),(2,3),
(3,0),(3,1),(3,2),(3,3)]
```
We can traverse this 2D array like you said.
```
double[] array = new double[] {1,2,3,4}
for (int i=0;i<array.length;i++){
for (int j=0;j<array.length;j++){
}
```
However, this checks both (1,0) and (0,1) which is redundant. All we have to do is check the elements above and including the diagonal from the top left to the bottom right. This makes it so that we have to set j to i+1. |
You can [choose which modules to export to the external code](https://stackoverflow.com/a/78251561/8138591). Additionally, u can *clearly* communicate the **intention** that a code exported from a file should only be used inside a package and not outside of it (**package scoped**).
My solution is based on the idea from another programming language. According to the [convention](https://dart.dev/language) adopted by the Dart programming language created by Google:
> Unlike Java, Dart doesn't have the keywords public, protected, and private. If an identifier starts with an underscore (_), it's private to its library.
So for a file that *only* exports **package-private code**, its name should be prefixed with an underscore `_`, for example `_packagePrivate.js`. *All* the **named exports** from this file (e.g., constants, functions) should also be prefixed with an underscore `_` too.
**JSDoc** has a special block tag [`@package`](https://jsdoc.app/tags-package):
> This symbol is meant to be **package-private**. Typically, this tag indicates that a symbol is available only to code in the same directory as the source file for this symbol.
The `@package` tag in JSDoc is the same as `@access package` (see [`@access`](https://jsdoc.app/tags-access)).
**TSDoc** provides a modifier tag [`@internal`](https://tsdoc.org/pages/tags/internal/):
> Designates that an API item is not planned to be used by third-party developers. The tooling may trim the declaration from a public release. In some implementations, certain designated packages may be allowed to consume **internal API items**, e.g. because the packages are components of the same product.
So as a result, the `_packagePrivate.ts` file might look like this:
```typescript
/**
* @package
* @internal
*/
export function _returnAPIKey(): string {
return '...';
}
/**
* @package
* @internal
*/
export default _returnAPIKey();
```
According to the [JSDoc example for ES 2015 Modules](https://jsdoc.app/howto-es2015-modules)
> In most cases, you can simply add a JSDoc comment to the export statement that defines the exported value. If you are exporting a value under another name, you can document the exported value within its export block. |
|vb.net|filtering| |
{"Voters":[{"Id":6600940,"DisplayName":"Storax"},{"Id":9214357,"DisplayName":"Zephyr"},{"Id":8422953,"DisplayName":"braX"}]} |
You should build `a` tag in your render, the symbol in your map should be the href string
const stocksArray = store.stocks.map((stock, index) => {
return {
id: index+1,
symbol: `/symbol/${stock.symbol}`,
volAvg: stock.volAvg,
mktCap: stock.mktCap,
}
});
Also you don't have to push into a new array, instead you can return the object in your map, just like I did
You must define the renderCell in the column symbol:
const columns = [
{
field: "symbol",
headerName: "Symbol",
width: 130,
disableColumnMenu: true,
resizable: false,
headerClassName: "datagrid-header-style",
renderCell: (params) => <a href={`/symbol/${stock.symbol}`}>stock.symbol</a>,
},
{
field: "volAvg",
headerName: "volAvg",
width: 130,
disableColumnMenu: true,
resizable: false,
headerClassName: "datagrid-header-style",
},
{
field: "mktCap",
headerName: "mktCap",
width: 130,
disableColumnMenu: true,
resizable: false,
headerClassName: "datagrid-header-style",
},
];
|
I have got bbappend file for one of the recipe in meta-tegra. Appends are dependent on dtc, which has been added to DEPENDS:append section. On bitbaking this recipe, dtc is not getting built.
Below is tegra-bootfiles\_%.bbappend
```
# Hack: The fetch task is disabled on this recipe, so the following is just for the task signature.
FILESEXTRAPATHS:prepend := "${THISDIR}/${BPN}:"
SRC_URI:append:p3768-0000-p3767-0000 = "\
file://tegra234-mb1-bct-gpio-p3767-hdmi-a03.dtsi \
file://tegra234-mb1-bct-pinmux-p3767-hdmi-a03.dtsi \
"
DEPENDS:append = " dtc dtc-native"
addtask do_process_eeprom before do_install
do_process_eeprom() {
dtc -I dtb -O dts ${S}/bootloader/t186ref/tegra234-bpmp-3767-0001-3509-a02.dtb > ${S}/bootloader/t186ref/tegra234-bpmp-3767-0001-3509-a02.dts
sed -i 's/gbe-uphy-config = <0x08>/gbe-uphy-config = <0x09>/g' ${S}/bootloader/t186ref/tegra234-bpmp-3767-0001-3509-a02.dts
dtc -I dts -O dtb -o ${S}/bootloader/t186ref/tegra234-bpmp-3767-0001-3509-a02.dtb ${S}/bootloader/t186ref/tegra234-bpmp-3767-0001-3509-a02.dts
rm -rf ${S}/bootloader/t186ref/tegra234-bpmp-3767-0001-3509-a02.dts
sed -i 's/cvb_eeprom_read_size = <0x100>/cvb_eeprom_read_size = <0x0>/g' ${S}/bootloader/t186ref/BCT/tegra234-mb2-bct-misc-p3767-0000.dts
sed -i 's/cvb_eeprom_read_size = <0x100>/cvb_eeprom_read_size = <0x0>/g' ${S}/bootloader/t186ref/BCT/tegra234-mb2-bct-misc-p3701-0002-p3711-0000.dts
sed -i 's/cvb_eeprom_read_size = <0x100>/cvb_eeprom_read_size = <0x0>/g' ${S}/bootloader/t186ref/BCT/tegra234-mb2-bct-misc-p3701-0002-p3740-0002.dts
# Define the text to add
text_to_add=' reg@322 { /* GPIO_M_SCR_00_0 */\n exclusion-info = <2>;\n value = <0x38009696>;\n };\n'
# Use sed to add the text after "tfs {" in the file
sed -i '/tfc {/a\'"$text_to_add"'' ${S}/bootloader/t186ref/BCT/tegra234-mb2-bct-scr-p3767-0000.dts
}
# Hack: As the fetch task is disabled for this recipe, we have to directly access the files."
CUSTOM_DTSI_DIR := "${THISDIR}/${BPN}"
do_install:append:p3768-0000-p3767-0000() {
install -m 0644 ${CUSTOM_DTSI_DIR}/tegra234-mb1-bct-gpio-p3767-hdmi-a03.dtsi ${D}${datadir}/tegraflash/
install -m 0644 ${CUSTOM_DTSI_DIR}/tegra234-mb1-bct-pinmux-p3767-hdmi-a03.dtsi ${D}${datadir}/tegraflash/
}
```
On executing `bitbake -v tegra-bootfiles`, following error occurs -
```
dtc -I dtb -O dts /home/user/poky/build/tmp/work-shared/L4T-tegra-35.4.1-r0/Linux_for_Tegra/bootloader/t186ref/tegra234-bpmp-3767-0001-3509-a02.dtb
/home/user/poky/build/tmp/work/p3768_0000_p3767_0000-poky-linux/tegra-bootfiles/35.4.1-r0/temp/run.do_process_eeprom.1462074: 141: dtc: not found
+ bb_sh_exit_handler
+ ret=127
+ [ 127 != 0 ]
+ echo WARNING: exit code 127 from a shell command.
+ exit 127
+ ls -A
+ [ deploy-source-date-epoch ]
+ set +e
+ tar -I pzstd -8 -p 12 -cS -f /home/user/poky/build/sstate-cache/4e/90/sstate:tegra-bootfiles:p3768_0000_p3767_0000-poky-linux:35.4.1:r0:p3768_0000_p3767_0000:10:4e90edf4eefa9b94a2e7fc008d6fa4b569642b6903ff9a967da1236cec8b2dc1_deploy_source_date_epoch.tar.zst.Ij3TcnB1 deploy-source-date-epoch
ERROR: tegra-bootfiles-35.4.1-r0 do_process_eeprom: ExecutionError('/home/user/poky/build/tmp/work/p3768_0000_p3767_0000-poky-linux/tegra-bootfiles/35.4.1-r0/temp/run.do_process_eeprom.1462074', 127, None, None)
ERROR: Logfile of failure stored in: /home/user/poky/build/tmp/work/p3768_0000_p3767_0000-poky-linux/tegra-bootfiles/35.4.1-r0/temp/log.do_process_eeprom.1462074
ERROR: Task (/home/user/poky/meta-tegra/recipes-bsp/tegra-binaries/tegra-bootfiles_35.4.1.bb:do_process_eeprom) failed with exit code '1'
```
I am expecting tegra-bootfiles to install dtc as part of its process but dtc build is no getting initiated.
Appreciate any pointers. Thanks |
I'm trying to retrieve a parameter value from a config a file.
(Get-Content C:\<ConfigFolder>\basic.conf | ConvertFrom-StringData)."FileLocation"
The value of the "FileLocation" is "C:\temp".
And, value printed on screen is;
"C: emp"
Looks like "\t" is parsed as a Tab.
Is there a way avoid this and print the as original value as is?
File content is as below;
UpdatedTime="2024-03-26 09:55:00"
FileLocation="C:\temp"
|
I found a workaround shared by Damian from AppDynamics support that helped me adjust the log levels for both the Proxy and the Watchdog in AppDynamics. Here's a summary of the steps:
- **Proxy Log Level:** The `log4j2.xml` file controls this. You can find it within the appdynamics_bindeps module. For example, in my WSL setup, it's located at `/home/wsl/.pyenv/versions/3.11.6/lib/python3.11/site-packages/appdynamics_bindeps/proxy/conf/logging/log4j2.xml`. In the Docker image python:3.9, the path is `/usr/local/lib/python3.9/site-packages/appdynamics_bindeps/proxy/conf/logging/log4j2.xml`. Modify the <AsyncLogger> level within the <Loggers> section to one of the following: debug, info, warn, error, or fatal.
- **Watch Dog Log Level:** This can be adjusted in the `proxy.py` file found within the appdynamics Python module. For example, in my WSL setup, it's located at `/home/wsl/.pyenv/versions/3.11.6/lib/python3.11/site-packages/appdynamics/scripts/pyagent/commands/proxy.py`. In the Docker image python:3.9, the path is `/usr/local/lib/python3.9/site-packages/appdynamics/scripts/pyagent/commands/proxy.py`. You will need to hardcode the log level in the configure_proxy_logger and configure_watchdog_logger functions by changing the level variable.
## My versions
```bash
$ pip freeze | grep appdynamics
appdynamics==24.2.0.6567
appdynamics-bindeps-linux-x64==24.2.0
appdynamics-proxysupport-linux-x64==11.68.3
```
## Original files
### log4j2.xml
```
<Loggers>
<!-- Modify each <AsyncLogger> level as needed -->
<AsyncLogger name="com.singularity" level="info" additivity="false">
<AppenderRef ref="Default"/>
<AppenderRef ref="RESTAppender"/>
<AppenderRef ref="Console"/>
</AsyncLogger>
</Loggers>
```
### proxy.py
```python
def configure_proxy_logger(debug):
logger = logging.getLogger('appdynamics.proxy')
level = logging.DEBUG if debug else logging.INFO
pass
def configure_watchdog_logger(debug):
logger = logging.getLogger('appdynamics.proxy')
level = logging.DEBUG if debug else logging.INFO
pass
```
## My Script to create environment variables to log4j2.xml and proxy.py
### update_appdynamics_log_level.sh
```bash
#!/bin/sh
# Check if PYENV_ROOT is not set
if [ -z "$PYENV_ROOT" ]; then
# If PYENV_ROOT is not set, then set it to the default value
export PYENV_ROOT="/usr/local/lib"
echo "PYENV_ROOT was not set. Setting it to default: $PYENV_ROOT"
else
echo "PYENV_ROOT is already set to: $PYENV_ROOT"
fi
echo "=========================== log4j2 - appdynamics_bindeps module ========================="
# Find the appdynamics_bindeps directory
APP_APPD_BINDEPS_DIR=$(find "$PYENV_ROOT" -type d -name "appdynamics_bindeps" -print -quit)
if [ -z "$APP_APPD_BINDEPS_DIR" ]; then
echo "Error: appdynamics_bindeps directory not found."
exit 1
fi
echo "Found appdynamics_bindeps directory at $APP_APPD_BINDEPS_DIR"
# Find the log4j2.xml file within the appdynamics_bindeps directory
APP_LOG4J2_FILE=$(find "$APP_APPD_BINDEPS_DIR" -type f -name "log4j2.xml" -print -quit)
if [ -z "$APP_LOG4J2_FILE" ]; then
echo "Error: log4j2.xml file not found within the appdynamics_bindeps directory."
exit 1
fi
echo "Found log4j2.xml file at $APP_LOG4J2_FILE"
# Modify the log level in the log4j2.xml file
echo "Modifying log level in log4j2.xml file"
sed -i 's/level="info"/level="${env:APP_APPD_LOG4J2_LOG_LEVEL:-info}"/g' "$APP_LOG4J2_FILE"
echo "log4j2.xml file modified successfully."
echo "=========================== watchdog - appdynamics module ==============================="
# Find the appdynamics directory
APP_APPD_DIR=$(find "$PYENV_ROOT" -type d -name "appdynamics" -print -quit)
if [ -z "$APP_APPD_DIR" ]; then
echo "Error: appdynamics directory not found."
exit 1
fi
echo "Found appdynamics directory at $APP_APPD_DIR"
# Find the proxy.py file within the appdynamics directory
APP_PROXY_PY_FILE=$(find "$APP_APPD_DIR" -type f -name "proxy.py" -print -quit)
if [ -z "$APP_PROXY_PY_FILE" ]; then
echo "Error: proxy.py file not found within the appdynamics directory."
exit 1
fi
echo "Found proxy.py file at $APP_PROXY_PY_FILE"
# Modify the log level in the proxy.py file
echo "Modifying log level in proxy.py file"
sed -i 's/logging.DEBUG if debug else logging.INFO/os.getenv("APP_APPD_WATCHDOG_LOG_LEVEL", "info").upper()/g' "$APP_PROXY_PY_FILE"
echo "proxy.py file modified successfully."
```
## Dockerfile
Dockerfile to run pyagent with FastAPI and run this script
```dockerfile
# Use a specific version of the python image
FROM python:3.9
# Set the working directory in the container
WORKDIR /app
# First, copy only the requirements file and install dependencies to leverage Docker cache
COPY requirements.txt ./
RUN python3 -m pip install --no-cache-dir -r requirements.txt
# Now copy the rest of the application to the container
COPY . .
# Make the update_log4j2.sh and update_watchdog.sh scripts executable and run them
RUN chmod +x update_appdynamics_log_level.sh && \
./update_appdynamics_log_level.sh
# Set environment variables
ENV APP_APPD_LOG4J2_LOG_LEVEL="warn" \
APP_APPD_WATCHDOG_LOG_LEVEL="warn"
EXPOSE 8000
# Command to run the FastAPI application with pyagent
CMD ["pyagent", "run", "uvicorn", "main:app", "--proxy-headers", "--host","0.0.0.0", "--port","8000"]
```
## Files changed by the script
### log4j2.xml
```xml
<Loggers>
<!-- Modify each <AsyncLogger> level as needed -->
<!-- Exemple -->
<AsyncLogger name="com.singularity" level="${env:APP_APPD_LOG4J2_LOG_LEVEL:-info}" additivity="false">
<AppenderRef ref="Default"/>
<AppenderRef ref="RESTAppender"/>
<AppenderRef ref="Console"/>
</AsyncLogger>
</Loggers>
```
### proxy.py
```python
def configure_proxy_logger(debug):
logger = logging.getLogger('appdynamics.proxy')
level = os.getenv("APP_APPD_WATCHDOG_LOG_LEVEL", "info").upper()
pass
def configure_watchdog_logger(debug):
logger = logging.getLogger('appdynamics.proxy')
level = os.getenv("APP_APPD_WATCHDOG_LOG_LEVEL", "info").upper()
pass
```
## Warning
Please note, these paths and methods may vary based on your AppDynamics version and environment setup. Always backup files before making changes and be aware that updates to AppDynamics may overwrite your customizations.
I hope this helps! |
I have registration form, I want to validate age for 18 year old only allowed.
```
IconButton(
icon: Icon(
color: Colors.grey[250],
Icons.calendar_month_outlined),
onPressed: () async {
DateTime? pickedDate = await showDatePicker(
context: context,
initialDate: DateTime.now(),
firstDate: DateTime(1930),
lastDate: DateTime.now());
setState(() => pickedDate = pickedDate);
if (pickedDate != null) {
_dateOfBirth.text = pickedDate.toString().substring(0, 10);
}
},
),
```
I have this date picker, please tell me how to validate if picked date is valid (18 year old). |
Your container's ID contains a space but they're illegal.
<div id="Sign-In TempUI"
Get rid of it. Also...
document.addEventListener('DOMContentLoaded',
Why are you in such a hurry? Use LOAD instead...
document.addEventListener('load',
The only time one would need to listen to *DOMContentLoaded* is if they’re writing a utility that does performance timing on pages!
|
{"Voters":[{"Id":22180364,"DisplayName":"Jan"},{"Id":9214357,"DisplayName":"Zephyr"},{"Id":354577,"DisplayName":"Chris"}],"SiteSpecificCloseReasonIds":[13]} |
{"OriginalQuestionIds":[7524057],"Voters":[{"Id":1043380,"DisplayName":"gunr2171","BindingReason":{"GoldTagBadge":"c#"}}]} |
<!-- language-all: sh -->
[zett42](https://stackoverflow.com/users/7571258/zett42) has provided the crucial pointer:
From the [`ConvertFrom-StringData`](https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/convertfrom-stringdata) help topic (emphasis added):
> `ConvertFrom-StringData` supports escape character sequences that are allowed by conventional machine translation tools. That is, the **cmdlet can interpret backslashes (`\`) *as escape characters* in the string data** [...] You **can also preserve a literal backslash in your results by escaping it with a preceding backslash, like this: `\\`**. Unescaped backslash characters, such as those that are commonly used in file paths, can render as illegal escape sequences in your results.
Therefore, assuming that all `\` characters in your input file are meant to be interpreted _literally_:
```
(Get-Content -Raw C:\<ConfigFolder>\basic.conf).Replace('\', '\\') |
ConvertFrom-StringData).FileLocation
```
Note the use of `-Raw` with [`Get-Content`](https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/get-content) to ensure that the file content is read _in full_, as a single, (typically) _multiline_ string, which in turn ensures that `ConvertFrom-StringData` outputs a _single_ `[hashtable]` instance. |
I used [this][1] to generate a slideshow. Instead of putting the pictures as background-images, I put them as images in the html inside the div tags, which works very well.
<input type="radio" id="trigger3" name="slider">
<label for="trigger3"><span class="sr-only">Slide 3</span></label>
<div class="slide bg3">
<img src="/images/slide-6.png">
</div>
Now I would also like to include a video. When I insert it as follows:
<input type="radio" id="trigger4" name="slider">
<label for="trigger4"><span class="sr-only">Slide 4</span></label>
<div class="slide bg4">
<video width="320" height="240" controls>
<source src="/images/1.mp4" type="video/mp4">
Your browser does not support the video tag.
</video>
</div>
the video shows up, but the controls do not react, also after adding
video {z-index: 1000;}
What do I need to do here? I suppose the `<input>` feature has priority over / covers the video controls in some way? Thanks!
[1]: https://codepen.io/KamilDyrek/pen/ejmRxV |
Video controls are hidden behind input feature? |
|html|input|video| |
I updated maven version and it resolved both the issue
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.1</version>
Now my test cases are running perfectly in local as well as in Jenkins both. |
I looking-for solution to sort by relation (hasMany) fields, with max value of two columns.
I prepare starter kit: https://www.db-fiddle.com/f/m371fzUJuQp9dX1yKm4yEm/3
The question is, how to display results from `parents` table, with GREATEST(MAX, MAX) value from two fields, order of course from high to low in Laravel Eloquent ORM? |
Laravel order with GREATEST with MAX by two columns |
|laravel|max|has-many| |
Replace
this.id.replace("trigger-".$teamid;", "")
By
this.id.replace("trigger-" + <?php echo $teamid; ?>, "") |
JpaSpecificationExecutor's method 'findBy()' does not use Sort object from provided Pageable object.
I try to use **findBy()** method of **JpaSpecificationExecutor** to be able mixing specification and projection, e.g.:
```
var someSort = Sort.by(Sort.Direction.DESC, "someFiled");
var somePageable = PageRequest.of(0, 100, someSort);
someRepository.findBy(someSpecification, q -> q.as(SomeProjection.class).page(somePageable));
```
But the code above ignores my custom sorting.
I can see from the source code, that it really doesn't use Sort object from provided Pageable object.
**FetchableFluentQueryBySpecification**'s **page()** method uses **createSortedAndProjectedQuery()** method under the hood which uses default sorting(**Sort.unsorted()**) instead of provided one.
Everything works as expected If I use '**sortBy()**' method with provided custom sorting object, but why should I provide Sort object separately instead of just providing it with Pageable object?
Should it work like this or it's a bug ? |
JpaSpecificationExecutor's method 'findBy()' does not use Sort object from provided Pageable object |
The relevant part of [the documentation](https://doc.rust-lang.org/std/fs/fn.rename.html) is this:
> On Unix, if […] `from` is not a directory, `to` must also be not a directory. In contrast, on Windows, `from` can be anything, but `to` must not be a directory.
I.e. `to` *may not be* a directory, but you've passed the name of a directory to the second parameter.
So you'll have to append the file name to the directory name yourself. |
According to the code snippet, you provide, `file_path` is a file while `destination_path` is a directory. This is not permitted in rust on linux. According to the [docs](https://doc.rust-lang.org/std/fs/fn.rename.html#platform-specific-behavior):
> On Unix, if `from` is a directory, `to` must also be an (empty) directory. If `from` is not a directory, `to` must also be not a directory.
Instead you can do this:
```rust
use std::{fs, path::PathBuf};
fn main() {
let file_path = PathBuf::from("/Users/xiaoqiangjiang/apps/a");
let destination_path = PathBuf::from("/Users/xiaoqiangjiang/apps/test/a"); // added file name at the end
let result = fs::rename(file_path, destination_path);
if let Err(err) = result {
println!("{}", err);
}
}
``` |
|flutter|firebase|firebase-storage| |
null |
```json
[
{
"B4": 14,
"B5": 12
},
{
"B4": 58,
"B5": 54
},
{
"B4": 26,
"B5": 65
}
]
```
I want to create index ID in my uploaded json file. The json file look like as in the image. I want it to be like the following:
```json
[
1: {
"B4": 14,
"B5": 12
},
2: {
"B4": 58,
"B5": 54
},
3: {
"B4": 26,
"B5": 65
}
]
```
It's just to do some calculations for each set and display the results. |
null |
If you have a list of the buttons then you can select a list entry at random. Always remember that the first entry in a list or an array has an index of 0.
If you put all the buttons in a Panel, it can be easier to keep them organised: for example, you can get VB.NET to select all the buttons within that panel. Then you can also get it to count them for you, so if you ever change how many buttons there are, you won't need to go through the code changing every applicable instance of 15.
For a random number generator, you'll be better off using the .NET [Random](https://learn.microsoft.com/en-us/dotnet/api/system.random) class instead of the old VB Rnd() function. You only need one instance of it, so it can be created at the same time as the main form.
- Note: the random-generator in the code in the question will produce floating-point numbers like 11.2342 - I doubt you have a button with that number ;)
Please start a new VB.NET Windows Forms project, add a Panel named "TheButtonPanel" to the default form, and as many buttons as you like, just for testing, to the panel.
I have added a simple handler to each of the buttons, so you can see some of the possibilities of getting the code to do work instead of you.
```
Public Class Form1
Public rand As New Random()
Public theButtons As List(Of Button)
Sub Game()
Dim nButtons = theButtons.Count()
Dim n = rand.Next(0, nButtons)
Dim selectedButton = theButtons(n)
selectedButton.PerformClick()
End Sub
Private Sub bnClick(sender As Object, e As EventArgs)
Dim bn = DirectCast(sender, Button)
MessageBox.Show(bn.Text)
End Sub
Private Sub Init()
theButtons = TheButtonPanel.Controls.OfType(Of Button).ToList()
' An example of doing something with the buttons:
For Each bn In theButtons
AddHandler bn.Click, AddressOf bnClick
Next
' Other initialisation code.
End Sub
Private Sub Form1_Shown(sender As Object, e As EventArgs) Handles MyBase.Shown
Game()
End Sub
Private Sub Form1_Load(sender As Object, e As EventArgs) Handles MyBase.Load
Init()
End Sub
End Class
```
When you run it, it will show the form and automatically click a random button, and the click handler will show you the value of the button's .Text property (because I wanted to put *something* in the `Game()` method). You can still click the buttons to run the click handler.
Finally, try to give controls more descriptive names than things like "TheButtonPanel"—I have no idea what would be a good name in your program. |
Check the version of your llama_index if you are using the current version, kindly use :
from llama_index.core.node_parser import SimpleNodeParser
--------------------------------------------------------- |
I am facing this problem for a while now. When i add a library or package in a flutter project and then run
`flutter pub get`
But when i import that package, it is not available (as if it is not included in pubspec)
As long as i Know, running the command
`flutter pub get`
actually downloads that package but for some reason is not availale to be used in the project files.
I have to restart the android studio every time to use the new library
I tried the following but they did not help.
1. Open andriod studio in administrator mode
2. Delete the pubspec.lock file
3. Upgraded the flutter to latest version
4. Upgraded the android studio to latest version
5. Open pubspec.yaml and then press `Get Dependencies`
All these do not help and I must have to restart the android studio. Then everything works.
**Additional Info**
I wonder if it has to do something with my project structure or something. Because i craeted a new flutter project and there is no such problem. |
Adding new package in a flutter project does not work untill the android studio is restarted |
|flutter|android-studio|flutter-dependencies|flutter-packages| |
null |
I am working on a Node.js application that utilizes Mongoose as the database. Whenever I try to call any function from the MongoDB driver, such as find(), updateOne(), or deleteMany(), I am getting a massive error output in the terminal, but it does not provide any specific error message or useful information about the cause of the issue.
The error output appears to be a large object containing various MongoDB connection details, configurations, and function references. However, there is no clear indication of what is causing the error or how to resolve it.
Here's an example of the error output I'm receiving:
code:
```
const allForms = async (req, res) => {
try {
let data = await formModal.find();
if (!data) {
return res.status(400).send("Data is not found");
} else {
return res.send({ data: data });
}
} catch (error) {
console.error(error.message);
return res.status(500).json({ error: 'Internal server error' });
}
}
```
model code:
```
const mongoose = require('mongoose');
const formSchema = mongoose.Schema({
destination: {
type: String,
required: true
},
className: {
type: String,
enum: ['First', 'Second'],
required: true
},
duration: {
type: String,
enum: ['Monthly', 'Quarterly'],
required: true
},
line: {
type: String,
enum: ['Harbour', 'Western', 'Central'],
required: true
},
student: {
type: mongoose.Schema.Types.ObjectId,
ref: 'Student'
}
});
const Form = mongoose.model('form', formSchema);
module.exports = Form;
```
output:
```
},
host: 'localhost',
port: 27017,
name: 'arcs'
},
queue: [],
buffer: false,
emitter: EventEmitter {
_events: [Object: null prototype] {},
_eventsCount: 0,
_maxListeners: undefined,
[Symbol(kCapture)]: false
}
},
'$__collection': <ref *6> Collection {
collection: Collection {
s: {
db: Db { s: [Object], client: [MongoClient] },
options: {
raw: false,
useBigInt64: false,
promoteLongs: true,
promoteValues: true,
promoteBuffers: false,
ignoreUndefined: false,
bsonRegExp: false,
serializeFunctions: false,
fieldsAsRaw: {},
enableUtf8Validation: true,
readPreference: [ReadPreference]
},
namespace: MongoDBCollectionNamespace {
db: 'arcs',
collection: 'students'
},
pkFactory: { createPk: [Function: createPk] },
readPreference: ReadPreference {
mode: 'primary',
tags: undefined,
hedge: undefined,
maxStalenessSeconds: undefined,
minWireVersion: undefined
},
bsonOptions: {
raw: false,
useBigInt64: false,
promoteLongs: true,
promoteValues: true,
promoteBuffers: false,
ignoreUndefined: false,
bsonRegExp: false,
serializeFunctions: false,
fieldsAsRaw: {},
enableUtf8Validation: true
},
readConcern: undefined,
writeConcern: undefined
me: 'student',
_closed: false,
opts: {
autoIndex: true,
autoCreate: true,
autoSearchIndex: false,
schemaUserProvidedOptions: {},
capped: false,
Promise: undefined,
'$wasForceClosed': undefined
},
name: 'students',
collectionName: 'students',
conn: <ref *3> NativeConnection {
base: <ref *1> Mongoose {
connections: [ [Circular *3] ],
nextConnectionId: 1,
models: <ref *4> {
student: [Circular *2],
form: [Function],
admin: [Function]
},
events: EventEmitter {
_events: [Object: null prototype] {},
_eventsCount: 0,
_maxListeners: undefined,
[Symbol(kCapture)]: false
},
__driver: {
Collection: [Function: NativeCollection],
Connection: [Function]
},
options: {
pluralization: true,
autoIndex: true,
autoCreate: true,
autoSearchIndex: false,
[Symbol(mongoose:default)]: true
},
_pluralize: [Function: pluralize],
Schema: [Function: Schema] {
reserved: [Object: null prototype],
Types: [Object],
ObjectId: [Function]
},
model: [Function (anonymous)],
plugins: [ [Array], [Array], [Array], [Array] ],
default: [Circular *1],
mongoose: [Circular *1],
cast: [Function: cast],
STATES: [Object: null prototype] {
'0': 'disconnected',
'1': 'connected',
'2': 'connecting',
'3': 'disconnecting',
'99': 'uninitialized',
disconnected: 0,
connected: 1,
connecting: 2,
disconnecting: 3,
uninitialized: 99
},
setDriver: [Function: setDriver],
set: [Function (anonymous)],
get: [Function (anonymous)],
createConnection: [Function (anonymous)],
connect: [AsyncFunction: connect],
disconnect: [AsyncFunction: disconnect],
startSession: [Function (anonymous)],
pluralize: [Function (anonymous)],
deleteModel: [Function (anonymous)],
modelNames: [Function (anonymous)],
plugin: [Function (anonymous)],
version: '8.2.4',
Mongoose: [Function: Mongoose],
SchemaType: [Function: SchemaType] {
cast: [Function: cast],
set: [Function: set],
get: [Function (anonymous)],
_isRef: [Function (anonymous)],
checkRequired: [Function (anonymous)],
CastError: [class CastError extends MongooseError],
ValidatorError: [class ValidatorError extends MongooseError]
},
SchemaTypes: {
Array: [Function],
BigInt: [Function],
Boolean: [Function],
Buffer: [Function],
Date: [Function],
Decimal: [Function],
Decimal128: [Function],
DocumentArray: [Function],
Map: [Function],
Mixed: [Function],
Number: [Function],
ObjectId: [Function],
String: [Function],
Subdocument: [Function],
UUID: [Function],
Oid: [Function],
Object: [Function],
Bool: [Function],
ObjectID: [Function]
},
VirtualType: [Function: VirtualType],
Types: {
Array: [Function: MongooseArray],
Buffer: [Function],
Embedded: [Function],
Document: [Function],
DocumentArray: [Function: MongooseDocumentArray],
Decimal128: [class Decimal128 extends BSONValue],
ObjectId: [Function],
Map: [class MongooseMap extends Map],
Subdocument: [Function: Subdocument],
UUID: [class UUID extends Binary]
},
Query: [Function: Query] { base: [Object], 'use$geoWithin': true },
Model: [Function: Model] {
exists: [Function: exists],
discriminator: [Function (anonymous)],
_events: undefined,
_eventsCount: 0,
_maxListeners: undefined,
setMaxListeners: [Function: setMaxListeners],
getMaxListeners: [Function: getMaxListeners],
emit: [Function: emit],
addListener: [Function: addListener],
on: [Function: addListener],
prependListener: [Function: prependListener],
once: [Function: once],
prependOnceListener: [Function: prependOnceListener],
removeListener: [Function: removeListener],
off: [Function: removeListener],
removeAllListeners: [Function: removeAllListeners],
listeners: [Function: listeners],
rawListeners: [Function: rawListeners],
listenerCount: [Function: listenerCount],
eventNames: [Function: eventNames],
init: [Function: init],
createCollection: [AsyncFunction: createCollection],
syncIndexes: [AsyncFunction: syncIndexes],
createSearchIndex: [AsyncFunction: createSearchIndex],
updateSearchIndex: [AsyncFunction: updateSearchIndex],
dropSearchIndex: [AsyncFunction: dropSearchIndex],
diffIndexes: [AsyncFunction: diffIndexes],
cleanIndexes: [AsyncFunction: cleanIndexes],
listIndexes: [AsyncFunction: listIndexes],
ensureIndexes: [AsyncFunction: ensureIndexes],
createIndexes: [AsyncFunction: createIndexes],
translateAliases: [Function: translateAliases],
deleteOne: [Function: deleteOne],
deleteMany: [Function: deleteMany],
find: [Function: find],
findById: [Function: findById],
findOne: [Function: findOne],
estimatedDocumentCount: [Function: estimatedDocumentCount],
countDocuments: [Function: countDocuments],
distinct: [Function: distinct],
where: [Function: where],
'$where': [Function: $where],
findOneAndUpdate: [Function (anonymous)],
findByIdAndUpdate: [Function (anonymous)],
findOneAndDelete: [Function (anonymous)],
findByIdAndDelete: [Function (anonymous)],
findOneAndReplace: [Function (anonymous)],
create: [AsyncFunction: create],
watch: [Function (anonymous)],
startSession: [Function (anonymous)],
insertMany: [AsyncFunction: insertMany],
'$__insertMany': [Function (anonymous)],
bulkWrite: [AsyncFunction: bulkWrite],
bulkSave: [AsyncFunction: bulkSave],
applyDefaults: [Function: applyDefaults],
castObject: [Function: castObject],
buildBulkWriteOperations: [Function: buildBulkWriteOperations],
hydrate: [Function (anonymous)],
updateMany: [Function: updateMany],
updateOne: [Function: updateOne],
replaceOne: [Function: replaceOne],
aggregate: [Function: aggregate],
validate: [AsyncFunction: validate],
populate: [AsyncFunction: populate],
compile: [Function: compile],
__subclass: [Function: subclass],
recompileSchema: [Function: recompileSchema],
inspect: [Function (anonymous)],
_applyQueryMiddleware: [Function: _applyQueryMiddleware],
[Symbol(nodejs.util.inspect.custom)]: [Function (anonymous)]
},
Document: [Function: Document] {
_events: undefined,
_eventsCount: 0,
_maxListeners: undefined,
setMaxListeners: [Function: setMaxListeners],
getMaxListeners: [Function: getMaxListeners],
emit: [Function: emit],
addListener: [Function: addListener],
on: [Function: addListener],
prependListener: [Function: prependListener],
once: [Function: once],
prependOnceListener: [Function: prependOnceListener],
removeListener: [Function: removeListener],
off: [Function: removeListener],
removeAllListeners: [Function: removeAllListeners],
listeners: [Function: listeners],
rawListeners: [Function: rawListeners],
listenerCount: [Function: listenerCount],
eventNames: [Function: eventNames],
ValidationError: [class ValidationError extends MongooseError]
},
ObjectId: [Function: SchemaObjectId] {
schemaName: 'ObjectId',
defaultOptions: {},
get: [Function (anonymous)],
set: [Function: set],
setters: [],
_checkRequired: [Function (anonymous)],
_cast: [Function: castObjectId],
cast: [Function: cast],
_defaultCaster: [Function (anonymous)],
checkRequired: [Function (anonymous)]
},
isValidObjectId: [Function (anonymous)],
isObjectIdOrHexString: [Function (anonymous)],
syncIndexes: [Function (anonymous)],
Decimal128: [Function: SchemaDecimal128] {
schemaName: 'Decimal128',
defaultOptions: {},
_cast: [Function: castDecimal128],
set: [Function: set],
setters: [],
get: [Function (anonymous)],
cast: [Function: cast],
_defaultCaster: [Function (anonymous)],
_checkRequired: [Function (anonymous)],
checkRequired: [Function (anonymous)]
},
Mixed: [Function: SchemaMixed] {
schemaName: 'Mixed',
UUID: [Getter],
MongoBulkWriteError: [Getter],
ClientEncryption: [Getter],
ChangeStreamCursor: [Getter],
MongoAPIError: [Getter],
MongoAWSError: [Getter],
MongoAzureError: [Getter],
MongoBatchReExecutionError: [Getter],
MongoChangeStreamError: [Getter],
MongoCompatibilityError: [Getter],
MongoCursorExhaustedError: [Getter],
},
mquery: [Function: Query] {
permissions: [Object],
_isPermitted: [Function (anonymous)],
canMerge: [Function (anonymous)],
setGlobalTraceFunction: [Function (anonymous)],
utils: [Object],
env: [Object],
Collection: [class NodeCollection extends Collection],
BaseCollection: [Function]
},
sanitizeFilter: [Function: sanitizeFilter],
trusted: [Function: trusted],
skipMiddlewareFunction: [Function: skipWrappedFunction],
overwriteMiddlewareResult: [Function: overwriteResult]
},
collections: {
students: [Circular *6],
forms: Collection {
collection: [Collection],
Promise: [Function: Promise],
modelName: 'form',
_closed: false,
opts: [Object],
name: 'forms',
collectionName: 'forms',
conn: [Circular *3],
queue: [],
buffer: false,
emitter: [EventEmitter]
},
admins: Collection {
collection: [Collection],
Promise: [Function: Promise],
modelName: 'admin',
_closed: false,
opts: [Object],
name: 'admins',
collectionName: 'admins',
conn: [Circular *3],
queue: [],
buffer: false,
emitter: [EventEmitter]
}
},
models: <ref *4> {
student: [Circular *2],
form: [Function: model] {
hooks: [Kareem],
base: [Mongoose],
modelName: 'form',
model: [Function: model],
db: [Circular *3],
discriminators: undefined,
events: [EventEmitter],
'$appliedMethods': true,
'$appliedHooks': true,
_middleware: [Kareem],
'$__insertMany': [Function (anonymous)],
schema: [Schema],
collection: [Collection],
'$__collection': [Collection],
Query: [Function],
'$init': [Promise],
'$caught': true,
[Symbol(mongoose#Model)]: true
},
admin: [Function: model] {
hooks: [Kareem],
base: [Mongoose],
modelName: 'admin',
model: [Function: model],
db: [Circular *3],
discriminators: undefined,
events: [EventEmitter],
'$appliedMethods': true,
'$appliedHooks': true,
_middleware: [Kareem],
'$__insertMany': [Function (anonymous)],
schema: [Schema],
collection: [Collection],
'$__collection': [Collection],
Query: [Function],
'$init': [Promise],
'$caught': true,
[Symbol(mongoose#Model)]: true
}
},
config: {},
replica: false,
options: null,
otherDbs: [],
relatedDbs: {},
states: [Object: null prototype] {
'0': 'disconnected',
'1': 'connected',
'2': 'connecting',
'3': 'disconnecting',
'99': 'uninitialized',
disconnected: 0,
connected: 1,
connecting: 2,
disconnecting: 3,
uninitialized: 99
},
_readyState: 1,
_closeCalled: undefined,
_hasOpened: true,
plugins: [],
id: 0,
_queue: [],
_listening: false,
_connectionOptions: { driverInfo: { name: 'Mongoose', version: '8.2.4' } },
_connectionString: 'mongodb://localhost:27017/arcs',
client: <ref *5> MongoClient {
_events: [Object: null prototype] {
serverDescriptionChanged: [Function (anonymous)]
},
_eventsCount: 1,
_maxListeners: 0,
mongoLogger: MongoLogger {
error: [Function: bound log],
warn: [Function: bound log],
info: [Function: bound log],
debug: [Function: bound log],
trace: [Function: bound log],
componentSeverities: [Object],
maxDocumentLength: 1000,
logDestination: [Object]
},
s: {
url: 'mongodb://localhost:27017/arcs',
bsonOptions: [Object],
namespace: [MongoDBNamespace],
hasBeenClosed: false,
sessionPool: [ServerSessionPool],
activeSessions: Set(0) {},
options: [Getter],
readConcern: [Getter],
writeConcern: [Getter],
readPreference: [Getter],
isMongoClient: [Getter]
},
topology: Topology {
_events: [Object: null prototype],
_eventsCount: 26,
_maxListeners: undefined,
client: [Circular *5],
selectServerAsync: [Function (anonymous)],
s: [Object],
[Symbol(kCapture)]: false,
[Symbol(waitQueue)]: [List]
},
connectionLock: undefined,
[Symbol(kCapture)]: false,
[Symbol(options)]: [Object: null prototype] {
hosts: [Array],
compressors: [Array],
connectTimeoutMS: 30000,
dbName: 'arcs',
directConnection: false,
driverInfo: [Object],
enableUtf8Validation: true,
forceServerObjectId: false,
heartbeatFrequencyMS: 10000,
loadBalanced: false,
localThresholdMS: 15,
maxConnecting: 2,
maxIdleTimeMS: 0,
maxPoolSize: 100,
minPoolSize: 0,
minHeartbeatFrequencyMS: 500,
monitorCommands: false,
noDelay: true,
pkFactory: [Object],
raw: false,
readPreference: [ReadPreference],
retryReads: true,
retryWrites: true,
serverMonitoringMode: 'auto',
serverSelectionTimeoutMS: 30000,
socketTimeoutMS: 0,
srvMaxHosts: 0,
srvServiceName: 'mongodb',
waitQueueTimeoutMS: 0,
zlibCompressionLevel: 0,
userSpecifiedAuthSource: false,
userSpecifiedReplicaSet: false,
mongoLoggerOptions: [Object],
metadata: [Object],
[Symbol(@@mdb.enableMongoLogger)]: false
}
},
'$initialConnection': Promise { [Circular *3] },
db: Db {
s: {
options: [Object],
readPreference: [ReadPreference],
bsonOptions: [Object],
pkFactory: [Object],
readConcern: undefined,
writeConcern: undefined,
namespace: [MongoDBNamespace]
},
client: <ref *5> MongoClient {
_events: [Object: null prototype],
_eventsCount: 1,
_maxListeners: 0,
mongoLogger: [MongoLogger],
s: [Object],
topology: [Topology],
connectionLock: undefined,
[Symbol(kCapture)]: false,
[Symbol(options)]: [Object: null prototype]
}
},
host: 'localhost',
port: 27017,
name: 'arcs'
},
queue: [],
buffer: false,
emitter: EventEmitter {
_events: [Object: null prototype] {},
_eventsCount: 0,
_maxListeners: undefined,
[Symbol(kCapture)]: false
}
},
Query: [Function (anonymous)] {
base: {
toConstructor: [Function: toConstructor],
setOptions: [Function (anonymous)],
collection: [Function: collection],
collation: [Function (anonymous)],
'$where': [Function (anonymous)],
where: [Function (anonymous)],
equals: [Function: equals],
eq: [Function: eq],
or: [Function: or],
nor: [Function: nor],
and: [Function: and],
gt: [Function (anonymous)],
gte: [Function (anonymous)],
lt: [Function (anonymous)],
lte: [Function (anonymous)],
ne: [Function (anonymous)],
in: [Function (anonymous)],
nin: [Function (anonymous)],
all: [Function (anonymous)],
regex: [Function (anonymous)],
size: [Function (anonymous)],
maxDistance: [Function (anonymous)],
minDistance: [Function (anonymous)],
mod: [Function (anonymous)],
exists: [Function (anonymous)],
elemMatch: [Function (anonymous)],
within: [Function: within],
box: [Function (anonymous)],
polygon: [Function (anonymous)],
circle: [Function (anonymous)],
near: [Function: near],
intersects: [Function: intersects],
geometry: [Function: geometry],
select: [Function: select],
slice: [Function (anonymous)],
sort: [Function (anonymous)],
limit: [Function (anonymous)],
skip: [Function (anonymous)],
batchSize: [Function (anonymous)],
comment: [Function (anonymous)],
maxTimeMS: [Function (anonymous)],
maxTime: [Function (anonymous)],
hint: [Function (anonymous)],
j: [Function: j],
slaveOk: [Function (anonymous)],
setReadPreference: [Function (anonymous)],
read: [Function (anonymous)],
r: [Function (anonymous)],
readConcern: [Function (anonymous)],
tailable: [Function (anonymous)],
w: [Function: writeConcern],
writeConcern: [Function: writeConcern],
wTimeout: [Function: wtimeout],
wtimeout: [Function: wtimeout],
merge: [Function (anonymous)],
find: [Function (anonymous)],
_find: [AsyncFunction: _find],
cursor: [Function (anonymous)],
findOne: [Function (anonymous)],
_findOne: [AsyncFunction: _findOne],
count: [Function (anonymous)],
_count: [AsyncFunction: _count],
distinct: [Function (anonymous)],
_distinct: [AsyncFunction: _distinct],
updateMany: [Function: updateMany],
_updateMany: [AsyncFunction (anonymous)],
updateOne: [Function: updateOne],
_updateOne: [AsyncFunction (anonymous)],
replaceOne: [Function: replaceOne],
_replaceOne: [AsyncFunction (anonymous)],
deleteOne: [Function (anonymous)],
_deleteOne: [AsyncFunction (anonymous)],
deleteMany: [Function (anonymous)],
_deleteMany: [AsyncFunction (anonymous)],
findOneAndUpdate: [Function (anonymous)],
_findOneAndUpdate: [AsyncFunction (anonymous)],
findOneAndDelete: [Function (anonymous)],
findOneAndRemove: [Function (anonymous)],
_findOneAndRemove: [AsyncFunction (anonymous)],
setTraceFunction: [Function (anonymous)],
exec: [AsyncFunction: exec],
then: [AsyncFunction (anonymous)],
selected: [Function: selected],
selectedInclusively: [Function: selectedInclusively],
selectedExclusively: [Function: selectedExclusively],
_mergeUpdate: [Function (anonymous)],
_optionsForExec: [Function (anonymous)],
_fieldsForExec: [Function (anonymous)],
_updateForExec: [Function (anonymous)],
_ensurePath: [Function (anonymous)],
_validate: [Function (anonymous)]
}
},
'$init': Promise { undefined, catch: [Function (anonymous)] },
'$caught': true,
[Symbol(mongoose#Model)]: true
}
}
Node.js v20.10.0
[nodemon] app crashed - waiting for file changes before starting...
```
I have tried various troubleshooting steps, such as reinstalling the MongoDB driver, updating Node.js, and checking my connection string, but none of these have resolved the issue.
When calling Mongoose functions, I expect to either receive the desired result or a clear and descriptive error message that can help me identify and resolve the issue.
Can anyone guide me on how to resolve this problem and get meaningful error messages when calling MongoDB functions? Any help would be greatly appreciated. |
Getting a Large Error Output When Calling MongoDB/Mongoose Functions Without an Error Message |
|node.js|mongodb|express|mongoose| |
null |
It looks like the issue might be related to the way Istio is handling the redirection of URLs in the VirtualService configuration. One thing you can try is to explicitly set the path prefix in the rewrite field for the /login path to /airflow/login instead of just /login. This way, when the redirection happens, it will go to the correct path within your airflow application.
Here is an updated version of your VirtualService configuration with this change:
- match:
- uri:
prefix: /login
rewrite:
uri: /airflow/login
route:
- destination:
host: airflow-service
port:
number: 8080
With this change, the redirection should now point to /airflow/login instead of just /login, which should resolve the 404 error you are facing.
Additionally, you may also want to check if the base URL in your airflow.cfg file (base_url = https://myapp.example.com/airflow/) matches the path prefixes in your VirtualService configuration to ensure consistency.
I hope this helps resolve the redirection issue and allows you to access the airflow UI correctly. |
Here is what I implemented form last time and this code and is working for most if my test cases but it is not working for these cases
```
#include <climits>
#include <cmath>
#include <iostream>
#include <limits>
#include <stdexcept>
#include <string>
#include <vector>
long long ConvertStringToNumber(std::string number) {
long long int finalNumber = 0;
bool isNegative = false;
int i = 0;
if (number.at(0) == '-') {
isNegative = true;
i++;
}
for (; i < number.length(); i++) {
if (number.at(i) < '0' || number.at(i) > '9') {
throw std::invalid_argument("Some error...");
}
if (finalNumber >
(std::numeric_limits<long long int>::max() - (number.at(i) - '0')) / 10) {
throw std::overflow_error("Input string contains a number that is too large!!!");
throw std::range_error("Number is too large");
}
finalNumber = finalNumber * 10 + (number.at(i) - '0');
}
return isNegative ? -finalNumber : finalNumber;
}
std::vector<long long int> NumbersInString(std::string s) {
std::vector<long long int> numbers;
std::string currentNumber = "";
bool isNumber = true;
for (int i = 0; i < s.length(); i++) {
char c = s.at(i);
if ((c >= '0' && c <= '9') || c == '-') {
currentNumber += c;
isNumber = true;
} else if (isNumber) {
if (!(currentNumber.size() == 0) &&
((currentNumber != "-" && currentNumber.at(0) == '-') ||
(currentNumber.at(0) >= '0' && currentNumber.at(0) <= '9'))) {
if (i == s.length() || ((s.at(i) < 'a' || s.at(i) > 'z') &&
(s.at(i) < 'A' || s.at(i) > 'Z'))) {
try {
long long int number = ConvertStringToNumber(currentNumber);
numbers.push_back(number);
} catch (std::domain_error &k) {
// skip it
}
}
}
currentNumber = "";
isNumber = false;
}
}
if (isNumber && !currentNumber.empty()) {
try {
long long int number = ConvertStringToNumber(currentNumber);
numbers.push_back(number);
} catch (std::domain_error &e) {
}
// skip again
}
return numbers;
}
int main() {
std::string input;
std::cout << "Enter a string: ";
std::getline(std::cin, input);
try {
std::vector<long long int> numbers = NumbersInString(input);
if (numbers.empty()) {
std::cout << "The entered string does not contain any numbers!\n";
} else {
std::cout << "Numbers within the string: ";
for (long long int number : numbers) {
std::cout << number << " ";
}
std::cout << std::endl;
}
} catch (std::overflow_error &e) {
std::cout << "PROBLEM: " << e.what() << std::endl;
} catch (std::range_error &k) {
std::cout << "Exception with message:" << k.what();
}
return 0;
}
```
Can someone please give explanation and help for these cases here is the output:
Enter a string: -123 123-123 123- 123
terminate called after throwing an instance of 'std::invalid_argument'
what(): Some error...
Aborted
=== Code Exited With Errors ===
Here is the correct output:
-123 123-123 123- 123
and
Enter a string: Ovo je 123456789012345678901234563 -90 test.
PROBLEM: Input string contains a number that is too large!!!
=== Code Execution Successful ===
But output needs to be: Number too big!
Can someone explain how to catch the exception and write different message then?
|
There are a lot of solutions you could implement, basing on my experience, **working with Transform's Tags is way easier** than anything else.
What I would do is create a **Player tag** and *assign it to the* **Player prefab** that you mentioned.
(I just saw you did it so you are already good to go for the next step)
Then, I would **create a script that handles the tracking of the players**, it is *way more efficient than having it on every single zombie*.
GameObject[] playerArray;
private void Start() {
InvokeRepeating(nameof(GetPlayers),0,1); //not optimized but not expensive, easy way of updating the players reference
}
private void GetPlayers() { //You must call this function when a player joins or exits the lobby
playerArray = GameObject.FindGameObjectsWithTag("Player");
}
public GameObject FindClosestPlayerTo(Transform zombie) {
GameObject closestPlayer = null;
float distance = Mathf.Infinity;
foreach (GameObject player in playerArray) {
if (player == null) continue; //maybe a player quits?
Vector3 difference = player.transform.position - zombie.position;
float currentDistance = difference.sqrMagnitude;
if (currentDistance < distance) {
closestPlayer = player;
distance = currentDistance;
}
}
return closestPlayer;
}
With this done, from you zombie script you could do:
private void FindTarget() { //You should call this every second using InvokeRepeating just like before
GameObject closestPlayer = referenceToScript.FindClosestPlayerTo(transform);
if (closestPlayer != null)
referenceAgent.SetDestination(closestPlayer.transform.position);
}
And there you go, easy zombie system setup. |
I am working on FlashCards app and I want to use json-ld for importing/exporting data from the app. I created simple structure but I am not sure if what I have done is valid and the links between the data are properly set.
1. I am not sure if the link between `FlashCardSide:language` and `Dictionary:availableLanguage` is created
2. `FlashCard` is of type `Thing`, and I have `FlashCard:contains` which is supposed to be not valid property, still validation passes.
3. How should I define the types and properties that are not from schema.org `ex:Dictionary`, `ex:FlashCardSide`,
4. Should I put some json-ld definitions on the site. something like `https://example.com/ontology.jsonld`
5. How should `acceptedValue` be defined
Here is sample of the json-ld:
```
{
"@context": {
"ex": "https://example.com/",
"schema": "http://schema.org/",
"id": "@id",
"type": "@type",
"Dictionary": "ex:Dictionary",
"Language": "schema:Language",
"FlashCard": "schema:Thing",
"FlashCardSide": "ex:FlashCardSide",
"availableLanguage": "schema:availableLanguage",
"name": "schema:name",
"value": "schema:value",
"description": "schema:description",
"isPartOf": "schema:isPartOf",
"itemListElement": "schema:itemListElement"
},
"id": "ex:dictionary/0117649e-f5af-486e-9d4d-756c335f7e53",
"type": "Dictionary",
"availableLanguage": [
{
"id": "ex:language/norwegian",
"type": "Language",
"name": "norsk"
},
{
"id": "ex:language/english",
"type": "Language",
"name": "english"
}
],
"itemListElement": [
{
"id": "ex:flashcard/0117649e-f5af-486e-9d4d-756c335f7e51",
"type": "FlashCard",
"contains": [
{
"type": "FlashCardSide",
"value": "hund",
"isPartOf": "ex:language/norwegian",
"acceptedValue": ["random-123"],
"description": "<p>additional details</p>"
},
{
"type": "FlashCardSide",
"value": "dog",
"isPartOf": "ex:language/english",
"acceptedValue": ["random-123"],
"description": "<p>additional details</p>"
}
]
},
{
"id": "ex:flashcard/0117649e-f5af-486e-9d4d-756c335f7e52",
"type": "FlashCard",
"contains": [
{
"type": "FlashCardSide",
"value": "eple",
"isPartOf": "ex:language/norwegian",
"acceptedValue": ["random-321"],
"description": "<p>additional details</p>"
},
{
"type": "FlashCardSide",
"value": "apple",
"isPartOf": "ex:language/english",
"acceptedValue": ["random-321"],
"description": "<p>additional details</p>"
}
]
}
]
}
```
I have tested it with both: https://validator.schema.org/ and https://json-ld.org/playground/
and it appears to be valid.
|
Hibernate manage transaction on standalone application |
|java|hibernate|entitymanager| |
When I run `pod --version` from vscode terminal or macos integrated terminal I get version 1.13.0 which is a "correct" version for me, but when I run the same script as a vscode task I get 1.11.3.
Also `which pod` print `/Users/__/.rbenv/shims/pod` from terminal but `/usr/local/bin/pod` when running as a task.
Why is there a such kind of difference? |
vscode uses different cocoapods version when running a task |
|visual-studio-code|vscode-tasks| |
Given a matrix like the following:
A User1 User2 … User9
2015-01-01 01:00:00 8,72 58 3 2
2015-01-01 02:00:00 24,46 57 4 3
2015-01-01 03:00:00 37,02 62 4 2
… 42,89 59 3 2
2015-12-31 23:00:00 46,80 59 4 2
I have a linear optimization problem where I have to distribute the amount in column A among a variable number of users, so that after the distribution I have as little left over as possible, and never give each user more than he can receive (more than what is in his column).
We have the following objective function, the following constraints, and the following bounds:
target_function = minimize[(User1 – beta1*A)+(User2-beta2*A)+…+(User9-beta9*A)]
constraint1 -> (User1 – beta1*A) >= 0
constraint2 -> User2 – beta2*A) >= 0
…
constraint9 -> (User9 – beta9*A) >= 0
beta1+beta2+…+beta9 = 1
beta1 = (0,1); beta2 = (0,1)…beta9=(0,1)
I have created a program that solves the above problem on an hourly, row-by-row basis, so I run 8,760 optimization problems, and get beta for each user and each hour. This is the optimal solution for my problem; the one that optimizes the allocation and minimizes the surplus.
However, I need to solve the same problem for an annual optimization. That is, I need to obtain a unique beta for each user for the whole year, which minimizes the annual; this is the best that could be done with Excel Solver, and I need to compare my solution with this one so that it can be seen that in each analysis my solution is more optimal (if I have 3 users, I would have 3 beta).
I have obtained the beta for **minimize[(sum(User1)-beta1*sum(A))+(sum(User2)-beta2*sum(A))+…+(sum(User9)-beta9*sum(A))**, but the solution with Solver on matrices and vectors is more optimal than on the total sum.
I pass my attempt, based on the problem solved for hourly optimization.
from datetime import
import numpy as np
import pandas as pd
import os
import scipy.optimize as opt
ruta = 'C://Users//F1K3G//Documents//Py//PRUEBAS DE OPTIMIZACIÓN//'
def f_objetivo(x):
funcion_objetivo = 0
for i in range(0, len(DF.columns)-2):
funcion_objetivo = funcion_objetivo + (DF[DF.columns[i]]-x(i-1)*DF[DF.columns[0]])
return funcion_objetivo
def g1(x): return ((DF[DF.columns[1]]-x(0)*DF[DF.columns[0]]))
def g2(x): return ((DF[DF.columns[2]]-x(1)*DF[DF.columns[0]]))
def g3(x): return ((DF[DF.columns[3]]-x(2)*DF[DF.columns[0]]))
def g4(x): return ((DF[DF.columns[4]]-x(2)*DF[DF.columns[0]]))
def g5(x): return ((DF[DF.columns[5]]-x(4)*DF[DF.columns[0]]))
def g6(x): return ((DF[DF.columns[6]]-x(5)*DF[DF.columns[0]]))
def g7(x): return ((DF[DF.columns[7]]-x(6)*DF[DF.columns[0]]))
def g8(x): return ((DF[DF.columns[8]]-x(7)*DF[DF.columns[0]]))
def g9(x): return ((DF[DF.columns[9]]-x(8)*DF[DF.columns[0]]))
def g51(x):
restriccion_betas = 0
for i in range(0, len(DF.columns)-2):
restriccion_betas = restriccion_betas + x[i]
restriccion_betas = restriccion_betas - 1
return restriccion_betas
def optimizar():
semilla = []
cons = list()
bnds = list()
for i in range(0, len(DF.columns)-2):
semilla.append(1/(len(DF.columns)-2))
for i in range(0, len(DF.columns)):
if i == (len(DF.columns)-2):
restriccion = {'type': 'eq', 'fun': g51}
else:
funcion = globals()["g"+str(i+1)]
restriccion = {'type': 'ineq', 'fun': funcion}
cons.append(restriccion)
cons = tuple(cons)
for i in range(0, len(DF.columns)-2):
bnds.append((0,1))
bnds = tuple(bnds)
result = opt.minimize(f_objetivo, semilla, bounds=bnds, constraints=cons)
print(result.message)
print(result.x)
print(f_objetivo(result.x))
return result.x
DF = pd.read_excel(
ruta + 'BBDD.xlsx',
sheet_name='CURVAS DE CONSUMO', skiprows=(0), usecols=range(0, 8), index_col=None,
keep_default_na=False, na_values=(""))
beta = optimizar()
Here my DF to simulate for 24 hours:
{'Unnamed: 0': {0: Timestamp('2015-01-01 01:00:00'), 1: Timestamp('2015-01-01 02:00:00'), 2: Timestamp('2015-01-01 03:00:00'), 3: Timestamp('2015-01-01 04:00:00'), 4: Timestamp('2015-01-01 05:00:00'), 5: Timestamp('2015-01-01 06:00:00'), 6: Timestamp('2015-01-01 07:00:00'), 7: Timestamp('2015-01-01 08:00:00'), 8: Timestamp('2015-01-01 09:00:00'), 9: Timestamp('2015-01-01 10:00:00'), 10: Timestamp('2015-01-01 11:00:00'), 11: Timestamp('2015-01-01 12:00:00'), 12: Timestamp('2015-01-01 13:00:00'), 13: Timestamp('2015-01-01 14:00:00'), 14: Timestamp('2015-01-01 15:00:00'), 15: Timestamp('2015-01-01 16:00:00'), 16: Timestamp('2015-01-01 17:00:00'), 17: Timestamp('2015-01-01 18:00:00'), 18: Timestamp('2015-01-01 19:00:00'), 19: Timestamp('2015-01-01 20:00:00'), 20: Timestamp('2015-01-01 21:00:00'), 21: Timestamp('2015-01-01 22:00:00'), 22: Timestamp('2015-01-01 23:00:00'), 23: Timestamp('2015-01-01 00:00:00')}, 'A': {0: 0.0, 1: 0.0, 2: 0.0, 3:
0.0, 4: 0.0, 5: 0.0, 6: 0.0, 7: 0.0, 8: 8.717808, 9: 24.463867999999998, 10: 37.023932, 11: 42.885964, 12: 46.796772, 13: 44.611467999999995, 14: 32.705155999999995, 15: 7.388567999999999, 16: 0.23240799999999995, 17: 0.0, 18: 0.0, 19: 0.0, 20: 0.0, 21: 0.0, 22: 0.0, 23: 0.0}, 'User 1': {0: 33.702759400187, 1: 33.702759400187, 2: 34.4354280827998, 3: 39.5641088610891, 4: 39.5641088610891, 5: 41.0294462263146, 6: 43.2274522741529, 7: 56.4154885611826, 8: 57.8808259264081, 9: 57.1481572437953, 10: 61.5441693394719, 11: 59.3461632916336, 12: 58.6134946090209, 13: 60.8115006568592, 14: 61.5441693394719, 15: 65.9401814351485, 16: 72.5341995786633, 17: 73.9995369438888, 18: 76.9302116743399, 19: 71.0688622134378, 20: 71.8015308960506, 21: 72.5341995786633, 22: 57.1481572437953, 23: 35.1680967654125}, 'User 2': {0: 4, 1: 4, 2: 3, 3: 4, 4: 4, 5: 3, 6: 4, 7: 4, 8: 3, 9: 4, 10: 4, 11: 3, 12: 4, 13: 3, 14: 4, 15: 4, 16: 3, 17: 4, 18: 4, 19: 3, 20: 4, 21: 4, 22: 3, 23: 4}, 'User 3': {0: 3, 1: 2, 2: 2, 3: 2, 4: 3, 5: 2, 6: 2, 7: 2, 8: 2, 9: 3, 10: 2, 11: 2, 12: 2, 13: 2, 14: 3, 15: 1, 16: 2, 17: 3, 18: 2, 19: 2, 20: 2, 21: 2, 22: 2, 23: 2}, 'User 4': {0: 0.018, 1: 0.017, 2: 0.018, 3: 0.018, 4:
0.018, 5: 0.017, 6: 0.018, 7: 0.018, 8: 0.018, 9: 0.017, 10: 0.018, 11: 0.017, 12: 0.382, 13: 0.478, 14: 0.017, 15: 0.018, 16: 0.017, 17:
0.017, 18: 0.207, 19: 0.22, 20: 0.221, 21: 0.221, 22: 0.222, 23: 0.222}, 'User 5': {0: 12.0, 1: 10.0, 2: 10.0, 3: 10.0, 4: 10.0, 5: 10.0, 6: 10.0, 7: 10.0, 8: 10.0, 9: 15.0, 10: 18.0, 11: 20.0, 12: 21.0, 13: 22.0, 14: 21.0, 15: 22.0, 16: 21.0, 17: 26.0, 18: 22.0, 19: 18.0, 20: 19.0, 21: 18.0, 22: 19.0, 23: 18.0}, 'User 6': {0: 4, 1: 3, 2: 4, 3: 3, 4: 3, 5: 4, 6: 4, 7: 4, 8: 3, 9: 3, 10: 3, 11: 3, 12: 2, 13: 3, 14: 3, 15: 3, 16: 3, 17: 2, 18: 4, 19: 5, 20: 4, 21: 4, 22: 4, 23: 4}} |
In Sharepoint, everytime a new email (.msg or .eml) file is added to a list, I want to open it and extract properties of the message and update some custom columns for the list item. Can you provide a high level view of doing this? |
How to automatically update a column in Sharepoint when an email item is added |
|sharepoint| |
null |
null |
null |
null |
null |
i want to use several linux environment variables inside a sql script that i run with this command :
```
psql -h XXX -p 5432 -U XXX -d XXX -f ./script.sql
```
the content of the script is a succession of this line, but the referencing doesn't work, i.e. it doesn't replace the value, even though env variables exist
```
alter user airflow with password '$PASSWORD';
```
do you have a way ?
I tried to run a script with references to linux environment variables, but it interprets it as a string |
how to use system's environnement variables in sql script |
|sql|linux|psql| |
null |
I am unsure if you also configured the ChatAPI and forgot to mention here. But in case you haven't, you'll need to.
To do so, go to Google Chat API, then click Manage > Configuration. And use [this][1] for reference.
[1]: https://developers.google.com/workspace/chat/quickstart/gcf-app#publish-app-to-chat |
{"Voters":[{"Id":3001761,"DisplayName":"jonrsharpe"},{"Id":2395282,"DisplayName":"vimuth"},{"Id":839601,"DisplayName":"gnat"}]} |
After a lot of search it was just a problem in the security.yaml :
main:
two_factor:
auth_form_path: 2fa_login # The route name you have used in the routes.yaml
check_path: 2fa_login_check # The route name you have used in the routes.yaml
provider: app_user_provider
enable_csrf: true
I put the provider in two factor and now it works |
I had to create a PR to a development branch from my own branch in a team project but i mistakenly created a PR to the main branch. what can i do now? can i redirect this PR to dev branch somehow?(Github) |
For me the problem was that I did create a reusable package from an app. I renamed the app but I forgot to rename an import.
The import was:
from blog.models import Post
and it had to be:
from django_blog.models import Post
Rename the import solved the problem. |
{"Voters":[{"Id":466862,"DisplayName":"Mark Rotteveel"},{"Id":9214357,"DisplayName":"Zephyr"},{"Id":354577,"DisplayName":"Chris"}],"SiteSpecificCloseReasonIds":[13]} |
React Native application is throwing the following error when building in Xcode. I am building on Mac M1.
Undefined symbols:
Linker command failed with exit code 1 (use -v to see invocation)
I tried pod install after removing the Pods folder. It doesn't seem to help.
The code builds file in android.
React Native : 11.3.10
Node: v18.18.0
iOS: 17
additional errors in build
vtable for facebook::jsi::JSIException, referenced from:
facebook::jsi::JSError::JSError(facebook::jsi::JSError const&) in librealm-js-ios.a[arm64][3](jsi_init.o)
vtable for facebook::jsi::JSError, referenced from:
facebook::jsi::JSError::JSError(facebook::jsi::JSError const&) in librealm-js-ios.a[arm64][3](jsi_init.o)
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Tried deleting the Pods folder and the pod install.
Removed the node_modules and reinstalled all dependencies.
Nothing seems to work. |
React Native - RealmJS - Linker command failed with exit code 1 |
|ios|xcode|react-native|macos| |
null |
Here the pg_available_extensions word is selected:

The selection is not visible
I tried to find a configuration parameter that would address this issue and could not find one. I lost hours trying to fix this as there are behavior associated to selected text that I use all the time. This issue makes the whole application unusable to me.
next has the word configuration selected with a blue background:
[enter image description here][1]
[1]: https://i.stack.imgur.com/7Y9ed.png |
To correctly implement client-side authentication checks with a JWT token stored in `sessionStorage` (or any client-side storage), you need to ensure that all interactions with `sessionStorage` are happening in the browser, after the component has mounted. This is what `useEffect` allows you to do. In the following code I have little bit improved the approach.
```js
import { useEffect, useState } from 'react';
import { useRouter } from 'next/navigation';
import { valJwt } from '@/libs/jwtSec';
export default function isAuth(Component) {
return function AuthenticatedComponent(props) {
const router = useRouter();
const [isAuthenticated, setIsAuthenticated] = useState(false);
const [isLoading, setIsLoading] = useState(true);
useEffect(() => {
const token = sessionStorage.getItem("token_test");
if (!token) {
// No token found, redirect to login or home
router.push('/');
} else {
// Validate the JWT token
const isValidToken = valJwt(token);
setIsAuthenticated(isValidToken);
if (!isValidToken) {
// Token is invalid, redirect to login or home
router.push('/');
}
}
setIsLoading(false);
}, [router]);
// Show a loading state, or nothing, or a skeleton component until we know the auth state
if (isLoading) {
return <div>Loading...</div>; // Customize your loading component
}
// If the token is valid, render the component passed into this HOC
if (isAuthenticated) {
return <Component {...props} />;
}
// Optionally, render nothing or some error state if not authenticated, but you should be redirected already
return null;
};
}
``` |
i have this page component for each product information:
```
export default function Product() {
const { data } = useContext(ShopContext);
const { id } = useParams();
if (!data) {
return <div>Loading product...</div>;
}
const product = data.find((item) => item.id === id);
return (
<div>
{product.title}
</div>
);
}
```
somehow product is undefined though data and id can be logged into console and their value is available, i made sure of it like this:
```
export default function Product() {
const { data } = useContext(ShopContext);
const { id } = useParams();
if (!data) {
return <div>Loading product...</div>;
}
console.log(id);
return (
<div>
{data.map(item => (
<div key={item.id}>
{item.title}
</div>
))}
</div>
);
}
```
i really can't find a solution when i even don't understand what's going on but i did it this way too and still nothing on the page (the url is correct and loading div is displayed):
```
export default function Product() {
const { data } = useContext(ShopContext);
const { id } = useParams();
if (!data) {
return <div>Loading product...</div>;
}
return (
<div>
{data.map(item => (
<div key={item.id}>
{item.id === id ? item.title : null}
</div>
))}
</div>
);
}
```
for additional information i'm fetching the data from [fakestoreapi.com](fakestoreapi.com) in the app component and it works fine in other components. here's the fetching piece:
```
useEffect(() => {
async function FetchData() {
try {
const response = await fetch("https://fakestoreapi.com/products");
if (!response.ok) {
throw new Error(`HTTP error: Status ${response.status}`);
}
let postsData = await response.json();
postsData.sort((a, b) => {
const nameA = a.title.toLowerCase();
const nameB = b.title.toLowerCase();
return nameA.localeCompare(nameB);
});
setData(postsData);
setError(null);
} catch (err) {
setData(null);
setError(err.message);
} finally {
setLoading(false);
}
}
FetchData();
}, []);
```
this is the context:
```
import { createContext } from "react";
export const ShopContext = createContext({
data: [],
loading: true,
error: "",
setData: () => {},
cart: [],
addToCart: () => {},
removeFromCart: () => {},
});
```
and this is its states in app component:
```
const [data, setData] = useState(null);
const [loading, setLoading] = useState(true);
const [error, setError] = useState(null);
const [cart, setCart] = useState([]);
const addToCart = (productId) => {
....
};
const removeFromCart = (productId) => {
.....
};
```
i really don't know what is the problem and what to search.
|
Here's one approach:
**Sample df**
```python
import pandas as pd
import numpy as np
np.random.seed(0) # for reproducibility
df = pd.DataFrame({f'col_{i}': np.random.choice(range(1, 10),
size=4,
replace=False)
for i in range(1, 4)})
df
col_1 col_2 col_3
0 6 3 6
1 3 7 8
2 4 8 5
3 5 6 7
```
**Code**
```python
max_value = df.values.max()
out = df.stack()
idx = pd.MultiIndex.from_arrays([out.index.get_level_values(1),
out.values])
out = (out
.set_axis(idx)
.unstack(0)
.reindex(range(1, max_value + 1))
)
out
col_1 col_2 col_3
1 NaN NaN NaN
2 NaN NaN NaN
3 3.0 3.0 NaN
4 4.0 NaN NaN
5 5.0 NaN 5.0
6 6.0 6.0 6.0
7 NaN 7.0 7.0
8 NaN 8.0 8.0
# write away to excel with `df.to_excel`
```
**Explanation**
* Create a variable (`max_value`) for the max value in the `df` (to be used later for reindexing).
* Apply [`df.stack`](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.stack.html) to get all your columns as one stacked `pd.Series` (here: `out`).
* Now, create a new index (`idx`) using [`pd.MultiIindex.from_arrays`](https://pandas.pydata.org/docs/reference/api/pandas.MultiIndex.from_arrays.html). For the first array pick the second level from the index (use: [`pd.MultiIndex.get_level_values`](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.MultiIndex.get_level_values.html#pandas.MultiIndex.get_level_values)); for the second array, use the *values*.
* Now, set `idx` as the index with [`Series.set_axis`](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.Series.set_axis.html), apply [`Series.unstack`](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.Series.unstack.html), and use [`Series.reindex`](https://pandas.pydata.org/docs/reference/api/pandas.Series.reindex.html) to add the missing index values (with `np.nan` values). |
|spring|jpa|spring-data-jpa| |
null |
This is how you can query JSON to retrieve .msg emails within a specified date range.

```json
{
"search": "*",
"filter": "DateSent ge 'start_date' and DateSent le 'end_date'",
"select": "From, To, CC, BCC, DateSent, Subject, Body"
}
```
```json
{
"search": "*",
"filter": "DateSent ge '2023-01-01T08:00:00Z' and DateSent le '2023-01-02T08:00:00Z'",
"select": "From, To, CC, BCC, DateSent, Subject, Body"
}
```

I added indexing like this.
```json
{
"name": "email-index",
"fields": [
{"name": "MessageId", "type": "Edm.String", "key": true, "searchable": false},
{"name": "From", "type": "Edm.String", "searchable": true, "filterable": true, "sortable": true, "facetable": true},
{"name": "To", "type": "Collection(Edm.String)", "searchable": true, "filterable": true, "facetable": true},
{"name": "CC", "type": "Collection(Edm.String)", "searchable": true, "filterable": true, "facetable": true},
{"name": "BCC", "type": "Collection(Edm.String)", "searchable": true, "filterable": true, "facetable": true},
{"name": "DateSent", "type": "Edm.String", "searchable": true, "filterable": true, "sortable": false, "facetable": true},
{"name": "Subject", "type": "Edm.String", "searchable": true, "filterable": true, "sortable": true, "facetable": true},
{"name": "Body", "type": "Edm.String", "searchable": true, "filterable": true, "sortable": true, "facetable": true}
]
}
```
I sent a sample data using this [Doc](https://learn.microsoft.com/en-us/python/api/overview/azure/search-documents-readme?view=azure-python).
```json
{
"MessageId": "1",
"From": "sender@example.com",
"To": ["recipient1@example.com", "recipient2@example.com"],
"CC": ["cc1@example.com", "cc2@example.com"],
"BCC": ["bcc1@example.com", "bcc2@example.com"],
"DateSent": "2023-01-01T08:00:00Z",
"Subject": "Sample email 1",
"Body": "This is the body of sample email 1."
},
{
"MessageId": "2",
"From": "sender@example.com",
"To": ["recipient3@example.com"],
"CC": ["cc3@example.com"],
"BCC": [],
"DateSent": "2023-01-02T08:00:00Z",
"Subject": "Sample email 2",
"Body": "This is the body of sample email 2."
},
{
"MessageId": "3",
"From": "another_sender@example.com",
"To": ["recipient4@example.com"],
"CC": [],
"BCC": [],
"DateSent": "2023-01-03T08:00:00Z",
"Subject": "Sample email 3",
"Body": "This is the body of sample email 3."
}
```
[![enter image description here][1]][1]
[1]: https://i.stack.imgur.com/sAQFu.png |
null |
{"OriginalQuestionIds":[36526282],"Voters":[{"Id":2901002,"DisplayName":"jezrael","BindingReason":{"GoldTagBadge":"pandas"}}]} |
The code might not work now as MS introduced separate module for beta.
You need to install the module separately using
*Install-Module -Name Microsoft.Graph.Beta* |
I have a question that seems rather simple and obvious but for the life of me I can't make it work.
For starters my Observability stack is comprised of:
- Prometheus
- Thanos
- Loki
- Grafana
- Alertamanager
All running on kubernetes. For deployment/update I'm using Helm.
Now I want to have multiple rules files for Loki, one for each service, so that the alerts are more easily managed. Having one "*rules.yaml*" file with hundreds or thousands of lines doesn't sit right with me.
My current Loki backend & read configuration includes this:
extraVolumeMounts:
- name: loki-rules
mountPath: "/etc/loki/rules/fake/loki"
- name: service1-rules
mountPath: "/etc/loki/rules/fake/service1"
```
extraVolumes:
- name: service1-rules
configMap:
name: loki-service1-rules
- name: loki-rules
configMap:
name: loki-rules
```
And I have both these files for the rules:
- loki-rules.yaml:
```
kind: ConfigMap
apiVersion: v1
metadata:
name: loki-rules
namespace: monitoring
data:
rules.yaml: |-
groups:
- name: loki-alerts
interval: 1m
rules:
- alert: LokiInternalHighErrorRate
expr: sum(rate({cluster="loki"} | logfmt | level="error"[1m])) by (pod) > 1
for: 1m
labels:
severity: warning
annotations:
summary: Loki high internal error rate
message: Loki internal error rate over last minute is {{ $value }} for pod '{{ $labels.pod }}' ```
```
And I have this one:
- rules-loki-service1.yml:
```
kind: ConfigMap
apiVersion: v1
metadata:
name: loki-service1-rules
namespace: monitoring
data:
service1-rules.yaml: |-
groups:
- name: service1_alerts
rules:
- alert: "[service1] - Log level set to debug {{ $labels.instance }} - Warning"
expr: |
sum by(instance) (count_over_time({job="service1"} |= `[DEBUG]` [1m])) > 0
for: 2h
labels:
severity: warning
annotations:
summary: "[service1] - Log level set to debug {{ $labels.instance }}"
description: "The number of service1 debug logs has been high for the last 2 hours on instance: {{ $labels.instance }}."
```
When I make the deployment of these rules I get no errors and everything looks good, but on Grafana's UI only the *rules.yaml* rules appear.
Does Loki not support multiple rules files or am I missing something ? Any help is greatly appreciated because like I said managing a filed with hundreds or thousands of lines with alerts seems to be a nightmare to manage.
Thank you! |
How to structure json-ld for export/import |
|rdf|ontology|json-ld|structured-data|vocabulary| |