instruction stringlengths 0 30k ⌀ |
|---|
I am trying to reverse-engineering a device (an 8 bit microcontroller) that comunicates with the PC. Each device has a Serial number of two digits (in decimal) that is used to calculated a checksum. For the same data message the checksum changes. Each message starts with an 0x02 and ends with a 0x03. The checksum is the number between 0x04 and 0x03. These are the messages in Hexa and a translation to Ascii code:
` 02 41 42 42 04 31 30 03 .ABB.10. //device ID :00
02 41 42 42 04 44 44 03 .ABB.DD. //device ID : 01
02 41 42 42 04 31 39 03 .ABB.19. //device ID : 56
02 41 42 42 04 35 34 03 .ABB.54. //device ID : 99
02 41 42 31 30 04 37 39 03 .AB10.79. //device ID :00
02 41 42 31 30 04 34 45 03 .AB10.4E. //device ID : 01
02 41 42 31 30 04 45 35 03 .AB10.E5. //device ID : 56
02 41 42 31 30 04 35 45 03 .AB10.5E. //device ID : 99
02 41 42 70 04 41 43 03 .ABp.AC. //device ID :00
02 41 42 70 04 36 31 03 .ABp.61. //device ID : 01
02 41 42 70 04 45 38 03 .ABp.E8. //device ID : 56
02 41 42 70 04 41 35 03 .ABp.A5. //device ID : 99
02 41 42 30 46 46 46 46 04 46 30 03 .AB0FFFF.F0. //device ID :00
02 41 42 30 46 46 46 46 04 35 34 03 .AB0FFFF.54. //device ID : 01
02 41 42 30 46 46 46 46 04 36 33 03 .AB0FFFF.63. //device ID : 56
02 41 42 30 46 46 46 46 04 31 45 03 .AB0FFFF.1E. //device ID : 99`
I think that is making a kind of Xor but i cant figure how. Also as you can see , the checksum changes for different devices ID (whick i detailed in decimal ). I need to identify how the whole chekcsum is calculated with the data + device ID, or at least identify how the device ID is changing the checksum between different devices. The microcontroller is an 8 bit uc 8051. I will be very greatful if someone can give me any clue on how it its working |
Identify the checksum algorithm |
|algorithm|checksum|puzzle|8051|identify| |
null |
I had the same problem. this happened because of a NVM misconfiguration. follow these steps to fix.
- check `node -v` on WSL terminal. command wouldn't work.
- check installed node versions by `nvm ls`. see `default -> N/A`? this is the issue.
- bind a node version to default by `nvm alias default <node_version>`
- check `nvm ls` again to see if it's added.
- check `node -v` again. now it'll work(if not check on a new terminal window) |
I have this integral, the semi standard deviation for calculation the Sortino ratio, however I cannot seem to get an analytical solution to this
[The mathematical expression of the integral](https://i.stack.imgur.com/8iKoS.png)
```
import sympy as sy
c = sy.Symbol('B')
x = sy.Symbol('x')
a= sy.Symbol('-sy.oo')
x = sy.Symbol('x')
f=sy.Symbol('f')
d=sy.Symbol('d')
f1 = sy.integrate(sy.sqrt((c-x)**2* f(x)*dx),(x, -sy.oo, c))
```
I get the below error
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
/var/folders/hc/13tll2g535x553nmqlfcyztr0000gn/T/ipykernel_70280/1712613650.py in <module>
----> 1 f1 = sy.integrate(sy.sqrt((c-x)**2* f(x)*dx),(x, -sy.oo, 0.2))
TypeError: 'Symbol' object is not callable
``` |
|math|sympy|symbolic-integration| |
Create one HTML table from multiple arrays |
I have ~250 gzipped JSONL files (150mb each) sitting in an S3 bucket that i'm unnesting into a flattened table (Table 1). I'm then using CTAS to write the results into a table WITH format=Parquet (Table 2) so that I'm not unnesting at the query layer. My files in S3 are just subsets of a larger file, they aren't organized by date or anything similar.
Sample cols I'm importing :
- User ID (string)
- Department (string)
- Start Date (date)
Problem: I'm trying to partition the results so I can better query them, but my understanding of partitioning is that I can't partition columns that I'm loading into Table 1, because I'll get the "column repeated in partitioning columns" error, which I've been getting. During Table 1 creation, I can add partitions to unimported columns like Phone Number, Email, etc, but I don't think that helps me.
Questions:
- Am I confusing partitioning for indexing? Do I WANT to add partitions
for my unimported columns so that my imported columns are slightly
more grouped together?
- Should I be BUCKETING BY my imported columns? Tried that as well and
had no success
- Is AWS Glue something I need here?
Thanks in advance for any help! |
How to understand partitions vs indexing in AWS Athena? |
|amazon-web-services|indexing|amazon-athena|partitioning| |
I'm using EAS to build my Expo app, and today I started getting this error when attempting to run an iOS build, either local or on EAS servers. I'm thinking it's an issue with Apple servers, so I'm hoping it gets fixed soon.
Output from build command:
```
✔ Select platform › iOS
✔ Using remote iOS credentials (Expo server)
If you provide your Apple account credentials we will be able to generate all necessary build credentials and fully validate them.
This is optional, but without Apple account access you will need to provide all the missing values manually and we can only run minimal validation on them.
✔ Do you want to log in to your Apple account? … yes
› Log in to your Apple Developer account to continue
✔ Apple ID: … ...
› Restoring session /Users/.../.app-store/auth/.../cookie
› Session expired Local session
› Using password for ... from your local Keychain
Learn more
✔ Logged in New session
Authentication with Apple Developer Portal failed!
Error: Cookie not in this host's domain. Cookie:developer-mdn.apple.com Request:developer.apple.com
```
Anyone else have this problem and have any ideas on how to resolve it?
I've tried removing the stored authentication cookie and signing in again. I also tried signing in on App Store Connect to see if there was any issue with my account but I couldn't find one.
**EDIT:** The issue resolved itself after a few days. It was most likely a temporary issue with the EAS servers' connection to Apple. |
I have been facing a problem for several days, I have not found a solution on the various forums and in the documentation that I have read
I need to create a SQL query that allows me to retrieve data from an Excel file. This file has a first column that I always have to retrieve, and 38 others columns which correspond to integers. I need to retrieve only one of these 38 columns using a user entered parameter in the SSRS report
For example, the Excel file has the following columns: Col1, 547, 589, 512, 596, ...
I have to get "Col1" and "512" if the user entered 512 in the parameter field
The query I wrote works in SSMS, but as soon as I use it in the report the parameter no longer works
```
DECLARE @Param AS INT = 552
DECLARE @SQL NVARCHAR(MAX)
SET @SQL = 'SELECT * FROM OPENROWSET
(''Microsoft.ACE.OLEDB.12.0'',''Excel 12.0 Xml;HDR=YES;Database=W:\Test\Test.xlsx''
,''SELECT [Year] + 1 AS [Year] , ROUND([' + cast(@Param as varchar(10)) + ']/1000,0) AS Val FROM [Sheet1$]
;'') '
DECLARE @ParmDefinition NVARCHAR(500) = '@Param NVARCHAR(20)';
EXECUTE sp_executesql @SQL
,@ParmDefinition
,@Param = @Param
```
[image1][1]
If i let the parameter in the SQL query as state above, it will works, bun when i comment the declaration and use SSRS parameter it is recognized as null
[screen2][2]
[screen3][3]
This window appears and the parameters doesn't work
[screen4][4]
Does anyone have an idea why parameter passing doesn't work (it seems to be related to the fact that it's a dynamic query with an openrowset, is there an alternative?)
Thanks in advance for the help
I tried to create a scalar function user but it doesn't work
[1]: https://i.stack.imgur.com/aq0o3.png
[2]: https://i.stack.imgur.com/kCj7V.png
[3]: https://i.stack.imgur.com/8qm7d.png
[4]: https://i.stack.imgur.com/5yukK.jpg |
I am using wiremock-standalone-3.4.2, trying to emulate a 3rd party that upon a request from me responds with a certain field with UUID value X and sends the same value X to a webhook. The value X needs to change from request to request so it needs to be randomized. I can't seem to find a way to have the emulator return in the response the same random value it send to the webhook. Any ideas?
Tried using `{{randomValue type='UUID'}}` in both places but this expectedly resulted in 2 different random values. -
```
{
"priority": 1,
"request": {
"method": "POST",
"urlPath": "/v1/businesses"
},
"response": {
"status": 200,
"jsonBody": {
"id": "{{randomValue type='UUID'}}"
}},
"serveEventListeners": [
{
"name": "webhook",
"parameters": {
"method": "POST",
"url": "http://some-url/",
"headers": {
"Content-Type": "application/json",
},
"body": "{\"id\": \"{{randomValue type='UUID'}}\"}"}}]}
```
|
Wiremock webhook and response won't return same random value |
|wiremock-standalone| |
null |
|php|stripe-payments| |
I'm using roproxy instead of roblox to get around cord measures, requestbody is username,ctype ,password and useridand for the headers I'm just doing xcrs token:xcrs. It's just returning 403(). Is this an xcrs token thing? Couldn't find anything about it online.
Ideally, it would print the incorect password error, or anything but 403() 0
heres the code im using:
const passwordcheck = {
ctype: "Username",
cvalue: "usernametomyaccount",
password: "passwordtomyaccount",
captchaToken: "CaptchaFromFuncaptcha",
captchaId: "Captcha",
captchaprovider: "PROVIDER_ARKOSE_LABS",
userId: 889124,
};
const request = {
method: "POST",
headers: {
"Content-type": "application/json",
"X-csrf-token": "wiXzYTbS/xF3",
Cookie: "",
},
body: JSON.stringify(passwordcheck),
};
the url is roproxy v2 login
I've tried using different proxies, but nothing came off it. Filling out the requestbody with all the stuff in the Roblox auth documentation didn't work either. |
|php|laravel|curl|xampp|laravel-11| |
I think the first question you should is answer is:
> *Am I going to do this myself or am I going to hire somebody to help me?*
This choice will, to some extend, determine the other choices you have.
If you do hire a professional I would suggest to discuss these things with that person. It can be hard to find someone you can work with for a longer time. The choices you make will have to be compatible with their capabilities.
If you're going to do this by yourself, you can only do what you know and feel comfortable with. Wordpress is fantastic to get you started quickly, and can do a lot, but it also has some clear disadvantages (bloated, slow, vulnerable, costly, etc). Going with something like Laravel is more complicated, but still gives you a nice head start. Creating forms yourself means you need to be a web designer, PHP programmer, database manager, etc. It's a full-stack job. **Not easy** by any means.
Just keep in mind that these questions never start coming. Every few years you need to radically update your code, and sometimes you need to shift to something completely new. Don't regard this as a one off, but more as a continuous evolving matter. |
null |
You can create your own Widget to do what you need.
I created a sample for you:
**Result**
[![enter image description here][1]][1]
[1]: https://i.stack.imgur.com/wHDkg.gif
**Explanation**
First, I created this code, where the main value is "show text", when the user press the button, the value will be updated to "hide text" and the animation will start.
There is a function `fadeBuilder` that you can use to validate when you want to hide the text, in this sample the condition to hide the text is `value == 'hide text'`
```dart
class _SampleViewState extends State<SampleView> {
String _myText = 'show text';
@override
Widget build(BuildContext context) {
return Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
MyAnimatedTextOpacity(
value: _myText,
fadeBuilder: (value) => value == 'hide text',
),
ElevatedButton(
onPressed: () {
setState(() {
_myText = 'hide text';
});
},
child: const Text('change text'),
)
],
);
}
}
```
Now this is the custom widget:
```dart
typedef FadeBuilder = bool Function(String value);
class MyAnimatedTextOpacity extends StatefulWidget {
const MyAnimatedTextOpacity({
required this.value,
required this.fadeBuilder,
super.key,
});
final String value;
final FadeBuilder fadeBuilder;
@override
State<MyAnimatedTextOpacity> createState() => _MyAnimatedTextOpacityState();
}
class _MyAnimatedTextOpacityState extends State<MyAnimatedTextOpacity> {
late final String _value = widget.value;
@override
Widget build(BuildContext context) {
return AnimatedOpacity(
duration: const Duration(seconds: 1),
opacity: widget.fadeBuilder(widget.value) ? 0 : 1,
child: Text(_value),
);
}
}
``` |
In git bash, `/usr/bin/start` is a shell script that invokes `"$COMSPEC" //c start "${@//&/^&}"`. If invoked with a `.txt` file as an argument. You can examine the file itself; it's an 8-line text file.
It's specific to git bash under Windows. It's a way to provide the equivalent of the `cmd.exe` `START` command, or double-clicking on a Windows icon, in a Unix-like environment. (Cygwin and WSL, which are also Unix-like environments under Windows, don't provide the same thing; Cygwin has `cygstart`, and WSL is more isolated from the Windows environment.)
If you invoke it as `start foo.txt`, it should invoke your default handler for `*.txt` files, determined by your Windows settings. That might be Notepad by default, but perhaps you've configured it to be some other editor.
Try `echo "$COMSPEC"`. The result should be `C:\WINDOWS\system32\cmd.exe`. The `/usr/bin/start` script seems to be complaining that the command that `$COMSPEC` expands to does not exist; either `$COMSPEC` has been changed, or `C:\WINDOWS\system32\cmd.exe` is unavailable (the latter could indicate a damaged Windows system).
If you type `"$COMSPEC"` (with the double quotes) at the git bash prompt, it should give you a Windows `cmd` prompt; if so, you can leave it by typing `exit`.
```
$ type -a start
start is /usr/bin/start
start is /bin/start
start is /usr/bin/start
$ cat /usr/bin/start
#!/usr/bin/env bash
# Copyright (C) 2014, Alexey Pavlov
# mailto:alexpux@gmail.com
# This file is part of Minimal SYStem version 2.
# https://sourceforge.net/p/msys2/wiki/MSYS2%20installation/
# File: start
"$COMSPEC" //c start "${@//&/^&}"
$ echo "$COMSPEC"
C:\WINDOWS\system32\cmd.exe
$ "$COMSPEC"
Microsoft Windows [Version 10.0.22631.3155]
(c) Microsoft Corporation. All rights reserved.
C:\Users\username>exit
exit
$
``` |
On iOS, the keyboard does not offer a 6-character SMS code |
|javascript|ios|reactjs|react-native|keyboard| |
null |
I think a demo would be easier than going back and forth: https://github.com/quyentho/submodule-demo
You can check my `dist/` folder in the `placeholder-lib` to see if your generated build has similar structure. I have no problem to import like this in my `consumer`:
import { Button } from "placeholder-lib/components";
import useMyHook from "placeholder-lib/shared";
I guess, your problem could be missing these export lines in `package.json`
"exports": {
".": "./dist/index.js",
"./components": "./dist/components/index.js",
"./shared": "./dist/shared/index.js"
},
|
I am setting up a project with `NextJS 14` and `Shadcn UI`. I have just started the project with constructing the menu navgation.
https://ui.shadcn.com/docs/installation/next : that is how I set `ShadCn UI`.
This is my nav component:
```react.js
import * as React from "react";
import Link from "next/link";
import { cn } from "@/lib/utils";
import {
NavigationMenu,
NavigationMenuContent,
NavigationMenuIndicator,
NavigationMenuItem,
NavigationMenuLink,
NavigationMenuList,
NavigationMenuTrigger,
NavigationMenuViewport,
navigationMenuTriggerStyle,
} from "@/components/ui/navigation-menu";
const MainNav = () => {
return (
<NavigationMenu>
<NavigationMenuList>
<NavigationMenuItem>
<Link href="/" legacyBehavior passHref>
<NavigationMenuLink className={navigationMenuTriggerStyle()}>
Home
</NavigationMenuLink>
</Link>
</NavigationMenuItem>
<NavigationMenuItem>
<NavigationMenuTrigger>Item One</NavigationMenuTrigger>
<NavigationMenuContent>
<ul className="grid w-[400px] gap-3 p-4 md:w-[500px] md:grid-cols-2 lg:w-[600px] ">
<ListItem href="/docs" title="Introduction">
Re-usable components built using Radix UI and Tailwind CSS
</ListItem>
</ul>
</NavigationMenuContent>
</NavigationMenuItem>
</NavigationMenuList>
</NavigationMenu>
);
};
export default MainNav;
```
And this is component from `Shadcn UI`:
```react.js
"use client"
import * as React from "react"
import { ChevronDownIcon } from "@radix-ui/react-icons"
import * as NavigationMenuPrimitive from "@radix-ui/react-navigation-menu"
import { cva } from "class-variance-authority"
import { cn } from "@/lib/utils"
const NavigationMenu = React.forwardRef<
React.ElementRef<typeof NavigationMenuPrimitive.Root>,
React.ComponentPropsWithoutRef<typeof NavigationMenuPrimitive.Root>
>(({ className, children, ...props }, ref) => (
<NavigationMenuPrimitive.Root
ref={ref}
className={cn(
"relative z-10 flex max-w-max flex-1 items-center justify-center",
className
)}
{...props}
>
{children}
<NavigationMenuViewport />
</NavigationMenuPrimitive.Root>
))
NavigationMenu.displayName = NavigationMenuPrimitive.Root.displayName
const NavigationMenuList = React.forwardRef<
React.ElementRef<typeof NavigationMenuPrimitive.List>,
React.ComponentPropsWithoutRef<typeof NavigationMenuPrimitive.List>
>(({ className, ...props }, ref) => (
<NavigationMenuPrimitive.List
ref={ref}
className={cn(
"group flex flex-1 list-none items-center justify-center space-x-1",
className
)}
{...props}
/>
))
NavigationMenuList.displayName = NavigationMenuPrimitive.List.displayName
const NavigationMenuItem = NavigationMenuPrimitive.Item
const navigationMenuTriggerStyle = cva(
"group inline-flex h-9 w-max items-center justify-center rounded-md bg-background px-4 py-2 text-sm font-medium transition-colors hover:bg-accent hover:text-accent-foreground focus:bg-accent focus:text-accent-foreground focus:outline-none disabled:pointer-events-none disabled:opacity-50 data-[active]:bg-accent/50 data-[state=open]:bg-accent/50"
)
const NavigationMenuTrigger = React.forwardRef<
React.ElementRef<typeof NavigationMenuPrimitive.Trigger>,
React.ComponentPropsWithoutRef<typeof NavigationMenuPrimitive.Trigger>
>(({ className, children, ...props }, ref) => (
<NavigationMenuPrimitive.Trigger
ref={ref}
className={cn(navigationMenuTriggerStyle(), "group", className)}
{...props}
>
{children}{" "}
<ChevronDownIcon
className="relative top-[1px] ml-1 h-3 w-3 transition duration-300 group-data-[state=open]:rotate-180"
aria-hidden="true"
/>
</NavigationMenuPrimitive.Trigger>
))
NavigationMenuTrigger.displayName = NavigationMenuPrimitive.Trigger.displayName
const NavigationMenuContent = React.forwardRef<
React.ElementRef<typeof NavigationMenuPrimitive.Content>,
React.ComponentPropsWithoutRef<typeof NavigationMenuPrimitive.Content>
>(({ className, ...props }, ref) => (
<NavigationMenuPrimitive.Content
ref={ref}
className={cn(
"left-0 top-0 w-full data-[motion^=from-]:animate-in data-[motion^=to-]:animate-out data-[motion^=from-]:fade-in data-[motion^=to-]:fade-out data-[motion=from-end]:slide-in-from-right-52 data-[motion=from-start]:slide-in-from-left-52 data-[motion=to-end]:slide-out-to-right-52 data-[motion=to-start]:slide-out-to-left-52 md:absolute md:w-auto ",
className
)}
{...props}
/>
))
NavigationMenuContent.displayName = NavigationMenuPrimitive.Content.displayName
const NavigationMenuLink = NavigationMenuPrimitive.Link
const NavigationMenuViewport = React.forwardRef<
React.ElementRef<typeof NavigationMenuPrimitive.Viewport>,
React.ComponentPropsWithoutRef<typeof NavigationMenuPrimitive.Viewport>
>(({ className, ...props }, ref) => (
<div className={cn("absolute left-0 top-full flex justify-center")}>
<NavigationMenuPrimitive.Viewport
className={cn(
"origin-top-center relative mt-1.5 h-[var(--radix-navigation-menu-viewport-height)] w-full overflow-hidden rounded-md border bg-popover text-popover-foreground shadow data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:zoom-out-95 data-[state=open]:zoom-in-90 md:w-[var(--radix-navigation-menu-viewport-width)]",
className
)}
ref={ref}
{...props}
/>
</div>
))
NavigationMenuViewport.displayName =
NavigationMenuPrimitive.Viewport.displayName
const NavigationMenuIndicator = React.forwardRef<
React.ElementRef<typeof NavigationMenuPrimitive.Indicator>,
React.ComponentPropsWithoutRef<typeof NavigationMenuPrimitive.Indicator>
>(({ className, ...props }, ref) => (
<NavigationMenuPrimitive.Indicator
ref={ref}
className={cn(
"top-full z-[1] flex h-1.5 items-end justify-center overflow-hidden data-[state=visible]:animate-in data-[state=hidden]:animate-out data-[state=hidden]:fade-out data-[state=visible]:fade-in",
className
)}
{...props}
>
<div className="relative top-[60%] h-2 w-2 rotate-45 rounded-tl-sm bg-border shadow-md" />
</NavigationMenuPrimitive.Indicator>
))
NavigationMenuIndicator.displayName =
NavigationMenuPrimitive.Indicator.displayName
export {
navigationMenuTriggerStyle,
NavigationMenu,
NavigationMenuList,
NavigationMenuItem,
NavigationMenuContent,
NavigationMenuTrigger,
NavigationMenuLink,
NavigationMenuIndicator,
NavigationMenuViewport,
}
```
This is the `Error` I get:
```
⨯ components/MainNav.tsx (22:69) @ navigationMenuTriggerStyle
⨯ TypeError: (0 , _components_ui_navigation_menu__WEBPACK_IMPORTED_MODULE_4__.navigationMenuTriggerStyle) is not a function
at MainNav (./components/MainNav.tsx:28:130)
at stringify (<anonymous>)
20 | <NavigationMenuItem>
21 | <Link href="/" legacyBehavior passHref>
> 22 | <NavigationMenuLink className={navigationMenuTriggerStyle()}>
| ^
23 | Home
24 | </NavigationMenuLink>
25 | </Link>
```
I could not find any information regarding this error.
All the `imports` and `exports` seem to be good and `navigationMenuTriggerStyle` is used as a function in the component without any errors.
How do I solve this? Or I have some stupid mistake somewhere?
|
null |
Clean architecture/OOP and optimization: how to organize for classes with same logic |
|java|oop|clean-architecture| |
I am using node server for the backend. Connection to Cassandra is done using cassandra-driver nodejs.
Connection is done as follows:
const client = new cassandra.Client({
contactPoints: ['h1', 'h2'],
localDataCenter: 'datacenter1',
keyspace: 'ks1'
});
1. In contactPoints, do I just need to add 'seed' nodes or can I can add any nodes from the datacenter?
2. Do I need to run separate backend service for each datacenter? Or is there a way to connect multiple datacenter from the same nodejs backend service?
3. Any recommended way for setting backend server such that bandwidth can be minimized between Cassandra nodes and backend server? Should backend server run on the same machine where one of the Cassandra node is running so that data won't need to travel between multiple machines? Or is it fine if backend server runs on a completely separate machine than Cassandra node? Here, for example, if AWS EC2 is used, then data transfer charges might increase.
|
I use docker compose with very basic node image configuration. I can use `npm install` after the container build. But I want to do so during it and the volumes I've bound in docker-compose.yaml seems not to work then, only after the build.
Is it right behavior? If yes then should I just COPY package.json file during build to install it?
The thing that bothers me here, is that I need to make the context for a build a lot wider, to get the file from the different folder (../frontend/package.json), which seems wrong, but is it a big deal?
The error `Could not read package.json: Error: ENOENT: no such file or directory, open '/var/www/html/frontend/package.json'`
docker-compose.yaml
```
node:
container_name: ${PROJECT_NAME}_node
build:
context: ./node
command:
- npm run dev -- --host
ports:
- '8012:5173'
volumes:
- ./frontend:/var/www/html/frontend
```
DockerFile
```
FROM node:latest
WORKDIR /var/www/html/frontend
RUN npm install
RUN npm run build
``` |
|r| |
null |
To me it looks like you are not waiting for every single character to be sent out. You are just updating the uartData on every clock cycle without checking the uartRdy signal.
So I assume the TX_SEND_WAIT state you commented out would be actually what you need. In that state (or create a second one) you just have to check if you're at the end of the string or if there are more characters to send. |
```plugins {
id("com.android.application")
id("org.jetbrains.kotlin.android")
id("com.google.devtools.ksp")
id("com.google.dagger.hilt.android") version "2.49" apply false
}
android {
namespace = "com.example.notesapp"
compileSdk = 34
defaultConfig {
multiDexEnabled = true
applicationId = "com.example.notesapp"
minSdk = 30
targetSdk = 34
versionCode = 1
versionName = "1.0"
testInstrumentationRunner = "androidx.test.runner.AndroidJUnitRunner"
vectorDrawables {
useSupportLibrary = true
}
}
buildTypes {
release {
isMinifyEnabled = false
proguardFiles(
getDefaultProguardFile("proguard-android-optimize.txt"),
"proguard-rules.pro"
)
}
}
compileOptions {
sourceCompatibility = JavaVersion.VERSION_1_8
targetCompatibility = JavaVersion.VERSION_1_8
}
kotlinOptions {
jvmTarget = "1.8"
}
buildFeatures {
compose = true
}
composeOptions {
kotlinCompilerExtensionVersion = "1.5.1"
}
packaging {
resources {
excludes += "/META-INF/{AL2.0,LGPL2.1}"
}
}
}
dependencies {
implementation("androidx.compose.material3:material3-android:1.2.1")
implementation("androidx.compose.material3:material3-desktop:1.2.1")
implementation("androidx.tv:tv-material:1.0.0-alpha10")
implementation("androidx.wear.compose:compose-material:1.3.0")
implementation("androidx.wear.compose:compose-material3:1.0.0-alpha19")
val room_version = "2.6.1"
val multidex_version = "2.0.1"
implementation("androidx.core:core-ktx:1.12.0")
implementation("androidx.lifecycle:lifecycle-runtime-ktx:2.7.0")
implementation("androidx.activity:activity-compose:1.8.2")
implementation(platform("androidx.compose:compose-bom:2023.08.00"))
implementation("androidx.compose.ui:ui")
implementation("androidx.compose.ui:ui-graphics")
implementation("androidx.compose.ui:ui-tooling-preview")
implementation("androidx.compose.material3:material3")
testImplementation("junit:junit:4.13.2")
androidTestImplementation("androidx.test.ext:junit:1.1.5")
androidTestImplementation("androidx.test.espresso:espresso-core:3.5.1")
androidTestImplementation(platform("androidx.compose:compose-bom:2023.08.00"))
androidTestImplementation("androidx.compose.ui:ui-test-junit4")
debugImplementation("androidx.compose.ui:ui-tooling")
debugImplementation("androidx.compose.ui:ui-test-manifest")
implementation("androidx.multidex:multidex:$multidex_version")
// Compose dependencies
implementation("androidx.lifecycle:lifecycle-viewmodel-compose:2.7.0")
implementation("androidx.navigation:navigation-compose:2.7.7")
implementation("androidx.compose.material:material-icons-extended:")
implementation("androidx.hilt:hilt-navigation-compose:1.2.0")
implementation("androidx.compose.material:material:1.6.4")
// Coroutines
implementation("org.jetbrains.kotlinx:kotlinx-coroutines-android:1.7.3")
// Dagger - Hilt
implementation("com.google.dagger:hilt-android:2.49")
ksp("com.google.dagger:hilt-android-compiler:2.44")
// Room
implementation("androidx.room:room-runtime:$room_version")
ksp("androidx.room:room-compiler:$room_version")
// optional - Kotlin Extensions and Coroutines support for Room
implementation("androidx.room:room-ktx:$room_version")
}
```
every time i run the project in android studio it gives me this error
> Caused by:
> org.gradle.workers.internal.DefaultWorkerExecutor$WorkExecutionException:
> A failure occurred while executing
> com.android.build.gradle.internal.tasks.CheckDuplicatesRunnable
I tried checking for duplicate dependencies and toggled the offline mode
still nothing works
|
*Bar chart* panel expects data in format
|Categories|Column1|Column2|
|--|--|--|
|Category1|value|value|
|Category2|value|value|
|Category3|value|value|
Your initial query is just fine in terms of providing needed data. But it requires some massaging to put it into expected format.
First, to leave only latest values and present them in an easier form for following steps change query options to *Format*: **Table**, *Type*: **Instant** (Options block under query editor).
Then to "transpose" table into expected format add Transformation *Grouping to Matrix* with *Column* set to `sellable`, *Row* to `location`, and *Cell value* left as `Value`.
After that in Table view you should see something like
|location\sellable|true|false|
|--|--|--|
|Warehouse0|45|0|
|Warehouse1|36|102|
|Warehouse2|19|23|
And bar chart view should present you groups of bars per `location`, with two bars in each group `true` and `false`. If you want goupping to happen the other way around, you can switch places for *Row* and *Column* options in transformation. |
|python|http-post|fastapi|pydantic|http-status-code-422| |
```
TITLE
——
1 nuit
1 nuit
1 nuit
1 nuit
1 nuit
total : 5 nuits
——
1 nuit
1 nuit
1 nuit
total : 3 nuits
——
1 nuit
1 nuit
1 nuit
1 nuit
1 nuit
1 nuit
```
and so on ...
I'm having this paragraph in which I'd like to select the last lines after the occurrence of the last `——`.
It should match and group the 6 following lines right after the `——`... I've tried pretty much everything that crossed my mind so far but I must be missing something here.
I tested `(?s:.*\s)\K——` that is able to match the last `——` of the document. But I can't seem to be able to select the lines after that match.
The point here is to count the lines after that. So if I'm only able to select the "1" or "nuit" that's fine.
The expected capture:
```
1 nuit
1 nuit
1 nuit
1 nuit
1 nuit
1 nuit
``` |
Select all lines after last occurrence of a certain character |
so I am new to Rust but recently came across a problem I don't know how to solve. It's to work with nested (and multi-dimensional) value-key pair arrays in Rust that is dynamically generated based on string splitting.
The sample dataset looks something like the example below:
Species | Category
Dog | Eukaryota, Animalia, Chordata, Mammalia, Carnivora, Canidae, Canis, C. familiaris
Cat | Eukaryota, Animalia, Chordata, Mammalia, Carnivora, Feliformia, Felidae, Felinae, Felis, F. catus
Bear. | Eukaryota, Animalia, Chordata, Mammalia, Carnivora, Ursoidea, Ursidae, Ursus
...
The goal would be to split the comma delimitation and a create a map or vector. Essentially, creating "layers" of nested keys (either as a Vector array or a key to a *final value*).
From my understanding, Rust has a crate called "serde_json" which can be *used* to create key-value pairings like so:
let mut array = Map::new();
for (k, v) in data.into_iter() {
array.insert(k, Value::String(v));
}
As for comma delimited string splitting, it might look something like this:
let categories = "a, b, c, d, e, f".split(", ");
let category_data = categories.collect::<Vec<&str>>()
However, the end goal would be to create a recursively nested map or vector that follows the *Category* column for the array which can ultimately be serialised to a json output. How would this be implemented in Rust?
In addition, while we might know the number of rows in the sample dataset, isn't it quite resource intensive to calculate all the "comma-delimited layers" in the *Category* column to know the final size of the array as required by Rust's memory safe design to "initialize" an array by a defined or specified size? Would this need to specifically implemented as way to know the maximum number of layers in order to be doable? Or can we implement an infinitely nested multi-dimensional array without having to specify or initialise a defined map or vector size?
For further reference, in PHP, this might be implemented so:
$output_array = array();
foreach($data_rows as $data_row) {
$temp =& $output_array;
foreach(explode(', ', $data_row["Category"]) as $key) {
$temp =& $temp[$key];
}
// Check if array is already initialized, if not create new array with new data
if(!isset($temp)) {
$temp = array($data_row["Species"]);
} else {
array_push($temp, $data_row["Species"]);
}
}
How would a similar solution like this be implemented in Rust? Thanks in advance! |
ValueError: Kernel shape must have the same length as input, but received kernel of shape (3, 3, 1, 32) and input of shape (None, None, 48, 48, 1).
from keras.models import load_model
from tensorflow.keras.utils import img_to_array
from keras.preprocessing import image
import cv2
import numpy as np
face_classifier = cv2.CascadeClassifier('./haarcascade_frontalface_default.xml')
classifier = load_model("./Emotion_Detection.h5")
class_labels = ['Angry', 'Happy', 'Neutral', 'Sad', 'Surprise']
When I try to run the .py file, the value error appears for the load_model() function. how to resolve the issue?
I am not even load the model, so I can't go further |
Thanks for your feedback. I changed my message. Cheers. |
null |
I am taking a look a the speech to text API and I had some questions:
1. What is the difference between v1 and v1p1?
2. Does the chirp model in stt v2 support transcribing audio from a streaming input? |
I'm using roproxy instead of roblox to get around cord measures, requestbody is username,ctype ,password and useridand for the headers I'm just doing xcrs token:xcrs. It's just returning 403(). Is this an xcrs token thing? Couldn't find anything about it online.
Ideally, it would print the incorect password error, or anything but 403() 0
heres the code im using:
const passwordcheck = {
ctype: "Username",
cvalue: "usernametomyaccount",
password: "passwordtomyaccount",
captchaToken: "CaptchaFromFuncaptcha",
captchaId: "Captcha",
captchaprovider: "PROVIDER_ARKOSE_LABS",
userId: 889124,
};
const request = {
method: "POST",
headers: {
"Content-type": "application/json",
"X-csrf-token": "wiXzYTbS/xF3",
Cookie: "",
},
body: JSON.stringify(passwordcheck),
};
the url is roproxy v2 login
I've tried using different proxies, but nothing came off it. Filling out the requestbody with all the stuff in the Roblox auth documentation didn't work either. |
I'm using roproxy instead of roblox to get around cord measures, requestbody is username,ctype ,password and useridand for the headers I'm just doing xcrs token:xcrs. It's just returning 403(). Is this an xcrs token thing? Couldn't find anything about it online.
Ideally, it would print the incorect password error, or anything but 403() 0
heres the code im using:
const passwordcheck = {
ctype: "Username",
cvalue: "usernametomyaccount",
password: "passwordtomyaccount",
captchaToken: "CaptchaFromFuncaptcha",
captchaId: "Captcha",
captchaprovider: "PROVIDER_ARKOSE_LABS",
userId: 889124,
};
const request = {
method: "POST",
headers: {
"Content-type": "application/json",
"X-csrf-token": "wiXzYTbS/xF3",
Cookie: "",
},
body: JSON.stringify(passwordcheck),
};
the url is roproxy v2 login
I've tried using different proxies, but nothing came off it. Filling out the requestbody with all the stuff in the Roblox auth documentation didn't work either. |
I think you should use state for the list of words.
```js
import wordsArray from "./components/wordsArray";
export default function App() {
const [word, setWord] = useState(getRandomWord());
const [wordlist, setWordlist] = useState(wordsArray);
const getRandomWord = () => {
return wordlist[Math.floor(Math.random() * wordlist.length)];
};
const removeWord = () => {
const updatedWordlist = wordlist.filter(w => w !== word);
setWordlist(updatedWordlist);
if (updatedWordlist.length === 0) {
setGameOver(true);
}
};
}
``` |
I am trying to reverse-engineering a device (an 8 bit microcontroller) that comunicates with the PC. Each device has a Serial number of two digits (in decimal) that is used to calculated a checksum. For the same data message the checksum changes. Each message starts with an 0x02 and ends with a 0x03. The checksum is the number between 0x04 and 0x03. These are the messages in Hexa and a translation to Ascii code:
02 41 42 42 04 31 30 03 .ABB.10. //device ID :00
02 41 42 42 04 44 44 03 .ABB.DD. //device ID : 01
02 41 42 42 04 31 39 03 .ABB.19. //device ID : 56
02 41 42 42 04 35 34 03 .ABB.54. //device ID : 99
02 41 42 31 30 04 37 39 03 .AB10.79. //device ID :00
02 41 42 31 30 04 34 45 03 .AB10.4E. //device ID : 01
02 41 42 31 30 04 45 35 03 .AB10.E5. //device ID : 56
02 41 42 31 30 04 35 45 03 .AB10.5E. //device ID : 99
02 41 42 70 04 41 43 03 .ABp.AC. //device ID :00
02 41 42 70 04 36 31 03 .ABp.61. //device ID : 01
02 41 42 70 04 45 38 03 .ABp.E8. //device ID : 56
02 41 42 70 04 41 35 03 .ABp.A5. //device ID : 99
02 41 42 30 46 46 46 46 04 46 30 03 .AB0FFFF.F0. //device ID:00
02 41 42 30 46 46 46 46 04 35 34 03 .AB0FFFF.54. //device ID : 01
02 41 42 30 46 46 46 46 04 36 33 03 .AB0FFFF.63. //device ID : 56
02 41 42 30 46 46 46 46 04 31 45 03 .AB0FFFF.1E. //device ID :99
I think that is making a kind of Xor but i cant figure how. Also as you can see , the checksum changes for different devices ID (whick i detailed in decimal ). I need to identify how the whole chekcsum is calculated with the data + device ID, or at least identify how the device ID is changing the checksum between different devices. The microcontroller is an 8 bit uc 8051. I will be very greatful if someone can give me any clue on how it its working |
Your container's ID contains a space and they're illegal.
<div id="Sign-In TempUI"
Get rid of it. Also...
document.addEventListener('DOMContentLoaded',
Why are you in such a hurry? Use LOAD...
document.addEventListener('load',
The only time one would need to listen to *DOMContentLoaded* is if they’re writing a utility that does performance timing on pages!
|
{"Voters":[{"Id":1563833,"DisplayName":"Wyck"}],"DeleteType":1} |
I am trying to reverse-engineering a device (an 8 bit microcontroller) that comunicates with the PC. Each device has a Serial number of two digits (in decimal) that is used to calculated a checksum. For the same data message the checksum changes. Each message starts with an 0x02 and ends with a 0x03. The checksum is the number that appears 0x04 and 0x03. These are the messages in Hexa and a translation to Ascii code:
02 41 42 42 04 31 30 03 .ABB.10. //device ID :00
02 41 42 42 04 44 44 03 .ABB.DD. //device ID : 01
02 41 42 42 04 31 39 03 .ABB.19. //device ID : 56
02 41 42 42 04 35 34 03 .ABB.54. //device ID : 99
02 41 42 31 30 04 37 39 03 .AB10.79. //device ID :00
02 41 42 31 30 04 34 45 03 .AB10.4E. //device ID : 01
02 41 42 31 30 04 45 35 03 .AB10.E5. //device ID : 56
02 41 42 31 30 04 35 45 03 .AB10.5E. //device ID : 99
02 41 42 70 04 41 43 03 .ABp.AC. //device ID :00
02 41 42 70 04 36 31 03 .ABp.61. //device ID : 01
02 41 42 70 04 45 38 03 .ABp.E8. //device ID : 56
02 41 42 70 04 41 35 03 .ABp.A5. //device ID : 99
02 41 42 30 46 46 46 46 04 46 30 03 .AB0FFFF.F0. //device ID:00
02 41 42 30 46 46 46 46 04 35 34 03 .AB0FFFF.54. //device ID : 01
02 41 42 30 46 46 46 46 04 36 33 03 .AB0FFFF.63. //device ID : 56
02 41 42 30 46 46 46 46 04 31 45 03 .AB0FFFF.1E. //device ID :99
I think that is making a kind of Xor but i cant figure how. Also as you can see , the checksum changes for different devices ID. I need to identify how the whole chekcsum is calculated with the data + device ID, or at least identify how the device ID is changing the checksum between different devices. The microcontroller is an 8 bit uc 8051. I will be very greatful if someone can give me any clue on how it its working |
Why docker-compose volume binding doesn't work during build? Should I always COPY necessary for build files? |
|docker|npm|docker-compose|docker-volume|docker-build| |
null |
I have the following query in mysql,
```
SELECT COALESCE(sum(COALESCE(amount, 0)), 0) as sum_amount, COALESCE(sum(COALESCE(fees_amount, 0)), 0) as sum_fees_amount
FROM incoming_operation
WHERE status = 'CLEARED' and merchant_id = ? and shop_id IN (?) and currency = ? and cycle_id is null
FOR UPDATE
```
where `shop_id,currency,status` and `cycle_id` are indexed
`CREATE INDEX payout_operation_update
ON incoming_operation(shop_id,currency,status);`
`CREATE INDEX cycle_id_index ON incoming_operation (cycle_id);`
the value of cycle_id is mostly NULL in the DB.
the values of `shop_id,currency,status` are unique in the DB
when running the above query, the query ran for a very long time (2 hours) and I got a lock where other operations couldnt update any row in the DB.
from running the query with exaplin ive seen that it used both indexes: `shop_id,currency,status` and `cycle_id`.
can i assume that using `payout_operation_update(shop_id,currency,status)` index only will make the query quicker?
my assumption is that because of using `cycle_id_index`, all other rows (whos cycle_id is null) were also locked, and using only `payout_operation_update(shop_id,currency,status)` index will lock only the relevant rows (those that apply to filter of `shop_id,currency,status`)
is that assumption valid? |
mysql query takes too long because if wrong indexes usage |
|mysql| |
I'm migrating Spring 5.2./Spring Security 5.8 backend to Spring 6.1/Spring Security 6.2 together with ditching spring-security-oauth2 and adding spring-security-oauth2-authorization-server and it looks a bit complicated.
First of all, current architecture:
1. Users are mobile apps that authenticates/authorises using username/password.
2. Auth/authz is performed by backend against LDAP Server (AD basically) using provided user credentials
3. Access/refresh tokens are generated at the same time and saved into local DB
4. There are additional simplified auth flows based on auth_id/api_keys where passwords are not involved.
So there is no redirects, nothing complex. Oauth2 is used in a very simplified way.
Also:
5. Everything is defined in spring-security.xml
6. No Spring Boot is involved and I would like not to add in unless it strictly necessary.
What I would like to have:
1. Same simplicity — not redirects, getting access token, refresh token together with single request, keep using LDAP.
2. Be able to easily use the whole power of spring-security-oauth2-authorization-server when needed if I add another resource/auth server.
So I did some research through the documentation, samples, code and ended up with the following assumptions and config (see below):
1. I would probably need to stick to Authorization Grant = Client Credentials
2. I should add LDAP password check to authentication providers of 'token endpoint'
I'm using the following config and I'm able to generate/save access token by performing request to /oauth2/token endpoint but I have a few issues there:
1. I'm not able to switch to ClientAuthenticationMethod.NONE. I don't need it.
2. Refresh token is not generated along with access token
3. I'm not able to perform a call with obtained access token (Authorization: Bearer ...) — I'm getting 401
4. It's not clear how to provide user details (UserDetailsService) fetched from LDAP during auth. Should I save them into oauth2_authorization table together with access token etc.?
Here is my config. Any help is appreciated.
```java
package accessmanager;
import com.nimbusds.jose.jwk.JWKSet;
import com.nimbusds.jose.jwk.RSAKey;
import com.nimbusds.jose.jwk.source.ImmutableJWKSet;
import com.nimbusds.jose.jwk.source.JWKSource;
import com.nimbusds.jose.proc.SecurityContext;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.annotation.Order;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.security.config.Customizer;
import org.springframework.security.config.annotation.web.builders.HttpSecurity;
import org.springframework.security.config.annotation.web.configuration.EnableWebSecurity;
import org.springframework.security.config.annotation.web.configurers.CsrfConfigurer;
import org.springframework.security.core.Authentication;
import org.springframework.security.core.GrantedAuthority;
import org.springframework.security.core.context.SecurityContextHolder;
import org.springframework.security.core.userdetails.UserDetails;
import org.springframework.security.core.userdetails.UserDetailsService;
import org.springframework.security.crypto.password.PasswordEncoder;
import org.springframework.security.oauth2.core.AuthorizationGrantType;
import org.springframework.security.oauth2.core.ClientAuthenticationMethod;
import org.springframework.security.oauth2.jose.jws.JwsAlgorithm;
import org.springframework.security.oauth2.jose.jws.SignatureAlgorithm;
import org.springframework.security.oauth2.jwt.*;
import org.springframework.security.oauth2.server.authorization.JdbcOAuth2AuthorizationService;
import org.springframework.security.oauth2.server.authorization.OAuth2TokenType;
import org.springframework.security.oauth2.server.authorization.authentication.OAuth2ClientCredentialsAuthenticationToken;
import org.springframework.security.oauth2.server.authorization.client.InMemoryRegisteredClientRepository;
import org.springframework.security.oauth2.server.authorization.client.RegisteredClient;
import org.springframework.security.oauth2.server.authorization.client.RegisteredClientRepository;
import org.springframework.security.oauth2.server.authorization.config.annotation.web.configuration.OAuth2AuthorizationServerConfiguration;
import org.springframework.security.oauth2.server.authorization.config.annotation.web.configurers.OAuth2AuthorizationServerConfigurer;
import org.springframework.security.oauth2.server.authorization.settings.AuthorizationServerSettings;
import org.springframework.security.oauth2.server.authorization.settings.ClientSettings;
import org.springframework.security.oauth2.server.authorization.settings.OAuth2TokenFormat;
import org.springframework.security.oauth2.server.authorization.token.*;
import org.springframework.security.web.SecurityFilterChain;
import org.springframework.util.StringUtils;
import javax.sql.DataSource;
import java.security.KeyPair;
import java.security.KeyPairGenerator;
import java.security.interfaces.RSAPrivateKey;
import java.security.interfaces.RSAPublicKey;
import java.time.Instant;
import java.util.Collection;
import java.util.Collections;
import java.util.Map;
import java.util.UUID;
import java.util.stream.Collectors;
import static org.springframework.security.config.Customizer.withDefaults;
@Configuration
@EnableWebSecurity
public class SecurityConfig {
private final CheckRolesLdapAuthenticationProvider checkRolesLdapAuthenticationProvider;
private final DataSource dataSource;
private final UserRepository userRepository;
@Autowired
public SecurityConfig(CheckRolesLdapAuthenticationProvider checkRolesLdapAuthenticationProvider,
DataSource dataSource, UserRepository userRepository) {
this.checkRolesLdapAuthenticationProvider = checkRolesLdapAuthenticationProvider;
this.dataSource = dataSource;
this.userRepository = userRepository;
}
@Bean
public JdbcTemplate jdbcTemplate() {
return new JdbcTemplate(dataSource);
}
@Bean
public JdbcOAuth2AuthorizationService authorizationService(JdbcTemplate jdbcTemplate,
RegisteredClientRepository registeredClientRepository) {
return new JdbcOAuth2AuthorizationService(jdbcTemplate, registeredClientRepository);
}
@Bean
public PasswordEncoder noOpPasswordEncoder() {
return new PasswordEncoder() {
@Override
public String encode(CharSequence rawPassword) {
return rawPassword.toString();
}
@Override
public boolean matches(CharSequence rawPassword, String encodedPassword) {
return rawPassword.equals(encodedPassword);
}
};
}
@Bean
@Order(1)
public SecurityFilterChain authorizationServerSecurityFilterChain(HttpSecurity http) throws Exception {
OAuth2AuthorizationServerConfiguration.applyDefaultSecurity(http);
http.getConfigurer(OAuth2AuthorizationServerConfigurer.class)
.tokenEndpoint(tokenEndpoint ->
tokenEndpoint
.accessTokenRequestConverter(
request -> {
final Authentication clientPrincipal = SecurityContextHolder.getContext().getAuthentication();
return
new OAuth2ClientCredentialsAuthenticationToken(
clientPrincipal,
Collections.singleton("write"),
request.getParameterMap()
.entrySet()
.stream()
.collect(
Collectors.toMap(Map.Entry::getKey, p -> p.getValue()[0])));
})
)
.clientAuthentication(auth ->
auth.authenticationProviders(list -> list.add(checkRolesLdapAuthenticationProvider)
))
.oidc(Customizer.withDefaults()); // Enable OpenID Connect 1.0
http
// .exceptionHandling((exceptions) -> exceptions
// .defaultAuthenticationEntryPointFor(
// new BasicAuthenticationEntryPoint(),
// new MediaTypeRequestMatcher(MediaType.ALL)
// )
// )
.oauth2ResourceServer((resourceServer) -> resourceServer.jwt(Customizer.withDefaults()));
return http.build();
}
@Bean
@Order(2)
public SecurityFilterChain securityFilterChain(HttpSecurity http) throws Exception {
http
.csrf(CsrfConfigurer::disable)
.authorizeHttpRequests((authorize) ->
authorize
.requestMatchers("/oauth2/**", "/api/v*/auth").permitAll()
.requestMatchers("/api/v*/**").authenticated()
)
.httpBasic(withDefaults());
return http.build();
}
@Bean
OAuth2TokenGenerator<?> tokenGenerator(JWKSource<SecurityContext> jwkSource) {
final JwtEncoder jwtEncoder = new NimbusJwtEncoder(jwkSource);
final OAuth2TokenGenerator<Jwt> jwtGenerator = new JwtGenerator(new NimbusJwtEncoder(jwkSource));
final OAuth2AccessTokenGenerator accessTokenGenerator = new OAuth2AccessTokenGenerator();
final OAuth2RefreshTokenGenerator refreshTokenGenerator = new OAuth2RefreshTokenGenerator();
return new DelegatingOAuth2TokenGenerator(
jwtGenerator, accessTokenGenerator, refreshTokenGenerator);
}
@Bean
public RegisteredClientRepository registeredClientRepository() {
RegisteredClient oidcClient = RegisteredClient.withId(UUID.randomUUID().toString())
.clientId("mobile")
.clientSecret("{noop}secret")
.clientAuthenticationMethod(ClientAuthenticationMethod.CLIENT_SECRET_BASIC)
.authorizationGrantType(AuthorizationGrantType.CLIENT_CREDENTIALS)
.authorizationGrantType(AuthorizationGrantType.AUTHORIZATION_CODE)
.authorizationGrantType(AuthorizationGrantType.REFRESH_TOKEN)
.redirectUri("http://127.0.0.1:8081/login/oauth2/code/mobile")
.postLogoutRedirectUri("http://127.0.0.1:8081/")
.scope("write")
.clientSettings(ClientSettings.builder().requireAuthorizationConsent(false).build())
.build();
return new InMemoryRegisteredClientRepository(oidcClient);
}
//https://docs.spring.io/spring-authorization-server/reference/getting-started.html
@Bean
public JWKSource<SecurityContext> jwkSource() {
KeyPair keyPair = generateRsaKey();
RSAPublicKey publicKey = (RSAPublicKey) keyPair.getPublic();
RSAPrivateKey privateKey = (RSAPrivateKey) keyPair.getPrivate();
RSAKey rsaKey = new RSAKey.Builder(publicKey)
.privateKey(privateKey)
.keyID(UUID.randomUUID().toString())
.build();
JWKSet jwkSet = new JWKSet(rsaKey);
return new ImmutableJWKSet<>(jwkSet);
}
private static KeyPair generateRsaKey() {
KeyPair keyPair;
try {
KeyPairGenerator keyPairGenerator = KeyPairGenerator.getInstance("RSA");
keyPairGenerator.initialize(2048);
keyPair = keyPairGenerator.generateKeyPair();
}
catch (Exception ex) {
throw new IllegalStateException(ex);
}
return keyPair;
}
@Bean
public JwtDecoder jwtDecoder(JWKSource<SecurityContext> jwkSource) {
return OAuth2AuthorizationServerConfiguration.jwtDecoder(jwkSource);
}
@Bean
public AuthorizationServerSettings authorizationServerSettings() {
return AuthorizationServerSettings.builder().build();
}
@Bean
public UserDetailsService userDetailsService() {
// how to route this to details received from LDAP on authorization?
return username -> {
return new UserDetails() {
@Override
public Collection<? extends GrantedAuthority> getAuthorities() {
return Collections.emptyList();
}
@Override
public String getPassword() {
return null;
}
@Override
public String getUsername() {
return username;
}
@Override
public boolean isAccountNonExpired() {
return true;
}
@Override
public boolean isAccountNonLocked() {
return true;
}
@Override
public boolean isCredentialsNonExpired() {
return true;
}
@Override
public boolean isEnabled() {
return true;
}
};
};
}
}
``` |
null |
I was wondering how can you do a crazy hard gpu stress test using only HTML
I tried a stress test by getting my device to make 510K random points using complex math but that did not really work... But I do thing that photon mapping would be the best option. |
How can you do a heavy gpu test only using HTML? Like a real hard stress test for your device? |
|html| |
null |
Есть задание никак не нет возможности связаться с заказчиком.
задание такое:
"Нужно скачать через браузер или программу страницу https://nameOfTheSide(имя сайта убрал на всякий случай) и в сохраненной копии поменять часть этой страницы под логотипом, начиная с элемента с классом “collection-title” по дизайну из фигмы. Ценовой блок и всё, что ниже него, а также товары на странице менять не нужно."
Проблема теперь такая что когда скачаешь сайт там уже скомпилированные данные, а работать сложновато с этим и на структуру сайта у меня нет доступа и заказчик тоже не скинул ссылку в гид.
Я новичок пока что в этой сфере и поэтому может я что-то не понимаю, поэтому подскажите пожалуйста. |
Comprehension in order |
|javascript|html|css| |
You forgot the `return` statement
```
{
title: 'Tags',
key: 'tags',
dataIndex: 'tags',
render: (_,{tags})=>{
return <>
{tags.map((tag)=>{
return(
<Tag key={tag}>{tag}</Tag>
);
})}
</>
}
}
``` |
Explain the below problem:
```
main() {
static char s[25]="vector"; int i=0;
char ch;
ch=s[++i];
printf("%c", ch);
ch=s[i++];
printf("%c", ch);
ch=i++[s];
printf("%c", ch);
ch=++i[s];
printf("%c", ch);
}
```
OUTPUT :: eecu
I'm trying to figure out how the word 'eec' came to be, but I'm having some trouble. Would you mind sharing your thoughts on this with me? |
I'm trying to solve a non-linear equation with nleqslv, and I know a priori if I want a positive or negative solution (it's a dataset on choice under risk, and I'm trying to compute the risk aversion coefficient for each individual under CRRA assumption. Since I can observe the DMs' choices, I already know if each DM is risk averse or not). Is there any way to enforce it?
I know I could try with different initial values; however, I'd like to find another way as I'm using nleqslv in a for loop (I must compute one solution for each observation) and I can't find an initial guess that works for everyone.
My code is the following:
`bernoulli <- function(x, r) {
ifelse(x>=0,((x+1)^(1-r)-1)/(1-r), -((-x+1)^(1-r)-1)/(1-r) )
}
bernoulli.log <- function(x) {
ifelse(x>=0, log(x+1), -log(-x+1) )}
mydata$alpha_crra <- NA
> for (i in 1:nrow(mydata)) {
indiff.eq.crra <- function(r) {
return(bernoulli(mydata$CE[i], r) - mydata$p[i]*bernoulli(mydata$win[i], r)
- (1-mydata$p[i])*bernoulli(mydata$lose[i], r))
}
mydata$alpha_crra[i] <- ifelse(mydata$riskneutral[i] == 1, 0,
ifelse(abs(bernoulli.log(mydata$CE[i]) -
mydata$p[i]*bernoulli.log(mydata$win[i]) -
(1-mydata$p[i])*bernoulli.log(mydata$lose[i])) < 0.001 &
mydata$riskaverse[i] == 1, 1,
nleqslv(-5, indiff.eq.crra)$x))
}`
> where: mydata$win[i]= the high payoff of the lottery (can change depending on the observation)
> mydata$lose[i]= the low payoff of the lottery
> mydata$p[i] = probability to win the high payoff
> mydata$CE[i]= the Certainty Equivalent stated by DM i
> mydata$riskneutral[i] = a dummy variable = 1 if i is risk neutral, 0 otherwise
> mydata$riskaverse[i] = a dummy variable = 1 if i is risk averse, 0 otherwise
|
Is there a way to force nleqslv to find positive solutions? |
|r| |
null |
So when checking for duplicates in an array (that's what I think the algorithm is for; correct me if I'm wrong), you want to check each element against each other element once. Let's have a 2D array `pairs` of all possible pairs of elements you can check.
For an array with 4 elements, the 2D array would look like this:
```
[(0,0),(0,1),(0,2),(0,3),
(1,0),(1,1),(1,2),(1,3),
(2,0),(2,1),(2,2),(2,3),
(3,0),(3,1),(3,2),(3,3)]
```
We can traverse this 2D array like you said.
```
double[] array = new double[] {1,2,3,4}
for (int i=0;i<array.length;i++){
for (int j=0;j<array.length;j++){
}
```
However, this checks both (1,0) and (0,1) which is redundant. All we have to do is check the elements above and including the diagonal from the top left to the bottom right. This makes it so that we have to set j to i+1. |
I've set up my \_app.js and store.js in what seems to be the expected format. This is my first attempt as using both so if I'm missing something obvious, please let me know.
store.js:
```
import { configureStore } from '@reduxjs/toolkit';
import { createWrapper } from 'next-redux-wrapper';
import authReducer from '../store/slices/authSlice';
import searchReducer from '../store/slices/searchSlice';
const makeStore = () => {
console.log('config store: ', configureStore({ reducer: { auth, search } }));
return configureStore({
reducer: {
auth: authReducer,
search: searchReducer,
},
});
}
export const wrapper = createWrapper(makeStore, { debug: process.env.NODE_ENV === 'development' });
```
_app.js:
```
import { Provider } from 'react-redux';
import { wrapper } from '../store/store';
function MyApp({ Component, pageProps }) {
console.log('wrapper: ', wrapper);
return (
<Provider store={wrapper}>
<Component {...pageProps} />
</Provider>
);
}
export default MyApp;
```
The console.log of my wrapper is:
```
wrapper: {
getServerSideProps: [Function: getServerSideProps],
getStaticProps: [Function: getStaticProps],
getInitialAppProps: [Function: getInitialAppProps],
getInitialPageProps: [Function: getInitialPageProps],
withRedux: [Function: withRedux],
useWrappedStore: [Function: useWrappedStore]
}
```
The reason I've done all this is because I was originally getting a legacy error when using withRedux so it said I had to use the createWrapper option.
I've tried to look up on here and on ChatGPT and Gemini and nothing seems to explain why I'm getting TypeError: store.getState is not a function. It also looks like makeStore is never getting called because the console.log in the function never logs. |
One regular expression is:
function replaceFraction(str)
str = string.gsub(str, "(%S+)/(%S+)", "\\frac{%1}{%2}")
return str
end
Examples:
print(replaceFraction("Hello a/b hi")) -- Hello \frac{a}{b} hi
print(replaceFraction("a/b hi")) -- \frac{a}{b} hi
print(replaceFraction("Hello a/b")) -- Hello \frac{a}{b} |
I'm trying to have an element project a shadow / blurred edge onto an element behind it, causing the latter to "dissolve", without impacting the background. Hopefully the following images can better illustrate my problem.
This is what I was able to get:

While this is the result I'm trying to achieve:

This is the code I used to make the first image:
<!-- begin snippet: js hide: false console: false babel: false -->
<!-- language: lang-css -->
html {
height: 500px;
background: linear-gradient(#9198E5, #E66465);
}
#back {
position: absolute;
top: 100px;
left: 100px;
width: 200px;
height: 200px;
background-color: #FFBBBB;
}
#front {
position: absolute;
top: 200px;
left: 200px;
width: 200px;
height: 200px;
background-color: #BBFFBB;
box-shadow: -30px -30px 15px #BF7B9F;
}
<!-- language: lang-html -->
<div id="back"></div>
<div id="front"></div>
<!-- end snippet -->
|
I am trying to reverse-engineering a device (an 8 bit microcontroller) that comunicates with the PC. Each microcontroller has a Serial number of two digits (in decimal) that is used to calculated a checksum. For the same data message the checksum changes. Each message starts with an 0x02 and ends with a 0x03. The checksum is the number that appears 0x04 and 0x03. These are the messages in Hexa and a translation to Ascii code:
02 41 42 42 04 31 30 03 .ABB.10. //device ID :00
02 41 42 42 04 44 44 03 .ABB.DD. //device ID : 01
02 41 42 42 04 31 39 03 .ABB.19. //device ID : 56
02 41 42 42 04 35 34 03 .ABB.54. //device ID : 99
02 41 42 31 30 04 37 39 03 .AB10.79. //device ID :00
02 41 42 31 30 04 34 45 03 .AB10.4E. //device ID : 01
02 41 42 31 30 04 45 35 03 .AB10.E5. //device ID : 56
02 41 42 31 30 04 35 45 03 .AB10.5E. //device ID : 99
02 41 42 70 04 41 43 03 .ABp.AC. //device ID :00
02 41 42 70 04 36 31 03 .ABp.61. //device ID : 01
02 41 42 70 04 45 38 03 .ABp.E8. //device ID : 56
02 41 42 70 04 41 35 03 .ABp.A5. //device ID : 99
02 41 42 30 46 46 46 46 04 46 30 03 .AB0FFFF.F0. //device ID:00
02 41 42 30 46 46 46 46 04 35 34 03 .AB0FFFF.54. //device ID : 01
02 41 42 30 46 46 46 46 04 36 33 03 .AB0FFFF.63. //device ID : 56
02 41 42 30 46 46 46 46 04 31 45 03 .AB0FFFF.1E. //device ID :99
I think that is making a kind of Xor but i cant figure how. Also as you can see , the checksum changes for different devices ID. I need to identify how the whole chekcsum is calculated with the data + device ID, or at least identify how the device ID is changing the checksum between different devices. The microcontroller is an 8 bit uc 8051. I will be very greatful if someone can give me any clue on how it its working |
I make it work executing the init_async function and using asynccontextmanager form fast api:
import py_eureka_client.eureka_client as eureka_client
from fastapi import FastAPI
from contextlib import asynccontextmanager
from landus.controladores import controladorRouter
@asynccontextmanager
async def lifespan(app: FastAPI):
await eureka_client.init_async(
eureka_server="http://eureka-primary:8011/eureka/,http://eureka-secondary:8012/eureka/,http://eureka-tertiary:8013/eureka/",
app_name="msprueba",
instance_port=8000,
instance_host="localhost"
)
yield
app = FastAPI(lifespan=lifespan)
app.include_router(controladorRouter)
The problem was that I cannot execute a code that is also async with fastapi, so the lifespan function execute before fastapi run |
I have no issue with ANSI color codes and how they work but I have no idea how to exactly color the entire background with such color. I want to make it so that the entire background of the console changes to the ANSI color code. Is there a way I can do this?
The answer does not have to be the same on all platforms - I am looking for something that works for Windows. |
Is there a way I can color the entire console (like setting the background color) using ANSI color codes? |
|c#| |
I am trying to reverse-engineering a device (an 8 bit microcontroller) that comunicates with the PC. Each microcontroller has a Serial number of two digits (in decimal) that is used to calculated a checksum. For the same data message the checksum changes from device to device. Each message starts with an 0x02 and ends with a 0x03. The checksum is the number that appears between 0x04 and 0x03. These are the messages in Hexa and a translation to Ascii code:
02 41 42 42 04 31 30 03 .ABB.10. //device ID :00
02 41 42 42 04 44 44 03 .ABB.DD. //device ID : 01
02 41 42 42 04 31 39 03 .ABB.19. //device ID : 56
02 41 42 42 04 35 34 03 .ABB.54. //device ID : 99
02 41 42 31 30 04 37 39 03 .AB10.79. //device ID :00
02 41 42 31 30 04 34 45 03 .AB10.4E. //device ID : 01
02 41 42 31 30 04 45 35 03 .AB10.E5. //device ID : 56
02 41 42 31 30 04 35 45 03 .AB10.5E. //device ID : 99
02 41 42 70 04 41 43 03 .ABp.AC. //device ID :00
02 41 42 70 04 36 31 03 .ABp.61. //device ID : 01
02 41 42 70 04 45 38 03 .ABp.E8. //device ID : 56
02 41 42 70 04 41 35 03 .ABp.A5. //device ID : 99
02 41 42 30 46 46 46 46 04 46 30 03 .AB0FFFF.F0. //device ID:00
02 41 42 30 46 46 46 46 04 35 34 03 .AB0FFFF.54. //device ID : 01
02 41 42 30 46 46 46 46 04 36 33 03 .AB0FFFF.63. //device ID : 56
02 41 42 30 46 46 46 46 04 31 45 03 .AB0FFFF.1E. //device ID :99
I think that is making a kind of Xor but i cant figure how. Also as you can see , the checksum changes for different devices ID. I need to identify how the whole chekcsum is calculated with the data + device ID, or at least identify how the device ID is changing the checksum between different devices. The microcontroller is an 8 bit uc 8051. I will be very greatful if someone can give me any clue on how it its working |
I’m working with a quadruped robot to learn locomotion behavior with DRL using Ros2 and Drake in c++, and I want to apply domain randomization in my learning.
I would like some advice from the community to How I can change the physical properties of the robot (like mass, inertia, joint friction) and plane properties dynamically during the training.
Thank you.
|
How to change the physical properties of the robot and plane to apply Domain Randomization ?- Drake |
|drake| |
null |
|stripe-payments|checkout| |
I'm using roproxy instead of roblox to get around cord measures, requestbody is username,ctype ,password and useridand for the headers I'm just doing xcrs token:xcrs. It's just returning 403(). Is this an xcrs token thing? Couldn't find anything about it online.
Ideally, it would print the incorect password error, or anything but 403() 0
heres the code im using:
const passwordcheck = {
ctype: "Username",
cvalue: "usernametomyaccount",
password: "passwordtomyaccount",
captchaToken: "CaptchaFromFuncaptcha",
captchaId: "Captcha",
captchaprovider: "PROVIDER_ARKOSE_LABS",
userId: 889124,
};
const request = {
method: "POST",
headers: {
"Content-type": "application/json",
"X-csrf-token": "wiXzYTbS/xF3",
Cookie: "",
},
body: JSON.stringify(passwordcheck),
};
the url is roproxy v2 login
I've tried using different proxies, but nothing came off it. Filling out the requestbody with all the stuff in the Roblox auth documentation didn't work either. |
I'm getting the error "'RNAppAuthAuthorizationFlowManager.h' file not found" on AppDelegate.h every time I build my app after migrating from react-native 0.61.5 to 0.69.12.
AppDelegate.h:
```
#import <React/RCTBridgeDelegate.h>
#import <UIKit/UIKit.h>
#import <React/RCTLinkingManager.h>
#import <AppAuth/AppAuth.h>
#import "RNAppAuthAuthorizationFlowManager.h"
@interface AppDelegate : UIResponder <UIApplicationDelegate, RCTBridgeDelegate, RNAppAuthAuthorizationFlowManager>
@property (nonatomic, strong) UIWindow *window;
@property(nonatomic, weak)id<RNAppAuthAuthorizationFlowManagerDelegate>authorizationFlowManagerDelegate;
@end
```
AppDelegate.mm:
```
#import <React/RCTBridge.h>
#import "AppDelegate.h"
#import <React/RCTBundleURLProvider.h>
#import <React/RCTRootView.h>
#import <React/RCTAppSetupUtils.h>
#if RCT_NEW_ARCH_ENABLED
#import <React/CoreModulesPlugins.h>
#import <React/RCTCxxBridgeDelegate.h>
#import <React/RCTFabricSurfaceHostingProxyRootView.h>
#import <React/RCTSurfacePresenter.h>
#import <React/RCTSurfacePresenterBridgeAdapter.h>
#import <ReactCommon/RCTTurboModuleManager.h>
#import <react/config/ReactNativeConfig.h>
#import <Firebase.h>
#import <PushIOManager/PushIOManager.h>
#import <UserNotifications/UserNotifications.h>
#import <FBSDKCoreKit/FBSDKCoreKit.h>
#import <React/RCTLinkingManager.h>
static NSString *const kRNConcurrentRoot = @"concurrentRoot";
@interface AppDelegate () <UNUserNotificationCenterDelegate, RCTCxxBridgeDelegate, RCTTurboModuleManagerDelegate> {
RCTTurboModuleManager *_turboModuleManager;
RCTSurfacePresenterBridgeAdapter *_bridgeAdapter;
std::shared_ptr<const facebook::react::ReactNativeConfig> _reactNativeConfig;
facebook::react::ContextContainer::Shared _contextContainer;
}
@end
#endif
@implementation AppDelegate
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
RCTAppSetupPrepareApp(application);
if ([FIRApp defaultApp] == nil) {
[FIRApp configure];
}
RCTBridge *bridge = [[RCTBridge alloc] initWithDelegate:self launchOptions:launchOptions];
#if RCT_NEW_ARCH_ENABLED
_contextContainer = std::make_shared<facebook::react::ContextContainer const>();
_reactNativeConfig = std::make_shared<facebook::react::EmptyReactNativeConfig const>();
_contextContainer->insert("ReactNativeConfig", _reactNativeConfig);
_bridgeAdapter = [[RCTSurfacePresenterBridgeAdapter alloc] initWithBridge:bridge contextContainer:_contextContainer];
bridge.surfacePresenter = _bridgeAdapter.surfacePresenter;
#endif
NSDictionary *initProps = [self prepareInitialProps];
UIView *rootView = RCTAppSetupDefaultRootView(bridge, @"stix", initProps);
if (@available(iOS 13.0, *)) {
rootView.backgroundColor = [UIColor systemBackgroundColor];
} else {
rootView.backgroundColor = [UIColor whiteColor];
}
self.window = [[UIWindow alloc] initWithFrame:[UIScreen mainScreen].bounds];
UIViewController *rootViewController = [UIViewController new];
rootViewController.view = rootView;
self.window.rootViewController = rootViewController;
[self.window makeKeyAndVisible];
[UNUserNotificationCenter currentNotificationCenter].delegate= self;
#ifdef DEBUG
[[PushIOManager sharedInstance] setLoggingEnabled:YES];
[[PushIOManager sharedInstance] setLogLevel:PIOLogLevelInfo];
#else
[[PushIOManager sharedInstance] setLoggingEnabled:NO];
#endif
[[PushIOManager sharedInstance] didFinishLaunchingWithOptions:launchOptions];
[PushIOManager sharedInstance].notificationPresentationOptions = UNNotificationPresentationOptionAlert|UNNotificationPresentationOptionSound|UNNotificationPresentationOptionBadge;
[[FBSDKApplicationDelegate sharedInstance] application:application
didFinishLaunchingWithOptions:launchOptions];
return YES;
}
- (BOOL)application:(UIApplication *)application
openURL:(NSURL *)url
options:(NSDictionary<UIApplicationOpenURLOptionsKey,id> *)options
{
if ([self.authorizationFlowManagerDelegate resumeExternalUserAgentFlowWithURL:url]) {
return YES;
}
return [RCTLinkingManager application:application openURL:url options:options];
}
- (BOOL) application:(UIApplication *)application
openURL:(NSURL *)url
options: (NSDictionary<UIApplicationOpenURLOptionsKey, id> *) options
{
if ([self.authorizationFlowManagerDelegate resumeExternalUserAgentFlowWithURL:url]) {
return YES;
}
return [RCTLinkingManager application:application openURL:url options:options];
}
- (BOOL) application:(UIApplication *)application
continueUserActivity:(nonnull NSUserActivity *)userActivity
restorationHandler:(nonnull void (^)(NSArray<id<UIUserActivityRestoring>> * _Nullable))restorationHandler
{
if ([userActivity.activityType isEqualToString:NSUserActivityTypeBrowsingWeb]) {
if (self.authorizationFlowManagerDelegate) {
BOOL resumableAuth = [self.authorizationFlowManagerDelegate resumeExternalUserAgentFlowWithURL:userActivity.webpageURL];
if (resumableAuth) {
return YES;
}
}
}
return [RCTLinkingManager application:application
continueUserActivity:userActivity
restorationHandler:restorationHandler];
}
/// This method controls whether the `concurrentRoot`feature of React18 is turned on or off.
///
/// @see: https://reactjs.org/blog/2022/03/29/react-v18.html
/// @note: This requires to be rendering on Fabric (i.e. on the New Architecture).
/// @return: `true` if the `concurrentRoot` feture is enabled. Otherwise, it returns `false`.
- (BOOL)concurrentRootEnabled
{
// Switch this bool to turn on and off the concurrent root
return true;
}
- (NSDictionary *)prepareInitialProps
{
NSMutableDictionary *initProps = [NSMutableDictionary new];
#ifdef RCT_NEW_ARCH_ENABLED
initProps[kRNConcurrentRoot] = @([self concurrentRootEnabled]);
#endif
return initProps;
}
- (NSURL *)sourceURLForBridge:(RCTBridge *)bridge
{
#if DEBUG
return [[RCTBundleURLProvider sharedSettings] jsBundleURLForBundleRoot:@"index"];
#else
return [[NSBundle mainBundle] URLForResource:@"main" withExtension:@"jsbundle"];
#endif
}
#if RCT_NEW_ARCH_ENABLED
#pragma mark - RCTCxxBridgeDelegate
- (std::unique_ptr<facebook::react::JSExecutorFactory>)jsExecutorFactoryForBridge:(RCTBridge *)bridge
{
_turboModuleManager = [[RCTTurboModuleManager alloc] initWithBridge:bridge
delegate:self
jsInvoker:bridge.jsCallInvoker];
return RCTAppSetupDefaultJsExecutorFactory(bridge, _turboModuleManager);
}
#pragma mark RCTTurboModuleManagerDelegate
- (Class)getModuleClassFromName:(const char *)name
{
return RCTCoreModulesClassProvider(name);
}
- (std::shared_ptr<facebook::react::TurboModule>)getTurboModule:(const std::string &)name
jsInvoker:(std::shared_ptr<facebook::react::CallInvoker>)jsInvoker
{
return nullptr;
}
- (std::shared_ptr<facebook::react::TurboModule>)getTurboModule:(const std::string &)name
initParams:
(const facebook::react::ObjCTurboModule::InitParams &)params
{
return nullptr;
}
- (id<RCTTurboModule>)getModuleInstanceFromClass:(Class)moduleClass
{
return RCTAppSetupDefaultModuleFromClass(moduleClass);
}
#endif
- (void)application:(UIApplication *)application didRegisterForRemoteNotificationsWithDeviceToken:
(NSData *)deviceToken
{
[[PushIOManager sharedInstance] didRegisterForRemoteNotificationsWithDeviceToken:deviceToken];
}
- (void)application:(UIApplication *)application didFailToRegisterForRemoteNotificationsWithError:(NSError *)error
{
[[PushIOManager sharedInstance] didFailToRegisterForRemoteNotificationsWithError:error];
}
- (void)application:(UIApplication *)application didReceiveRemoteNotification:(NSDictionary *)userInfo {
[[PushIOManager sharedInstance] didReceiveRemoteNotification:userInfo];
NSDictionary *payload = [userInfo objectForKey:@"aps"];
NSString *alertMessage = [payload objectForKey:@"alert"];
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:nil message:alertMessage delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
[alertView show];
}
-(void) userNotificationCenter:(UNUserNotificationCenter *)center didReceiveNotificationResponse:
(UNNotificationResponse *)response withCompletionHandler:(void(^)())completionHandler
{
[[PushIOManager sharedInstance] userNotificationCenter:center didReceiveNotificationResponse:response
withCompletionHandler:completionHandler];
}
-(void) userNotificationCenter:(UNUserNotificationCenter *)center willPresentNotification:
(UNNotification *)notification withCompletionHandler:(void (^)(UNNotificationPresentationOptions options))completionHandler
{
[[PushIOManager sharedInstance] userNotificationCenter:center willPresentNotification:notification
withCompletionHandler:completionHandler];
}
@end
```
Podfile:
```
require_relative '../node_modules/react-native/scripts/react_native_pods'
require_relative '../node_modules/@react-native-community/cli-platform-ios/native_modules'
platform :ios, '12.4'
install! 'cocoapods', :deterministic_uuids => false
def shared_pods
$RNFirebaseAsStaticFramework = true
pod 'Firebase', :modular_headers => true
pod 'FirebaseCore', :modular_headers => true
pod 'FirebaseCoreInternal', :modular_headers => true
pod 'GoogleUtilities', :modular_headers => true
pod 'Permission-AppTrackingTransparency', :path => "../node_modules/react-native-permissions/ios/AppTrackingTransparency"
pod 'React-RCTLinking', :path => '../node_modules/react-native/Libraries/LinkingIOS'
end
target 'stix dev' do
config = use_native_modules!
# Flags change depending on the env values.
flags = get_default_flags()
use_react_native!(
:path => config[:reactNativePath],
# to enable hermes on iOS, change `false` to `true` and then install pods
:hermes_enabled => false,
:fabric_enabled => flags[:fabric_enabled],
# :flipper_configuration => FlipperConfiguration.enabled,
# An absolute path to your application root.
:app_path => "#{Pod::Config.instance.installation_root}/.."
)
use_frameworks! :linkage => :static
# Pods for stix
shared_pods
def find_and_replace(dir, findstr, replacestr)
Dir[dir].each do |name|
text = File.read(name)
replace = text.gsub(findstr,replacestr)
if text != replace
puts "Fix: " + name
File.open(name, "w") { |file| file.puts replace }
STDOUT.flush
end
end
Dir[dir + '*/'].each(&method(:find_and_replace))
end
post_install do |installer|
react_native_post_install(installer)
__apply_Xcode_12_5_M1_post_install_workaround(installer)
end
end
target 'stix dev (embedded)' do
# Pods for stix
shared_pods
end
target 'stix staging' do
# Pods for stix
shared_pods
end
target 'stix staging (embedded)' do
# Pods for stix
shared_pods
end
target 'stix' do
# Pods for stix
shared_pods
end
target 'stix (embedded)' do
# Pods for stix
shared_pods
end
```
package.json:
```
{
"name": "@stix/stix-embedded",
"version": "2.5.118",
"private": false,
"scripts": {
"build": "npm run clean && npm run build:js && npm run copy",
"clean": "node ./tasks/clean",
"build:types": "echo tsc --emitDeclarationOnly",
"build:js": "tsc --build",
"copy": "node ./tasks/copy",
"android": "npx react-native run-android",
"ios": "npx react-native run-ios",
"ios-dev": "ENVFILE=.env.development npx react-native run-ios --scheme='stix dev'",
"ios-dev-embedded": "ENVFILE=.env.development-embedded npx react-native run-ios --scheme='stix dev (embedded)'",
"ios-staging": "ENVFILE=.env.staging npx react-native run-ios --scheme='stix staging'",
"ios-SE-staging": "ENVFILE=.env.staging npx react-native run-ios --scheme='stix staging' --simulator='iPhone SE (2nd generation)'",
"ipad-mini-staging": "ENVFILE=.env.staging npx react-native run-ios --scheme='stix staging' --simulator='iPad mini (6th generation)'",
"ipad-staging": "ENVFILE=.env.staging npx react-native run-ios --scheme='stix staging' --simulator='iPad Pro (11-inch) (3rd generation)'",
"ios-SE-staging-embedded": "ENVFILE=.env.staging-embedded npx react-native run-ios --scheme='stix staging (embedded)' --simulator='iPhone SE (2nd generation)'",
"ios-staging-embedded": "ENVFILE=.env.staging-embedded npx react-native run-ios --scheme='stix staging (embedded)'",
"ios-8-staging": "ENVFILE=.env.staging npx react-native run-ios --scheme='stix staging' --simulator='iPhone 8'",
"ios-8-staging-embedded": "ENVFILE=.env.staging-embedded npx react-native run-ios --scheme='stix staging (embedded)' --simulator='iPhone 8'",
"ios-8-dev-embedded": "ENVFILE=.env.development-embedded npx react-native run-ios --scheme='stix dev (embedded)' --simulator='iPhone 8'",
"ios-production": "echo 'Esta build funciona apenas aparelhos físicos' && ENVFILE=.env.production npx react-native run-ios --scheme='stix'",
"ios-production-embedded": "echo 'Esta build funciona apenas aparelhos físicos' && ENVFILE=.env.production-embedded npx react-native run-ios --scheme='stix (embedded)'",
"android-dev": "ENVFILE=.env.development npx react-native run-android",
"android-dev-embedded": "ENVFILE=.env.development-embedded npx react-native run-android",
"android-staging": "ENVFILE=.env.staging npx react-native run-android",
"android-staging-for-windows": "SET ENVFILE=.env.staging && npx react-native run-android",
"android-staging-embedded": "ENVFILE=.env.staging-embedded npx react-native run-android",
"android-production": "ENVFILE=.env.production npx react-native run-android",
"android-production-embedded": "ENVFILE=.env.production-embedded npx react-native run-android",
"clean-project": "rm -rf node_modules/ && npm cache clean -f && npm install && cd ios && pod install && cd ..",
"android-apk-development": "cd android && ENVFILE=.env.development ./gradlew assembleRelease && cd ..",
"android-apk-staging": "cd android && ENVFILE=.env.staging ./gradlew assembleRelease && open ./app/build/outputs/apk/release/ && cd ..",
"android-apk-production": "cd android && ENVFILE=.env.production ./gradlew assembleRelease && cd ..",
"android-aab-production": "cd android && ENVFILE=.env.production ./gradlew bundleRelease && cd ..",
"publish-embedded": "npm run publish-embedded-gpa && npm run publish-embedded-rd",
"publish-embedded-gpa": "node --max-old-space-size=8000 `which npm` publish",
"publish-embedded-rd": "cat package_rd.json > package.json && node --max-old-space-size=8000 `which npm` publish && git checkout package.json",
"android-apk-embedded": "cd android && ./gradlew assembleRelease && cd ..",
"start": "react-native start",
"test": "jest",
"test:dev": "jest --watchAll",
"test:cov": "jest src --coverage --coverageReporters=cobertura --coverageReporters=lcov",
"lint": "eslint",
"package-staging": "ENVFILE=.env.staging"
},
"rnpm": {
"ios": {},
"android": {},
"assets": [
"./src/assets/fonts"
]
},
"dependencies": {
"@fortawesome/fontawesome-svg-core": "^1.2.35",
"@fortawesome/free-regular-svg-icons": "^5.15.3",
"@fortawesome/free-solid-svg-icons": "^5.15.3",
"@fortawesome/react-native-fontawesome": "^0.2.6",
"@oracle/react-native-pushiomanager": "6.52.2",
"@react-native-async-storage/async-storage": "^1.22.2",
"@react-native-community/clipboard": "^1.5.1",
"@react-native-community/geolocation": "^2.0.2",
"@react-native-firebase/dynamic-links": "18.7.3",
"@react-navigation/bottom-tabs": "^5.11.11",
"@reduxjs/toolkit": "^1.9.7",
"@sentry/react-native": "5.16.0",
"@stix/authentication": "1.1.1",
"@stix/react-native-sec4u": "1.0.5",
"@types/styled-components-react-native": "^5.1.1",
"axios": "^1.6.7",
"js-sha256": "^0.9.0",
"jwt-decode": "^2.2.0",
"moment": "^2.29.1",
"obfuscator-io-metro-plugin": "2.1.3",
"react-dom": "18.0.0",
"react-native-app-auth": "7.1.3",
"react-native-config": "^1.0.0",
"react-native-device-info": "^8.1.7",
"react-native-fbsdk-next": "^7.0.1",
"react-native-masked-text": "^1.12.5",
"react-native-modal": "^11.10.0",
"react-native-permissions": "3.10.1",
"react-native-reanimated": "2.17.0",
"react-native-recaptcha-that-works": "^1.2.0",
"react-native-render-html": "^5.1.0",
"react-native-swiper": "^1.6.0",
"react-native-webview": "11.26.1",
"react-redux": "8.1.3",
"redux-persist": "6.0.0",
"styled-components": "^5.2.3"
},
"devDependencies": {
"@babel/core": "^7.12.9",
"@babel/runtime": "^7.12.5",
"@react-native-community/masked-view": "^0.1.10",
"@react-native-firebase/analytics": "18.7.3",
"@react-native-firebase/app": "18.7.3",
"@react-native-firebase/remote-config": "18.7.3",
"@react-navigation/native": "^5.3.0",
"@react-navigation/stack": "^5.3.2",
"@testing-library/jest-native": "^5.4.3",
"@testing-library/react-native": "^12.4.3",
"@types/jest": "^29.2.2",
"@types/node": "^14.14.37",
"@types/react": "^18.2.48",
"@types/react-native": "^0.64.2",
"@types/react-redux": "^7.1.33",
"@types/styled-components": "^5.1.9",
"@typescript-eslint/eslint-plugin": "^4.22.0",
"@typescript-eslint/parser": "^4.22.0",
"babel-jest": "^26.6.3",
"babel-plugin-root-import": "^6.6.0",
"eslint": "^7.32.0",
"eslint-config-airbnb": "^18.2.1",
"eslint-config-prettier": "^8.2.0",
"eslint-import-resolver-typescript": "^2.4.0",
"eslint-plugin-import": "^2.22.1",
"eslint-plugin-jsx-a11y": "^6.4.1",
"eslint-plugin-prettier": "^3.4.0",
"eslint-plugin-react": "^7.21.5",
"eslint-plugin-react-hooks": "^4",
"eslint-plugin-simple-import-sort": "^7.0.0",
"jail-monkey": "2.8.0",
"jest": "^29.7.0",
"jest-styled-components": "^7.1.1",
"jest-svg-transformer": "^1.0.0",
"metro-react-native-babel-preset": "^0.70.3",
"patch-package": "^6.4.7",
"prettier": "^2.2.1",
"react": "18.0.0",
"react-native": "0.69.12",
"react-native-adjust": "^4.32.1",
"react-native-gesture-handler": "^1.10.3",
"react-native-rate": "^1.2.6",
"react-native-safe-area-context": "3.2.0",
"react-native-screens": "2.18.1",
"react-native-svg": "^14.1.0",
"react-native-svg-transformer": "^1.3.0",
"react-test-renderer": "18.0.0",
"reactotron-react-native": "^5.0.0",
"typescript": "^4.2.3"
},
"publishConfig": {
"@stix:registry": "https://pkgs.dev.azure.com/StixFidelidade/_packaging/Stix-package/npm/registry/"
},
"jest": {
"preset": "react-native"
}
}
```
I've already tried to manually link the dependency following [this](https://github.com/FormidableLabs/react-native-app-auth/issues/80) but none of the solutions worked. Any ideas, please? |
'RNAppAuthAuthorizationFlowManager.h' file not found |
|ios|xcode|react-native|appdelegate|appauth| |
null |
I want the date and time to be formatted based on the criteria(below objects) for each datetime component. And the separators between them should be taken based on locale. Is there way to achieve this in java
{ year: numeric, month: long, day: two-digit, weekday: short }
{ year: numeric, month: twoDigit, day: twoDigit }
For example consider second object and locale Locale.US and Locale.FRANCE.
For Locale.US date should be 26/03/2024
For Locale.France date should be 26.03.2024
I know I can use `DateTimeFormatter.ofLocalizedDate()`, which gives me the choice of four format styles for each locale, from short to full. I am after the greater flexibility of specifying for each field whether it should be numeric or textual and how long, the same that I can when specifying a format pattern string, but still with the right delimiters for the locale.
|
|java|datetime|localization|date-formatting| |
the error is as follows:
C:\Users\user\Desktop\attachment_perfomance_munashe-main\attachment_perfomance_munashe-main>composer update
Loading composer repositories with package information
Updating dependencies
Your requirements could not be resolved to an installable set of packages.
Problem 1
- laravel/framework[v10.10.0, ..., v10.48.4] require league/flysystem ^3.8.0 -> satisfiable by league/flysystem[3.8.0, ..., 3.26.0].
- league/flysystem[3.3.0, ..., 3.14.0] require league/mime-type-detection ^1.0.0 -> satisfiable by league/mime-type-detection[1.0.0, ..., 1.15.0].
- league/flysystem[3.15.0, ..., 3.26.0] require league/flysystem-local ^3.0.0 -> satisfiable by league/flysystem-local[3.15.0, ..., 3.25.1].
- league/mime-type-detection[1.0.0, ..., 1.3.0] require php ^7.2 -> your php version (8.2.7) does not satisfy that requirement.
- league/mime-type-detection[1.4.0, ..., 1.15.0] require ext-fileinfo * -> it is missing from your system. Install or enable PHP's fileinfo extension.
- league/flysystem-local[3.15.0, ..., 3.25.1] require ext-fileinfo * -> it is missing from your system. Install or enable PHP's fileinfo extension.
- Root composer.json requires laravel/framework ^10.10 -> satisfiable by laravel/framework[v10.10.0, ..., v10.48.4].
To enable extensions, verify that they are enabled in your .ini files:
- C:\php\php.ini
You can also run `php --ini` in a terminal to see which files are used by PHP in CLI mode.
Alternatively, you can run Composer with `--ignore-platform-req=ext-fileinfo` to temporarily ignore these required extensions.
I enabled the fileInfo extensions in the config file |