instruction stringlengths 0 30k ⌀ |
|---|
|android|gpio|raspberry-pi4| |
null |
I am trying to make a landing page for tshirt store using vite+react+R3f . I just want to make my model pop up or out .I got my div as flex to get the basic layout .
[enter image description here](https://i.stack.imgur.com/G1yEa.png)This is what i have done so far |
How to make React Three Fiber (R3f) model Pop out? |
|reactjs|react-three-fiber|react-three-drei| |
null |
1. Sure you did not update accidentaly to 4.27.**3** or later? I got exactly your problem, after I installed the 4.28.0 Version - see below...
2. You need Hyper-V enabled for this: is it working correctly on your machine? If you are using Windows Home Edition there is no chance: upgrade your Windows to Professional Edition - see maybe [tag:docker-for-windows]?
From my view, at this time the Docker Desktop alt least Version 4.28.0 seems to have a problem with at least Windows 10, because after I deinstalled the 4.28.0 and replaced it with a fresh install of the Docker Desktop Version 4.27.2 (see [Docker Desktop release notes][2]) everything works fine for me with VS 2022 and ASP.NET 8.
... don't Update DD until this is fixed! ;)
In [GitHub, docker/for-win: ERROR: request returned Internal Server Error for API route and version...][1] there is a hint upgrading the WSL2 which might help too.
[1]: https://github.com/docker/for-win/issues/13909
[2]: https://docs.docker.com/desktop/release-notes/#4272 |
I have Error Code 33 Shown By the Motherboard LED.
This is my system information:
Operating System: Windows 10 Pro 64-bit (10.0, Build 19045)
System Manufacturer: Gigabyte Technology Co., Ltd.
System Model: AX370-Gaming K7
BIOS: F51i (type: UEFI)
Processor: AMD Ryzen 3 1200 Quad-Core Processor (4 CPUs), ~3.1GHz
Memory: 16384MB RAM
Available OS Memory: 16306MB RAM
DirectX Version: DirectX 12
This is my display information:
Card name: NVIDIA GeForce GTX 1070
Manufacturer: NVIDIA
Chip type: NVIDIA GeForce GTX 1070
Recently I started to have an unexpected OS shutdown (Kernel Power Error 41) so I started to diagnose the cause.
I installed fresh Windows 10, and I updated all drivers and the motherboard's bios but the problem still happening randomly. Sometimes PCs work for hours and sometimes shut down in a few minutes.
When pc shuts down, It won't turn on when I press the Power button on the case. I can see the motherboard has power since the power button on the motherboard is red. I have to switch off the power supply button and once the motherboard power is gone, Switch back on and press the power button on the case.
Any suggestion?
Thanks in advance |
How do I find the Workflow corresponding to a specific Fiori-App? |
{"OriginalQuestionIds":[43632],"Voters":[{"Id":3832970,"DisplayName":"Wiktor Stribiżew","BindingReason":{"GoldTagBadge":"regex"}}]} |
I used react-select multi-select to choose a value in react and made a post request to an API. So basically, the label of the select displays a name while what is being sent to the database is an id that looks like this "65a531c7e04e780a3a702fdd" but then I want to edit that same item and make a put request. so I need to preload the select input this time but what is coming from the API is just that id without the name.
Something like this -
"data": {
"_id": "65db636adc11e7260bd8f27e",
"nameOfFood": "Hanae Vance",
"kitchenId": "65a52c333972416e3ef579af",
"dishTypeId": "65a52cbf3972416e3ef57ba8",
"image": "https://pocketfood.s3.us-east-2.amazonaws.com/image/undefined/1708876620.jpeg",
"description": "Nulla ipsam nihil la",
"pricePerServing": 2000,
"available": true,
"isRecommendation": false,
"mustHaveSideFood": true,
"sideFoods": [
"65a531c6e04e780a3a702fc6"
],
"compulsoryExtras": true,
"extras": [
"65a531c7e04e780a3a702fcc"
],
"menuIds": [
"65a52d183972416e3ef57d3a"
],
"foodTags": [
"65a52c373972416e3ef57a30"
],
"allergies": [],
"isSpecial": false,
"createdAt": "2024-02-25T15:57:30.931Z",
"updatedAt": "2024-02-25T15:57:30.931Z",
"__v": 0
}
In this response, properties sideFoods, extras, menuIds, foodTags are what is returning an array of just the id, so I figured I could work with this since the options of the select input are still the same data as what the post request is working so I thought getting the id will make the name of the select show but I was wrong. What is preloading in the input field is the id instead of the name
const handleChange = (newValue: any, actionMeta: ActionMeta<Options>) => {
if (actionMeta.name === "foodTags") {
setFoodInput((prevDetails: any) => ({
...prevDetails,
foodTags: newValue ? newValue.map((option: any) => option.value) : [],
}));
} else if (actionMeta.name === "menuIds") {
setFoodInput((prevDetails: any) => ({
...prevDetails,
menuIds: newValue ? newValue.map((option: any) => option.value) : [],
}));
} else if (actionMeta.name === "extras") {
setFoodInput((prevDetails: any) => ({
...prevDetails,
extras: newValue.map((option: any) => option.label),
}));
} else if (actionMeta.name === "sideFoods") {
setFoodInput((prevDetails: any) => ({
...prevDetails,
sideFoods: newValue.map((option: any) => option.value),
}));
}
};
Above is the onchange of the select I am using and this is the react-select component
<Select
options={menuOption}
key={menu?._id}
className="basic-multi-select"
classNamePrefix="select"
name="menuIds"
isMulti
onChange={handleChange}
// value={foodInput?.menuIds?.map((menu: any) => ({
// label: menu,
// value: menu,
// }))}
// value={menuOption.filter((option) =>
// foodInput?.menuIds?.includes(option.value)
// )}
value={foodInput?.menuIds?.map((menuId: any) => ({
label: menu.find((m: any) => m._id === menuId)?.title,
value: menuId,
}))}
styles={customStyles}
/>
It can be seen that I have tried some methods to preload the value to show as name instead of as id. The first value is the one that preloads the id, while the last one which is uncommented is what chatgpt suggested I used to get the name but I got an empty string instead. While I also tried reaching out to the backend engineer to help with returning both the name and id but when I try to access the data.title, I also get the same empty string that I got with the chatgpt suggestion which made me realize that there is something I am doing wrong. Can someone please help me out on this?
[![In this image, menu can be seen as just selecting something without it being shown. this is the result of when I use chatgpt suggestion and this was how it looked like when I also told the backend engineer to help return the label][1]][1]
[![This image shows when the value is being preloaded with the id][2]][2]
[1]: https://i.stack.imgur.com/65eRf.png
[2]: https://i.stack.imgur.com/ayUHf.png
For addition, when the situation is as is in image 2, if I select something, it is the id that shows. That is apart from preloading the id, even selecting the select options shows as id. I can change the onchange function to show `option.label` instead of `option.value` but the database does not accept name, it accepts the id instead.
I don't know if I have been able to accurately explain my issue and if it is understandable from this but I would appreciate any form of advice. |
|javascript|reactjs|react-select|preloading| |
from `zeromq` v3 onwards all subscriptions are filtered on the publisher side. When you subscribe on your `ZMQ_SUB` socket, the filter information is sent to the `ZMQ_PUB` socket and filtered there.
So the answer is no, you `ZMQ_SUB` process, socket, network etc will not see the first or second frame if they are not subscribed to "huge_blob"
|
We can iterate through rows with `sapply`, then `paste` the string "Pop" to your `Party` columns for indexing and summation.
```r
df$PartyRel <- sapply(
1:nrow(df),
\(x) ifelse(df[x, 1] == df[x, 2],
df[x, paste0(df[x, 1], "Pop")],
df[x, paste0(df[x, 1], "Pop")] + df[x, paste0(df[x, 2], "Pop")])
)
df
PartyA PartyB ChristianPop MuslimPop JewishPop SikhPop BuddhistPop PartyRel
1 Christian Jewish 12 71 9 0 1 21
2 Muslim Muslim 1 93 2 0 0 93
3 Muslim Christian 74 5 12 1 2 79
4 Jewish Muslim 14 86 0 0 0 86
5 Sikh Buddhist 17 13 4 10 45 55
```
|
Use below
select Colm_A, Colm_B, Colm_C, Colm_D
from your_table,
unnest(split(Colm_B)) Colm_B with offset as B
join unnest(split(Colm_C)) Colm_C with offset as C on B = C
join unnest(split(Colm_D)) Colm_D with offset as D on B = D
with output
[![enter image description here][1]][1]
[1]: https://i.stack.imgur.com/A6G71.png |
|php|regex|preg-replace| |
I am calling a .NET 7 web API POST endpoint from another C# command line application.
The endpoint is returning a 502 error consistently after 2 minutes but the endpoint is still continuing processing even though the 502 is returned. This doesn't happen when I call a debug version locally but happens when I call the deployed version, configured via IIS.
I have tried the following things but none seem to resolve the issue:
- Large timeout on the http client in the c# application that is calling the API
- Increasing the Connection timeout limit in IIS
- Setting keep alive timeout during start up of the web API |
|c#|.net-core|timeout|.net-7.0|http-status-code-502| |
I have encountered something new to me.
I have an object with a type parameter
```
public class myObj<T>
{
...
}
```
Now, what gives me trouble is this kind of code not being valid:
```
public myObj<IEnumerable<string>> myFunc()
{
return new myObj<List<string>>();
}
```
I initially expected this code to be valid, but after thinking about it I get why it's not.
My question is:
Is there any way to act on the object's declaration to make that function valid?
Googled a bit, played with where conditions on the object's declaration, but couldn't find help.
I'm thinking this can't be done, but wanted to ask more knowledgeable people |
SwiftUI NavigationBar Title using UIHostingController does not Display correctly until NSLayoutConstraint is broken |
|ios|swift|swiftui|expo|uihostingcontroller| |
This is my first project with SwiftUI that is like a home inventory tracker and i have come to a problem I do not know how to solve. My code runs and functions fine but when pressing a button to submit and save data from a form to SwiftData, the console spits out 2 copies of the same warning, "Modifying state during view update, this will cause undefined behavior." **I managed to narrow down the source of the issue to the TextFields in form{}.** When I commented out the TextFields in the form closure, the warning messages in the console disappeared. I can't seem to find anything to find how to handle this.
Essentially, I have an AddItemsView file that is the screen to add an item. It contains fields to add:
- name (String)
- location (String)
- image (using PhotoPicker() )
- category (String using Picker() )
- notes (String)
I have another file called TabBarView that contains TabView to switch between 3 tabs/screens using a Int to set the screen:
``` lang-swift
import SwiftUI
import SwiftData
struct TabBarView: View {
@State var selection = 2 //this is used with binding to switch screens from other files
var body: some View {
TabView(selection:$selection){
AddItemView(selection: $selection)
.tabItem{
Label("Add", systemImage: "plus")
}
.tag(1)
BrowseView()
.tabItem {
Label("Browse", systemImage: "list.dash")
}
.tag(2)
SearchView()
.tabItem {
Label("Search", systemImage: "magnifyingglass")
}
.tag(3)
}
}
}
#Preview {
let container = try! ModelContainer(for: CategoryDataModel.self, ItemDataModel.self, configurations: ModelConfiguration(isStoredInMemoryOnly: true))
let tempArray = ["Miscellaneous"]
let newCategory = CategoryDataModel(categoryList: tempArray)
container.mainContext.insert(newCategory)
return TabBarView()
.modelContainer(container)
}
```
This is AddItemView:
```lang-swift
import SwiftUI
import SwiftData
import PhotosUI
struct AddItemView: View {
//SwiftData
@Query var items: [ItemDataModel]
@Query var categories: [CategoryDataModel]
@Environment(\.modelContext) var modelContext
@Binding var selection: Int //this is used to change the screen
//for the photo picker feature
@State private var photoPickerItem: PhotosPickerItem?
@State var avatarImage: UIImage?
//these will be saved using SwiftData using ItemDataModel
@State private var name = ""
@State private var category = ""
@State private var location = ""
@State private var notes = ""
@State private var imageData: Data?
var body: some View {
NavigationView{
Form{
//This section is for the required data fields
Section(header: Text("Required")){
TextField("Name", text: $name)
TextField("Location", text: $location)
}
//this section is for the optional data fields
Section(header: Text("Optional")){
//--------this lets users choose an image they own---------------------
// MARK: Photo Picker Section
PhotosPicker(selection: $photoPickerItem, matching: .images){
let chosenImage: UIImage? = avatarImage
if chosenImage != nil{
Image(uiImage: avatarImage!)
.resizable()
.aspectRatio(contentMode: .fill)
.frame(maxWidth: 80)
} else{
Text("Choose Image")
}
}
.onChange(of: photoPickerItem){ _, _ in
Task{
if let photoPickerItem,
let data = try? await photoPickerItem.loadTransferable(type: Data.self){
if let image = UIImage(data: data){
avatarImage = image
imageData = data
}
}
photoPickerItem = nil
}
}
.frame(maxWidth: .infinity)
.alignmentGuide(.listRowSeparatorLeading) { viewDimensions in
return 0
} //this is cleared
//-------------END PHOTO PICKER SECTION-------------------------------
// Category Picker
Picker("Choose Category", selection: $category){
ForEach(categories[0].categoryList, id: \.self) { cat in
Text(cat)
}
}
TextField("Notes", text: $notes, axis: .vertical)
.padding()
}
//save button
HStack{
Spacer()
Button ("Save Item"){
let item = ItemDataModel(name: name, location: location, category: category, notes: notes)
item.image = imageData
//modelContext.insert(item) //commented out for debugging
//clears form after saving
name = ""
category = ""
location = ""
self.notes = ""
imageData = nil
avatarImage = nil
//sends user to BrowseView
selection = 2 //ever since this line was added, the modifying state error has appeared. UPDATE: SOURCE FOUND - it was the textfields in form closure. commenting them out revealed the source of the error.
}
//input validation to ensure name and location are filled out
.disabled(name.isEmpty || location.isEmpty)
Spacer()
}
}
.navigationTitle("Add Item")
}
}// end body
}
#Preview {
let container = try! ModelContainer(for: CategoryDataModel.self, configurations: ModelConfiguration(isStoredInMemoryOnly: true))
let tempArray = ["Miscellaneous"]
let newCategory = CategoryDataModel(categoryList: tempArray)
container.mainContext.insert(newCategory)
return AddItemView(selection: .constant(2))
.modelContainer(container)
}
```
[Image of the AddItemView screen](https://i.stack.imgur.com/zrsZh.png)
[Image of the BrowseView screen which the AddItemView screen displays after submitting item](https://i.stack.imgur.com/zJM38.png)
I tried using a function to change the tabview value to change the screen and defined it outside the body but that did not work. I tried adding self to `selection = 2` to become `self.selection = 2` I know i have to change the screen outside of the view so as to not interfere while the view updates but I do not know how.
The error still appears when commenting out SwiftData components so I don't think posting my SwiftData model would help but if it would I'll gladly show it. This has genuinely stumped, I never have posted a question online before up to this point. I was considering just leaving it alone since the app runs fine but I read someone's statement saying that Apple could update SwiftUI in any way that this warning could actually come back to bite me later so I decided to fix it now.
I saw a post saying that i have to make changes outside the view so that it does not interfere with SwiftUI rebuilding the view in the background and they used onAppear to contain their example code that triggered the error but I don't know how to get that to work with TextField.
I really like the way the TabView looks in my app but I am open to other ways to handle it just to avoid this error. |
I'm using filtfilt function.
Based on Warren Weckesser comment, I went to documentation and it states that
> This function applies a linear digital filter twice, once forward and
> once backwards. The combined filter has zero phase and a filter order
> twice that of the original.
https://docs.scipy.org/doc/scipy/reference/generated/scipy.signal.filtfilt.html |
## I Got the Same Error Message ##
**SOLVED**
----------
Follow these steps:
1. Download the latest version of procobuf`pip install --upgrade proctobuf`
Now you will get another error similar to : [][1]
This is because the steps given by **https://tensorflow-object-detection-api-tutorial.readthedocs.io/en/latest/install.html** install an older version of tensorflow that can only work with proctobuf below 3.20.
2. I am asuming you are using an anaconda environment as well. check the path where your environment is saved using `conda info --envs`
3. Inside your environment file go copy **builder.py** from `.../Lib/site-packages/google/protobuf/internal` to any another folder on your computer temporarly.
4. Downgrade the version of proctobuf to lower version compatible with tensorflow. For me its below 3.20, the last version behind it is **3.19.6**.
5. Finally copy the builder.py file back to the `.../Lib/site-packages/google/protobuf/internal` folder.
[1]: https://i.stack.imgur.com/vLP1H.png |
You could tweak it with the [Styler][1], using a *hacky* [`bar`][2] :
```
st = (
df.assign(
bar=df["Orchid"].fillna(df["Rose"]),
tmp=df["Orchid"] + df["Rose"],
)
.style.set_caption("Orders in 2020")
.bar(subset=["bar", "tmp"], color="#fb9c04", axis=1)
.set_table_styles(
[
{
"selector": "td.col2",
"props": "background-color: #668ed2"
},
{
"selector": "caption",
"props": "font-size: large; font-weight: bold",
},
],
)
.hide(subset="tmp", axis=1)
.format("", na_rep="", subset="bar")
.format("${:,.2f}", na_rep="No Data", subset=["Orchid", "Rose"])
.format_index(lambda c: c if c != "bar" else "", axis=1)
.map_index(lambda i: "font-weight: bold")
.format_index(lambda i: i + 1)
)
```
Output :
[![enter image description here][3]][3]
Used input (`df`) :
```
df = pd.DataFrame(
{
"Orchid": [
3500.0, 4800.0, 1400.0, None, 7800.0, 6800.0,
8500.0, 6200.0, 9000.0, 7300.0, 8300.0, 11300.0,
],
"Rose": [
3200.0, 2500.0, 3700.0, 5600.0, 8000.0, 3800.0,
None, 7720.0, 8380.0, 9100.0, 9700.0, 10360.0,
],
}
)
```
[1]: https://pandas.pydata.org/docs/reference/api/pandas.io.formats.style.Styler.html#
[2]: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.io.formats.style.Styler.bar.html
[3]: https://i.stack.imgur.com/roKYO.png |
I have the following ```layout.tsx``` file:
```
import { ClerkProvider, currentUser } from "@clerk/nextjs";
import { DevMode, Footer, LogRocketInit, Nav } from "./components";
import "./global.scss";
import { Inter } from "next/font/google";
import { User, findUser } from "@/mongo/functions/user";
const inter = Inter({ subsets: ["latin"] });
export const metadata = {
title: "Commerce Karma",
};
export const runtime = "nodejs";
if (typeof setImmediate === "undefined") {
global.setImmediate = function (fn) {
setTimeout(fn, 0);
};
}
export default async function RootLayout({
children,
}: {
children: React.ReactNode;
}) {
const user = await currentUser();
let isSignedInUser = false;
let userStore: User[] | null = null;
if (user) {
isSignedInUser = true
userStore = await findUser({filters: {userId: user.id}})
};
return (
<ClerkProvider
appearance={{
variables: {
colorPrimary: "rgb(59,121,173)",
},
}}
>
<html lang="en">
<body className={inter.className}>
<LogRocketInit userProp={JSON.stringify(user)}/>
<Nav isSignedIn={isSignedInUser}/>
{children}
{userStore[0]?.isDevUser ? <DevMode /> : null}
</body>
</html>
</ClerkProvider>
);
}
```
After lots of debuging I have detirmed that ```userStore = await findUser({filters: {userId: user.id}})``` is the problem. On local (```npm run dev``` and ```npm run build && npm run start``` everything is fine, but when deployed on Vercel I'm getting a blank screen across all routes of my application.
Here is my ```.swcrc``` file:
```
{
"jsc": {
"parser": {
"syntax": "typescript",
"tsx": true
},
"experimental": {
"plugins": [["swc-plugin-add-display-name", {}]]
}
}
}
```
My ```next.config.js``` file:
```
/* eslint-disable */
/** @type {import('next').NextConfig} */
const nextConfig = {
async headers() {
return [
{
source: "/api/:path*",
headers: [
{ key: "Access-Control-Allow-Credentials", value: "true" },
{ key: "Access-Control-Allow-Origin", value: "*" },
{
key: "Access-Control-Allow-Methods",
value: "GET,DELETE,PATCH,POST,PUT,OPTIONS",
},
{
key: "Access-Control-Allow-Headers",
value:
"X-CSRF-Token, X-Requested-With, Accept, Accept-Version, Content-Length, Content-MD5, Content-Type, Date, X-Api-Version, Authorization, id, filters, Api-Key",
},
],
},
];
},
webpack: (
config,
{ buildId, dev, isServer, defaultLoaders, nextRuntime, webpack }
) => {
nextRuntime = "nodejs";
return config;
},
eslint: {
ignoreDuringBuilds: true,
},
typescript: {
ignoreBuildErrors: true,
},
};
module.exports = nextConfig;
```
Finaly my ```package.json```:
```
{
"name": "commerce-karma",
"version": "1.2.0",
"private": true,
"scripts": {
"dev": "next dev",
"build": "next build",
"start": "next start",
"lint": "next lint"
},
"dependencies": {
"@clerk/nextjs": "^4.29.3",
"@fortawesome/fontawesome-svg-core": "^6.5.1",
"@fortawesome/free-solid-svg-icons": "^6.5.1",
"@fortawesome/react-fontawesome": "^0.2.0",
"@lexical/react": "^0.13.1",
"@lexical/utils": "^0.13.1",
"@radix-ui/react-accordion": "^1.1.2",
"@radix-ui/react-dialog": "^1.0.5",
"@radix-ui/react-popover": "^1.0.6",
"@radix-ui/react-switch": "^1.0.3",
"@radix-ui/react-tabs": "^1.0.4",
"@radix-ui/react-tooltip": "^1.0.7",
"aws4": "^1.12.0",
"clsx": "^2.1.0",
"cookies": "^0.8.0",
"crypto-random-string": "^5.0.0",
"http-method-enum": "^1.0.0",
"jose": "^5.1.3",
"lexical": "^0.13.1",
"logrocket": "^7.0.0",
"logrocket-react": "^6.0.3",
"mongodb": "^6.4.0",
"mongoose": "^8.2.1",
"next": "^14.1.3",
"next-client-cookies": "^1.1.0",
"nuqs": "^1.15.3",
"react": "18.2.0",
"react-dom": "18.2.0",
"react-simple-wysiwyg": "^3.0.2",
"sanitize-html": "^2.11.0",
"sass": "^1.62.1",
"sharp": "^0.33.1",
"ssl-checker": "^2.0.9",
"svix": "^1.5.0",
"zod": "^3.21.4"
},
"devDependencies": {
"@types/cookies": "^0.7.10",
"@types/jsonwebtoken": "^9.0.5",
"@types/logrocket-react": "^3.0.3",
"@types/node": "20.2.5",
"@types/react": "18.2.7",
"@types/react-copy-to-clipboard": "^5.0.7",
"@types/react-dom": "18.2.4",
"@types/sanitize-html": "^2.11.0",
"@typescript-eslint/eslint-plugin": "^5.59.7",
"@typescript-eslint/parser": "^5.59.7",
"eslint": "^8.41.0",
"eslint-config-next": "^13.4.4",
"eslint-config-prettier": "^8.8.0",
"eslint-plugin-prettier": "^4.2.1",
"prettier": "^2.8.8",
"swc-plugin-add-display-name": "^0.3.2",
"typescript": "^5.3.3"
}
}
```
- All pages are blank
- Only Next.js scripts are included in the DOM not the content in the `main` tag
- REmoving the find call fixes the issue
I'm using Next.js 14.1.3 and Mongoose 8.2.1. |
Why might react-router navigate(-1) go back two routes/pages? |
|reactjs|react-router|navigation| |
I have a trigger like this:
ALTER TRIGGER dbo.TG_a
ON dbo.a
FOR DELETE
AS
BEGIN
SELECT *
INTO dbo.b
FROM deleted
END
When I delete multiple rows in the edit tab, there will be a problem:
> There is already object name 'b'
When the trigger looks like this:
ALTER TRIGGER dbo.TG_a
ON dbo.a
FOR DELETE
AS
BEGIN
INSERT INTO dbo.b
SELECT *
FROM deleted
END
When I delete multiple rows in edit tab, there will be multiple duplicate values.
Problems:
- I realise that the trigger is firing for each row when I delete multiple rows on edit tab.
- If I delete multiple rows by query, the trigger will fire for all rows once
- This only happen when delete because of: can not insert, update multi row on edit tab
Question:
- I usually edit value on edit tab so I need some way to make trigger fire for all rows that are deleted on edit tab
|
The following code creates a cosecant squared radiation pattern. There is a phases and amplitude_norm_V vector which are the coefficients of AF expression. I am looking to create a GA iterative algorithm that given a known AF data will recreate phases and amplitude coefficients. I was given a link to a possible method shown below.
How to implement it in matlab?
Thanks.
https://en.wikipedia.org/wiki/Least_mean_squares_filter#:%7E:text=Least%20mean%20squares%20(LMS)%20algorithms,desired%20and%20the%20actual%20signal

i wrote the matlab code
```
amplitude_norm_V=[0.3213,0.4336,0.7539,1,0.7817,0.3201,0.32,0.3261];
x=sqrt(1/(sum(amplitude_norm_V.^2)));
real_voltage=amplitude_norm_V.*x;
real_power=(real_voltage.^2);
sum(real_power);
phases=[129.9121,175.4215,-144.6394,-116.9071,-93.7603,-60.0165,55.2841,89.4477]
norm_phases=phases-175.4215;
theta=linspace(0,pi,1000);
theta_0=pi*(135.315/180);
f=5.6;
N=10;
lambda=300/f;
k=(2*pi)/lambda;
d=0.8*lambda;
total=0;
tot=0;
for z=1:8
total=total+AF;
tot=tot+AF.^2;
end
plot((theta/pi)*180,20*log10(total));
``` |
I'm currently developing Python code to solve a VRP problem using OR-Tools. I'm stuck because I can't set a strict precedence constraint between a pair of nodes. For instance, let's assume I have 10 different nodes (depot-1-2-3-4-5-6-7-8-9-10) and two vehicles with varying travel speeds.
My objective is that in any solution, node 3 is visited immediately before node 7 by either of the two vehicles (it doesn't matter which one). An example solution could be:
Vehicle 1: depot-1-4-10-6-2-depot
Vehicle 2: depot-9-5-3-7-8-depot
The varying speeds of the vehicles prevent me from applying the standard PDP example, which involves changing from `cumulVar(pickup_index) <= cumulVar(delivery_index)` to `cumulVar(pickup_index) == cumulVar(delivery_index) + "distance between pickup and delivery index"`. To address this issue, I attempted to use AddNodePrecedence. However, the code still does not produce the desired result; node 3 is not positioned exactly before node 7 as intended.
If you could provide me with a straightforward explanation and a code example, that would be very helpful. |
CPU PEI initialization |
|windows-10|kernel|cpu|amd|shutdown| |
null |
If you copied this solution from an another solution then check your namespace in your project please. Ex project namespace can be a conflict. |
The issue you are facing is likely due to, how the language switch is implemented in your `LocalSwitcher` component.
The `router.replace(/${nextLocale});` call in your `onClick` function effectively navigates the user to the root of the site (the homepage) for the selected locale, but it does not retain the rest of the current URL path beyond changing the locale. To fix this, you should modify this logic to dynamically replace only the locale part of the URL, keeping the rest of the path intact.
I updated your `onClick` function.
```js
const onClick: MenuProps['onClick'] = ({ key }) => {
const nextLocale = key;
// Build the new path by replacing the current locale with the nextLocale
// while keeping the rest of the path unchanged.
const newPath = router.asPath.replace(/^\/[a-z]{2}(\/|$)/, `/${nextLocale}/`);
startTransition(() => {
// Use the newPath for navigation instead of just `/${nextLocale}`
router.replace(newPath, newPath, { locale: nextLocale });
});
};
``` |
I am developing a Flutter app where I want to schedule OneSignal notifications from the app itself based on some conditions related to the user, I was not able to find something useful in the [OneSignal Flutter package](https://pub.dev/packages/onesignal_flutter).
I have seen this [question](https://stackoverflow.com/questions/56827952/sending-onesignal-notification-from-flutter) on stackoverflow that shows people are actually able to schedule notifications using OneSignal Flutter package, but, the `shared` property used in that example is not exposed anymore, and I couldn't find something useful in the [documentation](https://documentation.onesignal.com/docs/flutter-sdk-setup) of the OneSignal Flutter SDK.
Does anyone know if this is functionality was removed from the OneSignal Flutter package or only changed to something else?
I have installed the OneSignal package on my app, and it is perfectly working when I send a notification from OneSignal website, but, my goal is to schedule that notification from the app itself through OneSignal package. |
How can I schedule OneSignal notifications programmatically from a Flutter app |
I am running a project on Github codespaces but my it automatically terminated my code within few seconds. I have locally ran this project and the whole thing takes like 2 mins to run on an average system but gets automatically terminated like in the screenshot below. The "Terminated" line is not a debugging message from my end but from codespace itself. This issue has also been posted here but no response for an year.
https://github.com/orgs/community/discussions/59357
[Screenshot of Codespace Terminal](https://i.stack.imgur.com/a8UTi.png)
I was expecting the code to run normally like it runs locally but it terminated really quickly. I tried using notebook to cross check it but it gave the following error there as well:
The Kernel crashed while executing code in the current cell or a previous cell.
Please review the code in the cell(s) to identify a possible cause of the failure.
Click here for more info.
View Jupyter log for further details. |
When deploying Next.js to Vercel using Mongoose to fetch data I'm getting a blank screen |
|typescript|mongoose|next.js|vercel|next.js14| |
Instead of `pyreadstat` you can use `pyspssio`.
If it's the first time you're using this library, you may need to install it first. But to read the data in and to export it, you can use similar syntax to `pyreadstat`. This library allows you to use the metadata=... argument to export with the metadata. Here is some code as a reference:
pip install pyspssio
import pyspssio
df, meta = pyspssio.read_sav("C:/my_doc.sav")
pyspssio.write_sav("C:/my_doc v2.sav",df,metadata=meta)
You can find more details here:
[https://pyspssio.readthedocs.io/en/stable/readme.html][1]
[1]: https://pyspssio.readthedocs.io/en/stable/readme.html |
Witn scipy, I did some experiments with a butterworth filter like this:
sos = butter(order, normal_cutoff, btype='low', analog=False, output="sos")
I expect sos to be the coefficient of the filter.
I need to port this filter to an arm platform. There are several filters functions implemented in the CMSIS library but I don't understand if butterworth falls into one filter familly implemented in CMSIS.
My question is: shoud I implement butterworth myself or is there a cmsis function for that? |
Using VLCKit I am able to display subtitles by setting the current video subtitle index. But I can't find an option in the documentation that allows to change the font and font size. So I have know idea how to increase or decrease the font size.
```Swift
import VLCKit
let player = VLCMediaPlayer()
let media = VLCMedia!
media = VLCMedia(url: mediaURL)
player.media = media
player.play()
//Set subtitle
player.currentVideoSubTitleIndex = 1
//Change font size
??
```
EDIT (March 13, 00:22)
```Swift
player = VLCMediaPlayer(options: ["--sub-text-scale=10"])!
```
This changed the font size, but can this also be done while the video is playing or would that require re-initializing the player? |
In SwiftUI is there a way to create an overlay view that has a 'down' state and an 'up' state that can be slid up/down but locks in the up or down position? The view should always be on the screen but either locked in the up or down position
For example this chat view is the inspiration:
[![enter image description here][1]][1]
[![enter image description here][2]][2]
[1]: https://i.stack.imgur.com/iRsUc.png
[2]: https://i.stack.imgur.com/2UCQv.png |
SwiftUI Modal View Overlay |
|swift|swiftui|overlay| |
{"OriginalQuestionIds":[546433],"Voters":[{"Id":9473764,"DisplayName":"Nick"},{"Id":625403,"DisplayName":"amalloy"},{"Id":2943403,"DisplayName":"mickmackusa"}]} |
|node.js|firebase|google-cloud-platform|google-cloud-functions|firebase-tools| |
null |
I have wrote a code to insert data in LinkedList. But there is problem outputting the data. Here is the code.
#include <stdio.h>
#include <math.h>
#include <string.h>
#include <stdlib.h>
struct node {
char *name;
int age;
struct node *next;
};
struct node *linkeslist1head= NULL;
struct node *linkedlist1tail= NULL;
void llinsertend(const char *a,const int *b){
struct node *current = malloc(sizeof(struct node));
if(current == NULL){
printf("Current creation failed.\n");
}
current->name = malloc(strlen(a)+1);
if(current->name == NULL) {
printf("String allocation failed\n");
}
strcpy(current->name,a);
current->age = *b;
if(linkeslist1head == NULL){
linkeslist1head = current;
linkedlist1tail = current;
}else{
//If the list is not empty, append the new node to the end
linkedlist1tail->next = current;
// Update tail to point to the new last node
linkedlist1tail = current;
}
}
void llinsertbegin(const char *a, const int *b) {
struct node *newnode = malloc(sizeof(struct node));
if (newnode == NULL) {
printf("Memory allocation failed\n");
return;
}
newnode->name = malloc(strlen(a) + 1);
if (newnode->name == NULL) {
printf("String allocation failed\n");
free(newnode);
return;
}
strcpy(newnode->name, a);
newnode->age = *b;
if (linkeslist1head == NULL) {
// If the list is empty
newnode->next = NULL;
linkeslist1head = newnode;
linkedlist1tail = newnode;
} else {
// If the list is not empty
newnode->next = linkeslist1head;
linkeslist1head = newnode;
}
}
void llinsertaftern(const char *a, const int *b, int n) {
struct node *current = linkeslist1head;
int i;
for (i = 1; current != NULL && i < n; i++){
current = current->next; // Iterate until the (n-1)th node or until current becomes NULL
}
if (current == NULL){
printf("LL short\n");
return; // Exit the function if current is NULL
}
printf("Reached node %d\n", i);
struct node *newnode = malloc(sizeof(struct node));
if (newnode == NULL) {
printf("Memory allocation failed\n");
return;
}
newnode->name = malloc(strlen(a) + 1);
if (newnode->name == NULL) {
printf("String allocation failed\n");
free(newnode);
return;
}
strcpy(newnode->name, a);
newnode->age = *b;
if (current == NULL) {
printf("LL is shorter than %d\n", n);
free(newnode->name);
free(newnode);
return;
}
newnode->next = current->next;
current->next = newnode;
}
void outputLinkedList(struct node *head){
struct node *p = head;
while(p != NULL){
printf("Name:%s Age:%d\n",p->name,p->age);
p = p->next;
}
printf("\n");
}
int main() {
printf("How many persons' details you want to add\n");
int t;
scanf("%d",&t);
getchar();
for(int i=1;i<=t;i++){
int x;
char name[50];
scanf("%s",name);
getchar();
scanf("%d",&x);
llinsertend(name,&x);
}
int x=10,y=20,z=30;
llinsertbegin("facebook",&x );//(const int *) 10
llinsertbegin("instragram", &y);
llinsertbegin("whatsapp", &z);
outputLinkedList(linkeslist1head);
int code=1200,pos = 3;
llinsertaftern("Dhaka",&code,pos);
outputLinkedList(linkeslist1head);
}
In the second outputLinkedList not working in either CLion or Codeblocks but it works in Compiler Explorer. I mean after adding some elements through the `llinsertend()` function then the `llinsertbegin()` function, the output function works perfectly. But when I use the `llinsertaftern()` function it shows a segmentation fault in the debugger. Suppose that I add three elements through the `llinsertend()` function and then add 3 elements through the `llinsertbegin()` function, then what is the problem of `llinsertaftern()` with the value of pos = 3. It should work perfectly fine. |
I'm trying to implement both a PCA and t-sne algorithms to cluster the embeddings resulting of using ImageBind. For this, using scikit-learn v.1.3.2 library in python v.3.8.18, I'm trying to use the sklearn library but the error mentioned in the title keeps popping at:
```python
from sklearn.decomposition import PCA
from sklearn.manifold import TSNE
```
This is the traceback of the error:
```python
ModuleNotFoundError Traceback (most recent call last)
Cell In[1], line 4
1 import numpy as np
2 import pandas as pd
----> 4 from sklearn.decomposition import PCA
5 from sklearn.manifold import TSNE
7 import matplotlib.pyplot as plt
File ~\AppData\Roaming\Python\Python38\site-packages\sklearn\__init__.py:83
69 # We are not importing the rest of scikit-learn during the build
70 # process, as it may not be compiled yet
71 else:
(...)
77 # later is linked to the OpenMP runtime to make it possible to introspect
78 # it and importing it first would fail if the OpenMP dll cannot be found.
79 from . import (
80 __check_build, # noqa: F401
81 _distributor_init, # noqa: F401
82 )
---> 83 from .base import clone
84 from .utils._show_versions import show_versions
86 __all__ = [
87 "calibration",
88 "cluster",
(...)
129 "show_versions",
130 ]
File ~\AppData\Roaming\Python\Python38\site-packages\sklearn\base.py:19
17 from ._config import config_context, get_config
18 from .exceptions import InconsistentVersionWarning
---> 19 from .utils import _IS_32BIT
20 from .utils._estimator_html_repr import estimator_html_repr
21 from .utils._metadata_requests import _MetadataRequester
ModuleNotFoundError: No module named 'sklearn.utils'
```
I've checked that I'm using the latest version available of sklearn as well as that the packet .utils exists in my version of sklearn. I've tried to add the isolated library using both pip and conda in my environment, and installing the previous version (1.2.2). Nothing seems to work. Does anyone have any idea what the problem might be?
Edit: Hey, I'm new to both python and stackoverflow so please try to be nice:) |
Type arguments: accept types that inherit declared type |
|c#|inheritance| |
null |
Login directly to the Postgres server and execute the following:
PGPASSWORD=abcd1234 psql "host=101.1.2.154 sslmode=prefer sslrootcert=./Keys/server-ca.pem sslcert=./Keys/client-cert.pem sslkey=./Keys/client-key.pem port=5432 user=postgres" --command="ALTER DATABASE source_db_name RENAME TO new_target_db_name;"
output is:
ALTER DATABASE
|
You can delete a endpoint using the python azure ml sdk
from azureml.core import Workspace, Webservice
service = Webservice(workspace=ws, name='your-service-name')
service.delete()
Then if you want to re create you can re deploy the model
from azureml.core.model import InferenceConfig
from azureml.core.webservice import AciWebservice
from azureml.core.model import Model
service_name = 'my-custom-env-service'
inference_config = InferenceConfig(entry_script='score.py', environment=environment)
aci_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)
service = Model.deploy(workspace=ws,
name=service_name,
models=[model],
inference_config=inference_config,
deployment_config=aci_config,
overwrite=True)
service.wait_for_deployment(show_output=True)
There is no current way to schedule or temporary disable the endpoint. The only way would be to delete and re create using the azureml sdk. The other option would be to use a Azure function app for deployment for ml models this way you only pay for requests made. |
I'm using coreui/react CModal to display an alert box.
The problem that i have is that when i deminish the window size, i need to use the scrollbar to see all the content of the alert box, and when i click the scrollbar the alert box closes.
How to avoid closing the alert box when clicking on the scrollbar?
Here is my code:
<CModal
alignment={"center"}
visible={showSignUp}
onClose={() => setShowSignUp(false)}
className="custom-modal"
>
<CModalHeader onClose={() => setShowSignUp(false)}>
<CModalTitle
style={{ textAlign: "center", flex: 1 }}
id="LiveDemoExampleLabel"
>
Sign up
</CModalTitle>
</CModalHeader>
<CModalBody>
<>
CModalBody
</>
</CModalBody>
</CModal>
I've tried to use an event when using the onClose, but the event returned is always undefined when clicking on the scrollbar. |
CModal of @coreui/react closing when clicking on the scrollbar |
|reactjs|core-ui| |
Here is a quick trick you can use to alternate between the subnets:
- Put all subnets in a list using a local
`subnets = [var.public_subnet_a_id, var.public_subnet_b_id]`
- Use the [remainder of dividing][1] the index of the VM by the amount of subnets
`local.subnets[index(keys(var.web_vm_attribute), each.key) % length(local.subnets)]`
Here is some sample code
``` lang-hcl
variable "web_vm_attribute" {
default = {
"a" = {}
"b" = {}
"c" = {}
"d" = {}
}
}
variable "public_subnet_a_id" {
default = "abc11"
}
variable "public_subnet_b_id" {
default = "def22"
}
locals {
subnets = [var.public_subnet_a_id, var.public_subnet_b_id]
}
resource "null_resource" "test" {
for_each = var.web_vm_attribute
triggers = {
id = index(keys(var.web_vm_attribute), each.key)
key = each.key
subnet = local.subnets[index(keys(var.web_vm_attribute), each.key) % length(local.subnets)]
}
}
```
the output of `terraform plan` on that code is:
``` lang-txt
Terraform will perform the following actions:
# null_resource.test["a"] will be created
+ resource "null_resource" "test" {
+ id = (known after apply)
+ triggers = {
+ "id" = "0"
+ "key" = "a"
+ "subnet" = "abc11"
}
}
# null_resource.test["b"] will be created
+ resource "null_resource" "test" {
+ id = (known after apply)
+ triggers = {
+ "id" = "1"
+ "key" = "b"
+ "subnet" = "def22"
}
}
# null_resource.test["c"] will be created
+ resource "null_resource" "test" {
+ id = (known after apply)
+ triggers = {
+ "id" = "2"
+ "key" = "c"
+ "subnet" = "abc11"
}
}
# null_resource.test["d"] will be created
+ resource "null_resource" "test" {
+ id = (known after apply)
+ triggers = {
+ "id" = "3"
+ "key" = "d"
+ "subnet" = "def22"
}
}
```
You can see in the output how the `subnet` alternates between the values of two `public_subnet_a_id` and `public_subnet_b_id` ...
And of course this are just sample values, and I'm using a `null_resource` you can do the same with your `resource "aws_instance"`
[1]: https://developer.hashicorp.com/terraform/language/expressions/operators#arithmetic-operators
|
Add node precedence |
|or-tools|vehicle-routing| |
null |
In my case, the problem was I was standing in the wrong folder. i.e. I was standing in a folder that is inside the main git repo. |
`{0:30}` means that the variable `0` will be followed by `30` spaces.
For the separator (`|`) to be alined you can use `{0:<30}`:
```c++20
std::format("|{0:<30}|{1:30.5f}|\n", "p", std::numbers::pi)
```
To align the variable `0` left and add padding so that the length of the line is `30`. ([cppreference](https://en.cppreference.com/w/cpp/utility/format/spec#Fill_and_align))
Live on [Compiler Explorer](https://godbolt.org/z/Y9Kjx8zTo).
|
I am trying to configure `pgbouncer` as connection pooler for my postgres instances. For my use case I tried to work with `auth_query` authentication method to let users connect to postgres instances without explicitly adding the credentials in `userlist.txt`.
Here is my `pgbouncer.ini`:
```ini
[databases]
; PostgreSQL database named 'dbone' running on 'postgres1' service
pokemons = host=postgres1 port=5432 dbname=pokemon
super = host=postgres2 port=5432 dbname=supersaiyans
[pgbouncer]
listen_addr = *
listen_port = 6432
auth_type = scram-sha-256
auth_file = /etc/pgbouncer/userlist.txt
auth_query = select usename, passwd FROM user_search($1)
auth_user = pgbuser
logfile = /var/log/pgbouncer/pgbouncer.log
pidfile = /var/run/pgbouncer/pgbouncer.pid
admin_users = postgres
```
Here is my `userlist.txt`:
```txt
"pgbuser" "pgbouncer"
```
Here is the function and `auth_user` for pgbouncer:
```sql
CREATE OR REPLACE FUNCTION user_search(uname TEXT) RETURNS TABLE (usename name, passwd text) as
$$
WITH myuser as (SELECT $1 as fullusername) SELECT fullusername as usename, passwd FROM pg_catalog.pg_shadow INNER JOIN myuser ON usename = substring(fullusername from '[^@]*')
$$
LANGUAGE sql SECURITY DEFINER;
CREATE ROLE pgbuser WITH LOGIN PASSWORD 'pgbouncer';
GRANT EXECUTE ON FUNCTION user_search(text) TO pgbuser;
```
Here I created a non super user role `pgbuser` which could execute the function on postgres and verify the users.
Then i created a db users with view access on specific database. I checked manually logging in with these users and i could see the data i was permitted to view.
Issue: When i login to pgadmin using the host and port of pgbouncer and enter the credentials of my created db users. I could login successfully. To verify that login was working i also entered wrong passwords. I can login and see the databases but when i try to view the contents I get an SASL error:
Logs:
```
pgbouncer-1 | 2024-03-30 11:10:13.531 UTC [1] LOG S-0x7f8133c733e0: pokemons/pgbuser@192.168.16.4:5432 new connection to server (from 192.168.16.3:35530)
pgbouncer-1 | 2024-03-30 11:10:13.540 UTC [1] LOG C-0x7f8133c7d3c0: pokemons/ash@192.168.16.1:56156 login attempt: db=pokemons user=ash tls=no
pgbouncer-1 | 2024-03-30 11:10:13.550 UTC [1] LOG S-0x7f8133c73690: pokemons/ash@192.168.16.4:5432 new connection to server (from 192.168.16.3:35544)
pgbouncer-1 | 2024-03-30 11:10:16.337 UTC [1] LOG C-0x7f8133c7d670: (nodb)/(nouser)@192.168.16.1:56168 no such user: ash
pgbouncer-1 | 2024-03-30 11:10:16.337 UTC [1] LOG C-0x7f8133c7d670: (nodb)/ash@192.168.16.1:56168 login attempt: db=dbone user=ash tls=no
pgbouncer-1 | 2024-03-30 11:10:16.337 UTC [1] LOG C-0x7f8133c7d920: (nodb)/(nouser)@192.168.16.1:56178 no such user: ash
pgbouncer-1 | 2024-03-30 11:10:16.337 UTC [1] LOG C-0x7f8133c7d920: (nodb)/ash@192.168.16.1:56178 login attempt: db=dbone user=ash tls=no
pgbouncer-1 | 2024-03-30 11:10:16.347 UTC [1] ERROR C-0x7f8133c7d670: (nodb)/ash@192.168.16.1:56168 password authentication failed
pgbouncer-1 | 2024-03-30 11:10:16.347 UTC [1] LOG C-0x7f8133c7d670: (nodb)/ash@192.168.16.1:56168 closing because: SASL authentication failed (age=0s)
pgbouncer-1 | 2024-03-30 11:10:16.347 UTC [1] WARNING C-0x7f8133c7d670: (nodb)/ash@192.168.16.1:56168 pooler error: SASL authentication failed
pgbouncer-1 | 2024-03-30 11:10:16.348 UTC [1] ERROR C-0x7f8133c7d920: (nodb)/ash@192.168.16.1:56178 password authentication failed
```
Here once the login attempt is successful but after that when i click on databases I get authentication failed errors. |
Codespace terminates my code within seconds? |
|terminate|github-codespaces| |
null |
1. Sure you did not update accidentaly to 4.27.**3** or later? I got exactly your problem, after I installed the 4.28.0 Version - see below...
2. You need Hyper-V enabled for this: is it working correctly on your machine? If you are using Windows Home Edition there is no chance: upgrade your Windows to Professional Edition - see maybe [tag:docker-for-windows]?
From my view, at this time the Docker Desktop alt least Version 4.28.0 seems to have a problem with Windows 10, because after I deinstalled the 4.28.0 and replaced it with a fresh install of the Docker Desktop Version 4.27.2 (see [Docker Desktop release notes][2]) everything works fine for me with VS 2022 and ASP.NET 8.
... don't Update DD until this is fixed! ;)
In [GitHub, docker/for-win: ERROR: request returned Internal Server Error for API route and version...][1] there is a hint upgrading the WSL2 which might help too.
[1]: https://github.com/docker/for-win/issues/13909
[2]: https://docs.docker.com/desktop/release-notes/#4272 |
I am new to React js and MUI components.
I am working on the following design where stack contains multiple boxes. Each box has its own style which is same across all boxes within stack. Is there a way to have one style defined for a box which then be applied to all boxes?
https://codesandbox.io/p/sandbox/angry-gwen-c986mp?file=%2Fsrc%2FDemo.tsx%3A38%2C14&layout=%257B%2522sidebarPanel%2522%253A%2522EXPLORER%2522%252C%2522rootPanelGroup%2522%253A%257B%2522direction%2522%253A%2522horizontal%2522%252C%2522contentType%2522%253A%2522UNKNOWN%2522%252C%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522id%2522%253A%2522ROOT_LAYOUT%2522%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522contentType%2522%253A%2522UNKNOWN%2522%252C%2522direction%2522%253A%2522vertical%2522%252C%2522id%2522%253A%2522club5ghcv0007356hy9tigkpm%2522%252C%2522sizes%2522%253A%255B100%252C0%255D%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522contentType%2522%253A%2522EDITOR%2522%252C%2522direction%2522%253A%2522horizontal%2522%252C%2522id%2522%253A%2522EDITOR%2522%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL%2522%252C%2522contentType%2522%253A%2522EDITOR%2522%252C%2522id%2522%253A%2522club5ghcv0003356ht7gmhip2%2522%257D%255D%257D%252C%257B%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522contentType%2522%253A%2522SHELLS%2522%252C%2522direction%2522%253A%2522horizontal%2522%252C%2522id%2522%253A%2522SHELLS%2522%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL%2522%252C%2522contentType%2522%253A%2522SHELLS%2522%252C%2522id%2522%253A%2522club5ghcv0004356hmdcvoml5%2522%257D%255D%252C%2522sizes%2522%253A%255B100%255D%257D%255D%257D%252C%257B%2522type%2522%253A%2522PANEL_GROUP%2522%252C%2522contentType%2522%253A%2522DEVTOOLS%2522%252C%2522direction%2522%253A%2522vertical%2522%252C%2522id%2522%253A%2522DEVTOOLS%2522%252C%2522panels%2522%253A%255B%257B%2522type%2522%253A%2522PANEL%2522%252C%2522contentType%2522%253A%2522DEVTOOLS%2522%252C%2522id%2522%253A%2522club5ghcv0006356h5qkqh7qn%2522%257D%255D%252C%2522sizes%2522%253A%255B100%255D%257D%255D%252C%2522sizes%2522%253A%255B44.86491665069937%252C55.13508334930063%255D%257D%252C%2522tabbedPanels%2522%253A%257B%2522club5ghcv0003356ht7gmhip2%2522%253A%257B%2522tabs%2522%253A%255B%257B%2522id%2522%253A%2522club5ghcu0002356h82prmvsj%2522%252C%2522mode%2522%253A%2522permanent%2522%252C%2522type%2522%253A%2522FILE%2522%252C%2522initialSelections%2522%253A%255B%257B%2522startLineNumber%2522%253A38%252C%2522startColumn%2522%253A14%252C%2522endLineNumber%2522%253A38%252C%2522endColumn%2522%253A14%257D%255D%252C%2522filepath%2522%253A%2522%252Fsrc%252FDemo.tsx%2522%252C%2522state%2522%253A%2522IDLE%2522%257D%255D%252C%2522id%2522%253A%2522club5ghcv0003356ht7gmhip2%2522%252C%2522activeTabId%2522%253A%2522club5ghcu0002356h82prmvsj%2522%257D%252C%2522club5ghcv0006356h5qkqh7qn%2522%253A%257B%2522tabs%2522%253A%255B%257B%2522id%2522%253A%2522club5ghcv0005356hn5cg341m%2522%252C%2522mode%2522%253A%2522permanent%2522%252C%2522type%2522%253A%2522UNASSIGNED_PORT%2522%252C%2522port%2522%253A0%252C%2522path%2522%253A%2522%252F%2522%257D%255D%252C%2522id%2522%253A%2522club5ghcv0006356h5qkqh7qn%2522%252C%2522activeTabId%2522%253A%2522club5ghcv0005356hn5cg341m%2522%257D%252C%2522club5ghcv0004356hmdcvoml5%2522%253A%257B%2522tabs%2522%253A%255B%255D%252C%2522id%2522%253A%2522club5ghcv0004356hmdcvoml5%2522%257D%257D%252C%2522showDevtools%2522%253Atrue%252C%2522showShells%2522%253Afalse%252C%2522showSidebar%2522%253Atrue%252C%2522sidebarPanelSize%2522%253A15%257D |
You'll need to pass the config when instantiating a Cloudinary object so that the image_tag call will generate the image url to an existing account. For example:
//call the image
$image_on_cloudinary = 'student_images/carter-ski-jump-big.jpg';
$config = Configuration::instance();
$config->cloud->cloudName = 'my_cloud_name';
$config->cloud->apiKey = 'my_key';
$config->cloud->apiSecret = 'my_secret';
$config->url->secure = true;
$cld_image = new Cloudinary($config);
$cld_image->imageTag($image_on_cloudinary);
The above should generate the image URL. |
After following this doc https://github.com/Azure/azure-search-vector-samples/blob/main/demo-python/code/azure-search-vector-python-sample.ipynb
I created a schema
fields = [
SimpleField(name="id", type="Edm.String", key=True, sortable=True, filterable=True, facetable=True),
SearchableField(name="text", type="Edm.String"),
SearchField(name="embedding", type=SearchFieldDataType.Collection(SearchFieldDataType.Single),
searchable=True, vector_search_dimensions=1536, vector_search_profile_name="myHnswProfile")
I am trying to upload documents that are stored in a json format.
> "[{\"text\":\"blah blah\",\"embedding\":\"[-0.01130323 0.00934206
> -0.01263496 ... -0.02174513 -0.00830184\\n -0.0122043 ]\"}
When trying to run this command
search_client = SearchClient(endpoint=service_endpoint, index_name=index_name, credential=credential)
result = search_client.upload_documents(documents)
print(f"Uploaded {len(documents)} documents")
I am running into this issue
Message: The request is invalid. Details: A null value was found with the expected type 'search.documentFields[Nullable=False]'. The expected type 'search.documentFields[Nullable=False]' does not allow null values.
Tried the public doc https://github.com/Azure/azure-search-vector-samples/blob/main/demo-python/code/azure-search-vector-python-sample.ipynb
|
You have set a speaker count of 2, and you are seeing speaker tags {0, 1, 2}.
Speaker 0 is the entire script and is not part of the diarized output. You can discard speaker 0 since it simply repeats the entire script and provides no unique information regarding the diarization.
With a speaker count of 2, speakers 1 and 2 will contain the entire script. I hope this helps.
|
@Tom's answer was close enough, but when only showing one item, it didn't answer properly...it had some decimals if I did NOT show the last item in the filter, so here is a fixed version:
> =SUMPRODUCT(((Table13[Person]<>"")*SUBTOTAL(3,OFFSET(Table13[[#Headers],[Person]],ROW(Table13[Person])-MIN(ROW(Table13[Person]))+1,0)))/COUNTIFS(Table13[Person],Table13[Person]))
Notice the plus 1 to fix the decimals.
I also don't know why he was looking at column B either. |
I use `nohup` and bash scripts to help manage my python programs running on a local server. I have a bash script (`tmp.sh`) to invoke several python programs in a row. I try to kill the bash script and the python scripts started within the bash script with `kill $PID`, where `PID` is the process ID of the command `nohup bash tmp.sh &`, but only the bash script is terminated, and the python script keeps running. I don't want to export the process IDs of those python scripts because the bash script will run multiple python scripts within it and in that case I have to export the process ID of each python script.
I have created an example to reproduce the problem I have encountered.
Basically, I usually start my program by `source run2.sh`, which first determines if the same program is currently running to avoid duplicated running, and if not, submit a new job and change the `PID` to the new job in `~/.bashrc`.
`run2.sh`
```bash
submit_a_job()
{
nohup bash tmp.sh &
export PID=$! # get the process ID of the above submitted job
echo "job $PID submitted at $(date)"
echo "job $PID submitted at $(date)" >> output.log
echo "export PID=$!" >> ~/.bashrc
}
if [ -n "$PID" ]; then
# PID set, safe to run any job
if ps -p $PID > /dev/null; then
# the job is still running
echo "$PID is running, new job not submitted"
else
# the job has finished, delete previous PID, and submit a new job
echo "$PID is finished, new job submitted"
sed -i '/PID/d' ~/.bashrc
submit_a_job
fi
else
# PID not set, the job might still be running or have finished
echo "helloworld"
submit_a_job
fi
```
If you do not want to modify `~/.bashrc`, you can comment off the following lines in `run2.sh`. And make sure you run `run2.sh` with `source`, otherwise the environment variable is not exported to the current working shell.
```bash
echo "export PID=$!" >> ~/.bashrc
sed -i '/PID/d' ~/.bashrc
```
`tmp.sh` is the script that runs python jobs
```bash
time python3 while.py
```
`while.py` is just a meaningless dead loop
```python
import time
counter = 0
while True:
print(f"This is an infinite loop! Iteration: {counter}", flush=True)
counter += 1
time.sleep(1) # Sleep for 1 second between iterations
```
As I have exported the ID of the process that runs `bash tmp.sh` as `PID`, I can kill the bash script `tmp.sh` with command `kill $PID`. The problem is, even if `tmp.sh` no longer runs, the python script that runs at the time I kill `tmp.sh` keeps running in the background. I can confirm this point by command `ps aux | grep python3`, which clearly indicates `while.py` is running.
Which command should I use to kill the bash script `tmp.sh` as well as the python program running at the same time I kill `tmp.sh`? |
Can login with Pgbouncer but cannot access any database |
|postgresql|pgbouncer| |
I'm new to GitHub and I'm trying to set up my SSH keys by following the steps in <https://docs.github.com/en/authentication/connecting-to-github-with-ssh/working-with-ssh-key-passphrases#auto-launching-ssh-agent-on-git-for-windows> to auto-start ssh-agent in Windows.
But I'm already stuck in the first step "You can run `ssh-agent` automatically when you open bash or Git shell. Copy the following lines and paste them into your `~/.profile` or `~/.bashrc` file in Git shell" because my .ssh folder only has the ff files:
- id_ed25519
- id_ed25519.pub
- known_hosts
- known_hosts.old
I tried to look for similar files outside of my C:/Users/\<user\>/.ssh folder, and I found these:
- .bash_history
- .gitconfig
Is that correct? Or should I create the .profile or .bashrc file by myself? I'm new to this, so any extra explanations would be really helpful. Thank you! |
I can't find ~/.profile or ~/.bashrc in C:/Users/<user>/.ssh folder |
|git|github| |
null |
when I use the same Keras pipeline to get the results, I'm getting different results.
```
results = pipeline.recognize([ultralytics_crop_objects[5]])
print(results)
--> ji856931
results = pipeline.recognize(ultralytics_crop_objects)
print(results[5])
--> ji8569317076
```
Does anybody have an explanation for that?
[ultralytics_crop_objects[5]](https://i.stack.imgur.com/x6QvJ.png)
I checked a couple of times if I accidentally used another pipeline or if the input pictures are different. But they aren't. And I googled a lot. |
Keras OCR - Getting different results from Keras |
|python|keras|ocr| |
null |
Here is the final method:
public int ParseCSVText(string textToParse, string delimiter, out string[] tokens)
{
List<string> listTokens = new List<string>();
try
{
using (var stringReader = new StringReader(textToParse))
{
var config = new CsvConfiguration(CultureInfo.InvariantCulture);
config.Delimiter = delimiter;
config.HasHeaderRecord = false;
using (var reader = new CsvReader(stringReader, config))
{
while (reader.Read())
{
for (int column = 0; column < reader.ColumnCount; column++)
{
listTokens.Add(reader.GetField(column));
}
}
}
}
}
catch (Exception ex)
{
SimpleLog.Log(ex);
}
tokens = listTokens.ToArray();
return listTokens.Count;
}
|
implementing a JavaScript function to validate email addresses using regular expressions. The function needs to check whether the provided email address follows the standard format.
typically try using regular expressions in JavaScript to create a pattern that matches the standard format of an email address. You would then test this pattern against the given email address and expect either a match (indicating a valid email address) or no match (indicating an invalid email address). |
Have one css class or theme for multiple boxes inside the stack - MUI |
|css|reactjs|material-ui| |
{"Voters":[{"Id":1255289,"DisplayName":"miken32"},{"Id":3074564,"DisplayName":"Mofi"},{"Id":4267244,"DisplayName":"Dalija Prasnikar"}],"SiteSpecificCloseReasonIds":[13]} |
I'm trying OTA Firmware update of ESP32 using Cavlii C16Qs. I'm trying to Connect internet with ESP32 using Cavlii C16Qs. Basically issue is I'm Not getting Internet Connectivity using "AT+PPPSTART" and Cavli C16Qs modem. Cavlii C16Qs is use for Providing a LTE Network. I'm trying to Internet Connection using "AT+PPPSTART" Command. https://cavli.atlassian.net/servicedesk/customer/portal/8/topic/6c7a6bd9-155b-4438-84bb-4807e4f8db19/article/343015460 This is the link of Document for AT Commands Details. So what should I change in the below code so that esp 32 can get internet connectivity through Cavlii C16Qs? So that I can fetch the firmware file from the server.
AIM: ESP32 OTA using External modem Internet Connectivity, HTTP.h and Update.h Library.
Arduino IDE Version: 1.8.19
ESP32 Board Version: 2.0.0
Serial is use for Debug, and Serial1 is use for Serial Communication with Cavlii C16Qs.
```
#include <HTTPClient.h>
#include <update.h>
/* Cavli_C16QS */
String response = "";
const char *ipAddr;
String st1;
int value1, value2;
int clientID;
void setup() {
// Power up the ESP32 by toggling the PWRKEY pin
pinMode(2, OUTPUT);
digitalWrite(2, LOW);
delay(100);
digitalWrite(2, HIGH);
Serial.begin(115200);
Serial1.begin(115200, SERIAL_8N1, 32, 33); // RX, TX
//////////////////////////////////
// Send AT command via Serial to Cavli module
Serial.println("Sending ATI command to Cavli module");
Serial.println("==================================");
Serial1.print("ATI"); // Send command via Serial2
// FeedBack();
delay(1000);
while (Serial1.available()) {
char c = Serial1.read();
Serial.print(c);
}
Serial.println("Sending AT+COPS?");
//////////////////////////////////
// Read and print responses from Cavli module via Serial2
while (!(response.indexOf("+ATREADY") != -1 || response.indexOf("OK") != -1)) {
Serial1.print("AT+COPS?\r\n");
while (Serial1.available()) {
char c = Serial1.read();
response += c;
}
Serial.print("response-->");
Serial.println(response);
}
delay(1000); // Add a delay after sending the command
response = "";
//////////////////////////////////
value1 = value2 = -1;
Serial.println("Sending AT command... +CEREG? Checking network connectivity");
while (!(value1 == 0 && (value2 == 1 || value2 == 5))) {
Serial1.write("AT+CEREG?\r\n");
delay(500);
while (Serial1.available()) {
char c = Serial1.read();
response += c;
}
Serial.print("response-->");
Serial.println(response);
response = "";
}
delay(5000); // Add a delay after sending the command
response = "";
//////////////////////////////////
value1 = value2 = -1;
Serial.println("Sending AT command... +CGACT? Checking internet connectivity");
while (!(value1 == 1 && value2 == 1)) {
Serial1.write("AT+CGACT?\r\n");
delay(500);
while (Serial1.available()) {
char c = Serial1.read();
response += c;
}
Serial.print("response-->");
Serial.println(response);
response = "";
}
//////////////////////////////////
delay(500); // Add a delay after sending the command
response = "";
Serial.println("Sending AT+NETIF?");
Serial1.print("AT+NETIF?\r\n"); // Send command via Serial2
// Read and print responses from Cavli module via Serial2
while (Serial1.available()) {
char c = Serial1.read();
response += c;
}
Serial.print("response-->");
Serial.println(response);
response = "";
Serial.println("Sending AT+PPPSTART");
Serial1.print("AT+PPPSTART=?\r\n"); // Send command via Serial2
// Read and print responses from Cavli module via Serial2
while (Serial1.available()) {
char c = Serial1.read();
response += c;
}
Serial.print("response-->");
Serial.println(response);
response = "";
String _url = url_OTA_code_head + code_ver + url_OTA_code_tail;
delay(2000);
HTTPClient http;
http.begin(_url);
int httpCode = http.GET();
Serial.println(httpCode);
Serial.println("----------------------------End Setup----------------------------");
}
void loop() {
// put your main code here, to run repeatedly:
}
``` |
I'm Trying to Connect Internet with ESP32 using Cavlii C16Qs. but ESP32 is not Getting Internet Connectivity |
|esp32|gsm|at-command|arduino-esp32|internet-connection| |
null |
`INFORMATION_SCHEMA` system tables should be avoided, as they are generalized views for non-specific-to-SQL-Server clients. Instead use the `sys` schema's tables and views.
Unfortunately there is no `sys.functions` so we have to filter `sys.objects` instead:
```tsql
select o.name, p.*, t.name as type_name
from sys.parameters p
join sys.types t on t.user_type_id = p.user_type_id
join sys.objects o on o.object_id = p.object_id
where o.[type] in ('IF', 'TF') -- IF is inline, TF is multi-statement
``` |
I have an issue related to the package `@babel/runtime-corejs3` importing `core-js-pure` in my Vue application.
My browser console throws the following error:
`Uncaught SyntaxError: The requested module '/@fs/Users/xxx/Documents/projects/xxx/node_modules/core-js-pure/features/promise/index.js?v=570da9ba' does not provide an export named 'default' (at asyncToGenerator.js:1:8)`
This is caused by the following line of code in `node_modules/@babel/runtime-corejs3/helpers/esm/asyncToGenerator.js`:
`import _Promise from "/@fs/Users/a.akbulut/Documents/projects/embeddables/node_modules/core-js-pure/features/promise/index.js?v=76e5cea3";`
The issue is that there is no default export for `_Promise`, as the file contains the following:
`'use strict';
module.exports = require('../../full/promise');`
I confirmed `@adyen/adyen-web` is the only package in my application that uses `@babel/runtime-corejs3` & `core-js-pure`.
I noticed there has been a new tag release 3 days ago: [5.61.0](https://www.npmjs.com/package/@adyen/adyen-web/v/5.61.0).
However I tried to install `"@adyen/adyen-web": "5.60.0"` but it did not solve the issue.
Some extra info:
- Node version 21.7.1
- NPM version 10.5.0
Help is appreciated. |
Runtime error by `babel/runtime-corejs3` importing `core-js-pure` |
|babeljs|node-modules|babel-loader|adyen| |