instruction
stringlengths
0
30k
Thread-safe lock-free min where both operands can change c++
|c++|thread-safety|atomic|compare-and-swap|bellman-ford|
``` <template> <div class="flex"> <SidebarComponent /> <slot /> </div> </template> <style></style> <script setup lang="ts"> </script> ```
Not sure how your real data looks like, but in the case shown, you could `sub` 19-01 to 19-23, then `-` to `:` and `eval`uate the sequences to `cbind`, i.e. sth like > with(df1, + Map(cbind, + lapply(lapply(sub('-', ':', sub('-01', '-23', time)), str2lang), eval), + temperature)) |> do.call(what='rbind') [,1] [,2] [1,] 0 0 [2,] 1 0 [3,] 2 1 [4,] 3 1 [5,] 4 2 [6,] 5 2 [7,] 6 2 [8,] 7 3 [9,] 8 3 [10,] 9 3 [11,] 10 3 [12,] 11 3 [13,] 12 3 [14,] 13 3 [15,] 13 4 [16,] 14 4 [17,] 15 4 [18,] 16 4 [19,] 17 4 [20,] 18 4 [21,] 19 4 [22,] 19 1 [23,] 20 1 [24,] 21 1 [25,] 22 1 [26,] 23 1 ---- *Data:* > dput(df1) structure(list(time = c("00", "01", "02", "03", "04", "05", "06", "07-13", "13-19", "19-01"), temperature = c(0L, 0L, 1L, 1L, 2L, 2L, 2L, 3L, 4L, 1L)), class = "data.frame", row.names = c(NA, -10L))
Since I updated 2FA in order to put trusted device, I get this error after the login and security code entered: >Key provided is shorter than 256 bits, only 64 bits provided I updated my `User` entity to add: /** * @ORM\Column(type="integer") */ private int $trustedVersion; ... public function getTrustedTokenVersion(): int { return $this->trustedVersion; } I updated my database to put the new column `trusted_version` and updated `security.yaml` in order to use **scheb/2fa-trusted-device** I don't think the problem is about 2fa-trusted-device, but I'm a beginner with Symfony and I don't find the solution for the problem.
I have a simple android application that I created using xamarin. I want to retrieve information from a specific location in my data base whenever a user press a certain button. I'm using the real time database. Is this possible? I searched online for a few hours but I only found information about how to retrieve data when the data changes, I just need to retrieve it when a button is pressed. ![The structure of my database for context][1] I want to retrieve the strings from a specific ID when a user presses a certain button. How do I retrieve the strings? EDIT: I followed the advice of one of the comments that provided me with a java code, I'm trying to create a c# equivalent that i can use in xamarin, this is what I have so far: btnSearch.Click += async (sender, e) => { DatabaseReference dbref = AppDataHelper.GetDatabase().GetReference("trip"); dbref.ValueChanged += HandleValueChanged; void HandleValueChanged(object sender, ValueChangedEventArgs args) { if (args.DatabaseError != null) { System.Diagnostics.Debug.WriteLine(args.DatabaseError.Message); return; } } }; ValueChanged and ValueChangedEventArgs aren't recognized, what should I do? EDIT 2: I ended up using an event listener, when the event listener is initialized it retrieves data from the firebase. [1]: https://i.stack.imgur.com/MrSDt.jpg
My two cents on the issue: Don't know why this happens but I ran the `docker compose up --build` command from my IDE (WebStorm) terminal. Tried: upgrade the docker desktop (to 4.28), add big folders to the `.dockerignore` file. Strangely: It doesn't happen when WebStorm is not running (And I run it from cmd). **So try running from a command line after shutting down your IDE.** It also doesn't happen when I run it from PyCharm Terminal (PS). So in my case the IDE was the problem.
null
for kali linux or oher debian `sudo apt purge code-oss` remove the existing VSCode: ```bash sudo apt-get install wget gpg wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg - dearmor > packages.microsoft.gpg sudo install -D -o root -g root -m 644 packages.microsoft.gpg /etc/apt/keyrings/packages.microsoft.gpg sudo sh -c 'echo "deb [arch=amd64,arm64,armhf signed-by=/etc/apt/keyrings/packages.microsoft.gpg] https://packages.microsoft.com/repos/code stable main" > /etc/apt/sources.list.d/vscode.list' rm -f packages.microsoft.gpg ``` then: ```bash sudo apt install apt-transport-https sudo apt update sudo apt install code # or code-insiders ```
null
> how my default constructor should look like for a char*. You can make a null-terminated array of length one. Also note that you should use *member initializer list* as shown below: ``` //use member iniializer list String():name( new char[1]()) //------------------------^^---->note these parentheses to null terminate { } //use member initializer list String(char* str):name(new char[strlen(str) + 1]) { strcpy(name, str); } ```
I only know that when I add the numbers for a square or rectangle it shows that my shape is a parallellogram
|javascript|if-statement|shapes|
null
**Disclaimer**: This is my first question so any helpful critiques on how to make future ones better is welcome. **My Problem**: I have an expressjs application with a `/CreateShipment` endpoint that takes in json data to create a specific shipment for a clients order. The issue I am running into is the clients have different account credentials that we want to use. So I created a class to initialise the axios instance for each individual request, based on the client name so that the authorization headers passed matched those of the clients keys. **courierProvider.js** ```javascript const axios = require("axios").default require("dotenv").config() const { hashifyClient } = require('../utils/hashify-client') class Post { constructor(clientName) { const clientHash = hashifyClient(clientName) this.clientNumber = clientHash } getHeaders() { return { "Account-Number": process.env[`${this.clientNumber}_ACC_NUMBER`] || process.env.ACC_NUMBER, "Authorization": process.env[`${this.clientNumber}_AUTH`] || process.env.AUTH, } } postBase() { return axios.create({ baseURL: 'https://api.shippingendpoint.com/shipping/v1/', headers: { "Content-Type": "application/json", "Accept": "application/json", ...this.getHeaders() } }) } } ``` The code above is my class I use to setup the axios instance for each request depending on the client name. When I test this in my local setup with NodeJS with different client names, the env variables in the headers change and everything is all good. But when I run this in Heroku and I log the values, they are returned as `undefined` and it defaults to my fallback variables. The code that calls this is a POST endpoint `CreateShipment`. **courier.js** ```javascript const { Post } = require('../endpoints/courierProvider') router.post("/CreateShipment", apiAuth, async (req, res) => { let AP_Shipment_Body = AP_Shipment_Cx(req.body) let post = new Post(clientName); let postBase = post.postBase() let results = await postBase.post('shipments', AP_Shipment_Body) }) ``` Now when this class is initialised and the request made, it is being made to my fall back account instead of the clients account. Yes, the config variable is set inside Heroku, I have checked several times from the cli, inside the console in Heroku, and also checking them in the GUI. I can't for the life of me figure out why in Heroku it says the env vars are `undefined` when passed with a client that has shipping credentials in the env, and when testing in my local I run into no issues. I have tried checking the env variables multiple times both through Heroku CLI tool and their GUI, each time the variables are there. Logging the resulting dynamic env, reloading the dyno and more. None of which change the outcome of my results. I was expecting to have the headers change when a client that does have saved variables in our environment to be used in the request instead of the fallback variables.
Deploying to Heroku: Dynamic Env variables are undefined
|node.js|express|heroku|environment-variables|
Key provided is shorter than 256 bits, only 64 bits provided
|php|symfony|symfony5|
In Ada, it is perfectly possible and reasonable to declare variables in the middle of your function. Just like in all other programming languages (at least Java, C, C++, C#, python) you can make a scope explicitly anywhere: ```pascal declare A : My_Type := Val; begin Use(A); end; ``` which means your constant can be declared after receiving the runtime value: ```pascal WITH Ada.Text_IO; WITH Ada.Long_Float_Text_IO; WITH Material_Data; USE Material_Data; PROCEDURE sample IS Stiffness_Ratio : Long_Float; BEGIN -- main program Ada.Text_IO.Put("Enter stiffness ratio: "); Ada.Long_Float_Text_IO.Get(Item => Stiffness_Ratio); Ada.Long_Float_Text_IO.Put(Item => Stiffness_Ratio); --Ada.Text_IO.New_Line; declare Actual_Stiffness : CONSTANT Long_Float := Stiffness_Ratio * Stiffness_Total; begin Ada.Long_Float_Text_IO.Put(Item => Actual_Stiffness); end; --Ada.Text_IO.New_Line; --Ada.Long_Float_Text_IO.Put(Item => Stiffness_Total); END sample; ``` For another example of a good use of block-scoped variables, consider the following non-recursive implementation of Euclid's Greatest Common Divisor algorithm, with new X and Y declarations hiding the formal parameters: ```pascal function GCD(X,Y : Integer) return Integer is begin declare X : Integer := GCD.X; Y : Integer := GCD.Y; begin while Y /= 0 loop declare temp : Integer := X; begin X := Y; Y := temp mod Y; end; end loop; return X; end; end GCD; ``` Inside GCD, the formal parameters X and Y are always constant, because they are declared as `in`. This is to avoid confusion; if we allowed assigning to the variable, the programmer might think that he was causing a side-effect, as would be the case if the parameter were marked `out`. So we cannot simply modify the existing X and Y, as we might do in this case in many other programming languages. Programming in Ada 2012 by John Barnes recommends in this case to declare a variable with a different name in the scope of the function itself. On page 194, section 11.2: ``` function Sum(List: Cell_Ptr) return Integer is Local: Cell_Ptr := List; (...) ``` Which leaves the programmer able to refer to List, the constant and unchanged initial value. In the case of the GCD program given above, referring to that value in the body of the loop would lead to incorrect behaviour. Hiding X and Y by adding an inner scope with variables of the same name means that we would need to explicitly include the scope of the variable to make that mistake. Try changing `X := Y` to `X := GCD.Y` in the loop body, and observe that GCD(91,21) now returns 21 rather than 7. Finally, note that the inner scope can be given a name by prepending a label: ```pascal Inner: declare X : Integer := 0; begin Use(Inner.X); -- explicitly the X in that scope end Inner; ```
My server which is invoked when a user finishes the OAuth consent flow does not contain the full url that the callback is invoked with. Everything after the `#` is removed Callback full url seen in the browser: ```html https://<domain>/google-drive/callback #access_token=<token> &token_type=Bearer&expires_in=3599 &scope=https://www.googleapis.com/auth/drive.file%20https://www.googleapis.com/auth/drive.install ``` APIGateway event received by server ```yaml { version: '2.0', routeKey: 'GET /google-drive/callback', rawPath: '/google-drive/callback', rawQueryString: '', headers: { accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,*/*;q=0.8', 'accept-encoding': 'gzip, deflate, br', 'accept-language': 'en-US,en;q=0.5', 'content-length': '0', host: 'jmvd2hngq3.execute-api.us-east-1.amazonaws.com', 'sec-fetch-dest': 'document', 'sec-fetch-mode': 'navigate', 'sec-fetch-site': 'none', 'sec-fetch-user': '?1', 'upgrade-insecure-requests': '1', 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:124.0) Gecko/20100101 Firefox/124.0', 'x-amzn-trace-id': 'Root=1-6609a719-148a9dd90fa586d725643c88', 'x-forwarded-for': '68.23.54.13', 'x-forwarded-port': '443', 'x-forwarded-proto': 'https' }, queryStringParameters: {}, requestContext: { accountId: '87438545', apiId: 'kjndf98n49v', domainName: 'kjndf98n49v.execute-api.us-east-1.amazonaws.com', domainPrefix: 'kjndf98n49v', http: { method: 'GET', path: '/google-drive/callback', protocol: 'HTTP/1.1', sourceIp: '65.30.11.25', userAgent: 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:124.0) Gecko/20100101 Firefox/124.0' }, requestId: 'lkljSDLVK=', routeKey: 'GET /google-drive/callback', stage: '$default', time: '31/Mar/2024:18:10:33 +0000', timeEpoch: 1711908633334 }, isBase64Encoded: false } ``` OAuth Start Url: ```html https://accounts.google.com/o/oauth2/v2/auth ?client_id=<client-id> &redirect_uri=https://kjndf98n49v.execute-api.us-east-1.amazonaws.com/google-drive/callback &response_type=access_token &scope=https://www.googleapis.com/auth/drive.file https://www.googleapis.com/auth/drive.install ```
I want Nokogiri to return all the divs with the following class: `styled__PostItemWrapper-sc-e4ns84-7` however on my webpage i can see so many divs with this class, here is a sample: [Webpage with divs containing the class](https://i.stack.imgur.com/ekYD2.png) yet when I do this: ` Nokogiri.HTML(res.body).css('div.styled__PostItemWrapper-sc-e4ns84-7').size` I only get 5 results Why is that? I was expecting all of the divs with that class but I only got the first 5
Nokogiri only returning 5 results
|css|ruby|xpath|nokogiri|
null
The QuestDB implementation of the InfluxDB Line Protocol client supports two different transports: TCP and HTTP. What are the differences between these transports? Is there any guidance available on selecting the appropriate protocol for my use case?
What transports should I use for QuestDB ingestion over the ILP protocol?
|questdb|influx-line-protocol|
Please add a for loop in the JavaScript code. Generate the array only to the current row. ~~~SQL CREATE TEMP FUNCTION conditional_sum(X ARRAY<FLOAT64>) RETURNS FLOAT64 LANGUAGE js as r""" var cur=0; var out=[]; for(let x of X){ cur+=x*1; if(cur<0) cur=0; //out.push(cur) } return cur; // return out; """; With tbl as ( SELECT '2020-01-01' AS Date, 1 AS ID, 8 AS Number, 8 AS Desired UNION ALL SELECT '2020-01-02', 1, 11.5, 19.5 UNION ALL SELECT '2020-01-03', 1, -20, 0 UNION ALL SELECT '2020-01-04', 1, 10, 10 UNION ALL SELECT '2020-01-05', 1, -5, 5 UNION ALL SELECT '2020-01-06', 2, -9, 0 UNION ALL SELECT '2020-01-07', 2, 26, 26 UNION ALL SELECT '2020-01-08', 2, 5, 31 UNION ALL SELECT '2020-01-09', 2, -23, 8 UNION ALL SELECT '2020-01-10', 2, -10.5, 0 UNION ALL SELECT '2020-01-11', 2, 5, 5 ) select *,conditional_sum([1.1,2,1]), string_agg(cast(number as float64) || '' ) over win1, conditional_sum(array_agg(cast(number as float64)) over win1 ) as run_sum from tbl window win1 as (partition by ID order by Date rows between unbounded preceding and current row) ~~~
As mentioned in the comments of the question, now this issue has been resolved already by `Clang 16` with both `libc++` and `libstdc++` [[LIVE](https://compiler-explorer.com/#z:OYLghAFBqd5QCxAYwPYBMCmBRdBLAF1QCcAaPECAMzwBtMA7AQwFtMQByARg9KtQYEAysib0QXACx8BBAKoBnTAAUAHpwAMvAFYTStJg1DIApACYAQuYukl9ZATwDKjdAGFUtAK4sGe1wAyeAyYAHI%2BAEaYxBJcpAAOqAqETgwe3r56icmOAkEh4SxRMVxxdpgOqUIETMQE6T5%2BZZj2uQzVtQT5YZHRsbY1dQ2ZZYNdwT1FfaUAlLaoXsTI7BzmAMzByN5YANQma27EhsCYCvvYJhoAgpdXwQQ7LEzBEDO3JgDsVtc7vzvMbAU8SYyx2RyMpz2awAIjsFAR0CAQOCTmc1t8rn8dvcdrViCYAKxWAmwz4WHalOI7Mw00g7NYMz7Q/YYrEo05IhReCLsiC3LFs44ckBRYAvPFzdkKJGuCASnZvdH8v5Sznc3nmABseMJFg0hOhpC1OqJXANioxTNuHDmtE4BN4fg4WlIqE4bms1jhCyWmD2ZjWPFIBE0NrmAGsQDSAHQADjW8c1a0kaw%2BBIAnEm1vpOJJHaHXZxeNKNMHQ3M4LAYFBqxAkGgWPE6NFyJQG036DEtscuASNKWaLQCNFpRAIgWIsFagBPThByfMYjTgDyEW0FRD3F4DbYgmXDFos%2BdvCwES8wDcYlo0q3pCwTyM4mPd7wxA3eAAbhzn5hVBUvMOc68PcLQFrQeA8jOHhYAWBDEHgLBAXMVAGMACgAGp4JgADuy7xIwQEyIIIhiOwUhEfIShqAWuhxAYRgoJ6lj6BB0qwAC7ARKgSQEKQX4xNwHx5hWHEoPE8RoMQqB8X0cFeAw4Yiaw7AOG%2BMkxHJClKWwIBUCwvH8SAmmKaQECiVKH7ZoZxkzHMqDxG0N4ALTwug4ERPs0LuaYlhei5CKeT5ViWGYGg7E5y4AF7hU8SwIJ5CjhtOBjhpgTlMB%2BqgElwZjhQ5CBvkw6C8Kg/HwVgbGvAkhhhMpnA7Kosaak5mqSDs3ZGBSBLRhoPU7BA3lMRYMzFi077OBArjDE0pCBBMhTFFkSQpAI01LTkqTdAt0y2GNlQCB0QyeI0ejlPt7RjFtvQlAMnRraMnRXVMJRzAoPrLBItr2vmz5uhwDVNS1bUdcAXU9X1A0QUF1gjWWx4Vkgv7/kQZAUHKxBocoNW0EICCoNhTpBu2dBMG0WMhDjeMEwWxOdig9HAAA%2BsZpC09EoR1Rw26oI2zbEMuAG4/jToukjyBXBjN68GL1T4E6vD8MRojiORiuUSo6jPrR%2BjHIxvnMe5lVmXVXE8epc5CTapmicg4mSdJ1nEPJJnGzpqmYObNnW3VekGbJTtad7OkWVZ/vO7ZroOakzmuYFQ2heFy5cDFtTIPFMKJclTCpelmXZblTn5YVxWumVeAVfAkryY4bAACrcdelfzIsH2jHL5MtEL1PPthRzxEhOYcA6pAiyVnDYH%2ByAASQAPNa17UM/1Nn9R6%2Bs2DsuCEDP6xcLDm5aBHCCYEVfRVZGfaD3mpCIRfo%2BFlztggKW%2B9W5WiAgAsBDxABraTfgKPlz0GrEiKtpBq0UBrGiehe5MH7luL6Q8fouj%2BgLL%2BAEdioCoLPIGC9jhg16mFCAHheadn9IGPe5Y5hHxPjEKqdoOBXxvqWO%2Bf1ixPzhgfCMIACRrGjAyfhAjBGakHmsJBY8H4vwrLWWs9YeYdhbGjNmXYGa9n7HwOgw5iCjnHM%2BBcM5CJ6KXKudcDhCI7kYAQfch4CynnPJeWgjdbz3mOE%2BF0%2BA3yVC/DeUWk8AIrCDCBehLp3JHCXNBFYLo4IIQHihJgaFMI4TwgRW8wDlZkTAbICB1EtbcJ1gxaGBtWLwFdpxbi8JzaCWEkHFSdsSAOzDoHEpKA4Ie0duHapul9KewDi7cyQoFCWW6eHOyUcBAxwCjCApFgE4RWik5WKacEpJRSmlDKWUcp5QIAVY%2BJdSrRHKhySAcxgQFE5tg%2BeIN8EQ0NgiKZsMzptBcAwdwx0RizWeU9RacRsgrTSK8maPy2ifJ2g8qoYx7q7VaGCx681rqnXBf8%2BFMKChwt3s3X0n1B7DxYfVRqc9gaL17ODQhNz0B3N4JIqhOzT5cIvvQxh3DmEFlYY/Z%2B5ZTIyI/gBb%2BvFFFyL5hzHSHA8U4MuUSgh0t/4kEAXEVJpEJAEgolkzWLpdCKpgXAngCDsXMs4KgnlGCsEiouYS7qBCV78tIesAkFD4ZUpoZQBBDLb66ofiWDhYZSCRkkJqaMmp/UBsDYG6Q9DREj1dRSyhXqQBrG6pqMwpQaRcA0PGjQOVYwhs4GYXgiFk1Mt%2BkWD1UjinSOKbIkhCi2yWr6MAZNZh1FDhHJQHRLpDFHnnFOIxa4NxmJ5ruSxB4jxuMwGeC8V4bxBmcY%2BCJJ5Xzvi8QWMWfjCKBLAhBUJ05wmESiYheBfBUIYSwrhfChMFayBAekpVVEVU6CjHk4wQ0WIREqiMxynB/LoDjmvGZy4FApziksrOOc1n502dsoqJUy4VyOZC8afhJrPIhXNFFz11q/IhYCzasLUPNChQdBFGQZqgvw8iyYXzbpHUI0iuowKXrotbtqsR99zltWAMgZAFJeq5QgJvFGZDd6RrtaQahWBaFcLMLGP1kgCTxkEQI4R9Kc2MvDQWt17DJEcrLSAJdKNf61ExtjLup7WaWtJqkDulNha9orTEWtGh61KMFSsbmNnUFGcXZPCWaFC0ywRMEQtcrQFXsgTk%2BtDM9bBRsIbJu9k30cA/V%2ByLP6/3zNTunaEmcVm53WQXIuOzIP7PLocqqAca6YHrp4F99GyIDHboZqmxmNUD3oTq1TE9kYz2Nax9jnHozcdXpFukvHt4BgE0W%2B1onHWXyUy61TbC2VCcjMmaMGYuBrA0E1db/qCQEgU5wMNOKJHsrfid8t8jUZVps/THsmo1GDk0doicnb228DbcYntt5zF7kHTYkddjx2ESncAVxs6PGOAXT%2BXxgFbyrufCEqCGAZ3BngrurV%2B64mHsSSewigXL3gOvVA7M4WplPqNqJU25TDKVNfjbWpUkhmNJti0hnvSfZdLaYzuqIcWcR1i9Hd9rl3KeUGmvcKsdJnxzCrM/9iyM7LOzqsvOGzC5bOLgV4gByqsnNqkKljuDOpcF9eayGERyWwfOk8l5VG4jIbIztTDq1EXfOWkC7D5HiMXTuk783bRDrjBQ%2B7gjJ0Ho0bd9MV670yKMZU8g3FgMTV4MN8S/qIvIu2s4cJ6lYno0pj9RoD4Zh0zpqEhoQMaw9sMJm/m2PamFsH007ARGUOSB6clhZ9zn3TNk3q1ZzvV2QaG7UY5zmLnztuYax5ioXmpazUnrLfzD9ccKuC9k1VMb70ResKT4p5Oyl%2BwElwS22kakSTqTzjp7tz9NN9lfvpEIBmhw0j03noyGDjLchBYXUMhpi4mdCKZP6cyCy6WmWCu2WoGKu4GuyUGxWVcggCE5WDcVWb0LcNWrkwQ7eE%2Bt4LS7ApYTWe6LWTGf07WU8fGXW%2BuoMSexuA2W%2BG8UqxA/G6enqImNK0218ymh282420aZgHw0YkgKYEmResa9m6YsYsYIiRBhaGmJ2nKSiv8Si12Rgg%2BA4GiTaY4T2i4L2pAb23apife/aViQ6J4f2Y6DiE6JhD4wOSO7i8634PiyM/iwEggoEcO66COMEz4O6MSB6CSx6ySQYS%2BqsmSBOOSIausJO0W7EJse%2BFSh%2BVSTStsp%2B9OHOrObszOqRx%2BnS%2B%2BRkz%2BHS3OmRr6/O8Wgun%2BMIpKUyv%2Bn6Eu36UuUUMuIB8uwGSuuWqu%2BWpchW0GVU2uTmce%2BKFBFIRu1yrEtyQ09ye0jyCGVuwe7y6AtGaGbQGGLuWGAeIKEx0KlGMxHufu8xIe9Q3u8IpG20dGKBGKaKhBMe4ieuYqQxJKIxZKYxgmGeLB2eS23U6Y6YGgkgHwmoXAnx3xvx6Ymale7Bs2NeXBMhnKOmLeaM%2BmCgmBveRMXe5mPe3cSJ/eKit2pYw%2BQqo%2BfM4%2BveDh4sksPmc%2Bfm8sFEF6y%2B%2BOIWa%2B4R%2BSj6URUAu%2BZsVO8RNOdUSR9st%2BnJGRDSaR7AN%2BmR%2BR/SgyhRkccWH6Qu5R9xlRCWNRSWdRQBaWgGWWIGyueWEGHRGuRWVWpWCBFWjcMGpxrctWGBqJxmOB7C%2BBaOFxh2JB08DB5BNxyeRCj6dBW8DBO8TBh8WeU2imoJ1e4iEJUaS2fBmYBIkg9mAYKYXxO2iqoaUhR2i2IAkgXAK2UZ2UAhu2oUAYQkg82a7Bealx98Txnq9ChZnB3B/EyQzgkgQAA%3D)]
When attempting to convert data obtained from an XML file, stored in a list, into a byte array, I encounter an error. Could someone provide guidance on how to successfully convert this data without encountering errors? ``` private fun convertFileToByteArray(file: File): ByteArray { val inputStream = BufferedInputStream(FileInputStream(file)) val buffer = ByteArrayOutputStream() val bufferSize = 1024 val tempBuffer = ByteArray(bufferSize) var bytesRead: Int try { while (inputStream.read(tempBuffer, 0, bufferSize).also { bytesRead = it } != -1) { buffer.write(tempBuffer, 0, bytesRead) } buffer.flush() } catch (e: IOException) { e.printStackTrace() } finally { inputStream.close() } return buffer.toByteArray() } ``` error: ``` Process: com.kader.testclient, PID: 23504 java.lang.OutOfMemoryError: Failed to allocate a 134217744 byte allocation with 25149440 free bytes and 124MB until OOM, target footprint 95582488, growth limit 201326592 at java.util.Arrays.copyOf(Arrays.java:3161) at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118) at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153) at com.kader.testclient.activities.DataManagerActivity.convertFileToByteArray(DataManagerActivity.kt:157) at com.kader.testclient.activities.DataManagerActivity.sendXMLToSDCard(DataManagerActivity.kt:114) at com.kader.testclient.activities.DataManagerActivity.onCreate$lambda$1(DataManagerActivity.kt:71) at com.kader.testclient.activities.DataManagerActivity.$r8$lambda$kG6UQAIpGVwaMzA6MHqKPJlhIVs(Unknown Source:0) at com.kader.testclient.activities.DataManagerActivity$$ExternalSyntheticLambda1.onClick(Unknown Source:2) at android.view.View.performClick(View.java:7448) at com.google.android.material.button.MaterialButton.performClick(MaterialButton.java:1218) at android.view.View.performClickInternal(View.java:7425) at android.view.View.access$3600(View.java:810) at android.view.View$PerformClick.run(View.java:28305) at android.os.Handler.handleCallback(Handler.java:938) at android.os.Handler.dispatchMessage(Handler.java:99) at android.os.Looper.loop(Looper.java:223) at android.app.ActivityThread.main(ActivityThread.java:7656) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:592) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:947) ``` I increased the bufferSize size, but it didn't helped.
Pandas: Create dtype string (not object) column from single string value without having to cast it
|pandas|dataframe|
|typescript|riot.js|tsd|
Here's what you are missing: String newStr = str.substring(0, i) + str.substring(i + 1); This will skip the character in newStr at index i. The full string (which you sometimes want if you want to insert something at a point in a String) would be String newStr = str.substring(0, i) + str.substring(i); Read the definition of this function and try creating some test examples to clarify your understanding.
By [default][1] Kestrel will listen on a random port between 5000-5300 (7000-7300 for HTTPS). In addition to adding the hostname to the hosts file you need to instruct Kestrel to listen on port 80. Something like this: { // Kestrel "AllowedHosts": "*", "Kestrel": { "Endpoints": { "Http": { "Url": "http://[::]:80", "Protocols": "Http1AndHttp2" }, } }, // YARP "ReverseProxy": { "Routes": { "api-route": { "ClusterId": "api-cluster", "Match": { "Host": "www.mybeautifultestsite.com", "Path": "{**catch-all}" } } } }, "Clusters": { "api-cluster": { "Destinations": { "api-destination": { "Address": "http://127.0.0.1:5278" } } } } } } ***Disclaimer: Not tested*** [1]: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/servers/kestrel/endpoints?view=aspnetcore-8.0
I have a function and if statement as below. How I can lower the cognitive complexity? const renderInput = (element, a) => { return ( <Input key={a} name={element.label ? element.label : ''}// Cognitive Complexity type={element.type ? element.type : ''}// Cognitive Complexity label={element.label ? element.label : ''}// Cognitive Complexity /> ) } const renderTag = (element, a) => { if (element.element === 'input' ) { // Cognitive Complexity if (element.type === 'submit' ) { // Cognitive Complexity return testFunction1() } if (element.type !== 'file') { // Cognitive Complexity return testFunction2() } } }
How to reduce Cognitive Complexity in a Input field and if condition in reactjs
|cognitive-complexity|
null
I don't have admin rights on my computer, and want to download a extension like vibrancy on my computer. I think there is another extension like it, but I can't find it. I tried downloading it, but you need admin rights for it. SO What should I do? Thank in advance. Faizan
How do you get vibrancy extension on VS code without admin rights?
|vscode-extensions|
null
|active-directory|sas|office365|power-automate|
This works ... =IF(A2="","",IF(AND(B2="IN",C2<=D2),"NORMAL",IF(AND(B2="OUT",E2<=C2),"NORMAL","LATE"))) ... or this ... =IF(A2="","",IF(OR(AND(B2="IN",C2<=D2),AND(B2="OUT",E2<=C2)),"NORMAL","LATE"))
|java|android|
I'm currently training a RNNModel using python darts. For comparing different trained models I want to extract the train_loss and val_loss from the fit method. How would I do this? I've read something about a metric collection but can't figure out how to use it. This is my current code ``` from darts.models import RNNModel from darts import TimeSeries train = # training data as TimeSeries model = RNNModel(model="LSTM", input_chunk_length=self.past_samples) model.fit(train) ``` The loss is shown in the console during training but I don't know how to access it. So far I've tried to find any docs online and asked bing chat and ChatGPT for help. However they told me to use `model.history.history["loss"]` which doesn't exist
This may be an old question but it might be useful for others, so here's a way to achieve this is to use a debouncer. Make the delay to be like 1 second, so if the user is tapping on the button multiple times before 1 second, it will cancel the previous action. So, you only need to locally update the button like/dislike state, and only call the API when the user stopped tapping.
use [$match][1] to filter the records you want then use [$group][2] to group by date test it here: [playground][3] ``` db.collection.aggregate([ { "$match": { "source": "here" } }, { "$group": { "_id": "$date", "count": { "$sum": 1 } } }, { "$project": { "_id": 0, "date": "$_id", "count": 1 } } ]) ``` [1]: https://www.mongodb.com/docs/manual/reference/operator/aggregation/match/https:// [2]: https://www.mongodb.com/docs/manual/reference/operator/aggregation/group/ [3]: https://mongoplayground.net/p/xR57s_Bx-3w
I extracted a table from a pdf using pdf_file <- "UBI Kenya paper.pdf" for(i in 59:68) { table_i <- extract_tables(pdf_file, pages = i) tname <- paste0("table_E.", i-58, "_full") assign(tname,table_i) } I didn't use as.data.frame because it returned an error message. Instead I did this: table_E.9_full <- data.table::as.data.table(table_E.9_full) table_E.9 <- table_E.9_full[-c(10:16),] This gives me a table which looks like this: [![enter image description here][1]][1] where the Enterprises and Net Revenues columns are combined into one for every category (Retail, Manufacturing, Transportation, Services) separated by a space. when what I want is it to look like the original here: [![enter image description here][2]][2] How do I split rows 2-9 into two columns each for V2-V5, so that columns titles would be "Retail Trade - # Enterprises", "Retail Trade - Net Revenues", "Manufacturing - # Enterprises", "Manufacturing - Net Revenues", etc., with the correct values in the correct columns? dput(head(table_E.9)) returns: structure(list(V1 = c("", "", "", "Long Term Arm", "", "Short Term Arm" ), V2 = c("Retail Trade", "# Enterprises Net Revenues", "(1) (2)", "3.89�\u0088\u0097�\u0088\u0097�\u0088\u0097 1601.42�\u0088\u0097", "[1.28] [824.74]", "2.34�\u0088\u0097�\u0088\u0097 464.60�\u0088\u0097" ), V3 = c("Manufacturing", "# Enterprises Net Revenues", "(3) (4)", "0.02 51.90", "[.27] [120.79]", "0.03 17.82"), V4 = c("Transportation", "# Enterprises Net Revenues", "(5) (6)", "0.53�\u0088\u0097 100.76", "[.29] [85.37]", "-0.12 -3.85"), V5 = c("Services", "# Enterprises Net Revenues", "(7) (8)", "0.23 198.64", "[.33] [205.6]", "-0.04 70.95")), row.names = c(NA, -6L), class = c("data.table", "data.frame"), .internal.selfref = <pointer: 0x000001716fd35930>) [1]: https://i.stack.imgur.com/mpJqF.png [2]: https://i.stack.imgur.com/xSV1g.png
Lambda endpoint for the Google OAuth callback does not recieve the access_token
|aws-lambda|oauth|google-oauth|aws-api-gateway|
The callback url that's being received contains a `#` fragment. [These are not sent to the server](https://stackoverflow.com/questions/14462218/is-the-url-fragment-identifier-sent-to-the-server) The reason google is containing the fragment in the URL is because the OAuth Start URL's `response_type` is set to `access_token` If you change the value to `code` Google will format the callback to use queryString parameters and the ApiGatewayProxyEvent will contain the access token. `response_type=code` is meant for [server integrations](https://developers.google.com/identity/protocols/oauth2/web-server#httprest) while `response_type=access_token` is meant for [web applications](https://developers.google.com/identity/protocols/oauth2/javascript-implicit-flow) The OAuth spec also has [documentation](https://auth0.com/docs/authenticate/protocols/oauth) on the values you can set in an OAuth Authorization flow request
I was following a tutorial on Web API and I saw the creator creating his service method as nullable `( Task<Comment?> Update(CommentUpdateDto comment) )` and he later used it like following which made sense: ``` [HttpPut("{id:int}")] public async Task<IActionResult> Update([FromRoute] int id, [FromBody] CommentUpdateDto commentUpdateDto) { if (!ModelState.IsValid) return BadRequest(ModelState); var comment=await _commentRepo.UpdateAsync(id, commentUpdateDto.ToCommentFromUpdate()); if (comment == null) { return NotFound("Comment not found!"); } return Ok(comment); } ``` Now I want to use same approach in my `ASP.Net Core MVC` app, but for some reason it doesn't make sense. It is probably because I am using `Fluent Validation` with auto mapper. My CategoryService: ``` public async Task<Category?> UpdateCategoryAsync(UpdateCategoryDto updateCategoryDto) { var category = await GetCategoryByIdAsync(updateCategoryDto.Id); if (category == null) return null; mapper.Map(updateCategoryDto, category); await unitOfWork.GetRepository<Category>().UpdateAsync(category); await unitOfWork.SaveChangesAsync(); return category; } ``` My Controller Action: ``` [HttpPost] public async Task<IActionResult> Update(UpdateCategoryDto updateCategoryDto) { var category=mapper.Map<Category>(updateCategoryDto); var result = validator.Validate(category); var exists= await categoryService.Exists(category); if(exists) result.Errors.Add(new ValidationFailure("Name", "This category name already exists")); if(result.IsValid) { await categoryService.UpdateCategoryAsync(updateCategoryDto); return RedirectToAction("Index", "Category", new { Area = "Admin" }); } result.AddToModelState(ModelState); return View(updateCategoryDto); } ``` I tried to modify it like this: ``` public async Task<IActionResult> Update(UpdateCategoryDto updateCategoryDto) { var category = mapper.Map<Category>(updateCategoryDto); var result = validator.Validate(category); var exists = await categoryService.Exists(category); if (exists) result.Errors.Add(new ValidationFailure("Name", "This category name already exists")); if (result.IsValid) { var value = await categoryService.UpdateCategoryAsync(updateCategoryDto); if(value == null) return NotFound(); return RedirectToAction("Index", "Category", new { Area = "Admin" }); } result.AddToModelState(ModelState); return View(updateCategoryDto); } ``` The thing is if my `updateCategoryDto` is null, I cannot even pass the validation as my category will be null, so my modification doesn't change anything. I want to know what changes I should make in order to have a logical flow. Should I just use my service method as `Task<Category>` instead of `Task<Category?>` or do I have to make changes in my controller action. Note that I am a self-taught beginner, so any suggestions or advice is valueable for me. If you think I can make changes in my code for better, please share it with me. Thanks in advance!
You can get list of path of whole device photos/videos by using [photo_manager][1]. For example final List<AssetPathEntity> paths = await PhotoManager.getAssetPathList(); paths.forEach((e){ // Here you can get Path of files }); [1]: https://pub.dev/packages/photo_manager
null
Step 1: Open "CMD"(Administrator) Step 2: `npm install -g json-server@0.17.4` Step 3: `json-server --watch db.json` Sorce: [https://github.com/typicode/json-server/tree/v0][1] [1]: https://github.com/typicode/json-server/tree/v0
|window|nsis|repair|updaters|
I am using Bootstrap responsive table. The problem I am facing here is the text in the table column is overflowing. As per my knowledge the table adjusts the column width automatically if it is responsive. But the problem here is I have a input field inside td and I have a placeholder for that input field. The placeholder is getting overflowed. How to fix this issue for this requirement. **Placeholder : Drag and Drop here** **Note: This problem happens in mobile device only. ** ``` <div class="table-responsive border"> <table class="table" id="dataTable" width="100%" style="overflow-x:auto"> <thead> <tr> <th>Document Type</th> <th class="text-center">Upload</th> <th>Open</th> <th>Resolved</th> <th>Not Required</th> </tr> </thead> <tbody id="missingBody"> <tr> <tdFront and Back - Passport/Driver’s License/Photo Car</td> <td> <div class="file-field"> <div class="file-path-wrapper""><input type="file" name="file1" missval="dcn_56" multiple=""><input style="vertical-align:top;padding-bottom:8px;" class="file-path validate" type="text" id="dcn_56" doctype="56" multiple="" placeholder="Drag and Drop files here"></div> </div> </td> <td><input type="radio" id="open56" missinginfoid="56" checked="" value="71" name="Missing0" class="custom-control-input"><label class="custom-control-label" for="open56"></label></td> <td><input type="radio" id="resolve56" missinginfoid="56" value="72" name="Missing0" class="custom-control-input"><label class="custom-control-label" for="resolve56"></label></td> <td><input type="radio" id="close56" missinginfoid="56" value="73" name="Missing0" class="custom-control-input"><label class="custom-control-label" for="close56"></label></td> </tr> </tbody> </table> </div> ``` ** Ouput: https://ibb.co/TrCSrk4** The output should look like drag and drop here in same line
Text overflow in Bootstrap responsive table
|html|css|bootstrap-4|
null
I have this sample dataset of product reviews. I'm trying to get users who rated certain products positively. Ratings are numbers 1-5, 5 being the best and 1 being the worst. [![dataset][1]][1] I tried the following queries: ``` SELECT user, product, rating FROM reviews WHERE user != 1 (product = 173 AND rating > 3) AND (product = 50 AND rating > 3); SELECT user FROM reviews WHERE user != 1 GROUP BY user HAVING (product = 173 AND rating > 3) AND (product = 50 AND rating > 3); SELECT user, product, rating FROM reviews WHERE user != 1 AND (product, rating) IN ((173, 4), (50, 4)) ``` None of them work. I'm looking for users who rated product 173, 4 or 5 AND rated product 50, 4 or 5. Its easy to get results where one of the combinations match but I need to match all conditions of product, rating combinations. I prepared a fiddle: [db-fiddle.com][2] [1]: https://i.stack.imgur.com/UMVZM.png [2]: https://www.db-fiddle.com/f/gGmLCpQMkDoeFQC5TZ8YKM/0
Which approach is right while creating a service for your update method?
|c#|asp.net|asp.net-mvc|asp.net-core|
null
I found the code below on microsoft that sorts in-text citations into alphabetical order. For example, (Gamma 2019; Beta 2011; Alpha 2009) --> (Alpha 2009; Beta 2011; Gamma 2019). When I run the code I get a "Compile error: Wrong number of arguments or invalid property assignment". ``` Sub AlphaCites() Application.ScreenUpdating = False Dim ArrTxt() As String, i As Long, j As Long With ActiveDocument.Content With .Find .ClearFormatting .Text = "\([!\(]@\)" .Format = False .Forward = True .MatchWildcards = True .Wrap = wdFindStop End With Do While .Find.Execute If InStr(.Text, "; ") > 0 Then .Start = .Start + 1 .End = .End - 1 j = UBound(Split(.Text, "; ")) ReDim ArrTxt(j) For i = 0 To j ArrTxt(i) = Split(.Text, "; ")(i) Next WordBasic.SortArray ArrTxt() .Text = Join(ArrTxt(), "; ") End If .Collapse wdCollapseEnd DoEvents Loop End With End Sub ``` Can anyone point me to how I can troubleshoot this? The error seems to be on `j = UBound(Split(.Text, "; "))` Which if I'm correct will find the upper delimiter of the array data stored in j. Perhaps the array is empty? Regards, Steve
VBA query - sort text in alphabetical order in Word
|vba|ms-word|
null
I can't seem to get Log4j2 logging to work. I've read many articles and posts about configuring the log4j2.properties file, yet nothing I've tried works. Here is my `log4j2.properties` file: appender.file.type = File appender.file.name = fileLogger appender.file.fileName=ExampleClass.log appender.file.layout.type = PatternLayout appender.file.layout.Pattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n logger.ExampleClass.name = ExampleClass logger.ExampleClass.level = trace logger.ExampleClass.appenderRef.file.ref = fileLogger Here is the Java source code that attempts to log: import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; public class ExampleClass { private static Log log = LogFactory.getLog(ExampleClass.class); public static void main(String[] args) { log.info("info in the main method"); log.error("error in the main method"); } } Here are the VM arguments passed to the JVM: `-Dlog4j.configurationFile=log4j2.properties -Dlog4j2.debug` Here are the JARs on the CLASSPATH: commons-logging-1.2.jar log4j-api-2.17.2.jar log4j-core-2.17.2.jar log4j-jcl-2.17.2.jar Here is the Console log (from Eclipse) when the Java code is run: DEBUG StatusLogger Using ShutdownCallbackRegistry class org.apache.logging.log4j.core.util.DefaultShutdownCallbackRegistry DEBUG StatusLogger Took 0.271365 seconds to load 222 plugins from jdk.internal.loader.ClassLoaders$AppClassLoader@73d16e93 DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger AsyncLogger.ThreadNameStrategy=UNCACHED (user specified null, default is UNCACHED) TRACE StatusLogger Using default SystemClock for timestamps. DEBUG StatusLogger org.apache.logging.log4j.core.util.SystemClock supports precise timestamps. DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger PluginManager 'Converter' found 45 plugins DEBUG StatusLogger Starting OutputStreamManager SYSTEM_OUT.false.false-1 DEBUG StatusLogger Starting LoggerContext[name=73d16e93, org.apache.logging.log4j.core.LoggerContext@4d826d77]... DEBUG StatusLogger Reconfiguration started for context[name=73d16e93] at URI null (org.apache.logging.log4j.core.LoggerContext@4d826d77) with optional ClassLoader: null DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger PluginManager 'ConfigurationFactory' found 4 plugins DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger Missing dependencies for Yaml support, ConfigurationFactory org.apache.logging.log4j.core.config.yaml.YamlConfigurationFactory is inactive DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger Missing dependencies for Json support, ConfigurationFactory org.apache.logging.log4j.core.config.json.JsonConfigurationFactory is inactive DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger Using configurationFactory org.apache.logging.log4j.core.config.ConfigurationFactory$Factory@75f9eccc TRACE StatusLogger Trying to find [log4j2.properties] using context class loader jdk.internal.loader.ClassLoaders$AppClassLoader@73d16e93. DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger Apache Log4j Core 2.17.2 initializing configuration org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@2cbb3d47 DEBUG StatusLogger PluginManager 'Core' found 127 plugins DEBUG StatusLogger PluginManager 'Level' found 0 plugins DEBUG StatusLogger PluginManager 'Lookup' found 16 plugins DEBUG StatusLogger Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef]. TRACE StatusLogger TypeConverterRegistry initializing. DEBUG StatusLogger PluginManager 'TypeConverter' found 26 plugins DEBUG StatusLogger createAppenderRef(ref="fileLogger", level="null", Filter=null) DEBUG StatusLogger Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig]. DEBUG StatusLogger LoggerConfig$Builder(additivity="null", level="TRACE", levelAndRefs="null", name="ExampleClass", includeLocation="null", ={fileLogger}, ={}, Configuration, Filter=null) DEBUG StatusLogger Building Plugin[name=loggers, class=org.apache.logging.log4j.core.config.LoggersPlugin]. DEBUG StatusLogger createLoggers(={ExampleClass}) DEBUG StatusLogger Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout]. DEBUG StatusLogger PatternLayout$Builder(pattern="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n", PatternSelector=null, Configuration, Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null") DEBUG StatusLogger PluginManager 'Converter' found 45 plugins DEBUG StatusLogger Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.FileAppender]. DEBUG StatusLogger FileAppender$Builder(fileName="ExampleClass.log", append="null", locking="null", advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout(%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n), name="fileLogger", Configuration, Filter=null, ={}) DEBUG StatusLogger Starting FileManager ExampleClass.log DEBUG StatusLogger Building Plugin[name=appenders, class=org.apache.logging.log4j.core.config.AppendersPlugin]. DEBUG StatusLogger createAppenders(={fileLogger}) WARN StatusLogger No Root logger was configured, creating default ERROR-level Root logger with Console appender DEBUG StatusLogger Starting OutputStreamManager SYSTEM_OUT.false.false-2 DEBUG StatusLogger Configuration org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@2cbb3d47 initialized DEBUG StatusLogger Starting configuration org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@2cbb3d47 DEBUG StatusLogger Started configuration org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@2cbb3d47 OK. TRACE StatusLogger Stopping org.apache.logging.log4j.core.config.DefaultConfiguration@10a035a0... TRACE StatusLogger DefaultConfiguration notified 1 ReliabilityStrategies that config will be stopped. TRACE StatusLogger DefaultConfiguration stopping root LoggerConfig. TRACE StatusLogger DefaultConfiguration notifying ReliabilityStrategies that appenders will be stopped. TRACE StatusLogger DefaultConfiguration stopping remaining Appenders. DEBUG StatusLogger Shutting down OutputStreamManager SYSTEM_OUT.false.false-1 DEBUG StatusLogger OutputStream closed DEBUG StatusLogger Shut down OutputStreamManager SYSTEM_OUT.false.false-1, all resources released: true DEBUG StatusLogger Appender DefaultConsole-1 stopped with status true TRACE StatusLogger DefaultConfiguration stopped 1 remaining Appenders. TRACE StatusLogger DefaultConfiguration cleaning Appenders from 1 LoggerConfigs. DEBUG StatusLogger Stopped org.apache.logging.log4j.core.config.DefaultConfiguration@10a035a0 OK TRACE StatusLogger Reregistering MBeans after reconfigure. Selector=org.apache.logging.log4j.core.selector.ClassLoaderContextSelector@53aad5d5 TRACE StatusLogger Reregistering context (1/1): '73d16e93' org.apache.logging.log4j.core.LoggerContext@4d826d77 TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=StatusLogger' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=ContextSelector' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=Loggers,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=Appenders,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=AsyncAppenders,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=AsyncLoggerRingBuffer' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=Loggers,name=*,subtype=RingBuffer' DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=73d16e93 DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=73d16e93,component=StatusLogger DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=73d16e93,component=ContextSelector DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=73d16e93,component=Loggers,name=ExampleClass DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=73d16e93,component=Appenders,name=DefaultConsole-2 DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=73d16e93,component=Appenders,name=fileLogger TRACE StatusLogger Using default SystemClock for timestamps. DEBUG StatusLogger org.apache.logging.log4j.core.util.SystemClock supports precise timestamps. TRACE StatusLogger Using DummyNanoClock for nanosecond timestamps. DEBUG StatusLogger Reconfiguration complete for context[name=73d16e93] at URI C:\Users\mbmas_000\workspace\TestCommonsLogging2\bin\log4j2.properties (org.apache.logging.log4j.core.LoggerContext@4d826d77) with optional ClassLoader: null DEBUG StatusLogger Shutdown hook enabled. Registering a new one. DEBUG StatusLogger LoggerContext[name=73d16e93, org.apache.logging.log4j.core.LoggerContext@4d826d77] started OK. 11:22:55.347 [main] INFO ExampleClass - info in the main method 11:22:55.352 [main] ERROR ExampleClass - error in the main method DEBUG StatusLogger Stopping LoggerContext[name=73d16e93, org.apache.logging.log4j.core.LoggerContext@4d826d77] DEBUG StatusLogger Stopping LoggerContext[name=73d16e93, org.apache.logging.log4j.core.LoggerContext@4d826d77]... TRACE StatusLogger Unregistering 1 MBeans: [org.apache.logging.log4j2:type=73d16e93] TRACE StatusLogger Unregistering 1 MBeans: [org.apache.logging.log4j2:type=73d16e93,component=StatusLogger] TRACE StatusLogger Unregistering 1 MBeans: [org.apache.logging.log4j2:type=73d16e93,component=ContextSelector] TRACE StatusLogger Unregistering 1 MBeans: [org.apache.logging.log4j2:type=73d16e93,component=Loggers,name=ExampleClass] TRACE StatusLogger Unregistering 2 MBeans: [org.apache.logging.log4j2:type=73d16e93,component=Appenders,name=fileLogger, org.apache.logging.log4j2:type=73d16e93,component=Appenders,name=DefaultConsole-2] TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=AsyncAppenders,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=AsyncLoggerRingBuffer' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=73d16e93,component=Loggers,name=*,subtype=RingBuffer' TRACE StatusLogger Stopping org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@2cbb3d47... TRACE StatusLogger PropertiesConfiguration notified 2 ReliabilityStrategies that config will be stopped. TRACE StatusLogger PropertiesConfiguration stopping 1 LoggerConfigs. TRACE StatusLogger PropertiesConfiguration stopping root LoggerConfig. TRACE StatusLogger PropertiesConfiguration notifying ReliabilityStrategies that appenders will be stopped. TRACE StatusLogger PropertiesConfiguration stopping remaining Appenders. DEBUG StatusLogger Shutting down FileManager ExampleClass.log DEBUG StatusLogger OutputStream closed DEBUG StatusLogger Shut down FileManager ExampleClass.log, all resources released: true DEBUG StatusLogger Appender fileLogger stopped with status true DEBUG StatusLogger Shutting down OutputStreamManager SYSTEM_OUT.false.false-2 DEBUG StatusLogger OutputStream closed DEBUG StatusLogger Shut down OutputStreamManager SYSTEM_OUT.false.false-2, all resources released: true DEBUG StatusLogger Appender DefaultConsole-2 stopped with status true TRACE StatusLogger PropertiesConfiguration stopped 2 remaining Appenders. TRACE StatusLogger PropertiesConfiguration cleaning Appenders from 2 LoggerConfigs. DEBUG StatusLogger Stopped org.apache.logging.log4j.core.config.properties.PropertiesConfiguration@2cbb3d47 OK DEBUG StatusLogger Stopped LoggerContext[name=73d16e93, org.apache.logging.log4j.core.LoggerContext@4d826d77] with status true Note the following: 1. The actual log entries created (you can see this in the above output) are: `11:22:55.347 [main] INFO ExampleClass - info in the main method` `11:22:55.352 [main] ERROR ExampleClass - error in the main method` are not in the format specified in `log4j2.properties`. 2. The log went to the console (obviously utilizing the built in default root logger) and not the file. In fact, no file was created. What is wrong with my `log4j2.properties` file?
Very simple application of Log4j2 appears to ignore log4j2.properties file
|java|log4j2|apache-commons-logging|
How to dynamically update a selectizeInput?
Here is one possible option using [`cKDTree.query`][1] from [tag:scipy]: ```py from scipy.spatial import cKDTree def knearest(gdf, **kwargs): notna = gdf["PPM_P"].notnull() coordinates = gdf.get_coordinates().to_numpy() dist, idx = cKDTree(coordinates[notna]).query(coordinates[~notna], **kwargs) k = kwargs.get("k") _ser = pd.Series( gdf.loc[notna, "PPM_P"].to_numpy()[idx].tolist(), index=(~notna)[lambda s: s].index, ) gdf.loc[~notna, "PPM_P"] = _ser[~notna].map(np.mean) return gdf N = 2 # feel free to make it 5, or whatever.. out = knearest(gdf.to_crs(3662), k=range(1, N + 1))#.to_crs(4326) ``` Output (*with `N=2`*): ***NB**: Each red point (an FID having a null PPM_P), is associated with the N nearest green points*. [![enter image description here][2]][2] Final GeoDataFrame (*with intermediates*): ```py FID PPM_P (OP) PPM_P (INTER) PPM_P geometry 0 0 34.919571 NaN 34.919571 POINT (842390.581 539861.877) 1 1 NaN 37.480218 37.480218 POINT (842399.476 539861.532) 2 2 NaN 35.567003 35.567003 POINT (842408.370 539861.187) 3 3 NaN 35.567003 35.567003 POINT (842420.229 539860.726) 4 4 36.214436 NaN 36.214436 POINT (842429.124 539860.381) 5 5 NaN 38.127651 38.127651 POINT (842438.018 539860.036) 6 6 NaN 40.431946 40.431946 POINT (842446.913 539859.691) 7 7 40.823028 NaN 40.823028 POINT (842458.913 539862.868) 8 8 NaN 37.871299 37.871299 POINT (842378.298 539851.425) 9 9 40.823028 NaN 40.823028 POINT (842390.158 539850.965) 10 10 40.040865 NaN 40.040865 POINT (842399.052 539850.620) 11 11 36.214436 NaN 36.214436 POINT (842407.947 539850.275) 12 12 34.919571 NaN 34.919571 POINT (842419.947 539853.452) 13 13 NaN 38.127651 38.127651 POINT (842428.841 539853.107) 14 14 40.040865 NaN 40.040865 POINT (842437.736 539852.761) 15 15 NaN 40.431946 40.431946 POINT (842449.595 539852.301) 16 16 NaN 40.431946 40.431946 POINT (842458.489 539851.956) 17 17 NaN 40.431946 40.431946 POINT (842467.384 539851.611) 18 18 NaN 40.431946 40.431946 POINT (842476.278 539851.266) 19 19 NaN 37.871299 37.871299 POINT (842368.981 539840.859) ``` [1]: https://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.cKDTree.query.html [2]: https://i.stack.imgur.com/KefCq.png
Although other answers offer great answers as how to include the current directory, I think the *actual* answer to the original question may be in this: Automatically adding current directory to sys.path is the "safepath" concept. But look: > int safepath<br> > ...<br> > Default: 0 in Python config, 1 in isolated config. https://docs.python.org/3/c-api/init_config.html#c.PyConfig.safe_path > The Isolated Configuration can be used to embed Python into an application. It isolates Python from the system. For example, environment variables are ignored, the LC_CTYPE locale is left unchanged and no signal handler is registered. https://docs.python.org/3/c-api/init_config.html#init-config So as it seems, embedded mode is run as isolated mode, which specifically disables this behavior by default. This answer is for Python 3, where this behavior is still observable.
<link rel="icon" href="https://example.com/wp-content/uploads/2017/11/cropped-smiling-sun-1.png" sizes="32x32" /> <link rel="icon" href="https://example.com/wp-content/uploads/2017/11/cropped-smiling-sun-1.png" sizes="192x192" /> <link rel="apple-touch-icon" href="example.com/wp-content/uploads/2017/11/cropped-smiling-sun-1.png" /> From another similar question that's answer seems to have worked. Going to try it myself soon. I'll edit then.
I know this is old but in case it helps someone. Just a clean solution and Rebuild solution fixed this for me
Try it here: https://github.com/Mantano/iridium?tab=readme-ov-file. Currently, there is no implementation for "Highlights/annotations", but it is listed in the to-do list.
I have strings that contain scalars, vectors, and matrices, e.g. "[0,1],[[1,2],[3,4]],42" And following the same example, that input in R should give me this: ``` [[1]] [1] 0 1 [[2]] [,1] [,2] [1,] 1 3 [2,] 2 4 [[3]] [1] 42 ``` So the notation is similar to that of Python where matrices are actually lists of lists. Should I just write a parser for this, or has somebody already done that or is there a simple and neat trick to achieve this with no or only a little effort?
How to read scalar, vector and matrix information in string format resembling Python syntax in R?
|python|r|parsing|matrix|
I'm trying to get information from the **ECU**. The connection works, and I see that the request has been sent but I don't receive any information. Because the application (which is burned in the unit and I work in front of it) checks that the **FD** **flag** exists and therefore I need it. I added the flag but it comes off, not received. In any case, I did not receive the information I requested. (I am using **VN5620** as a **CAN** Interface) my code to **connection**: ~~~ # Create connection bus = Bus(interface='vector', app_name='CANalyzer', channel=0, bitrate=250000) # Network layer addressing scheme tp_addr = isotp.Address(isotp.AddressingMode.Normal_29bits, txid=0x1C440019, rxid=0x1C460019) # Network/Transport layer (IsoTP protocol) stack = isotp.CanStack(bus=bus, address=tp_addr, params=isotp_params) # Speed First (do not sleep) stack.set_sleep_timing(0, 0) # Set up the ISO-TP connection using the appropriate CAN bus interface and parameters tp_connection = PythonIsoTpConnection(stack) ~~~ my code to **request**: ~~~ with Client(tp_connection, request_timeout=1, config=config) as client: try: response = client.tester_present() if response.valid: # Extract the data from the response data = response.service_data.values[data_identifier] # Print the response data print(f"Response data: {data}") else: print("Error: Invalid response") finally: client.close() ~~~ the log: ~~~ 2024-02-13 12:39:13 [INFO] Connection: Connection opened 2024-02-13 12:39:13 [INFO] UdsClient: TesterPresent<0x3e> - Sending TesterPresent request 2024-02-13 12:39:13 [DEBUG] Connection: Sending 2 bytes : [3e00] Traceback (most recent call last): File "C:\workspace\companion_chip_automation\Tests\UdsOnCanTest\tring_test.py", line 86, in <module> response = client.tester_present() File "C:\workspace\venv_3.8\lib\site-packages\udsoncan\client.py", line 174, in decorated return func(self, *args, **kwargs) File "C:\workspace\venv_3.8\lib\site-packages\udsoncan\client.py", line 391, in tester_present response = self.send_request(req) File "C:\workspace\venv_3.8\lib\site-packages\udsoncan\client.py", line 2184, in send_request raise TimeoutException('Did not receive response in time. %s time has expired (timeout=%.3f sec)' % udsoncan.exceptions.TimeoutException: Did not receive response in time. Global request timeout time has expired (timeout=1.000 sec) 2024-02-13 12:39:14 [DEBUG] Connection: No data received: [TimeoutException] - Did not receive IsoTP frame from the Transport layer in time (timeout=1.0 sec) 2024-02-13 12:39:14 [ERROR] UdsClient: [TimeoutException] : Did not receive response in time. Global request timeout time has expired (timeout=1.000 sec) ~~~ Could you help me to get the first response and solve this issue?
null
Iam new here for postgresql..I want to know exactly inside how the pages is looks in postgresql- have gone through some of articles like 8KB page but internally to understand quite difficulty but as a newer can some one provide how the page looks like and if anything can simulation exists let Me know I can use it for reference. I have gone through the many links but its very difficult to understand as an new beginner if someone can tel easy way that helps for Me.
Data page internal pull information
|postgresql|
null
# Idea In your `for` loop, when you start iterating from index `i`, you actually don't need to fetch all occurrence of high/low hits, but just stop the iteration once the **first hit** (either for high or low) happens at `j` for `j >= i`. Then you refresh your starting point with `i+1` and repeat the process above. In other words, your origin approach `min(which(...))` is unnecessarily inefficient since it gives all indices info and takes the first one only, while it could be much faster if you terminate the iteration as long as you find the first hit, then enter the next round. # Code Maybe you can try the `for` loops like below ``` L <- length(x) hit <- vector(length = L) for (i in 1:L) { for (j in i:L) { d <- x[j] - x[i] if (d >= con) { hit[i] <- 1 break } if (d <= -con) { break } } } ``` # benchmark Defining candidates like below ``` f1 <- function() { hit <- NULL for (i in 1:length(x)) { hithi <- min(which(x[i:length(x)] >= x[i] + con)) hithi <- ifelse(is.infinite(hithi), length(x), hithi) hitlo <- min(which(x[i:length(x)] <= x[i] - con)) hitlo <- ifelse(is.infinite(hitlo), length(x), hitlo) hit[i] <- ifelse(hithi < hitlo, 1, 0) } hit } f2 <- function() { L <- length(x) hit <- vector(length = L) for (i in 1:L) { xi <- x[i] for (j in i:L) { d <- x[j] - xi if (d >= con) { hit[i] <- 1 break } if (d <= -con) { break } } } hit } ``` we run the following benchmarking test ``` set.seed(0) x <- cumsum(rnorm(10000)) con <- 5 microbenchmark( f1 = f1(), f2 = f2(), times = 10L, unit = "relative", check = "equivalent" ) ``` and we see that ``` Unit: relative expr min lq mean median uq max neval f1 23.86287 24.56557 23.66208 24.22205 23.91301 21.11874 10 f2 1.00000 1.00000 1.00000 1.00000 1.00000 1.00000 10 ``` which shows `f2` is `23x` faster than `f1`. If you want to boost the speed even further, you can rewrite the code in `Rcpp`.
[image showing output from below code][1] homes_by_state = df_south.groupby(["state"])["property_type"].value_counts() I want to output only "state" and total number of "property_type" [1]: https://i.stack.imgur.com/DwhmA.png
For Swagger in .NET 6 (possibly earlier), use standard HTML list tags: ``` /// <summary> /// ... /// </summary> /// <param name="...">_some_text_like_"Available Values:" /// <ul> /// <li>item 1</li> /// <li>item 2</li> /// <li>item 3</li> /// <li>...</li> /// </ul> /// </param> ``` *Note: Indentation not required.*
I have a java Discord bot running JDA Version 5.0.0-beta.20 and lavaplayer version 1.3.77. This bot worked very well, but I decided to update the command system. I created a new branch on my GitHub project to update the system without changing the working bot. After updating the command system (I implemented slash commands) I've tested the new version of the bot. It joins my channel, but it doesn't play music. Then I've tested if my old jar with the bot that worked well still works and there is the same problem for this .jar file without changing anything to it. I already tried to update/downgrade both JDA and lavaplayer version. I also removed the bot from my test discord server and added it new (with administration permission). But it still doesn't work. Could it be a bug from lavaplayer/JDA? Or do I have to change anything?
JDA Lavaplayer bot doesn't work without changing .jar file
|java|discord|bots|lavaplayer|
I need to batch delete from a table filtering from another related table. there can be millions of rows so I use .Take() in a loop - so as not to fill transaction log. I'm getting SqlException: The column 'InvoiceId' was specified multiple times for 't1'. var x = ctx.Invoices.Where(p => p.InvoiceDate <= new DateTime(2023, 9, 14)).SelectMany(p =>p.InvoiceDetails); var y = x.ToLinqToDB().Take(100000); y.Delete(); converts to info: Microsoft.EntityFrameworkCore.Infrastructure[10403] Entity Framework Core 6.0.13 initialized 'Context' using provider 'Microsoft.EntityFrameworkCore.SqlServer:6.0.13' with options: SensitiveDataLoggingEnabled info: LinqToDB[0] -- SqlServer.2008 DECLARE @take Int -- Int32 SET @take = 10 DECLARE @InvoiceDate Date SET @InvoiceDate = CAST('2023-09-14T00:00:00.0000000' AS DATETIME2) DELETE [c] FROM ( SELECT TOP (@take) * FROM [Invoices] [p] INNER JOIN [InvoiceDetails] [c_1] ON [p].[InvoiceId] = [c_1].[InvoiceId] WHERE [p].[InvoiceDate] = @InvoiceDate ) [t1] fail: LinqToDB[0] Failed executing command. Microsoft.Data.SqlClient.SqlException (0x80131904): The column 'InvoiceId' was specified multiple times for 't1'. I have tried the many different ways of coding this, Query syntax, method syntax. If it is just one table .Delete() works fine. also Tried .ToLinq2DbTable() with the same error. Linq2db version 6.7.1 EF Core version 6.0.13
Linq2db batch delete not working for 2 tables
|batch-file|entity-framework-core|linq2db|
null
{"Voters":[{"Id":446594,"DisplayName":"DarkBee"},{"Id":4535200,"DisplayName":"Qirel"},{"Id":272109,"DisplayName":"David Makogon"}],"SiteSpecificCloseReasonIds":[13]}