instruction stringlengths 0 30k ⌀ |
|---|
|rest|apache-kafka|content-type|acr|azure-webhooks| |
null |
Just type `.git` Infront of the project path.
For example: `C:\DotNetAngularProject\MyApp\.git`
You will get list of files & folders [list of file names like HEAD,FETCH][1]
[1]: https://i.stack.imgur.com/ImSzj.png
Delete `index.lock` file manually.
Close your project & restart again.
Issue will be solved. |
null |
In DB Stored procedure i am trying to insert data into multiple tables together with normal insert query. But if i am getting more service call at the same time. Data is not inserting in all the tables and skipping it and moving to the another request. Required solution at procedure level.
Required solution at procedure level
INSERT INTO RepurchaseMaster_Reg (RegId, SprRegId, SalesTo, FId, BillNo, BillDate, TotQty, TotPV, Status, SesId, UqId, IpAddress, TotalDP, TotUnitPrice, CommDate, Remarks, CBPayNo, MemType, BillType, OrdType,
TaxType, TaxSubType, SaleType, TotGross, TotNet, TotValue, TotCGST, TotSGST, TotIGST, SplOffCode, SplOffAmt, TotalBP, TaxAmt, NetAmt, UpdatedDate, PayableAmount, TotMRP, RepType)
SELECT
@RegId,
@SprRegId,
@SalesTo,
@FId,
@BillNo,
@BillDate,
isnull(@TotQty,0),
@TotPV,
@Status,
@SessId,
@UqId,
@IpAddress,
@TotalDP,
@TotUnitPrice,
@CommDate,
@Remarks,
@CBPayNo,
@MemType,
@BillType,
@OrdType,
@TaxType,
@TaxSubType,
@SaleType,
@TotGross,
@TotNet,
@TotValue,
@TotCGST,
@TotSGST,
@TotIGST,
@SplOffCode,
@SplOffAmt,
0,
0,
0,
GETDATE(),
@PayableAmt,
@TotMRP,
@RepType
SET @RMid = @@identity
IF @BillType = 2
BEGIN
INSERT INTO ProductCredits (RegId, Sprno, InAmt, OutAmt, Dated, Descr, Remarks, TypeOfInc, billno, rmid, ExpiredDate, LockedDate)
SELECT
0 RegId,
@RegId Sprno,
@TotPV InAmt,
0 OutAmt,
GETDATE() Dated,
'Product Purchase thru ID : ' + @BillNo Descr,
'Thru Product Purchase' Remarks,
'Product Purchase' TypeOfInc,
@BillNo billno,
@RMid,
DATEADD(yyyy, 1, GETDATE()) ExpiredDate,
DATEADD(yyyy, 1, GETDATE()) LockedDate
UPDATE RepurchaseMaster_Reg
SET TotPV = 0
WHERE rmid = @RMid
END
INSERT INTO RepurchaseItems_Reg (rmid, pid, qty, MRP, DP, UnitPrice, GrossValue, DisCount, NetValue, PV, GST, IGSTAmt, CGST, CGSTAmt, SGST, SGSTAmt, TotValue, BP, amount, BV
, TaxAmt, CGSTTax, CGSTTaxAmt, SGSTTax, SGSTTaxAmt)
SELECT
@rmid,
ISNULL(ic.ProdItemId, t.pid) pid,
qty,
MRP,
DP,
UnitPrice,
GrossValue,
DisCount,
NetValue,
PV,
GST,
IGSTAmt,
CGST,
CGSTAmt,
SGST,
SGSTAmt,
TotValue,
ISNULL(BP, 0),
ISNULL(Amount, 0),
BV,
ISNULL(IGSTAmt, 0),
CGST,
CGSTAmt,
SGST,
SGSTAmt
FROM tmpRPProductsItems t
OUTER APPLY (SELECT TOP 1
ProdItemId
FROM ProductItemCodes
WHERE pid = t.pid) ic
WHERE UQID = @UqId
AND sessionid = @SessId
AND RegId = @RegId
AND FId = @FId
INSERT INTO RepurchasePayments_Reg (Regid, RMid, Mop, MopNo, MopDate, MopBank, Amount, Sesid, DDNo, ChequeNo, vouchercode)
SELECT
@regid,
@RMid,
MOP,
@MopRefNo,
@Billdate,
'',
Amount,
@sessid,
DDNo,
'',
''
FROM TmpPayments
WHERE Uniqid = @UqId
AND SessId = @SessId
AND RegId = @RegId
AND FId = @FId
IF (@ReqToType = 'Stores')
BEGIN
INSERT INTO RepurchaseReqToAddress_Reg (Rmid, Code, Name, Addr, State, District, City, Pincode, Mobile, Email, GSTNo)
SELECT
@RMid,
'Stores',
Name,
Address,
State,
District,
City,
LEFT(Pincode, 6),
LEFT(Mobile, 15),
Email,
LEFT(GSTNo, 15)
FROM StoreProfileAddress(NOLOCK)
WHERE CountryId = @CountryId
INSERT INTO RepurchaseAddress_Reg (Rmid, Name, Addr, State, District, City, Mobile, Email, Pincode, GSTNo)
SELECT
@RMid,
@SName,
@Addr,
@StId,
@District,
@City,
@Mobile,
Email,
@Pin,
LEFT(GSTNo, 15)
FROM StoreProfileAddress(NOLOCK)
WHERE CountryId = @CountryId
|
The following code behaves differently between Hibernate 6.1 and 6.4 on postgresql:
```java
var rows =
session
.createNativeQuery("SELECT CAST('[1,2,3]' as jsonb) as data, 2 as id", Object[].class)
.addScalar("data")
.addScalar("id", StandardBasicTypes.INTEGER)
.list();
rows.forEach(row -> System.out.println(row[0].getClass() + " -> " + row[0]));
```
Output with 6.1:
```
class java.util.ArrayList -> [1, 2, 3]
```
Output with 6.4:
```
class java.lang.String -> [1, 2, 3]
```
In 6.1, it converts the jsonb column into an ArrayList<Integer>; but on 6.4 it just reads it as a String. I tried passing in a bunch of different `StandardBasicTypes` and none of them made 6.4 behave like 6.1. **So, how can I do this in 6.4?**
Here is code to reproduce this:
https://github.com/chadselph/hibernate-json-native-query |
I have a function responsible for constructing URLs using relative paths, such as `../../assets/images/content/recipe/`. I want to replace the `../../assets/images` part with a Vite alias, but it doesn't work.
Here's my function:
const getSrc = ext => {
return new URL(
`@img/content/recipe/${props.recipe.image}${ext}`,
import.meta.url
).href;
};
And here's Vite config for aliases:
'@img': fileURLToPath(
new URL('./src/assets/images', import.meta.url)
),
I'm looking for a possible solution. |
Using Aliases in a Javascript Function |
|javascript|node.js|vue.js|vite|alias| |
1. \[enter image description here\](https://i.stack.imgur.c
```
*
type here
*
xmkosjpscndowdv dwnmo
```
om/JYxhM.png)
i am try to learn but i have no idea about because i disturb my mind
tell me about it
any one have idea about it can any person know about
i am try to learn but i have no idea about because i disturb my mind
tell me about it
any one have idea about it can any person know about |
i have a problem when i remove 3 from new arry( ) there is no changing this thing disturb me what is this |
|arrays|new-operator| |
null |
I create a chart in Vegalite and add scroller on it (vconcat-1) scroller work fine but text get dissapper on apply(bind scroller on x axis) by using a code -- "scale": {"domain": {"param": "brush"}}, under the x axis .
I want text must be visible and scroller also work properly .[![when i bind scroller on x axis text get dissapper ][1]][1][Editor link][2]
[1]: https://i.stack.imgur.com/hlEIC.png
[2]: https://vega.github.io/editor/#/url/vega-lite/N4IgRghgxg1g5gJwPYFcB2ATEAuUBTADwAcEcQAhaeZdDAYQjQDcIBnOpAGyVIF8AaEFCRoAZgEs4OUBALjW0kJwhg8nAGIiALgGUtAT055FhEmQCCc1gBkVazWl0GjIASC3itRh08PHcIKak2CCW8gAqnt7aen6ubqLaJsTBID7h+kTGbkZweJiKwpwoALZoCgFBZNZ4eZgcxWWugsqqnJSsauJo-vgp1bX5GLZtHV09zUp2nA08yWYhNXXD07N8LdM+OuIAXr2B-YuDmCP2MbvZG21bzvtVR8unGjG3k61q1uIlnvOpS0NPThfH45aYAeVEok6Wl+A0e4Mh0Le4KI0E8+lhDwBKPEUHRyP0qBhATASC0WiQJUUjCgAAs5iFvhgMC43BSiNS0HSGSAmSzsm4eOJ8sS+gsQP9MGCEMLHJNWPoSqTOGDUXiDJiJccMDpFcrVbj8bw2VF9olHLEXJVDiBIl48Dc4m5WCgwB57Y6rWLUjpXe7ohbXsa3BgIFoIIoWMU8BUANqgBhGTAQBAAAgAstpaamAJIAETIACYAAzFgCcxYAjCBBDm0ABHFB4JsAfWE6GJhcrAA4ACyCACixDwUC0eAwbdQcuwhYArMWAGwCBMQJOhtOZxzZ-NF0sVws1kB1xvNvCTjs4LsAdm7g+Ho-H5+nhd73cLgnT4gQeCIFQXxdnStBAAOSQVMkFEVMACU8BKJAowqWdZwAZn7EAAAUvx-RDewXZDlxARMhhTDMs1zAsQhLcti2Qw9jybVt22fZDKwXO8sgfCcmM7K8F17AiiOTDcyJ3Si92LXs6IbBiz24y9K1nW8QCHDixy4qdOwXLSBNXYjhK3cjd2o2cpJPRiNMvJC0JUkc1KfYlkIk-D+BXNcSM3LRtwokAqIrBdTJk+zLy0st2Nsx85OwZDHLLHS3P0zzDLE6irwC08gpnK9C1C5T7zsyLoqvK84r00iDNEnzxO7NLzIvGctPfXLVIiiyook-iXMI3ShLKxKKt88satk1rC0LVCws4jKXyvZzXNKjyvKMstK2LIapoXbslJsyaCpLSsSp6haksq8tK2rWtpPSyLrxy7b8pG5DAIO9desW5LloPC6zOGuqxokib7t+q8aN4ABdNxUQQCASjjUA0Gh-wtWWBpSjQQ8oybHAOrhhGyAADTCGxpnMNA4BcQQMf8ABaCsCPhkpEclDBpVlGEKdXTGQlJclKWaHGGbhbFRjYcZjHZ6MyD5VlOvpxntSeNZ0Y5xGAGJS3V1a6dxrETk2c49iViXsErWaQFlwXdbaIFvjZkBKZwYstYFnWVjaCEoTwW37ZnWcnbl+E3aIQ0NXFzn9pl7WQAANTwBAPCgVdPh6NZzrt5WyDVmioCvMA+bNyOY7j3FE+6PA9GQGA8DzNhaVT73YyA4twYj53o9j+OS56cukErgB1cQME8uv0+Nv2yELjvOCTvA1k+tOjZATPkOz3Ox5CCfi6n0vu8r6vWFpOf6-7Xtm-5xGN4Treu60Cu8H7weD8NznTfN9f2836e1lo0PVacsAr0kmvNuRdL7Tx3lXGu3956cwbvwJuQCL6dzLjfHud8B6eSgd7cOZ9x7v1AaXNYkkf4Zz-gAvOr9gGTzASg3eNciHQP8LA+BLdz54KQeA++nl6FYKAUzXUSouAGnVBiYho8WHVFLnwvUXAn6I1YI2FMlwcGLEkdqfhyptgG1EbTcRIRwgpjyFoKRAjOCyLIPIlAijyGR30QgQxxiNEXDMdgHRyiQAAE1CZPBJmTMWDCHZAN9GAO04gjCK1ESAHgjA8jWNbnaAMWhwn+K5soWAsTEbxIdPrPxWCly6NtKacwQI4BowiVAEUsd0lkEyeEQgXsR4gCqXowpXJ6SkAiVLJR+c4mmjGECCYET2RNIKfaPMBBnGO3yUE-0WS5QROVFgIBmTPg2wmUs007skSiJLIEv0ppnGNN2W6U0WwnGiNYkA9MKYYCqjRCHZJ2DumM1Lpw2kzjCxAM8VYTJBz6JXQskAgm3z9kRMEi9I6O5AVeOmCsn4ojJluK+fIJ4sL6lGwRU8sgSKiZtE0TkkeJsoVWCeHi5xhL8lAuRasLgcx5mpJgMM7FCsaXtOSeAelwzKWsEiLAUl8LPmEx5TAPlySMUUKZiSs5DyxWR2xZkpJ3t2VUE5YK00CqGlgA5QK4FHpslkryW4rlmTTlaIeQazFIRKCwEQFOegLKDkqwHMhAc6h1D+SJfIAA4jKZmap8T8vydi71A9bnCLJR61gwa7XcFZYqlWGBkIJoTYywmUb1UL3jYmrNKarB5kpBAbozjYzFn4AuU+FqJQEPtRElWvYMALlnCoYZOgRwiGGFWmNDrVC9jAK+YZ6EkDdC0HWHoCB02c0XnWhtTbAmtswAOodI7Y7jtVt23t1UgELscGCFAY4x3VrZbW+tjbV5TLnRgLdWgd17pXRnNdfalkGM9inB1UAzp4ELIs-JtSCCJJZcPDNb7Kwfq-YiqC0TYKyGcWgFAnBOCfPA6TWChbREwbg4+uxz6WWHwabW1iyEIC0SWXU2eDrez4cI8Mq5CBhW0ggFkA5N9xDROlm46jtH6Nlyld7YG2r5ADtYB4EQaz8nUNvgwDkESaAFF4dvGhED95FsbuW8Vpc+mlwOf0vAVNuYUipLJ6+t9XnhtE6XWp5RxDCfOQZvAob-UPNncIE4cnxP0YOdJ0DFaW1OfbYZ1Be83miKYSpyO3m23T3UwMtlWmdNkj08289YnUHGes2enz09zOsEs6UhzaXwulzs-cnhFLoWBzuSI6VGHDHgICwBmBymquew4eg2uZKAB0Hzv1PqvX6gwdX-CPIobY6r8mAs4aNsFxrTgjMtfG2HDrU3Cv6DmwNiNAmhM5e9jKuJdSTX4qNgpTdg7HA+PJqK4jv6fBjOcbx79u3tB5gq1txzbaf1aD22S32eXMBvau+M0Rt23FhZ+-dxwj2RNuPSJkfbE7EziDADKYZxqXhxHmVwTzQ2TnaF7ngSQtI0UToWc2vZurAyo7ZUTo5MyfA47xwTxGlP8mlbOGTs7irGduOZ88RwtO4D44ORzitqrSe+DZxq9HwzhcJN5-ztHnAMeRwlXrVnMOGcS6AVa6gtqGDMDYLekIKtXVG-UK4UGggZnUmKWja04pMlFMkGjNwNI2majt60uYbhICdC0673pIsfduCKAyb01SseOEViGcZNvUiZOuwkPV0fQ8i9JTkYEooDi29NKiyYEEPbp-uCMowmzPbyhJ16DPPoy+XBAC6Y59pFaJ5CEEkJYT7XOirx9xvIBplh90E4tkdTff2je-ECmTmE7EnjIELkSAMDdCkAEKPMgrCKEQAPHAN8myCDX9G4PFf8app9RH7fPqluai5VGs-oI2inbuDaLl3jSbSymG0BvIeQgP+pZ26-LO+8Gy75-riv3lcB8Gnuflztnj-iqL1hiAAVzlfoIEQEgFlhtuAVYOttlpMPHMKk4nAcCrysAe4PsngREPsiaPXvaiQdymqm3ubr3qSlQcjhaP3gkMKPLmQGCu5CJAWIIKwDwOnhIGoFgCEOvs6JfPsBgPmoWgEJDNDGQAjigIpsaObtDmQGgJSN0KuKPiALSLjnzp2POBsPoJUtgFPvkMIHPqTIUJQaAKGFoKUGQH8rVHMkoNqNYY0OUJqEzCjE0FAZFnfuKErsLN7hplAW-vvi7Myt-iAdzn-gEX8PLMrnEciG0Nnl3kEWoNbCCDEcXvnjaBkSqIiCXlAWfukYkYHMHBiM6NItAeVl4WojUUIkaLwTUQwe-pWj0A4lwCni0SYuAuEQXpetesurQTXjURkAxukaossOojIsoTXuIYoFDEhjgGYTaJekuvut-s3EQfaGQJMLAcvvIKvj6hvggFviADvgMTaEGkfqMTvqUe0bcSGjASkWoLfpqEysTE-tXu8DMDYRER4lzsfi-moCpKkHYaUG1n8QANQADkAApHCYeH8Z3k8Vzj0aCVvKsl3l8akWARDMgZ4JgbiYTBgcJiaAQf-uifgTgQbOQeXgXnKmQXQRQZ2qSTqq3tEbsQkm0YCcycniwYIIIewSEE4T9C4awIsQEJISUAWmgFckvvyYhnkHKeMiGFIQqdIU8Sqcho7oIHsMgDgKIKuJ0M6PwTgGhpwCoZMSAAoo4J4GGOIEwAKIIHKTRooF7qLF4Wpv7qEcKaEqYiEDfIwKwJDCKIeBIHBuOGcRcUgK8VMT0AgSAEgUOpyL4pqJeh8awXBpmcdsOmgKOiCfvJxpqOxjoHRgxs6Lge0eWZWVxvSbwfJnmUOsMVsR7tWdSYCdPMZs6M2YmTPKMYJq5hyAOf0W5n2bfAFj6X5rQkoU2TNg-DOcgouZ5FgfkFlsJgOZlpgWyEgFwB4KOZvngDaYjAHogSmNDLDBWgofvKtLwWoLZCYDPpYXAHGCAOMmbu4KoSEKmXKMaOWuYbPvPu4XvpCVSCEHdC1BeC0G4QEEUKjBUO0d4VwKjG8e0H6RMGUQHGoP4ehdcYEeUWoCCaiQnshURbEZiX8WkeRThXBgSTkUUXkYRXRbkehY8d2RRU0RqNUSYhxQXl0dAZUfKK0TWd2dMUMLMZwJiQqH0fJgRT6OekMbuiMdybJcqBMfERIp0Q0SYtoVKbpEsRBqse0cDhevmZsRHjsRbiEAcdSCvgEH8eCWQOBdCdMJMCKcIU1OFOpDBQsYZQEIaUgMaaaXgOaXHJabBtad+bafae6E6S6doe6QyiSJhVpU3olr6SERMDmUGe4MsWGYoi4VGUYMIceYIPGXUV3mZdPMmX+enixulRhPmdmQGbmdVUpRZYWapR7rwfWWWdchWaWZ2U1XWZxjJf2aZZ1a2Spe2XwCNcub2QuagpqDVR2r1TXvJW5h1eli5qghJvKKNjXKtZlbOQpm8pOcli1idbtWdUte4BuSSVNbdXgDuRSebvuZwIebGSeTFWef6SmZeTDCZTeecfvKnN7k+ZUC+fPu+Z+aeWQPVfEIBTDVYfBQCeBdUt1nPLkEMJFXBrwdKaAMsTEqYe0cNlhjGtWGDAyYjJMEvm4IcSAJjVFFRPMclYoJVWGl3hTT1uVtTctXOW8jzd1jVjXALZtaucLeTaLfJq8hLRSAeeIEeecb9QYLaecc-rIcDWTaDYobSFApDaOM+RYbDasR+SAF+erYjEjQBQREBa+YoGOL+ooCzbaN1qmAeG4AzYIEzSzchGzW4BzejeyeTSRv+pMBgEqQXr9g9uqYIBgEzTHaDloODvHswV2cnZdnqmyD+Q9S7YIFHiAAAO7XXKH4Co0L6gDO3p5u3hiYZaAS0+0gB+1hgOEzg9j8RB3XKgXBBh2-qkaR43WvYp1x4J1J02hmWx1g5VHClkWAlT0p2YnW1kA12HhF2l1Lnl3T6m1o2gBB593M1t0QUgB5jFipi80olwVWmE0BXE3GW60x3daD3WXEGHKF3SCM2u3H3yR9js090BBc34gi0N1LZe2C3nWu6y1Tk1zgOS1XVLkgMjZS1wOK1fXK0-UI0hCa3V7a3XkUK3m0j0JG356V1w2W1YMpn5nI322V292KC40FDYA308l03e3f32En0DQmRf24D-0ekh174F6IJXyDlU3yj9UhA+4QPTld4iNJZC0S3DkINrlyNsKiPNYPwK2fXfXYDlV-VkA4OHhAMahiKA1Qw61T4ENg20j+QPlGDG3Q271vnm3w0GO-nUN22dQO0gWCOH2MPCEsM2Xv0W0yk-3vRnR2V8Pd0CP70AnCPqOfzYYSOllSMaYyPHVqMgLsJHX7xwPKN9zXVZNUJ7WFMPyoM6MYN6Oq2UNGMVUJnhzmNXkg3WP60mT2NQ0V3OPkNW1522003eN0N+MMPX1RWslWghNKlu0DQS2HH8MpVxOh2AnyPrUIBEZ9WpNKDpPwNC2agrNnUBbrM7NoKIPtH7MrkqMG1YGVMq0XEr3YOwa4NA34ORyEPVQdOONdPAWkw9OUP9Mo3OP0OOWjME2sP7HsNhOcNLRVhRPADzNAtnOJOrOAIbO2nSPHOyOIvZMaO5NEOHVS17NIt3UtYovuA3OYPuMgBGMQzPMtOvM2OpQfOkPdOuMUOUv-NgyCDfh8GcCJUBAGXl4H1kDdAYA-hDARmouIwitiuYARnzEhnlCJAIAQVT5Rl7qKDa3yE2PI0UzCjF2KAlWaia42q0AR5uCb2qNCDaDymVK0OAvDMBCKrFL47eoQCz2hNHEVAsOeUcHdTgrcGHh8ERUBA+siFfqUvqHfDwymLzG6F05Ywlq8gAOgAmOwFASUuKIRg0sWMvOtyEOBuPmfM73fMuOmEW1W0FXKBjgUv3MgBDqxxRg6sl1FNWuOA2t8DgxAA |
Text getting dissappear on using scroller in Vegalite |
|vega-lite|vega|vega-embed|vega-lite-api| |
I was trying to convert a project I have in codeblocks to compile in the makefile, but I'm having problems linking the opengl libraries.
I'm trying this for days and haven't been able to get it to work yet.
In my project directory I have the following structure:
- include/
- All headers of GL and GLFW
- lib/
- libfreeglut32.a
- libglfw3.a
- libglfw3dll.a
- libglu32.a
- libopengl32.a
- src/
- main.cpp
- others cpp's
My codeblocks project:
```
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<CodeBlocks_project_file>
<FileVersion major="1" minor="6" />
<Project>
<Option title="Canvas 2D" />
<Option pch_mode="2" />
<Option compiler="gcc" />
<Build>
<Target title="Debug">
<Option output="../__bin/Debug/canvas2d" prefix_auto="1" extension_auto="1" />
<Option working_dir="../" />
<Option object_output="../__obj/Debug/" />
<Option type="1" />
<Option compiler="gcc" />
<Compiler>
<Add option="-g" />
</Compiler>
</Target>
<Target title="Release">
<Option output="../__bin/Release/canvas2d" prefix_auto="1" extension_auto="1" />
<Option working_dir="../" />
<Option object_output="../__obj/Release/" />
<Option type="1" />
<Option compiler="gcc" />
<Compiler>
<Add option="-O2 -Wall" />
</Compiler>
<Linker>
<Add option="-s" />
</Linker>
</Target>
</Build>
<Compiler>
<Add option="-Wall" />
<Add option="-fexceptions" />
<Add option="-std=c++11" />
<Add directory="./include" />
</Compiler>
<Linker>
<Add library="./lib/libfreeglut32.a" />
<Add library="./lib/libopengl32.a"/>
<Add library="./lib/libglu32.a"/>
</Linker>
<Unit filename="src/Bola.h" />
<Unit filename="src/Botao.h" />
<Unit filename="src/Relogio.cpp" />
<Unit filename="src/Relogio.h" />
<Unit filename="src/Vector2.h" />
<Unit filename="src/gl_canvas2d.cpp" />
<Unit filename="src/gl_canvas2d.h" />
<Unit filename="src/main.cpp" />
</Project>
</CodeBlocks_project_file>
```
Makefile:
```
cpp_source_files = $(shell find -name *.cpp)
all:
g++ -Wall -fexceptions -std=c++11 -L lib -l freeglut32 -l glu32 -l opengl32 -l glfw3 -l glfw3dll -I./include $(cpp_source_files) -o run.exe
```
Error:
```
g++ -Wall -fexceptions -std=c++11 -L lib -l freeglut32 -l glu32 -l opengl32 -l glfw3 -l glfw3dll -I./include ./src/gl_canvas2d.cpp ./src/main.cpp ./src/Relogio.cpp -o run.exe
./src/main.cpp: In function 'int main()':
./src/main.cpp:164:52: warning: ISO C++ forbids converting a string constant to 'char*' [-Wwrite-strings]
bt = new Botao(200, 400, 140, 50, "Sou um botao");
^
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x1c): undefined reference to `_imp____glutInitWithExit@12'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x3f): undefined reference to `_imp____glutCreateWindowWithExit@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x61): undefined reference to `_imp____glutCreateMenuWithExit@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x7c): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x93): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x9b): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0xb0): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0xc7): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0xcf): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0xe4): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0xfb): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x112): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x11a): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x130): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x147): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x15e): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x175): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x18c): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x194): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x1aa): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x1c1): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x1d8): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x1ef): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x206): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x20e): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x224): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x23b): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x252): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x269): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x280): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x288): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x29e): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x2e0): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x2ee): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x304): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x346): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x354): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x3c4): undefined reference to `glRasterPos2i@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x3e5): undefined reference to `_imp__glutBitmapCharacter@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x419): undefined reference to `glClearColor@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x445): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x493): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x4aa): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x4d3): undefined reference to `glBegin@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x521): undefined reference to `glVertex2d@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x538): undefined reference to `glEnd@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x54e): undefined reference to `glMatrixMode@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x556): undefined reference to `glLoadIdentity@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x570): undefined reference to `glTranslated@24'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x588): undefined reference to `glMatrixMode@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x590): undefined reference to `glLoadIdentity@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x5aa): undefined reference to `glTranslated@24'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x5d1): undefined reference to `glColor3d@24'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x5f7): undefined reference to `glColor3fv@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x628): undefined reference to `glColor4d@32'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x7b9): undefined reference to `glViewport@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x7c8): undefined reference to `glMatrixMode@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x7d0): undefined reference to `glLoadIdentity@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x7f0): undefined reference to `gluOrtho2D@32'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x7ff): undefined reference to `glMatrixMode@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x807): undefined reference to `glLoadIdentity@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x82c): undefined reference to `glClearColor@16'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x843): undefined reference to `glPolygonMode@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x85b): undefined reference to `glClear@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x86a): undefined reference to `glMatrixMode@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x872): undefined reference to `glLoadIdentity@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x87c): undefined reference to `glFlush@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x881): undefined reference to `_imp__glutSwapBuffers@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x8cd): undefined reference to `_imp__glutSetOption@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x8de): undefined reference to `_imp__glutInitDisplayMode@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x8f5): undefined reference to `_imp__glutInitWindowSize@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x90e): undefined reference to `_imp__glutInitWindowPosition@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x932): undefined reference to `_imp__glutReshapeFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x943): undefined reference to `_imp__glutDisplayFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x954): undefined reference to `_imp__glutKeyboardFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x965): undefined reference to `_imp__glutKeyboardUpFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x976): undefined reference to `_imp__glutSpecialUpFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x987): undefined reference to `_imp__glutSpecialFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x998): undefined reference to `_imp__glutIdleFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x9a9): undefined reference to `_imp__glutMouseFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x9ba): undefined reference to `_imp__glutPassiveMotionFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x9cb): undefined reference to `_imp__glutMotionFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x9dc): undefined reference to `_imp__glutMouseWheelFunc@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0x9ed): undefined reference to `glGetString@4'
C:\Users\ADMINI~1\AppData\Local\Temp\ccL56CtJ.o:gl_canvas2d.cpp:(.text+0xa0f): undefined reference to `_imp__glutMainLoop@0'
C:\Users\ADMINI~1\AppData\Local\Temp\ccljMOAH.o:main.cpp:(.text+0x1c): undefined reference to `_imp____glutInitWithExit@12'
C:\Users\ADMINI~1\AppData\Local\Temp\ccljMOAH.o:main.cpp:(.text+0x3f): undefined reference to `_imp____glutCreateWindowWithExit@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccljMOAH.o:main.cpp:(.text+0x61): undefined reference to `_imp____glutCreateMenuWithExit@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccSmVNUV.o:Relogio.cpp:(.text+0x1c): undefined reference to `_imp____glutInitWithExit@12'
C:\Users\ADMINI~1\AppData\Local\Temp\ccSmVNUV.o:Relogio.cpp:(.text+0x3f): undefined reference to `_imp____glutCreateWindowWithExit@8'
C:\Users\ADMINI~1\AppData\Local\Temp\ccSmVNUV.o:Relogio.cpp:(.text+0x61): undefined reference to `_imp____glutCreateMenuWithExit@8'
collect2.exe: error: ld returned 1 exit status
make: *** [all] Error 1
``` |
Compiling C++ program with Opengl and Glut in windows |
|c++|c|opengl|linker| |
{"Voters":[{"Id":307138,"DisplayName":"Ocaso Protal"},{"Id":6022243,"DisplayName":"Roy"},{"Id":205233,"DisplayName":"Filburt"}],"SiteSpecificCloseReasonIds":[18]} |
This should probably be considered an implementation detail of the `json` package, and also it might be a Python version thing. In any case, it becomes crucial with your implementation:
- The `dump()` function internally calls the `iterencode()` method on your encoder (see lines 169 and 176 [in the actual source code][1].
- Yet, the `dumps()` function internally calls `encode()` (see lines 231 and 238).
You can verify this by adjusting `encode()` and overriding `iterencode()` in your `JSONSerializer` like so:
```python
class JSONSerializer(json.JSONEncoder):
def encode(self, obj):
print("encode called")
... # Your previous code here
return super().encode(obj)
def iterencode(self, *args, **kwargs):
print("iterencode called")
return super().iterencode(*args, **kwargs)
```
… and you will see that only "iterencode called" will be printed with your test code, but not "encode called".
The other Stack Overflow question that you linked in your question seems to have the same issue by the way, at least when using a rather recent version of Python (I am currently on 3.11 for writing this) – see my [comment][2] to the corresponding answer.
I have two solutions:
1. *Either* use `dumps()` in your `Agent.memorize()` method, e.g. like so:
```python
def memorize(self):
with open("memory.json", "w") as w:
w.write(json.dumps(self.G, cls=JSONSerializer))
```
2. *Or* move your own implementation from `encode()` to `iterencode()`, e.g. like so:
```python
class JSONSerializer(json.JSONEncoder):
def iterencode(self, obj, *args, **kwargs):
if isinstance(obj, MutableMapping):
for key in list(obj.keys()):
if isinstance(key, tuple):
strkey = "%d:%d" % key
obj[strkey] = obj.pop(key)
yield from super().iterencode(obj, *args, **kwargs)
```
This 2nd solution seems to have the benefit that it works with both `dump()` and `dumps()` (see note below).
A note for completion: The `dumps()` function later seems to result in a call of `iterencode()`, as well (I did not track the source code so far as to see where exactly that happens, but from the printouts that I added it definitely happens). This has the following effects: (1) In the 1st proposed solution, as `encode()` is called first and we can make all adjustments for making our data JSON-serializable there, at this later point, calling `iterencode()` will not result in an error, any more. (2) In the 2nd proposed solution, as we reimplemented `iterencode()`, our data will be made JSON-serializable at this point.
**Update**: There is actually a third solution, thanks to the comment of @AbdulAzizBarkat: we can override the [`default()`][3] method. However, we have to make sure that we hand over an object type for serialization that is not handled by the regular encoder, or otherwise we will never reach the `default()` method with it. In the given code, we can for example pass the `Agent` instance itself to `dump()` or `dumps()`, rather than its dictionary field `G`. So we have to make two adjustments:
1. Adjust `memorize()` to pass `self`, rather than `self.G`, e.g. like so:
```python
def memorize(self):
with open("memory.json", "w") as w:
json.dump(self, w, cls=JSONSerializer)
```
2. Adjust `JSONSerializer.default()` to handle the `Agent` instance, e.g. like so (we won't need `encode()` and `iterencode()`, any more):
```python
class JSONSerializer(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, Agent):
new_obj = {}
for k in obj.G.keys():
new_obj[("%d:%d" % k) if isinstance(k, tuple) else k] = obj.G[k]
return new_obj # Return the adjusted dictionary
return super().default(obj)
```
Final side note: In the first two solutions, the keys of the original dictionary are altered. You probably don't want to have this as actually your instance should not change just because you dump a copy of it. So you might rather want to create a new dictionary with the altered keys and original values. I took care of this in the 3rd solution.
[1]: https://github.com/python/cpython/blob/6c8ac8a32fd6de1960526561c44bc5603fab0f3e/Lib/json/__init__.py#L169
[2]: https://stackoverflow.com/questions/72931719/json-serializer-class-why-json-dumps-works-while-json-dump-does-not/72932299#comment137929561_72932299
[3]: https://docs.python.org/3/library/json.html#json.JSONEncoder.default |
Using `libgpiod` version 1 with C++ bindings, I'm waiting for an event using `event_read` to get line event changes.
I understand that event comes with a timestamp in the `std::chrono::nanoseconds` format:
auto event = line.event_read();
std::cout << "EVENT TYPE CODE: " << event.event_type << std::endl;
std::cout << "EVENT TIMESTAMP: " << event.timestamp.count() << std::endl;
So, I'm getting output like:
EVENT TYPE CODE: 1
EVENT TIMESTAMP: 6603108649419
Now I need to convert that timestamp to Linux epoch date and time in seconds, as to register the exact moment the event occurred.
I've tried the following function:
unsigned long State::convert_timestamp_to_epoch(std::chrono::nanoseconds timestamp)
{
// Extract total number of seconds from nanoseconds
auto seconds = std::chrono::duration_cast<std::chrono::seconds>(timestamp);
// Convert seconds to unsigned long
unsigned long epoch_time = static_cast<unsigned long>(seconds.count());
std::cout << "CONVERTING:" << std::endl;
std::cout << timestamp.count() << std::endl;
std::cout << seconds.count() << std::endl;
std::cout << epoch_time << std::endl;
return epoch_time;
}
But I'm getting the following result:
CONVERTING:
6603108649419
6603
6603
None of them are epoch date and times...
How can I convert the `std::chrono::nanoseconds` timestamp that comes from `libgpiod` event to epoch date and time?
|
|xpath| |
null |
When the project had one data source, native queries ran fine, now when there are two data sources, hibernate cannot determine the schema for receiving native queries, non-native queries work fine.
**application.yaml**
```
spring:
autoconfigure:
exclude: org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration
flyway:
test:
testLocations: classpath:/db_test/migration,classpath:/db_test/migration_test
testSchemas: my_schema
locations: classpath:/db_ems/migration
baselineOnMigrate: true
schemas: my_schema
jpa:
packages-to-scan: example.example1.example2.example3.example4
show-sql: false
properties:
hibernate.dialect: org.hibernate.dialect.PostgreSQL10Dialect
hibernate.format_sql: false
hibernate.jdbc.batch_size: 50
hibernate.order_inserts: true
hibernate.order_updates: true
hibernate.generate_statistics: false
hibernate.prepare_connection: false
hibernate.default_schema: my_schema
org.hibernate.envers:
audit_table_prefix: log_
audit_table_suffix:
hibernate.javax.cache.uri: classpath:/ehcache.xml
hibernate.cache:
use_second_level_cache: true
region.factory_class: org.hibernate.cache.ehcache.internal.SingletonEhcacheRegionFactory
hibernate:
connection:
provider_disables_autocommit: true
handling_mode: DELAYED_ACQUISITION_AND_RELEASE_AFTER_TRANSACTION
hibernate.ddl-auto: validate
# todo:
open-in-view: false
database-platform: org.hibernate.dialect.H2Dialect
#database connections
read-only:
datasource:
url: jdbc:postgresql://localhost:6432/db
username: postgres
password: postgres
configuration:
pool-name: read-only-pool
read-only: true
auto-commit: false
schema: my_schema
read-write:
datasource:
url: jdbc:postgresql://localhost:6433/db
username: postgres
password: postgres
configuration:
pool-name: read-write-pool
auto-commit: false
schema: my_schema
```
**Datasources config:**
```
@Configuration
public class DataSourceConfig {
@Bean
@ConfigurationProperties("spring.read-write.datasource")
public DataSourceProperties readWriteDataSourceProperties() {
return new DataSourceProperties();
}
@Bean
@ConfigurationProperties("spring.read-only.datasource")
public DataSourceProperties readOnlyDataSourceProperties() {
return new DataSourceProperties();
}
@Bean
@ConfigurationProperties("spring.read-only.datasource.configuration")
public DataSource readOnlyDataSource(DataSourceProperties readOnlyDataSourceProperties) {
return readOnlyDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build();
}
@Bean
@ConfigurationProperties("spring.read-write.datasource.configuration")
public DataSource readWriteDataSource(DataSourceProperties readWriteDataSourceProperties) {
return readWriteDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build();
}
@Bean
@Primary
public RoutingDataSource routingDataSource(DataSource readWriteDataSource, DataSource readOnlyDataSource) {
RoutingDataSource routingDataSource = new RoutingDataSource();
Map<Object, Object> dataSourceMap = new HashMap<>();
dataSourceMap.put(DataSourceType.READ_WRITE, readWriteDataSource);
dataSourceMap.put(DataSourceType.READ_ONLY, readOnlyDataSource);
routingDataSource.setTargetDataSources(dataSourceMap);
routingDataSource.setDefaultTargetDataSource(readWriteDataSource);
return routingDataSource;
}
@Bean
public BeanPostProcessor dialectProcessor() {
return new BeanPostProcessor() {
@Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (bean instanceof HibernateJpaVendorAdapter) {
((HibernateJpaVendorAdapter) bean).getJpaDialect().setPrepareConnection(false);
}
return bean;
}
};
}
}
```
**Routing data sources**
```
public class RoutingDataSource extends AbstractRoutingDataSource {
@Override
protected Object determineCurrentLookupKey() {
return DataSourceTypeContextHolder.getTransactionType();
}
@Override
public void setTargetDataSources(Map<Object, Object> targetDataSources) {
super.setTargetDataSources(targetDataSources);
afterPropertiesSet();
}
}
```
**depending on the type of transaction readOnly or not, the datasource is selected**
```
public class DataSourceTypeContextHolder {
private static final ThreadLocal<DataSourceType> contextHolder = new ThreadLocal<>();
public static void setTransactionType(DataSourceType dataSource) {
contextHolder.set(dataSource);
}
public static DataSourceType getTransactionType() {
return contextHolder.get();
}
public static void clearTransactionType() {
contextHolder.remove();
}
}
```
```
@Aspect
@Component
@Slf4j
public class TransactionAspect {
@Before("@annotation(transactional) && execution(* *(..))")
public void setTransactionType(Transactional transactional) {
if (transactional.readOnly()) {
DataSourceTypeContextHolder.setTransactionType(DataSourceType.READ_ONLY);
} else {
DataSourceTypeContextHolder.setTransactionType(DataSourceType.READ_WRITE);
}
}
@AfterReturning("@annotation(transactional) && execution(* *(..))")
public void clearTransactionType(Transactional transactional) {
DataSourceTypeContextHolder.clearTransactionType();
}
}
```
**Error**
```
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [UPDATE my_table SET lock_until = timezone('utc', CURRENT_TIMESTAMP) + cast(? as interval), locked_at = timezone('utc', CURRENT_TIMESTAMP), locked_by = ? WHERE my_table.name = ? AND my_table.lock_until <= timezone('utc', CURRENT_TIMESTAMP)]; nested exception is org.postgresql.util.PSQLException: ERROR: relation "my_table" does not exist
Позиция: 8
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:235)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72)
at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1443)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:633)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:862)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:883)
at org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate.update(NamedParameterJdbcTemplate.java:321)
at org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate.update(NamedParameterJdbcTemplate.java:326)
at net.javacrumbs.shedlock.provider.jdbctemplate.JdbcTemplateStorageAccessor.lambda$execute$0(JdbcTemplateStorageAccessor.java:115)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:140)
at net.javacrumbs.shedlock.provider.jdbctemplate.JdbcTemplateStorageAccessor.execute(JdbcTemplateStorageAccessor.java:115)
at net.javacrumbs.shedlock.provider.jdbctemplate.JdbcTemplateStorageAccessor.updateRecord(JdbcTemplateStorageAccessor.java:81)
at net.javacrumbs.shedlock.support.StorageBasedLockProvider.doLock(StorageBasedLockProvider.java:91)
at net.javacrumbs.shedlock.support.StorageBasedLockProvider.lock(StorageBasedLockProvider.java:65)
at jdk.internal.reflect.GeneratedMethodAccessor328.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:205)
at com.sun.proxy.$Proxy139.lock(Unknown Source)
at net.javacrumbs.shedlock.core.DefaultLockingTaskExecutor.executeWithLock(DefaultLockingTaskExecutor.java:63)
at net.javacrumbs.shedlock.spring.aop.MethodProxyScheduledLockAdvisor$LockingInterceptor.invoke(MethodProxyScheduledLockAdvisor.java:86)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:747)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:689)
at example1.example2.example3.example3.example3.example3.example3.scheduler.ExampleContentSheduler$$EnhancerBySpringCGLIB$$631d68e1.loadData(<generated>)
at jdk.internal.reflect.GeneratedMethodAccessor320.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:84)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:93)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.postgresql.util.PSQLException: ERROR: relation "my_table" does not exist
```
when I change the native query and specify the schema before the table name, the query processes normally :
UPDATE my_schema.my_table SET lock_until = timezone('utc', CURRENT_TIMESTAMP) + cast(? as interval), locked_at = timezone('utc', CURRENT_TIMESTAMP), locked_by = ? WHERE my_schema.my_table.name = ? AND my_schema.my_table.lock_until <= timezone('utc', CURRENT_TIMESTAMP);
|
As stated in comments, PowerShell is not a great language for async programming, the issue is that `.ShowDialog()` blocks the thread and is not allowing your events to execute normally. Solution to this is to register the events in a separated runspace, below is as minimal example of how this can be accomplished (as minimal as I could). I have added a few pointer comments to help you understand the logic, though the code is clearly not easy, as stated before, PowerShell is not a language designed for this, C# would give you a much easier time.
__Demo:__
[![demo][1]][1]
__Code:__
```sh
Add-Type -Assembly System.Windows.Forms
[System.Windows.Forms.Application]::EnableVisualStyles()
$form = [System.Windows.Forms.Form]@{
Size = '500, 150'
FormBorderStyle = 'Fixed3d'
}
$btn = [System.Windows.Forms.Button]@{
Name = 'MyButton'
Text = 'Click Me!'
Size = '90, 30'
Location = '370, 70'
Anchor = 'Bottom, Right'
}
$btn.Add_Click({
# disable the button here to allow a single download at a time
$this.Enabled = $false
# hardcoded link here for demo
$downloader.DownloadFileAsync('https://www.7-zip.org/a/7z2301-x64.exe', "$pwd\7Zip.exe")
})
$progress = [System.Windows.Forms.ProgressBar]@{
Name = 'MyProgressBar'
Size = '460, 40'
Location = '10, 10'
}
$form.Controls.AddRange(($btn, $progress))
# create a WebClient instance
$downloader = [System.Net.WebClient]::new()
# create a new runspace where the Download Events will execute
# this new runspace will have the PSHost hooked so that things like
# `Write-Host`, `Out-Host`, `Write-Warning`, etc. goes straight to the console
# its easier for troubleshooting but not mandatory
$rs = [runspacefactory]::CreateRunspace($Host)
$rs.Open()
# add the `$form` instance to the runspace scope
$rs.SessionStateProxy.PSVariable.Set([psvariable]::new('form', $form))
# the code that will initiate the events in the new runspace
$ps = [powershell]::Create().AddScript({
$registerObjectEventSplat = @{
InputObject = $args[0] # $args[0] = $downloader
EventName = 'DownloadProgressChanged'
SourceIdentifier = 'WebMainDownloadProgressChanged'
Action = {
$progress = $form.Controls.Find('MyProgressBar', $false)[0]
# for demo, in addition to increment the progress bar,
# show the percentage to the console
$eventArgs.ProgressPercentage | Out-Host
# increment the progress bar
$progress.Value = $eventArgs.ProgressPercentage
}
}
Register-ObjectEvent @registerObjectEventSplat
$registerObjectEventSplat['EventName'] = 'DownloadFileCompleted'
$registerObjectEventSplat['SourceIdentifier'] = 'WebMainDownloadFileCompleted'
$registerObjectEventSplat['Action'] = {
[System.Threading.Monitor]::Enter($form)
# when the download is completed, enable the button
$form.Controls.Find('MyButton', $false)[0].Enabled = $true
# and show this to the console
Write-Host 'Download Completed!'
[System.Threading.Monitor]::Exit($form)
}
Register-ObjectEvent @registerObjectEventSplat
}).AddArgument($downloader)
$ps.Runspace = $rs
$task = $ps.BeginInvoke()
$form.ShowDialog()
```
[1]: https://i.stack.imgur.com/RYIzj.gif |
How should I backup SQLite database by EFCore? |
|sqlite|entity-framework-core| |
|typescript|mustache|monaco-editor| |
I have airflow and spark on different hosts,
i am trying to submit, but i got
`{standard_task_runner.py:107} ERROR - Failed to execute job 223 for task spark_job (Cannot execute: spark-submit --master spark://spark-host --proxy-user hdfs --name arrow-spark --queue default --deploy-mode client /opt/airflow/dags/plugins/plug_code.py. Error code is: 1.; 27605)`
I noticed that some error
` Spark-Submit cmd: spark-submit --master spark://spark-host --name arrow-spark --queue default --deploy-mode client /opt/airflow/dags/plugins/plug_code.py`
I tried with proxy name, did not make sense |
Apache Airflow sparksubmit |
|apache-spark|pyspark|airflow|spark-submit| |
null |
Hola!
For a starter here, make sure you take the [tour] and read [ask]. Then, please ask questions with separate sentences. Your question above contains a lot of text, but there's no structuring period "." in between different statements. This makes it hard to read and understand.
Now, concerning your issue, search for the error message ("PHP Warning: PHP Startup: Unable to load dynamic library 'imagick'") online. You will find a bunch of similar questions here and on other sites as well.
Lastly, I would have asked this as a proper question. This here is the discussions section, which is rather for things that can't be answered precisely and which are thus off-topic as questions.
|
null |
Which approach is more better use of httpclient in .Net Framework 4.7 among the following two
in terms of socket exhaustion and thread safe?
Both are reusing httpclient instance.Right?
I can't implement ihttpclientfactory now.
**First Code : -**
```
class Program
{
private static HttpClient httpClient;
static void Main(string[] args)
{
httpClient = new HttpClient();
httpClient.BaseAddress = new Uri("your base url");
// add any default headers here also
httpClient.Timeout = new TimeSpan(0, 2, 0); // 2 minute timeout
MainAsync().Wait();
}
static async Task MainAsync()
{
await new MyClass(httpClient).StartAsync();
}
}
public class MyClass
{
private Dictionary<string, string> myDictionary;
private readonly HttpClient httpClient;
public MyClass(HttpClient httpClient)
{
this.httpClient = httpClient;
}
public async Task StartAsync()
{
myDictionary = await UploadAsync("some file path");
}
public async Task<Dictionary<string, string>> UploadAsync(string filePath)
{
byte[] fileContents;
using (FileStream stream = File.Open(filePath, FileMode.Open))
{
fileContents = new byte[stream.Length];
await stream.ReadAsync(fileContents, 0, (int)stream.Length);
}
HttpRequestMessage requestMessage = new HttpRequestMessage();
// your request stuff here
HttpResponseMessage httpResponse = await httpClient.SendAsync(requestMessage, HttpCompletionOption.ResponseContentRead, CancellationToken.None);
// parse response and return the dictionary
}
}
```
**Second Code:- **
```
class Program
{
static void Main(string[] args)
{
MainAsync().Wait();
}
static async Task MainAsync()
{
await new MyClass().StartAsync();
}
}
public class MyClass
{
private Dictionary<string, string> myDictionary;
private static readonly HttpClient httpClient= = new HttpClient() { BaseAddress = new Uri("your base url"), Timeout = new TimeSpan(0, 2, 0) };
public async Task StartAsync()
{
myDictionary = await UploadAsync("some file path");
}
public async Task<Dictionary<string, string>> UploadAsync(string filePath)
{
byte[] fileContents;
using (FileStream stream = File.Open(filePath, FileMode.Open))
{
fileContents = new byte[stream.Length];
await stream.ReadAsync(fileContents, 0, (int)stream.Length);
}
HttpRequestMessage requestMessage = new HttpRequestMessage();
// your request stuff here
HttpResponseMessage httpResponse = await httpClient.SendAsync(requestMessage, HttpCompletionOption.ResponseContentRead, CancellationToken.None);
// parse response and return the dictionary
}
}
```
I'm expecting resolve socket exhaustion problem. |
i am trying to update from 8 to 9
i get this:
Using package manager: 'npm'
Collecting installed dependencies...
Found 85 dependencies.
Fetching dependency metadata from registry...
Updating package.json with dependency @angular/material @ "9.2.4" (was "8.2.3")...
Updating package.json with dependency @angular/cdk @ "9.2.4" (was "8.2.3")...
UPDATE package.json (6742 bytes)
⠸ Installing packages...npm ERR! code ERESOLVE
npm ERR! ERESOLVE could not resolve
npm ERR!
npm ERR! While resolving: @agm/core@1.1.0
npm ERR! Found: @angular/common@9.1.13
npm ERR! node_modules/@angular/common
npm ERR! @angular/common@"^9.1.13" from the root project
npm ERR! peer @angular/common@"^9.0.0" from @angular-material-extensions/password-strength@6.0.0
npm ERR! node_modules/@angular-material-extensions/password-strength
npm ERR! @angular-material-extensions/password-strength@"^6.0.0" from the root project
npm ERR! 15 more (@angular/flex-layout, @angular/forms, ...)
npm ERR!
npm ERR! Could not resolve dependency:
npm ERR! peer @angular/common@"^6.0.0 || ^7.0.0 || ^8.0.0" from @agm/core@1.1.0
npm ERR! node_modules/@agm/core
npm ERR! @agm/core@"^1.0.0" from the root project
npm ERR! peer @agm/core@"^1.0.0-beta.7" from @agm/js-marker-clusterer@1.1.0
npm ERR! node_modules/@agm/js-marker-clusterer
npm ERR! @agm/js-marker-clusterer@"^1.1.0" from the root project
npm ERR!
npm ERR! Conflicting peer dependency: @angular/common@8.2.14
npm ERR! node_modules/@angular/common
npm ERR! peer @angular/common@"^6.0.0 || ^7.0.0 || ^8.0.0" from @agm/core@1.1.0
npm ERR! node_modules/@agm/core
npm ERR! @agm/core@"^1.0.0" from the root project
npm ERR! peer @agm/core@"^1.0.0-beta.7" from @agm/js-marker-clusterer@1.1.0
npm ERR! node_modules/@agm/js-marker-clusterer
npm ERR! @agm/js-marker-clusterer@"^1.1.0" from the root project
npm ERR!
npm ERR! Fix the upstream dependency conflict, or retry
npm ERR! this command with --force or --legacy-peer-deps
npm ERR! to accept an incorrect (and potentially broken) dependency resolution.
npm ERR!
npm ERR!
when i run
NG_DISABLE_VERSION_CHECK=1 npx @angular/cli@9 update @angular/material@9
i get this:
Using package manager: 'npm'
Collecting installed dependencies...
Found 85 dependencies.
Fetching dependency metadata from registry...
Updating package.json with dependency @angular/material @ "9.2.4" (was "8.2.3")...
Updating package.json with dependency @angular/cdk @ "9.2.4" (was "8.2.3")...
UPDATE package.json (6742 bytes)
⠸ Installing packages...npm ERR! code ERESOLVE
npm ERR! ERESOLVE could not resolve
npm ERR!
npm ERR! While resolving: @agm/core@1.1.0
npm ERR! Found: @angular/common@9.1.13
npm ERR! node_modules/@angular/common
npm ERR! @angular/common@"^9.1.13" from the root project
npm ERR! peer @angular/common@"^9.0.0" from @angular-material-extensions/password-strength@6.0.0
npm ERR! node_modules/@angular-material-extensions/password-strength
npm ERR! @angular-material-extensions/password-strength@"^6.0.0" from the root project
npm ERR! 15 more (@angular/flex-layout, @angular/forms, ...)
npm ERR!
npm ERR! Could not resolve dependency:
npm ERR! peer @angular/common@"^6.0.0 || ^7.0.0 || ^8.0.0" from @agm/core@1.1.0
npm ERR! node_modules/@agm/core
npm ERR! @agm/core@"^1.0.0" from the root project
npm ERR! peer @agm/core@"^1.0.0-beta.7" from @agm/js-marker-clusterer@1.1.0
npm ERR! node_modules/@agm/js-marker-clusterer
npm ERR! @agm/js-marker-clusterer@"^1.1.0" from the root project
npm ERR!
npm ERR! Conflicting peer dependency: @angular/common@8.2.14
npm ERR! node_modules/@angular/common
npm ERR! peer @angular/common@"^6.0.0 || ^7.0.0 || ^8.0.0" from @agm/core@1.1.0
npm ERR! node_modules/@agm/core
npm ERR! @agm/core@"^1.0.0" from the root project
npm ERR! peer @agm/core@"^1.0.0-beta.7" from @agm/js-marker-clusterer@1.1.0
npm ERR! node_modules/@agm/js-marker-clusterer
npm ERR! @agm/js-marker-clusterer@"^1.1.0" from the root project
npm ERR!
npm ERR! Fix the upstream dependency conflict, or retry
npm ERR! this command with --force or --legacy-peer-deps
npm ERR! to accept an incorrect (and potentially broken) dependency resolution.
npm ERR!
npm ERR! |
update angular 8 to 9 |
|angular|angular8|angular9|agm-core| |
null |
class ItemGetter extends EventEmitter {
constructor () {
super();
this.on('item', item => this.handleItem(item));
}
handleItem (item) {
console.log('Receiving Data: ' + item);
}
getAllItems () {
for (let i = 0; i < 15; i++) {
this
.getItem(i)
.then(item => this.emit('item', item))
.catch(console.error);
}
console.log('=== Loop ended ===')
}
async getItem (item = '') {
console.log('Getting data:', item);
return new Promise((resolve, reject) => {
exec('echo', [item], (error, stdout, stderr) => {
if (error) {
throw error;
}
resolve(item);
});
});
}
}
(new ItemGetter()).getAllItems()
You logic, for first, run loop with calling all GetItem, then output '=== Loop ended ===', and only after that run all promices resolution, so, if you want get result of each getItem execution independently of eachother, just don't abuse asynchronous logic, frequently right solution much simpliest, than it seems ;)
Note: in this solution, you will get the same output, because loop with getItem calling, runnung faster, then promises with exec, but in this case, each item will be handled exactly after appropriate promise resolve, except of awaiting of all promises resolution |
im having the same issue, im reading that to get the permisions it has to include projectId:
`
token = await Notifications.getExpoPushTokenAsync({
projectId: Constants.expoConfig.extra.eas.projectId,
});
but typescript doesnt found expoConfig in Constants
[{
"resource": "/Users/user/code/name_of_app/app/navigaton/AppNavigator.tsx",
"owner": "typescript",
"code": "2339",
"severity": 8,
"message": "Property 'expoConfig' does not exist on type 'typeof import(\"/Users/user/code/name_of_app/node_modules/expo-constants/build/Constants\")'.",
"source": "ts",
"startLineNumber": 57,
"startColumn": 28,
"endLineNumber": 57,
"endColumn": 38
}]`
<!-- end snippet -->
i tried hardcodig the projectId, it´s not working either
this its the docs page
[https://docs.expo.dev/push-notifications/push-notifications-setup/][1]
[1]: https://docs.expo.dev/push-notifications/push-notifications-setup/
*** update solved types `import Constants from "expo-constants"` instead
`import * as Constants from "expo-config"` |
You made a mistake when importing `Query`:
import { Query, UseGuards } from '@nestjs/common'
You need to import it from `@nestjs/graphql`:
import { Query } from '@nestjs/graphql';
[Here][1] is a file from **nestjs**' sample repository that imports `Query`.
[1]: https://github.com/nestjs/nest/blob/master/sample/23-graphql-code-first/src/recipes/recipes.resolver.ts |
I have a tfidfvectorizer which was fitted on english text data to predict sentiment of english calls. The task is to convert this in spanish languages. I want to use the weights of this tfidfvectorizers and wants to convert features from english to Spanish for e.g. "thank you" becomes "gracias" and use the old weights. So essentially I want to use the same tfidf vectorizer but with modified feature names. Can someone suggest some way to do that in Python.
The code with solution. |
In your code how you are passing context doesn't makes sense specially `'events':Event.objects.filter(Event_ID=pk), 'event':event`
Django ORM `get()` method will always expect to give you 1 results if there are more than 1 results it will give an error. So not sure what is your idea here for events and event. If you expect more than 1 value with same event id then use `filter()` otherwise use `get()`
**Try it like this:**
def single_event(request, pk):
try:
event = Event.objects.get(Event_ID=pk) #Should get only 1 results according to your model
submissions = Submission.objects.filter(event=event).select_related('participant')
submitted_details = {submission.participant.id: submission.details for submission in submissions}
except Event.DoesNotExist:
# When the event does not exist
return render(request, 'events/single-event.html', {"success": False}, status=404)
context = {
'event': event,
'submitted_details': submitted_details,
}
return render(request, 'events/single-event.html', context)
**In your template file adjust like below:**
{% for user in event.Event_participants.all %}
<tr>
<td>{{ user.profile.title }}</td>
<td>{{ user.first_name }} {{ user.last_name }}</td>
<td>{{ submitted_details|default:user.id:"No details submitted." }}</td>
</tr>
{% endfor %}
|
|html|schema.org|json-ld| |
if(tid\<n)
{
gain = in_degree[neigh]*out_degree[tid] + out_degree[neighbour]*in_degree[tid]/total_weight
//here let say node 0 moves to 2 atomicAdd(&in_degree[2]+=in_degree[0] // because node 0 is in node 2 now
atomicAdd(&out_degree[2]+=out_degree[0] // because node 0 is in node 2 now }
this is the process, in this problem during calculation of gain all the thread should see the update value of 2 which values of 2+values 0 but threads see only previous value of 2. how to solve that ? here is the output: node is: 0 node is: 1 node is: 2 node is: 3 node is: 4 node is: 5 updated_node is: 0 // this should be 2 updated_values are: 48,37.
9
// this should be(48+15(values of 2))+(37+12(values of 0))
I have tried using , __syncthreads(), _threadfence() and shared memory foe reading writing values can any one tell what could be the issue ?? |
You can switch to the legacy checkout page.
Go to Pages->Checkout->Edit
Remove the checkout block and replace it with the shortcode [woocommerce_checkout] |
I am trying to create a directory and a file inside it in Java (spring) and it working fine when I execute it on windows, but after deployong the war file on wildfly on linux it's throwing error:
**code snippet:**
```
String filePath = "reports/"//
File f = new File(filePath);
if (!f.exists()) {
boolean b=f.mkdir();
if(b) {
File xlFile = new File(filePath+ File.separator + "report.xlsx");
}
}
```
Error after deploying it on wildfly that is runnig on Ubuntu:
```
{
"data": "java.io.FileNotFoundException: reports/Attendance.xlsx (No such file or directory)",
"message": "Error while writing file",
"status": 400,
"isSuccess": true
}
```
Please guide.
I tried many ways to do this but nothing works. |
Reading jsonb from Hibernate 6 native query |
|java|hibernate|jsonb|hibernate-native-query| |
It depends on the device you're using. Try adding both `click` and `touchstart` event listeners
In your case:
<div ontouchstart="toggleForm()" id="toggleButton">
<h3>TITLE</h3>
</div> |
I want to send push and pull events of images to Kafka using the Azure ACR Webhook function.
'Event Hub' is not available due to cost.
The webhook is set to be thrown to the Kafka Rest Proxy URI with the header 'Content-Type: application/vnd.kafka.json.v2+json'.
When an event occurs
Headers:
'User-Agent: AzureContainerRegistry/1.0
traceparent: 00-1318068fc8e0ca8904b19865b65d551b-d98ef2405fa798bf-01
Content-Type: application/vnd.kafka.json.v2+json; charset=utf-8
Content-Length: 104'
Payload:
'
{
"id": "a00c973c-a717-425d-9db6-874a6288728c",
"timestamp": "2024-03-13T06:56:49.2277133Z",
"action": "ping"
}
'
The message is sent, but the response is '{"error_code":422,"message":"Unrecognized field: id"}'.
This seems to be because the payload format of the webhook is sent without the 'records, value' keys as follows.
{ "records":[{"value": { *ACTUAL DATA* }}]}
In this environment where the payload format of the message cannot be edited,
Is there a solution that matches the message format and sends it to Kafka, or is there a more efficient REST solution that can receive such a simple message and send it to Kafka?
Please let me know if there is a configuration that allows you to send messages to Kafka using Azure Webhook without Event Hub.
thank you |
Postgresql doesn't respond, select * from claims where case_number='22222' but responds select * from claims I don't understand the reason who can help please
I have restarted postgresql and normally works 5-10 minutes
postgresql version 12 |
One option is with [pivot_longer](https://pyjanitor-devs.github.io/pyjanitor/api/functions/#janitor.functions.pivot.pivot_longer), where you pass the new header names to `names_to` and a list of regexes to `names_pattern`:
```py
# pip install pyjanitor
import pandas as pd
df.pivot_longer(index=ilist,names_to=stublist,names_pattern=stublist)
g1 g2 g3 st1 st2 ft
0 1 2 3 1 2 1
1 1 2 3 1 2 0
2 1 2 3 1 2 1
3 1 2 3 1 2 1
4 1 2 3 1 2 1
5 1 2 3 1 2 0
6 1 2 3 1 2 0
7 1 2 3 1 2 1
```
Another option is a reshaping of the columns, followed by pd.stack:
```py
reshaped = df.set_index(ilist)
reshaped.columns = reshaped.columns.str.split('_',expand=True).set_names([None,'drop'])
reshaped.stack(level='drop').droplevel('drop').reset_index()
g1 g2 g3 st1 st2 ft
0 1 2 3 1 2 1
1 1 2 3 1 2 0
2 1 2 3 1 2 1
3 1 2 3 1 2 1
4 1 2 3 1 2 1
5 1 2 3 1 2 0
6 1 2 3 1 2 0
7 1 2 3 1 2 1
```
the `names_pattern` relies on regexes, and under the hood, `pd.Series.str.contains` and `np.select` is used to extract and pair the columns with the regexes. As such, the regexes have to be crafted properly to match the columns:
```py
# pip install pyjanitor
import pandas as pd
import janitor
# note the inclusion of digits within the regexes
names_pattern = [r'st1_\d+',r'st1_Next',r'st2_\d+', 'ft']
df.pivot_longer(index=ilist,names_to=stublist,names_pattern=names_pattern)
g1 g2 g3 st1 st1_Next st2 ft
0 1 2 3 1 8 2 1
1 1 2 3 1 8 2 0
2 1 2 3 1 8 2 1
3 1 2 3 1 8 2 1
4 1 2 3 5 9 2 1
5 1 2 3 5 9 2 0
6 1 2 3 5 9 2 0
7 1 2 3 5 9 2 1
``` |
How to convert a timestamp from libgpiod to epoch date and time? |
|c++|epoch|c++-chrono|libgpiod| |
I keep throwing an error code when I try to create a stored funtion for netflix average IMD scores for 3 tables. The code works on it's own but not as a stored function. The answer returns as 7.9999 so I wanted to try to reduce the decimal output in the process to only two decimals, such as 7.9.
`
```
CREATE FUNCTION Average_Netflix_Score (
Score INT
)
RETURNS DECIMAL (2, 2)
DETERMINISTIC
BEGIN
DECLARE average_score DECIMAL (2, 2);
SELECT AVG(SCORE) INTO average_score
FROM (
SELECT SCORE FROM NETFLIX_19
UNION ALL
SELECT SCORE FROM NETFLIX_20
UNION ALL
SELECT SCORE FROM NETFLIX_21
) AS CombinedScores;
RETURN avg_score;
END;
````
Expected stored function to save with a value of 7.9 as outcome. But got syntax error code 1067. |
Creating a stored function on SQL of average scores, error code 1064 |
|sql-server|function|stored-procedures|syntax| |
null |
The business has BPC 11.1 version and using analysis for Office Excel AFO, but they are facing one issue that is when business users are trying to execute the macro, their system is not able to read that macro. It is because their system Office language is French or maybe Spanish or Italian, but our
The macro written in English should work for French users like it worked for Japan users. In the system of French people, the system is not able to read macro logic in English.
But when it was in Japan, the system was able to read the same macro in english.
The logic is not readable for French users :
Result = Application.Run("SAPGetProperty", "IsConnected", "DS_1") |
Macro written in English working for Japan users but not for French users |
|excel|vba|macros|french| |
null |
We are using Menu component to show navigation menu on top. It works fine, however, for menu item, the click area is restricted to only where we have text. If we click on the remaining area then the menu just closes itself. I tried inspecting and found that the hyper link is only applied to the text.
Any suggestions?
here is the component:
```
return (
<>
<Menu trigger="hover" className={'${classes.menuItem} ${classes.navmenu}'} position='bottom' offset='8' withArrow>
{hasDropdown && (
<Menu.Dropdown key={'Dd_' + level}>
{dropdownItems.map(({ title, href, dropdownItems, index, isOffice }) => (
<React.Fragment key={index}>
{Boolean(dropdownItems) && (
<NavSubItem
key={'NV+' + level + '_' + index}
title={title}
/>
)}
{!Boolean(dropdownItems) && !Boolean(isOffice) && (
<Menu.Item
key={level + '_' + index}
className={classes.menuItem} // Apply custom CSS class to MenuItem
>
{title}
</Menu.Item>
)}
</React.Fragment>
))}
</Menu.Dropdown>
)}
</Menu>
</>
);
```
here is how it is being called: (navDisplay is an array of items)
```
<div className="col-md-auto ml-md-auto order-1 order-md-3" style={{alignContent:'center'}}>
<Nav>
{navDisplay.map(({ title, dropdownItems, index }) => (
<NavItem
key={title + index}
title={title}
hasDropdown={Boolean(dropdownItems)}
dropdownItems={dropdownItems}
setCounty={setCounty}
county={county}
level={0}
/>
))}
</Nav>
</div>
```
here is what is in the array:
[![enter image description here][1]][1]
[![enter image description here][2]][2]
For example, if we click on the space near green dot then it closes the menu. The hyperlink only works if we click on the blue line area.
see image below, about what we get when we inspect the element, as we can see, the button element doesn't have any click event and hyperlink element (a) is way under it.
[![enter image description here][3]][3]
[1]: https://i.stack.imgur.com/rmi7i.png
[2]: https://i.stack.imgur.com/KtlYT.png
[3]: https://i.stack.imgur.com/qMiyu.png |
Can not create file in specific directory inside the project (war file) Java deployed on wildfly on ubuntu(The system cannot find the path specified) |
|java|spring|spring-boot| |
null |
if(tid\<n)
{
gain = in_degree[neigh]*out_degree[tid] + out_degree[neighbour]*in_degree[tid]/total_weight
//here let say node 0 moves to 2
atomicAdd(&in_degree[2]+=in_degree[0] // because node 0 is in node 2 now
atomicAdd(&out_degree[2]+=out_degree[0] // because node 0 is in node 2 now }
this is the process, in this problem during calculation of gain all the thread should see the update value of 2 which values of 2+values 0 but threads see only previous value of 2.
how to solve that ? here is the output:
node is: 0
node is: 1
node is: 2
node is: 3
node is: 4
node is: 5
updated_node is: 0 // this should be 2
updated_values are: 48,37. // this should be(48+15(values of 2))+(37+12(values of 0))
I have tried using , __syncthreads(), _threadfence() and shared memory foe reading writing values can any one tell what could be the issue ?? |
In VS Code, how can you automatically add const when you press (<kbd>Ctrl</kbd> + <kbd>S</kbd>) ?
|
How can I auto add 'const' in Flutter in VS Code? |
{"OriginalQuestionIds":[12470665],"Voters":[{"Id":550094,"DisplayName":"Thierry Lathuille"},{"Id":794749,"DisplayName":"gre_gor"},{"Id":523612,"DisplayName":"Karl Knechtel","BindingReason":{"GoldTagBadge":"python"}}]} |
If you want to follow the link, then you need to use. `driver get(url)` |
If your use case fits the description of a 'variant' or a 'mod' of the original app then you can avoid the hassle of renaming your packages, and can leave alone the `namespace` _and_ the `applicationId`, and instead define the **`applicationIdSuffix`** in your `build.gradle`, e.g.:
```bash
defaultConfig {
namespace "com.original.app"
applicationId "com.original.app"
versionCode 42
versionName "0.42"
applicationIdSuffix '.insidersBuild'
################### ^^^^^^^^^^^^^^^^
}
```
*Keep in mind*: If you don’t explicitly define the `applicationId` in your `build.gradle`'s `defaultConfig`, then the value is inferred from the package attribute in your `AndroidManifest.xml`.
<sub>*See also:*</sub>
- https://developer.android.com/build/build-variants#groovy |
MySQL 8.3.0
UnableToConnectException: Unsupported protocol version: 11. Likely connecting to an X Protocol port.
You are trying to connect port 33060, whereas you should connect using 3306. |
How to modify features of tfidfvectorizer from English to Spanish |
|python|machine-learning|nlp|translation|tfidfvectorizer| |
null |
**EDITED FOLLOWING COMMENTS**
Once you have applied your ```mesh_grid``` commands then the layout of your ```z_values``` corresponds to the following (print ```x_grid```, ```y_grid``` and ```z_values``` out if you want to see):
x ------->
y
| [dependent-]
| [variable ]
\|/ [ array ]
Your poly-fitted figure corresponds to this, but your heat-map doesn't. I think that's because you **transposed z** in the heatmap plot but you **didn't transpose z** in the polyfit.
Whatever you do, you need to use the **SAME dependent-variable array** in both figures.
So, the obvious answer, if you want x to vary with row rather than column is to **transpose z_values before doing either plot**.
Full code below. Note that z_values is transposed very early on and NOT in the line
contour1 = ax1.contourf(x_grid, y_grid, np.log(z_values.T + 1))
which has been changed to
contour1 = ax1.contourf(x_grid, y_grid, np.log(z_values + 1))
The peaks of the diagrams now correspond. They are at high-x/low-y.
import matplotlib.pyplot as plt
import numpy as np
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression
x_values = np.linspace(1, 8, 8) #### This means each row of the dependent variable will need 8 columns
y_values = np.linspace(1, 10, 10) #### This means that the dependent variable will require 10 rows
#### As defined, your dependent variable is 8 rows, 10 columns ....
z_values = [[0.0128, 0.0029, 0.0009, 0.0006, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000],
[0.0049, 0.0157, 0.0067, 0.0003, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000],
[0.0203, 0.0096, 0.0055, 0.0096, 0.0012, 0.0023, 0.0000, 0.0000, 0.0000, 0.0000],
[0.0229, 0.0191, 0.0020, 0.0073, 0.0055, 0.0026, 0.0022, 0.0000, 0.0000, 0.0000],
[0.0218, 0.0357, 0.0035, 0.0133, 0.0073, 0.0145, 0.0000, 0.0029, 0.0000, 0.0000],
[0.0261, 0.0232, 0.0365, 0.0200, 0.0212, 0.0107, 0.0036, 0.0007, 0.0022, 0.0007],
[0.0305, 0.0244, 0.0284, 0.0786, 0.0226, 0.0160, 0.0000, 0.0196, 0.0007, 0.0007],
[0.0171, 0.0189, 0.0598, 0.0215, 0.0218, 0.0464, 0.0399, 0.0051, 0.0000, 0.0000]]
#### ... so you need to TRANSPOSE IT HERE
z_values = np.array(z_values).T
x_grid, y_grid = np.meshgrid(x_values, y_values)
fig = plt.figure(figsize=(12, 5))
ax1 = fig.add_subplot(121)
contour1 = ax1.contourf(x_grid, y_grid, np.log(z_values + 1))
fig.colorbar(contour1, ax=ax1)
ax1.set_xlabel('x values')
ax1.set_ylabel('y values')
x_flat = x_grid.flatten()
y_flat = y_grid.flatten()
z_flat = z_values.flatten()
degree = 4
poly_features = PolynomialFeatures(degree=degree)
X_poly = poly_features.fit_transform(np.column_stack((x_flat, y_flat)))
model = LinearRegression()
model.fit(X_poly, z_flat)
z_pred = model.predict(X_poly)
z_pred_grid = z_pred.reshape(x_grid.shape)
ax2 = fig.add_subplot(122, projection='3d')
ax2.plot_surface(x_grid, y_grid, np.log(z_pred_grid + 1), cmap='viridis')
ax2.set_xlabel('x values')
ax2.set_ylabel('y values')
ax2.set_zlabel('z values')
plt.show()
Output:
[![enter image description here][1]][1]
[1]: https://i.stack.imgur.com/motaj.png |
the partialType in not working for my updateDto it not making option my nested object variable in update dto. the main thing is that its only not working for boolean values mean not doing option boolean type in updateDto
here is createDto
```
class PoolDTO {
@IsBoolean()
present: boolean;
@IsOptional()
@IsString()
fence: string;
@IsOptional()
@IsString({ each: true })
location: string;
}
class UtilitiesDTO {
@IsBoolean()
sprinklerSystem: boolean;
@IsBoolean()
sumpPump: boolean;
@IsBoolean()
sumpPit: boolean;
@ValidateNested()
@Type(() => PoolDTO)
pool: PoolDTO;
@IsBoolean()
spaHot: boolean;
@IsBoolean()
water: boolean;
@IsBoolean()
gas: boolean;
@IsBoolean()
electric: boolean;
@IsOptional()
@IsString()
comment: string;
}
export class CreateDto{
@IsString()
@IsNotEmpty()
number: string;
@ValidateNested()
@Type(() => PoolDTO)
pool: PoolDTO;
@ValidateNested({ each: true })
@Type(() => UtilitiesDTO)
utilities: UtilitiesDTO;
```
and here is update dto
```
import { PartialType } from '@nestjs/mapped-types';
import { CreateDto } from './create.dto';
export class UpdateDto extends PartialType(CreateDto) {}
```
it gives me validation error
```
"messages": [
"utilities.sprinklerSystem must be a boolean value",
"utilities.sumpPump must be a boolean value",
"utilities.sumpPit must be a boolean value",
"utilities.water must be a boolean value",
"utilities.gas must be a boolean value",
"utilities.electric must be a boolean value"
]
```
but i'm used PartialType
which is The PartialType() function returns a type (class) with all the properties of the input type set to optional.
**still i'm getting error i'm not want to write updatedto want to used same createdto for updatedto** |
i got an error for PartialType is not working for updateDto |
|javascript|node.js|typescript|nestjs|typeorm| |
{"OriginalQuestionIds":[14512562],"Voters":[{"Id":4108803,"DisplayName":"blackgreen"}]} |
while learning and playing around with asyncio and aiohttp, faced a strange issue and was wondering if this was an occurance limited to VSC only?
This one keeps leaving residues from previous loops.
```
while True:
loop.run_until_complete(get_data())
```
[](https://i.stack.imgur.com/OiuBe.png)
This runs without any residual tasks.
```
while True:
asyncio.run(get_data())
```
[](https://i.stack.imgur.com/Ju2RG.png)
```
session = aiohttp.ClientSession()
async def fetch(url, params=None):
async with semaphore:
async with session.get(url, params=params) as response:
return await response.json()
```
I tried moving the while loop inside the called method itself but still the same result. Also tried using semaphores with no better luck.
I was trying to manage multiple rest api call methods using aiohttp from within a single loop but with this happening, guess I am stuck with having to use asyncio.run separately for each api related method. So the only way right now is to keep closing and reopening separate loops.
Need I even to worry about the forever created tasks? I read that these are just residues from the previous loops and the impact load is nonexistent even with them constantly stacking up. Could this possibly be a VSC related issue? |
I am trying to run a nodejs based Docker container on a k8s cluster.
The code refuses to run, and continuously getting errors:
`Navigation frame was detached`
`Requesting main frame too early`
I cut down to the minimal code that should do some work:
```
const puppeteer = require('puppeteer');
const os = require('os');
function delay(time) {
return new Promise(resolve => setTimeout(resolve, time));
}
const platform = os.platform();
(async () => {
console.log('started')
let browserConfig = {}
if (platform === 'linux') {
browserConfig = {
executablePath: '/usr/bin/google-chrome',
headless: true,
args:['--no-sandbox','--disable-web-security','--disable-features=IsolateOrigins,site-per-process','--disable-gpu', '--disable-dev-shm-usage']
}
}
else {
browserConfig = {
headless: true,
args:['--no-sandbox','--disable-web-security','--disable-features=IsolateOrigins,site-per-process','--disable-gpu', '--disable-dev-shm-usage']
}
}
console.log('create browser')
const browser = await puppeteer.launch(browserConfig)
console.log('create page')
const page = await browser.newPage()
await page.setViewport({ width: 1920, height: 926 })
console.log('browsing to url')
await page.goto("https://www.example.com", {
waitUntil: 'load',
timeout: 3000000
})
console.log('waiting')
await delay(5000)
console.log('get content')
const s = await page.content();
console.log('content', s);
console.log('close browser')
await browser.close()
console.log('finished')
})();
```
That code is running on the local nodejs cli, and also when packed to a docker image using this Dockerfile:
```
# Install dependencies only when needed
FROM node:20.11.1-slim AS deps
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD true
RUN apt-get update && \
apt-get install -y libc6 && \
apt-get install -y git && \
rm -rf /var/lib/apt/lists/*
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci --legacy-peer-deps
# Rebuild the source code only when needed
FROM node:20.11.1-slim AS builder
WORKDIR /app
COPY . .
COPY --from=deps /app/node_modules ./node_modules
ARG NODE_ENV=production
RUN echo ${NODE_ENV}
# RUN NODE_ENV=${NODE_ENV} npm run build
# Production image, copy all the files and run next
FROM node:20.11.1-slim AS runner
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD true
WORKDIR /app
RUN apt-get update && apt-get install -y gnupg wget && \
wget --quiet --output-document=- https://dl-ssl.google.com/linux/linux_signing_key.pub | gpg --dearmor > /etc/apt/trusted.gpg.d/google-archive.gpg && \
echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" > /etc/apt/sources.list.d/google-chrome.list && \
apt-get update && \
apt-get install -y google-chrome-stable --no-install-recommends && \
rm -rf /var/lib/apt/lists/*
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./package.json
COPY --from=builder /app/app.js ./app.js
# Expose
EXPOSE 3000
# CMD ["node", "app.js"]
CMD node app.js
```
Running this image on local Docker is doing the job.
However when I try to deploy to Kubernetes cluster, it fails with those errors.
[k8s error message](https://i.stack.imgur.com/t2Fpw.png)
This is very frustrating :( help will be much appreciated. |
I'm trying to setup auto-scaling for an ECS Fargate Service based on SQS Queue depth.
The idea is to stay at 0 tasks until there are messages in the queue and then apply a step scaling policy to gradually increase the number of tasks using cloudwatch alarms.
I'm mostly following this article here - https://adamtuttle.codes/blog/2022/scaling-fargate-based-on-sqs-queue-depth/
The problem is that step scaling itself is disabled when I try to update my service.
[![enter image description here][1]][1]
I don't understand what I'm doing wrong here.
[1]: https://i.stack.imgur.com/e7l27.png |
Python 3.7.4 asyncio and aiohttp leaving multiple tasks with run_until_complete in VS Code debug call stack |
|python|rest|visual-studio-code|python-asyncio| |
null |
With reference to your mention - "Since the yaml will be used in multiple environments, where some will use the pvc and others don't...":
If you mean a "make the pvc available on-demand only as and when the (application in the)container needs it" : You might want to review to [dynamic provisioning of PVCs][1], via Storage Classes.
For static PVCs:
When pvc is declared in kubernetes yaml file, it's a declaration indicating it will be used **for** the deployment to keep it available for the container.
Assuming the pvc and associated pv definitions you have in your yaml are valid pointing to working and available storage as and when needed by the container needing the storage, it's upto the (application in the)container whether to use the pvc or not for storing the logs or whatever that might be. So it's optional as per the needs of the container image, and not as per the kubernetes platform because kubernetes wouldn't know whether or not the application needs it, ahead of time.
[1]: https://kubernetes.io/docs/concepts/storage/dynamic-provisioning/ |
Why when I run the command...
aws --profile default sts get-caller-identity
it works and I get the expected result back. But when I attempt to run...
aws sts get-caller-identity
It fails with the error "*An error occurred (ExpiredToken) when calling the GetCallerIdentity operation: The security token included in the request is expired*"
I've configured aws sso correctly and have successfully given my machine permission. I've ensure that there are no AWS environment variables set in my user or system environment variables (no environment variable prefixed with 'AWS_') |
AWS sts error: The security token included in the request is expired |
|amazon-web-services|aws-cli| |
Why does it provide two different outputs? |
I have two matters I need help with in using superset.
the client wants to see the top 10 products from table Sales (ie: product count >300), and the rest of them displayed as "Others" (in a pie chart). How do I get the other products to be grouped into "Others", dynamically?
If they filter by one of the products (which would fall under "Others" category), they want to be able to see the data. Any ideas?
Many thanks in advance.
I tried the following query but it is not doing what I expected:
```
SELECT
CASE
WHEN t.product IN (SELECT product FROM Sales GROUP BY product HAVING COUNT(*) >= 30) THEN t.product
ELSE 'Others'
END AS grouped_product,
COUNT(*) AS product_count
FROM
Sales AS t
GROUP BY
grouped_product;
```
Output:
Selection of top 10 ranking products without hardcoding 300 and the rest would be in "Others".
Using Maria-DB |
Apparently your custom `AuthenticationHandler` has already written to the response. It is probably sending a forward or redirect to a different page all by itself.
In that case you'll need to inform Faces that the response is complete.
```java
FacesContext.getCurrentInstance().responseComplete();
```
This way Faces won't touch the response during render response phase. Because the default behavior is to redisplay/render the same page from where the Faces action was invoked. |
After setting up the GitHub path in the Jenkins pipeline definition SCM (using GitHub webhook), can set the Jenkinsfile script path below to be in another repository on GitHub?
[enter image description here](https://i.stack.imgur.com/WCok8.jpg)
Thanks.
For example:
Webhook repository (https://github.example.com/example/test.git
Jenkinsfile repository (https://github.example.com/jenkinsfile/jenkinsfile-test.git) |
Jenkins pipeline script path |