instruction
stringlengths
0
30k
Notification Using Rest Api
Disable annoying source code modification indication (codelenses and text decorations)
How to enable sound for background notifications while keeping foreground notifications silent in Capacitor with @capacitor-firebase/messaging?
This is my code to read the csv file asynchronoulsy using ReadLineAsync() function from the [StreamReader][1] class but it reads first line only of the [csv file][2] private async Task ReadAndSendJointDataFromCSVFileAsync(CancellationToken cancellationToken) { Stopwatch sw = new Stopwatch(); sw.Start(); string filePath = @ "/home/adwait/azure-iot-sdk-csharp/iothub/device/samples/solutions/PnpDeviceSamples/Robot/Data/Robots_data.csv"; using(StreamReader oStreamReader = new StreamReader(File.OpenRead(filePath))) { string sFileLine = await oStreamReader.ReadLineAsync(); string[] jointDataArray = sFileLine.Split(','); // Assuming the joint data is processed in parallel var tasks = new List < Task > (); // Process joint pose tasks.Add(Task.Run(async () => { var jointPose = jointDataArray.Take(7).Select(Convert.ToSingle).ToArray(); var jointPoseJson = JsonSerializer.Serialize(jointPose); await SendTelemetryAsync("JointPose", jointPoseJson, cancellationToken); })); // Process joint velocity tasks.Add(Task.Run(async () => { var jointVelocity = jointDataArray.Skip(7).Take(7).Select(Convert.ToSingle).ToArray(); var jointVelocityJson = JsonSerializer.Serialize(jointVelocity); await SendTelemetryAsync("JointVelocity", jointVelocityJson, cancellationToken); })); // Process joint acceleration tasks.Add(Task.Run(async () => { var jointAcceleration = jointDataArray.Skip(14).Take(7).Select(Convert.ToSingle).ToArray(); var jointAccelerationJson = JsonSerializer.Serialize(jointAcceleration); await SendTelemetryAsync("JointAcceleration", jointAccelerationJson, cancellationToken); })); // Process external wrench tasks.Add(Task.Run(async () => { var externalWrench = jointDataArray.Skip(21).Take(6).Select(Convert.ToSingle).ToArray(); var externalWrenchJson = JsonSerializer.Serialize(externalWrench); await SendTelemetryAsync("ExternalWrench", externalWrenchJson, cancellationToken); })); await Task.WhenAll(tasks); } sw.Stop(); _logger.LogDebug(String.Format("Elapsed={0}", sw.Elapsed)); } Basically, the csv file has 10128 lines. I want to read the latest line which gets added to the csv file. How do I do it? [1]: https://learn.microsoft.com/en-us/dotnet/api/system.io.streamreader?view=net-8.0 [2]: https://github.com/addy1997/azure-iot-sdk-csharp/blob/main/iothub/device/samples/solutions/PnpDeviceSamples/Robot/Data/Robots_data.csv
PyCharm crashing with "multiprocessing" error in single threaded application only when debugger is attached
|python|pycharm|
I'm developing a webapp with Next.js and I am at a point where I need to decide on an authentication strategy. My options seem to boil down to two main categories: - **Self-Hosted Authentication:** using libraries like [Auth.js][1] or [Lucia Auth][2] that are hosted on the same server. - **Third-Party Authentication Services:** utilizing services like [Supabase][3] or [Clerk][4]. I’m wondering about the benefits of choosing one over the other. Ideally, I prefer authentication hosted on the same server so I don’t need to rely on third-party services, but I’m concerned there might be problems I’m not aware of. *Last question, if I need a simple authentication system using only email and password, implemented with JWT (without the need for providers like Google or Github), can I implement it? Or is it a bad idea?* Thanks! [1]: https://authjs.dev/ [2]: https://lucia-auth.com/ [3]: https://supabase.com/ [4]: https://clerk.com/
Next.js Auth: Self-Hosted vs Third Part Provider
|authentication|next.js|next-auth|
{"OriginalQuestionIds":[5755506],"Voters":[{"Id":4108803,"DisplayName":"blackgreen"}]}
Thank you for all the explanations and very quick response. I got round this issue with the following code. I basically convert it to a string then back to a decimal. Maybe not the correct way but the cleanest for me. Decimal(np.format_float_positional(abs(dist))) <= t.min_dbv
You have to do two things to start with unsecure protocol http: 1. Remove the part https://localhost:7248 from launchSettings.json uner Properties folder both from Client and Server Project. 2. Comment the line app.UseHttpsRedirection() from the Program.cs under the Server Project Also you need to know how you start the Blazor in order this changes to take an effect. Do not start it with IIS Profile but instead with Server profile. Try via Microsoft Visual Studio if it is possible.
Since January 2024 Edge (and all major browsers) support AVIF files
We have an ASP.NET MVC 4 web app that uses SQL Server 2012 and Entity Framework as ORM and Unity for IoC. Web app is hosted on Amazon EC2. I started getting "Physical connection is not usable" exception. It happens few times a day. I searched many articles and forums and tried all the possible suggestions: - Tried removing pooling from connection string "Polling=False" - Tried limiting pool size and connection lifetime - Tried changing LifetimeManager of Unity to HierarchicalLifetimeManager, PerRequestLifetimeManager. Also made sure entities context is disposed after the end of request - Removed all TransactionScope references When this exception happens, the only way to restore application is to restart server, which is very bad! This is full exception: > A transport-level error has occurred when sending the request to the > server. (provider: Session Provider, error: 19 - Physical connection > is not usable)
i am using material date picker in my andorid priject in fragment on specific click i want to show user the date dialog but on click nothing is showing . here is my code [Image link](https://i.stack.imgur.com/CyZRZ.png.imgur.com/5saGg.png) i am expecting that user select two date range and i get those date
I want to create a Snackbar in flutter. I want it to have borders on all 4 corners and have a border width with a different color only at bottom side. [Snackbar required][1] I couldn't find a way to add both the properties together. Can someone suggest a way to do it without using external flutter libraries or a Container inside content parameter of the snackbar widget? I am able to create only one of the required properties separately. 1. I am able to apply border width at bottom with color. <!-- begin snippet: js hide: false console: true babel: false --> <!-- language: lang-html --> var snackBar = SnackBar( behavior: SnackBarBehavior.floating, shape: const Border( bottom: BorderSide( color: Colors.green width: 4, content: Flex( direction: Axis.horizontal, children: [ Padding( padding: const EdgeInsets.only(right: 12), child: Icon(Icons.add), ), Text( "Toast message", ), ], ), ); <!-- end snippet --> [SnackBar with bottom border width][2] 2. I am able to apply corner radius at all 4 corners. <!-- begin snippet: js hide: false console: true babel: false --> <!-- language: lang-html --> var snackBar = SnackBar( behavior: SnackBarBehavior.floating, shape: const RoundedRectangleBorder( borderRadius: 8, ), content: Flex( direction: Axis.horizontal, children: [ Padding( padding: const EdgeInsets.only(right: 12), child: Icon(Icons.add) ), Text( "Toast message", ), ], ), ); <!-- end snippet --> [Snackbar with border radius][3] I want to apply both properties together but I am unable to it. [1]: https://i.stack.imgur.com/opvJS.jpg [2]: https://i.stack.imgur.com/doAxz.jpg [3]: https://i.stack.imgur.com/k6GUG.jpg
I want to create feature information(like feature matrix) based only on the similarity information of each drug, but is it possible to use the method using truncated PCA without any abnormalities? If there is a problem, what are the other ways? I've tried embedding using random walk, but I'm looking for other ways because I don't seem to get the results I want.
Without path overlay, this is my image with 2 custom marker without path overlay: [![][1]][1] In my android application, in Navigation activity, routeObserve, here 's my code: List<Point> list = PolylineUtils.decode(routeProgress.getRoute().geometry(), 6); List<Double> coordinates = new ArrayList<>(); for (int i = 0; i < list.size(); i++) { coordinates.addAll(list.get(i).coordinates()); } Log.e("p_des_org",routeProgress.getRoute().routeOptions().coordinates()); Log.e("p_geometry",routeProgress.getRoute().geometry()); Log.e("p_point", list.toString()); Log.e("p_coordinates", coordinates.toString()); Log.e("p_poly", PolylineUtils.encode(list, 6)); and this is the output : p_des_org: 105.8287217,21.0163447;105.8317352,21.0195122 p_geometry: wqvag@ysgzhEaa@kScBy@uL_GySgKiLoGsKsGqF}BiNsHuCwAyBgAePoGwCkAkGsNOm@cBsFqGqR}AgBiA}Ca@gAYu@oDsJ}AyEaA}Eg@{EW}DE{E?O p_point: [Point{type=Point, bbox=null, coordinates=[105.828685, 21.016364]}, Point{type=Point, bbox=null, coordinates=[105.829011, 21.016909]}, Point{type=Point, bbox=null, coordinates=[105.82904, 21.016959]}, Point{type=Point, bbox=null, coordinates=[105.829168, 21.017178]}, Point{type=Point, bbox=null, coordinates=[105.829364, 21.017511]}, Point{type=Point, bbox=null, coordinates=[105.8295, 21.017724]}, Point{type=Point, bbox=null, coordinates=[105.829638, 21.017926]}, Point{type=Point, bbox=null, coordinates=[105.829701, 21.018047]}, Point{type=Point, bbox=null, coordinates=[105.829855, 21.018292]}, Point{type=Point, bbox=null, coordinates=[105.829899, 21.018367]}, Point{type=Point, bbox=null, coordinates=[105.829935, 21.018428]}, Point{type=Point, bbox=null, coordinates=[105.830071, 21.018703]}, Point{type=Point, bbox=null, coordinates=[105.830109, 21.018779]}, Point{type=Point, bbox=null, coordinates=[105.830359, 21.018913]}, Point{type=Point, bbox=null, coordinates=[105.830382, 21.018921]}, Point{type=Point, bbox=null, coordinates=[105.830504, 21.018971]}, Point{type=Point, bbox=null, coordinates=[105.830817, 21.019108]}, Point{type=Point, bbox=null, coordinates=[105.830869, 21.019155]}, Point{type=Point, bbox=null, coordinates=[105.830948, 21.019192]}, Point{type=Point, bbox=null, coordinates=[105.830984, 21.019209]}, Point{type=Point, bbox=null, coordinates=[105.831011, 21.019222]}, Point{type=Point, bbox=null, coordinates=[105.831197, 21.01931]}, Point{type=Point, bbox=null, coordinates=[105.831306, 21.019357]}, Point{type=Point, bbox=null, coordinates=[105.831417, 21.01939]}, Point{type=Point, bbox=null, coordinates=[105.831527, 21.01941]}, Point{type=Point, bbox=null, coordinates=[105.831622, 21.019422]}, Point{type=Point, bbox=null, coordinates=[105.831732, 21.019425]}, Point{type=Point, bbox=null, coordinates=[105.83174, 21.019425]}] p_coordinates: [105.828685, 21.016364, 105.829011, 21.016909, 105.82904, 21.016959, 105.829168, 21.017178, 105.829364, 21.017511, 105.8295, 21.017724, 105.829638, 21.017926, 105.829701, 21.018047, 105.829855, 21.018292, 105.829899, 21.018367, 105.829935, 21.018428, 105.830071, 21.018703, 105.830109, 21.018779, 105.830359, 21.018913, 105.830382, 21.018921, 105.830504, 21.018971, 105.830817, 21.019108, 105.830869, 21.019155, 105.830948, 21.019192, 105.830984, 21.019209, 105.831011, 21.019222, 105.831197, 21.01931, 105.831306, 21.019357, 105.831417, 21.01939, 105.831527, 21.01941, 105.831622, 21.019422, 105.831732, 21.019425, 105.83174, 21.019425] p_poly: wqvag@ysgzhEaa@kScBy@uL_GySgKiLoGsKsGqF}BiNsHuCwAyBgAePoGwCkAkGsNOm@cBsFqGqR}AgBiA}Ca@gAYu@oDsJ}AyEaA}Eg@{EW}DE{E?O When i use above encoded polyline, static image had no error, but not shown path overlay i try with differents path parameter but not helping (path-{strokeWidth}+{strokeColor}-{strokeOpacity}+{fillColor}-{fillOpacity}({polyline}) with path overlay: [![][2]][2] How can i get static image with an encoded polyline string from routeProgress.getRoute().geometry()? [1]: https://i.stack.imgur.com/dshnm.png [2]: https://i.stack.imgur.com/tKVt1.png
null
null
null
null
null
|azure|powershell|azure-powershell|microsoft-partner-center|
Very simple: do the compression, then snip the result. import gzip plain = b"Stuff" compressed = gzip.compress(plain) bad_compressed = compressed[:-1] gzip.decompress(bad_compressed) # EOFError (This is in-memory for the simplicity of demonstration; it would work the same if you manipulated the file instead of the string.)
I am getting the below java exception for API which i am triggering from eclipse, JAVA Version : 1.8 IDE : Eclipse Exception :"javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target" Pre Condition : It is a 3rd party tool which we use and To login this URL we need to login ZScaler and then basic auth will call. Tried Below thing, 1) Install jdk 1.8 and configured java home path to jdk 1.8 2) Eclipse-Project Run Configuration : Java Build Path set to jdk 1.8 3) Added root cert to cacerts using keystore ["C:\Program Files\Java\jre8\lib\security\cacerts"] 4) re stared my system and ran again To import Cert used below command "keytool -importcert -file location_of_cert -alias cd12 -keystore "C:\Program Files\Java\jre8\lib\security\cacerts" --------------------------------------------------------------------------- Just for Ref, I am trying below code in rest assured, but getting above exception, RestAssured.proxy("ipaddress", portno); //RestAssured.useRelaxedHTTPSValidation("TLSv1.2"); RestAssured .given() .auth().basic("username", "Pwd") .contentType(ContentType.JSON) .baseUri("https://apiURL") .when() .get() .then() .assertThat() //.body(null, null, null) .statusCode(200);
I want to search for a date (which is a struct) in an array of dates to see if it is in it. This is the first time I am using `bsearch` and it always returns the same result, `0`, whereas it should either return `null` or a pointer to the date found. I am using the same comparing function I used to sort the array and the sorting works fine. I'm guessing if the function returns `0 it means it has found the date in the array. What have I done wrong? If the mistake is not obvious I can post the full code. #define MIN_SIZE 0 #define MAX_SIZE 10000 #define MAX_MONTH_STR 9 #define SWAP 1 #define NO_SWAP -1 #define EQUAL 0 //Define a struct data type typedef struct { //Declaration of struct members char* month; int day; int year; }date; //Method to allocate memory for a list of date structures date* allocateStruct(int size) { //Declaration of variables date *array; int i; //Allocate memory for array to store 'size' many 'date' struct data types array = malloc(size*sizeof(date)); //For-loop to allocate memory for each struct's members and initialize them to zero for (i=0; i<size; i++) { array[i].month = calloc(MAX_MONTH_STR,sizeof(char)); array[i].day = (int) calloc(1,sizeof(int)); array[i].year = (int) calloc(1,sizeof(int)); } return array; } //Method to free memory allocated void freeStruct(date* array, int size) { //Declaration of variable int i; //For-loop to free up struct members for (i=0; i<size; i++) { free(array[i].month); free(&array[i].day); free(&array[i].year); } //Free up structs free(array); } //Method to compare two dates int cmpDates (const void *a, const void *b) { //Declaration and dereference of variables date first = *(date*)a; date second = *(date*)b; int y_result, m_result, d_result; //Calculate results y_result = second.year-first.year; m_result = second.month-first.month; d_result = second.day-first.day; //If-statements to determine whether to swap dates based on year //If-statement to determine whether both years are in 90s group if (first.year>=90 && first.year<=99 && second.year>=90 && second.year<=99) { //If-statement to determine whether years are equal if (y_result!=0) { return (y_result); } } //Else-if-statement to determine whether both years are in 00-12 group else if (first.year>=0 && first.year<=12 && second.year>=0 && second.year<=12) { //If-statement to determine whether years are equal if (y_result!=0) { return (y_result); } } //Otherwise the two years belong to different year groups else { //If-statement to determine whether first year belongs to 00-12 group if (first.year>=0 && first.year<=12) { return NO_SWAP; } else { return SWAP; } } //If-statement to determine whether to swap dates based on months if (m_result!=0) { return m_result; } //If-statement to determine whether to swap dates based on days if (d_result!=0) { return d_result; } //If dates are exactly the same return EQUAL; } enum months { January=1, February, March, April, May, June, July, August, September, October, November, December }; int main() { //Declaration of variables int n; //number of dates in array date* date_list; //array of dates date *key_date; //date to search for date *q_result; //result of search //Read input do { //printf("Enter number of dates you want to enter (between 1 and 10000):\n"); scanf("%d", &n); }while(n<MIN_SIZE || n>MAX_SIZE); //Allocate memory for an array of n dates date_list = allocateStruct(n); //For-loop to store values in 'date_list' for (i=0; i<n; i++) { //printf("Enter the date (month day year) in the following format <text number number>:"); scanf("%s", date_list[i].month); scanf("%d", &date_list[i].day); scanf("%d", &date_list[i].year); } //Allocate memory for one date key_date = allocateStruct(1); //Read date for query //printf("Enter date you want to query:"); scanf("%s", key_date->month); scanf("%d", &key_date->day); scanf("%d", &key_date->year); //Sort the array with built-in function qsort qsort(date_list, n, sizeof(date), cmpDates); //Print list of sorted dates for (i=0; i<n; i++) { //printf("Enter the date (month day year) in the following format: text number number"); printf("%s ", date_list[i].month); printf("%d ", date_list[i].day); printf("%02d\n", date_list[i].year); //need & ? } //Query with bsearch --> I TRIED BOTH OF THESE LINES BUT THE RESULT WAS THE SAME q_result = (date*) bsearch(&key_date, date_list, n, sizeof(date), cmpDates); // q_result = bsearch(&key_date, date_list, n, sizeof(date), cmpDates); //Printing answer to query if(q_result!=NULL) { printf("Yes in list"); } else { printf("No not in list"); } }
{"Voters":[{"Id":3001761,"DisplayName":"jonrsharpe"},{"Id":8054998,"DisplayName":"Bench Vue"},{"Id":6463558,"DisplayName":"Lin Du"}]}
{"OriginalQuestionIds":[2953834],"Voters":[{"Id":16343464,"DisplayName":"mozway","BindingReason":{"GoldTagBadge":"dataframe"}}]}
{"Voters":[{"Id":9599344,"DisplayName":"uber.s1"}]}
A content-addressable store is utilized by pnpm for managing dependencies. Dev Containers often mount project directories from the host machine's filesystem (e.g., `/workspaces/nx`), relying on the host's directory structure being available. Thus these dependencies may end up in the mounted directory, which can be confirmed with `pnpm store path` command. To avoid compatibility issues or file access permissions errors, pnpm can be configured to use a local directory within the container itself for its store: ```sh pnpm config set store-dir ~/.local/share/pnpm/store ``` *Note*: `node_modules` will need to be deleted and the dependencies reinstalled (`pnpm install`). Additional information: - This issue is discussed in https://github.com/pnpm/pnpm/issues/5803. - Additionally pnpm's [FAQ][1] recommends package stores to be in the same file system. >Does pnpm work across multiple drives or filesystems? > >The package store should be on the same drive and filesystem as installations, otherwise packages will be copied, not linked. This is due to a limitation in how hard linking works, in that a file on one filesystem cannot address a location in another. [1]: https://pnpm.io/faq#does-pnpm-work-across-multiple-drives-or-filesystems [2]: https://pnpm.io/motivation#:~:text=dependency%20will%20be%20stored%20in%20a%20content%2Daddressable%20store
I am trying to download a file from my ftp server. As soon as the file is larger than 300KB, the download fails. Below that limit, it works. The ftp server is running in a Docker container (image: delfer/alpine-ftp-server). Here is my method: ```java public InputStream getInputStreamFromURI(String uri) throws IOException { try { ftp.addProtocolCommandListener(new PrintCommandListener( new PrintWriter(new OutputStreamWriter(System.out, "UTF-8")), true)); assureConnection(false); ftp.setBufferSize(2048000); ftp.setAutodetectUTF8(true); ftp.setDataTimeout(Duration.ofSeconds(6000)); ftp.enterLocalPassiveMode(); InputStream is; logger.trace("Retrieve file " + uri + " from FTP server..."); is = ftp.retrieveFileStream(uri); ftp.completePendingCommand(); logger.trace("FTP response was: " + ftp.getReplyString()); return is; } catch (FTPConnectionClosedException e1) { logger.error("Server has closed connection, we will try again..."); return getInputStreamFromURI(uri); } catch (ConnectException e2) { logger.error("The server could not be connected, let's try again..."); return getInputStreamFromURI(uri); } catch (NoRouteToHostException e3) { logger.error("Server was not routable, let's try again..."); return getInputStreamFromURI(uri); } } ``` Here are the logs from the ftp container: ``` [pid 1336] OK LOGIN: Client "172.12.123.12" [pid 1338] [ppdw] FAIL DOWNLOAD: Client "172.12.123.12", "file.csv", 366760 byte ``` What could be the problem? So far I have tried to save the file temporarily. <s>However, this temp file is not created.</s> Another idea would be to split the file, but that didn't work either. *Update:* Here is the log for the small file: ``` OK DOWNLOAD: Client "172.12.123.12", "file.csv", 228893 bytes, 0.74Kbyte/sec ``` *Update#2:* Adding the "ftp.addProtocolCommandListener" did not change anything in the logs of ftp container. Only the application logs show now: ``` Log in configured FTP server... NOOP 200 NOOP ok. PASV 227 Entering Passive Mode (172,12,123,98,82,17). RETR files/config/tpch1.properties 150 Opening BINARY mode data connection for file.properties (842 bytes). 226 Transfer complete. ```
Whatever the bug was in 2.0.8, it no longer occurs in 2.0.31 which is the current version at the time of writing this, and can be downloaded through maven or at https://pdfbox.apache.org/download.html
{"Voters":[{"Id":21363224,"DisplayName":"markalex"},{"Id":10008173,"DisplayName":"David Maze"},{"Id":213269,"DisplayName":"Jonas"}],"SiteSpecificCloseReasonIds":[18]}
I'm optimizing some work I did moving from Orange to Python code, but I'm having some problems with image Embedders. I'm trying to recreate my work using Tensorflow/Keras, but the outputs of the VGG16 networks but the 4096 outputs of the activation layer of the penultimate FC layer for this architecture, using Orange and Keras, are different. [In the Orange documentation it is written:](https://i.stack.imgur.com/X1Kya.png) For python: `model_vgg = VGG16(include_top=True, weights='imagenet', pooling=None, input_shape=(224, 224, 3)) model_vgg16 = Model(inputs=model_vgg.input, outputs=model_vgg.layers[-2].output)` [Keras Model](https://i.stack.imgur.com/31umC.png) To reshape some images to 224x244 pixels i use the same package and code of function load_image_or_none -> https://github.com/biolab/orange3-imageanalytics/blob/master/orangecontrib/imageanalytics/utils/embedder_utils.py And get the same image resized to 224x224 used for VGG16 in Orange, by widget Save Image. [My resized images and Orange images are the same](https://i.stack.imgur.com/GBO7o.png) Perhaps I'm making a mistake during preprocessing, since in the Orange documentation it is written that they use the original weights of the model. To preprocess the images i try the Keras preprocess_input of VGG16, and manually ``` def process_vgg16(imgs): output = np.zeros(imgs.shape) VGG_MEAN = np.array([103.939, 116.779, 123.68], dtype=np.float32) for i in range(0, imgs.shape[0]): b = np.array(imgs[i,:,:,2], dtype=np.float32) g = np.array(imgs[i,:,:,1], dtype=np.float32) r = np.array(imgs[i,:,:,0], dtype=np.float32) output[i,:,:,0] = b - VGG_MEAN[0] output[i,:,:,1] = g - VGG_MEAN[1] output[i,:,:,2] = r - VGG_MEAN[2] #output = output/255 return output ``` Note: The images are in gray scale, so all channels are the same. Results: [First three output of an image in orange (VGG16)](https://i.stack.imgur.com/Jex5Z.png) [First three output of an image in Keras (VGG16)](https://i.stack.imgur.com/T29ph.png) Would anyone know the reason?
How to read the latest line from the csv file using ReadLineAsync method?
We facing weird issue with Flutter iOS, I have a listview which shows heterogeneous widgets in which some of the widgets are either network images which are rendered using CachedNetworkImage or some have API calls (using Dio) and on success we render widget. In terms of iOS this widgets with API calls are rendered randomly. They sometimes show up after successful api call and sometimes it just initiates API call and keeps on showing loading widget. Have tried calling API using FutureBuilder but still sometimes gets stuck at loading state. This issue is not happening on Android and also on debug build of iOS. It can be only be re-produced on iOS release build Project has riverpod & hooks, were I tried calling API useEffect, FutureProvider, FutureBuilder but all of them have same behaviour. The expected behavoir is that the listview with this widgets should load consistently same as Android and debug iOS build.
Flutter iOS Listview/ScrollView API call issue
|ios|flutter|dart|riverpod|
null
def create_media_container_for_reel(video_url, access_token, caption, user_id): graph_url = f'https://graph.facebook.com/v19.0/{user_id}/media' payload = { 'media_type': 'REELS', # Updated as per the new requirement 'video_url': video_url, 'caption': caption, 'access_token': access_token } response = requests.post(graph_url, json=payload) if response.status_code in [200, 201]: media_container_id = response.json().get('id') return media_container_id else: logging.error(f"Failed to create media container. Status code: {response.status_code}, Response: {response.text}") return None import time def publish_media_container_as_reel(media_container_id, access_token, user_id, retries=5, delay=10): print(media_container_id) publish_endpoint = f'https://graph.facebook.com/v19.0/{user_id}/media_publish' publish_payload = { 'creation_id': media_container_id, 'access_token': access_token } while retries > 0: response = requests.post(publish_endpoint, data=publish_payload) if response.status_code in [200, 201]: logging.info("Reel published successfully.") return True else: error = response.json().get('error', {}) if error.get('code') == 9007 and retries > 0: logging.warning(f"Media not ready for publishing, waiting {delay} seconds before retrying...") time.sleep(delay) retries -= 1 else: logging.error(f"Failed to publish reel. Status code: {response.status_code}, Response: {response.text}") return False logging.error("Failed to publish reel after retries. Giving up.") return False def post_reel_to_instagram(video_url, access_token, caption, user_id): # Step 1: Create Media Container media_container_id = create_media_container_for_reel(video_url, access_token, caption, user_id) if not media_container_id: return False # Step 2: Publish the Media Container as a Reel return publish_media_container_as_reel(media_container_id, access_token, user_id) I'm using the above code, all the token and id is same for image and reel, but image upload works flawlessly but reel one doesn't work. `create_media_container_for_reel` also returns a media id, it only gives error when i try to post ot on my feed using `publish_media_container_as_reel` What am I doing wrong?
null
|redux|rxjs|blazor-webassembly|system.reactive|fluxor|
I have written my Golang HTML DOM parser with pure Golang and no dependency used in it. This project is still under development. But now it is usable, and I will develop it in the future. Project link: https://github.com/pejman-hkh/gdp/ ```go package main import ( "fmt" "github.com/pejman-hkh/gdp/gdp" ) func main() { document := gdp.Default(`<!DOCTYPE html> <html> <head> <title> Title of the document </title> </head> <body> body content <p>more content</p> </body> </html>`) body := document.Find("body").Eq(0) fmt.Print(body.OuterHtml()) } ```
I am creating an API that searches and reverses an entry in the account.move model. I am able to find the correct entry and reverse it using the refund_moves() method. However, whenever I try to confirm the reversed entry using the action_post() method, I get a "Expected singleton: res.company()" error. I've used the action_post() method before on other models such as sale.order/account.move and it works fine. Code: ```python @http.route('/update_invoice', website="false", auth='custom_auth', type='json', methods=['POST']) #Searching for entry invoice = request.env['account.move'].sudo().search([('matter_id','=',matterID),('account_id','=',accountID),('move_type','=','out_invoice'),('company_id','=',creditor.id)]) if invoice: #Create Reversal move_reversal = request.env['account.move.reversal'].with_context(active_model="account.move", active_ids=invoice.id).sudo().create({ 'date': intakeDate, 'reason': 'Balance Adjustment', 'journal_id': invoice.journal_id.id, }) #Reverse Entry move_reversal.refund_moves() #Search for created reversed entry refundInvoice = request.env['account.move'].sudo().search([('name','=',"/"),('company_id','=',creditor.id),('move_type','=','out_refund')]) if refundInvoice: _logger.info("Refund Invoice Found") #Error occurs refundInvoice.action_post() ``` Custom Authorization: ```python @classmethod def _auth_method_custom_auth(cls): #access_token = request.httprequest.header.get('Authorization') _logger.info("+++++++++++++++++++++++++++++++++++") access_token = request.httprequest.headers.get('Authorization') _logger.info(access_token) if not access_token: _logger.info('Access Token Missing') raise BadRequest('Missing Access Token') if access_token.startswith('Bearer '): access_token = access_token[7:] _logger.info(access_token) user_id = request.env["res.users.apikeys"]._check_credentials(scope='odoo.restapi', key=access_token) if not user_id: _logger.info('No user with api key found') raise BadRequest('Access token Invalid') request.update_env(user=user_id) #users = request.env["res.users"].search([]) _logger.info("+++++++++++++++++++++++++++++++++++") ``` Traceback: ``` Traceback (most recent call last): File "/home/odoo/src/odoo/odoo/models.py", line 5841, in ensure_one _id, = self._ids ValueError: not enough values to unpack (expected 1, got 0) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/odoo/src/odoo/odoo/http.py", line 2189, in __call__ response = request._serve_db() File "/home/odoo/src/odoo/odoo/http.py", line 1765, in _serve_db return service_model.retrying(self._serve_ir_http, self.env) File "/home/odoo/src/odoo/odoo/service/model.py", line 133, in retrying result = func() File "/home/odoo/src/odoo/odoo/http.py", line 1792, in _serve_ir_http response = self.dispatcher.dispatch(rule.endpoint, args) File "/home/odoo/src/odoo/odoo/http.py", line 1996, in dispatch result = self.request.registry['ir.http']._dispatch(endpoint) File "/home/odoo/src/odoo/addons/website/models/ir_http.py", line 235, in _dispatch response = super()._dispatch(endpoint) File "/home/odoo/src/odoo/odoo/addons/base/models/ir_http.py", line 222, in _dispatch result = endpoint(**request.params) File "/home/odoo/src/odoo/odoo/http.py", line 722, in route_wrapper result = endpoint(self, *args, **params_ok) File "/home/odoo/src/user/account_ext/controllers/main.py", line 411, in update_invoice refundInvoice.action_post() File "/home/odoo/src/odoo/addons/sale/models/account_move.py", line 63, in action_post res = super(AccountMove, self).action_post() File "/home/odoo/src/enterprise/account_accountant/models/account_move.py", line 76, in action_post res = super().action_post() File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 4072, in action_post other_moves._post(soft=False) File "/home/odoo/src/enterprise/sale_subscription/models/account_move.py", line 13, in _post posted_moves = super()._post(soft=soft) File "/home/odoo/src/enterprise/account_asset/models/account_move.py", line 109, in _post posted = super()._post(soft) File "/home/odoo/src/odoo/addons/sale/models/account_move.py", line 99, in _post posted = super()._post(soft) File "/home/odoo/src/enterprise/account_reports/models/account_move.py", line 48, in _post return super()._post(soft) File "/home/odoo/src/enterprise/account_avatax/models/account_move.py", line 15, in _post res = super()._post(soft=soft) File "/home/odoo/src/enterprise/account_invoice_extract/models/account_invoice.py", line 262, in _post posted = super()._post(soft) File "/home/odoo/src/enterprise/account_inter_company_rules/models/account_move.py", line 14, in _post posted = super()._post(soft) File "/home/odoo/src/enterprise/account_external_tax/models/account_move.py", line 53, in _post return super()._post(soft=soft) File "/home/odoo/src/enterprise/account_accountant/models/account_move.py", line 68, in _post posted = super()._post(soft) File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 3876, in _post draft_reverse_moves.reversed_entry_id._reconcile_reversed_moves(draft_reverse_moves, self._context.get('move_reverse_cancel', False)) File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 3694, in _reconcile_reversed_moves lines.with_context(move_reverse_cancel=move_reverse_cancel).reconcile() File "/home/odoo/src/odoo/addons/account/models/account_move_line.py", line 2935, in reconcile return self._reconcile_plan([self]) File "/home/odoo/src/odoo/addons/account/models/account_move_line.py", line 2345, in _reconcile_plan self._reconcile_plan_with_sync(plan_list, all_amls) File "/home/odoo/src/odoo/addons/account/models/account_move_line.py", line 2492, in _reconcile_plan_with_sync exchange_diff_values = exchange_lines_to_fix._prepare_exchange_difference_move_vals( File "/home/odoo/src/odoo/addons/account/models/account_move_line.py", line 2603, in _prepare_exchange_difference_move_vals accounting_exchange_date = journal.with_context(move_date=exchange_date).accounting_date File "/home/odoo/src/odoo/odoo/fields.py", line 1207, in __get__ self.compute_value(recs) File "/home/odoo/src/odoo/odoo/fields.py", line 1389, in compute_value records._compute_field_value(self) File "/home/odoo/src/odoo/addons/mail/models/mail_thread.py", line 424, in _compute_field_value return super()._compute_field_value(field) File "/home/odoo/src/odoo/odoo/models.py", line 4867, in _compute_field_value fields.determine(field.compute, self) File "/home/odoo/src/odoo/odoo/fields.py", line 102, in determine return needle(*args) File "/home/odoo/src/odoo/addons/account/models/account_journal.py", line 366, in _compute_accounting_date journal.accounting_date = temp_move._get_accounting_date(move_date, has_tax) File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 4358, in _get_accounting_date lock_dates = self._get_violated_lock_dates(invoice_date, has_tax) File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 4389, in _get_violated_lock_dates return self.company_id._get_violated_lock_dates(invoice_date, has_tax) File "/home/odoo/src/odoo/addons/account/models/company.py", line 369, in _get_violated_lock_dates self.ensure_one() File "/home/odoo/src/odoo/odoo/models.py", line 5844, in ensure_one raise ValueError("Expected singleton: %s" % self) ValueError: Expected singleton: res.company() ```
I am rendering a component inside an iframe to print. I have tested it in Firefox, where it works, but a collaborator on the project has pointed out that there are issues in both Google Chrome and Internet Explorer. The error I'm getting: *TypeError: Cannot read properties of undefined (reading 'nativeElement')* The code: ``` export class CheckoutComponent { @ViewChild('iframe', {read: ElementRef, static: false}) iframe: ElementRef | undefined; doc: any; onLoad() { this.doc = this.iframe!.nativeElement.contentDocument ||this.iframe!.nativeElement.contentWindow; this.createComponent(); } createComponent() { const component = this.viewContainerRef.createComponent(ReceiptComponent); component.setInput('order', this.currentOrder); component.location.nativeElement.id = 'receipt'; this.doc.body.appendChild(component.location.nativeElement); } } ``` In the template, it looks like this: ``` <iframe #iframe id="iframe" name="iframe" (load)="onLoad()"></iframe> ``` I am kind of stumped on this since it works in Firefox - I have tried various methods of making it work, but I seem to be getting nowhere. Any help would be much appreciated! :)
Angular 16: ElementRef error in Chrome, but not in Firefox
|angular|
The result of my simple SQL query ``` SELECT ID, STARDATE, ENDDATE FROM A INNER JOIN B ON A.ID = B.ID AND B.CONTRACTID = 572786 WHERE A.ISACTIVE = 1 ``` is: | ID | STARDATE | ENDDATE | | --- | --- | --- | | 394539 | 2024-03-01 | 2025-12-31 | | 394540 | 2026-01-01 | 2026-12-31 | But now I would like to convert/split the result into three rows by year (desired result): | YEAR | ID | STARTDATE | ENDDATE | | --- | --- | --- | --- | | 2024 | 394539 | 2024-03-01 | 2024-12-31 | | 2025 | 394539 | 2025-01-01 | 2025-12-31 | | 2026 | 394540 | 2026-01-01 | 2026-12-31 | Is there anyone who can explain me how to tackle this in SQL? Tried to find a solution on stackoverflow but I cant find any answers.
How to split row with date range into multip rows by year
|sql|split|date-range|group|
null
Recently I was asked to maintain an old image processing project(5 year old) at my company and It uses openCL. There is piece of code which works like below **if (oneKernelFlag == true) launch a gamma correction kernel on the whole image else break the image into grids(ex:- 2*2) for loop (....) // iterate for each grid launch the same gamma correction kernel on each grid** Similar kind of logic is used for applying kernels in few other functions. The oneKernelFlag is hardcoded and project is built for each hardware product. I noticed that execution is way faster when we launch single kernel (oneKernelFlag == true) compared to multiple kernel launch , almost 30% reduction in timing. **Now, I am confused what is the use of launching multiple same kernels on smaller problem spaces? When is this useful?** Please help The original developer and documentation are unavailable I couldnot find concrete details online.
I have a performance-critical tight loop which, for a large number of keys, needs to look up the key in a map and do something with that key if it exists in the map: Map<Integer, SomeClass> map = ...; for (int i : someLargeIterable) if (map.containsKey(i)) doSomething(i, map.get(i)); If possible, I'd like to avoid looking up the key in the map twice (once with `containsKey` and then again with `get`). Is it possible? If I could guarantee that the map does not contain null values, I could do this: var value = map.get(i); if (value != null) doSomething(i, value); Unfortunately, I can't do that because sometimes the map can contain null values, which I want to pass to `doSomething`. One way I had thought of was to use `computeIfPresent`: map.computeIfPresent(i, (key, value) -> { doSomething(key, value); return value; }); But that re-inserts `value` back into the map, so again we're doing a double lookup. Is there, maybe, something similar to `computeIfPresent` that doesn't try to modify the map? Maybe something that involves `Optional`?
Avoiding double lookup in Map
|java|
I want my shiny app to remain active continuously so that anyone who accesses the link can instantly use the app without waiting for it to start. According to [documentation](https://docs.posit.co/shiny-server/#application-timeouts) app_idle_timeout set to large values should solve my problem but it doesn't. My app is running on Ubuntu 22.04. I tried to change default.config file which is in "/opt/shiny-server/config/" directory as it is advised [here](https://groups.google.com/g/shiny-discuss/c/tUcEMk1Av8Y).
Setting app_idle_timeout for shiny changes nothing
|r|ubuntu|shiny|shiny-server|ubuntu-22.04|
When login user via gigya screenset lang parameter is truncated from de-at to de and we have problems regarding consentments. When user was registered correct language was set de-at by setting ```js window.__gigyaConf = {toggles: {useFullLangCode: true}}; ``` Is there any option to also used fulllangcode in login screen?
How can I implement nested columns in a Streamlit data_editor?
I have two jobs of the same github workflow. The first job produces an artifact and uploads it. Here is the log for that: ``` Run actions/upload-artifact@v4 With the provided path, there will be 58 files uploaded Artifact name is valid! Root directory input is valid! Beginning upload of artifact content to blob storage Uploaded bytes 6049904 Finished uploading artifact content to blob storage! SHA256 hash of uploaded artifact zip is 234760fcd4148780c7f4cbb3a0296eca3a3c5ae0d54457fbd5d2f97c7411d00a Finalizing artifact upload Artifact DIS_1.0.5.zip successfully finalized. Artifact ID 1366404513 Artifact DIS_1.0.5 has been successfully uploaded! Final size is 6049904 bytes. Artifact ID is 1366404513 Artifact download URL: https://github.com/my-company-url/DIS.API/actions/runs/8466719969/artifacts/1366404513 ``` The second job, for some reason cannot find the artifact, even when I intentially remove the name, which should be forcing the step to download all artifacts. Here is the log for the second job: ``` Run actions/download-artifact@v3 No artifact name specified, downloading all artifacts Creating an extra directory for each artifact that is being downloaded Unable to find any artifacts for the associated workflow There were 0 artifacts downloaded Artifact download has finished successfully ``` Any idea what might be causing that ? Here is the whole pipeline script: ``` name: build-and-deploy # todo: delete this and use the one below on: [workflow_dispatch] # todo: use this and delete the one above # on: # push: # branches: ["main"] # pull_request: # branches: ["main"] env: DOTNET_INSTALL_DIR: "./.dotnet" VERSION: 1.0.${{ github.run_number }} PACKAGE_STORE_NAME: Nexus # Package store name in the nuget.config file. PACKAGE_STORE_URI: https://nexus.milestone.dev # The URI PACKAGE_STORE_REPO: experimental-nuget # and repo BUILD_CONFIGURATION: -c Release jobs: build: runs-on: gha-runner-scale-set-linux steps: - uses: actions/checkout@v3 - name: Apt update run: sudo apt-get update - name: Install CURL run: sudo apt-get install -y curl - name: Setup .NET uses: actions/setup-dotnet@v3 with: dotnet-version: 8.0.x - name: Update Nuget Source run: dotnet nuget update source nexus --username ${{secrets.NEXUS_PIPELINE_SERVICE_TOKEN_USER}} --password ${{ secrets.NEXUS_PIPELINE_SERVICE_TOKEN_PASSWORD}} --store-password-in-clear-text - name: Restore dependencies run: dotnet restore - name: Build run: dotnet build ${{ env.BUILD_CONFIGURATION }} --no-restore - name: Test run: dotnet test ${{ env.BUILD_CONFIGURATION }} --no-build --verbosity normal - name: Dotnet Publish API if: github.ref == 'refs/heads/main' run: dotnet publish ./DIS.Web/DIS.Web.csproj ${{ env.BUILD_CONFIGURATION }} --no-build -o ./publish - name: "Upload Artifact" if: github.ref == 'refs/heads/main' uses: actions/upload-artifact@v4 with: name: DIS_${{ env.VERSION }} path: ./publish retention-days: 5 publish-docker-image: runs-on: gha-runner-scale-set-linux needs: build steps: - name: Download Artifact uses: actions/download-artifact@v3 with: name: DIS_${{ env.VERSION }} - name: List Artifact Contents run: ls -R ```
So I am trying to deploy laravel on vercel from [php-laravel](https://github.com/juicyfx/vercel-examples/tree/master/php-laravel) but I am getting below error --> Downloading user files 11:24:03.561 | Downloading PHP runtime files 11:24:03.564 | Installing Composer dependencies [START] 11:24:03.567 | php: error while loading shared libraries: libssl.so.10: cannot open shared object file: No such file or directory 11:24:03.569 | Error: Exited with 127 11:24:03.569 | at ChildProcess.<anonymous> (/vercel/path1/.vercel/builders/node_modules/vercel-php/dist/utils.js:178:24) 11:24:03.569 | at ChildProcess.emit (node:events:518:28) 11:24:03.571 | at ChildProcess.emit (node:domain:488:12) 11:24:03.571 | at ChildProcess._handle.onexit (node:internal/child_process:294:12) 11:24:03.592 | Error: Command "vercel build" exited with 1 11:24:03.647 | Command "vercel build" exited with 1 can someone help, how can I fix this? I did try other methods but I am getting same erorr. I tried putting custom installation commands but didn't work :(
Error while deploying Laravel App on Vercel Project
|laravel|deployment|vite|vercel|
null
If you're encountering the error, you can resolve it by wrapping the `TextField` inside an Expanded or `Flexible` widget within the `Row`. Row( children: [ TextField( controller: controller, onSubmitted: (String value) { setState(() { text = controller.text; }); }, ), ], )
null
I don’t have sudo access and contacting sys-admin takes a non trivial amount of time. Here is the output of `nvcc -V` > nvcc: NVIDIA (R) Cuda compiler driver > Copyright (c) 2005-2024 NVIDIA Corporation > Built on Tue_Feb_27_16:19:38_PST_2024 > Cuda compilation tools, release 12.4, V12.4.99 > Build cuda_12.4.r12.4/compiler.33961263_0 Output of `nvidia-smi` ``` +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 550.67 Driver Version: 550.67 CUDA Version: 12.4 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA RTX A6000 Off | 00000000:1C:00.0 Off | Off | | 30% 32C P8 19W / 300W | 23MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 1 NVIDIA RTX A6000 Off | 00000000:1E:00.0 Off | Off | | 30% 33C P8 20W / 300W | 11MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 2 NVIDIA RTX A6000 Off | 00000000:3D:00.0 Off | Off | | 30% 32C P8 27W / 300W | 11MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 3 NVIDIA RTX A6000 Off | 00000000:3E:00.0 Off | Off | | 30% 34C P8 25W / 300W | 11MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 4 NVIDIA RTX A6000 Off | 00000000:3F:00.0 Off | Off* | |ERR! 49C P5 ERR! / 300W | 11MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 5 NVIDIA RTX A6000 Off | 00000000:40:00.0 Off | Off | | 30% 31C P8 6W / 300W | 11MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 6 NVIDIA RTX A6000 Off | 00000000:41:00.0 Off | Off | | 30% 31C P8 16W / 300W | 11MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 7 NVIDIA RTX A6000 Off | 00000000:5E:00.0 Off | Off | | 30% 29C P8 6W / 300W | 11MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ +-----------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=========================================================================================| | 0 N/A N/A 4216 G /usr/libexec/Xorg 9MiB | | 0 N/A N/A 4466 G /usr/bin/gnome-shell 4MiB | | 1 N/A N/A 4216 G /usr/libexec/Xorg 4MiB | | 2 N/A N/A 4216 G /usr/libexec/Xorg 4MiB | | 3 N/A N/A 4216 G /usr/libexec/Xorg 4MiB | | 4 N/A N/A 4216 G /usr/libexec/Xorg 4MiB | | 5 N/A N/A 4216 G /usr/libexec/Xorg 4MiB | | 6 N/A N/A 4216 G /usr/libexec/Xorg 4MiB | | 7 N/A N/A 4216 G /usr/libexec/Xorg 4MiB | +-----------------------------------------------------------------------------------------+ ``` when I try to run ``` cuda_available = torch.cuda.is_available() print("CUDA Available:", cuda_available) if cuda_available: print("CUDA version:", torch.version.cuda) print("cuDNN version:", torch.backends.cudnn.version()) else: print("CUDA not available") ``` I get the following error: > /home/user_name/anaconda3/envs/llm2/lib/python3.10/site-packages/torch/cuda/__init__.py:141: UserWarning: CUDA initialization: CUDA driver initialization failed, you might not have a CUDA gpu. (Triggered internally at ../c10/cuda/CUDAFunctions.cpp:108.) > return torch._C._cuda_getDeviceCount() > 0 > CUDA Available: False > CUDA not available
|python|pytorch|computer-vision|anaconda|cuda|
- I use conditional resource creation for my lambda function. - When I try to build the template I get presented with the error `InvalidTemplateException Every Condition member must be a string.`. The CLI does not provide more details as to *where* something is wrong. - As my condition is a string, I'm not sure what is wrong. ##### Code: ``` Parameters: LocalExecution: Type: String Default: "false" AllowedValues: Conditions: IsLocalExecution: !Equals [ !Ref LocalExecution, 'true' ] Resources: ProductionDeploy: Type: AWS::Serverless::Function Condition: !Not [ IsLocalExecution ] Properties: ... LocalExecution: Type: AWS::Serverless::Function Condition: IsLocalExecution Properties: ... ```
AWS Cloudformation: InvalidTemplateException `Every Condition member must be a string.`
|aws-cloudformation|
##### Problem This is the problem: `Condition: !Not [ IsLocalExecution ]` ##### Solution Create a new condition ``` IsNotLocalExecution: !Equals [ !Ref LocalExecution, 'false' ] ``` Use that condition instead: ``` ProductionDeploy: Type: AWS::Serverless::Function Condition: IsNotLocalExecution Properties: ... ``` ##### Notes > As my condition is a string, I'm not sure what is wrong. This was a misconception. `!Not` will turn the condition into a boolean value. It's not a string anymore.
403 Permission Error When Trying To Upload Reel To Instagram Using Graph API, But Image works?
|python|instagram|instagram-reels|
Since MultiIndexes are iterables, an easy way would be to use [`zip`](https://docs.python.org/3/library/functions.html#zip): ``` out = list(zip(*idx)) ``` Output: `[('A', 'A', 'B', 'B'), ('C', 'D', 'C', 'D')]` As lists: ``` out = list(map(list, zip(*idx))) # [['A', 'A', 'B', 'B'], ['C', 'D', 'C', 'D']] ```
Good morning, I'm having several problems playing a video with the exoplayer. More precisely, I'm trying to create an Android app to be able to deploy it on Pepper's tablet which interfaces with a website that allows you to create and upload social stories with images, videos and audio. We checked the site to see if there were any problems with php, but nothing. Therefore, while I was debugging the app I noticed that when we start the social story with the video present inside, the player loads correctly but after a second the player crashes, causing the audio to play after it returns to the main screen, and after that ends the audio of the video completely crashes the entire app. We believe it is a race condition problem but we are not very sure of this as the exception passed to us by debug is of type RuntimeException. We look forward to your advice and thank you in advance! I attach the handler script below ``` private void getParagraph() { Log.d("prova", "valore array paragrafi: " +story.size()); Log.d("prova", "valore array immagini: " +imageList.size()); Log.d("prova", "prova bitmap: " + imageList.get(index)); if (imageList.get(index) != null) { imageView.setBackgroundColor(255); imageView.setImageBitmap(imageList.get(index)); } else if (!videoName.get(index).isEmpty()) { //SE IL NOMEVIDEO NELLA COLONNA DEL DATABASE E' PRESENTE //SE NON SOSTITUISCO GLI SPAZI CON I CARATTERI %20, IL VIDEO NON VIENE VISUALIZZATO String storyTableNoSpace = PepperStory.storyTitle; storyTableNoSpace = storyTableNoSpace.replaceAll(" ", "%20"); Log.d("prova video", "prova stringa storyTableNoSpace: " + storyTableNoSpace); String string = "https://pepper4socialstory.altervista.org/get_video2.php?table=" + storyTableNoSpace + "&id=" + index; Log.d("prova video", "prova stringa connessione: " + string); //String string = "https://pepper4storytelling.altervista.org/get_video2.php?table=" + PepperStory.storyTitle + "&id=" + index; //Log.d("prova video", "prova stringa connessione: " + string); imageView.setVisibility(imageView.INVISIBLE); simpleVideoExoPlayer = new SimpleExoPlayer.Builder(this).build(); //null videoView.setPlayer(simpleVideoExoPlayer); DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this, Util.getUserAgent(this, "app")); MediaSource dataSource = new ProgressiveMediaSource.Factory(dataSourceFactory).createMediaSource(Uri.parse(string)); videoView.setLayoutParams(new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.WRAP_CONTENT)); videoView.setControllerHideOnTouch(true); simpleVideoExoPlayer.prepare(dataSource); simpleVideoExoPlayer.setPlayWhenReady(true); //FIXME simpleVideoExoPlayer.addListener(new Player.EventListener() { @Override public void onPlayerStateChanged(boolean playWhenReady, int playbackState) { if(playbackState == Player.STATE_ENDED) { Log.d("prova video", "IS PLAYING: " +simpleVideoExoPlayer.isPlaying()); Log.d("prova video", "SONO NEL LISTENER STATO FINITO"); //TODO: aggiunta adesso if(simpleAudioExoPlayer == null || simpleAudioExoPlayer.isPlaying() == false) { Log.d("flusso", "sono nel getParagraph dell'if dell'end simpleVideoPlayer"); nextParagraph.setVisibility(nextParagraph.VISIBLE); imageView.setBackgroundColor(255); videoView.setLayoutParams(new FrameLayout.LayoutParams(1, 1)); index = index +1; imageView.setVisibility(imageView.VISIBLE); nextParagraph.setVisibility(nextParagraph.INVISIBLE); getParagraph(); } } } }); } else { imageView.setImageBitmap(null); imageView.setBackgroundColor(Color.parseColor(color.get(index))); } if (!audioName.get(index).isEmpty()) { //SE IL NOMEAUDIO NELLA COLONNA DEL DATABASE E' PRESENTE //SE NON SOSTITUISCO GLI SPAZI CON I CARATTERI %20, L'AUDIO NON VIENE RIPRODOTTO String storyTableNoSpace = PepperStory.storyTitle; storyTableNoSpace = storyTableNoSpace.replaceAll(" ", "%20"); Log.d("prova video", "prova stringa storyTableNoSpace: " + storyTableNoSpace); String string = "https://pepper4socialstory.altervista.org/get_audio.php?table=" + storyTableNoSpace + "&id=" + index; Log.d("prova video", "prova stringa connessione: " + string); //String string = "https://pepper4storytelling.altervista.org/get_audio.php?table=" + PepperStory.storyTitle + "&id=" + index; //Log.d("prova audio", "prova stringa connessione: " + string); simpleAudioExoPlayer = new SimpleExoPlayer.Builder(this).build(); audioView.setPlayer(simpleAudioExoPlayer); DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this, Util.getUserAgent(this, "app")); MediaSource dataSource = new ProgressiveMediaSource.Factory(dataSourceFactory).createMediaSource(Uri.parse(string)); //audioView.setLayoutParams(new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.WRAP_CONTENT)); audioView.setControllerHideOnTouch(true); simpleAudioExoPlayer.prepare(dataSource); simpleAudioExoPlayer.setPlayWhenReady(true); simpleVideoExoPlayer.addListener(new Player.EventListener() { @Override public void onPlayerStateChanged(boolean playWhenReady, int playbackState) { if (playbackState == Player.STATE_ENDED) { Log.d("prova video", "IS PLAYING: " + simpleVideoExoPlayer.isPlaying()); Log.d("prova video", "SONO NEL LISTENER STATO FINITO"); // TODO: aggiunta adesso if (simpleAudioExoPlayer == null || !simpleAudioExoPlayer.isPlaying()) { Log.d("flusso", "sono nel getParagraph dell'if dell'end simpleVideoPlayer"); runOnUiThread(new Runnable() { @Override public void run() { nextParagraph.setVisibility(View.VISIBLE); imageView.setBackgroundColor(Color.WHITE); // Imposta il colore di sfondo bianco videoView.setLayoutParams(new FrameLayout.LayoutParams(1, 1)); index = index + 1; imageView.setVisibility(View.VISIBLE); nextParagraph.setVisibility(View.INVISIBLE); getParagraph(); } }); } } } }); } startTalk(); } ``` Assuming the race condition problem, we tried to use Syncronize() when it increments the index, but nothing
Crash video on SimpleVideoExoPlayer
|android|multithreading|race-condition|exoplayer2.x|
null
|java|spring-boot|redis|threadpool|jedis|
As already pointed out, in the OAuth 2.0 'Authorization Code Flow', the code provided not only should expire shortly after being generated but can be used only once as stated in the 4.1.2 section of the documentation provided from the same previous answer: https://www.rfc-editor.org/rfc/rfc6749#section-4.1.2 Now, because the authorization code is generated once the user logs in after the first GET request, if you missed capturing the data of the response object, you'll need to log in again to get a fresh code. Note that by default permission by the user is given only once, so you'll need to overwrite this by including in your query parameters 'show_dialog=true'. Or probably by simply clearing the cache of your browser. show_dialog is mentioned in the API method documentation here: https://developer.spotify.com/documentation/web-api/tutorials/code-flow
React application connected with Backend using python. Fetching dropdown data from database. Below is the ui code- ``` useEffect(() => { const fetchLocations = () => { try { const response = fetch('http://localhost:8000/locations'); const data = response.json(); console.log(data); setLocations(data); } catch (error) { console.error('Error fetching locations:', error); } }; const fetchQualifications = async () => { try { const response = await fetch('http://localhost:8000/qualifications'); const data = await response.json(); setQualifications(data); } catch (error) { console.error('Error fetching qualifications:', error); } }; }) ``` @app.route('/submit-form', methods=['POST']) def submit_form(): data = request.json print(data) collection.insert_one({ 'name': data.get('name', ''), 'age': data.get('age', 0), 'location': data.get('location', ''), 'qualification': data.get('qualification', '') }) return jsonify({'message': 'Form data submitted successfully!'}), 201 server is running. In the url i am ale to see the data in json format. Issue is related to front end where i am not getting the data. Error in console - `GET http://localhost:5000/qualifications net::ERR_FAILED 200 (OK)`
Unable to find any artifacts for the associated workflow - github actions
|github-actions|
I've started using psycopg2-binary, and it is working with it.
I'd like to know if a GameObject that is a UI Element (Button for example) in a screen space canvas is visible on screen or not. Checking activeself is not good enough because if the element is active but has an inactive parent, it won't be visible. I've been googling away for an hour without success. Should I check if all its parents are active ? That can be costly... Can you kindly help ? Thanks
Unity : Check if UI Element in screen space canvas is visible or not
|unity-game-engine|
When using Spring Cloud Stream, I need to give the group value to the binders dynamically, how can I do this?
Spring Cloud Stream Rabbit Binder Dynamic Group Name
|rabbitmq|spring-cloud-stream|spring-rabbit|
Same issue is answered [here][1], Just Wrap your text field with CompositionLocalProvider like below. CompositionLocalProvider( LocalTextInputService provides null ) { OutlinedTextField( label = { Text("TEXT1") }, value = fieldValue, singleLine = true, modifier = Modifier .fillMaxWidth() .focusRequester(focusRequester), onValueChange = { enteredValue -> fieldValue = enteredValue }, keyboardOptions = KeyboardOptions.Default.copy( imeAction = ImeAction.Done ), keyboardActions = KeyboardActions( onDone = { fieldValue = "" showOutlinedTextField1 = false } ) ) } [1]: https://stackoverflow.com/a/69358766
I am using selenium to automatically login website https://fiverraffiliates.com/loginaffiliate/. But Selenium did not load the page. It just displayed a white website and nothing happened (below image). This is my code: ```python from selenium import webdriver from selenium.webdriver.common.by import By from selenium.common.exceptions import NoSuchElementException import time driver = webdriver.Chrome() driver.implicitly_wait(10) url = "https://fiverraffiliates.com/loginaffiliate/" while True: try: driver.get(url) time.sleep(10) username_input = driver.find_element(By.NAME, "user") print("Success") break except NoSuchElementException: time.sleep(10) driver.refresh() ``` [blank website](https://i.stack.imgur.com/RD6p5.png) Please guide me solution for this problem. I need use selenium to access this website and automatically login.
I'm running Unity 2022.3.20f1 and can't find any polyspatial options anywhere. I've gone through start up guides and Unity's official videos but nothing aobut why the options wouldn't be here.. I am using Unity Pro here. [![enter image description here][1]][1] [![enter image description here][2]][2] [1]: https://i.stack.imgur.com/TI5LM.png [2]: https://i.stack.imgur.com/Y0zcW.png
No PolySpatial option under 'window', project settings xr says there's no VisionOS plugins in XR Plug-In Management either
|unity-game-engine|apple-vision|
When the project had one data source, native queries ran fine, now when there are two data sources, hibernate cannot determine the schema for receiving native queries, non-native queries work fine. **application.yaml** ``` spring: autoconfigure: exclude: org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration flyway: test: testLocations: classpath:/db_test/migration,classpath:/db_test/migration_test testSchemas: my_schema locations: classpath:/db_ems/migration baselineOnMigrate: true schemas: my_schema jpa: packages-to-scan: example.example1.example2.example3.example4 show-sql: false properties: hibernate.dialect: org.hibernate.dialect.PostgreSQL10Dialect hibernate.format_sql: false hibernate.jdbc.batch_size: 50 hibernate.order_inserts: true hibernate.order_updates: true hibernate.generate_statistics: false hibernate.prepare_connection: false hibernate.default_schema: my_schema org.hibernate.envers: audit_table_prefix: log_ audit_table_suffix: hibernate.javax.cache.uri: classpath:/ehcache.xml hibernate.cache: use_second_level_cache: true region.factory_class: org.hibernate.cache.ehcache.internal.SingletonEhcacheRegionFactory hibernate: connection: provider_disables_autocommit: true handling_mode: DELAYED_ACQUISITION_AND_RELEASE_AFTER_TRANSACTION hibernate.ddl-auto: validate # todo: open-in-view: false database-platform: org.hibernate.dialect.H2Dialect #database connections read-only: datasource: url: jdbc:postgresql://localhost:6432/db username: postgres password: postgres configuration: pool-name: read-only-pool read-only: true auto-commit: false schema: my_schema read-write: datasource: url: jdbc:postgresql://localhost:6433/db username: postgres password: postgres configuration: pool-name: read-write-pool auto-commit: false schema: my_schema ``` **Datasources config:** ``` @Configuration public class DataSourceConfig { @Bean @ConfigurationProperties("spring.read-write.datasource") public DataSourceProperties readWriteDataSourceProperties() { return new DataSourceProperties(); } @Bean @ConfigurationProperties("spring.read-only.datasource") public DataSourceProperties readOnlyDataSourceProperties() { return new DataSourceProperties(); } @Bean @ConfigurationProperties("spring.read-only.datasource.configuration") public DataSource readOnlyDataSource(DataSourceProperties readOnlyDataSourceProperties) { return readOnlyDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build(); } @Bean @ConfigurationProperties("spring.read-write.datasource.configuration") public DataSource readWriteDataSource(DataSourceProperties readWriteDataSourceProperties) { return readWriteDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build(); } @Bean @Primary public RoutingDataSource routingDataSource(DataSource readWriteDataSource, DataSource readOnlyDataSource) { RoutingDataSource routingDataSource = new RoutingDataSource(); Map<Object, Object> dataSourceMap = new HashMap<>(); dataSourceMap.put(DataSourceType.READ_WRITE, readWriteDataSource); dataSourceMap.put(DataSourceType.READ_ONLY, readOnlyDataSource); routingDataSource.setTargetDataSources(dataSourceMap); routingDataSource.setDefaultTargetDataSource(readWriteDataSource); return routingDataSource; } @Bean public BeanPostProcessor dialectProcessor() { return new BeanPostProcessor() { @Override public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException { if (bean instanceof HibernateJpaVendorAdapter) { ((HibernateJpaVendorAdapter) bean).getJpaDialect().setPrepareConnection(false); } return bean; } }; } } ``` **Routing data sources** ``` public class RoutingDataSource extends AbstractRoutingDataSource { @Override protected Object determineCurrentLookupKey() { return DataSourceTypeContextHolder.getTransactionType(); } @Override public void setTargetDataSources(Map<Object, Object> targetDataSources) { super.setTargetDataSources(targetDataSources); afterPropertiesSet(); } } ``` **depending on the type of transaction readOnly or not, the datasource is selected** ``` public class DataSourceTypeContextHolder { private static final ThreadLocal<DataSourceType> contextHolder = new ThreadLocal<>(); public static void setTransactionType(DataSourceType dataSource) { contextHolder.set(dataSource); } public static DataSourceType getTransactionType() { return contextHolder.get(); } public static void clearTransactionType() { contextHolder.remove(); } } ``` ``` @Aspect @Component @Slf4j public class TransactionAspect { @Before("@annotation(transactional) && execution(* *(..))") public void setTransactionType(Transactional transactional) { if (transactional.readOnly()) { DataSourceTypeContextHolder.setTransactionType(DataSourceType.READ_ONLY); } else { DataSourceTypeContextHolder.setTransactionType(DataSourceType.READ_WRITE); } } @AfterReturning("@annotation(transactional) && execution(* *(..))") public void clearTransactionType(Transactional transactional) { DataSourceTypeContextHolder.clearTransactionType(); } } ``` **Error** ``` org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [UPDATE my_table SET lock_until = timezone('utc', CURRENT_TIMESTAMP) + cast(? as interval), locked_at = timezone('utc', CURRENT_TIMESTAMP), locked_by = ? WHERE my_table.name = ? AND my_table.lock_until <= timezone('utc', CURRENT_TIMESTAMP)]; nested exception is org.postgresql.util.PSQLException: ERROR: relation "shedlock" does not exist Позиция: 8 at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:235) at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72) at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1443) at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:633) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:862) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:883) at org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate.update(NamedParameterJdbcTemplate.java:321) at org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate.update(NamedParameterJdbcTemplate.java:326) at net.javacrumbs.shedlock.provider.jdbctemplate.JdbcTemplateStorageAccessor.lambda$execute$0(JdbcTemplateStorageAccessor.java:115) at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:140) at net.javacrumbs.shedlock.provider.jdbctemplate.JdbcTemplateStorageAccessor.execute(JdbcTemplateStorageAccessor.java:115) at net.javacrumbs.shedlock.provider.jdbctemplate.JdbcTemplateStorageAccessor.updateRecord(JdbcTemplateStorageAccessor.java:81) at net.javacrumbs.shedlock.support.StorageBasedLockProvider.doLock(StorageBasedLockProvider.java:91) at net.javacrumbs.shedlock.support.StorageBasedLockProvider.lock(StorageBasedLockProvider.java:65) at jdk.internal.reflect.GeneratedMethodAccessor328.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:205) at com.sun.proxy.$Proxy139.lock(Unknown Source) at net.javacrumbs.shedlock.core.DefaultLockingTaskExecutor.executeWithLock(DefaultLockingTaskExecutor.java:63) at net.javacrumbs.shedlock.spring.aop.MethodProxyScheduledLockAdvisor$LockingInterceptor.invoke(MethodProxyScheduledLockAdvisor.java:86) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:747) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:689) at ru.russianpost.ems.core.assembly.service.impl.scheduler.RpoContentSheduler$$EnhancerBySpringCGLIB$$631d68e1.loadData(<generated>) at jdk.internal.reflect.GeneratedMethodAccessor320.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:84) at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54) at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:93) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: org.postgresql.util.PSQLException: ERROR: relation "my_table" does not exist ``` when I change the native query and specify the schema before the table name, the query processes normally : UPDATE my_schema.my_table SET lock_until = timezone('utc', CURRENT_TIMESTAMP) + cast(? as interval), locked_at = timezone('utc', CURRENT_TIMESTAMP), locked_by = ? WHERE my_schema.my_table.name = ? AND my_schema.my_table.lock_until <= timezone('utc', CURRENT_TIMESTAMP);
i would not suggest to store data in json due to usage you need at compile time. so something like this string value = businessMessagesModel.MSG_1.en - will be available at compile time. as a different approach i would suggest: this is example to have class as enum which you can add messages at compilation and dynanicaly pick base on key name... ```C# public sealed class RejectReasons { public static RejectReasons SI { get; } = new RejectReasons("SI", "INVALID SELLER ID"); public static RejectReasons SD { get; } = new RejectReasons("SD", "INVALID SELLER DBA NAME"); public static RejectReasons SM { get; } = new RejectReasons("SM", "INVALID SELLER MCC"); public static RejectReasons SS { get; } = new RejectReasons("SS", "INVALID SELLER STREET ADDRESS 1"); public static RejectReasons SN { get; } = new RejectReasons("SN", "INVALID SELLER CITY NAME"); public static RejectReasons SR { get; } = new RejectReasons("SR", "INVALID SELLER REGION CODE"); public static RejectReasons SP { get; } = new RejectReasons("SP", "INVALID SELLER POSTAL CODE"); public static RejectReasons SC { get; } = new RejectReasons("SC", "INVALID SELLER COUNTRY CODE"); public static RejectReasons SU { get; } = new RejectReasons("SU", "INVALID SELLER CURRENCY CODE"); public static RejectReasons SL { get; } = new RejectReasons("SL", "INVALID SELLER LANGUAGE(Canada)"); public static RejectReasons AX { get; } = new RejectReasons("AX", "AMEX ISSUE.Please contact Premium Partner Servicing for details."); private RejectReasons(string name, string description) { Name = name; Description = description; } public string Name { get; } public string Description { get; } public override string ToString() => Name; public static IEnumerable<string> GetNames() => GetValues().Select(RejectReasons => RejectReasons.Name); public static string GetValue(string name) => GetValues().FirstOrDefault(RejectReasons => RejectReasons.Name == name)?.Description; public static IReadOnlyList<RejectReasons> GetValues() { return typeof(RejectReasons).GetProperties(BindingFlags.Public | BindingFlags.Static) .Select(property => (RejectReasons)property.GetValue(null)) .ToList(); } } ``` usage: ``` RejectReasons.GetValues(); //will return all as List<rejectedReasons> RejectReasons.GetValue("AX").Dump(); // will return text ``` **UPDATE**: another option is to use class as enumerator would be close to haw you want to use yours solution. Without JSON ```C# public static class JobSettings { public static JobOptions ExcesiveReattempts { get; } = new JobOptions("ExcessiveReattempts msg"); public static JobOptions VisaDataConsistancy { get; } = new JobOptions("DataConsistancy msg"); public static JobOptions VisaExcessiveFallbacks { get; } = new JobOptions("ExcessiveFallbacks msg"); } public class JobOptions { public string Msg {get;set;} public JobOptions() {} public JobOptions(string msg) { Msg = msg; } } ``` usage ``` C# Console.WriteLine(JobSettings.ExcesiveReattempts.Msg); ```