instruction
stringlengths
0
30k
I have a basic Maven/Spring test project with some simple JUnit tests and Selenium/Cucumber tests that I later want to run using Github Actions, but first I want to make sure they run locally. This is the JUnit "test": ``` package com.example.CucumberSelenium; import org.junit.jupiter.api.Test; import org.springframework.boot.test.context.SpringBootTest; import static org.junit.jupiter.api.Assertions.assertTrue; @SpringBootTest public class GithubActionsTests { @Test public void test() { assertTrue(true); } } ``` This is my Cucumber test runner file: ``` package com.example.CucumberSelenium; import io.cucumber.junit.Cucumber; import io.cucumber.junit.CucumberOptions; import org.junit.runner.RunWith; import org.springframework.boot.test.context.SpringBootTest; @RunWith(Cucumber.class) @CucumberOptions(features = "src/test/java/com/example/CucumberSelenium/resources/features", glue = "com.example.CucumberSelenium.stepdefs") public class CucumberTestRunnerTest { } ``` This is my pom.xml file: ``` <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>3.2.4</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>com.example</groupId> <artifactId>CucumberSelenium</artifactId> <version>0.0.1-SNAPSHOT</version> <name>CucumberSelenium</name> <description>testing</description> <properties> <java.version>21</java.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>io.cucumber</groupId> <artifactId>cucumber-java</artifactId> <scope>test</scope> <version>7.11.2</version> </dependency> <dependency> <groupId>io.cucumber</groupId> <artifactId>cucumber-junit</artifactId> <version>7.11.2</version> <scope>test</scope> </dependency> <dependency> <groupId>org.seleniumhq.selenium</groupId> <artifactId>selenium-java</artifactId> <version>4.17.0</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <version>2.19.1</version> <configuration> <includes> <include>**/*Tests.java</include> <include>**/*Test.java</include> <include>**/GithubActionsTests.java</include> <include>**/CucumberTestRunnerTest.java</include> </includes> </configuration> </plugin> </plugins> </build> </project> ``` This is my file structure: [![enter image description here](https://i.stack.imgur.com/EDBsm.png)](https://i.stack.imgur.com/EDBsm.png) Both tests run fine by them themselves using IntelliJ. But when running `mvn test` only the cucumber tests are running. Changing `maven-surefire-plugin` version to 2.22.2 will reverse the result, only JUnit test is running and not the cucumber tests. So I guess there is some dependecy issue or issue with the plugin, but I can't figure out what. Please advise
Match multiple combinations of two columns
|mariadb|
I'm trying to get the selected option (value) of the dropdown menu in a variable but I'm having trouble with getting it stored because I'm getting errors about it not existing. I understand that html can have order issues but because the popup's html elements are all written in a JS file's function I'm not sure how to circumvent that. ![picture of popup](https://i.stack.imgur.com/8pP54.png) Essentially, I'm trying to use something like this at a specific point to save the selected option. `var chosenGender = document.getElementById("mySelect").value;` This gives me a `TypeError: document.getElementById(...) is null` which, I believe means that it can't find my mySelect element which agrees with all the other tests I've run. This happens even when I check for an element I created and id'ed immediately before leading me to believe that there's just something I'm missing here. How would I go about detecting completely new html elements that were inserted only with js?
There isn't necessarily an error. If the p-value is smaller than the smallest double precision floating point number, it [underflows](https://en.wikipedia.org/wiki/Arithmetic_underflow) and you get a zero. This would not be a bug in your code or in SciPy; it's just a fundamental limitation of floating point arithmetic. If your sample size is large, it doesn't take much of a difference in sample means to get a zero p-value. ```python3 import numpy as np from scipy import stats rng = np.random.default_rng(83469358365936) x = rng.random(1000) stats.ttest_ind(x, x + 1) # TtestResult(statistic=-76.66392731424226, pvalue=0.0, df=1998.0) ```
Here is my HTML, CSS, JAVASCRIPT (THREE.JS) code. Can you please analyze the code and check for my desired output that i put in my title. ``` //**HTMLCODE** <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <link rel="icon" href="%PUBLIC_URL%/favicon.ico" /> <meta name="viewport" content="width=device-width, initial-scale=1" /> <meta name="theme-color" content="#000000" /> <link rel="stylesheet" href="/src/index.css" /> <script type="importmap"> { "imports": { "three": "https://unpkg.com/three@v0.150.1/build/three.module.js", "three/addons/": "https://unpkg.com/three@v0.150.1/examples/jsm/" } } </script> <meta name="description" content="Web site created using create-react-app" /> <link rel="manifest" href="%PUBLIC_URL%/manifest.json" /> <title>EXAMPLE</title> </head> <body> <div id="root"></div> <div class="globe-render" id="globe-render"> <script type="module" src="./globe.js"></script> </div> </body> </html> **//CSS CODE:** @tailwind base; @tailwind components; @tailwind utilities; body { background-color: #000f14; margin: 0; position: relative; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", "Roboto", "Oxygen", "Ubuntu", "Cantarell", "Fira Sans", "Droid Sans", "Helvetica Neue", sans-serif; -webkit-font-smoothing: antialiased; -moz-osx-font-smoothing: grayscale; } code { font-family: source-code-pro, Menlo, Monaco, Consolas, "Courier New", monospace; } #globe-render { position: absolute; top: -20%; left: 20%; background-color: rgb(255, 255, 255); } #myCanvas { position: absolute; background-color: green; } //JS CODE import * as THREE from "three"; import { OrbitControls } from "three/addons/controls/OrbitControls.js"; import { EffectComposer } from "three/addons/postprocessing/EffectComposer.js"; import { RenderPass } from "three/addons/postprocessing/RenderPass.js"; import { UnrealBloomPass } from "three/addons/postprocessing/UnrealBloomPass.js"; //Radius define let ParticleSurfaceLayer = 7.5; let GlobeRadius = 6.6; const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 0.1, 1000 ); const renderer = new THREE.WebGLRenderer({ alpha: true }); renderer.setSize(window.innerWidth, window.innerHeight); const container = document.querySelector(".globe-render"); renderer.setClearColor(0x000000,0); container.appendChild(renderer.domElement); // Group for globe and particle system const group = new THREE.Group(); scene.add(group); // Bloom pass for the globe const renderScene = new RenderPass(scene, camera); const globeBloomPass = new UnrealBloomPass( new THREE.Vector2(window.innerWidth, window.innerHeight), 1.5, 0.4, 0.85 ); globeBloomPass.threshold = 0; globeBloomPass.strength = 2; globeBloomPass.radius = 0; const globeComposer = new EffectComposer(renderer); globeComposer.setSize(window.innerWidth, window.innerHeight); globeComposer.addPass(renderScene); globeComposer.addPass(globeBloomPass); // Bloom pass for particles const particleComposer = new EffectComposer(renderer); particleComposer.setSize(window.innerWidth, window.innerHeight); particleComposer.addPass(new RenderPass(group, camera)); // Assuming 'group' contains particles const particleBloomPass = new UnrealBloomPass( new THREE.Vector2(window.innerWidth, window.innerHeight), 1.5, 0.4, 0.85 ); particleBloomPass.threshold = 0; particleBloomPass.strength = 1.3; particleBloomPass.radius = 0.8; particleComposer.addPass(particleBloomPass); // Update size function function updateSize() { const newWidth = window.innerWidth; const newHeight = window.innerHeight; camera.aspect = newWidth / newHeight; camera.updateProjectionMatrix(); renderer.setSize(newWidth, newHeight); const sphereRadius = Math.min(newWidth, newHeight) * 0.1; globe.geometry = new THREE.SphereGeometry(sphereRadius, 32, 32); const particleSizeMin = Math.min(newWidth, newHeight) * 0.001; const particleSizeMax = Math.min(newWidth, newHeight) * 0.004; group.children.forEach((particle) => { const randomSize = THREE.MathUtils.randFloat( particleSizeMin, particleSizeMax ); particle.scale.set(randomSize, randomSize, randomSize); }); // Update composer sizes globeComposer.setSize(newWidth, newHeight); particleComposer.setSize(newWidth, newHeight); } //orbit controls const controls = new OrbitControls(camera, renderer.domElement); const loader = new THREE.TextureLoader(); controls.enableZoom = false; // controls.enabled=false // Initial setup //////// const geometry = new THREE.SphereGeometry(GlobeRadius, 80, 80); const material1 = new THREE.MeshBasicMaterial({ map: loader.load("./8k_earth_nightmap_underlayer.jpg"), //transparent: true, opacity: 1, }); const material2 = new THREE.MeshBasicMaterial({ color: 0xff047e, transparent: true, opacity: 0.1, }); const multimaterial = [material1, material2]; const globe = new THREE.Mesh(geometry, material1); globe.layers.set(1); group.add(globe); // Particle System const particleCount = 600; const color = new THREE.Color("#fc2414"); for (let i = 0; i < particleCount; i++) { // ... (same as your code) const phi = Math.random() * Math.PI * 2; const theta = Math.random() * Math.PI - Math.PI / 2; const randomradius = 0.01 + Math.random() * 0.04; const radius = ParticleSurfaceLayer; // Radius of the sphere const x = radius * Math.cos(theta) * Math.cos(phi); const y = radius * Math.cos(theta) * Math.sin(phi); const z = radius * Math.sin(theta); const particleGeometry = new THREE.SphereGeometry(randomradius, 30, 25); // Initial size const particleMaterial = new THREE.MeshBasicMaterial({ color: "#00FFFF", }); const particle = new THREE.Mesh(particleGeometry, particleMaterial); particle.position.set(x, y, z); particle.layers.set(1); group.add(particle); } const ambientLight = new THREE.AmbientLight(0xffffff, 100); scene.add(ambientLight); // Camera position camera.position.z = 15; // Rotation animation const rotationSpeed = 0.001; // Animation function const animate = function () { requestAnimationFrame(animate); group.rotation.y += rotationSpeed; // Render globe with bloom effect camera.layers.set(1); globeComposer.render(); // Render particles with bloom effect particleComposer.render(); }; ///////orbit controls///* let isDragging = false; let originalRotation = group.rotation.y; // Event listener for mouse down renderer.domElement.addEventListener("mousedown", () => { isDragging = true; }); // Event listener for mouse up renderer.domElement.addEventListener("mouseup", () => { isDragging = false; // Reset the rotation to its original position group.rotation.y = originalRotation; }); // Event listener for mouse leave (in case mouse leaves the canvas while dragging) renderer.domElement.addEventListener("mouseleave", () => { if (isDragging) { isDragging = false; // Reset the rotation to its original position group.rotation.y = originalRotation; } }); // Handle window resize window.addEventListener("resize", updateSize); // Start animation animate(); const canvas = document.querySelector("canvas"); canvas.id = "myCanvas"; canvas.classList.add("myCanvasClass"); ``` ![Screen Shot of my rendered page](https://i.stack.imgur.com/4evOJ.png) I tried setClearColor in js, transparent in the css block for canvas. I want the canvas convert from black background to transparent. Please some one help this.
import { useCallback, useMemo, useState, useEffect } from 'react'; import Head from 'next/head'; import ArrowDownOnSquareIcon from '@heroicons/react/24/solid/ArrowDownOnSquareIcon'; import ArrowUpOnSquareIcon from '@heroicons/react/24/solid/ArrowUpOnSquareIcon'; import PlusIcon from '@heroicons/react/24/solid/PlusIcon'; import { Box, Button, Container, Stack, SvgIcon, Typography } from '@mui/material'; import { useSelection } from 'src/hooks/use-selection'; import { Layout as DashboardLayout } from 'src/layouts/dashboard/layout'; import { applyPagination } from 'src/utils/apply-pagination'; import { Modal, Backdrop, Fade, TextField } from '@mui/material'; import AddProductModal from 'src/components/AddProductModal'; import AddItemsModal from 'src/components/AddItemModal'; import { useDispatch, useSelector } from 'react-redux'; import { getAllProducts } from 'src/redux/productSlice'; import AddCategoryModal from 'src/components/AddCategoryModal'; import { MaterialReactTable, useMaterialReactTable } from 'material-react-table'; import { getAllCategories } from 'src/redux/categorySlice'; import { formatDistanceToNow } from 'date-fns'; import { parseISO } from 'date-fns'; import { Table, TableBody, TableCell, TableContainer, TableHead, TableRow, Paper } from '@mui/material'; const useCategories = () => { const categoriesState = useSelector((state) => state.category.categories); const { user } = useSelector((state) => state.auth); return useMemo(() => { if (user.isVendor) { const filteredCategories = categoriesState.filter((category) => category.item._id === user.vendorDetails.item ); return filteredCategories; } return categoriesState; }, [categoriesState, user, user]); }; const useCategoryIds = (categories) => { return useMemo(() => { if (categories) { return categories.map((category) => category._id); } return []; }, [categories]); }; const renderDetailPanel = ({ row }) => { const products = row.original.products; if (!products || products.length === 0) { return null; } return ( <TableContainer component={Paper}> <Table size="small" aria-label="a dense table"> <TableHead> <TableRow> <TableCell>Product Name</TableCell> <TableCell align="right">Location</TableCell> <TableCell align="right">Price</TableCell> <TableCell align="right">Dimension</TableCell> <TableCell align="right">Unit</TableCell> <TableCell align="right">Created At</TableCell> </TableRow> </TableHead> <TableBody> {products.map((product) => ( <TableRow key={product._id}> <TableCell component="th" scope="row"> {product.name} </TableCell> <TableCell align="right">{product.location}</TableCell> <TableCell align="right">{product.price}</TableCell> <TableCell align="right">{product.dimension}</TableCell> <TableCell align="right">{product.unit}</TableCell> <TableCell align="right"> {formatDistanceToNow(parseISO(product.createdAt), { addSuffix: true })} </TableCell> </TableRow> ))} </TableBody> </Table> </TableContainer> ); }; const ProductsPage = () => { const dispatch = useDispatch(); const { user } = useSelector((state) => state.auth); const { categoryStatus } = useSelection((state) => state.category) const [page, setPage] = useState(0); const [rowsPerPage, setRowsPerPage] = useState(5); const categories = useCategories(page, rowsPerPage); const categoryIds = useCategoryIds(categories); const categoriesSelection = useSelection(categoryIds); const [isModalOpen, setIsModalOpen] = useState(false); const [isAddProductModalOpen, setIsAddProductModalOpen] = useState(false); const [isAddItemsModalOpen, setIsAddItemsModalOpen] = useState(false); const [isAddCategoryModalOpen, setIsAddCategoryModalOpen] = useState(false); useEffect(() => { dispatch(getAllCategories()); }, [dispatch, isModalOpen]); const handlePageChange = useCallback( (event, value) => { setPage(value); }, [] ); const handleRowsPerPageChange = useCallback( (event) => { setRowsPerPage(event.target.value); }, [] ); const handleOpenModal = () => { setIsModalOpen(true); }; const handleCloseModal = () => { setIsModalOpen(false); }; const handleOpenAddProductModal = () => { setIsAddProductModalOpen(true); }; const handleCloseAddProductModal = () => { setIsAddProductModalOpen(false); }; const handleOpenAddItemsModal = () => { setIsAddItemsModalOpen(true); }; const handleCloseAddItemsModal = () => { setIsAddItemsModalOpen(false); }; const handleOpenAddCategoryModal = () => { setIsAddCategoryModalOpen(true); }; const handleCloseAddCategoryModal = () => { setIsAddCategoryModalOpen(false); }; const columns = useMemo( () => [ { accessorKey: 'item.img', header: 'Image', Cell: ({ row }) => ( <Box sx={{ display: 'flex', alignItems: 'center', gap: '1rem', }} > <img alt="avatar" height={50} src={row.original.item.img} loading="lazy" style={{ borderRadius: '50%' }} /> {/* using renderedCellValue instead of cell.getValue() preserves filter match highlighting */} </Box> ), }, { accessorKey: 'item.name', header: 'Category Item', size: 200, }, { accessorKey: 'name', header: 'Category Name', size: 200, }, { accessorKey: 'unit', header: 'Unit', size: 150, }, { accessorKey: 'createdAt', header: 'Created At', size: 150, Cell: ({ row }) => { const formattedDate = formatDistanceToNow(parseISO(row.original.createdAt), { addSuffix: true }); return <span>{formattedDate}</span>; }, }, ], [], ); const table = useMaterialReactTable({ columns, data: categories, renderDetailPanel, }); return ( <> <Head> <title>Categories | Your App Name</title> </Head> <Box component="main" sx={{ flexGrow: 1, py: 8, }} > <Container maxWidth="xl"> <Stack spacing={3}> <Stack direction="row" justifyContent="space-between" spacing={4} > <Stack spacing={1} direction="row"> <Typography variant="h4">Product/Categories</Typography> <Stack alignItems="center" direction="row" spacing={1} > </Stack> </Stack> <Stack spacing={1} direction="row"> <Button startIcon={( <SvgIcon fontSize="small"> <PlusIcon /> </SvgIcon> )} variant="contained" onClick={handleOpenAddProductModal} > Add Product </Button> <AddProductModal isOpen={isAddProductModalOpen} onClose={handleCloseAddProductModal} /> {!user.isVendor && (<Button startIcon={( <SvgIcon fontSize="small"> <PlusIcon /> </SvgIcon> )} variant="contained" onClick={handleOpenAddItemsModal} > Add Items </Button>)} <AddItemsModal isOpen={isAddItemsModalOpen} onClose={handleCloseAddItemsModal} /> {!user.isVendor && (<Button startIcon={( <SvgIcon fontSize="small"> <PlusIcon /> </SvgIcon> )} variant="contained" onClick={handleOpenAddCategoryModal} > Add Categories </Button>)} <AddCategoryModal isOpen={isAddCategoryModalOpen} onClose={handleCloseAddCategoryModal} /> </Stack> </Stack> {categories && Array.isArray(categories) && categories.length > 0 && <MaterialReactTable table={table} />} </Stack> </Container> </Box> </> ); }; ProductsPage.getLayout = (page) => ( <DashboardLayout> {page} </DashboardLayout> ); export default ProductsPage; this is above my products page in next js ,and there are other pages too like orders, the issue is all pages routing are working fine but when i go to product page and then i click to another page from sidenav link after coming to product page,the page doesnt change no error in console rest all pages are working productis also working fine only page are not getting routed after going to product page
next js routing issue
|javascript|reactjs|next.js|next.js13|
I am using Bootstrap responsive table. The problem I am facing here is the text in the table column is overflowing. As per my knowledge the table adjusts the column width automatically if it is responsive. But the problem here is I have a input field inside td and I have a placeholder for that input field. The placeholder is getting overflowed. How to fix this issue for this requirement. **Placeholder : Drag and Drop here** **Note: This problem happens in mobile device only. ** ``` <div class="table-responsive border"> <table class="table" id="dataTable" width="100%" style="overflow-x:auto"> <thead> <tr> <th>Document Type</th> <th class="text-center">Upload</th> <th>Open</th> <th>Resolved</th> <th>Not Required</th> </tr> </thead> <tbody id="missingBody"> <tr> <tdFront and Back - Passport/Driver’s License/Photo Car</td> <td> <div class="file-field"> <div class="file-path-wrapper""><input type="file" name="file1" missval="dcn_56" multiple=""><input style="vertical-align:top;padding-bottom:8px;" class="file-path validate" type="text" id="dcn_56" doctype="56" multiple="" placeholder="Drag and Drop files here"></div> </div> </td> <td><input type="radio" id="open56" missinginfoid="56" checked="" value="71" name="Missing0" class="custom-control-input"><label class="custom-control-label" for="open56"></label></td> <td><input type="radio" id="resolve56" missinginfoid="56" value="72" name="Missing0" class="custom-control-input"><label class="custom-control-label" for="resolve56"></label></td> <td><input type="radio" id="close56" missinginfoid="56" value="73" name="Missing0" class="custom-control-input"><label class="custom-control-label" for="close56"></label></td> </tr> </tbody> </table> </div> ``` ** Ouput: https://ibb.co/TrCSrk4 ** The output should look like drag and drop here in same line
There are two options 1. install the driver using `chromedriver-autoinstaller` install package `chromedriver-autoinstaller` `pip install chromedriver-autoinstaller` from selenium import webdriver from selenium.webdriver.chrome.options import Options from selenium.webdriver.chrome.service import Service import chromedriver_autoinstaller chromedriver_autoinstaller.install() options = webdriver.ChromeOptions() options.add_argument('--start-maximized') options.add_experimental_option('excludeSwitches', ['enable-logging']) driver = webdriver.Chrome(service=Service(), options=options) 2. add the `chromedriver.exe` service path directly options = webdriver.ChromeOptions() options.add_argument("--start-maximized") options.add_experimental_option("excludeSwitches", ["enable-logging"]) driver = webdriver.Chrome( service=Service( "C:/Users/yourusername/.cache/selenium/chromedriver/win32/112.0.5615.49/chromedriver.exe" ), options=options, )
I am facing some difficuilty desiding on the best way to structure a large test. This is more of an end-to-end test of a particular workflow within the application. I am using python 3.11 and pytest. My current test stands like this ``` def test_workflow(fixture_1, fixture_2, fixture_3): # Some arrange code # Some act code # Lot of assert statements ``` I dislike having so many asserts in one test but I am not sure how to structure it otherwise. - I cannot put the arrange and act code to fixtures and have the assert statements in separate tests as the this code is expensive to run (especially in CI). I want the arrange and act to run just once. This is a end-to-end workflow, everything being checked is part of this one workflow. - I also cannot do the above and declare these fixtures as `scope="module"`. My tests/fixtures depend on other fixtures that are defined as `function` scope and they are external to my part of the appilcation. I can't edit them easily. So I am stuck having my one large test. Is there any solutions to better structure this? Thanks!
What's the best way to breakup a large test in pytest
|python-3.x|unit-testing|pytest|fixtures|end-to-end|
I just started to work with LED matrix (16*16) and attiny85. The current purpose is to switch on a led on each row, where led number is the number of row (I know that led strip is like a snake, it does not matter for now). So, I written an `byte matrix[16][16]` and manually put a digit into target cells. It worked well. After that I replace `byte matrix[16][16]` into a `rgb matrix[16][16]` where `struct rgb {byte r, byte g, byte b}` and it doesn\`t work correctly (see photos below). The LED functions: ``` #define LED PB0 #define byte unsigned char struct rgb { byte r; byte g; byte b; }; // @see https://agelectronica.lat/pdfs/textos/L/LDWS2812.PDF // HIGH 0.8mks +/- 150ns and 0.45mks +/- 150ns // LOW 0.4mks +/- 150ns and 0.85mks +/- 150ns // 1s/8000000 = 125ns for 1 tact void setBitHigh(byte pin) { PORTB |= _BV(pin); // 1 tactDuration = 125ns asm("nop"); asm("nop"); asm("nop"); asm("nop"); asm("nop"); // 0.75mks PORTB &= ~_BV(pin); // 1 tactDuration asm("nop"); asm("nop"); asm("nop"); // 0.5mks } void setBitLow(byte pin) { PORTB |= _BV(pin); // 1 tactDuration asm("nop"); asm("nop"); // 0.375mks PORTB &= ~_BV(pin); // 1 tactDuration asm("nop"); asm("nop"); asm("nop"); asm("nop"); asm("nop"); asm("nop"); // 0.875mks } void trueByte(byte pin, byte intensity) { for (int i = 7; i >= 0; i--) { intensity & _BV(i) ? setBitHigh(pin) : setBitLow(pin); } } void falseByte(byte pin) { for (int i = 0; i < 8; i++) { setBitLow(pin); } } void setPixel(byte pin, rgb color) { DDRB |= _BV(pin); color.g > 0 ? trueByte(pin, color.g) : falseByte(pin); color.r > 0 ? trueByte(pin, color.r) : falseByte(pin); color.b > 0 ? trueByte(pin, color.b) : falseByte(pin); } ``` working well code: ``` void display(byte matrix[WIDTH][HEIGHT]) { for (byte rowIdx = 0; rowIdx < WIDTH; rowIdx++) { for (byte cellIdx = 0; cellIdx < HEIGHT; cellIdx++) { setPixel(LED, {matrix[rowIdx][cellIdx], 0, 0}); } } } byte m[WIDTH][HEIGHT] = { { 15, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0}, { 0, 15, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0}, { 0, 0, 15, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0}, ... { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 15, 0}, { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 15} }; int main() { display(m); } ``` result: [![enter image description here][1]][1] NOT working well code: ``` void display(rgb matrix[WIDTH][HEIGHT]) { for (byte rowIdx = 0; rowIdx < WIDTH; rowIdx++) { for (byte cellIdx = 0; cellIdx < HEIGHT; cellIdx++) { setPixel(LED, matrix[rowIdx][cellIdx]); } } } rgb m[WIDTH][HEIGHT] = { { {15, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0} }, ... }; int main() { display(m); } ``` result: [![enter image description here][2]][2] I will be glad to any advice... [1]: https://i.stack.imgur.com/c7dW8.jpg [2]: https://i.stack.imgur.com/0ykNX.jpg
It's immensely hacky and time consuming, and so suitable only for a chart that you really care about, but you can annotate a plot with rectangles to produce the desired effect. First, you have to work out (by trial and error) where ggplot is placing each dodged bar's centre point (a manual process, but for factor variables it's not too painful). When you have that and the relevant maximum and minimum values (derived from the underlying plot) creating a df with the relevant rectangle coordinates and placing it over the plot (with a suitable alpha) seems to work. In the image, the orange bars have been partly covered by white rectangles with alpha 0.6, to de-emphasise them. The effect is quite nice IMO. [1]: https://i.stack.imgur.com/Wk7jm.png
The login page is only one in the application and you should *not* use a dynamic result returned from the action which hase a `location` hardcoded in the code. There's should be a getter for the parameter in the `${page}`. Such as ``` public String getPage() { return page; } ``` The JSP file should be contained in the web application folder, after the war file is unpackeged to the server, available to the web application and be in the web content folder. You should not use *absolute* path from the hard drive when specifying location. After the application is deployed the location is use absolute from the context root path. So put your JSP to the web content folder and use absolute path from there. The context path is added automatically if you redirect to the login page. The results that are common for all actions can be configured globally. ```xml <global-results> <result name="login" type="redirect">/opacLogin_irdxc.jsp</result> </global-results> ``` You can learn more if you read [Global results across different packages defined in struts configuration file][1]. [1]: https://stackoverflow.com/a/16875911/573032
|deep-learning|large-language-model|huggingface|llama|peft|
I'm creating my own links manually within a module of a divi theme, since divi doesn't support three links side by side. It's supposed to look like this: [Three side-by-side links](https://i.stack.imgur.com/RZ5iM.png) But when the screen is resized it looks like this: [Split looking link](https://i.stack.imgur.com/RNPZF.png) Divi breaks its responsive at certain points and this happens before it gets to the tablet size. This is the code I've placed in the module's css: ` a.work { background-color:#b7ad68; font-size: .5em; font-family: "helvetica", san-serif; letter-spacing: 2px; font-weight: 500; padding: 10px 20px 10px 20px; border-radius: 50px; color:white; } a:hover.work { background-color: #ffffff; border: solid 2px #b7ad68; color: #b7ad68; } a.podcast { background-color:#f1b36e; font-size: .5em; font-family: "helvetica", san-serif; letter-spacing: 2px; font-weight: 500; padding: 10px 20px 10px 20px; border-radius: 50px; color:white; } a:hover.podcast { background-color: #ffffff; border: solid 2px #f1b36e; color: #f1b36e; } a.speak { background-color:#e58059; font-size: .5em; font-family: "helvetica", san-serif; letter-spacing: 2px; font-weight: 500; padding: 10px 20px 10px 20px; border-radius: 50px; color:white; } a:hover.speak { background-color: #ffffff; border: solid 2px #e58059; color: #e58059; } HTML: <p style="text-align: center;"> <a class="work" href="/work-with-me"> <span style="font-family: ETmodules; font-size: 1.5em; font-weight: 300; padding-top: 10px; position: relative; top: .15em;">&#xe0ef;</span>&nbsp;WORK WITH ME</a>&nbsp;&nbsp;<a class="podcast" href="/work-with-me"><span style="font-family: ETmodules; font-size: 1.5em; font-weight: 300; position: relative; top: .15em;">&#xe01b;</span>&nbsp;PODCAST</a>&nbsp;&nbsp;<a class="speak"><span style="font-family: ETmodules; font-size: 1.5em; font-weight: 300;position: relative; top: .15em;">&#xe031;</span>&nbsp;SPEAKING</a></p> `
Since you are doing a left join, it's possible that TABLE2.DATE and TABLE2.ProductCode will be null so you need to include that possibility in your WHERE clause. You will also need to update the SUM arguments as shown: SELECT TABLE1.Date, TABLE1.ProductCode , SUM(CASE WHEN TABLE2.ProductCode IS NULL THEN 0 ELSE 1 END) AS X FROM TABLE1 LEFT JOIN TABLE2 on TABLE1.ProductCode = TABLE2.PRODUCT WHERE (TABLE2.DATE='1/1' OR TABLE2.ProductCode IS NULL) AND TABLE1.ProductCode IN ('AAA','BBB','CCC','DDD','EEE') GROUP BY TABLE1.Date, TABLE1.ProductCode
I want to implement Spring Cloud Kubernetes. I created this test project: https://github.com/rcbandit111/mockup/tree/master configuration: spring: application: name: mockup cloud: kubernetes: discovery-server-url: "http://spring-cloud-kubernetes-discoveryserver" I also tried to set for server url: `spring-cloud-kubernetes-discoveryserver.default.svc:80` I have deployed discovery server using this deployment file: --- apiVersion: v1 kind: List items: - apiVersion: v1 kind: Service metadata: labels: app: spring-cloud-kubernetes-discoveryserver name: spring-cloud-kubernetes-discoveryserver spec: ports: - name: http port: 80 targetPort: 8761 selector: app: spring-cloud-kubernetes-discoveryserver type: ClusterIP - apiVersion: v1 kind: ServiceAccount metadata: labels: app: spring-cloud-kubernetes-discoveryserver name: spring-cloud-kubernetes-discoveryserver - apiVersion: rbac.authorization.k8s.io/v1 kind: RoleBinding metadata: labels: app: spring-cloud-kubernetes-discoveryserver name: spring-cloud-kubernetes-discoveryserver:view roleRef: kind: Role apiGroup: rbac.authorization.k8s.io name: namespace-reader subjects: - kind: ServiceAccount name: spring-cloud-kubernetes-discoveryserver - apiVersion: rbac.authorization.k8s.io/v1 kind: Role metadata: namespace: default name: namespace-reader rules: - apiGroups: ["", "extensions", "apps"] resources: ["services", "endpoints"] verbs: ["get", "list", "watch"] - apiVersion: apps/v1 kind: Deployment metadata: name: spring-cloud-kubernetes-discoveryserver-deployment spec: selector: matchLabels: app: spring-cloud-kubernetes-discoveryserver template: metadata: labels: app: spring-cloud-kubernetes-discoveryserver spec: serviceAccount: spring-cloud-kubernetes-discoveryserver containers: - name: spring-cloud-kubernetes-discoveryserver image: springcloud/spring-cloud-kubernetes-discoveryserver:2.1.0-M3 imagePullPolicy: IfNotPresent readinessProbe: httpGet: port: 8761 path: /actuator/health/readiness livenessProbe: httpGet: port: 8761 path: /actuator/health/liveness ports: - containerPort: 8761 I get error: 2024-03-30 23:31:43.254 WARN 1 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'compositeDiscoveryClient' defined in class path resource [org/springframework/cloud/client/discovery/composite/CompositeDiscoveryClientAutoConfiguration.class]: Unsatisfied dependency expressed through method 'compositeDiscoveryClient' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'kubernetesDiscoveryClient' defined in class path resource [org/springframework/cloud/kubernetes/discovery/KubernetesDiscoveryClientAutoConfiguration$Servlet.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.cloud.client.discovery.DiscoveryClient]: Factory method 'kubernetesDiscoveryClient' threw exception; nested exception is org.springframework.cloud.kubernetes.discovery.DiscoveryServerUrlInvalidException: spring.cloud.kubernetes.discovery-server-url must be specified and a valid URL. 2024-03-30 23:31:43.548 INFO 1 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat] 2024-03-30 23:31:43.855 INFO 1 --- [ main] ConditionEvaluationReportLoggingListener : Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled. 2024-03-30 23:31:44.454 ERROR 1 --- [ main] o.s.boot.SpringApplication : Application run failed org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'compositeDiscoveryClient' defined in class path resource [org/springframework/cloud/client/discovery/composite/CompositeDiscoveryClientAutoConfiguration.class]: Unsatisfied dependency expressed through method 'compositeDiscoveryClient' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'kubernetesDiscoveryClient' defined in class path resource [org/springframework/cloud/kubernetes/discovery/KubernetesDiscoveryClientAutoConfiguration$Servlet.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.cloud.client.discovery.DiscoveryClient]: Factory method 'kubernetesDiscoveryClient' threw exception; nested exception is org.springframework.cloud.kubernetes.discovery.DiscoveryServerUrlInvalidException: spring.cloud.kubernetes.discovery-server-url must be specified and a valid URL. at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:800) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:541) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1352) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1195) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:955) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:920) ~[spring-context-5.3.27.jar!/:5.3.27] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:583) ~[spring-context-5.3.27.jar!/:5.3.27] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:145) ~[spring-boot-2.6.15.jar!/:2.6.15] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:745) ~[spring-boot-2.6.15.jar!/:2.6.15] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:423) ~[spring-boot-2.6.15.jar!/:2.6.15] at org.springframework.boot.SpringApplication.run(SpringApplication.java:307) ~[spring-boot-2.6.15.jar!/:2.6.15] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1317) ~[spring-boot-2.6.15.jar!/:2.6.15] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1306) ~[spring-boot-2.6.15.jar!/:2.6.15] at com.mockup.mockup.MockupApplication.main(MockupApplication.java:34) ~[classes!/:na] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) ~[na:na] at java.base/java.lang.reflect.Method.invoke(Method.java:580) ~[na:na] at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49) ~[mockup.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:108) ~[mockup.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:58) ~[mockup.jar:na] at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:88) ~[mockup.jar:na] Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'kubernetesDiscoveryClient' defined in class path resource [org/springframework/cloud/kubernetes/discovery/KubernetesDiscoveryClientAutoConfiguration$Servlet.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.cloud.client.discovery.DiscoveryClient]: Factory method 'kubernetesDiscoveryClient' threw exception; nested exception is org.springframework.cloud.kubernetes.discovery.DiscoveryServerUrlInvalidException: spring.cloud.kubernetes.discovery-server-url must be specified and a valid URL. at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:658) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:638) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1352) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1195) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.DefaultListableBeanFactory.addCandidateEntry(DefaultListableBeanFactory.java:1609) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.DefaultListableBeanFactory.findAutowireCandidates(DefaultListableBeanFactory.java:1573) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveMultipleBeans(DefaultListableBeanFactory.java:1462) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1349) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1311) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791) ~[spring-beans-5.3.27.jar!/:5.3.27] ... 25 common frames omitted Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.cloud.client.discovery.DiscoveryClient]: Factory method 'kubernetesDiscoveryClient' threw exception; nested exception is org.springframework.cloud.kubernetes.discovery.DiscoveryServerUrlInvalidException: spring.cloud.kubernetes.discovery-server-url must be specified and a valid URL. at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:185) ~[spring-beans-5.3.27.jar!/:5.3.27] at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:653) ~[spring-beans-5.3.27.jar!/:5.3.27] ... 42 common frames omitted Caused by: org.springframework.cloud.kubernetes.discovery.DiscoveryServerUrlInvalidException: spring.cloud.kubernetes.discovery-server-url must be specified and a valid URL. at org.springframework.cloud.kubernetes.discovery.KubernetesDiscoveryClient.<init>(KubernetesDiscoveryClient.java:40) ~[spring-cloud-kubernetes-discovery-2.1.9.jar!/:2.1.9] at org.springframework.cloud.kubernetes.discovery.KubernetesDiscoveryClientAutoConfiguration$Servlet.kubernetesDiscoveryClient(KubernetesDiscoveryClientAutoConfiguration.java:67) ~[spring-cloud-kubernetes-discovery-2.1.9.jar!/:2.1.9] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) ~[na:na] at java.base/java.lang.reflect.Method.invoke(Method.java:580) ~[na:na] at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154) ~[spring-beans-5.3.27.jar!/:5.3.27] ... 43 common frames omitted Discovery server is successfully deployed but again I get this issue. Do you know what value I need to set? **EDIT**: I tried to deploy `image: springcloud/spring-cloud-kubernetes-discoveryserver:3.1.1` (just changed number into above yml config). I get this error stack: java.lang.RuntimeException: io.kubernetes.client.openapi.ApiException: at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:112) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.commons.LazilyInstantiate.get(LazilyInstantiate.java:47) ~[spring-cloud-kubernetes-commons-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.client.KubernetesClientHealthIndicator.getDetails(KubernetesClientHealthIndicator.java:44) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.commons.AbstractKubernetesHealthIndicator.doHealthCheck(AbstractKubernetesHealthIndicator.java:72) ~[spring-cloud-kubernetes-commons-3.1.1.jar:3.1.1] at org.springframework.boot.actuate.health.AbstractHealthIndicator.health(AbstractHealthIndicator.java:82) ~[spring-boot-actuator-3.2.4.jar:3.2.4] at reactor.core.publisher.MonoCallable.call(MonoCallable.java:72) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.publisher.FluxSubscribeOnCallable$CallableSubscribeOnSubscription.run(FluxSubscribeOnCallable.java:228) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:68) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:28) ~[reactor-core-3.6.4.jar:3.6.4] at java.base/java.util.concurrent.FutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[na:na] at java.base/java.lang.Thread.run(Unknown Source) ~[na:na] Caused by: io.kubernetes.client.openapi.ApiException: at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:989) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:905) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedPodWithHttpInfo(CoreV1Api.java:26769) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedPod(CoreV1Api.java:26747) ~[client-java-api-19.0.1.jar:na] at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:107) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] ... 13 common frames omitted 2024-03-31T18:35:06.737Z WARN 1 --- [oundedElastic-4] .s.c.k.c.KubernetesClientHealthIndicator : Health check failed java.lang.RuntimeException: io.kubernetes.client.openapi.ApiException: at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:112) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.commons.LazilyInstantiate.get(LazilyInstantiate.java:47) ~[spring-cloud-kubernetes-commons-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.client.KubernetesClientHealthIndicator.getDetails(KubernetesClientHealthIndicator.java:44) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.commons.AbstractKubernetesHealthIndicator.doHealthCheck(AbstractKubernetesHealthIndicator.java:72) ~[spring-cloud-kubernetes-commons-3.1.1.jar:3.1.1] at org.springframework.boot.actuate.health.AbstractHealthIndicator.health(AbstractHealthIndicator.java:82) ~[spring-boot-actuator-3.2.4.jar:3.2.4] at reactor.core.publisher.MonoCallable.call(MonoCallable.java:72) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.publisher.FluxSubscribeOnCallable$CallableSubscribeOnSubscription.run(FluxSubscribeOnCallable.java:228) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:68) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:28) ~[reactor-core-3.6.4.jar:3.6.4] at java.base/java.util.concurrent.FutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[na:na] at java.base/java.lang.Thread.run(Unknown Source) ~[na:na] Caused by: io.kubernetes.client.openapi.ApiException: at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:989) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:905) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedPodWithHttpInfo(CoreV1Api.java:26769) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedPod(CoreV1Api.java:26747) ~[client-java-api-19.0.1.jar:na] at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:107) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] ... 13 common frames omitted 2024-03-31T18:35:16.420Z WARN 1 --- [oundedElastic-4] .s.c.k.c.KubernetesClientHealthIndicator : Health check failed java.lang.RuntimeException: io.kubernetes.client.openapi.ApiException: at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:112) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.commons.LazilyInstantiate.get(LazilyInstantiate.java:47) ~[spring-cloud-kubernetes-commons-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.client.KubernetesClientHealthIndicator.getDetails(KubernetesClientHealthIndicator.java:44) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.commons.AbstractKubernetesHealthIndicator.doHealthCheck(AbstractKubernetesHealthIndicator.java:72) ~[spring-cloud-kubernetes-commons-3.1.1.jar:3.1.1] at org.springframework.boot.actuate.health.AbstractHealthIndicator.health(AbstractHealthIndicator.java:82) ~[spring-boot-actuator-3.2.4.jar:3.2.4] at reactor.core.publisher.MonoCallable.call(MonoCallable.java:72) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.publisher.FluxSubscribeOnCallable$CallableSubscribeOnSubscription.run(FluxSubscribeOnCallable.java:228) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:68) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:28) ~[reactor-core-3.6.4.jar:3.6.4] at java.base/java.util.concurrent.FutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[na:na] at java.base/java.lang.Thread.run(Unknown Source) ~[na:na] Caused by: io.kubernetes.client.openapi.ApiException: at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:989) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:905) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedPodWithHttpInfo(CoreV1Api.java:26769) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedPod(CoreV1Api.java:26747) ~[client-java-api-19.0.1.jar:na] at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:107) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] ... 13 common frames omitted 2024-03-31T18:35:16.425Z WARN 1 --- [oundedElastic-3] .s.c.k.c.KubernetesClientHealthIndicator : Health check failed java.lang.RuntimeException: io.kubernetes.client.openapi.ApiException: at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:112) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.commons.LazilyInstantiate.get(LazilyInstantiate.java:47) ~[spring-cloud-kubernetes-commons-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.client.KubernetesClientHealthIndicator.getDetails(KubernetesClientHealthIndicator.java:44) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] at org.springframework.cloud.kubernetes.commons.AbstractKubernetesHealthIndicator.doHealthCheck(AbstractKubernetesHealthIndicator.java:72) ~[spring-cloud-kubernetes-commons-3.1.1.jar:3.1.1] at org.springframework.boot.actuate.health.AbstractHealthIndicator.health(AbstractHealthIndicator.java:82) ~[spring-boot-actuator-3.2.4.jar:3.2.4] at reactor.core.publisher.MonoCallable.call(MonoCallable.java:72) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.publisher.FluxSubscribeOnCallable$CallableSubscribeOnSubscription.run(FluxSubscribeOnCallable.java:228) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:68) ~[reactor-core-3.6.4.jar:3.6.4] at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:28) ~[reactor-core-3.6.4.jar:3.6.4] at java.base/java.util.concurrent.FutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[na:na] at java.base/java.lang.Thread.run(Unknown Source) ~[na:na] Caused by: io.kubernetes.client.openapi.ApiException: at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:989) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:905) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedPodWithHttpInfo(CoreV1Api.java:26769) ~[client-java-api-19.0.1.jar:na] at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedPod(CoreV1Api.java:26747) ~[client-java-api-19.0.1.jar:na] at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:107) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] ... 13 common frames omitted 2024-03-31T18:35:16.454Z WARN 1 --- [oundedElastic-4] .s.c.k.c.KubernetesClientHealthIndicator : Health check failed java.lang.RuntimeException: io.kubernetes.client.openapi.ApiException: at org.springframework.cloud.kubernetes.client.KubernetesClientPodUtils.internalGetPod(KubernetesClientPodUtils.java:112) ~[spring-cloud-kubernetes-client-autoconfig-3.1.1.jar:3.1.1] Full log: https://pastebin.com/gZ7cyJPZ
[CSS layers][1] are awesome, except one (in my opinion) unintuitive behavior: they have a lower priority than unlayered styles: ```css /* site style */ div { color: red; } /* user style */ @layer myflavor { div { color: blue; } } /* The div is still red :-( */ ``` So I can not, for example, use layers in user styles. Because then the user style may not have an effect if a rule for that selector is already defined. It also makes it complex to gradually transform the styles of a site to a layer based approach, except the complete original CSS gets wrapped in a layer first, which may be difficult to achieve. Is there a feature which makes a layer have a higher priority than unlayered styles? Something like `@layer! foo {...}` or `@option layer-priority layered unlayered`? [1]: https://developer.mozilla.org/en-US/docs/Web/CSS/@layer
currently i am working on a Java question that is related to Java thread and concurrency. The thread should handles Multiple Processes, Multiple Processors and Single Priority Queue. It has 5 main functions which are set_number_of_processors, reg, start, schedule, and terminate. My code works fine with 3 sessions. However, when it has 4 sessions. It shows errors. I need some help in figuring what is the problems with my code. Thank you in advance. ``` import java.util.concurrent.locks.ReentrantLock; import java.util.concurrent.locks.Condition; //Note that the 'notifyAll' method or similar import java.util.ArrayList; import java.util.LinkedList; public class OS implements OS_sim_interface { int pid = 0; int numProcessors; int numAvailableProcessors; ArrayList<Process> processes = new ArrayList<Process>(); //An ArrayList to store the processes ArrayList<Process>[] runningProcesses; LinkedList<Process> readyQueue = new LinkedList<Process>(); //A LinkedList to store the waiting processes //lock and condition variables ReentrantLock lock = new ReentrantLock(); Condition condition = lock.newCondition(); OS(){ //default constructor } @Override public void set_number_of_processors(int nProcessors) { if(nProcessors < 1){ System.out.println("Number of processors must be greater than 1"); } runningProcesses = new ArrayList[nProcessors]; for (int i = 0; i < nProcessors; i++) { runningProcesses[i] = new ArrayList<Process>(); } numAvailableProcessors = nProcessors; numProcessors = nProcessors; } @Override public int reg(int priority) { if(priority <= 0){ System.out.println("Please give a valid priority (minimum 1)"); return -1; }else{ //Create a new process Process process = new Process(pid, priority); //Add the process to the list of processes processes.add(process); pid++; //Increment the process ID recorded by OS return process.ID; } } @Override public void start(int ID) { if(ID < 0 || ID >= processes.size()){ System.out.println("Invalid process ID"); }else{ Process currProcess = processes.get(ID); int currProcessor = 0; for (int i = 0; i < numProcessors; i++) { if (runningProcesses[i].isEmpty()) { currProcessor = i; break; } } //Check if there are available processors if(numAvailableProcessors > 0){ lock.lock(); try{ currProcess.run(); runningProcesses[currProcessor].add(currProcess); System.out.println(currProcess.name + " is running on Processor-" + currProcessor); numAvailableProcessors--; }catch(Exception e){ System.out.println("Hi, I'm your exception: " + e); }finally { lock.unlock(); } }else { lock.lock(); try{ readyQueue.add(currProcess); condition.await(); System.out.println(currProcess.name + " is added in readyQueueStart"); }catch(Exception e){ System.out.println("Hi, I'm your exception: " + e); }finally { lock.unlock(); } } } } @Override public void schedule(int ID) { if(ID < 0 || ID >= processes.size()){ System.out.println("Invalid process ID"); }else{ Process currProcess = processes.get(ID); int currProcessor = 0; for (int i = 0; i < numProcessors; i++) { if (runningProcesses[i].contains(currProcess)) { currProcessor = i; break; } // else do nothing } if(!readyQueue.isEmpty()){ lock.lock(); try{ //add the current process to readyQueue readyQueue.add(currProcess); System.out.println(currProcess.name + " is added in readyQueue"); runningProcesses[currProcessor].remove(currProcess); //wake up the waiting process condition.signal(); lock.unlock(); //wait for the process to finish lock.lock(); //run next process in readyQueue Process newProcess = readyQueue.poll(); newProcess.run(); runningProcesses[currProcessor].add(newProcess); System.out.println(newProcess.name + " is scheduled running on Processor-" + currProcessor); //wait for the process to finish condition.await(); }catch(Exception e){ System.out.println("Hi, I'm your exception: " + e); }finally { lock.unlock(); } return; }else{ System.out.println(currProcess.name + " is already running on Processor-" + currProcessor); } } } @Override public void terminate(int ID) { if(ID < 0 || ID >= processes.size()){ System.out.println("Invalid process ID"); }else{ Process currProcess = processes.get(ID); int currProcessor = 0; for (int i = 0; i < numProcessors; i++) { if (runningProcesses[i].contains(currProcess)) { currProcessor = i; break; } // else do nothing } if(readyQueue.isEmpty()){ System.out.println(currProcess.name + " is terminated"); return; }else{ lock.lock(); try{ //remove the current process from runningProcesses runningProcesses[currProcessor].remove(currProcess); System.out.println(currProcess.name + " is terminated"); //wake up the waiting process condition.signal(); //run next process in readyQueue Process newProcess = readyQueue.poll(); newProcess.run(); runningProcesses[currProcessor].add(newProcess); System.out.println(newProcess.name + " is scheduled running on Processor-" + currProcessor); }catch(Exception e){ System.out.println("Hi, I'm your exception: " + e); }finally { lock.unlock(); } } } } class Process extends Thread{ int ID; int priority; String name; Process(int ID, int priority) { this.ID = ID; this.priority = priority; this.name = "Process-" + ID; } } } ``` Thread Class: Default ``` class ProcessSimThread2 extends Thread { int pid = -1; int start_session_length=0; OS os; ProcessSimThread2(OS os){this.os = os;} //Constructor stores reference to os for use in run() public void run(){ os.start(pid); events.add("pid="+pid+", session=0"); try {Thread.sleep(start_session_length);} catch (InterruptedException e) {e.printStackTrace();} os.schedule(pid); events.add("pid="+pid+", session=1"); os.schedule(pid); events.add("pid="+pid+", session=2"); os.terminate(pid); events.add("pid="+pid+", session=3"); //Error occured at this session. It works fine if i commented this out }; }; ``` Test Class: Default ``` public void test(){ System.out.println("\n\n\n*********** test *************"); events = new ConcurrentLinkedQueue<String>(); //List of process events //Instantiate OS simulation for two processors OS os = new OS(); os.set_number_of_processors(2); int priority1 = 1; //Create two process simulation threads: int pid0 = os.reg(priority1); ProcessSimThread2 p0 = new ProcessSimThread2(os); p0.start_session_length = 250; p0.pid = pid0; //p0 grabs first processor and keeps it for 250ms int pid1 = os.reg(priority1); ProcessSimThread2 p1 = new ProcessSimThread2(os); p1.start_session_length = 50; p1.pid = pid1; //p1 grabs 2nd processor and keeps it for 50ms int pid2 = os.reg(priority1); ProcessSimThread2 p2 = new ProcessSimThread2(os); p2.start_session_length = 0; p2.pid = pid2; //p2 tries to get processor straight away but has to wait for p1 os.schedule call //Start the treads making sure that p0 will get to its first os.start() p0.start(); sleep(20); p1.start(); sleep(25); //make sure that p1 has grabbed a processor before starting p2 p2.start(); //Give time for all the process threads to complete: sleep(test_timeout); String[] expected = { "pid=0, session=0", "pid=1, session=0", "pid=2, session=0", "pid=1, session=1", "pid=2, session=1", "pid=1, session=2", "pid=2, session=2","pid=1, session=3", "pid=2, session=3", "pid=0, session=1", "pid=0, session=2", "pid=0, session=3"}; System.out.println("\nUR4 - NOW CHECKING"); //Check expected events against actual: String test_status = "UR4 PASSED"; if (events.size() == expected.length) { Iterator <String> iterator = events.iterator(); int index=0; while (iterator.hasNext()) { String event = iterator.next(); if (event.equals(expected[index])) System.out.println("Expected event = "+ expected[index] + ", actual event = " + event + " --- MATCH"); else { test_status = "UR3 FAILED - NO MARKS"; System.out.println("Expected event = "+ expected[index] + ", actual event = " + event + " --- ERROR"); } index++; } } else { System.out.println("Number of events expected = " + expected.length + ", number of events reported = " + events.size()); test_status = "UR4 FAILED - NO MARKS"; } System.out.println("\n" + test_status); } ``` Current Output: ``` Process-0 is running on Processor-0 Process-1 is running on Processor-1 Process-1 is added in readyQueue Process-2 is scheduled running on Processor-1 Process-2 is added in readyQueueStart Process-2 is added in readyQueue Process-1 is scheduled running on Processor-1 Process-1 is added in readyQueue Process-2 is scheduled running on Processor-1 Process-2 is added in readyQueue Process-1 is scheduled running on Processor-1 Process-1 is terminated Process-2 is scheduled running on Processor-1 Process-2 is terminated Process-0 is already running on Processor-0 Process-0 is already running on Processor-0 Process-0 is terminated UR4 - NOW CHECKING Expected event = pid=0, session=0, actual event = pid=0, session=0 --- MATCH Expected event = pid=1, session=0, actual event = pid=1, session=0 --- MATCH Expected event = pid=2, session=0, actual event = pid=2, session=0 --- MATCH Expected event = pid=1, session=1, actual event = pid=1, session=1 --- MATCH Expected event = pid=2, session=1, actual event = pid=2, session=1 --- MATCH Expected event = pid=1, session=2, actual event = pid=1, session=2 --- MATCH Expected event = pid=2, session=2, actual event = pid=1, session=3 --- ERROR Expected event = pid=1, session=3, actual event = pid=2, session=2 --- ERROR Expected event = pid=2, session=3, actual event = pid=2, session=3 --- MATCH Expected event = pid=0, session=1, actual event = pid=0, session=1 --- MATCH Expected event = pid=0, session=2, actual event = pid=0, session=2 --- MATCH Expected event = pid=0, session=3, actual event = pid=0, session=3 --- MATCH ``` 1. Thread safe' and 'synchronized' classes (e.g. those in java.util.concurrent) other than the two imported above MUST not be used. 2. keyword 'synchronized', or any other thread safe classes or mechanisms are not allowed 3. any delays or 'busy waiting' (spin lock) methods are not allowed
Multiple Processes, Multiple Processors, Single Priority Queue - Java Thread-Safe and Concurrency -
|java|session|concurrency|thread-safety|
The code `char *line; line = strncpy(line, req, size);` has undefined behavior: `line` is an uninitialized pointer, so you cannot copy anything to it. My recommendation is [**you should never use `strncpy`**][1]: it does not do what you think. In your code, you should instead use `char *line = strndup(req, size);` which allocates memory and copies the string fragment to it. The memory should be freed after use with `free(line)`. `strndup()` first standardized in POSIX finally becomes part of the C Standard in the latest version, so it is available on most systems, but it your target does not have it, it can be defined this way: ``` #include <stdlib.h> char *strndup(const char *s, size_t n) { char *p; size_t i; for (i = 0; i < n && s[i] != '\0'; i++) continue; p = malloc(i + 1); if (p != NULL) { memcpy(p, s, i); p[i] = '\0'; } return p; } ``` There are other problems in your code: - `strcmp(req, "GET")` returns `0` is the strings have the same characters, so you should write: ``` if (strcmp(req, "GET") == 0) { return GET; } ``` or ``` if (!strcmp(req, "GET")) { return GET; } ``` - you should reverse the order of the tests in `while((*(req + size) != '\n') && (size < req_size))` to avoid accessing `req[req_size]`. * in `accept_incoming_request` you should allocate the destination array with an extra byte for the null terminator and set this null byte at the end of the received packet with: buffer[recvd_bytes] = '\0'; [1]: https://randomascii.wordpress.com/2013/04/03/stop-using-strncpy-already/
I have a simple animation where I have a static rectangle slowly moving horizontally so that it pushes a circle across the 'floor'. How can I make the circle roll instead of slide? I tried adding friction to the circle and the floor, and removing friction from the pushing rectangle, but it makes no difference. var boxX = 10; var boxY = 390; const ball = Bodies.circle(100, 400, 80, {friction: 100 } ); const box = Bodies.rectangle( boxX , boxY, 20, 20, { friction: 0, isStatic: true } ); const floor = Bodies.rectangle( 500 , 480, 1000, 20, { isStatic: true } ); move(); function move() { boxX += 1; Matter.Body.setPosition( box, {x:boxX, y:boxY}, [updateVelocity=true]); window.requestAnimationFrame( move ); };
How to have circle roll when pushed using MatterJS?
|javascript|matter.js|
here is my code ``` function ScrollToActiveTab(item, id, useraction) { if (item !== null && item !== undefined && useraction) { dispatch(addCurrentMenu(item)); } requestAnimationFrame(() => { // Ensure this runs after any pending layout changes var scrollableDiv = document.getElementById('scrollableDiv'); let tempId = 'targetId-' + id; var targetElement = document.getElementById(tempId); if (targetElement) { var targetPosition = targetElement.offsetLeft + targetElement.clientWidth / 2 - window.innerWidth / 2; // Perform the scroll scrollableDiv.scrollLeft = targetPosition; } }); } ``` Please guide my why ° scrollableDiv.scrollLeft = targetPosition;° not work on safari. Thanks i want work scrollableDiv.scrollLeft = targetPosition , in safari also
In chorme horizontal scroll work , but in safari browser its not work
|javascript|google-chrome|scroll|safari|mobile-safari|
null
**My current setup is:** Multipage Applications JSP, JS and CSS PWA is URLs precached **I have kind of a unique usecase here:** Phones that are used to connect to the app might be shared Connections are very unstable (sometimes no connection for half a day) Data should be accessible through the interface only by an authenticated user The data should be accessible after the first login for each user Users are not really tech sure PWAs use JavaScript and therefore do have a restricted possibilities for encryption. **What could be a good idea to store user credentials on the device even if they are encrypted, especially when using JavaScript?** What kind of flow is recommend? I thought about creating a unique token that the user successfully logged in. It is stored encrypted together with the username. This token in combination with the username can then be used to relogin as long the application is offline. As soon as the app is online again the user is asked to login with its real credentials? If this succeeds the token is deleted and a new one is created (and shown to the user) when the user logs out. What kind of flow is recommend?
PWA Offline Login Procedure
|authentication|browser|progressive-web-apps|offline|
The key takeaway from [Harshita](https://stackoverflow.com/users/19648279/harshitha)'s [link](https://stackoverflow.com/a/77263657/6309) is the flexibility of the `RoutePrefix` setting in Swagger UI options, which allows you to tailor Swagger's accessibility based on the environment (local vs. Azure). See "[Customize the HTTP endpoint](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-http-webhook-trigger?tabs=python-v2%2Cisolated-process%2Cnodejs-v4%2Cfunctionsv2&pivots=programming-language-csharp#customize-the-http-endpoint)". So you should adapt the Swagger and Swagger UI configuration in your ASP.NET Core application to dynamically set the [`RoutePrefix`](https://learn.microsoft.com/en-us/dotnet/api/microsoft.azure.management.network.models.parameter.routeprefix?view=azure-dotnet) based on whether the application is running locally or in Azure. That would make sure the Swagger UI works correctly in both environments without manual adjustments for each deployment. ```csharp // Determine the running environment var isAzure = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") == "Production"; // Configure Swagger app.UseSwagger(options => options.RouteTemplate = "swagger/v1/swagger.json"); // Configure Swagger UI app.UseSwaggerUI(x => { x.SwaggerEndpoint("/swagger/v1/swagger.json", "Web API V1"); // Set the RoutePrefix based on the environment x.RoutePrefix = isAzure ? "" : "swagger"; }); ``` That code snippet checks the environment using the `ASPNETCORE_ENVIRONMENT` variable ([this you can set here](https://stackoverflow.com/a/34622196/6309)), a common way to differentiate between development and production environments in ASP.NET Core applications. It assumes that the `Production` value indicates deployment on Azure. - When running locally (not in `Production`), `RoutePrefix` is set to `"swagger"`, making the Swagger UI accessible at `/swagger/index.html`. - When deployed to Azure (`Production`), `RoutePrefix` is set to an empty string, making the Swagger UI accessible directly at the root (`/index.html`). ---- That should also address the concern of making Swagger accessible at the desired path without causing cross-origin token redemption issues related to the OAuth2 flow with Azure AD (`AADSTS9002326`), which comes from Azure AD's handling of OAuth2 redirects, particularly in scenarios where the redirect seems to originate from a different origin than expected. When serving Swagger from the root (`/`) on Azure, the redirect URI used for OAuth2 authentication changes, affecting how Azure AD processes these requests. If Azure AD perceives these requests as cross-origin from an unauthorized origin, it will block them due to security policies designed to prevent unauthorized token redemption. By dynamically setting the `RoutePrefix` based on the deployment environment (local development versus Azure), you make sure the OAuth2 redirect URIs remain consistent with the expectations of Azure AD, whether the application is accessed locally or hosted on Azure. - Local development: The Swagger UI is accessed at `/swagger/index.html`, and OAuth2 redirects use paths consistent with local development settings. Since there is no cross-origin concern locally, authentication flows work as expected. - Azure deployment: With `RoutePrefix` set to an empty string, Swagger UI is served from the root (`/index.html`), and OAuth2 redirects are expected to originate from the root. That setup would match Azure's security policies for OAuth2 redirects, avoiding the cross-origin issue since the redirects no longer appear to come from a different origin. ----- > What I failed to understand is the case of the prefix for the local run to be set to empty string. How can I achieve that? And if I can't, what's the reason? > > The question isn't "how to make it work locally?" but rather "how to make it work locally with the empty string as prefix?". In ASP.NET Core, [setting the `RoutePrefix` to an empty string (`""`)](https://learn.microsoft.com/en-us/aspnet/core/tutorials/getting-started-with-swashbuckle?view=aspnetcore-8.0&tabs=visual-studio#add-and-configure-swagger-middleware) means you want to serve Swagger UI directly from the application's base URL (e.g., `https://localhost:5001/`). That setup is straightforward when deploying to Azure or any environment where you have control over the base URL and can make sure no conflicts with other routes. However, doing this locally is more complex, because of how the application routes are resolved and potential conflicts with other endpoints in your application: - Your application might have other routes that could conflict with Swagger's assets when served from the root. For example, if you have a controller action mapped to the root, it might take precedence or conflict with the Swagger UI route, leading to the Swagger UI not being accessible. - ASP.NET Core [processes middleware in the order](https://learn.microsoft.com/en-us/aspnet/core/fundamentals/middleware/?view=aspnetcore-8.0#middleware-order) they are added to the pipeline in `Startup.Configure`. If Swagger middleware (`UseSwaggerUI`) is configured *before* static files middleware (`UseStaticFiles`), there might be issues serving the Swagger UI assets from the root because the request might be intercepted by the static files middleware first, especially if there are files in the `wwwroot` directory or other middleware that handles requests to the root path. To make Swagger work locally with the `RoutePrefix` set to an empty string, you would have to: - confirm that `UseSwagger` and `UseSwaggerUI` are correctly ordered in your middleware pipeline. Ideally, `UseSwaggerUI` should come after `UseRouting` and before any `UseEndpoints` calls. - make sure no other routes or controllers that handle requests to the root (`/`). That might require adjusting your application's routing to accommodate Swagger UI at the root. For instance: ```csharp public void Configure(IApplicationBuilder app, IWebHostEnvironment env) { // Other middleware (e.g., error handling) can go here app.UseRouting(); app.UseSwagger(); app.UseSwaggerUI(c => { c.SwaggerEndpoint("/swagger/v1/swagger.json", "My API V1"); c.RoutePrefix = string.Empty; }); // More configurations, like UseAuthorization(), UseEndpoints(), etc. } ```
The error message indicates that the **Simple-QRCode** package requires the **PHP GD extension**, which is currently missing or not enabled on your system. Here are the steps to enable the GD extension: > **1.** Open your PHP configuration file (**php.ini**). In your case, it seems to be located at `C:\xampp\php\php.ini`. > > **2.** Search for the following line in the php.ini file: ``` ;extension=gd ``` Remove the semicolon at the beginning of the line to > uncomment it: ``` extension=gd ``` > > **3.** Save the php.ini file and restart your web server (e.g., Apache) for the changes to take effect. After enabling the GD extension, you should be able to install the Simple-QRCode package without any issues.
I am trying to migrate from legacy FCM(Firebase Cloud Messaging) API to the new FCM API V1. For this I have done a few things on the FCM side as per the FCM documentation, I have created the service account, downloaded the JSON file for authenticating requests. My code is in C#, so on C# side, I have made the following changes: 1. Used OAuth 2.0 for authentication. 2. Used the downloaded JSON file to get the credentials and used those credentials to generate the access token. 3. Updated the FCM endpoint and used the new endpoint https://fcm.googleapis.com/v1/yourprojectid/messages:send 4. Updated the JSON payload structure. Below is the code that I have written for achieving this: PersUser persUser = PersUser.GetPersUser(); string pushToken = persUser.PushNotificationToken; if (persUser != null && !String.IsNullOrEmpty(pushToken)) { string senderId = string.Empty; // Load credentials from the JSON key file GoogleCredential credential; using (var stream = new FileStream(@"JSON_FILE_PATH", FileMode.Open, FileAccess.Read)) { credential = GoogleCredential.FromStream(stream) .CreateScoped("https://www.googleapis.com/auth/firebase.messaging"); } string accessToken = String.Empty; // Obtain an access token if (credential != null) { var token = await credential.UnderlyingCredential.GetAccessTokenForRequestAsync(); accessToken = Convert.ToString(token); } senderId = ConfigurationManager.AppSettings["FIREBASESENDERID"]; var httpWebRequest = (HttpWebRequest)System.Net.WebRequest.Create("https://fcm.googleapis.com/v1/projects/myproject-id/messages:send"); httpWebRequest.ContentType = "application/json"; httpWebRequest.Headers.Add(string.Format("Authorization: Bearer {0}", accessToken)); httpWebRequest.Method = "POST"; var body = new object(); body = new { token = pushToken, notification = new { title = "Patient Flow", body = message, sound = soundFileName }, data = new { type = notificationType } }; string jsonPayload = String.Empty; var serializer = new JavaScriptSerializer(); using (var streamWriter = new StreamWriter(httpWebRequest.GetRequestStream())) { string json = jsonPayload = serializer.Serialize(body); streamWriter.Write(json); streamWriter.Flush(); } var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse(); using (var streamReader = new StreamReader(httpResponse.GetResponseStream())) { string httpResult = streamReader.ReadToEnd(); if (httpResponse.StatusCode != HttpStatusCode.OK) { Logger.Write(TraceTyp.DEBUG, String.Empty, "PushNotificationProcessor.SendNotification", string.Format("Push notification for {0} was not successfully delivered.", httpResponse), String.Empty); } } return true; } return false; When I debug this, I see a 400 Bad request error at: var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse(); I have also tried the solution suggested in https://stackoverflow.com/questions/76520279/firebase-cloud-messaging-device-group-management-via-http-v1-auth, but that hasn't helped. Please suggest what I may be missing here. Where am I possibly going wrong?
I have pandas dataframe with this input: data = { 'document_section_id': ['1', '', '1.2', '1.3', '1.3.1', '1.3.2', '2', '2.1', '2.2', '2.3', '', '2.3.2', '2.3.3', '3', '4', '4.1', '4.1.1', '4.2', '4.3', '4.4', '5', '5.1', '5.2', '5.3', '5.3.1', '5.3.2', '5.3.3', '5.3.4', '5.3.5', '5.4', '5.5', '6', '6.1', '6.1.1', '6.2', '6.3', '6.4', '6.5', '6.6', '6.6.1', '6.6.2', '6.6.3', '6.7', '6.8', '6.9', '6.9.1', '', '', '6.9.2'], 'paragraph_type': ['Heading1', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading3', 'Heading1', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading3', 'Heading3', 'Heading1', 'Heading1', 'Heading2', 'Heading3', 'Heading2', 'Heading2', 'Heading2', 'Heading1', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading3', 'Heading3', 'Heading3', 'Heading3', 'Heading2', 'Heading2', 'Heading1', 'Heading2', 'Heading3', 'Heading2', 'Heading2', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading3', 'Heading3', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading4', 'Heading4', 'Heading3'] } df = pd.DataFrame(data) The problem is to populate the blank "document_section_id" values with accurate section ID values, using the preceding ones as references. Conditions: 1. The number of digits is determined by the "paragraph type" column. For example, for "Heading3," there should be 4 digits and 3 dots, like so: 1.2.3.1. 2. For each empty value, it should reference the preceding available "paragraph type" and increment by 1 accordingly. Example 1:Given the input, the section ID for the 12th row can be derived from the previous one, resulting in the computed value of 2.3.1.Example 2: For the 48th and 49th rows, the section ID needs to be derived as 6.9.1.1 and 6.9.1.2, respectively. There can be max 10 levels of subsection, so that should be taken care irrespective of number of sub sections. Output: document_section_id = [ '1', '1.1', '1.2', '1.3', '1.3.1', '1.3.2', '2', '2.1', '2.2', '2.3', '2.3.1', '2.3.2', '2.3.3', '3', '4', '4.1', '4.1.1', '4.2', '4.3', '4.4', '5', '5.1', '5.2', '5.3', '5.3.1', '5.3.2', '5.3.3', '5.3.4', '5.3.5', '5.4', '5.5', '6', '6.1', '6.1.1', '6.2', '6.3', '6.4', '6.5', '6.6', '6.6.1', '6.6.2', '6.6.3', '6.7', '6.8', '6.9', '6.9.1', '6.9.1.1', '6.9.1.2', '6.9.2' ] paragraph_type = [ 'Heading1', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading3', 'Heading1', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading3', 'Heading3', 'Heading1', 'Heading1', 'Heading2', 'Heading3', 'Heading2', 'Heading2', 'Heading2', 'Heading1', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading3', 'Heading3', 'Heading3', 'Heading3', 'Heading2', 'Heading2', 'Heading1', 'Heading2', 'Heading3', 'Heading2', 'Heading2', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading3', 'Heading3', 'Heading3', 'Heading2', 'Heading2', 'Heading2', 'Heading3', 'Heading4', 'Heading4', 'Heading3' ] This is what I tried but it's not giving accurate output: current_section_id = "" current_level = 0 for index, row in df.iterrows(): if row['document_section_id'] == '': current_level += 1 section_id = current_section_id.split('.') section_id[current_level - 1] = str(int(section_id[current_level - 1]) + 1) section_id = '.'.join(section_id[:current_level]) current_section_id = section_id df.at[index, 'document_section_id'] = section_id else: current_section_id = row['document_section_id'] current_level = row['paragraph_type'].count('Heading') - 1
I have created the following Photoshop script designed to convert selected layers into smart objects: // Function to convert selected layers to smart objects function convertToSmartObject() { var doc = app.activeDocument; var selectedLayers = getSelectedLayers(doc); // Iterate through selected layers for (var i = 0; i < selectedLayers.length; i++) { var layer = selectedLayers[i]; // Convert the layer to a smart object convertLayerToSmartObject(doc, layer); } } // Function to get selected layers function getSelectedLayers(doc) { var selectedLayers = []; var layers = doc.layers; for (var i = 0; i < layers.length; i++) { var layer = layers[i]; if (layer.selected) { selectedLayers.push(layer); } } return selectedLayers; } // Function to convert a layer to a smart object function convertLayerToSmartObject(doc, layer) { doc.activeLayer = layer; createSmartObject(); } // Function to create a smart object function createSmartObject() { var idnewPlacedLayer = stringIDToTypeID('newPlacedLayer'); executeAction(idnewPlacedLayer, undefined, DialogModes.NO); } // Call the function to convert selected layers to smart objects convertToSmartObject(); Why isn't this script functioning properly? What might be the issue with my script?
Convert selected layers into smart objects via script in photoshop
|javascript|jsx|photoshop|
Update: perhaps it's a bug in the package but out of curiosity I swapped the REST API client implementation out to use asyncio and aiohttp instead of http3 and retried the request and it works perfectly so yeah, I guess if anyone encounters the same thing - try that instead
As said in other answers, a `.webmanifest` file is recognized as a simple JSON file by the editors. But using the `.webmanifest` extension allows you to setup a specific process in your development tools (linting, deployment). For example, you can configure VSCode editor to check your `.webmanifest` file against the [Web App Manifest JSON schema][1] and provide autocompletion. [1]: https://json.schemastore.org/web-manifest-combined.json
Is it possible to combine two plots and make it one plot? These are the plots I am trying to combine. I have tried the "combined_plot" command, but I get the same result - two separate plots. [![\[enter image description here\](https://i.stack.imgur.com/rNHuA.png)][1]][1] Here are my codes: Plot 1: plot1 <- ggplot() + geom_bar(data = complete_august_data, aes(x = DATE, y = Count, fill = MANUAL.ID.), stat = "identity", width = 0.7, position = "dodge") + geom_bar(data = complete_july_data, aes(x = DATE, y = Count, fill = MANUAL.ID.), stat = "identity", width = 0.7, position = "dodge") + labs(title = "Kumulative kurver for juli og august ved Borupgård 2018", x = "Dato", y = "Kumulative observationer", fill = "Art") + scale_x_date(date_labels = "%Y-%m-%d") + theme_minimal() Plot 2: plot2 <- ggplot(result_df, aes(x = Dato, y = Belysning)) + geom_line(color = "gold2") + # Tilføj en linje geom_point(color = "goldenrod", size = 2) + # Tilføj punkter labs(x = "Dato", y = "Belysning", title = "Graf over belysning 25 juli - 29 august 2018") + scale_x_date(limits = as.Date(c("2018-07-26", "2018-08-29")), date_breaks = "5 day", date_labels = "%Y-%m-%d")+ theme(axis.text.x = element_text(angle = 45, hjust = 1)) + theme_bw() [1]: https://i.stack.imgur.com/9D7ma.png
Problems running both JUnit tests and Selenium/Cucumber tests at the same time
|java|spring|selenium-webdriver|junit|cucumber|
null
I only have access to `libusbdi` documentation for QNX 6.5 and prior. At that time, QNX libusbdi did not supply synchronous messaging functions, only an asynchronous completion callback interface via `usbd_io`. You will need to write your own shim function. `usb_bulk_msg` itself is actually implemented using an underlying asynchronous USB API very similar to libusbdi; you can use the kernel source code there as a guide on how to implement it. On QNX, `struct completion` could be implemented with a condvar. Note that you'll also need to track outstanding requests so as to manage cleanup in the device removal callback.
I tried in this way library(ggplot2) library(reshape2) # Melt the data melted_data <- melt(data, id.vars = c("gene", "Class")) colnames(melted_data) <- c("gene", "Class", "Sample", "Presence") # Define a color palette for the classes class_colors <- c("Aminoglycoside" = "red", "Beta-lactam" = "blue", "Macrolide" = "green", "Tetracycline" = "purple") # Create the ggplot object for the heatmap heatmap <- ggplot(melted_data, aes(x = Sample, y = gene, fill = Presence)) + geom_tile(color = "gray50") + scale_fill_gradient(low = "white", high = "black") + theme_minimal() + theme(axis.text.x = element_text(angle = 45, hjust = 1, vjust = 1), # Rotate x-axis labels panel.grid.major = element_blank(), panel.grid.minor = element_blank()) + labs(x = NULL, y = NULL)+ coord_flip() # Create the ggplot object for the annotation bar # Create the ggplot object for annotation bar without facetting annotation_bar <- ggplot(melted_data, aes(x = Sample, y = gene, fill = Class)) + geom_tile() + scale_fill_manual(values = c("Aminoglycoside" = "blue", "Beta-lactam" = "red", "Macrolide" = "green", "Tetracycline" = "orange")) + # Specify colors for each class theme_void()+ # Remove unnecessary elements coord_flip() # Combine both plots using patchwork library(patchwork) heatmap_with_annotation <- (annotation_bar / heatmap) + plot_layout(heights = c(0.1, 1)) # Display the combined plot heatmap_with_annotation [![enter image description here][1]][1] [1]: https://i.stack.imgur.com/uFk4M.png
Here is a complete [MVCE](https://stackoverflow.com/help/minimal-reproducible-example): package com.example.countcharacters; /** * EXAMPLE OUTPUT: * Enter some words here: * How now brown cow * Enter a character here: * abc * Please enter one character. Try again. * Enter a character here: * o * There are 4 occurrence(s) of o in the text How now brown cow. */ import java.util.Scanner; public class CountCharacters { public static void main(String[] args) { Scanner scan = new Scanner(System.in); // Captures word(s) String inputEntry; System.out.println("Enter some words here: "); inputEntry = scan.nextLine(); // Captures char char inputCharacter; while (true) { System.out.println("Enter a character here: "); String line = scan.nextLine(); if (line.length() == 1) { inputCharacter = line.charAt(0); break; } else { // if user is not in compliance System.out.println("Please enter one character. Try again."); } } // iterates through word(s) int charCountDisplay = 0; int i = 0; while(i < inputEntry.length()) { char c = inputEntry.charAt(i++); if (c == inputCharacter) { ++charCountDisplay; } } // Print results System.out.print("There are " + charCountDisplay + " occurrence(s) of " + inputCharacter + " in the text " + inputEntry + "."); } } NOTES: * You can use "char" and "String.charAt()" to simplify your code. * In general, it's preferable to declare variables close to where you use them (rather than at the top). * You can put your test for "one character only" in its own loop.
How to combine two plots?
|r|ggplot2|
from bs4 import BeautifulSoup as bs import requests as rq #Replace <your url> with the url you want to scrap url ='<your url>' r=requests.get(url) soup=bs(r.content,"html.parser") links = soup.find_all("a") # Create an empty dict dct = {} for x in links: # Get keys of the dict being clickable text and value being links key = x.string val = x.get("href") dct[key] = val print(dct) The output will be a dictionary in which the keys are clickable texts and the values are the links these texts lead to if clicked.
I'm evaluating some Plan Cache behaviors and this is my scenario. I'm running the following two queries separately: ``` SELECT TOP 10 * FROM dbo.Countries CT LEFT JOIN dbo.Continents CN ON CT.ContinentId=CN.ContinentId WHERE CountryId='AR' ``` ``` SELECT TOP 10 * FROM dbo.Countries CT LEFT JOIN dbo.Continents CN ON CT.ContinentId=CN.ContinentId WHERE CountryId='BR' ``` After running both queries, I'm getting this plan cache view: ![Plan Cache View](https://i.stack.imgur.com/yR5BR.png) My understanding is: - Different sql_handle: expected - Different plan_handle: **unexpected** - Same query_hash: expected - Same query_plan_hash: expected Question: I really don't get why I'm getting a different plan_handle for each execution, even when the query is basically the same, and the query_hash and query_plan_hash do match. What could be the reason for this? This is a comparison of both cached plans: [Comparison of cached plans](https://i.stack.imgur.com/rbaGp.png) I get the difference in the statement but I don't think that should count. Otherwise, we would always have one plan_handle per sql_handle since it would always change. Some additional settings already checked: - Optimization level: FULL - No plan warnings: (both are Good Enough Plans found) - SET options match, both queries are executed in the same SSMS Window - Compat Level: 140 - Optimize for Ad Hoc: false - Query Optimizer Fixes: Off - Legacy CE: Off - No Database-scoped configuration - Parameterization: Simple - Resource Governor: Disabled I checked all potential properties affecting this behaviors with no luck. I would expect both queries reusing the same plan, hence, pointing to the same plan_handle. Is my expectation incorrect? - Plan A: https://www.brentozar.com/pastetheplan/?id=ByP5gwIkA - Plan B: https://www.brentozar.com/pastetheplan/?id=Bk2ybw81C Thanks
Since you're discarding things after 11pm, we can do this: ```r library(dplyr) library(tidyr) # unnest doseq <- function(z) { z <- as.integer(z) if (length(z) > 1) { if (z[2] < z[1]) z[2] <- 24 z <- z[1]:max(z[1], z[2]-1) } z } df1 |> mutate(time = lapply(strsplit(time, "-"), doseq)) |> unnest(time) |> print(n=99) # # A tibble: 24 × 2 # time temperature # <int> <int> # 1 0 0 # 2 1 0 # 3 2 1 # 4 3 1 # 5 4 2 # 6 5 2 # 7 6 2 # 8 7 3 # 9 8 3 # 10 9 3 # 11 10 3 # 12 11 3 # 13 12 3 # 14 13 4 # 15 14 4 # 16 15 4 # 17 16 4 # 18 17 4 # 19 18 4 # 20 19 1 # 21 20 1 # 22 21 1 # 23 22 1 # 24 23 1 ```
null
Currently you are blocking thread where events are coming in and IO Operations can take some time. You could use something like this. When have you data coming from an event you immediately give it to a handler which keep track of data in a queue which is safe to be accessed from many threads. This handler runs a loop in separate thread which either saves data or waits for more data to come in. ``` import android.util.Log; import java.util.concurrent.LinkedBlockingQueue; public class DataHandler implements Runnable { private final LinkedBlockingQueue<Data> queue = new LinkedBlockingQueue<>(); private Thread thread; void start() { Log.i("handler", "Trying to start"); thread = new Thread(this); thread.start(); } void stop() { Log.i("handler", "Trying to stop"); thread.interrupt(); } void addData(Data data) { Log.i("handler", "Adding data"); queue.add(data); } @Override public void run() { Log.i("handler", "Starting"); while (true) { try { if (queue.peek() != null) { saveData(queue.take()); } else { Log.i("handler", "Waiting"); Thread.sleep(1000); } } catch (InterruptedException e) { break; } } Log.i("handler", "Exiting"); } private void saveData(Data data) { Log.i("handler", "Saving Data"); } } ``` Use it like ``` void fakeData() throws InterruptedException { DataHandler handler = new DataHandler(); handler.start(); Thread.sleep(3000); handler.addData(new Data()); Thread.sleep(500); handler.addData(new Data()); Thread.sleep(3000); handler.addData(new Data()); Thread.sleep(2000); handler.stop(); } ```
For this I would simply: * Concatenate everything to a single line: <span /> $OneLine = -Join (Get-Content .\VerticalFile.txt) * Than split the rows based on the comas followed ([LookAhead](https://www.regular-expressions.info/lookaround.html)) by a date string: <span /> $Csv = $OneLine -Split ',(?=\d\d ... \d\d\d\d)', [Environment]::NewLine * Optionally, if you need the fields quoted: <span /> $Csv | ConvertFrom-Csv | ConvertTo-Csv
I was facing the same issue, and none of the above answers worked for me. I received the following error: docker run -d -p9090:9090 --name prometheus --network my-network -v ./prometheus.yml:/etc/prometheus/prometheus.yml prom/prometheus docker: Error response from daemon: source /var/lib/docker/overlay2/36ea9303de63182ffed055cca6d7f59880400e03a29e3c1ecf770f80beb51f24/merged/etc/prometheus/prometheus.yml is not directory. I was using windows directory pneumonic "/" for prometheus unix os. It started working after changing "/" with "\\" for unix path and double quotes with "\\\\" for windows path docker run -d -p9090:9090 --name prometheus --network my-network -v "D:\\projects\\docker\\prometheus.yml":/etc/prometheus/prometheus.yml prom/prometheus
I'm looking through this [SNES controller][1] made with HTML and CSS. The background here is taken almost directly from this. I was wondering what the [num] [num] / [num] [num] no-repeat following the linear-gradient() does. I can't seem to find any documentation on numbers following gradients outside their parentheses. <!-- begin snippet: js hide: false console: true babel: false --> <!-- language: lang-css --> body { height: 100% width: 100% margin: 0px; } div { height: 100vh; width: 100vw; background: radial-gradient(circle at 17vmin 17vmin, grey 16.95vmin, rgba(0, 0, 0, 0) 17vmin), radial-gradient(circle at 63vmin 17vmin, grey 16.95vmin, rgba(0, 0, 0, 0) 17vmin), linear-gradient(grey 0 0) 50% 0/46vmin 30vmin no-repeat; } <!-- language: lang-html --> <body> <div></div> </body> <!-- end snippet --> It doesn't work here for some reason though. Any thoughts? [1]: https://codepen.io/alvaromontoro/pen/PoEgRPG
No need to add an extra element, add the background coloration to the outer element <!-- begin snippet: js hide: false console: true babel: false --> <!-- language: lang-css --> .container{ width: 300px; height: 200px; background-color: blue; position: relative; } .outer{ position: absolute; top: 0px; right: 0px; cursor: pointer; background: /*-2px then + 2px (4px thickness)*/ linear-gradient(45deg, #0000 calc(50% - 2px),black 0 calc(50% + 2px),#0000 0) padding-box, linear-gradient(-45deg,#0000 calc(50% - 2px),black 0 calc(50% + 2px),#0000 0) padding-box, yellow; width: 10%; aspect-ratio: 1; border: 10px solid #0000; /*to control the margin*/ } <!-- language: lang-html --> <div class='container'> <div class='outer'> </div> </div> <!-- end snippet --> Another syntax where you can adjust `background-size`/`background-position` <!-- begin snippet: js hide: false console: true babel: false --> <!-- language: lang-css --> .container{ width: 300px; height: 200px; background-color: blue; position: relative; } .outer{ position: absolute; top: 0px; right: 0px; cursor: pointer; background: /*-2px then + 2px (4px thickness)*/ linear-gradient(45deg, #0000 calc(50% - 2px),black 0 calc(50% + 2px),#0000 0) 50% padding-box no-repeat, linear-gradient(-45deg,#0000 calc(50% - 2px),black 0 calc(50% + 2px),#0000 0) 50% padding-box no-repeat, yellow; background-size: calc(100% - 20px) calc(100% - 20px); width: 15%; aspect-ratio: 1; } <!-- language: lang-html --> <div class='container'> <div class='outer'> </div> </div> <!-- end snippet -->
I'm new in react native, and I'm creating a simple test app. I followed the firebase documentation and inserted the dependencies and the google-services.json file. The problem is that now I would like to read the data from my realtime database, and following the documentation I did something like this: `... import { firebase } from '@react-native-firebase/app'; const Categories = () => { const reference = firebase .app() .database('https://testapp-app-default-rtdb.euorpe-west1.firebasedatabase.app/') .ref('/categories'); ...` But if I running the command from the terminal: `npx expo start` to test the application on my Android phone, the console gives me this error: `ERROR Error: You attempted to use a firebase module that's not installed on your Android project by calling firebase.app(). Ensure you have: 1) imported the 'io.invertase.firebase.app.ReactNativeFirebaseAppPackage' module in your 'MainApplication.java' file. 2) Added the 'new ReactNativeFirebaseAppPackage()' line inside of the RN 'getPackages()' method list.` How can i solve this to test my app with db? Testing with database
null
I'm developing an app with Expo and facing an issue with push notifications where the custom sound only plays when the app is in the foreground. When the app is backgrounded or closed, the notifications default to the system sound instead of the custom sound I've specified also it uses miscellaneous channelId. I can't understand why it's different in the background than in the foreground , the version im testing is an apk production build. I'm using Expo's managed workflow and sending notifications through their backend (expo-server-sdk). Here's how I configure the notification channel on the frontend: ``` await Notifications.setNotificationChannelAsync("ltsChannel", { name: "ltsChannel", sound: "sound.wav", // Custom sound file importance: Notifications.AndroidImportance.MAX, audioAttributes: { usage: Notifications.AndroidAudioUsage.ALARM, contentType: Notifications.AndroidAudioContentType.SONIFICATION, }, }); ``` And this is how I send notifications from the backend: ``` const messages = []; messages.push({ to: pushToken, sound: 'default', title: 'Test Title', body: 'Test Body', channelId: "ltsChannel", }); ``` any help would be appreciated.
Custom Sound for Expo Push Notifications Only Works in Foreground
|android|react-native|push-notification|expo|expo-notifications|
More of a general question more than anything. I am authenticating during login and storing the token jwt response in local storage and on each route change running checkLogin which checks the token is valid by hitting /token/validate. Im doing this asynchronously for unprotected pages to help with performance and requiring it for protected pages before proceeding. Im doing the same process for grabbing fresh user data for if user data has been updated in the mean time. My question is do i need to validate each route change as theyve authenticated on login and each page has many protected REST endpoints that check against the token anyway so if any of those come back invalid i can force logout there instead. Trying to speed things up as on each route change its quite slow as it performs these lookups before getting any endpoint data for data on the page. Also for any endpoints that are quite data heavy is it worth prefetching those after login so when you visit the route it already has that data? async handleRouteChange(newRoute, oldRoute) { if (newRoute.path === '/edit-profile') { await this.checkLogin(); } else { this.checkLogin(); } if(this.$store.state.user_data){ this.updateUserData(); }else{ await this.updateUserData(); } },
Authenticating vue app on each route change
|javascript|vue.js|api|rest|
|powershell|inno-setup|synchronous|start-process|
``` p1 = new Promise((res,rej)=>{ console.log("p1 setTimeout"); setTimeout(()=>{ res(17); }, 10000); }); p2 = new Promise((res,rej)=>{ console.log("p2 setTimeout"); setTimeout(()=>{ res(36); }, 2000); }); function checkIt() { console.log("Started"); let val1 = this.p1; console.log("P1"); val1.then((data)=>{console.log("P1 => "+data)}); let val2 = this.p2; console.log("P2"); val2.then((data)=>{console.log("P2 => "+data)}); console.log("End"); } checkIt(); ``` My understanding of above written JavaScript code was: 1: In callback queue, p2's setTimeout will be first and then p1's setTimeout (and they will execute in FIFO manner) 2: Callback queue won't execute before microtask queue 3: In microtask queue, p1's callback function will first and then p2's callback function (and they will execute in FIFO manner) 4: Hence this should be deadlock. But instead getting output: 1: p1 setTimeout 2: p2 setTimeout 3: Started 4: P1 5: P2 6: End 7: (After 2 seconds) P2 => 36 8: (After 10 seconds) P1 => 17 **Doubt:** How 7th and 8th line are executing? I have ran the code and getting output as defined above
How to connect connect firebase to react-native app?
|reactjs|firebase|react-native|
null
Here's a good discussion on CDECL vs. STDCALL: https://stackoverflow.com/questions/3404372/stdcall-and-cdecl Regardless of calling convention, the callee will typically save the current stack pointer to the frame pointer (EBP) so he can push and pull local variables to/from the stack at will. When he's ready to return, the callee *MUST* then restore the stack pointer (ESP) in order for the "return" to succeed. Additional information --- There are two issues: 1) calling the subroutine (this part is "stdcall" vs. "cdecl" (among others - those aren't the only two options), and 2) returning from the subroutine. The main difference between the CDECL and STDCALL lies in who is responsible for cleaning the stack for local variables upon "return". The callee *ALWAYS* restores the stack pointer. That's the only way "return" can work correctly. For STDCALL, the callee *ALSO* clears the stack of it's own local variables. Roughly speaking: * STDCALL might use slightly less space, because the "cleanup code" lives in only one place: in the callee. For CDECL calls, the cleanup must be repeated everywhere the subroutine is called. Your example sub_12 is "STDCALL". * CDECL is more flexible: it allows you to pass a variable number of parameters into the subroutine. Your example sub_48 is "CDECL".
You can just execute commands in bash/other and change directory yourself! That's what worked best for me. For example: ```yaml - repo: local hooks: - id: go-unit-tests name: run go test s(go test) language: system entry: bash -c 'cd subdir && exec go test ./...' pass_filenames: false types: [go] files: ^subdir/ ``` or if you're running Python tests: ```yaml - repo: local hooks: - id: pytest name: pytest entry: sh -c 'cd text2sql-backend && PYTHONPATH=. poetry run pytest .' language: system types: [python] pass_filenames: false stages: [commit] exclude: ".*pb2(_grpc)?.py(i)?$|__init__.py" ```
Mac M1, Docker version 25.0.3 I enjoy a good keyboard shortcut. The most useful of my shortcuts so far has actually turned out to be simply opening specific applications with one press: "hyperkey + V" opens VS Code, "hyperkey + t" opens terminal, etc. i have found this approach to be much more useful than using "cmd + tab" to go through the running applications because i can choose the application i want with one press regardless of where it is in the list of running applications. I achieved this using automator and keyboard>shortcuts>services. The only application that is giving me trouble is Docker. Ironically, I much prefer Docker Desktop over the CLI for viewing logs and exec shells etc. Here is my Automator application for opening Docker ``` launch application "Docker.app" ``` For all other applications like google chrome for example, - if the application IS NOT running it runs the application and opens a new window. - if the application IS running and has windows open, it activates those windows and brings them to the front. - if the application IS running and has no window open it opens a new window and brings it to the front. great! this works as expected in all scenarios with Docker: - if the application is NOT running, it runs docker. (my docker preferences are set to "open dashboard when docker starts") and a new docker dashboard window opens up - if Docker IS running, which is the vast majority of the time when i am building something, i press the keyboard shortcut and nothing happens. Docker is not activated, it is not brought to the front, and no dashboard is opened. so i must grab my mouse and scroll to the icon in the menu bar and then click the open dashboard menu item. you would think it would annoy me lees considering i'm about to use the GUI anyway. lol. but i really hate the two extra clicks. - I have tried "open -a Docker" in the terminal, - I have tried open "Docker.app" in spotlight - I have tried an apple script to click menu items - I have tried apple script to activate. none of them will open a new dashboard. let me know if you all can think of any other ways i might be able to achieve this.
I have a dataframe which looks like below [![enter image description here][1]][1] Here as you can see in the "Schema" Column there are values like PCP / LCP string, Data Load Supplier string, Cost Type string, Latest Cost CPN,SU,Loc,Med double This column actually holds the schema of a particular dataset. what i want here is to change all the column values, if there is any space in between or any special character that should be replaced by "_". so in the above example the values will become PCP_LCP string, Data_Load_Supplier string, Cost_Type string, Latest_Cost_CPN_SU_Loc_Med double the code i have written is from pyspark.sql.functions import col, udf import re def process_part(part): # Replace slashes with underscores processed_part = re.sub(r'/', '_', part) # Replace spaces with underscores only if they are between words processed_part = re.sub(r'(\b\w+)\s+(/|,)\s+(\w+\b)', r'\1\2\3', processed_part) # Replace spaces with underscores between words processed_part = re.sub(r'(\b\w+)\s+(\w+\b)', r'\1_\2', processed_part) return processed_part process_part_udf = udf(process_part) processed_schema_df = schema_df.withColumn('Schema', process_part_udf(col('Schema'))) display(processed_schema_df) with this the output has come as [![enter image description here][2]][2] [1]: https://i.stack.imgur.com/SSnPd.png [2]: https://i.stack.imgur.com/BpOVQ.png can someone rectify the mistake over here
Transformation in Dataframe Column in pySpark
|pyspark|databricks|azure-databricks|
> [![enter image description here](https://i.stack.imgur.com/kxnaD.png)](https://i.stack.imgur.com/kxnaD.png) Mac mini M1, 2020 Xcode 14.0 react-native 0.73.2 Can you help me! please I tried many ways but couldn't solve it, There was a time when I tried deleting those two error codes and it was able to build, but when I came back everything was the same as before.
Error: Expected a type in FBSDKConversionValueUpdating file while install react-native-fbsdk-next
|react-native|fbsdk|react-native-fbsdk|fbsdksharekit|react-native-fbsdk-next|
null
I am getting null in image **ImageView** Object while getting **activity_dialog**. the context is activity in **ReadActivityMVCView** based on MVC pattern, Any idea how to fix this? java.lang.NullPointerException: Attempt to invoke virtual method 'void android.widget.ImageView.setImageBitmap(android.graphics.Bitmap)' on a null object reference public ReadActivityMVCView(LayoutInflater inflator, ViewGroup parent,Activity readActivity) { activity = readActivity; rootView = inflator.inflate(R.layout.activity_read_main, parent, false); setRootView(rootView); } private void loadPhoto(Bitmap bitmap) { LayoutInflater inflater = this.activity.getLayoutInflater(); this.activity.getWindow().addContentView(inflater.inflate(R.layout.activity_dialog, null), new ViewGroup.LayoutParams( ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)); ImageView image = rootView.findViewById(R.id.fullimage); if (bitmap != null) { image.setImageBitmap(bitmap); }
Callback and Microtask Queue of Java Script
|javascript|asynchronous|event-loop|asynccallback|
null
|c|linux|docker|ubuntu|clang|
Here is one possible option using [`cKDTree.query`][1] from [tag:scipy]: ```py from scipy.spatial import cKDTree def knearest(gdf, **kwargs): notna = gdf["PPM_P"].notnull() coordinates = gdf.get_coordinates().to_numpy() dist, idx = ( cKDTree(coordinates[notna]).query( coordinates[~notna], **kwargs) ) _ser = pd.Series( gdf.loc[notna, "PPM_P"].to_numpy()[idx].tolist(), index=(~notna)[lambda s: s].index, ) gdf.loc[~notna, "PPM_P"] = _ser[~notna].map(np.mean) return gdf N = 2 # feel free to make it 5, or whatever.. out = knearest(gdf.to_crs(3662), k=range(1, N + 1))#.to_crs(4326) ``` Output (*with `N=2`*): [![enter image description here][2]][2] ***NB**: Each red point (an `FID` having no `PPM_P`), is associated with the `N` nearest green points*. Final GeoDataFrame (*with intermediates*): ```py FID PPM_P (OP) PPM_P (INTER) PPM_P geometry 0 0 34.919571 NaN 34.919571 POINT (842390.581 539861.877) 1 1 NaN 37.480218 37.480218 POINT (842399.476 539861.532) 2 2 NaN 35.567003 35.567003 POINT (842408.370 539861.187) 3 3 NaN 35.567003 35.567003 POINT (842420.229 539860.726) 4 4 36.214436 NaN 36.214436 POINT (842429.124 539860.381) 5 5 NaN 38.127651 38.127651 POINT (842438.018 539860.036) 6 6 NaN 40.431946 40.431946 POINT (842446.913 539859.691) 7 7 40.823028 NaN 40.823028 POINT (842458.913 539862.868) 8 8 NaN 37.871299 37.871299 POINT (842378.298 539851.425) 9 9 40.823028 NaN 40.823028 POINT (842390.158 539850.965) 10 10 40.040865 NaN 40.040865 POINT (842399.052 539850.620) 11 11 36.214436 NaN 36.214436 POINT (842407.947 539850.275) 12 12 34.919571 NaN 34.919571 POINT (842419.947 539853.452) 13 13 NaN 38.127651 38.127651 POINT (842428.841 539853.107) 14 14 40.040865 NaN 40.040865 POINT (842437.736 539852.761) 15 15 NaN 40.431946 40.431946 POINT (842449.595 539852.301) 16 16 NaN 40.431946 40.431946 POINT (842458.489 539851.956) 17 17 NaN 40.431946 40.431946 POINT (842467.384 539851.611) 18 18 NaN 40.431946 40.431946 POINT (842476.278 539851.266) 19 19 NaN 37.871299 37.871299 POINT (842368.981 539840.859) ``` [1]: https://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.cKDTree.query.html [2]: https://i.stack.imgur.com/KefCq.png
{"Voters":[{"Id":14868997,"DisplayName":"Charlieface"},{"Id":17562044,"DisplayName":"Sunderam Dubey"},{"Id":1127428,"DisplayName":"Dale K"}],"SiteSpecificCloseReasonIds":[11]}
Given an array `Array<VkDeviceQueueCreateInfo>` I want to convert it to a C Array of C values (or pointer to the first item), i. e. `VkDeviceQueueCreateInfo*`. This is where I'm getting a type mismatch (obviously): [![type mismatch][1]][1] How do I convert this `Array<VkDeviceQueueCreateInfo>` to `CPointer<VkDeviceQueueCreateInfo>`? I don't want to create an array of pointers. [1]: https://i.stack.imgur.com/XSi0A.png
I know `serialVersionUID` is used serialization and deserialization. I read https://stackoverflow.com/questions/285793/what-is-a-serialversionuid-and-why-should-i-use-it So if we are using distributed cache like hazelcast or coherence or redis we need serialVersionUID. Example entity or VO classes needs that. My question is little bit further. Do we need to define that for app specific **exception** in spring boot application no EJB no weblogic no websphere, regular tomcat. example ```Java public class MyBusinessException extends Exception { private static final long serialVersionUID = 7718828512143293558L; public MyBusinessException() { super(); } } ``` I have never seen an app specific excpetion deserialized before in prod or anywhere. So do we really need `serialVersionUID` here ?
|java|spring-boot|
I create two services in .NET Web API. One service is for Identity and generates a jwt token. The second service GET the jwt token from the Identity. These two services are connected using http. This is the code from the second service which get jwt token from the Identity Service ``` builder.Services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme) .AddJwtBearer(options => { options.Authority = builder.Configuration["IdentityServiceUrl"]; options.RequireHttpsMetadata = false; options.SaveToken = true; options.TokenValidationParameters.ValidateAudience = false; options.TokenValidationParameters.NameClaimType = "username"; options.Configuration = new OpenIdConnectConfiguration(); }); var app = builder.Build(); // Configure the HTTP request pipeline. if (app.Environment.IsDevelopment()) { app.UseSwagger(); app.UseSwaggerUI(); } app.UseAuthentication(); app.UseAuthorization(); ``` ``` [Authorize] [HttpPut("create-age-category")] public async Task<ActionResult> CreateAgeCategory(AgeCategoryDto ageCategory) { try { await _ageCategoryService.CreateAgeCategory(ageCategory); return Ok("success"); } catch (Exception ex) { return BadRequest(ex); } } ``` When I try to access this private root I receive 401 unauthorized. Even though I sent the token in the authorize tab in postman. I'm new to microservices I don't know where to start to fix this issue. First I change both service's url to http and I still receive the 401.
401 unauthorized .NET Web API microservices
|asp.net|api|microservices|
null