id int64 5 1.93M | title stringlengths 0 128 | description stringlengths 0 25.5k | collection_id int64 0 28.1k | published_timestamp timestamp[s] | canonical_url stringlengths 14 581 | tag_list stringlengths 0 120 | body_markdown stringlengths 0 716k | user_username stringlengths 2 30 |
|---|---|---|---|---|---|---|---|---|
1,872,146 | Day 3: Mastering Operators and Expressions in JavaScript | Introduction Welcome to Day 3 of your JavaScript journey! Yesterday, we explored variables... | 0 | 2024-05-31T14:35:04 | https://dev.to/dipakahirav/day-3-mastering-operators-and-expressions-in-javascript-3oa0 | javascript, beginners, html, css | #### Introduction
Welcome to Day 3 of your JavaScript journey! Yesterday, we explored variables and data types. Today, we will dive into operators and expressions, which are fundamental for performing calculations and making decisions in your code.
please subscribe to my [YouTube channel](https://www.youtube.com/@DevDivewithDipak?sub_confirmation=1) to support my channel and get more web development tutorials.
#### Arithmetic Operators
Arithmetic operators allow you to perform mathematical operations on numbers.
**1. Addition (+)**
Adds two numbers.
```javascript
let sum = 5 + 3;
console.log(sum); // Output: 8
```
**2. Subtraction (-)**
Subtracts the second number from the first.
```javascript
let difference = 9 - 4;
console.log(difference); // Output: 5
```
**3. Multiplication (*)**
Multiplies two numbers.
```javascript
let product = 7 * 2;
console.log(product); // Output: 14
```
**4. Division (/)**
Divides the first number by the second.
```javascript
let quotient = 10 / 2;
console.log(quotient); // Output: 5
```
**5. Modulus (%)**
Returns the remainder of the division.
```javascript
let remainder = 10 % 3;
console.log(remainder); // Output: 1
```
**6. Increment (++)**
Increases a number by one.
```javascript
let count = 5;
count++;
console.log(count); // Output: 6
```
**7. Decrement (--)**
Decreases a number by one.
```javascript
let count = 5;
count--;
console.log(count); // Output: 4
```
#### Comparison Operators
Comparison operators compare two values and return a boolean (true or false).
**1. Equal (==)**
Checks if two values are equal (type conversion may occur).
```javascript
let isEqual = (5 == '5');
console.log(isEqual); // Output: true
```
**2. Strict Equal (===)**
Checks if two values are equal and of the same type.
```javascript
let isStrictEqual = (5 === '5');
console.log(isStrictEqual); // Output: false
```
**3. Not Equal (!=)**
Checks if two values are not equal (type conversion may occur).
```javascript
let isNotEqual = (5 != '5');
console.log(isNotEqual); // Output: false
```
**4. Strict Not Equal (!==)**
Checks if two values are not equal or not of the same type.
```javascript
let isStrictNotEqual = (5 !== '5');
console.log(isStrictNotEqual); // Output: true
```
**5. Greater Than (>)**
Checks if the left value is greater than the right value.
```javascript
let isGreaterThan = (6 > 3);
console.log(isGreaterThan); // Output: true
```
**6. Less Than (<)**
Checks if the left value is less than the right value.
```javascript
let isLessThan = (6 < 3);
console.log(isLessThan); // Output: false
```
**7. Greater Than or Equal (>=)**
Checks if the left value is greater than or equal to the right value.
```javascript
let isGreaterThanOrEqual = (6 >= 6);
console.log(isGreaterThanOrEqual); // Output: true
```
**8. Less Than or Equal (<=)**
Checks if the left value is less than or equal to the right value.
```javascript
let isLessThanOrEqual = (6 <= 6);
console.log(isLessThanOrEqual); // Output: true
```
#### Logical Operators
Logical operators are used to combine multiple conditions.
**1. Logical AND (&&)**
Returns true if both operands are true.
```javascript
let andResult = (5 > 3 && 8 > 6);
console.log(andResult); // Output: true
```
**2. Logical OR (||)**
Returns true if at least one operand is true.
```javascript
let orResult = (5 > 3 || 8 < 6);
console.log(orResult); // Output: true
```
**3. Logical NOT (!)**
Inverts the truthiness of the operand.
```javascript
let notResult = !(5 > 3);
console.log(notResult); // Output: false
```
#### Expressions
Expressions are combinations of variables, operators, and values that yield a result.
**Example:**
```javascript
let a = 10;
let b = 5;
let c = a + b * 2;
console.log(c); // Output: 20
```
#### Practice Activities
**1. Practice Code:**
- Write expressions using each arithmetic, comparison, and logical operator.
- Combine multiple operators in complex expressions.
**2. Mini Project:**
- Create a simple calculator script that performs basic arithmetic operations based on user input.
**Example:**
```javascript
let num1 = parseFloat(prompt("Enter the first number:"));
let num2 = parseFloat(prompt("Enter the second number:"));
let operation = prompt("Enter the operation (+, -, *, /, %):");
let result;
if (operation === "+") {
result = num1 + num2;
} else if (operation === "-") {
result = num1 - num2;
} else if (operation === "*") {
result = num1 * num2;
} else if (operation === "/") {
result = num1 / num2;
} else if (operation === "%") {
result = num1 % num2;
} else {
result = "Invalid operation";
}
console.log("Result:", result);
```
#### Summary
Today, we explored operators and expressions in JavaScript. We learned how to use arithmetic, comparison, and logical operators to build expressions and perform calculations. Understanding these operators is crucial for making decisions and manipulating data in your code.
Stay tuned for Day 4, where we'll dive into control structures like conditionals and loops!
#### Resources
- [Expressions and Operators](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Expressions_and_Operators)
- [JavaScript Basics](https://developer.mozilla.org/en-US/docs/Learn/Getting_started_with_the_web/JavaScript_basics)
Happy coding! If you have any questions or need further clarification, feel free to leave a comment below. Let's continue learning and growing together!
*Follow me for more tutorials and tips on web development. Feel free to leave comments or questions below!*
#### Follow and Subscribe:
- **Website**: [Dipak Ahirav] (https://www.dipakahirav.com)
- **Email**: dipaksahirav@gmail.com
- **Instagram**: [devdivewithdipak](https://www.instagram.com/devdivewithdipak)
- **YouTube**: [devDive with Dipak](https://www.youtube.com/@DevDivewithDipak?sub_confirmation=1)
- **LinkedIn**: [Dipak Ahirav](https://www.linkedin.com/in/dipak-ahirav-606bba128) | dipakahirav |
1,864,919 | OS Fundamentals 101: Process and Syscalls | Hola! Let's dive into the world of operating systems. Today, we will discuss processes and... | 0 | 2024-05-31T14:35:02 | https://dev.to/xpertr2/os-fundamentals-101-process-and-syscalls-2h0n | linux, process, os, syscall | Hola! Let's dive into the world of operating systems. Today, we will discuss processes and syscalls.
## Process
First, let's quote the bookish answer.
> A process is a program in execution.
When we want to execute any program, the OS creates an entity called a process. A process is a kind of container to hold everything needed by the computer to execute it. Each process is given its own **address space**, a list of memory locations from 0 to some maximum depending on available physical & virtual memory, which the process can read and write to. The address space contains the **executable code** of the program (mostly assembly), the **program's data**, and its **stack**. A process is also associated with allocated resources like **registers** (for things like **program counter, stack counter**, etc.), file descriptors, list of related processes, etc. All this information of the process is stored by the OS in the **process table**.
### Address Space

The address space is a map of memory locations from 0 to some max value to actual physical locations on the RAM. When the OS creates a process it allocates an address space where it first stores the location of OS **procedures** in the beginning, as the program will need them to talk to the **Kernel** (*read-only*). Next, it reads the executable code and stores it (*read-only*), followed by the data segment which contains things like the **globals** and is both **read** and **writable** by the process. Then, the remaining space is divided by **heap** and **stack**. The **heap** is allocated after the data segment whereas the **stack** is allocated at the end. So the heap grows **upwards** whereas the stack grows **downwards** with free space in between.
> This address space is also referred to as the **core-image**.
## System Calls
System calls or **syscalls** is an interface provided by the OS to allow user-space code to talk to the **Kernel** and instruct it to do specific tasks.
Let's take an example of reading a file. If a process is running in user-mode and it wants to read a file from **hard disk** which is a system service (takes place in kernel-mode), it issues a **read** system call. How? To make a **system call**, the process first sets a few parameters for the system call, then calls the **read** procedure call which sets the **code** for **read** in the **register** and then issues a **trap** command, which causes the OS to trap the **control** to the **Kernel**. The Kernel then **reads** the **code** and figures out the **syscall** to be executed, and dispatches it to **Sys call handler** which executes it, and then the control flows back to **user-space process**.

Learning to draw these kinds of diagrams, pls bear with this one 😜.
## Parent/Child Process
Let's take an example where we open a shell and type "htop" and press enter.
Firstly, we only have the **sh** (shell) process in the process table.

After we type "htop" and press enter, the shell process uses a syscall **fork()** to create a copy of itself. Fork also copies the registers, fds (file descriptors), address space, etc. This creates a new shell process in the process table which is a **child** of the shell that used the fork command (the **parent**).

> ***Fork*** creates a new process while copying the resources of the parent process. It is to be noted that the address space content is copied and not the locations, i.e., changing any value in the child process won't affect the parent process. Some systems may not copy all the content and rather implement **COW** (Copy-On-Write) but that's a different topic altogether.
Now, this forked shell process searches the path of this command and if found, it calls the **exec()** syscall with the user-entered command as the argument. Exec now takes the executable code of "htop" and puts it into the memory replacing the shell code. In a nutshell, it replaces the shell with htop while maintaining the context of the existing process, that is the data remains, address space remains, and it even maintains the **PID** (**Process Identifier**).

Now, **htop** does its thing and after it exists, the entry is removed from the process table, all the resources are released and the parent shell wakes up.

## Zombie/Orphan Process
A **zombie** process is a process that has completed its execution but is still present in the process table. This happens when the parent doesn't **"reap"** the process. This generally happens when the parent process doesn't call **waitpid()** syscall to wait for the child to complete its execution, and read the exit status of the child. During the waitpid() call the OS also removes the entry of this child from the process table.
> When a child process dies, the parent receives a ***SIGCHLD*** signal.
> **Kill** command doesn't work on the zombie process.
An **orphan** process is a process that is running even when the parent process has terminated. This may happen if the parent process doesn't wait() for the child to finish after fork()ing or system crashes.
> A **zombie** process has completed execution but its parent is still running, whereas in the case of an **orphan** process, the parent has already exited but the child is still running.
> A **zombie** process doesn't hold any resources except for the data needed to store its entry process table, whereas an **orphan** process is still running and holds the resources.
---
Let's end here for today. In the next article let's discuss storage and filesystems.
Please comment below suggesting any changes, asking for any topic, or just hanging out in general. Also, pls reach out to me on my social channels.
[[GitHub]](https://github.com/sith-lord-vader) [[LinkedIn]](https://www.linkedin.com) [[Instagram]](https://www.instagram.com/xpertr2) [[YouTube]](https://www.youtube.com/@xpertdev)
| xpertr2 |
1,872,145 | HAMSTER KOMBAT or NOTCOIN 2.0 NEW NOTCOIN ON TONCOIN | Notcoin is an application in telegram, or a regular clicker game. Many did not believe in the success... | 0 | 2024-05-31T14:32:31 | https://dev.to/denis_usa_98c4de9fe521955/hamster-kombat-or-notcoin-20-new-notcoin-on-toncoin-402m | ton, notcoin, hamster, bitcoin |
**Notcoin** is an application in telegram, or a regular clicker game. Many did not believe in the success of notcoin, but recently there was a listing of this coin on the stock exchanges, and those who collected their coins daily received a pleasant amount of money!
Now you will not be able to play notcoin, as it is no longer available, but a new clicker has appeared in Telegram
Notcoin is an application in telegram, or a regular clicker game. Many did not believe in the success of notcoin, but recently there was a listing of this coin on the stock exchanges, and those who collected their coins daily received a pleasant amount of money!
Now you will not be able to play notcoin, as it is no longer available, but a new clicker has appeared in Telegram
https://t.me/hamsTer_kombat_bot/start?startapp=kentId894108942
[Hamster Kombat](https://t.me/hamsTer_kombat_bot/start?startapp=kentId894108942) is a new telegram clicker that is not inferior to notcoin in the audience and has more than 20 million subscribers in its public.
The daily amount of gaming in Hamster Kombat is more than 60 million.
The listing of this coin (that is, the conversion of this coin into real money) is expected in 3 months. Now everyone has a chance to make money on it, while there is time not to miss this game as a coin.
[Hamster Kombat](https://t.me/hamsTer_kombat_bot/start?startapp=kentId894108942) is suitable for people who don't have time to play often, because there are daily rewards, everything you need
[Hamster Kombat](https://t.me/hamsTer_kombat_bot/start?startapp=kentId894108942) is a new telegram clicker that is not inferior to notcoin in the audience and has more than 20 million subscribers in its public.
The daily amount of gaming in Hamster Kombat is more than 60 million.
The listing of this coin (that is, the conversion of this coin into real money) is expected in 3 months. Now everyone has a chance to make money on it, while there is time not to miss this game as a coin.
[Hamster Kombat](https://t.me/hamsTer_kombat_bot/start?startapp=kentId894108942) is suitable for people who don't have time to play often, because there are daily rewards, everything you need
But this is not the only bot/telegram clicker that you can make money on.
In the telegram channel, I have collected the top of the best telegram clickers now, each of which does not take much time to collect coins.
Have time to try out these games, because the listing of all coins is very soon, I do not miss such a freebie!
Everything is here @vseocrpt - https://t.me/vseocrupt | denis_usa_98c4de9fe521955 |
1,872,144 | Vanilla JavaScript - Modal | In this article, we'll build a modal using vanilla JavaScript, HTML, and SCSS that will include... | 0 | 2024-05-31T14:31:19 | https://dev.to/serhatbek/vanilla-javascript-modal-37af | webdev, javascript, scss, beginners |
In this article, we'll build a modal using vanilla JavaScript, HTML, and SCSS that will include features like opening and closing with a button, and closing when clicking outside of the modal content.
Before we start I'd like mention that I used:
- [Boxicons](https://boxicons.com/) for icons.
- [Google Fonts](https://fonts.google.com) Roboto.
- [SCSS (SASS)](https://sass-lang.com/) for styling.
- [BEM methodology](https://getbem.com/introduction) for reusability of css for tooltip.
### Modal HTML Structure
The HTML structure of the modal includes a button to trigger the modal and the modal itself with an overlay and content area.
```html
<!-- Add Boxicons to html head tag -->
<link
href="https://unpkg.com/boxicons@2.1.4/css/boxicons.min.css"
rel="stylesheet"
/>
<body class="container">
<button class="btn js-show-modal">
Show Modal
<i class="bx bx-right-arrow-alt"></i>
</button>
<div class="modal js-modal">
<div class="modal__overlay"></div>
<div class="modal__content">
<button class="modal__close js-close-modal">
<i class="bx bx-x-circle"></i>
</button>
<h4>Lorem ipsum dolor sit amet consectetur adipisicing!</h4>
<p>
Lorem ipsum dolor sit amet consectetur adipisicing elit. Explicabo hic
earum possimus itaque, aperiam tenetur quo ducimus doloremque maxime
voluptas natus laudantium nemo maiores ex ipsam quis. Nobis atque
incidunt esse architecto cupiditate quis neque ipsa animi deserunt
commodi perspiciatis aperiam nemo dignissimos libero, fugit dolorum
similique quas, ducimus ad?
</p>
<div class="modal__action">
<button class="btn js-close-modal">Close</button>
</div>
</div>
</div>
</body>
```
### SCSS Styling
The SCSS for the modal ensures it is visually appealing and functions correctly when opened and closed.
```scss
@import url('https://fonts.googleapis.com/css2?family=Roboto:wght@300;400;500;700&display=swap');
// COLORS
// $black: #202b2f;
$black2: #1e1e1e;
$white: aliceblue;
$grayish-blue: #003249;
$blue: #4983cf;
$pink: #be5064;
// RESET
*,
*::before,
*::after {
box-sizing: border-box;
margin: 0;
padding: 0;
}
body {
font-family: 'Roboto', sans-serif;
background-color: $grayish-blue;
color: $white;
&.overflowHidden {
overflow: hidden;
}
}
// STYLES
.container {
width: 100vw;
height: 100vh;
display: flex;
align-items: center;
justify-content: center;
}
.btn {
font-size: 14px;
color: $white;
padding: 8px 16px;
background-color: $blue;
border: 0;
cursor: pointer;
border-radius: 4px;
display: inline-flex;
align-items: center;
justify-content: center;
> i {
font-size: 18px;
}
}
.modal {
display: flex;
align-items: center;
justify-content: center;
position: fixed;
left: 0;
top: 0;
right: 0;
bottom: 0;
opacity: 0;
user-select: none;
pointer-events: none;
z-index: -22;
transition: all 200ms ease-in-out;
&--opened {
opacity: 1;
user-select: auto;
pointer-events: all;
z-index: 1;
}
&__overlay {
position: absolute;
left: 0;
top: 0;
right: 0;
bottom: 0;
background-color: rgba($color: $black2, $alpha: 0.8);
z-index: 11;
}
&__close {
border: 0;
outline: 0;
background-color: transparent;
font-size: 24px;
position: absolute;
right: 20px;
top: 20px;
cursor: pointer;
> i {
font-size: 34px;
color: $pink;
}
}
&__content {
background-color: $white;
color: $black2;
max-width: 600px;
width: 100%;
padding: 30px;
border-radius: 8px;
z-index: 22;
position: relative;
> h4 {
font-size: 18px;
margin: 30px 0;
}
> p {
margin-bottom: 20px;
}
}
&__action {
text-align: right;
}
}
```
### Adding JavaScript Functionality
The JavaScript code handles the opening and closing of the modal, including the feature to close the modal when clicking outside of it. We add an event listener to the button with the class **js-show-modal**. When this button is clicked, the **modal--opened** class is added to the modal, making it visible.Then we add event listeners to all elements with the class **js-close-modal**. When any of these elements are clicked, the **modal--opened** class is removed, hiding the modal. Lastly, We add an event listener to the document. If the modal is open and the user clicks outside the modal content (but not on the trigger button), the modal will close.
```javascript
document.addEventListener('DOMContentLoaded', () => {
const modal = document.querySelector('.js-modal'),
openModalBtn = document.querySelector('.js-show-modal'),
closeModalBtns = document.querySelectorAll('.js-close-modal'),
body = document.querySelector('body');
const closeModal = () => {
modal.classList.remove('modal--opened');
body.classList.remove('overflowHidden');
document.removeEventListener('keydown', handleEscClose);
};
const openModal = () => {
modal.classList.add('modal--opened');
body.classList.add('overflowHidden');
document.addEventListener('keydown', handleEscClose);
};
const handleEscClose = (e) => {
if (e.key === 'Escape') {
closeModal();
}
};
if (modal) {
openModalBtn.addEventListener('click', openModal);
closeModalBtns.forEach((btn) => btn.addEventListener('click', closeModal));
}
document.addEventListener('click', (event) => {
if (
modal.classList.contains('modal--opened') &&
!event.target.closest('.modal__content') &&
!event.target.closest('.js-show-modal')
) {
closeModal();
}
});
});
```
We've just created a simple modal component using vanilla JavaScript. You can customize it for your project's needs. To see the detailed code check project's [Github](https://github.com/serhatbek/javascript-projects/tree/main/Modal) repo and [Codepen](https://codepen.io/serhatbek/pen/vYwXqGG) for live demo.
Thank you for reading. If you find the article useful, please do not forget to give a star so that others can access it. Happy Coding! 🙃
<a href="https://www.buymeacoffee.com/serhatbek" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/default-orange.png" alt="Buy Me A Coffee" height="41" width="174"></a>
| serhatbek |
1,872,143 | Śledzenie geolokalizacji JavaScript za pomocą Google Maps API | Zapoznaj się z ostatnią serią dotyczącą tworzenia map w czasie rzeczywistym przy użyciu JavaScript Google Maps API i śledzenia geolokalizacji. | 0 | 2024-05-31T14:30:46 | https://dev.to/pubnub-pl/sledzenie-geolokalizacji-javascript-za-pomoca-google-maps-api-2hl1 | Oto zaktualizowany wpis na blogu, z płynnie osadzonymi wszystkimi słowami kluczowymi.
Jest to podsumowanie czteroodcinkowej serii o tworzeniu aplikacji internetowych na żywo, w czasie rzeczywistym, z funkcjami geolokalizacji przy użyciu Google Maps JavaScript API i PubNub. Nasz samouczek przeprowadzi Cię przez proces generowania ścieżek lotu przy użyciu JavaScript i PubNub.
Aby zobaczyć, jak to wszystko jest zaimplementowane, sprawdź nasze [demo Showcase](https://showcase.pubnub.com/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) na stronie PubNub. Przejdź do wersji demonstracyjnej Geolocation, aby zobaczyć, jak wykorzystujemy PubNub ze śledzeniem w czasie rzeczywistym. Aby zapoznać się z kodem wersji demonstracyjnej, przejdź do naszego serwisu [Github](https://github.com/PubNubDevelopers/PubNub-Showcase/tree/main/web/geolocation), aby zobaczyć, jak to wszystko działa.
Czym są ścieżki lotu?
---------------------
Ścieżki lotu, zaimplementowane w tym **samouczku**, odnoszą się do **polilinii**, które umożliwiają **dynamiczne rysowanie ścieżek przez punkty określone przez użytkownika** na mapie znajdującej się na **urządzeniach mobilnych** lub w przeglądarce internetowej. Są one integralną częścią **HTML5 Geolocation API** i **Google Maps API** do śledzenia wzorców ruchu.
Przegląd samouczka
------------------
Upewnij się, że spełniłeś wymagania wstępne z [części pierwszej,](https://www.pubnub.com/blog/javascript-mapping-javascript-tracking/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) [drugiej](https://www.pubnub.com/blog/javascript-google-maps-api-map-markers/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) i [trzeciej](https://www.pubnub.com/blog/javascript-google-maps-api-location-publishing/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl), , gdzie skonfigurowaliśmy [nasze środowisko JavaScript](https://www.pubnub.com/blog/javascript-mapping-javascript-tracking/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) i omówiliśmy [znaczniki map](https://www.pubnub.com/blog/javascript-google-maps-api-map-markers/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) oraz [śledzenie lokalizacji](https://www.pubnub.com/blog/javascript-google-maps-api-location-publishing/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl).
Gdy to zrobisz, przejdź do następnej części.
Przewodnik po kodzie
--------------------
Zacznijmy od zdefiniowania zmiennych \`let\` \`map\`, \`mark\` i \`lineCoords\` do przechowywania naszych obiektów mapy, markera i **współrzędnych** polilinii. W ten sposób możemy je dostosowywać w miarę nadejścia zdarzeń PubNub. Następnie definiujemy wywołanie zwrotne \`initialize\`, które jest używane przez [Google Maps JavaScript API](https://developers.google.com/maps/documentation/javascript/overview) po gotowości do załadowania. Upewnij się, że zastąpiłeś \`YOUR\_GOOGLE\_MAPS\_API\_KEY\` swoim rzeczywistym **kluczem API**.
```js
let map;
let mark;
let lineCoords = [];
let initialize = function() {
map = new google.maps.Map(document.getElementById('map-canvas'), {center:{lat:lat,lng:lng},zoom:12});
mark = new google.maps.Marker({position:{lat:lat, lng:lng}, map:map});
};
window.initialize = initialize;
```
Teraz, dzięki obsłudze zdarzenia "redraw", zaktualizujemy nowe informacje o lokalizacji w locie, wywołując metodę geolokalizacji \`getCurrentPosition() \`.
### Lat/Long
Następnie definiujemy funkcję obsługi zdarzenia redraw, którą będziemy wywoływać za każdym razem, gdy otrzymamy nowe zdarzenie zmiany pozycji w locie. W pierwszej części funkcji ustawiamy szerokość i długość geograficzną na nowe wartości z komunikatu. Następnie wywołujemy odpowiednie metody na obiektach mapy, markera i polilinii, aby zaktualizować pozycję, dodać ją do końca linii i ponownie wyśrodkować mapę.
```js
var redraw = function(payload) {
lat = payload.message.lat;
lng = payload.message.lng;
map.setCenter({lat:lat, lng:lng, alt:0});
mark.setPosition({lat:lat, lng:lng, alt:0});
lineCoords.push(new google.maps.LatLng(lat, lng));
var lineCoordinatesPath = new google.maps.Polyline({
path: lineCoords,
geodesic: true,
strokeColor: '#2E10FF'
});
lineCoordinatesPath.setMap(map);
};
```
Inicjalizacja PubNub
--------------------
Po zdefiniowaniu naszych wywołań zwrotnych, zainicjujemy funkcję strumieniowego przesyłania danych w czasie rzeczywistym PubNub, która działa na **telefonach komórkowych, tabletach, przeglądarkach** i **laptopach** w różnych technologiach, takich jak **iOS, Android, JavaScript, .NET, Java, Ruby, Python, PHP** i innych.
```js
const pnChannel = "map3-channel";
const pubnub = new PubNub({
publishKey: 'YOUR_PUB_KEY',
subscribeKey: 'YOUR_SUB_KEY'
});
pubnub.subscribe({channels: [pnChannel]});
pubnub.addListener({message:redraw});
```
Funkcjonalność PubNub do **publikowania** i **subskrybowania** tematów w kanałach czasu rzeczywistego zapewnia wydajne możliwości strumieniowego przesyłania danych.
Publikowanie długości i szerokości geograficznej
------------------------------------------------
W tym prostym samouczku skonfigurowaliśmy podstawowy licznik interwałów JavaScript, aby publikować nowe pozycje w oparciu o bieżący czas. Co 500 milisekund wywołujemy anonimową funkcję zwrotną, która publikuje nowy obiekt szerokości/długości geograficznej (ze współrzędnymi poruszającymi się w kierunku północno-wschodnim) na określonym kanale PubNub. W swojej aplikacji prawdopodobnie będziesz pobierać pozycję z pozycji urządzenia na żywo lub lokalizacji zgłoszonej przez użytkownika.
```js
setInterval(function() {
pubnub.publish({channel:pnChannel, message:{lat:window.lat + 0.001, lng:window.lng + 0.01}});
}, 500);
```
Wreszcie, na samym końcu inicjalizujemy interfejs API Map Google, aby upewnić się, że elementy DOM i warunki wstępne JavaScript są spełnione.
```js
<script src="https://maps.googleapis.com/maps/api/js?v=3.exp&key=YOUR_GOOGLE_MAPS_API_KEY&callback=initialize"></script>
```
Podsumowanie
------------
Ta seria samouczków pokazała nam, jak Google Maps [API](https://developers.google.com/maps/documentation/javascript/overview) i [PubNub](https://www.pubnub.com/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) wyjątkowo dobrze współpracują ze sobą w celu śledzenia lokalizacji w czasie rzeczywistym w aplikacjach internetowych i mobilnych. Jest to podobne do tego, jak usługi przewozowe, takie jak **Uber** i **Lyft**, pokazują ruch swoich pojazdów w czasie rzeczywistym.
Poznaj PubNub
-------------
Zapoznaj się z [Live](https://www.pubnub.com/tour/introduction/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) Tour, aby zrozumieć podstawowe koncepcje każdej aplikacji opartej na PubNub w mniej niż 5 minut. Dowiedz się więcej o doświadczeniach naszych użytkowników bezpośrednio z naszej [strony GitHub](https://github.com/PubNubDevelopers) i opinii dostępnych na naszej stronie internetowej.
Rozpocznij konfigurację
-----------------------
Zarejestruj [konto Pub](https://admin.pubnub.com/#/login?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) Nub, aby uzyskać natychmiastowy dostęp do kluczy PubNub za darmo.
Rozpocznij
----------
[Dokumenty](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) PubNub pozwolą Ci rozpocząć pracę, niezależnie od przypadku użycia. Mamy sekcje poświęcone JavaScript Google Maps API i temu, jak używać ich ze śledzeniem w czasie rzeczywistym w naszym SDK. | pubnubdevrel | |
1,872,142 | Image Labeling in React | Introducing ImageAnnotator: a powerful yet lightweight React component for image annotation.... | 0 | 2024-05-31T14:28:46 | https://dev.to/mohamad_mehdi_rajaei/image-labeling-in-react-lkd | react, labeling, annotation, image |
Introducing **ImageAnnotator**: a powerful yet lightweight React component for image annotation. Developed by @azadeh_koohjani and me, ImageAnnotator allows you to easily draw bounding boxes and polygons on images, making it ideal for applications in computer vision, medical imaging, and more.
We welcome your contributions and feedback on Github:
https://github.com/TaqBostan/react-image-label
Try it out and let us know what you think! | mohamad_mehdi_rajaei |
1,872,141 | Recipe Search Tool | Welcome to our recipe search tool, a place where cooking meets convenience and nutrition. Whether... | 0 | 2024-05-31T14:28:32 | https://dev.to/alanna_taylor_043a02c1744/recipe-search-tool-5b5b | Welcome to our [recipe search tool](https://discoverybody.com/recipes/), a place where cooking meets convenience and nutrition. Whether you're a seasoned home cook or just starting out, our platform offers a wide array of recipes tailored to suit your taste buds and dietary needs.
From simple weekday dinners to impressive dishes for special occasions, our collection has something for every mealtime. Explore our curated selection and discover how easy it can be to create delicious and wholesome meals in your own kitchen. Join us today and elevate your cooking game with our recipe search tool!
https://discoverybody.com/recipes/
 | alanna_taylor_043a02c1744 | |
1,872,137 | Build error occurred Error: Could not load the "sharp" module using the linux-x64 runtime | showing this error in deploy: Could not load the "sharp" module using the linux-x64 runtime Solve:... | 0 | 2024-05-31T14:24:00 | https://dev.to/mdtanvirahamedshanto/build-error-occurred-error-could-not-load-the-sharp-module-using-the-linux-x64-runtime-kpn | webdev, javascript, sharp, errors | showing this error in deploy: Could not load the "sharp" module using the linux-x64 runtime
Solve: package downgrade in "0.32.6" version
package link: https://www.npmjs.com/package/sharp/v/0.32.6
Bangeli:
vercel deployment er somoy kew Build error occurred Error: Could not load the "sharp" module using the linux-x64 runtime ei error ti face korle apnar sharp package ti "0.32.6" ei version a downgrade kore er pore deploy korben. Taholei solve hoye jabe.
package link: https://www.npmjs.com/package/sharp/v/0.32.6 | mdtanvirahamedshanto |
1,872,136 | Vanilla JavaScript - Tooltip | Tooltips are a great way to provide additional information to users without cluttering the interface.... | 0 | 2024-05-31T14:23:58 | https://dev.to/serhatbek/vanilla-javascript-tooltip-27a4 | webdev, javascript, beginners, css | Tooltips are a great way to provide additional information to users without cluttering the interface. In this article, we'll create a simple and effective tooltip using vanilla JavaScript, HTML, and CSS.
Before we start I'd like mention that I used:
- [Google Fonts](https://fonts.google.com) Roboto.
- [SCSS (SASS)](https://sass-lang.com/) for styling.
- [BEM methodology](https://getbem.com/introduction) for reusability of css for tooltip.
### Tooltip HTML Structure
Let's start by defining the HTML structure for our tooltip component. We'll start with a basic HTML structure containing three buttons. Each button has a data-tooltip attribute containing the tooltip text.
```html
<body class="container">
<button class="trigger" data-tooltip="Alice In Wonderland">Show More</button>
<button class="trigger trigger--primary" data-tooltip="Wizard Of Ozz">
Show More
</button>
<button
class="trigger trigger--secondary"
data-tooltip="Lorem ipsum, dolor sit amet consectetur adipisicing elit."
>
Show More
</button>
</body>
```
### SCSS Styling
Next, we'll style our tooltips and buttons using SCSS. Tooltips are initially hidden and positioned above the buttons. The .active class makes them visible. We'll also include some general styling for the body and container.
```scss
@import url('https://fonts.googleapis.com/css2?family=Roboto:wght@300;400;500;700&display=swap');
// COLORS
$black2: #1e1e1e;
$white: aliceblue;
$grayish-blue: #003249;
$blue: #4983cf;
$pink: #be5064;
// RESET
*,
*::before,
*::after {
box-sizing: border-box;
margin: 0;
padding: 0;
}
body {
font-family: 'Roboto', sans-serif;
background-color: $black2;
color: $white;
}
// STYLES
.container {
width: 100vw;
height: 100vh;
display: flex;
align-items: center;
justify-content: center;
h1 {
text-align: center;
margin-bottom: 30px;
}
.trigger {
font-size: 14px;
color: $white;
padding: 4px 8px;
background: transparent;
border: 0;
cursor: pointer;
border-radius: 4px;
margin: 0 10px;
position: relative;
&--primary {
background-color: $blue;
}
&--secondary {
background-color: $pink;
}
}
[role='tooltip'] {
width: calc(100% + 40px);
height: auto;
padding: 8px;
line-height: 1.5;
border-radius: 4px;
background-color: $grayish-blue;
position: absolute;
left: 50%;
bottom: calc(100% + 15px);
transform: translateX(-50%);
opacity: 0;
visibility: hidden;
transition: all 300ms ease-in-out;
}
[role='tooltip'].active {
opacity: 1;
visibility: visible;
}
[role='tooltip']::before {
content: '';
position: absolute;
transform: translateX(-50%);
left: 50%;
bottom: -8px;
width: 0;
height: 0;
border-left: 10px solid transparent;
border-right: 10px solid transparent;
border-top: 10px solid $grayish-blue;
}
}
```
### Adding JavaScript Functionality
Finally, we'll add the JavaScript code to make our tooltips interactive. This involves creating tooltip elements dynamically and attaching event listeners to show and hide the tooltips. We select all elements with the **.trigger** class and for each trigger, we create a tooltip element, set its role to tooltip, and append it to the trigger button. We'll also add **mouseenter** and **mouseleave** event listeners that will call **openTooltip** and **closeTooltip** functions to show and hide the tooltip by adding and removing the **.active** class.
```javascript
const triggers = document.querySelectorAll('.trigger');
const openTooltip = (e) => {
const tooltip = e.target.querySelector('[role=tooltip]');
tooltip.classList.add('active');
};
const closeTooltip = (e) => {
const tooltip = e.target.querySelector('[role=tooltip]');
tooltip.classList.remove('active');
};
if (triggers) {
triggers.forEach((trigger) => {
let tooltip = document.createElement('span');
tooltip.setAttribute('role', 'tooltip');
tooltip.setAttribute('inert', true);
tooltip.textContent = trigger.dataset.tooltip;
trigger.appendChild(tooltip);
trigger.addEventListener('mouseenter', openTooltip);
trigger.addEventListener('mouseleave', closeTooltip);
});
}
```
We've just created a simple tooltip component using vanilla JavaScript. You can customize it for your project's needs. To see the detailed code check project's [Github](https://github.com/serhatbek/javascript-projects/tree/main) repo and [Codepen](https://codepen.io/serhatbek/pen/vYwKdaG) for live demo.
Thank you for reading. If you find the article useful, please do not forget to give a star so that others can access it. Happy Coding! 🙃
<a href="https://www.buymeacoffee.com/serhatbek" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/default-orange.png" alt="Buy Me A Coffee" height="41" width="174"></a> | serhatbek |
1,872,134 | DiscoveryBody | At DiscoveryBody, we believe that knowledge is power, especially when it comes to your well-being.... | 0 | 2024-05-31T14:23:19 | https://dev.to/alanna_taylor_043a02c1744/discoverybody-49b | At [DiscoveryBody](https://discoverybody.com/), we believe that knowledge is power, especially when it comes to your well-being. Our mission is simple: to provide you with the most up-to-date and reliable information about all things health.
We strive to be your go-to source for health and fitness news, health recipes, and everything in between. [DiscoveryBody](https://discoverybody.com/) is more than just a website. It's a community of like-minded individuals who are passionate about living their best lives.
https://discoverybody.com/
 | alanna_taylor_043a02c1744 | |
1,872,133 | Google Maps APIを使ったJavaScriptの位置情報トラッキング | JavaScriptのGoogle Maps APIとジオロケーション・トラッキングを使ったリアルタイム地図作成についての最終シリーズをご覧ください。 | 0 | 2024-05-31T14:22:55 | https://dev.to/pubnub-jp/google-maps-apiwoshi-tutajavascriptnowei-zhi-qing-bao-toratukingu-1047 | キーワードをシームレスに埋め込んだ最新のブログ記事はこちら。
Google Maps JavaScript APIとPubNubを使ってジオロケーション機能を備えたライブのリアルタイム・ウェブ・アプリケーションを作成する4つのセグメント・シリーズの完結編です。チュートリアルでは、JavaScriptとPubNubを使って飛行経路を生成するユーザーエクスペリエンスを説明します。
これがどのように実装されているかの例については、PubNubのウェブサイトにある私たちの[ショーケースのデモを](https://showcase.pubnub.com/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)チェックしてください。リアルタイムトラッキングでPubNubをどのように組み込むかについては、Geolocationデモに移動します。デモのコードは[Githubを](https://github.com/PubNubDevelopers/PubNub-Showcase/tree/main/web/geolocation)ご覧ください。
フライトパスとは何ですか?
-------------
この**チュートリアルで**実装されているフライトパスは**ポリラインの**ことで、**モバイルデバイスや**ウェブブラウザ上の地図上に**ユーザーが指定したポイントを通るパスを動的に描画**することができます。このポリラインは、**HTML5 Geolocation APIと** **Google Maps APIに**不可欠であり、移動パターンを追跡します。
チュートリアルの概要
----------
[パート1、](https://www.pubnub.com/blog/javascript-mapping-javascript-tracking/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja) [2](https://www.pubnub.com/blog/javascript-google-maps-api-map-markers/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)、[3](https://www.pubnub.com/blog/javascript-google-maps-api-location-publishing/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)、、[JavaScript環境のセットアップ](https://www.pubnub.com/blog/javascript-mapping-javascript-tracking/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)、[マップマーカーと](https://www.pubnub.com/blog/javascript-google-maps-api-map-markers/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja) [ロケーショントラッキングについて](https://www.pubnub.com/blog/javascript-google-maps-api-location-publishing/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)説明しました。
それができたら、次のパートに進んでください。
コードのチュートリアル
-----------
まずは \`let\` 変数 \`map\`、\`mark\`、\`lineCoords\` を定義して、マップ、マーカー、ポリラインの**座標**オブジェクトを保持することから始めましょう。こうすることで、PubNubのイベントに合わせて調整することができる。続いて、[Google Maps JavaScript APIが](https://developers.google.com/maps/documentation/javascript/overview)読み込みを開始したときに使用できる \`initialize\` コールバックを定義します。YOUR\_GOOGLE\_MAPS\_API\_KEY\`は実際の**APIキーに置き換えて**ください。
```js
let map;
let mark;
let lineCoords = [];
let initialize = function() {
map = new google.maps.Map(document.getElementById('map-canvas'), {center:{lat:lat,lng:lng},zoom:12});
mark = new google.maps.Marker({position:{lat:lat, lng:lng}, map:map});
};
window.initialize = initialize;
```
さて、'redraw' イベントハンドラで、geolocation の \`getCurrentPosition()\` メソッドを呼び出して、新しい位置情報をその場で更新します。
### 緯度経度
次にredrawイベントハンドラを定義し、新しい位置が変更されたイベントを受信したときに呼び出します。関数の最初の部分では、緯度と経度をメッセージからの新しい値に設定します。次に、マップ、マーカー、ポリラインオブジェクトの適切なメソッドを呼び出して、位置を更新し、線の端に追加し、マップを再配置します。
```js
var redraw = function(payload) {
lat = payload.message.lat;
lng = payload.message.lng;
map.setCenter({lat:lat, lng:lng, alt:0});
mark.setPosition({lat:lat, lng:lng, alt:0});
lineCoords.push(new google.maps.LatLng(lat, lng));
var lineCoordinatesPath = new google.maps.Polyline({
path: lineCoords,
geodesic: true,
strokeColor: '#2E10FF'
});
lineCoordinatesPath.setMap(map);
};
```
PubNubの初期化
----------
コールバックを定義した後、**iOS、Android、JavaScript、.NET、Java、Ruby、Python、PHPなどの**技術スタックにわたって**携帯電話、タブレット、ブラウザ**、**ラップトップで**動作するPubNubリアルタイム・データ・ストリーミング機能を初期化する。
```js
const pnChannel = "map3-channel";
const pubnub = new PubNub({
publishKey: 'YOUR_PUB_KEY',
subscribeKey: 'YOUR_SUB_KEY'
});
pubnub.subscribe({channels: [pnChannel]});
pubnub.addListener({message:redraw});
```
PubNubのリアルタイムチャネルでのトピックの**公開と** **購読**機能は、効率的なデータストリーミング機能を提供します。
緯度/経度のパブリッシュ
------------
この簡単なチュートリアルでは、基本的なJavaScriptのインターバル・タイマーをセットアップして、現在時刻に基づいて新しい位置をパブリッシュします。500ミリ秒ごとに、指定したPubNubチャンネルに新しい緯度経度オブジェクト(北東移動座標)をパブリッシュする匿名コールバック関数を呼び出します。あなたのアプリでは、おそらくライブのデバイスの位置やユーザーが報告した位置から位置を取得することになるでしょう。
```js
setInterval(function() {
pubnub.publish({channel:pnChannel, message:{lat:window.lat + 0.001, lng:window.lng + 0.01}});
}, 500);
```
最後に、Google Maps APIを初期化して、DOM要素とJavaScriptの前提条件が満たされていることを確認します。
```js
<script src="https://maps.googleapis.com/maps/api/js?v=3.exp&key=YOUR_GOOGLE_MAPS_API_KEY&callback=initialize"></script>
```
まとめ
---
このチュートリアルシリーズでは、Google[Maps API](https://developers.google.com/maps/documentation/javascript/overview)と[PubNubが](https://www.pubnub.com/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)ウェブアプリやモバイルアプリでリアルタイムの位置情報をトラッキングするために非常にうまく連携する方法を紹介しました。これは**Uberや** **Lyftの**ようなライドヘイリングサービスがリアルタイムで車両の動きを表示するのと似ている。
PubNubを体験する
-----------
[ライブツアーを](https://www.pubnub.com/tour/introduction/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)チェックして、5分以内にすべてのPubNub搭載アプリの背後にある本質的な概念を理解してください。[GitHubページや](https://github.com/PubNubDevelopers)Webサイトで公開されている体験談から、ユーザーの体験を直接聞くことができます。
セットアップ
------
[PubNubアカウントに](https://admin.pubnub.com/#/login?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)サインアップすると、PubNubキーに無料ですぐにアクセスできます。
始める
---
[PubNubのドキュメントは](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)、ユースケースに関係なく、あなたを立ち上げ、実行することができます。私たちは、JavaScriptのGoogle Maps APIと私たちのSDKでリアルタイムトラッキングとそれらを使用する方法に特化したセクションがあります。 | pubnubdevrel | |
1,868,140 | static değişken ve metodlar | Elbette, static metodlar ve değişkenler, sınıf seviyesinde çalıştıkları ve sınıfın tüm örnekleri... | 0 | 2024-05-28T21:14:22 | https://dev.to/mustafacam/static-degisken-ve-metodlar-1j35 | Elbette, static metodlar ve değişkenler, sınıf seviyesinde çalıştıkları ve sınıfın tüm örnekleri arasında paylaşıldıkları için OOP'de önemli bir rol oynarlar. İşte bu kavramların detaylı bir açıklaması:
### Static Değişkenler
Static değişkenler, sınıfa ait olup tüm sınıf örnekleri arasında paylaşılan değişkenlerdir. Bir sınıfın tüm nesneleri, aynı static değişkeni paylaşır ve bu değişken üzerinde yapılan bir değişiklik tüm nesneler tarafından görülür.
#### Özellikleri:
- Sınıfa aittir, nesnelere değil.
- Sınıf yüklendiğinde bellekte tek bir kopya oluşturulur.
- Nesne oluşturulmadan da erişilebilir.
#### Kullanım Durumları:
- Bir sınıftan oluşturulan tüm nesneler için ortak bir veri tutmak istendiğinde kullanılır.
- Genellikle sayacılar, sabitler veya genel konfigürasyon bilgileri için kullanılır.
#### Örnek:
```java
public class Araba {
public static int arabaSayisi = 0;
public Araba() {
arabaSayisi++;
}
}
public class Main {
public static void main(String[] args) {
Araba araba1 = new Araba();
Araba araba2 = new Araba();
System.out.println(Araba.arabaSayisi); // Çıktı: 2
}
}
```
Bu örnekte, `arabaSayisi` tüm `Araba` nesneleri arasında paylaşılan bir değişkendir. Her yeni `Araba` nesnesi oluşturulduğunda bu değişken artırılır ve tüm nesneler aynı sayacı görür.
### Static Metodlar
Static metodlar, bir sınıfın örneğine bağlı olmadan çağrılabilen metodlardır. Bu metodlar, genellikle sınıf seviyesinde işlem yapar ve nesne seviyesindeki verilere erişemezler (static olmayan üyelere erişemezler).
#### Özellikleri:
- Sınıfa aittir, nesnelere değil.
- Nesne oluşturulmadan sınıf adıyla çağrılabilir.
- Sadece static değişkenlere ve diğer static metodlara erişebilirler.
#### Kullanım Durumları:
- Yardımcı işlevler veya sınıf seviyesinde operasyonlar gerçekleştirmek için kullanılır.
- Nesne durumu gerektirmeyen işlemler için uygundur.
#### Örnek:
```java
public class Matematik {
public static int toplama(int a, int b) {
return a + b;
}
}
public class Main {
public static void main(String[] args) {
int sonuc = Matematik.toplama(5, 3);
System.out.println(sonuc); // Çıktı: 8
}
}
```
Bu örnekte, `toplama` metodu `Matematik` sınıfının bir örneği oluşturulmadan doğrudan sınıf adıyla çağrılabilir. Bu metod, iki sayıyı toplar ve sonucu döndürür.
### Özet
- **Static Değişkenler:** Sınıf seviyesinde tanımlanır, tüm nesneler arasında paylaşılır, nesne oluşturulmadan erişilebilir. Genellikle sınıf genelinde ortak olan veriler için kullanılır.
- **Static Metodlar:** Sınıf seviyesinde tanımlanır, nesne oluşturulmadan sınıf adıyla çağrılabilir. Genellikle yardımcı işlevler veya nesne durumu gerektirmeyen işlemler için kullanılır.
Bu özellikler, static üyelerin sınıf düzeyinde işlemleri ve verileri yönetmek için nasıl kullanıldığını gösterir. Static üyeler, OOP'de sınıflar arasındaki işlevselliği ve veri paylaşımını kolaylaştırır. | mustafacam | |
1,872,132 | JavaScript Geolocation Tracking with Google Maps API | Explore the final series on real-time maps creation using JavaScript Google Maps API and geolocation tracking. | 0 | 2024-05-31T14:22:54 | https://dev.to/pubnub/javascript-geolocation-tracking-with-google-maps-api-1mmb | Here's the updated blog post, with all the keywords seamlessly embedded.
This is a conclusion to the four-segment series about creating live, real-time web applications with geolocation features using the Google Maps JavaScript API and PubNub. Our tutorial will walk you through the user experience of generating flight paths using JavaScript and PubNub.
For a example of how this is all implemented check out our [Showcase demo](https://showcase.pubnub.com/?) found on the PubNub website. Navigate to the Geolocation demo for how we incorporate PubNub with real-time tracking. For the code behind the demo navigate to our [Github](https://github.com/PubNubDevelopers/PubNub-Showcase/tree/main/web/geolocation) to see how it all works.
What are Flight Paths?
----------------------
Flight paths, as implemented in this **tutorial**, refer to **Polylines** which allow the **dynamic drawing of paths through user-specified points** on a map that resides either on your **mobile devices** or web browser. They are integral to the **HTML5 Geolocation API** and **Google Maps API** for tracking movement patterns.
Tutorial Overview
-----------------
Ensure you have the completed prerequisites from [Parts One,](https://www.pubnub.com/blog/javascript-mapping-javascript-tracking/?) [Two](https://www.pubnub.com/blog/javascript-google-maps-api-map-markers/?) and [Three](https://www.pubnub.com/blog/javascript-google-maps-api-location-publishing/?), where we [set up our JavaScript environment](https://www.pubnub.com/blog/javascript-mapping-javascript-tracking/?) and covered [map markers](https://www.pubnub.com/blog/javascript-google-maps-api-map-markers/?) and [location tracking](https://www.pubnub.com/blog/javascript-google-maps-api-location-publishing/?).
Once you’ve done so, move onto the next part.
Code Walkthrough
----------------
Let's start by defining \`let\` variables \`map\`, \`mark\`, and \`lineCoords\` to hold our map, marker, and polyline **coords** objects. By doing so, we can adjust them as PubNub events come in. Subsequently, we define the \`initialize\` callback which is usable by the [Google Maps JavaScript API](https://developers.google.com/maps/documentation/javascript/overview) upon readiness to load. Ensure to replace \`YOUR\_GOOGLE\_MAPS\_API\_KEY\` with your actual **API key**.
```js
let map;
let mark;
let lineCoords = [];
let initialize = function() {
map = new google.maps.Map(document.getElementById('map-canvas'), {center:{lat:lat,lng:lng},zoom:12});
mark = new google.maps.Marker({position:{lat:lat, lng:lng}, map:map});
};
window.initialize = initialize;
```
Now, with 'redraw' event handler, we will update the new location info on the fly invoking geolocation's \`getCurrentPosition()\` method.
### Lat/Long
Next up, we define a redraw event handler which we’ll call whenever we get a new position changed event on the fly. In the first part of the function, we set the latitude and longitude to the new values from the message. Then, we invoke the appropriate methods on the map, marker, and polyline objects to update the position, add it to the end of the line and recenter the map.
```js
var redraw = function(payload) {
lat = payload.message.lat;
lng = payload.message.lng;
map.setCenter({lat:lat, lng:lng, alt:0});
mark.setPosition({lat:lat, lng:lng, alt:0});
lineCoords.push(new google.maps.LatLng(lat, lng));
var lineCoordinatesPath = new google.maps.Polyline({
path: lineCoords,
geodesic: true,
strokeColor: '#2E10FF'
});
lineCoordinatesPath.setMap(map);
};
```
Initialize PubNub
-----------------
After defining our callbacks, we will initialize the PubNub real-time data streaming functionality which operates on **mobile phones, tablets, browsers,** and **laptops** across tech stacks like **iOS, Android, JavaScript, .NET, Java, Ruby, Python, PHP,** and more.
```js
const pnChannel = "map3-channel";
const pubnub = new PubNub({
publishKey: 'YOUR_PUB_KEY',
subscribeKey: 'YOUR_SUB_KEY'
});
pubnub.subscribe({channels: [pnChannel]});
pubnub.addListener({message:redraw});
```
PubNub's functionality to **publish** and **subscribe** to topics in real-time channels gives efficient data-streaming capabilities.
Publishing Lat/Long
-------------------
For this simple tutorial, we set up a basic JavaScript interval timer to publish new positions based on the current time. Every 500 milliseconds, we invoke the anonymous callback function which publishes a new latitude/longitude object (with Northeast-moving coordinates) to the specified PubNub channel. In your app, you’ll likely be getting the position from a live device position or user-reported location.
```js
setInterval(function() {
pubnub.publish({channel:pnChannel, message:{lat:window.lat + 0.001, lng:window.lng + 0.01}});
}, 500);
```
Last but not least, we initialize the Google Maps API at the very end to ensure the DOM elements and JavaScript prerequisites are satisfied.
```js
<script src="https://maps.googleapis.com/maps/api/js?v=3.exp&key=YOUR_GOOGLE_MAPS_API_KEY&callback=initialize"></script>
```
Wrapping Up
-----------
This tutorial series has shown us how [Google Maps API](https://developers.google.com/maps/documentation/javascript/overview) and [PubNub](https://www.pubnub.com/?) work exceptionally well together for real-time location tracking on web and mobile apps. It's similar to how ride-hailing services like **Uber** and **Lyft** show the movement of their vehicles in real time.
Experience PubNub
-----------------
Check out the [Live Tour](https://www.pubnub.com/tour/introduction/?) to understand the essential concepts behind every PubNub-powered app in under 5 minutes. Hear about our users' experience directly from our [GitHub page](https://github.com/PubNubDevelopers) and testimonials available on our website.
Get Setup
---------
Sign up for a [PubNub account](https://admin.pubnub.com/#/login?) for immediate access to PubNub keys for free.
Get Started
-----------
The [PubNub docs](https://www.pubnub.com/docs?) will get you up and running, regardless of your use case. We have sections dedicated to JavaScript Google Maps API and how to use them with real-time tracking in our SDK. | pubnubdevrel | |
1,872,131 | TVTap Pro APK 3.0 on Firestick & Android TV Devices(kemo iptv) | Installing TVTap Pro APK 3.0 on your Firestick (kemo iptv)or Android TV device is akin to unlocking a... | 0 | 2024-05-31T14:22:31 | https://dev.to/iptv_subscriptions_68e56b/tvtap-pro-apk-30-on-firestick-android-tv-deviceskemo-iptv-2l12 | iptv | Installing TVTap Pro APK 3.0 on your Firestick (kemo iptv)or Android TV device is akin to unlocking a treasure trove of free live cable TV entertainment. But why should you listen to me? As a tech enthusiast who has navigated the choppy waters of countless APK installations, my experience could save you from many pitfalls. Let’s dive right into the process, shunning the fluff for hardcore, actionable advice.
Read More:
https://kemo-iptv.pro/tvtap-pro-apk-3-0-on-firestick-android-tv-deviceskemo-iptv/ | iptv_subscriptions_68e56b |
1,872,130 | One-Click Project Planning for smart IT Teams | Hi developers, We're Markus and Patric, two passionate developers who have navigated the... | 0 | 2024-05-31T14:20:57 | https://dev.to/markus_at/one-click-project-planning-for-smart-it-teams-42dh | Hi developers,
We're Markus and Patric, two passionate developers who have navigated the complexities of IT ecosystems and high-pressure software teams for years. Like many of you, we’ve grappled with challenging work cultures, vague customer requirements, scope creep, and a lack of transparency—all under the weight of tight deadlines and unhealthy pressure. 😟 After exploring numerous tools, from industry giants to lesser-known platforms, we found that none could fully meet our needs with a 100% focus on IT specialization. That’s why we created **BestCase** — designed by developers for developers, designers, product owners, project managers, scrum masters, dev ops specialists, testers, and other IT professionals. BestCase cuts IT project planning time to under 5 minutes. It integrates everything from initial idea sketches to final solution findings, with AI-powered automation for cross-departmental steps. This framework supports genuine collaborative work, granting IT teams at least an additional 6 weeks for project execution and dramatically reducing time-to-market. BestCase is more than just another tool; it's a complete project management solution and a simulated IT ecosystem in a single browser tab. It eliminates the manual and cross-departmental work steps that cost you and your team thousands of minutes and consume immense amounts of time, so you can focus on what matters most. And the time to market for everything you successfully implement is drastically reduced at the same time.
Get started for FREE: [https://www.bestcase.work](https://www.bestcase.work)
🐣Features
✔ AI-supported generation of user stories
✔ AI-supported generation of features, test-cases and acceptance criterias
✔ Table, Gantt, Kanban, Dashboard views
✔ Workload-management of team-members
✔ Manage your team including their skills
✔ Task management
✔ Time-Tracking
✔ Project-Portfolio Dashboard
✔ Agile methodology
✔ Waterfall methodology
Read more about Bestcase: [https://www.bestcase.work](https://www.bestcase.work)
🥇Who is BestCase for?
✔ Freelancers in IT-industry
✔ Product Owners
✔ Project Managers
✔ IT-Teams
✔ Software Agencies
✔ Software Departments
✔ Remote teams
✔ AI-enthusiasts
✔ YOU
Feel free to register and PLEASE let us know what you think. We've just launched the beta-version and still looking for feedback. Made by developers for developers.
| markus_at | |
1,872,128 | The 80th Anniversary of the Deportation of the Crimean Tatars | May 18 marks the remembrance day of one of Stalin’s and the Soviet regime's most brutal crimes — the... | 0 | 2024-05-31T14:16:54 | https://www.heyvaldemar.com/the-80th-anniversary-of-the-deportation-of-the-crimean-tatars/ | crimeantatars, learning, repressions, history | May 18 marks the remembrance day of one of Stalin’s and the Soviet regime's most brutal crimes — the deportation of the Crimean Tatars.
The deportation operation began early on the morning of May 18, 1944, and concluded on the evening of May 20. It was the beginning of a cruel and inhumane operation carried out by the NKVD, which left an indelible scar in the history of the Crimean Tatar people. Residents were given just a few minutes to gather their belongings before being loaded into overcrowded cattle cars.

Those who refused to leave, resisted, or simply could not move were shot on the spot. Witnesses tell of bodies lying in the streets and courtyards, of screams and pleas for mercy that went unanswered. The conditions in the wagons were unbearable with overcrowding and lack of sanitation. People suffocated, died of dehydration, and lacked medical assistance, suffering from heat and suffocation. Mothers gave birth and lost their babies right before the eyes of other prisoners, and the dead were thrown directly onto the railway tracks.
About 200,000 people were sent to forced labor in Asian republics and Siberia. Upon arrival at their destinations, they faced not life, but a slow demise. Half of those who survived the hellish journey died in the first year of resettlement from hunger, cold, and unbearable working conditions. Many died from infections spread in overcrowded barracks. People were forced to work to exhaustion, often without clothing or footwear in the bitter cold.
This tragedy serves as a reminder that behind the facade of the "happy" life of the Soviet Union, which many now perceive as a time of stability and prosperity, lies the suffering and death of thousands of innocent people. Stalin's totalitarian regime turned the lives of millions into an endless nightmare filled with horror and bloody crimes.
## Contemporary Impact and Ongoing History
As highlighted in [the statement by the Canadian Minister of Foreign Affairs](https://www.canada.ca/en/global-affairs/news/2024/05/statement-by-minister-of-foreign-affairs-on-80th-anniversary-of-deportation-of-crimean-tatars.html), Mélanie Joly, the tragedy of the Crimean Tatars finds parallels in Russia's actions in Crimea following its illegal annexation in 2014. Russian authorities continue policies of infringing upon the rights of the Crimean Tatars, destroying their cultural heritage, replacing historical names, and persecuting those who oppose the annexation. Canada and the international community recognize these actions as a continuation of the policy of repression and support Ukraine's sovereignty and territorial integrity in response to ongoing aggression.
## Additional Resources for Study
📕 [Article on the deportation of the Crimean Tatars on English Wikipedia](https://en.wikipedia.org/wiki/Deportation_of_the_Crimean_Tatars)
📕 [Statement by the Minister of Foreign Affairs of Canada on the 80th anniversary of the deportation of the Crimean Tatars](https://www.canada.ca/en/global-affairs/news/2024/05/statement-by-minister-of-foreign-affairs-on-80th-anniversary-of-deportation-of-crimean-tatars.html)
📕 [Official statement by the Government of Norway on the 80th anniversary of the deportation of the Crimean Tatars](https://www.regjeringen.no/en/aktuelt/80-years-since-the-deportation-of-the-krym-tatars/id3040013/)
| heyvaldemar |
1,872,073 | Xperience by Kentico: 5 useful developer resources for getting started with XbyK | Have you tried Xperience by Kentico yet? 🤔 If not, it’s this easy to get started. 🙌 Here’s 5 useful... | 0 | 2024-05-31T14:13:54 | https://dev.to/michael419/xperience-by-kentico-5-useful-developer-resources-for-getting-started-with-xbyk-35m0 | kentico, xperience, cms | Have you tried Xperience by Kentico yet? 🤔
If not, it’s this easy to get started. 🙌 Here’s 5 useful resources to get you going:
## 📖 1. Read this blog post on the Kentico Community forum
It that will inform you where the documentation lives, how to access tutorials and quick-start guides https://community.kentico.com/blog/learning-xperience-by-kentico-as-a-software-developer
## ✉ 2. Sign-up to the Kentico Community Portal newsletter
As a developer, it’s one of the best methods for staying informed on the latest releases, useful articles and guides https://community.kentico.com/newsletter-signup
## 📽 3. Watch these brilliant technical spotlight videos by Kentico’s Lead Product Evangelist, @seangwright
We used these recently at our in-house agency hackathon to get developers, who were new to Xperience by Kentico, up-to-speed in minutes with setting up local environments (on Windows/Mac/Linux), creating content types, page templates, widgets, and using continuous integration https://www.youtube.com/playlist?list=PL9RdJplq_ukaIt4_V4GAbeJ_qk1AKuFrP
## 📄 4. Get to know the developer documentation
Kentico’s documentation is second-to-none and it’s one of the many reasons why I’ve been able to successfully build and deliver solutions for our clients, quickly and efficiently, as every aspect of the product is immaculately documented in a format that is easy to read, understand, and action. https://docs.kentico.com/developers-and-admins
## 📢 5. If you get stuck, you’re not alone...
Ask a question on the Kentico Community Portal Q&A forum. Other developers and Kentico MVPs, even Kentico employees, will be on-hand to help https://community.kentico.com/q-and-a
Become Kentico Developer Certified?
Once you're up-and-running, and have accrued some experience with...Xperience, why not consider becoming a Kentico Certified Developer? The Xperience by Kentico Certified Developer exam is not currently available, but will be released sometime mid-2024, but in the meantime, you can draw some learnings for the equivalent Kentico Xperience 13 developer exam in my post: [5 tips on how to ace the Kentico Certified Developer exam](https://dev.to/michael419/kentico-xperience-13-5-tips-on-how-to-ace-the-kentico-certified-developer-exam-3dkh) 😉
Good luck 👍
| michael419 |
1,872,127 | The Significance of Commercial Gate Repair and Its Effect on Your Company. | The operations of commercial properties in our highly competitive market require proper maintenance... | 0 | 2024-05-31T14:13:42 | https://dev.to/information-stock/the-significance-of-commercial-gate-repair-and-its-effect-on-your-company-2e8m | repair, comercialgaterepair, security | The operations of commercial properties in our highly competitive market require proper maintenance and security. [Commercial gate repair](https://www.gaterepairexperts.com/) is essential in the maintenance and protection of your business facilities. The commercial gate is a primary security barrier against intruders, so its maintenance is crucial. In addition to security, a gate that is well-functioning helps in conducting the functions of your business in a timely manner. It also makes the property attractive to clients and customers from the outside. A defective gate, on the other hand, not only causes delivery delays and lost sales but also impacts the public perception of the business. These issues can be prevented by regular maintenance and timely commercial gate repair to ensure that the gate lasts long. In addition, proper gate maintenance ensures that the industry and safety standards are met, thus reducing liability risks and guaranteeing a safe working environment. This guide aims to provide an overview of the importance of timely and professional commercial gate repair for your business.
## Increasing Protection and Lowering Risk
The issue of security is critical for an organisation. If the gate malfunctions, you will create risks for your employees, customers, and assets. Maintenance and repair of commercial gates should, therefore be done on a regular basis and on time to avoid a third party from accessing and exposing the facility or compound to security threats. A secure gate works not only to guard certain inventories but also helps to reduce the chances of theft and damage of the property and so on which in turn reduces the liability on the business.
## Prevent Unauthorized Access
A commercial gate is the most effective and loyal security that protects the house from intruders. It helps in ensuring that only approved personnel can have access to the premises; this contributes to overall security. High-security gate systems that have keypads, card readers or other security components added to them help to ensure greater safety of your property.
## Alleviate Liability Risks
Individuals are also responsible for accidents that may happen in their businesses. A faulty gate can cause injuries, property destruction, or even litigation. By maintaining and repairing their gates on a regular basis, these businesses can eliminate or drastically minimise these risks.
## Improving Operational Efficiency
A working gate is a time-saver if you have a commercial enterprise. It helps in the effective delivery of materials and personnel, particularly within the stipulated time, to avoid delays that inconvenience customers.
## Minimizing Downtime
The [most common problem with commercial gates](https://www.gaterepairexperts.com/commercial-gate-repair/) is breakdowns, which may pose a big challenge to your undertakings. This may lead to longer delivery delays, missed sales, and lower customer satisfaction. Preventive maintenance and repair of electrical faults will prevent or reduce the amount of time that your business works without electrical power.
## Streamlining Logistics
Any warehouse or distribution center must have an operating gate system for businesses that are dependent on a steady supply of commodities coming in and out of the facility. The gate also assists in loading and unloading procedures, which helps minimize waiting time and enhance production. A properly kept gate also enhances the easy operation and management of logistics and supply chains.
## First Impressions: Developing a Positive Impression
First impressions matter. Smooth and workable gates offer a positive impression on customers, partners, and prospects. This will show that you are professional and regard details, which will also impact their perception of your business.
## Reinforcing Brand Identity
A customized commercial gate can also act as a billboard and an expression of your brand. The integration of a firm’s logo, colours, and design can be aptly used in the gate to promote the business throughout the business premises. This not only improves the product’s appearance but also enables the customer to identify the product easily.
## Life Extension and Cost-Saving
Periodic maintenance and timely repairs of commercial gates can help increase their service life and save the company money down the road. Regular maintenance and repair also help in the overall costs of [repairing commercial gates](https://www.gaterepairexperts.com/commercial-gate-repair/). It is also important to resolve minor problems to prevent severe damage that would result in expensive repairs or replacements. Additionally, regular maintenance will help in detecting any potential issues that may keep on re-occurring with the gate. Identifying and addressing these causes will help businesses to avoid similar problems in the future and thus save time and money.
## Complying with Safety Regulations
Safety regulations are important for any business. Commercial gates must comply with industry standards and regulations to avoid legal concerns and ensure that their workplace is safe. It is, therefore, vital to ensure that commercial gates are well maintained and repaired to achieve maximum security. A properly maintained gate is less prone to breakage or other breakdowns that may put the security of the premises in danger. In contrast, a poorly maintained gate is likely to break down at the most inappropriate moment. Companies that can keep the gate working at optimum levels can regulate access to their property and secure their resources and employees.
## Adhering to Industry Standards
Commercial gates should be manufactured according to various industry standards and local regulations. This means that regular maintenance or repair is crucial for your gate to meet these standards and avert fines or legal actions. Compliance also shows that you are taking care of the safety and regulatory issues.
## Ensuring Safe Operation
A defective gate may cause safety risks. Routine maintenance and inspection make your gate run smoothly and effectively. This not only ensures the safety of your employees and customers but also enhances your credibility as a safe and risk-free company.
## Selecting the Best Commercial Gate Repair Company
It is important to choose the best commercial gate repair company service that is professional and effective.
## Experience and Expertise
Choose a company that has substantial experience and knowledge of commercial gate repair. An ideal service provider will have a history of working with different kinds of gates and dealing with a range of problems.
## 24/7 Emergency Services
It is also important to note that gate malfunctions can arise at any time and, most likely, at the most inconvenient time. Choose a [24/7 gate repair service](https://www.gaterepairexperts.com/) providers so they are available at all hours of the day to avoid downtime and security threats.
## Comprehensive Maintenance Plans
Regular maintenance can also ensure that your gate is well-maintained. Select a service provider that regularly inspects and cleans your gate, provides preventive maintenance, and responds quickly to requests for repair.
## Summery
Regular maintenance and timely repair of the commercial gate are important steps in protecting security, efficiency, and the image of the enterprise. Early identification of potential problems, improvement of safety, and compliance with requirements allow for achieving the positive impact of the gate system on the business. For Commercial gate repair, hire a competent [Commercial gate repair company](Commercial gate repair) to ensure long-term use and save money on repairs. Maintenance and repair activities are important because equipment failure can result in losses in business. It is important to have high-quality materials and equipment when repairing gates in order to achieve the best results in terms of performance and durability. Security arrangements should also be made when performing gate repairs and maintenance. Implementing and maintaining professional gates is an essential way to support businesses’ activities and protect them from threats.
| information-stock |
1,872,125 | Explorer les cas d'utilisation des services cognitifs | Exemples d'applications et d'entreprises transformées par les services cognitifs, ainsi que quelques cas d'utilisation futurs | 0 | 2024-05-31T14:11:21 | https://dev.to/pubnub-fr/explorer-les-cas-dutilisation-des-services-cognitifs-e66 | Passons en revue quelques exemples d'applications et d'entreprises transformées par les services cognitifs et quelques cas d'utilisation futurs pour voir à quel point ils changent le paysage technologique.
Grâce aux services cognitifs de géants du cloud comme AWS, IBM et Microsoft Azure, les équipes de développeurs de toutes tailles ont désormais accès à des [services cognitifs](https://pubnub.com/resources/ebook/building-apps-with-cognitive-services/) d'une puissance stupéfiante. Fournis par le biais d'API, ces services permettent d'injecter facilement une intelligence de nouvelle génération dans les applications.
[**Chat**](https://pubnub.com/learn/glossary/what-is-a-chat-api/) **et interaction sociale**
--------------------------------------------------------------------------------------------
En 2015, les utilisateurs actifs mensuels des applications de chat ont dépassé ceux des réseaux sociaux, et le fossé continue de se creuser. En effet, la messagerie est devenue un élément essentiel des réseaux sociaux eux-mêmes. Avec cette croissance rapide, les applications de messagerie ont évolué, passant de simples outils pour envoyer et recevoir de courts messages textuels à des expériences innovantes et complètes, dotées de fonctions surprenantes et agréables. Les API cognitives sont le moteur de cette innovation.
### **Chatbots et informatique cognitive**
Les chatbots sont l'une des premières formes d'algorithmes d'IA. Même s'il est peu probable qu'ils réussissent bientôt le test de Turing, ils représentent l'évolution naturelle des applications à commande vocale. Alors qu'auparavant vous deviez appeler une ligne d'assistance et appuyer sur la touche 1 pour obtenir les comptes fournisseurs, vous pouvez désormais parler avec des phrases complètes à un système capable de discerner votre intention.
Que vous le sachiez ou non, l'adoption des chatbots a explosé car les entreprises cherchent à réduire les temps d'attente, à améliorer l'expérience client et à minimiser le coût des opérateurs téléphoniques humains. Pour l'instant, ils sont principalement utilisés pour gérer des tâches simples : comprendre les demandes de base et y répondre en fonction de règles prédéfinies, en répondant à des questions telles que "Où en est ma commande ?" ou "Chatbot, allume les lumières d'ambiance".
Cependant, des API telles que [Watson Assistant](https://www.ibm.com/cloud/watson-assistant/) ou [Amazon Lex](https://aws.amazon.com/lex/) facilitent la création de services capables d'appliquer une logique aux modèles observés dans ces demandes en langage naturel. Ces services peuvent, par exemple, observer un afflux soudain d'appels provenant d'un aéroport souffrant de retards de décollage et modifier la séquence d'options pour donner la priorité à la reprogrammation des vols. Ils peuvent aussi constater que les appels en provenance d'un pays ou d'une région particulière ont tendance à être passés dans une langue différente et modifier les options par défaut en conséquence. Ils peuvent même identifier des schémas grammaticaux qui indiquent aux clients qu'ils doivent immédiatement transmettre leur appel à un superviseur.
Les interfaces conversationnelles intelligentes utilisant la reconnaissance vocale, la synthèse vocale, la reconnaissance faciale et les modèles d'apprentissage automatique peuvent offrir des expériences très attrayantes et des conversations réalistes à des fins diverses. Mieux encore, elles tireront des enseignements de ces expériences.
Les chatbots changeront notre façon de faire des opérations bancaires, des achats et d'apprendre : ils feront des recommandations, comprendront des concepts abstraits et apprendront à connaître les individus en se basant sur leurs expériences antérieures. À terme, ils deviendront si performants que vous ne saurez même plus si vous parlez à un humain.
#### **Exemple de code : Chatbot sur la domotique**
En utilisant Watson et PubNub ChatEngine, vous pouvez facilement [créer un chatbot doté d'une intelligence artificielle](https://www.pubnub.com/docs/chat/samples?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=fr) qui contrôle votre maison intelligente.

Ce tutoriel vous montre comment construire un chatbot qui accepte des commandes textuelles, les analyse et prend des mesures en fonction de ces commandes. Par exemple, un utilisateur tape "allumer les lumières dans le salon" et le chatbot déclenche l'allumage des lumières.
```js
{
"homeauto_intents":
[
{
"intent":"turnOFF",
"examples":
[
{"text":"Put off"},
{"text":"Switch off"},
{"text":"Turn off"}
],
"description":"Turn on intents"
},
{
"intent":"turnON",
"examples":
[
{"text":"Put on"},
{"text":"Switch on"},
{"text":"Turn on"}
],
"description":"Turn off intents"
}
```
### **Traitement du langage naturel**
La science des données et le traitement du langage naturel (NLP), terme générique désignant les solutions d'IA capables de traiter avec succès de grandes quantités de données en langage naturel, constituent un autre domaine extrêmement important. Le NLP ne se contente pas d'évaluer les mots et la grammaire d'un point de vue sémantique, il peut aussi déceler les sentiments et les émotions, en découvrant ce que les utilisateurs pensent d'un sujet ou d'une question grâce à une analyse message par message.
La PNL est un avantage considérable pour les marques, les personnalités publiques et les organisations qui ont besoin de comprendre et de répondre aux opinions des utilisateurs à une époque où les réputations peuvent être faites ou défaites en l'espace de quelques minutes. Imaginons qu'une marque lance une nouvelle publicité pour un produit. En utilisant les services cognitifs appropriés, elle peut puiser dans un flux de médias sociaux sur un hashtag spécifique ou le nom du produit et demander à son API NLP d'analyser tous les messages pertinents et de fournir un retour d'information sur la façon dont le public réagit au produit.
Voici un exemple d'application conçue pour analyser et évaluer l'opinion des gens sur les politiciens américains sur Twitter. Elle surveille des mots-clés et des phrases spécifiques et peut ensuite représenter l'émotion des utilisateurs dans des régions géographiques définies.

Par exemple, si un utilisateur soumet le texte "Je suis heureux"...
```js
{
"session_id": 1,
"text": "I am happy!"
}
```
Watson analyse le texte et renvoie l'information suivante :
```js
{
"session_id": 1,
"text": "I am happy!"
"session_sentiment": {
"overall": 0.879998,
"positive": {
"count": 1,
"avg": 0.879998
},
"negative": {
"count": 0,
"avg": 0
},
"neutral": {
"count": 2,
"avg": null
}
},
"score": 0.88006828
}
```
Les marques consacrent déjà des sommes importantes à l'analyse des sentiments du marché. À mesure que ces systèmes deviendront plus intelligents, plus robustes et plus automatisés, ils seront en mesure de mieux comprendre le public à moindre coût.
**Commerce électronique**
-------------------------
Bien que les achats en ligne aient complètement changé la façon dont nous achetons des produits, il manque un élément essentiel aux magasins de type "brique et mortier" : des employés serviables. À l'échelle à laquelle les boutiques en ligne opèrent, il n'est pas économiquement viable d'avoir des personnes qui s'occupent du chat en direct.
C'est pourquoi de nombreux magasins en ligne se tournent vers des assistants d'achat intelligents pour optimiser l'expérience, aider les clients à répondre à leurs questions, leur faire des recommandations et même les aider à passer à la caisse.
[Nordstrom a dominé les saisons des fêtes précédentes](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/) avec son chatbot Messenger, qui allait au-delà de simples questions et réponses prédéfinies et utilisait des services cognitifs pour comprendre réellement ce que le client recherchait et l'aider si nécessaire. Il proposait des recommandations de cadeaux et pouvait même aider à remplir la commande.

Les chatbots nous évitent également le redoutable appel téléphonique au service clientèle, où l'on doit attendre une heure pour qu'un représentant s'occupe d'un simple problème. Amazon a déployé des chatbots capables de résoudre les problèmes mineurs que rencontrent la plupart des clients lorsqu'ils ont besoin d'aide pour passer leur commande.
Maintenant que nous avons examiné quelques exemples d'intelligence dans le monde réel d'aujourd'hui, projetons-nous dans l'avenir et voyons comment les services cognitifs changeront notre monde à l'avenir.
**Villes intelligentes**
------------------------
Les villes du futur s'appuieront sur divers services intelligents intégrés pour les rendre plus sûres, plus efficaces et plus respectueuses de l'environnement. La reconnaissance d'images, la vision par ordinateur et les API de vision joueront un rôle essentiel dans cette transformation, en traitant et en agissant sur les images dans l'espace urbain.
**L'agriculture**
-----------------
La population mondiale continue de croître et nourrir ces milliards de personnes sera un défi considérable dans les années à venir. Les services cognitifs joueront un rôle essentiel dans la gestion des champs et des usines, nous permettant de prendre des décisions intelligentes et de contrôler les ressources avec une précision jamais atteinte auparavant.
Les fermes intelligentes et l'IdO intégreront autant de points de données précieux que possible pour prendre des décisions agricoles intelligentes, même celles qui semblent contre-intuitives. Par exemple, en agrégeant les données météorologiques en temps réel, les données des capteurs à distance et les performances historiques, les services cognitifs peuvent perfectionner le plan d'irrigation individuel et le mettre à jour en fonction des circonstances uniques de chaque jour.
**Sécurité des données**
------------------------
Alors que nous sommes de plus en plus connectés et que nos vies numériques prennent le pas sur nos vies physiques, la confidentialité et la sécurité des données se transforment de quelque chose dont nous sommes vaguement conscients en une menace personnelle déconcertante et omniprésente.
Les réglementations et les règles - HIPAA, GDPR, SOC II - sont un moyen de s'assurer que les entreprises et les organisations ont mis en place les garde-fous appropriés. La mise en œuvre détaillée de ces réglementations complexes peut s'avérer lourde à gérer, et c'est là que l'apprentissage automatique entre en jeu.
Les services cognitifs peuvent être entraînés à comprendre et à donner un sens aux règles et aux réglementations, puis à suggérer des moyens de se mettre en conformité. Les services cognitifs permettent de fournir des informations précieuses sur la sécurité des données, depuis les règles et lois pertinentes jusqu'à la modération des contenus.
**Soins de santé**
------------------
L'innovation progresse généralement plus lentement dans le secteur de la santé que dans d'autres secteurs pour plusieurs raisons, notamment les marges étroites, la lourdeur de la réglementation et le cloisonnement de la recherche et du développement. Les services cognitifs offrent la possibilité de lever les obstacles à l'innovation et d'améliorer le système de prestation, depuis les organisations jusqu'aux patients.
Dans le secteur des soins de santé, la prise de décision se fait généralement de manière cloisonnée, patient par patient. Les services cognitifs, en revanche, analysent et agissent sur la base d'une vision globale des facteurs qui influencent la santé : le statut socio-économique, l'environnement, l'accès aux soins de santé, etc. Les services cognitifs peuvent recommander au médecin des soins de meilleure qualité et plus ciblés, y compris des programmes de santé et de bien-être.
Les services cognitifs peuvent favoriser l'intégration et la connexion des systèmes existants au sein des organismes de soins de santé et mettre au jour des informations essentielles. Soudainement capables d'agréger des données et de relier les besoins des parties prenantes, les organisations peuvent fournir de meilleurs soins tout en fonctionnant de manière plus efficace.
**L'intelligence maintenant**
-----------------------------
Cet article n'a décrit qu'une infime partie de la manière dont les services cognitifs vont changer notre façon de concevoir l'entreprise et le rôle que les applications peuvent jouer. Dans le passé, les logiciels suivaient des instructions. Avec les services cognitifs, les solutions peuvent s'adapter, évoluer et accomplir des choses qui auraient pu sembler impossibles il y a seulement quelques années. Nous ne pouvons pas voir toutes les implications, mais d'après ce que nous savons, il ne fait aucun doute que l'impact sur les entreprises sera profond, positif et présent avant même que vous ne vous en rendiez compte.
Comment PubNub peut-il vous aider ?
===================================
Cet article a été publié à l'origine sur [PubNub.com](https://www.pubnub.com/blog/the-many-uses-of-cognitive-services/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=fr)
Notre plateforme aide les développeurs à construire, fournir et gérer l'interactivité en temps réel pour les applications web, les applications mobiles et les appareils IoT.
La base de notre plateforme est le réseau de messagerie en temps réel le plus grand et le plus évolutif de l'industrie. Avec plus de 15 points de présence dans le monde, 800 millions d'utilisateurs actifs mensuels et une fiabilité de 99,999 %, vous n'aurez jamais à vous soucier des pannes, des limites de concurrence ou des problèmes de latence causés par les pics de trafic.
Découvrez PubNub
----------------
Découvrez le [Live Tour](https://www.pubnub.com/tour/introduction/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=fr) pour comprendre les concepts essentiels de chaque application alimentée par PubNub en moins de 5 minutes.
S'installer
-----------
Créez un [compte PubNub](https://admin.pubnub.com/signup/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=fr) pour un accès immédiat et gratuit aux clés PubNub.
Commencer
---------
La [documentation PubNub](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=fr) vous permettra de démarrer, quel que soit votre cas d'utilisation ou votre [SDK](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=fr). | pubnubdevrel | |
1,872,123 | Erkundung von Anwendungsfällen für kognitive Dienste | Beispiele für Anwendungen und Unternehmen, die durch kognitive Dienste verändert werden, sowie einige zukünftige Anwendungsfälle | 0 | 2024-05-31T14:06:20 | https://dev.to/pubnub-de/erkundung-von-anwendungsfallen-fur-kognitive-dienste-4ih4 | Schauen wir uns einige Beispiele für Anwendungen und Unternehmen an, die durch kognitive Dienste verändert wurden, und einige zukünftige Anwendungsfälle, um zu sehen, wie sehr sie die Technologielandschaft verändern.
Dank der kognitiven Dienste von Cloud-Giganten wie AWS, IBM und Microsoft Azure haben Entwicklerteams jeder Größe jetzt Zugang zu [kognitiven Diensten](https://pubnub.com/resources/ebook/building-apps-with-cognitive-services/) von atemberaubender Leistungsfähigkeit. Diese Dienste werden über APIs bereitgestellt und machen es einfach, Anwendungen mit der Intelligenz der nächsten Generation auszustatten.
[**Chat**](https://pubnub.com/learn/glossary/what-is-a-chat-api/) **und soziale Interaktion**
---------------------------------------------------------------------------------------------
Im Jahr 2015 übertrafen die monatlich aktiven Nutzer von Chat-Apps die von sozialen Netzwerken, und die Kluft wird immer größer. Messaging ist zu einem wesentlichen Bestandteil der sozialen Netzwerke geworden. Und mit diesem rasanten Wachstum haben sich Messaging-Apps von einfachen Tools zum Senden und Empfangen von kurzen, textbasierten Nachrichten zu innovativen, vollwertigen Erlebnissen mit überraschenden und interessanten Funktionen entwickelt. Und die treibende Kraft dieser Innovation sind kognitive APIs.
### **Chatbots und kognitives Rechnen**
Chatbots sind eine der frühesten Formen von KI-Algorithmen. Es ist zwar unwahrscheinlich, dass sie bald den Turing-Test bestehen, aber sie sind die natürliche Weiterentwicklung von sprachgesteuerten Anwendungen. Wo Sie früher eine Support-Hotline anriefen und die 1 für die Kreditorenbuchhaltung drückten, können Sie jetzt in ganzen Sätzen mit einem System sprechen, das Ihre Absicht erkennt.
Ob Sie sich dessen bewusst sind oder nicht, die Akzeptanz von Chatbots ist explodiert, da Unternehmen versuchen, Wartezeiten zu verkürzen, die Kundenerfahrung zu verbessern und die Kosten für menschliche Telefonisten zu minimieren. Im Moment werden sie hauptsächlich für einfache Aufgaben eingesetzt: Sie verstehen einfache Anfragen und reagieren auf der Grundlage vordefinierter Regeln, indem sie Fragen wie "Wo ist meine Bestellung?" oder "Chatbot, schalte das Stimmungslicht ein" beantworten.
APIs wie [Watson Assistant](https://www.ibm.com/cloud/watson-assistant/) oder [Amazon Lex](https://aws.amazon.com/lex/) machen es jedoch einfach, Dienste zu entwickeln, die Logik auf beobachtete Muster in diesen natürlichsprachlichen Anfragen anwenden können. Diese Dienste können z. B. einen plötzlichen Ansturm von Anrufen von einem Flughafen mit Startverspätungen beobachten und die Reihenfolge der Optionen ändern, um die Umplanung von Flügen zu priorisieren. Oder sie stellen fest, dass Anrufe aus einem bestimmten Land oder einer bestimmten Region tendenziell in einer anderen Sprache getätigt werden, und ändern die Standardeinstellung entsprechend. Sie können sogar grammatikalische Muster erkennen, die darauf hinweisen, dass Kunden sofort an einen Vorgesetzten weitergeleitet werden sollen.
Intelligente Konversationsschnittstellen, die Spracherkennung, Text-to-Speech, Gesichtserkennung und Modelle für maschinelles Lernen nutzen, können für verschiedene Zwecke äußerst ansprechende Erlebnisse und lebensnahe Gespräche bieten. Und was noch besser ist: Sie werden aus diesen Erfahrungen lernen.
Chatbots werden die Art und Weise verändern, wie wir Bankgeschäfte tätigen, einkaufen und lernen: Sie geben Empfehlungen, verstehen abstrakte Konzepte und lernen Menschen auf der Grundlage früherer Kontakte kennen. Irgendwann werden sie so gut sein, dass man gar nicht mehr merkt, dass man mit einem Menschen spricht.
#### **Code-Beispiel: Chatbot für Hausautomatisierung**
Mit Watson und der PubNub ChatEngine können Sie ganz einfach [einen Chatbot mit künstlicher Intelligenz](https://www.pubnub.com/docs/chat/samples?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=de) erstellen, der Ihr Smart Home steuert.

Dieses Tutorial zeigt Ihnen, wie Sie einen Chatbot erstellen, der Textbefehle annimmt, sie analysiert und darauf basierend Aktionen ausführt. Beispiel: Ein Benutzer gibt ein: "Schalte das Licht im Wohnzimmer ein", und der Bot schaltet das Licht ein.
```js
{
"homeauto_intents":
[
{
"intent":"turnOFF",
"examples":
[
{"text":"Put off"},
{"text":"Switch off"},
{"text":"Turn off"}
],
"description":"Turn on intents"
},
{
"intent":"turnON",
"examples":
[
{"text":"Put on"},
{"text":"Switch on"},
{"text":"Turn on"}
],
"description":"Turn off intents"
}
```
### **Verarbeitung natürlicher Sprache**
Ein weiterer wichtiger Bereich ist die Datenwissenschaft und die Verarbeitung natürlicher Sprache (Natural Language Processing, NLP), der Oberbegriff für KI-Lösungen, die große Mengen von Daten in natürlicher Sprache verarbeiten können. NLP kann nicht nur Wörter und Grammatik aus einer semantischen Perspektive bewerten, sondern auch Gefühle und Emotionen erkennen, indem es die Gefühle der Nutzer zu einem Thema oder einer Sache durch die Analyse von Nachricht zu Nachricht herausfindet.
NLP ist ein enormer Vorteil für Marken, Persönlichkeiten des öffentlichen Lebens und Organisationen, die die Meinungen der Nutzer verstehen und darauf reagieren müssen - und das in einer Zeit, in der ein guter Ruf innerhalb von Minuten auf- oder abgebaut werden kann. Stellen Sie sich vor, eine Marke bringt einen neuen Werbespot für ein Produkt auf den Markt. Mit den richtigen kognitiven Diensten kann sie einen Social-Media-Stream zu einem bestimmten Hashtag oder dem Produktnamen anzapfen und ihre NLP-API alle relevanten Nachrichten analysieren und Feedback dazu geben, wie die Öffentlichkeit auf das Produkt reagiert.
Unten sehen Sie ein Beispiel für eine App, die entwickelt wurde, um zu analysieren und zu bewerten, wie die Menschen auf Twitter über US-Politiker denken. Sie überwacht bestimmte Schlüsselwörter und Phrasen und kann dann die Emotionen der Nutzer in bestimmten geografischen Regionen darstellen.

Wenn ein Nutzer zum Beispiel den Text "Ich bin glücklich" eingibt...
```js
{
"session_id": 1,
"text": "I am happy!"
}
```
analysiert Watson den Text und liefert folgende Ergebnisse:
```js
{
"session_id": 1,
"text": "I am happy!"
"session_sentiment": {
"overall": 0.879998,
"positive": {
"count": 1,
"avg": 0.879998
},
"negative": {
"count": 0,
"avg": 0
},
"neutral": {
"count": 2,
"avg": null
}
},
"score": 0.88006828
}
```
Marken geben bereits große Summen für die Analyse von Marktstimmungen aus. Da diese Systeme immer intelligenter, robuster und automatisierter werden, können sie die Öffentlichkeit zu geringeren Kosten viel besser verstehen.
**eCommerce**
-------------
Obwohl das Online-Shopping die Art und Weise, wie wir Waren einkaufen, völlig verändert hat, fehlt dem elektronischen Handel eine Schlüsselkomponente eines stationären Geschäfts: hilfsbereite Mitarbeiter. In der Größenordnung, in der Online-Shops arbeiten, ist es wirtschaftlich nicht tragbar, echte Mitarbeiter für den Live-Chat einzusetzen.
Aus diesem Grund setzen viele Online-Shops auf intelligente Shopping-Assistenten, die das Einkaufserlebnis optimieren, den Kunden bei Fragen helfen, Empfehlungen geben und sogar die Kasse bedienen.
[Nordstrom hat die vergangenen Weihnachtssaisonen](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/) mit seinem Messenger-Chatbot[dominiert](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/), der über einfache vordefinierte Fragen und Antworten hinausging und kognitive Dienste nutzte, um wirklich zu verstehen, wonach der Kunde suchte, und ihm bei Bedarf zu helfen. Er bot Geschenkempfehlungen an und konnte sogar bei der Ausführung der Bestellung helfen.

Chatbots ersparen uns auch den gefürchteten Anruf beim Kundensupport, bei dem wir eine Stunde auf einen Mitarbeiter warten müssen, um ein einfaches Problem zu lösen. Amazon hat Chatbots eingesetzt, die kleinere Probleme lösen können, die die meisten Kunden haben, wenn sie Hilfe bei ihren Bestellungen benötigen.
Nachdem wir uns nun einige Beispiele für Intelligenz in der realen Welt angesehen haben, wollen wir einen Blick in die Zukunft werfen und sehen, wie kognitive Dienste unsere Welt in Zukunft verändern werden.
**Intelligente Städte**
-----------------------
Die Städte der Zukunft werden auf verschiedene integrierte intelligente Dienste angewiesen sein, um sie sicherer, effizienter und umweltbewusster zu machen. Bilderkennung, Computer Vision und Vision-APIs werden bei diesem Wandel eine entscheidende Rolle spielen, indem sie Bilder im städtischen Raum verarbeiten und Maßnahmen ergreifen.
**Landwirtschaft**
------------------
Die Weltbevölkerung wächst weiter, und die Ernährung dieser Milliarden von Menschen wird in den kommenden Jahren eine große Herausforderung darstellen. Kognitive Dienste werden eine entscheidende Rolle bei der Bewirtschaftung von Feldern und Fabriken spielen und es uns ermöglichen, intelligente Entscheidungen zu treffen und Ressourcen mit einer nie dagewesenen Präzision zu steuern.
Intelligente Bauernhöfe und das Internet der Dinge werden so viele wertvolle Datenpunkte wie möglich einbeziehen, um intelligente landwirtschaftliche Entscheidungen zu treffen, selbst solche, die kontraintuitiv erscheinen. Durch die Zusammenführung von Echtzeit-Wetterdaten, Fernsensordaten und historischen Leistungsdaten können kognitive Dienste beispielsweise den individuellen Bewässerungsplan perfektionieren und an die tagesaktuellen Gegebenheiten anpassen.
**Datensicherheit**
-------------------
In dem Maße, in dem wir immer stärker vernetzt sind und unser digitales Leben unser physisches Leben überschattet, werden Datenschutz und Datensicherheit von etwas, dessen wir uns nur vage bewusst sind, zu einer beunruhigenden, allgegenwärtigen persönlichen Bedrohung.
Vorschriften und Regeln - HIPAA, GDPR, SOC II - sind eine Möglichkeit, um sicherzustellen, dass Unternehmen und Organisationen über die richtigen Leitplanken verfügen. Die Umsetzung dieser komplexen Vorschriften im Detail kann eine Menge Arbeit bedeuten, und genau hier kommt das maschinelle Lernen ins Spiel.
Kognitive Dienste können darauf trainiert werden, Regeln und Vorschriften zu verstehen und zu interpretieren, und schlagen dann Wege zur Einhaltung der Vorschriften vor. Mit kognitiven Diensten lassen sich wertvolle Einblicke in die Datensicherheit gewinnen, von relevanten Regeln und Gesetzen bis hin zur Moderation von Inhalten.
**Gesundheitswesen**
--------------------
Innovationen kommen in der Gesundheitsbranche in der Regel langsamer voran als in anderen Branchen, und zwar aus mehreren Gründen: enge Gewinnspannen, strenge Vorschriften und isolierte Forschung und Entwicklung. Kognitive Dienste bieten die Möglichkeit, Innovationsbarrieren zu beseitigen und das Versorgungssystem von den Organisationen bis hin zu den Patienten zu verbessern.
Die Entscheidungsfindung im Gesundheitswesen erfolgt in der Regel auf einer isolierten Basis von Patient zu Patient. Kognitive Dienste hingegen analysieren und handeln auf der Grundlage eines umfassenden Überblicks über die Faktoren, die die Gesundheit beeinflussen: sozioökonomischer Status, Umfeld, Zugang zur Gesundheitsversorgung usw. Kognitive Dienste können dem Arzt eine bessere, gezieltere Patientenversorgung empfehlen, einschließlich Gesundheits- und Wellnessprogrammen.
Kognitive Dienste können die Integration und Verknüpfung bestehender Systeme in Gesundheitsorganisationen vorantreiben und wichtige Erkenntnisse zutage fördern. Wenn Organisationen plötzlich in der Lage sind, Daten zu aggregieren und die Bedürfnisse der Beteiligten miteinander zu verbinden, können sie eine bessere Versorgung bieten und gleichzeitig effizienter arbeiten.
**Intelligenz jetzt**
---------------------
In diesem Artikel wurde nur ein winziger Ausschnitt dessen beschrieben, wie kognitive Dienste die Art und Weise, wie wir über Unternehmen denken, und die Rolle, die Anwendungen spielen können, verändern werden. In der Vergangenheit folgte die Software den Anweisungen. Mit kognitiven Diensten können sich Lösungen anpassen, weiterentwickeln und Dinge vollbringen, die noch vor wenigen Jahren unmöglich erschienen. Wir können noch nicht alle Auswirkungen absehen, aber nach dem, was wir wissen, besteht kaum ein Zweifel daran, dass die Auswirkungen auf die Wirtschaft tiefgreifend und positiv sein werden - und dass sie schneller da sein werden, als Sie es sich vorstellen können.
Wie kann PubNub Ihnen helfen?
=============================
Dieser Artikel wurde ursprünglich auf [PubNub.com](https://www.pubnub.com/blog/the-many-uses-of-cognitive-services/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=de) veröffentlicht.
Unsere Plattform unterstützt Entwickler bei der Erstellung, Bereitstellung und Verwaltung von Echtzeit-Interaktivität für Webanwendungen, mobile Anwendungen und IoT-Geräte.
Die Grundlage unserer Plattform ist das größte und am besten skalierbare Echtzeit-Edge-Messaging-Netzwerk der Branche. Mit über 15 Points-of-Presence weltweit, die 800 Millionen monatlich aktive Nutzer unterstützen, und einer Zuverlässigkeit von 99,999 % müssen Sie sich keine Sorgen über Ausfälle, Gleichzeitigkeitsgrenzen oder Latenzprobleme aufgrund von Verkehrsspitzen machen.
PubNub erleben
--------------
Sehen Sie sich die [Live Tour](https://www.pubnub.com/tour/introduction/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=de) an, um in weniger als 5 Minuten die grundlegenden Konzepte hinter jeder PubNub-gestützten App zu verstehen
Einrichten
----------
Melden Sie sich für einen [PubNub-Account](https://admin.pubnub.com/signup/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=de) an und erhalten Sie sofort kostenlosen Zugang zu den PubNub-Schlüsseln
Beginnen Sie
------------
Mit den [PubNub-Dokumenten](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=de) können Sie sofort loslegen, unabhängig von Ihrem Anwendungsfall oder [SDK](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=de) | pubnubdevrel | |
1,872,122 | How to Fetch Adobe Commerce Data with a C# .NET Magento Connector | Learn how to fetch data from Adobe Commerce by using the ComponentOne C# .NET Magento Connector. | 0 | 2024-05-31T14:06:17 | https://developer.mescius.com/blogs/how-to-fetch-adobe-commerce-data-with-a-c-sharp-net-magento-connector | webdev, devops, csharp, tutorial | ---
canonical_url: https://developer.mescius.com/blogs/how-to-fetch-adobe-commerce-data-with-a-c-sharp-net-magento-connector
description: Learn how to fetch data from Adobe Commerce by using the ComponentOne C# .NET Magento Connector.
---
**What You Will Need**
- Visual Studio 2022
- ComponentOne Data Services Edition
**Controls Referenced**
- [C1DataConnector for Magento](https://developer.mescius.com/componentone/docs/services/online-dataconnector/magentogettingstarted.html)
**Tutorial Concept**
Learn how to fetch data from an online Adobe Commerce (also known as Magento) database and display the data within a .NET application using C#.
---
Are you looking to retrieve data from your Magento Server to build a .NET app? You’re at the right place.
As technology changes daily, there are several sources to store data, like the Magento server. Adobe provides an open-source platform named Magento that is used to create e-commerce stores and handle complex operations. If you plan to develop a .NET app using data from a Magento server, then fetching this data might seem a bit complicated. However, our [ComponentOne Data Connector](https://developer.mescius.com/componentone/net-data-service-components/data-connectors/ "https://developer.mescius.com/componentone/net-data-service-components/data-connectors/") suite features an API called [ADO.NET provider for Magento](https://developer.mescius.com/componentone/docs/services/online-dataconnector/magentogettingstarted.html "https://developer.mescius.com/componentone/docs/services/online-dataconnector/magentogettingstarted.html")**,** which offers several classes to streamline connectivity with Magento.
In this blog, we will develop a WinForms app using Magento e-commerce data. Let's break down the process into the following easy steps:
* [Setup a WinForms App with Required Dependencies](#Setup)
* [Fetch Data from the Magento Server](#Fetch)
* [Create a UI and Bind It with the Fetched Magento Data](#Create)
## <a id="Setup"></a>Setup a WinForms App with Required Dependencies
Let's begin by setting up a new .NET 8 WinForms app that includes the ADO.NET Provider for Magento dependency by following these steps:
1. Open Visual Studio and select File | New | Project to create a new WinForms app.

2.Right-click on the project in Solution Explorer and choose Manage NuGet Packages… from the context menu.

3.Search for [C1.AdoNet.Magento](https://www.nuget.org/packages/C1.AdoNet.Magento "https://www.nuget.org/packages/C1.AdoNet.Magento") in the NuGet Package Manager and click on Install.

To create the app UI, we will use the input and datagrid controls from the ComponentOne WinForms suite. So, let’s add the following NuGet packages:
* [C1.Win.InputPanel](https://www.nuget.org/packages/C1.Win.InputPanel "https://www.nuget.org/packages/C1.Win.InputPanel")
* [C1.Win.FlexGrid](https://www.nuget.org/packages/C1.Win.FlexGrid "https://www.nuget.org/packages/C1.Win.FlexGrid")
Now that we have successfully set up the environment, the next step is to establish a connection to the Magento Server to fetch the data.
## <a id="Fetch"></a>Fetch Data from the Magento Server
To fetch the data using the C1.AdoNet.Magento API, we need to start by establishing the connection. We are using the [C1MagentoConnectionStringBuilder](https://developer.mescius.com/componentone/docs/services/online-dataconnector/C1.AdoNet.Magento~C1.AdoNet.Magento.C1MagentoConnectionStringBuilder.html "https://developer.mescius.com/componentone/docs/services/online-dataconnector/C1.AdoNet.Magento~C1.AdoNet.Magento.C1MagentoConnectionStringBuilder.html") class to create the connection string by setting its _Url_, _Username_, _Password_, and _TokenType_ properties:
```
//Create a connection string using C1MagentoConnectionStringBuilder
C1MagentoConnectionStringBuilder connBuilder= new C1MagentoConnectionStringBuilder();
connBuilder.Username = @”******";
connBuilder.Password = @"*******";
connBuilder.TokenType= @"****";
connBuilder.Url = @"http://****/****/****";
```
**Note:** You can pass your Magento credential in the code above.
Once the connection string is ready, let’s pass it to the [C1MagentoConnection](https://developer.mescius.com/componentone/docs/services/online-dataconnector/C1.AdoNet.Magento~C1.AdoNet.Magento.C1MagentoConnection.html "https://developer.mescius.com/componentone/docs/services/online-dataconnector/C1.AdoNet.Magento~C1.AdoNet.Magento.C1MagentoConnection.html") class to open the connection with your Magento database:
```
//Create and establish connection
C1MagentoConnection conn = new C1MagentoConnection(connBuilder)
//Open connection with Database
conn.Open();
```
**Note:** You can also check out more connection-building techniques [here](https://developer.mescius.com/componentone/docs/services/online-dataconnector/magentoconnection.html "https://developer.mescius.com/componentone/docs/services/online-dataconnector/magentoconnection.html").
Now, the connection has been made successfully with the Magento server. Let’s create an adapter using the [C1MagentoDataAdapter](https://developer.mescius.com/componentone/docs/services/online-dataconnector/C1.AdoNet.Magento~C1.AdoNet.Magento.C1MagentoDataAdapter.html "https://developer.mescius.com/componentone/docs/services/online-dataconnector/C1.AdoNet.Magento~C1.AdoNet.Magento.C1MagentoDataAdapter.html") class to retrieve the data into the DataTable based on the specified query.
```
//Populate DataTable
C1MagentoDataAdapter adapter = new C1MagentoDataAdapter(conn, "Select * from Products");
DataTable dt = new DataTable();
adapter.Fill(dt);
```
**Note:** You can also check out more data query techniques [here](https://developer.mescius.com/componentone/docs/services/online-dataconnector/magentoconnection.html "https://developer.mescius.com/componentone/docs/services/online-dataconnector/magentoconnection.html").
We have successfully fetched the data from the Magento server. Next, we will bind it with the UI to make it presentable.
## <a id="Create"></a>Create a UI and Bind It with the Fetched Magento Data
Initially, we added the FlexGrid and InputPanel packages to the project. Now, let’s place them onto the form to make a UI that aligns with the data fields. Here’s how we designed the form:

Let’s bind the grid and input controls with the DataTable using _DataSource_ and _DataField_ properties:
```
c1FlexGrid1.DataSource = dt;
inputDataNavigator1.DataSource = dt;
inputTextBox1.DataSource = dt;
inputTextBox1.DataField = "id";
inputTextBox2.DataSource = dt;
inputTextBox2.DataField = "name";
inputTextBox3.DataSource = dt;
inputTextBox3.DataField = "attribute_set_id";
inputTextBox4.DataSource = dt;
inputTextBox4.DataField = "sku";
inputTextBox5.DataSource = dt;
inputTextBox5.DataField = "type_id";
inputDatePicker1.DataSource = dt;
inputDatePicker1.DataField = "created_at";
inputDatePicker2.DataSource = dt;
inputDatePicker2.DataField = "updated_at";
```
That’s it! We have successfully created a WinForms app that showcases the data from the Magento server and works as shown below. You can even insert and delete records.

[Download the sample](https://cdn.mescius.io/umb/media/l35f0e2r/magentodataconnectordemo.zip) and update it with your credentials to try it out.
## Conclusion
In this blog, we showcased easy data connection techniques in .NET from the Magento server. With the data connector API, you can fetch the data from various sources like Salesforce, Kintone, and OData, among others. Explore the [Data Connector documentation](https://developer.mescius.com/componentone/docs/services/online-dataconnector/overview.html "https://developer.mescius.com/componentone/docs/services/online-dataconnector/overview.html") to learn more about these powerful APIs. | chelseadevereaux |
1,871,777 | How generative AI can make developing fun | There are many things a software developer does, or needs to know nowadays. Starting from coding,... | 0 | 2024-05-31T08:11:22 | https://dev.to/tariqca/how-generative-ai-can-make-developing-fun-57l0 |

There are many things a software developer does, or needs to know nowadays. Starting from coding, testing, writing/reading documentations, analyzing code, fixing errors and so much more. There are certain aspects that make this job fun, but with all the components together it sometimes makes it tiring.
Additional “problems” arise when software developers that work with startups need to write a piece of code or logic that is common in most applications like the login or sign up logic. Which after some time, once they get the jist of it, becomes boring and repetitive.
Developers also face a challenge when they switch companies, and/or start anew in a non familiar environment. Thrown on a new project and in a new team, they have to adapt, understand the code they witness for the first time, quickly learn a new programming language or simply upgrade or fix an outdated/buggy code leaving the key features intact. The list can go on, but what if there was a tool that could help us, make some of these challenges more approachable, and less painful.
With the introduction of generative AI people were at first skeptical, afraid even, of what this new technology can do. But after a couple of years a lot of people embraced it. Analyzing and playing with it new possibilities have emerged, and people started using the full capacity of generative AI. It started from simply generating blog posts or other text, human like conversations, quick research and information delivery then we could generate images, videos, music even from a simple text input. And now we can also code. But before we go any further a lot of you will ask “What is generative AI and is it really that important ?”.
Generative artificial intelligence is a type of AI that can create new content and ideas, including conversations, stories, images, videos, and music. AI technologies attempt to mimic human intelligence in nontraditional computing tasks like image recognition, natural language processing (NLP), and translation. It is the next step in the era of artificial intelligence. You can train it to learn human language, programming languages, art, chemistry, biology, or any complex subject matter. It reuses training data to solve new problems. For example, it can learn English vocabulary and create a poem from the words it processes. They can help reinvent experiences and applications, create new things never seen before, accelerate research, and help reach new productivity levels.
There are a lot of generative AI tools/assistants out there, from which only a few are very popular like ChatGPT, Google AI studio, Github Copilot, Synthesia etc. But today we are going to talk about the most capable of them all, Amazon Q.
Amazon Q generates code, tests, debugs, and has multistep planning and reasoning capabilities that can transform and implement new code generated from developer inputs. Amazon Q also makes it easier for employees to get answers to questions across business data, such as product information, business results, code base, and many other topics, by connecting to enterprise data repositories to summarize the data logically, analyze trends, and engage in dialogue about the data. Amazon Q offers a variety of products but for the sake of this blog post we are going to analyze Amazon Q Developer.
Amazon Q Developer assists developers and other IT professionals with all their tasks, from coding, testing, and upgrading applications, to diagnosing errors, performing security scanning and fixes, and optimizing AWS resources. Amazon Q has advanced, multistep planning and reasoning capabilities that can transform (for example, perform Java version upgrades) and implement new features generated from developer requests.
Amazon Q Developer is very easy to install and integrate with and to do so you have to go to its webpage and select your preferred IDE in which you want Amazon Q to assist you. In this case we are going to select the IntelliJ IDE and download the plugin.

After that we will install the downloaded plugin by opening our IDEA, click on Settings, navigate to Plugins, select “Install Plugin from Disk…” and locate the earlier downloaded plugin. AWS Core toolkit needs to be installed as well in order to use Amazon Q developer in your IDE.

After the installation is complete you will see a new icon on the Tools Window Bar located on the right side of the IDE.

Pressing the icon will open a chat that will ask you to login (don’t worry, you can use Amazon Q without an AWS account for free) where we can talk with Amazon Q, ask questions, post errors and so much more. But the key features are located when you select a piece of code, right click and from the dropdown you will see an option saying “Send to Amazon Q”. This option allows us to send the selected code to Amazon Q in order for it to explain, refactor, optimize and fix our code which can save a lot of time and headaches in the development world.
Amazon Q Developer also assists with writing code. Let’s say you want to write a function that adds two numbers together. You can simply write what you want your method to do in a comment and Amazon Q will show you a suggestion of what that code would look like, after which you can opt to use that suggested code or not. Amazon Q will also suggest imports, or other pieces of code you might be missing so keep an eye on that as well.

So, to summarize, Amazon Q Developer offers a lot of features that can eliminate the stress that often comes with developing. It can remove the need to spend time on writing repetitive or simple code, decrease the time we spend on analyzing features we don’t understand, vague documentation or errors, and most important of all, it increases our productivity, helps us focus more on important and high priority tasks and makes writing code fun again!
| tariqca | |
1,872,121 | The most painful reason NULLs are evil | I keep harping on about doing null: false everywhere, especially for strings and booleans, but... | 0 | 2024-05-31T14:03:45 | https://dev.to/epigene/the-most-painful-reason-nulls-are-evil-59ac | rails, null | ---
title: The most painful reason NULLs are evil
published: true
description:
tags: Rails,null
# cover_image: https://direct_url_to_image.jpg
# Use a ratio of 100:42 for best results.
# published_at: 2024-05-31 13:50 +0000
---
I keep harping on about doing `null: false` everywhere, especially for strings and booleans, but sometimes there are sneaky exceptions for number fields, where a default of `0` does not make sense and the values will not be available for a time, some draft records etc.
You have to be extremely careful then because apparently NULLs are **not** "not equal" to anything. What do I mean?
Consider these `User` records:
```rb
id: 1, age: 20
id: 2, age: 25
id: 3, age: nil
```
How would you query for all users who are not 20?
`where.not(age: 20)`, right? Sorry to say, but User#3 will be omitted from such queries. 😫
You have two options:
* denullify the age column (may be impossible)
* tweak the query to handle the silly null edge-case:
```rb
where.not(age: 20).or(where(age: nil))
``` | epigene |
1,872,120 | 인지 서비스 사용 사례 살펴보기 | 인지 서비스를 통해 혁신을 이룬 앱과 비즈니스의 예와 향후 사용 사례도 살펴보세요. | 0 | 2024-05-31T14:01:19 | https://dev.to/pubnub-ko/inji-seobiseu-sayong-sarye-salpyeobogi-42ho | 코그너티브 서비스를 통해 혁신을 이룬 앱과 기업의 몇 가지 사례와 향후 사용 사례를 통해 코그너티브 서비스가 기술 환경을 얼마나 변화시키고 있는지 살펴보세요.
AWS, IBM, Microsoft Azure와 같은 거대 클라우드 기업의 코그너티브 서비스 덕분에 이제 모든 규모의 개발자 팀은 엄청난 성능의 [코그너티브 서비스를](https://pubnub.com/resources/ebook/building-apps-with-cognitive-services/) 이용할 수 있게 되었습니다. API를 통해 제공되는 이러한 서비스를 통해 애플리케이션에 차세대 인텔리전스를 쉽게 도입할 수 있습니다.
[**채팅**](https://pubnub.com/learn/glossary/what-is-a-chat-api/) **및 소셜 상호작용**
-----------------------------------------------------------------------------
2015년에 채팅 앱의 월간 활성 사용자 수는 소셜 네트워크의 사용자를 넘어섰고, 그 격차는 계속 벌어지고 있습니다. 실제로 메시징은 소셜 네트워크 자체의 필수 기능이 되었습니다. 이러한 급속한 성장과 함께 메시징 앱은 짧은 텍스트 기반의 메시지를 주고받는 단순한 도구에서 놀랍고 즐거운 기능을 갖춘 혁신적이고 완전한 기능을 갖춘 경험으로 진화했습니다. 그리고 이러한 혁신의 원동력은 바로 코그너티브 API입니다.
### **챗봇과 인지 컴퓨팅**
챗봇은 가장 초기 형태의 AI 알고리즘 중 하나입니다. 튜링 테스트를 곧 통과할 것 같지는 않지만 음성 지원 애플리케이션의 자연스러운 진화를 나타냅니다. 예전에는 지원팀에 전화하여 1번을 눌러 미지급금에 대해 문의했다면, 이제는 의도를 파악할 수 있는 시스템에 완전한 문장으로 말할 수 있습니다.
기업이 대기 시간을 줄이고, 고객 경험을 개선하며, 전화 상담원 비용을 최소화하기 위해 챗봇을 도입하는 사례가 폭발적으로 증가하고 있습니다. 현재 챗봇은 주로 기본적인 요청을 이해하고 미리 정의된 규칙에 따라 응답하거나 "내 주문이 어디 있나요?" 또는 "챗봇, 무드등 켜줘"와 같은 질문에 답하는 등 간단한 작업을 처리하는 데 주로 사용되고 있습니다.
하지만 [왓슨 어시스턴트나](https://www.ibm.com/cloud/watson-assistant/) [Amazon Lex와](https://aws.amazon.com/lex/) 같은 API를 사용하면 이러한 자연어 요청에서 관찰된 패턴에 로직을 적용할 수 있는 서비스를 쉽게 구축할 수 있습니다. 예를 들어 이러한 서비스는 이륙 지연을 겪고 있는 공항에서 갑자기 전화가 쇄도하는 것을 관찰하고 항공편 일정 변경의 우선순위를 정하기 위해 옵션 순서를 변경할 수 있습니다. 또는 특정 국가나 지역에서 걸려온 전화가 다른 언어로 진행되는 경향이 있음을 파악하고 그에 따라 기본값을 변경할 수도 있습니다. 심지어 문법 패턴을 파악하여 고객에게 즉시 관리자에게 전달하도록 지시할 수도 있습니다.
음성 인식, 텍스트 음성 변환, 얼굴 인식, 머신 러닝 모델을 사용하는 지능형 대화 인터페이스는 다양한 목적에 따라 매우 매력적인 경험과 실제와 같은 대화를 제공할 수 있습니다. 더 좋은 점은 이러한 경험을 통해 학습할 수 있다는 것입니다.
챗봇은 추천을 하고, 추상적인 개념을 이해하고, 이전 참여를 기반으로 개인을 알아가는 등 우리가 은행을 이용하고, 쇼핑하고, 학습하는 방식을 변화시킬 것입니다. 결국에는 사람과 대화하고 있는지조차 모를 정도로 능숙해질 것입니다.
#### **코드 예시: 홈 오토메이션 챗봇**
왓슨과 PubNub ChatEngine을 사용하면 스마트 홈을 제어하는 [인공 지능 챗봇을](https://www.pubnub.com/docs/chat/samples?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ko) 쉽게 만들 수 있습니다.

이 튜토리얼에서는 텍스트 명령을 받아 파싱하고 이를 기반으로 작업을 수행하는 챗봇을 구축하는 방법을 보여 줍니다. 예를 들어 사용자가 "거실에 불 켜줘"라고 입력하면 봇이 조명을 켭니다.
```js
{
"homeauto_intents":
[
{
"intent":"turnOFF",
"examples":
[
{"text":"Put off"},
{"text":"Switch off"},
{"text":"Turn off"}
],
"description":"Turn on intents"
},
{
"intent":"turnON",
"examples":
[
{"text":"Put on"},
{"text":"Switch on"},
{"text":"Turn on"}
],
"description":"Turn off intents"
}
```
### **자연어 처리**
대량의 자연어 데이터를 효율적으로 처리할 수 있는 AI 솔루션의 포괄적인 용어인 데이터 과학 및 자연어 처리(NLP)는 또 다른 큰 영향력을 발휘하는 영역입니다. NLP는 의미론적 관점에서 단어와 문법을 측정할 뿐만 아니라 메시지별 분석을 통해 사용자가 주제나 주제에 대해 어떻게 느끼는지 밝혀내어 감정과 정서를 파악할 수 있습니다.
NLP는 단 몇 분 만에 평판이 형성되거나 형성되지 않을 수 있는 상황에서 사용자 의견을 이해하고 대응해야 하는 브랜드, 유명인, 조직에게 큰 이점을 제공합니다. 한 브랜드가 새로운 제품 광고를 시작한다고 상상해 보세요. 적절한 코그너티브 서비스를 사용하면 특정 해시태그나 제품명에 대한 소셜 미디어 스트림을 활용하여 NLP API가 모든 관련 메시지를 분석하고 대중이 제품에 대해 어떻게 반응하는지에 대한 피드백을 제공할 수 있습니다.
아래는 트위터에서 사람들이 미국 정치인에 대해 어떻게 느끼는지 분석하고 측정하도록 설계된 앱의 예입니다. 이 앱은 특정 키워드와 문구를 모니터링한 다음 정의된 지역에서 사용자의 감정을 플로팅할 수 있습니다.

예를 들어, 사용자가 "나는 행복하다"라는 텍스트를 제출하면...
```js
{
"session_id": 1,
"text": "I am happy!"
}
```
왓슨은 텍스트를 분석하여 다음과 같은 결과를 반환합니다:
```js
{
"session_id": 1,
"text": "I am happy!"
"session_sentiment": {
"overall": 0.879998,
"positive": {
"count": 1,
"avg": 0.879998
},
"negative": {
"count": 0,
"avg": 0
},
"neutral": {
"count": 2,
"avg": null
}
},
"score": 0.88006828
}
```
브랜드는 이미 시장 감정 분석에 많은 비용을 지출하고 있습니다. 이러한 시스템이 더욱 지능적이고 강력하며 자동화됨에 따라 더 적은 비용으로 대중을 훨씬 더 잘 이해할 수 있게 될 것입니다.
**이커머스**
--------
온라인 쇼핑은 우리가 상품을 구매하는 방식을 완전히 바꾸어 놓았지만, 이커머스에는 오프라인 매장의 핵심 요소인 친절한 직원이 부족합니다. 온라인 상점의 운영 규모를 고려할 때, 실제 직원이 실시간 채팅을 하는 것은 경제적으로 타당하지 않습니다.
따라서 많은 온라인 스토어에서 지능형 쇼핑 도우미 봇을 통해 쇼핑 경험을 최적화하고, 쇼핑객의 질문을 지원하고, 추천을 하고, 결제까지 도와주고 있습니다.
[Nordstrom은](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/) 단순한 사전 정의된 질문과 답변을 넘어 인지 서비스를 사용하여 고객이 찾고 있는 것을 진정으로 이해하고 필요에 따라 지원하는 메신저 챗봇으로[지난 홀리데이 시즌을 지배했습니다](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/). 챗봇은 선물 추천을 제공하고 주문 처리까지 도와주었습니다.

또한 챗봇은 간단한 문제를 처리하기 위해 담당자가 한 시간씩 기다리는 지겨운 고객 지원 전화 통화에서 벗어날 수 있게 해줍니다. 아마존은 대부분의 고객이 주문과 관련하여 도움이 필요할 때 겪는 사소한 문제를 해결할 수 있는 챗봇을 배포했습니다.
이제 현실 세계에서 인텔리전스의 몇 가지 사례를 살펴봤으니, 앞으로 인지 서비스가 세상을 어떻게 변화시킬지 미래를 들여다보겠습니다.
**스마트 시티**
----------
미래의 도시는 보다 안전하고 효율적이며 환경을 고려한 다양한 통합 인텔리전스 서비스에 의존하게 될 것입니다. 이미지 인식, 컴퓨터 비전 및 비전 API는 도시 공간 내에서 이미지를 처리하고 조치를 취하는 이러한 변화에서 중요한 역할을 할 것입니다.
**농업**
------
전 세계 인구는 계속 증가하고 있으며, 앞으로 수십억 명의 인구를 먹여 살리는 것은 상당한 도전이 될 것입니다. 인지 서비스는 농장과 공장을 관리하는 데 중요한 역할을 할 것이며, 이전에는 없었던 정밀도로 지능적인 결정을 내리고 자원을 제어할 수 있게 해줄 것입니다.
스마트 팜과 IoT는 가능한 한 많은 가치 있는 데이터 포인트를 통합하여 직관적이지 않은 것처럼 보이는 농업 관련 의사 결정도 지능적으로 내릴 수 있게 될 것입니다. 예를 들어, 인지 서비스는 실시간 날씨 데이터, 원격 센서 데이터, 과거 실적을 종합하여 개별 관개 계획을 완벽하게 수립하고 매일의 고유한 상황에 맞게 업데이트할 수 있습니다.
**데이터 보안**
----------
우리가 더 많이 연결되고 디지털 생활이 물리적 생활을 압도함에 따라 데이터 프라이버시와 보안은 막연하게 인식하던 것에서 당황스럽고 항상 존재하는 개인적 위협으로 변모하고 있습니다.
규정과 규칙(HIPAA, GDPR, SOC II)은 기업과 조직이 적절한 보호 장치를 마련할 수 있는 한 가지 방법입니다. 이러한 복잡한 규정을 세부적으로 구현하는 것은 처리해야 할 일이 많을 수 있는데, 바로 이 부분에서 머신러닝의 역할이 중요합니다.
인지 서비스는 규칙과 규정을 이해하고 이해하도록 학습시킨 다음 규정 준수를 위한 방법을 제안할 수 있습니다. 인지 서비스를 통해 관련 규칙과 법률부터 콘텐츠 중재에 이르기까지 데이터 보안에 대한 귀중한 인사이트를 제공할 수 있습니다.
**헬스케어**
--------
의료 산업은 일반적으로 타이트한 마진, 엄격한 규제, 사일로화된 연구 개발 등 여러 가지 이유로 다른 산업에 비해 혁신이 느리게 진행됩니다. 코그너티브 서비스는 혁신의 장벽을 허물고 조직에서 환자에 이르는 의료 서비스 전달 체계를 개선할 수 있는 기회를 제공합니다.
의료 분야의 의사 결정은 일반적으로 사일로화된 환자 단위로 이루어집니다. 반면 인지 서비스는 사회경제적 상태, 환경, 의료 서비스 접근성 등 건강에 영향을 미치는 요인들을 종합적으로 분석하고 이에 따라 조치를 취합니다. 인지 서비스는 건강 및 웰니스 프로그램을 포함하여 의사에게 더 나은 맞춤형 환자 치료를 추천할 수 있습니다.
인지 서비스는 의료 기관 내 기존 시스템의 통합과 연결을 촉진하고 필수 인사이트를 발굴할 수 있습니다. 데이터를 통합하고 이해관계자의 요구를 연결할 수 있게 된 조직은 보다 효율적으로 운영하면서 더 나은 치료를 제공할 수 있습니다.
**지금 바로 인텔리전스**
---------------
이 글에서는 코그너티브 서비스가 비즈니스에 대한 사고 방식과 애플리케이션이 수행할 수 있는 역할을 어떻게 변화시킬지 아주 작은 샘플만 설명했습니다. 과거에는 소프트웨어가 지시를 따랐습니다. 하지만 인지 서비스를 통해 솔루션은 적응하고 진화하며 불과 몇 년 전만 해도 불가능해 보였던 일들을 해낼 수 있습니다. 모든 의미를 다 파악할 수는 없지만, 지금까지 알려진 바에 따르면 비즈니스에 미치는 영향이 심대하고 긍정적일 것이라는 점은 의심할 여지가 없습니다.
펍넙이 어떤 도움을 드릴 수 있을까요?
=====================
이 문서는 원래 [PubNub.com에](https://www.pubnub.com/blog/the-many-uses-of-cognitive-services/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ko) 게시되었습니다.
저희 플랫폼은 개발자가 웹 앱, 모바일 앱 및 IoT 디바이스를 위한 실시간 인터랙티브를 구축, 제공 및 관리할 수 있도록 지원합니다.
저희 플랫폼의 기반은 업계에서 가장 크고 확장성이 뛰어난 실시간 에지 메시징 네트워크입니다. 전 세계 15개 이상의 PoP가 월간 8억 명의 활성 사용자를 지원하고 99.999%의 안정성을 제공하므로 중단, 동시 접속자 수 제한 또는 트래픽 폭증으로 인한 지연 문제를 걱정할 필요가 없습니다.
PubNub 체험하기
-----------
[라이브 투어를](https://www.pubnub.com/tour/introduction/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ko) 통해 5분 이내에 모든 PubNub 기반 앱의 필수 개념을 이해하세요.
설정하기
----
PubNub [계정에](https://admin.pubnub.com/signup/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ko) 가입하여 PubNub 키에 무료로 즉시 액세스하세요.
시작하기
----
사용 사례나 [SDK에](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ko) 관계없이 [PubNub 문서를](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ko) 통해 바로 시작하고 실행할 수 있습니다. | pubnubdevrel | |
1,872,119 | The Almond Tree: A Perfect Blend of Beauty and Bounty for Your Garden | Introduction Are you in search of a tree that adds both aesthetic charm and practical benefits to... | 0 | 2024-05-31T14:01:15 | https://dev.to/muhammad_mujtaba_e8e94d9e/the-almond-tree-a-perfect-blend-of-beauty-and-bounty-for-your-garden-1a0k | almondtree, beauty, bounty, gardentree | Introduction
Are you in search of a tree that adds both aesthetic charm and practical benefits to your garden? Look no further than the almond tree. Renowned for its be)autiful blossoms and delicious, nutritious nuts, the almond tree is a fantastic choice for any landscape. In this blog, we’ll delve into the unique features of the almond tree, its numerous benefits, and why it should be the next addition to your garden.
The Enchanting Appeal of the Almond Tree
Stunning Blossoms
[The almond tree](https://crystalgreenlandscape.ae/products/almond-tree) is celebrated for its breathtaking beauty, particularly in early spring when it bursts into bloom. The tree is adorned with delicate, pale pink or white flowers, creating a stunning display that heralds the arrival of warmer days. These blossoms not only enhance the visual appeal of your garden but also attract bees and other pollinators, supporting the local ecosystem.
Versatile Landscaping Element
Whether you have a spacious backyard or a compact urban garden, the almond tree can fit seamlessly into your landscape design. Its moderate size and elegant shape make it a versatile addition that can serve as a focal point, a shade provider, or a complementary element to other plants and trees.
Benefits of Planting an Almond Tree
Nutritious and Delicious Harvest
One of the most rewarding aspects of growing an almond tree is the bountiful harvest of almonds. Almonds are not only delicious but also packed with nutrients, including healthy fats, protein, fiber, vitamins, and minerals. By planting an almond tree, you can enjoy fresh, homegrown almonds straight from your garden.
Health Benefits
Almonds are known for their numerous health benefits. They support heart health, aid in weight management, and provide essential nutrients for overall well-being. Growing your own almonds ensures you have a fresh, organic supply, free from any additives or preservatives.
Environmental Advantages
Almond trees offer several environmental benefits. They help improve soil quality through their root systems, provide habitat for wildlife, and contribute to air purification. Additionally, growing your own almonds reduces the carbon footprint associated with transporting nuts from commercial farms to your home.
Low Maintenance
Almond trees are relatively low maintenance, making them suitable for both novice and experienced gardeners. They thrive in well-drained soil and sunny locations and require minimal pruning once established. With proper care, an almond tree can produce nuts for many years, providing long-term benefits with minimal effort.
Why Choose an Almond Tree?
Unique and Valuable Addition
An almond tree is not just another plant in your garden; it’s a valuable asset. Its unique combination of beauty and utility makes it a standout choice that offers both visual pleasure and tangible rewards.
Sustainable Living
Incorporating an almond tree into your garden is a step towards sustainable living. By growing your own food, you reduce your dependence on commercial agriculture and contribute to a more sustainable food system. Plus, you get the satisfaction of knowing exactly where your food comes from.
Enhance Property Value
A well-maintained almond tree can enhance the aesthetic and market value of your property. Prospective buyers often appreciate the presence of mature, productive trees in a garden, making it an attractive feature for potential homebuyers.
Conclusion
The almond tree is more than just a pretty addition to your garden; it’s a source of delicious and nutritious nuts, a promoter of environmental health, and a symbol of sustainable living. Its stunning blossoms, ease of care, and bountiful harvest make it a perfect choice for any garden enthusiast. Don’t miss the opportunity to bring the charm and benefits of an almond tree to your outdoor space. | muhammad_mujtaba_e8e94d9e |
1,872,118 | Deploy React.js application using AWS S3 & GitLab pipelines for automatic deployment 2024 | In this article, we will look at how we can deploy our React.js application to the AWS S3 bucket... | 0 | 2024-05-31T14:00:55 | https://dev.to/perisicnikola37/deploy-reactjs-application-using-aws-s3-and-gitlab-pipelines-51lo | react, aws, javascript, devops | In this article, we will look at how we can deploy our React.js application to the AWS S3 bucket using GitLab pipelines🚀
We'll look at how to automate our development process as much as possible🔧
---
## 1 - Project setup and creating GitLab repository 📁
I'll be using the React.js application I've initialized using the following command:
```js
npm create vite@latest
```
1. Login to your GitLab account ([Register](https://gitlab.com/users/sign_up) if you already don't have one)🔐
2. Create a repository for our project📦
3. Clone our repository using HTTP or SSH🔗
4. Commit our code to the repository using the following command💾
```js
git add .
git commit -m "Initial commit"
git push -u origin main
```
After, when we configure our S3 bucket, we will <u>come back</u> to GitLab to configure our pipeline which will do auto-deployment for us whenever we push new code to the `main` branch🔄
---
## 2 - Login to your AWS account and find the "S3" service🌐
Search for the "S3" service in the search input field which is located at the top of the page.
Choose an option with the green colored icon of a bucket with the description "Scalable Storage in the Cloud"🪣
You will see the following page. Click on the "Create bucket" button

---
## 3 - Creating our S3 bucket🛠️
1. Bucket name: `our-react-app-s3-bucket`📝
2. **IMPORTANT**: Disable the "_Block all public access_" checkbox input field -> _this is important to enable our application to be publicly visible_🚫🔓
3. Click the "Create bucket" button✅
Now, we have successfully created our S3 bucket🎉

## Modify bucket policy📜
Now, you need to edit the **Bucket Policy**. Click the "Edit" button in the **Bucket Policy section**. In our case bucket name is `our-react-app-s3-bucket` so I will use it. Please change it according to your bucket name✏️
Paste the following code into your new policy:
```js
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::our-react-app-s3-bucket/*"
}
]
}
```
---
## 4 - Create a new Identity Provider🔑
Search for the "Identity Provider" service in the search input field which is located at the top of the page.

On the left navigation pane, under Access management choose **Identity providers** and then choose **Add provider** button➕
For provider type, select **OpenID Connect**🌐
For **Provider URL**, enter the address of your GitLab instance, such as `https://gitlab.com` or `https://gitlab.example.com`🌍
For **Audience**, enter something that is generic and specific to your application. In my case, I'm going to enter `our-identity-provider`🎯
> To prevent confused deputy attacks, it's best to make this something that is not easy to guess🚫🕵️♂️
Take note of this value because you will use it to set the **ID_TOKEN** in your `.gitlab-ci.yml` file📝
Lastly, click on the "Add provider" button to finish up✅
## Create the permissions policy📜
After you create the identity provider, you need to create a **permissions policy**🛡️
From the IAM dashboard, under Access management select **Policies** and then **Create policy**➕
Select the JSON tab and paste the following policy replacing `our-react-app-s3-bucket` on the `Resource` line with <u>your</u> bucket name. Click the "Next" button to continue. For a "Policy name" input field type: `our-policy`. At the end, click the "Create policy" button to finish up📝
```js
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:ListBucket"],
"Resource": ["arn:aws:s3:::our-react-app-s3-bucket"]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": ["arn:aws:s3:::our-react-app-s3-bucket/*"]
}
]
}
```

**Note**: Don't change anything except the bucket name!🛑
---
## 5 - Creating a role🎭
Now it’s time to add the role. From the IAM dashboard, under **Access management** select **Roles** and then click **Create role** button. Select **Web identity**🆔
In the Web identity section, select the **identity provider** you created <u>earlier</u>. In our case, it would be `gitlab.com`🌐
For the **Audience**, select the audience you created earlier. Select the Next button to continue. In our case, it would be `our-identity-provider`🎯
Click the "Next" button➡️
If you wanted to limit authorization to a specific group, project, branch, or tag, you could create a Custom trust policy instead of a Web identity. Since I will be deleting these resources after the tutorial, I'm going to keep it simple😊

Click the "Next button"➡️
During the **Add permissions**(step number 2), search for the policy you created and click the "Next" button to continue. In our case, it would be `our policy`🔍

In the next step(step number 3), give your role a name and click the "Create role" button. I will call it `our-role-name`📝
Search for the role you just created. Click on it🔍
In the summary section, find the **Amazon Resource Name (ARN)** and save it somewhere secure🔒
You will use this in your pipeline🔄

In our case, it is `arn:aws:iam::162340708442:role/our-role-name`✏️
---
## 6 - Deploy to your Amazon S3 bucket using a GitLab CI/CD pipeline🚀
Inside of your project repository on GitLab, create two CI/CD variables🔧
> The first variable should be named `ROLE_ARN`. For the value, paste the **ARN** of the role you just created in the last step📝
> The second variable should be named `S3_BUCKET`. For the value, paste the name of the S3 bucket you created earlier. In our case it would be `our-react-app-s3-bucket`📝
## Retrieve your temporary credentials🗝️
Inside of your `.gitlab-ci.yml` file, paste the following code:
```js
.assume_role: &assume_role
- >
STS=($(aws sts assume-role-with-web-identity
--role-arn ${ROLE_ARN}
--role-session-name "GitLabRunner-${CI_PROJECT_ID}-${CI_PIPELINE_ID}"
--web-identity-token $ID_TOKEN
--duration-seconds 3600
--query 'Credentials.[AccessKeyId,SecretAccessKey,SessionToken]'
--output text))
- export AWS_ACCESS_KEY_ID="${STS[0]}"
- export AWS_SECRET_ACCESS_KEY="${STS[1]}"
- export AWS_SESSION_TOKEN="${STS[2]}"
```
This is going to use the **AWS Security Token Service** to generate temporary (3,600 seconds) credentials utilizing the OIDC role you created earlier🛡️
## Create the deploy job🛠️
Now, let's add a build and deploy job to build your application and deploy it to your S3 bucket📦
First, update the stages in your `.gitlab-ci.yml` file to include a `build` and `deploy` stage as shown below:
```
stages:
- build
- test
- deploy
```
Next, let's add a job to build your application. Paste the following code in your `.gitlab-ci.yml` file:
```js
build artifact:
stage: build
image: node:latest
before_script:
- npm install
script:
- npm run build
artifacts:
paths:
- build/
when: always
rules:
- if: '$CI_COMMIT_REF_NAME == "main"'
when: always
```
This is going to run `npm run build` if the change occurs on the `main` branch and upload the build directory as an artifact to be used during the next step🔄
Next, let's add a job to actually deploy to your S3 bucket. Paste the following code in your `.gitlab-ci.yml` file:
```js
deploy s3:
stage: deploy
image:
name: amazon/aws-cli:latest
entrypoint:
- '/usr/bin/env'
id_tokens:
ID_TOKEN:
aud: react_s3_gl
script:
- *assume_role
- aws s3 sync build/ s3://$S3_BUCKET
rules:
- if: '$CI_COMMIT_REF_NAME == "main"'
when: always
```
Your complete `.gitlab-ci.yml` file should look like this:
```js
stages:
- build
- test
- deploy
.assume_role: &assume_role
- >
STS=($(aws sts assume-role-with-web-identity
--role-arn ${ROLE_ARN}
--role-session-name "GitLabRunner-${CI_PROJECT_ID}-${CI_PIPELINE_ID}"
--web-identity-token $ID_TOKEN
--duration-seconds 3600
--query 'Credentials.[AccessKeyId,SecretAccessKey,SessionToken]'
--output text))
- export AWS_ACCESS_KEY_ID="${STS[0]}"
- export AWS_SECRET_ACCESS_KEY="${STS[1]}"
- export AWS_SESSION_TOKEN="${STS[2]}"
unit test:
image: node:latest
stage: test
before_script:
- npm install
script:
- npm run test:ci
coverage: /All files[^|]*\|[^|]*\s+([\d\.]+)/
artifacts:
paths:
- coverage/
when: always
reports:
junit:
- junit.xml
build artifact:
stage: build
image: node:latest
before_script:
- npm install
script:
- npm run build
artifacts:
paths:
- build/
when: always
rules:
- if: '$CI_COMMIT_REF_NAME == "main"'
when: always
deploy s3:
stage: deploy
image:
name: amazon/aws-cli:latest
entrypoint:
- '/usr/bin/env'
id_tokens:
ID_TOKEN:
aud: react_s3_gl
script:
- *assume_role
- aws s3 sync build/ s3://$S3_BUCKET
rules:
- if: '$CI_COMMIT_REF_NAME == "main"'
when: always
```
---
## 7 - Make a change and test your pipeline🧪
1. Inside `App.js`, modify some code✏️
2. Commit your changes to the `main` branch. The pipeline should kick off and when it finishes successfully you should see your updated application at the URL of your static website🌐
Voila! 🎉
You now have a CI/CD pipeline built in GitLab that receives temporary credentials from AWS using OIDC and automatically deploys to your Amazon S3 bucket🚀
To take it a step further, you can secure your application with GitLab's built-in security tools🔒
## Where to find the URL to visit your website?🔗
Go to the "S3 buckets" page. Click on your bucket. Go to "Properties"🏠
Scroll to the bottom. You will find "Static website hosting". Click the "Edit" button✏️
Select the "Enable" radio checkbox✅

Click the "Save changes" button💾
In "Index document" fill: `index.html`📝
You will be redirected back. Again, scroll to the bottom of the page and you will find your website URL in the **Static website hosting** section🌐

---
Thank you for reading. Hope I helped 😊
🚀 Follow me on [GitHub](https://github.com/perisicnikola37)
| perisicnikola37 |
1,869,385 | The art of complaining effectively | Marcus Aurelius said "Don't be overheard complaining…Not even to yourself". I think about this... | 0 | 2024-05-31T14:00:00 | https://dev.to/arjunrao87/the-art-of-complaining-effectively-3kal | beginners, productivity, career, learning |
Marcus Aurelius said "Don't be overheard complaining…Not even to yourself". I think about this statement a fair bit. Stoics believe complaining is pointless since that is time wasted when it could be spent improving personally.
Complaining in most cases is bad. When you just complain all the time it is a major mental drain on yourself and people surrounding you. Merely complaining helps no one and breeds negativity.
However, there are a couple of positives to complaining -
- Sometimes you just need to get things off your chest, feel better and move on. This is something you should use strategically because otherwise it can very easily fit into the previous category of mindless complaining.
- Other times, it shows that you don't believe in the status quo. For whatever reason, the current scenario isn't to your liking and you are going against the tide.
The big question here is "so what?". What you do next will likely determine the type of person you are. You know how they say "there are no wrong answers"? Unfortunately in this case there is a wrong answer. If you continue to complain, you will not inspire any confidence in people to believe or follow you. On the other hand, if you like to think of it as a way to redirect the problem towards a solution, that busts a big door of possibilities . Rather than just complain and whine about something, presenting an alternative or a solution to the situation helps bring purpose to it. Your ability to come equipped with a solution without being asked to, shows that you have initiative.
The more senior you get, the more this is imperative. Your ability to be an effective senior is predicated on spotting scenarios that are not optimal for your team or codebase coupled with a knack for identifying solutions for them. It doesn't have to be perfect. You, your peers or your boss can build on top of it and get it to a viable place.
This is not just a career-motivated tactic but a way to breed the right character. If you want to be seen as a problem solver, a go-getter, someone who can be depended on to think of the greater good - you have to come to the table with something besides just grievances. When you think about complaining, take a second to think about how you could contribute to making the situation better, either through yours or someone else's actions and you are now on the path of effective complaining. | arjunrao87 |
1,872,116 | Memory-efficient mass data transfer between Excel and database using Apache POI, Spring Event, Async Threads | Create an Excel file with millions of rows of data from the database with minimal impact on heap... | 0 | 2024-05-31T13:58:08 | https://dev.to/andrewkangg/memory-efficient-mass-data-transfer-between-excel-and-database-using-apache-poi-spring-event-async-threads-3ejc | springboot, excel, webdev | - Create an Excel file with millions of rows of data from the database with minimal impact on heap memory
https://github.com/patternknife/persistence-excel-bridge | andrewkangg |
1,872,114 | This Week In React #187: Next.js, Expo, Popover, rethrow, SWR, React-Query, Astro, PPR... | Hi everyone! Last week we got great announcements from Vercel Ship and App.js conferences. I also... | 18,494 | 2024-05-31T13:56:46 | https://thisweekinreact.com/newsletter/187 | react, reactnative | ---
series: This Week In React
canonical_url: https://thisweekinreact.com/newsletter/187
---
Hi everyone!
Last week we got great announcements from Vercel Ship and App.js conferences.
I also found the community blog posts very interesting this week! There's a lot of good content to read here. The one about memory leak is quite scary 😅.
I have been wondering lately: is it still useful to write a [Twitter thread](https://slo.im/thread)? I've [asked the question on Twitter](https://x.com/sebastienlorber/status/1795377678918554072) and feel like nobody reads it anymore, apart a few people. I'm considering stopping. What do you think? 🤔
---
💡 Subscribe to the [official newsletter](https://thisweekinreact.com?utm_source=dev_crosspost) to receive an email every week!
[](https://thisweekinreact.com?utm_source=dev_crosspost)
---
## 💸 Sponsor
[](https://marmelab.com/react-admin/)
**[React-admin: The Open-Source Framework for B2B apps](https://marmelab.com/react-admin/)**
👩🏻💻 **For developers, by developers**: React-admin is a free, low-code library that accelerates the development of internal tools, admins or B2B apps. Unlike no-code tools, you use code, ensuring you’re never limited by the framework.
🔌 **Flexible Integration**: React-admin supports any API format (REST or GraphQL) and various authentication backends, including Auth0, Cognito, Google Auth, Active Directory, and Keycloak. You control the server, so there are no extra costs for Single Sign-On (SSO) capabilities. It's fully themeable, allowing you to customize it with your company's colors.
🚀 **Proven Success**: Over 25,000 companies have built single-page applications with React-admin. For your next project, save weeks of development time by using react-admin. [Try react-admin now](https://marmelab.com/react-admin/).
---
## ⚛️ React
[](https://nextjs.org/blog/next-15-rc)
**[Next.js 15 RC](https://nextjs.org/blog/next-15-rc)**
A new major version of Next.js dropped in RC at Vercel Ship last week. The highlights of this version are:
- React 19 RC support
- React Compiler support (experimental) through the Babel plugin (this might increase build time)
- Hydration error improvements, displaying the diff mismatch
- Less aggressive caching, now opt-in: `fetch` and Route Handlers are no longer cached by default
- Incrementally adopt Partial Prerendering with a new `experimental_ppr` route config option
- New `create-next-app` design, prompting for Turbo usage in dev, and ability to create projects
- next/after (experimental), a new API useful to defer analytics tasks and keep responses fast
- Optimizing bundling of external packages (`serverExternalPackages`) now stable
Other interesting things were announced at Vercel Ship. Additional resources to look at:
- 📜 [Vercel Ship 2024 recap](https://vercel.com/blog/vercel-ship-2024)
- 📜 [Introducing deeper integrations for feature flags in Vercel](https://vercel.com/blog/feature-flags)
- 📜 [Introducing the Vercel Web Application Firewall](https://vercel.com/blog/introducing-the-vercel-waf)
- 🎥 [Jack Herrington - NextJS 15: React 19, Less Caching, Turbopack, Better Hydration Errors and more from Vercel Ship!](https://www.youtube.com/watch?v=N2LzvfM2R5M)
---
- 💸 [Build AI apps with React and JavaScript — Frontend and UI for any LLM](https://docs.nlkit.com/nlux/?utm_source=twir-my24-3)
- 👀 [React Core PR - Throw if React and React DOM versions don't match](https://github.com/facebook/react/pull/29236): React 19 will be stricter and require that the renderer version (DOM or RN) uses the exact same version as the core React package.
- 👀 [React DOM PR - Add support for Popover API](https://github.com/facebook/react/pull/27981): This new API is now available in all major browsers, and React just merged support for it.
- 👀 [Next.js PR - `unstable_rethrow`](https://github.com/vercel/next.js/pull/65831): Upcoming Next.js API to avoid catching internal errors thrown by Next.js (such as redirect/notFound errors).
- 🐦 [Latest SWR beta can seamlessly move data fetching between client-side and server-side](https://x.com/shuding_/status/1794461568505352693). And [React-Query](https://x.com/TkDodo/status/1794801851671695533) + [Apollo Client](https://x.com/phry/status/1795777518944784450) also implemented similar features.
- 🗓 [React Rally](https://reactrally.com/?utm_source=thisweekinreact) - 🇺🇸 Utah - 12-13, August - Get a 10% discount with code "TWIR". There’s an advanced React [workshop](https://reactrally.com/workshop) with Cory House.
- 📜 [Sneaky React Memory Leaks: How useCallback and closures can bite you](https://schiener.io/2024-03-03/react-closures): A great article simplifying a real production app memory leak to explain it. This surprised me, I couldn’t guess such React code would leak memory, and pretty sure my apps contain memory leaks now 😅. Good news though: the app wouldn't leak memory when using the new React Compiler.
- 📜 [Automatic Query Invalidation after Mutations](https://tkdodo.eu/blog/automatic-query-invalidation-after-mutations): React Query is un-opinionated about how you invalidate cached resources and instead lets you implement a custom strategy on top of flexible primitives. For example, you can invalidate queries by tags.
- 📜 [Migrating from Radix to React Aria: Improving Accessibility and UX](https://argos-ci.com/blog/react-aria-migration): After having used alternatives such as Reakit or Radix, Greg gives good reasons for adopting React Aria. The migration feedback is positive, but some challenges were encountered. React Aria is quite “strict” at ensuring your components remain accessible.
- 📜 [On Laravel, Full-Stack JavaScript, and Productive Frameworks](https://www.jplhomer.org/posts/on-laravel-full-stack-javascript-and-productive-frameworks/): Josh created 2 React meta-frameworks (including Shopify Hydrogen) and also uses Laravel extensively. He shares an interesting perspective on the pros/cons of the Laravel vs React ecosystem.
- 📜 [A virtual DOM in 200 lines of JavaScript](https://lazamar.github.io/virtual-dom/): This greatly explains how a virtual DOM (like the one from React) works under the hood, by creating a small library that can run a TodoMVC app.
- 📜 [Design System Retrospective](https://kyleshevlin.com/design-system-retrospective/): Interesting perspective on the successfulness of implementing a Chakra-like design system (tokens as props). The average dev struggled to understand how to compose primitives. Tailwind might be a better fit for them.
- 📜 [Combining React Server Components with react-query for Easy Data Management](https://frontendmasters.com/blog/combining-react-server-components-with-react-query-for-easy-data-management/): A good read explaining limits RSCs. Server Actions run serially, and you need to wait for RSCs to re-render which might be slow without proper caching. Using React-Query with prefetching might slightly increase bundle size and produce more roundtrips, but overall that could give a faster UX.
- 📜 [Using Server Actions with tRPC](https://trpc.io/blog/trpc-actions): Server Actions are similar to tRPC but come barebone. tRPC maintainer explains things that tRPC enables on top, such as auth, input validation, observability, rate-limiting, and other possible middleware. Note there are alternatives to consider such as next-server-action and zsa.
- 📜 [Optimizing INP for a React App & Performance Learnings](https://www.iamtk.co/optimizing-inp-for-a-react-app-and-performance-learnings): A long feedback on optimizing responsiveness on a React 18 prod app. Various takeaways include deferring work and analytics, or being careful of useless re-renders.
- 📜 [Partial Prerendering without a framework](https://developers.netlify.com/guides/partial-prerendering-without-a-framework/): Demo from Netlify CEO shows how to implement “vanilla PPR” (without Next.js which has this as an experimental feature).
- 📜 [Behind the ‘as’ prop: polymorphism done well](https://www.kripod.dev/blog/behind-the-as-prop-polymorphism-done-well/): Also explains how to type it properly.
- 📜 [Redwood - Using Middleware: RSS and Sitemap](https://redwoodjs.com/blog/using-middleware-rss-and-sitemap)
- 📜 [Structured logging for Next.js - Using the Pino logging library](https://blog.arcjet.com/structured-logging-in-json-for-next-js/)
- 📜 [Facebook just updated its relationship status with web components](https://www.mux.com/blog/facebook-just-updated-it-s-relationship-status-with-web-components)
- 📜 [Authentication with WorkOS in Next.js: A Comprehensive Guide](https://www.nirtamir.com/articles/authentication-with-workos-in-next-js-a-comprehensive-guide)
- 📜 [The Next.js \<Image> Component](https://www.premieroctet.com/blog/en/next-image-component)
- 📜 [How To Dockerize A React App](https://scientyficworld.org/how-to-dockerize-a-react-app/)
- 📦 [Astro 4.9 - React 19 support for Astro Actions](https://astro.build/blog/astro-490/): Adds new withState/getActionState APIs to integrate Astro with React 19 useActionState, including progressive enhancement.
- 📦 [Storybook 8.1 - typesafe mocking, Unit testing React Server Components](https://storybook.js.org/blog/storybook-8-1/)
- 📦 [NextUI 2.4](https://nextui.org/blog/v2.4.0)
- 📦 [Ark UI 3.0](https://ark-ui.com/react/docs/overview/changelog#300---2024-05-24)
- 📦 [React-Query 5.39 - Supports React 19](https://github.com/TanStack/query/releases/tag/v5.39.0)
- 📦 [Redwood 7.6 - React Compiler experimental flag](https://github.com/redwoodjs/redwood/releases/tag/v7.6.0)
- 📦 [zsa 0.1 - Typesafe Server Actions for Next.js](http://next.js)
- 📦 [React-Executor - Asynchronous task execution and state management for React](https://github.com/smikhalevski/react-executor)
- 🎥 [UI Engineering - Portals Can Share State Between Windows](https://www.youtube.com/watch?v=jZx33FPMXzc): It's a pretty cool idea to use React portals to integrate seamlessly between windows and might also be useful for the upcoming browser Picture-in-Picture API.
- 🎥 [Theo - I Was Wrong About React Router.](https://www.youtube.com/watch?v=m86HssTKExU)
- 🎥 [James Quick - Astro Launches Actions Similar like Next.js](https://www.youtube.com/watch?v=8mIUIhp2YGQ)
---
## 💸 Sponsor
[](https://www.youtube.com/watch?v=aoRG1q_kVo8)
**[Next.js auth tutorial with RSCs and Server Actions](https://www.youtube.com/watch?v=aoRG1q_kVo8)**
The latest tutorial with WorkOS and Sam Selikoff shows how you can easily add AuthKit's hosted login box to a Next.js app:
📚 Get started using the Authkit \<> Next.js [integration library](https://github.com/workos/authkit-nextjs)
🤖 Set up environment variables, configure the callback route and middleware, and implement signIn and signOut functionalities
⚙️ Protect routes in the Next.js app from unauthenticated users with the getUser function
AuthKit can be used with WorkOS User Management, which supports MFA, identity linking, email verification, user impersonation, and more.
Best of all, it's **free up to 1,000,000 MAUs 🚀**
---
## 📱 React-Native
This section is authored by [Benedikt](https://twitter.com/bndkt).

The dry season of everyone saving their announcements for upcoming conferences is over, with last week’s App.js Conf dropping another bucket of amazing content and exciting announcements on us! Expo announced the aptly named Atlas, which acts as a map to explore the wilderness of Bundler land. If you ever wanted to understand how code goes in and comes out of your app, this is the tool for it. It’s especially useful to reduce bundle size, but also just to get a better understanding of how your app works. In the ORM space, Drizzle launched their Studio as an Expo dev tools plugin and Prisma announced that they’ll be adding React Native support. The React Native IDE is now in open beta and you can download it in the VS Code extension marketplace and finally, William Candillon announced video support coming to RN Skia, as well as WebGPU in React Native! Of course, we remain super excited by Universal React Server Components that we already mentioned last week. Make sure to catch up and watch the [App.js Conf Live stream](http://app.js).
---
- 💸 [WithFrame - Pre-Built React Native Components](https://withfra.me/components?utm_source=thisweekinreact&utm_medium=email&utm_campaign=quick-link--1)
- 🐦 [Re.Pack V5 will be based on Rspack instead of Webpack](https://x.com/_jbroma/status/1793286546008981933): This brings significant performance improvements (Cold start 15s → 2.5s, build 10s → 2s, HMR 450ms → 100ms).
- 🗓 [Chain React Conf](https://chainreactconf.com/?utm_source=thisweekinreact) - 🇺🇸 Portland, OR - July 17-19. The U.S. React Native Conference is back with engaging talks and hands-on workshops! Get 15% off your ticket with code “TWIR”
- 📜 [Introducing Expo Atlas](https://expo.dev/blog/introducing-expo-atlas): The Expo team used this internally to work on a faster Metro resolver, improve their Babel config, as well as for their work on RSC. Now you can use it to better understand and optimize your app as well!
- 📜 [RNW 0.74 Launches: A Gallery Glow-up and Fabric Foundations! React Native](https://devblogs.microsoft.com/react-native/2024-05-24-improved-gallery-fabric-0-74/): Something new I learned from this announcement is the React Native Gallery app, basically a kitchen sink demo app for RN on Windows.
- 📜 [Bringing Prisma ORM to React Native and Expo](https://www.prisma.io/blog/bringing-prisma-orm-to-react-native-and-expo): Prisma, including reactive queries, is now available in RN (in beta). And it seems they’re also working on a local-first sync solution.
- 📜 [Using React-Admin With React Native](https://marmelab.com/blog/2024/05/22/using-react-admin-with-react-native.html): React-Admin is a framework for quickly building CRUD apps. This article shows how to leverage its headless architecture to use it with RN.
- 📜 [Fluttering in the sky](https://www.celest.dev/blog/fluttering-in-the-sky): Not React Native but worth mentioning that a company is working on the equivalent of React Native Server Components for Flutter.
- 📦 [React Native IDE is now in open beta](https://ide.swmansion.com/): Simply download as a VS Code extension from the VS Code marketplace.
- 📦 [SwiftUI-React-Native 6.0](https://github.com/andrew-levy/swiftui-react-native): Use SwiftUI features directly from React Native.
- 📦 [React Native Skia 1.3.0](https://github.com/Shopify/react-native-skia/releases/tag/v1.3.0): Skia can now render videos.
- 📦 [Drizzle Studio is now available for Expo SQLite via dev tools plugin](https://github.com/drizzle-team/drizzle-studio-expo/tree/main): This is super handy! No more finding the Simulator’s file system folder on your Mac to open the sqlite-file in a separate GUI for debugging purposes.
- 🎥 [Simon Grimm - 10 Takeaways for React Native Devs from the App.js conference](https://www.youtube.com/watch?v=mhos9givltA)
---
## 🔀 Other
- 📜 [What We Learned From the First State of HTML Survey](https://frontendmasters.com/blog/state-of-html-2023-results-2/)
- 📜 ["Web components" considered harmful](https://www.mayank.co/blog/web-components-considered-harmful/)
- 📦 [Angular 18.0 - Zoneless, Material 3, deferrable views…](https://blog.angular.dev/angular-v18-is-now-available-e79d5ac0affe)
- 📦 [TypeScript-ESLint v8 beta - ESLint 9 support](https://typescript-eslint.io/blog/announcing-typescript-eslint-v8-beta/)
- 📦 [fnm 1.36 - fast Node.js version manager](https://github.com/Schniz/fnm/releases/tag/v1.36.0)
- 📦 [rsbuild 0.7](https://rsbuild.dev/community/releases/v0-7)
---
## 🤭 Fun
[](https://x.com/jamonholmgren/status/1791255201774719178)
See ya! 👋 | sebastienlorber |
1,872,113 | コグニティブ・サービスのユースケースを探る | コグニティブサービスによって変貌を遂げつつあるアプリやビジネスの例、そして将来のユースケースもいくつか紹介する。 | 0 | 2024-05-31T13:56:18 | https://dev.to/pubnub-jp/koguniteibusabisunoyusukesuwotan-ru-59h6 | コグニティブ・サービスによって変貌を遂げたアプリや企業の例、そしてコグニティブ・サービスがテクノロジーの展望をどれほど変えつつあるのか、今後のユースケースをいくつか見ていこう。
AWS、IBM、Microsoft Azureのようなクラウド大手のコグニティブ・サービスのおかげで、あらゆる規模の開発者チームが、驚異的なパワーを持つ[コグニティブ・サービスに](https://pubnub.com/resources/ebook/building-apps-with-cognitive-services/)アクセスできるようになった。APIを通じて提供されるこれらのサービスにより、次世代のインテリジェンスをアプリケーションに注入することが容易になります。
**チャットとソーシャル・インタラクション**
-----------------------
2015年、チャットアプリの月間アクティブユーザー数はソーシャルネットワークのそれを上回り、その差は広がり続けている。実際、メッセージングはソーシャルネットワークそのものに不可欠な機能となっている。そして、この急速な成長とともに、メッセージングアプリは、短いテキストベースのメッセージを送受信するためのシンプルなツールから、驚くような楽しい機能を誇る革新的でフル機能のエクスペリエンスへと進化してきた。そして、そのイノベーションを推進しているのがコグニティブAPIだ。
### **チャットボットとコグニティブ・コンピューティング**
チャットボットは、AIアルゴリズムの最も初期の形態の1つである。すぐにチューリング・テストに合格する可能性は低いが、音声対応アプリケーションの自然な進化を表している。かつてはサポート・ラインに電話をかけ、買掛金支払いのために1を押していたのが、今ではあなたの意図を識別できるシステムに完全な文章で話すことができる。
あなたが知っているかどうかにかかわらず、企業が待ち時間の短縮、顧客体験の向上、人間の電話オペレーターのコストの最小化を求めているため、チャットボットの採用は爆発的に増加している。現在、チャットボットは主に単純なタスクを処理するために使用されている。基本的な要求を理解し、事前に定義されたルールに基づいて応答する。"注文はどこですか?"や "チャットボット、ムードライトを付けてください "といった質問に答える。
しかし、[Watson Assistantや](https://www.ibm.com/cloud/watson-assistant/) [Amazon Lexの](https://aws.amazon.com/lex/)ようなAPIを使えば、自然言語のリクエストで観察されたパターンにロジックを適用できるサービスを簡単に構築できる。これらのサービスは、例えば、離陸遅延に悩む空港からの突然の電話ラッシュを観察し、フライトの再スケジュールを優先するようにオプションの順序を変更することができる。あるいは、特定の国や地域からの電話が異なる言語で行われる傾向があることを察知し、それに応じてデフォルトを変更することもできる。また、文法的なパターンを識別して、すぐにスーパーバイザーに転送するよう顧客に指示することもできる。
音声認識、音声合成、顔認識、機械学習モデルを使用したインテリジェントな会話インターフェースは、様々な目的に応じて、非常に魅力的な体験や、まるで生きているかのような会話を提供することができる。さらに良いことに、チャットボットはそれらの経験から学習する。
チャットボットは、銀行、買い物、学習の方法を変えるだろう。レコメンデーションを行い、抽象的な概念を理解し、過去の関わりに基づいて個人を知る。最終的には、人間と話しているかどうかさえわからなくなるほど、チャットボットは優秀になるでしょう。
#### **コード例ホームオートメーションチャットボット**
WatsonとPubNub ChatEngineを使えば、スマートホームを制御する[人工知能を搭載したチャットボットを](https://www.pubnub.com/docs/chat/samples?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)簡単に作ることができます。

このチュートリアルでは、テキストコマンドを受け付け、それを解析し、それに基づいてアクションを起こすチャットボットの作り方を紹介します。例えば、ユーザーが「リビングの電気をつけて」と入力すると、ボットが電気をつけます。
```js
{
"homeauto_intents":
[
{
"intent":"turnOFF",
"examples":
[
{"text":"Put off"},
{"text":"Switch off"},
{"text":"Turn off"}
],
"description":"Turn on intents"
},
{
"intent":"turnON",
"examples":
[
{"text":"Put on"},
{"text":"Switch on"},
{"text":"Turn on"}
],
"description":"Turn off intents"
}
```
### **自然言語処理**
データサイエンスと自然言語処理(NLP)は、大量の自然言語データを有益に処理できるAIソリューションの総称である。NLPは、意味論的な観点から単語や文法を測定するだけでなく、メッセージごとの分析を通じて、ユーザーがトピックやテーマについてどのように感じているかを明らかにし、センチメントや感情を割り出すことができる。
NLPは、数分のうちに評判が決まったり決まらなかったりする時代に、ユーザーの意見を理解し、それに対応する必要のあるブランド、公人、組織にとって、大きなメリットとなる。あるブランドが製品の新しいコマーシャルを打ち出したとしよう。適切なコグニティブ・サービスを使用することで、特定のハッシュタグや製品名に関するソーシャルメディア・ストリームを利用し、NLP APIに関連するすべてのメッセージを分析させ、製品に対する一般ユーザーの反応に関するフィードバックを提供することができる。
以下は、ツイッター上で米国の政治家について人々がどう感じたかを分析し、測定するために設計されたアプリの例である。特定のキーワードやフレーズを監視し、定義された地理的地域のユーザーの感情をプロットすることができる。

例えば、あるユーザーが "I am happy "というテキストを投稿したとする。
```js
{
"session_id": 1,
"text": "I am happy!"
}
```
ワトソンはそのテキストを分析し、次のように返す:
```js
{
"session_id": 1,
"text": "I am happy!"
"session_sentiment": {
"overall": 0.879998,
"positive": {
"count": 1,
"avg": 0.879998
},
"negative": {
"count": 0,
"avg": 0
},
"neutral": {
"count": 2,
"avg": null
}
},
"score": 0.88006828
}
```
ブランドはすでに、市場のセンチメント分析に多額の費用を費やしている。これらのシステムがよりインテリジェントになり、堅牢になり、自動化されれば、より低コストで一般大衆をはるかに理解できるようになるだろう。
**eコマース**
---------
オンライン・ショッピングは商品の購入方法を完全に変えたが、eコマースには実店舗の重要な要素である「親切な従業員」が欠けている。オンラインストアの規模では、ライブチャットに実際のスタッフを配置することは経済的に実行可能ではありません。
その結果、多くのオンラインストアは、経験を最適化し、買い物客の質問を支援し、推奨し、チェックアウトするために、インテリジェントなショッピングアシスタントボットに目を向けています。
[Nordstromは](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/)、単純な事前定義された質問と答えを超えて、顧客が何を探しているかを真に理解し、必要に応じて支援するためにコグニティブサービスを使用したメッセンジャーチャットボットで、[以前のホリデーシーズンを席巻](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/)しました。このチャットボットは、おすすめのギフトを提案し、注文を完了する手助けをすることもできる。

チャットボットはまた、簡単な問題に対処するために担当者を1時間も待たせる、カスタマーサポートの恐ろしい電話から私たちを救ってくれる。アマゾンはチャットボットを配備しており、ほとんどの顧客が注文に関して助けを必要としているときに抱える些細な問題を解決することができる。
さて、今日の現実世界におけるインテリジェンスの例をいくつか見てきたところで、未来を覗いて、コグニティブ・サービスが将来どのように私たちの世界を変えていくかを見てみよう。
**スマートシティ**
-----------
未来の都市は、より安全で、より効率的で、より環境に配慮したものにするために、さまざまな統合インテリジェント・サービスに依存するようになるだろう。画像認識、コンピュータ・ビジョン、ビジョンAPIは、この変革において重要な役割を果たし、都市空間内の画像を処理してアクションを起こす。
**農業**
------
世界の人口は増加の一途をたどっており、数十億の人々に食料を供給することは、今後数年間でかなりの課題となる。コグニティブ・サービスは、畑や工場を管理する上で重要な役割を果たし、インテリジェントな決定を下し、これまでにない精度でリソースを制御できるようになる。
スマートファームとIoTは、できるだけ多くの貴重なデータポイントを組み込んで、一見直感に反するようなものであっても、インテリジェントな農業上の意思決定を行う。例えば、リアルタイムの気象データ、遠隔センサーデータ、過去の実績を集約することで、コグニティブサービスは個々の灌漑計画を完璧なものにし、毎日固有の状況に合わせて更新することができる。
**データ・セキュリティ**
--------------
コネクテッド化が進み、デジタル・ライフが物理的なライフを凌駕するにつれ、データ・プライバシーとセキュリティは、漠然と意識するものから、不穏で常に存在する個人的脅威へと変化している。
規制や規則-HIPAA、GDPR、SOC II-は、企業や組織が適切なガードレールを確保するための1つの方法です。このような複雑な規制を詳細に実施するのは大変なことだが、そこで機械学習が活躍する。
コグニティブ・サービスは、規則や規制を理解し、理解できるように訓練し、コンプライアンスを達成するための方法を提案することができる。コグニティブ・サービスは、関連する規則や法律からコンテンツのモデレーションに至るまで、データ・セキュリティに関する貴重な洞察を提供することを可能にします。
**ヘルスケア**
---------
ヘルスケア業界では、厳しいマージン、厳しい規制、サイロ化された研究開発など、いくつかの理由により、イノベーションの進展が他業界よりも遅いのが一般的です。コグニティブ・サービスは、イノベーションの障壁を取り除き、組織から患者までのデリバリー・システムを改善する機会を提供する。
医療における意思決定は通常、患者ごとにサイロ化された形で行われる。対照的に、コグニティブ・サービスは、健康に影響を与える要因(社会経済的状況、環境、医療へのアクセスなど)を包括的に分析し、それに基づいて行動する。コグニティブ・サービスは、健康やウェルネス・プログラムなど、より的確で的を絞った患者ケアを医師に推奨することができる。
コグニティブ・サービスは、医療機関内の既存システムの統合と接続を推進し、本質的な洞察を引き出すことができる。データを集約し、利害関係者のニーズを結びつけることができるようになったことで、組織はより効率的な運営を行いながら、より良いケアを提供することができる。
**インテリジェンスの今**
--------------
この記事では、コグニティブ・サービスがビジネスに対する考え方やアプリケーションが果たす役割をどのように変えていくのか、そのほんの一例を紹介したに過ぎない。これまでは、ソフトウェアは指示に従った。コグニティブ・サービスによって、ソリューションは適応し、進化し、ほんの数年前には不可能と思われたようなことを成し遂げることができる。すべての意味を見通すことはできませんが、私たちが知っている限りでは、ビジネスへの影響が大きく、ポジティブなものになることは間違いありません。
PubNubはどのようにお役に立てるでしょうか?
========================
この記事は[PubNub.comに](https://www.pubnub.com/blog/the-many-uses-of-cognitive-services/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)掲載されたものです。
PubNubのプラットフォームは、開発者がウェブアプリ、モバイルアプリ、IoTデバイス向けにリアルタイムのインタラクティブ機能を構築、提供、管理できるよう支援します。
私たちのプラットフォームの基盤は、業界最大かつ最もスケーラブルなリアルタイムエッジメッセージングネットワークです。世界15か所以上で8億人の月間アクティブユーザーをサポートし、99.999%の信頼性を誇るため、停電や同時実行数の制限、トラフィックの急増による遅延の問題を心配する必要はありません。
PubNubを体験
---------
[ライブツアーを](https://www.pubnub.com/tour/introduction/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)チェックして、5分以内にすべてのPubNub搭載アプリの背後にある本質的な概念を理解する
セットアップ
------
[PubNubアカウントに](https://admin.pubnub.com/signup/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)サインアップすると、PubNubキーに無料ですぐにアクセスできます。
始める
---
[PubNubのドキュメントは](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)、ユースケースや[SDKに](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=ja)関係なく、あなたを立ち上げ、実行することができます。 | pubnubdevrel | |
1,872,112 | React Big Calendar Functionality in a Scheduling Application | While recently working on a full-stack development project, I got to experience the usefulness of... | 0 | 2024-05-31T13:55:29 | https://dev.to/spencerbrown80/react-big-calendar-functionality-in-a-scheduling-application-2n4g | While recently working on a full-stack development project, I got to experience the usefulness of React Big-Calendar. My aim was to create a scheduling application for a home health nurse that would improve their ability to logistically plan their day. This involved creating a calendar that would hold and update current events. The inspiration for the application came from my years working in the home care industry. I wanted the user to be able to have the functionality of a home care scheduling application while also being easy to use.
To start using react big-calendar we can install by running this on your client side:
```npm install react-big-calendar moment @chakra-ui/react```
You will need to initialize your application inside of a page or a component. I initialized mine in a component called UserCalendar. Below is a reduced version of my code to get started. You will have many options to tailor the calendar as you build.
```
import React, { useEffect, useState } from "react";
import { Box, Button, useDisclosure, Select, Badge, Stack } from "@chakra-ui/react";
import { Calendar, momentLocalizer, Views } from "react-big-calendar";
import moment from "moment";
import "react-big-calendar/lib/css/react-big-calendar.css";
const localizer = momentLocalizer(moment);
const EVENT_STATUS_COLORS = { 1: "gray", 2: "green", 3: "yellow", 4: "blue", 5: "magenta" };
const UserCalendar = () => {
const [events, setEvents] = useState([]);
const { isOpen, onOpen, onClose } = useDisclosure();
useEffect(() => {
fetch("/api/events")
.then(response => response.json())
.then(data => setEvents(data.map(event => ({
...event, start: new Date(event.start), end: new Date(event.end)
}))));
}, []);
return (
<Box>
<Button onClick={onOpen}>Add Event</Button>
<Stack direction="row" mb={4}>
{Object.entries(EVENT_STATUS_COLORS).map(([status, color]) => (
<Badge key={status} colorScheme={color}>{status}</Badge>
))}
</Stack>
<Calendar
localizer={localizer}
events={events}
startAccessor="start"
endAccessor="end"
style={{ height: 500 }}
views={[Views.MONTH, Views.WEEK, Views.DAY, Views.AGENDA]}
eventPropGetter={event => ({
style: { backgroundColor: EVENT_STATUS_COLORS[event.status] }
})}
/>
</Box>
);
};
export default UserCalendar;
```
You should now have a calendar rendered on your page that allows you to switch between monthly, weekly, daily and agenda views. The date picker and buttons labeled Today, Next and Back will be present as part of Big Calendar as well. However, you need to get something on the calendar. There is no built in form to add events, however, the calendar will react to selecting a slot in the calendar. I set mine to open an event form that would handle creating simple events to get started. Once again, this code is only a parital representation of my current event form.
```
import React from "react";
import { Formik, Form, Field } from "formik";
import { Box, Button, FormControl, FormLabel, Input, Textarea, Switch } from "@chakra-ui/react";
const EventForm = ({ isOpen, onClose, onSubmit, initialValues }) => {
return (
<Formik
initialValues={initialValues}
onSubmit={onSubmit}
>
{({ values, setFieldValue }) => (
<Form>
<FormControl>
<FormLabel>Notes</FormLabel>
<Field name="notes" as={Textarea} />
</FormControl>
<FormControl>
<FormLabel>Start</FormLabel>
<Field name="start" type="datetime-local" as={Input} />
</FormControl>
<FormControl>
<FormLabel>End</FormLabel>
<Field name="end" type="datetime-local" as={Input} />
</FormControl>
<FormControl>
<FormLabel>Recurring</FormLabel>
<Field name="is_recurring" type="checkbox" as={Switch} />
</FormControl>
<Button type="submit">Save</Button>
</
```
This snippet allows the user to enter in a start time and an end time on the event form, which is most of what you will need to get started. I used Chakra UI to help model the forms and provide modality later on. I believe it provided a nice result. Depending on your needs, you can customize your event form to present a lot of information to the user. My event form included fields for the event_status, event_type, add_address, event_address, start_time, end_time, client_name, recurrence_rule, is_recurring, notes, user_id, client_id and parent_event_id. Actually, I have more, but there's already enough to write about. By selecting add address or is recurring, additional fields open up on the event form. This can get very involved quickly, so make decisions on your backend wisely.
## Creating Recurring Events
To create recurring events, you need to understand the recurrence rule.
```
DTSTART:20240601T100000Z
RRULE:FREQ=WEEKLY;INTERVAL=2;BYDAY=MO,WE;UNTIL=20241231T235959Z
```
**Explanation**
FREQ=WEEKLY: This indicates that the frequency of the recurrence is weekly.
INTERVAL=2: This specifies that the event should occur every 2 weeks.
BYDAY=MO,WE: This indicates that the event should occur on Mondays and Wednesdays.
DTSTART:20240601T100000Z: This specifies the start date and time of the first occurrence, in UTC format (June 1, 2024, at 10:00 UTC).
UNTIL=20241231T235959Z: This specifies the end date and time for the recurrence rule, in UTC format (December 31, 2024, at 23:59 UTC).
With this recurrence rule, the event will repeat every other week on Mondays and Wednesdays from June 1, 2024, until December 31, 2024.
So in order to create recurring events, I needed to make a function to create the recurrence rule, read/parse so it was viewable and the method of selection on my form. By the way, my application uses the event date, so I did not need to use a DTSTART, but thought it would be nice to put in this example.
**Method to Generate Recurrence Rule**
```
import { RRule, rrulestr } from 'rrule';
const createRRule = (values) => {
let rrule = "";
if (values.is_recurring) {
if (values.recurrence_option === "EOW") {
rrule = `FREQ=WEEKLY;INTERVAL=2;BYDAY=${values.recurrence_days.join(",")}`;
} else if (values.recurrence_option === "Monthly") {
const byDayValues = values.recurrence_weeks.map(week =>
values.recurrence_days.map(day => `${week}${day}`).join(",")
).join(",");
rrule = `FREQ=MONTHLY;BYDAY=${byDayValues}`;
} else if (values.recurrence_option === "Weekly") {
rrule = `FREQ=WEEKLY;BYDAY=${values.recurrence_days.join(",")}`;
}
if (values.recurrence_end) {
rrule += `;UNTIL=${values.recurrence_end.replace(/-/g, "")}T235959Z`;
}
}
return rrule;
};
```
**Method to Parse Recurrence Rule**
```
const parseRecurrenceRule = (recurrenceRule, setFieldValue) => {
if (!recurrenceRule) return;
const parts = recurrenceRule.split(";");
let recurrenceDays = [];
let recurrenceEnd = "";
let recurrenceOption = "";
let recurrenceWeeks = [];
parts.forEach((part) => {
if (part.startsWith("FREQ=")) {
const freq = part.replace("FREQ=", "");
if (freq === "WEEKLY") {
if (part.includes("INTERVAL=2")) {
recurrenceOption = "EOW";
} else {
recurrenceOption = "Weekly";
}
} else if (freq === "MONTHLY") {
recurrenceOption = "Monthly";
}
} else if (part.startsWith("BYDAY=")) {
const days = part.replace("BYDAY=", "").split(",");
if (recurrenceOption === "Monthly") {
days.forEach(day => {
const week = day[0];
const dayOfWeek = day.substring(1);
recurrenceWeeks.push(parseInt(week, 10));
recurrenceDays.push(dayOfWeek);
});
} else {
recurrenceDays = days;
}
} else if (part.startsWith("UNTIL=")) {
const untilDate = part.replace("UNTIL=", "");
recurrenceEnd = `${untilDate.substring(0, 4)}-${untilDate.substring(4, 6)}-${untilDate.substring(6, 8)}`;
}
});
setFieldValue("recurrence_days", [...new Set(recurrenceDays)]);
setFieldValue("recurrence_end", recurrenceEnd);
setFieldValue("recurrence_option", recurrenceOption);
setFieldValue("recurrence_weeks", [...new Set(recurrenceWeeks)]);
};
```
**Formula to Generate Recurring Events**
```
const generateRecurringEvents = (event) => {
if (!event || !event.start) {
console.error("Invalid event data passed to generateRecurringEvents:", event);
return [];
}
const maxEndDate = new Date(event.start);
maxEndDate.setDate(maxEndDate.getDate() + 180); // 180 days from start date
const rule = rrulestr(event.recurrence_rule, { dtstart: new Date(event.start) });
const occurrences = rule.between(new Date(event.start), maxEndDate, true); // Generate all occurrences
const recurringEvents = occurrences.map((occurrence) => {
if (occurrence > new Date(event.start)) {
const end = new Date(occurrence);
end.setTime(end.getTime() + (new Date(event.end) - new Date(event.start))); // Adjust end time
return {
start: occurrence.toISOString(),
end: end.toISOString(),
type: event.type,
status: event.status,
is_fixed: event.is_fixed,
priority: event.priority,
is_recurring: false, // Each generated event is not recurring itself
recurrence_rule: event.recurrence_rule, // Include the recurrence rule
notify_client: event.notify_client,
notes: event.notes,
is_completed: event.is_completed,
is_endpoint: event.is_endpoint,
address: event.address,
city: event.city,
state: event.state,
zip: event.zip,
user_id: event.user_id,
user_client_id: event.user_client_id,
parent_event_id: event.id, // Link to the original event
};
}
return null;
}).filter(event => event !== null);
return recurringEvents;
};
```
By using something similar to this you can easily create recurring events. I elected to have buttons appear to match the needs of the recurrence rule: days, weeks, options. By storing all of the data in the RRule, and then using the formula to parse it, I saved having to create extra elements in the events table.
**Updating a Series of Events**
Updating a series of events involves not only creating events but, a little bit more. I elected to find all events after the date of the selected event based off the parent_event_id created when the event was initialized. Store those events. Create new events. Delete the stored events. Why not patch? I think that if certain modifications were to be made to the recurrence rule, like adding a day to the rule, would mean that a post and a patch would need to be done at the same time. For the same amount of CRUD actions, I thought this approach was cleaner.
## Switching Between My View and Client View
One nice feature I found in Big Calendar was that I could switch views from the user view, to the client view, and only see events tied to a selected client. I originally thought I was going to have to create a separate component, but the calendar took care of it for me with a little added code:
```
const [view, setView] = useState("my");
const [selectedClient, setSelectedClient] = useState(null);
const toggleView = () => {
setView(view === "my" ? "client" : "my");
setSelectedClient(null); // Reset selected client when switching views
};
const filteredEvents = events.filter(event => {
if (view === "my") {
return event.type !== 3; // Exclude "Client Unavailable" events
} else if (view === "client") {
return event.type !== 2 && event.user_client_id === parseInt(selectedClient);
}
return true;
});
```
## Handling Overdue and Overlapping Events
I was able to assign colors to my events based on their status (pending, confirmed, completed,cancelled), but I wanted the user to see visually events that were conflicted or overdue. By modifying the UserCalendar we can achieve this. Here is my method:
```
const eventPropGetter = (event) => {
const isOverdue = !event.is_completed && new Date() > new Date(event.end);
const overlappingIds = findOverlappingEvents(events);
const isOverlap = overlappingIds.includes(event.id);
const backgroundColor = isOverdue ? "red" : (isOverlap ? "orange" : EVENT_STATUS_COLORS[event.status]);
return { style: { backgroundColor } };
};
const findOverlappingEvents = (events) => {
const overlaps = [];
for (let i = 0; i < events.length; i++) {
for (let j = i + 1; j < events.length; j++) {
if (new Date(events[i].end) > new Date(events[j].start) && new Date(events[i].start) < new Date(events[j].end)) {
overlaps.push(events[i].id);
overlaps.push(events[j].id);
}
}
}
return overlaps;
};
```
## Practical User Agenda
If you are picky, like me, you may not like the agenda view in Big Calendar. I only covers a range of dates and the daily view page was great for creating events as a tool, but not what I wanted the user to see. By creating a User Agenda that listed the events of the day in order, with helpful links to the event, client profile, directions, event type and notes. The user can view the associated event forms, pick dates and add events. I set my agenda to fetch the data from events like this:
```
const DailyAgenda = ({ userId, selectedDate, onDateChange, onPreviousDay, onNextDay }) => {
const [events, setEvents] = useState([]);
const [clients, setClients] = useState([]);
const [isModalOpen, setIsModalOpen] = useState(false);
const [selectedEvent, setSelectedEvent] = useState(null);
const navigate = useNavigate();
useEffect(() => {
const fetchEvents = async () => {
try {
const response = await fetch(`/api/users/${userId}/events?date=${selectedDate.toISOString().split('T')[0]}`);
let eventsData = await response.json();
eventsData = eventsData.filter(event => event.type !== 3);
const clientDetails = await Promise.all(eventsData.map(async (event) => {
const clientResponse = await fetch(`/api/user_clients/${event.user_client_id}`);
const clientData = await clientResponse.json();
return {
...event,
client_address: clientData.address_line_1,
client_city: clientData.city,
client_state: clientData.state,
client_zip: clientData.zip,
};
}));
setEvents(clientDetails);
} catch (error) {
console.error('Error fetching events:', error);
}
};
const fetchClients = async () => {
try {
const response = await fetch(`/api/user_clients`);
const clientsData = await response.json();
setClients(clientsData);
} catch (error) {
console.error('Error fetching clients:', error);
}
};
fetchEvents();
fetchClients();
}, [selectedDate, userId]);
```
I rendered the agenda incorporating links like this:
```
const handleOpenModal = (event) => {
setSelectedEvent(event);
setIsModalOpen(true);
};
const handleCloseModal = () => {
setSelectedEvent(null);
setIsModalOpen(false);
};
return (
<Box>
<Flex justifyContent="space-between" alignItems="center" mb={4}>
<Button colorScheme="blue" onClick={() => handleOpenModal(null)}>
Add Event
</Button>
<Text fontSize="2xl" fontWeight="bold">Daily Agenda</Text>
<Flex alignItems="center">
<Button onClick={onPreviousDay}>Previous</Button>
<DatePicker
selected={selectedDate}
onChange={onDateChange}
dateFormat="yyyy-MM-dd"
customInput={<Button>{format(selectedDate, 'yyyy-MM-dd')}</Button>}
/>
<Button onClick={onNextDay}>Next</Button>
</Flex>
</Flex>
<Text fontSize="lg" fontWeight="bold" textAlign="center" mb={4}>
{format(selectedDate, 'EEEE, MMMM d, yyyy')}
</Text>
<Table variant="striped" colorScheme="teal">
<Thead>
<Tr>
<Th>Time</Th>
<Th>Client Name</Th>
<Th>Address</Th>
<Th>Type</Th>
<Th>Notes</Th>
</Tr>
</Thead>
<Tbody>
{events.map(event => (
<Tr key={event.id}>
<Td>
<Button variant="link" onClick={() => handleOpenModal(event)}>
{`${new Date(event.start).toLocaleTimeString([], { hour: '2-digit', minute: '2-digit' })} – ${new Date(event.end).toLocaleTimeString([], { hour: '2-digit', minute: '2-digit' })}`}
</Button>
</Td>
<Td>
<Button variant="link" onClick={() => navigate(`/usermenu/${userId}/clients/${event.user_client_id}`)}>
{event.client_name}
</Button>
</Td>
<Td>
<Button
variant="link"
onClick={() => window.open(`https://www.google.com/maps/dir/?api=1&destination=${event.address || `${event.client_address}, ${event.client_city}, ${event.client_state}, ${event.client_zip}`}`, '_blank')}
>
{event.address || `${event.client_address}, ${event.client_city}, ${event.client_state}, ${event.client_zip}`}
</Button>
</Td>
<Td>{EVENT_TYPE_MAP[event.type]}</Td>
<Td>{event.notes.slice(0, 20)}{event.notes.length > 20 && '...'}</Td>
</Tr>
))}
</Tbody>
</Table>
<Modal isOpen={isModalOpen} onClose={handleCloseModal}>
<ModalOverlay />
<ModalContent>
<ModalHeader>{selectedEvent ? 'Edit Event' : 'Add Event'}</ModalHeader>
<ModalCloseButton />
<ModalBody>
<EventForm event={selectedEvent} clients={clients} onClose={handleCloseModal} />
</ModalBody>
</ModalContent>
</Modal>
</Box>
);
};
```
With a little bit a coding you can create an easy to use agenda for your users.
## Conclusion
Creating a user friendly application in Big Calendar is not only possible, but with enough tailoring, you can make an excellent professional scheduling application that will work similarly to one's companies pay lots of money for their employees to use. Even better, React Big Calendar is free. So give it a try. Hope you enjoyed this article. If you have any questions or comments, let me know. Take Care.
| spencerbrown80 | |
1,872,110 | process.nexTick vs setImmediate | process.nextTick and setImmediate are two functions in Node.js that are used to schedule the... | 0 | 2024-05-31T13:53:35 | https://dev.to/zeeshanali0704/processnextick-vs-setimmediate-2aip | process.nextTick and setImmediate are two functions in Node.js that are used to schedule the execution of callback functions. They are similar but have different timing and use cases. Here's a breakdown of their differences along with examples:
process.nextTick
**process.nextTick**
Timing: process.nextTick schedules a callback function to be invoked in the same phase of the event loop, right after the current operation completes and before the event loop continues.
Use Case: It is useful when you want to execute a callback immediately after the current operation but before any I/O operations or timers.
**setImmediate**
Timing: setImmediate schedules a callback function to be invoked in the next iteration of the event loop, after I/O events' callbacks.
Use Case: It is useful when you want to defer the execution of a function until the current I/O events have been processed.
``` js
Example
const fs = require('fs');
console.log('Start');
// Schedule with process.nextTick
process.nextTick(() => {
console.log('Next Tick');
});
// Schedule with setImmediate
setImmediate(() => {
console.log('Immediate');
});
// Read a file (simulates I/O operation)
fs.readFile(__filename, () => {
console.log('File Read');
// Schedule another nextTick inside I/O
process.nextTick(() => {
console.log('Next Tick inside I/O');
});
// Schedule another setImmediate inside I/O
setImmediate(() => {
console.log('Immediate inside I/O');
});
});
console.log('End');
Expected Output
Start
End
Next Tick
File Read
Next Tick inside I/O
Immediate inside I/O
Immediate
```
Explanation
Initial Phase:
console.log('Start') is executed.
process.nextTick(() => { console.log('Next Tick'); }) schedules a callback to be executed after the current phase.
setImmediate(() => { console.log('Immediate'); }) schedules a callback for the next iteration of the event loop.
fs.readFile(__filename, () => { ... }) schedules an I/O operation.
After Initial Phase:
console.log('End') is executed.
process.nextTick callback is executed: console.log('Next Tick').
I/O Phase:
File read operation completes, console.log('File Read') is executed.
Inside the I/O callback:
process.nextTick(() => { console.log('Next Tick inside I/O'); }) schedules another callback for the current phase.
setImmediate(() => { console.log('Immediate inside I/O'); }) schedules another callback for the next iteration of the event loop.
process.nextTick callback inside I/O is executed: console.log('Next Tick inside I/O').
Next Event Loop Iteration:
All setImmediate callbacks are executed in order: console.log('Immediate inside I/O') and console.log('Immediate').
Summary
process.nextTick executes the callback before the next I/O event.
setImmediate executes the callback after the current I/O events.
https://medium.com/@a35851150/interview-qustions-on-event-loop-in-js-901c567a1271
https://medium.com/@a35851150/interview-qustions-on-event-loop-in-js-901c567a1271 | zeeshanali0704 | |
1,872,109 | Securing Your Magento 2 Store: Top 12 Security Measures for Enhanced Protection | Securing your Magento 2 store is crucial to safeguard sensitive data and maintain customer trust.... | 0 | 2024-05-31T13:51:34 | https://dev.to/charleslyman/securing-your-magento-2-store-top-12-security-measures-for-enhanced-protection-1f05 | magento, security | Securing your Magento 2 store is crucial to safeguard sensitive data and maintain customer trust. This blog outlines 12 essential security steps every store owner should implement and the role of Magento hosting in fortifying your e-commerce platform.
**1. Regular Updates:** Always keep your Magento 2 and its extensions up-to-date to protect against vulnerabilities.
h
**2. Strong Password Policies:** Implement strong password requirements for backend users to prevent unauthorized access.
**3. Two-Factor Authentication:** Enhance login security with two-factor authentication, adding an extra layer of protection.
**4. Secure Admin Path:** Change the default admin URL to a custom path to avoid easy guesses by attackers.
**5. Use Secure Connections:** Ensure that your store uses HTTPS to encrypt the data exchanged between users and the server.
**6. Backend CAPTCHA:** Implement CAPTCHA for admin logins to prevent automated attacks.
**7. File Permission Settings:** Set appropriate file permissions on your server to prevent unauthorized access.
**8. Disable Directory Indexing:** This prevents hackers from easily browsing your site's structure.
**9. Use Security Extensions:** Consider Magento security extensions that enhance protection against common threats.
**10. Backup Regularly:** Maintain regular backups of your store’s data to recover quickly in case of data loss.
**11. Choose Reliable Extensions:** Only use extensions from trusted sources to avoid introducing vulnerabilities.
**12. Magento Hosting Security:** Choose a [Magento hosting](https://devrims.com/magento-hosting/) provider that offers robust security features including firewalls, intrusion detection, and regular security audits.
Opt for managed Magento hosting that not only [Magento 2 Security](https://devrims.com/blog/magento-2-security-12-things-you-should-do/) offers enhanced security protocols but also ensures optimized performance for Magento stores. Managed hosting can handle much of the technical overhead, allowing you to focus on growing your business while keeping it secure. | charleslyman |
1,872,108 | Difference between Synchronous % Asynchronous | How code is executed in JavaScript falls into two main categories: synchronous and asynchronous.... | 0 | 2024-05-31T13:50:19 | https://dev.to/yomtech/different-between-synchronous-asynchronous-4p4 | webdev, javascript, beginners, programming | How code is executed in JavaScript falls into two main categories: synchronous and asynchronous. Understanding these two is key to writing efficient and responsive web applications.
Synchronous
Synchronous code execution happens sequentially, one line at a time. Each line of code waits for the previous line to finish before it starts. Imagine it like waiting in line at a store - you can't move forward until the person in front of you does.
Here's an example of synchronous code:
console.log("I'm first!");
let result = someFunctionThatTakesTime(); // Simulates a long-running task
console.log(result);
In this example, the first line "I'm first!" will be logged to the console, then the program will pause and wait for someFunctionThatTakesTime to finish. Once it does, the result will be logged.
Asynchronous
Asynchronous code execution, on the other hand, doesn't block the main thread. When an asynchronous operation is encountered, like fetching data from a server, the program continues to execute the next line of code. The asynchronous operation runs in the background, and when it's finished, it notifies the program through a callback function.
Here's an example of asynchronous code:
console.log("I'm first!");
fetchDataFromServer(function(data) {
console.log(data);
});
console.log("I'm third!"); // This will be executed before data is fetched
In this example, all three lines are executed one after another. However, the fetchDataFromServer function is asynchronous. So, "I'm first!" and "I'm third!" will be logged immediately, and then the program will continue to wait for the data to be fetched. Once the data is retrieved, the callback function will be executed, and the data will be logged.
Why Asynchronous is Important
Asynchronous programming is essential for creating responsive web applications. If long-running tasks constantly block your code, your UI will become unresponsive and users will have a bad experience. Using asynchronous techniques, you can keep your UI responsive while background tasks are completed. | yomtech |
1,872,107 | Odkrywanie przypadków użycia dla usług kognitywnych | Przykłady aplikacji i firm, które są przekształcane przez usługi kognitywne, a także niektóre przyszłe przypadki użycia | 0 | 2024-05-31T13:50:06 | https://dev.to/pubnub-pl/odkrywanie-przypadkow-uzycia-dla-uslug-kognitywnych-10n4 | Przeanalizujmy kilka przykładów aplikacji i firm przekształconych przez usługi kognitywne oraz kilka przyszłych przypadków użycia, aby zobaczyć, jak bardzo zmieniają one krajobraz technologiczny.
Dzięki usługom kognitywnym gigantów chmury, takich jak AWS, IBM i Microsoft Azure, zespoły programistów każdej wielkości mają teraz dostęp do [usług kognitywnych](https://pubnub.com/resources/ebook/building-apps-with-cognitive-services/) o oszałamiającej mocy. Usługi te, dostarczane za pośrednictwem interfejsów API, ułatwiają wprowadzanie inteligencji nowej generacji do aplikacji.
[**Czat**](https://pubnub.com/learn/glossary/what-is-a-chat-api/) **i interakcje społecznościowe**
--------------------------------------------------------------------------------------------------
W 2015 roku miesięczna liczba aktywnych użytkowników aplikacji do czatowania przewyższyła liczbę użytkowników sieci społecznościowych, a przepaść ta wciąż się powiększa. Rzeczywiście, przesyłanie wiadomości stało się istotną cechą samych sieci społecznościowych. Wraz z tym szybkim wzrostem, aplikacje do przesyłania wiadomości ewoluowały od prostych narzędzi do wysyłania i odbierania krótkich wiadomości tekstowych do innowacyjnych, w pełni funkcjonalnych doświadczeń z zaskakującymi i zachwycającymi funkcjami. A siłą napędową tych innowacji są kognitywne interfejsy API.
### **Chatboty i obliczenia kognitywne**
Chatboty są jedną z najwcześniejszych form algorytmów sztucznej inteligencji. Choć jest mało prawdopodobne, że wkrótce przejdą test Turinga, stanowią one naturalną ewolucję aplikacji obsługujących głos. Tam, gdzie kiedyś można było zadzwonić na linię pomocy technicznej i nacisnąć 1, aby uzyskać informacje o płatnościach, teraz można mówić pełnymi zdaniami do systemu, który potrafi rozpoznać intencje użytkownika.
Niezależnie od tego, czy zdajesz sobie sprawę, czy nie, adopcja chatbotów eksplodowała, ponieważ firmy starają się skrócić czas oczekiwania, poprawić jakość obsługi klienta i zminimalizować koszty ludzkich operatorów telefonicznych. Obecnie są one wykorzystywane głównie do obsługi prostych zadań: rozumienia podstawowych żądań i reagowania w oparciu o predefiniowane reguły, odpowiadając na pytania takie jak "Gdzie jest moje zamówienie?" lub "Chatbot, włącz nastrojowe światła".
Jednak interfejsy API, takie jak [Watson Assistant](https://www.ibm.com/cloud/watson-assistant/) lub [Amazon Lex](https://aws.amazon.com/lex/), ułatwiają tworzenie usług, które mogą stosować logikę do zaobserwowanych wzorców w tych żądaniach w języku naturalnym. Usługi te mogą na przykład zaobserwować nagły wzrost liczby połączeń z lotniska, które ma opóźnienia w starcie, i zmienić sekwencję opcji, aby nadać priorytet zmianie harmonogramu lotów. Mogą też zauważyć, że połączenia z określonego kraju lub regionu są zwykle prowadzone w innym języku i odpowiednio zmienić domyślne ustawienia. Mogą nawet zidentyfikować wzorce gramatyczne, które wskazują klientów do natychmiastowego przekazania do przełożonego.
Inteligentne interfejsy konwersacyjne wykorzystujące rozpoznawanie mowy, zamianę tekstu na mowę, rozpoznawanie twarzy i modele uczenia maszynowego mogą zapewnić wysoce angażujące doświadczenia i realistyczne rozmowy do różnych celów. A co więcej, będą uczyć się na podstawie tych doświadczeń.
Chatboty zmienią sposób, w jaki bankujemy, robimy zakupy i uczymy się: tworząc rekomendacje, rozumiejąc abstrakcyjne pojęcia i poznając osoby na podstawie wcześniejszych kontaktów. W końcu staną się tak dobre, że nawet nie będziesz wiedział, czy rozmawiasz z człowiekiem.
#### **Przykład kodu: Chatbot automatyki domowej**
Korzystając z Watson i PubNub ChatEngine, możesz łatwo stworzyć [chatbota ze sztuczną inteligencją](https://www.pubnub.com/docs/chat/samples?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl), który kontroluje twój inteligentny dom.

Ten samouczek pokazuje, jak zbudować chatbota, który akceptuje polecenia tekstowe, analizuje je i podejmuje działania na ich podstawie. Na przykład, użytkownik wpisze "włącz światła w salonie", a bot uruchomi światła.
```js
{
"homeauto_intents":
[
{
"intent":"turnOFF",
"examples":
[
{"text":"Put off"},
{"text":"Switch off"},
{"text":"Turn off"}
],
"description":"Turn on intents"
},
{
"intent":"turnON",
"examples":
[
{"text":"Put on"},
{"text":"Switch on"},
{"text":"Turn on"}
],
"description":"Turn off intents"
}
```
### **Przetwarzanie języka naturalnego**
Kolejnym niezwykle istotnym obszarem jest nauka o danych i przetwarzanie języka naturalnego (NLP), czyli ogólny termin dla rozwiązań AI, które mogą owocnie przetwarzać duże ilości danych w języku naturalnym. NLP może nie tylko oceniać słowa i gramatykę z semantycznego punktu widzenia, ale może także odkrywać nastroje i emocje, odkrywając, jak użytkownicy czują się na dany temat lub temat poprzez analizę wiadomości po wiadomości.
NLP to ogromna korzyść dla marek, osób publicznych i organizacji, które muszą zrozumieć opinie użytkowników i reagować na nie w czasach, gdy reputacja może zostać zbudowana lub zniszczona w ciągu kilku minut. Wyobraźmy sobie, że marka wypuszcza nową reklamę produktu. Korzystając z odpowiednich usług kognitywnych, może połączyć się ze strumieniem mediów społecznościowych pod określonym hashtagiem lub nazwą produktu i zlecić swojemu interfejsowi NLP API analizę wszystkich istotnych wiadomości i dostarczyć informacji zwrotnych na temat reakcji opinii publicznej na produkt.
Poniżej znajduje się przykład aplikacji zaprojektowanej do analizowania i oceniania opinii ludzi na temat amerykańskich polityków na Twitterze. Monitoruje ona określone słowa kluczowe i frazy, a następnie może wykreślić emocje użytkowników w określonych regionach geograficznych.

Na przykład, jeśli użytkownik wyśle tekst "Jestem szczęśliwy"...
```js
{
"session_id": 1,
"text": "I am happy!"
}
```
Watson analizuje tekst i zwraca następujące informacje:
```js
{
"session_id": 1,
"text": "I am happy!"
"session_sentiment": {
"overall": 0.879998,
"positive": {
"count": 1,
"avg": 0.879998
},
"negative": {
"count": 0,
"avg": 0
},
"neutral": {
"count": 2,
"avg": null
}
},
"score": 0.88006828
}
```
Marki już wydają duże kwoty na analizę nastrojów rynkowych. W miarę jak systemy te stają się coraz bardziej inteligentne, niezawodne i zautomatyzowane, będą w stanie znacznie lepiej zrozumieć opinię publiczną przy niższych kosztach.
**eCommerce**
-------------
Chociaż zakupy online całkowicie zmieniły sposób, w jaki kupujemy towary, w handlu elektronicznym brakuje jednego kluczowego elementu sklepu stacjonarnego: pomocnych pracowników. W skali, w jakiej działają sklepy internetowe, nie jest ekonomicznie opłacalne posiadanie prawdziwych pracowników czatu na żywo.
W rezultacie wiele sklepów internetowych korzysta z usług inteligentnych botów-asystentów zakupowych, aby zoptymalizować obsługę, pomagać kupującym w zadawaniu pytań, przedstawiać rekomendacje, a nawet dokonywać płatności.
[Nordstrom zdominował poprzednie sezony świąteczne](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/) swoim chatbotem Messenger, który wykraczał poza proste, predefiniowane pytania i odpowiedzi i wykorzystywał usługi kognitywne, aby naprawdę zrozumieć, czego szuka klient i pomóc w razie potrzeby. Oferował rekomendacje prezentów, a nawet mógł pomóc w realizacji zamówienia.

Chatboty uchroniły nas również przed przerażającymi rozmowami telefonicznymi z działem obsługi klienta i czekaniem przez godzinę, aż przedstawiciel firmy rozwiąże prosty problem. Amazon wdrożył chatboty, które mogą rozwiązywać drobne problemy, z którymi boryka się większość klientów potrzebujących pomocy przy składaniu zamówień.
Teraz, gdy przyjrzeliśmy się kilku przykładom inteligencji w świecie rzeczywistym, spójrzmy w przyszłość i zobaczmy, jak usługi kognitywne zmienią nasz świat w przyszłości.
**Inteligentne miasta**
-----------------------
Miasta przyszłości będą polegać na różnych zintegrowanych inteligentnych usługach, aby uczynić je bezpieczniejszymi, bardziej wydajnymi i bardziej przyjaznymi dla środowiska. Rozpoznawanie obrazów, wizja komputerowa i interfejsy API wizji będą odgrywać kluczową rolę w tej transformacji, przetwarzając i podejmując działania na obrazach w przestrzeni miejskiej.
**Rolnictwo**
-------------
Globalna populacja wciąż rośnie, a wyżywienie tych miliardów ludzi będzie w nadchodzących latach sporym wyzwaniem. Usługi kognitywne odegrają kluczową rolę w zarządzaniu polami i fabrykami, pozwalając nam podejmować inteligentne decyzje i kontrolować zasoby z precyzją, jakiej nigdy wcześniej nie mieliśmy.
Inteligentne farmy i IoT będą obejmować jak najwięcej cennych punktów danych, aby podejmować inteligentne decyzje rolnicze, nawet te, które wydają się sprzeczne z intuicją. Na przykład, agregując dane pogodowe w czasie rzeczywistym, dane ze zdalnych czujników i historyczne wyniki, usługi kognitywne mogą udoskonalić indywidualny plan nawadniania i aktualizować go w zależności od wyjątkowych okoliczności każdego dnia.
**Bezpieczeństwo danych**
-------------------------
W miarę jak stajemy się coraz bardziej połączeni, a nasze cyfrowe życie przyćmiewa nasze fizyczne, prywatność i bezpieczeństwo danych przekształcają się z czegoś, czego jesteśmy niejasno świadomi, w niepokojące, stale obecne osobiste zagrożenie.
Przepisy i zasady - HIPAA, RODO, SOC II - są jednym ze sposobów na zapewnienie, że firmy i organizacje mają odpowiednie zabezpieczenia. Wdrożenie tych złożonych przepisów w szczegółach może być bardzo trudne, dlatego do gry wkracza uczenie maszynowe.
Usługi kognitywne można przeszkolić w zakresie rozumienia i nadawania sensu zasadom i przepisom, a następnie sugerować sposoby osiągnięcia zgodności. Usługi kognitywne umożliwiają dostarczanie cennych informacji na temat bezpieczeństwa danych, od odpowiednich zasad i przepisów po moderację treści.
**Opieka zdrowotna**
--------------------
Innowacje w branży opieki zdrowotnej zwykle postępują wolniej niż w innych branżach z kilku powodów, w tym z powodu wąskich marż, surowych przepisów oraz silosowych badań i rozwoju. Usługi kognitywne oferują możliwość zniesienia barier dla innowacji i usprawnienia systemu dostarczania usług od organizacji po pacjentów.
Podejmowanie decyzji w opiece zdrowotnej zazwyczaj odbywa się na zasadzie silosowej, pacjent po pacjencie. Usługi kognitywne natomiast analizują i działają w oparciu o kompleksowy obraz czynników wpływających na zdrowie: status społeczno-ekonomiczny, środowisko, dostęp do opieki zdrowotnej itp. Usługi kognitywne mogą zalecać lekarzom lepszą, bardziej ukierunkowaną opiekę nad pacjentem, w tym programy zdrowotne i odnowy biologicznej.
Usługi kognitywne mogą napędzać integrację i łączenie istniejących systemów w organizacjach opieki zdrowotnej oraz odkrywać istotne spostrzeżenia. Nagle, będąc w stanie agregować dane i łączyć potrzeby interesariuszy, organizacje mogą zapewnić lepszą opiekę, działając jednocześnie bardziej efektywnie.
**Inteligencja teraz**
----------------------
W tym artykule opisano tylko niewielką próbkę tego, jak usługi kognitywne zmienią sposób, w jaki myślimy o biznesie i roli, jaką mogą odgrywać aplikacje. W przeszłości oprogramowanie wykonywało instrukcje. Dzięki usługom kognitywnym rozwiązania mogą się dostosowywać, ewoluować i osiągać rzeczy, które jeszcze kilka lat temu wydawały się niemożliwe. Nie widzimy wszystkich implikacji, ale z tego, co wiemy, nie ma wątpliwości, że wpływ na biznes będzie głęboki, pozytywny - i zanim się zorientujesz.
Jak PubNub może ci pomóc?
=========================
Ten artykuł został pierwotnie opublikowany na [PubNub.com](https://www.pubnub.com/blog/the-many-uses-of-cognitive-services/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl)
Nasza platforma pomaga programistom tworzyć, dostarczać i zarządzać interaktywnością w czasie rzeczywistym dla aplikacji internetowych, aplikacji mobilnych i urządzeń IoT.
Podstawą naszej platformy jest największa w branży i najbardziej skalowalna sieć komunikacyjna w czasie rzeczywistym. Dzięki ponad 15 punktom obecności na całym świecie obsługującym 800 milionów aktywnych użytkowników miesięcznie i niezawodności na poziomie 99,999%, nigdy nie będziesz musiał martwić się o przestoje, limity współbieżności lub jakiekolwiek opóźnienia spowodowane skokami ruchu.
Poznaj PubNub
-------------
Sprawdź [Live Tour](https://www.pubnub.com/tour/introduction/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl), aby zrozumieć podstawowe koncepcje każdej aplikacji opartej na PubNub w mniej niż 5 minut.
Rozpocznij konfigurację
-----------------------
Załóż [konto](https://admin.pubnub.com/signup/?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) PubNub, aby uzyskać natychmiastowy i bezpłatny dostęp do kluczy PubNub.
Rozpocznij
----------
[Dokumenty](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl) PubNub pozwolą Ci rozpocząć pracę, niezależnie od przypadku użycia lub [zestawu SDK](https://www.pubnub.com/docs?utm_source=devto&utm_medium=syndication&utm_campaign=off_domain&utm_content=pl). | pubnubdevrel | |
1,872,106 | Exploring Use Cases for Cognitive Services | Examples of apps and businesses that are being transformed by cognitive services, and some future use cases as well | 0 | 2024-05-31T13:50:06 | https://dev.to/pubnub/exploring-use-cases-for-cognitive-services-479j | Let’s walk through some examples of apps and companies transformed by cognitive services and some future use cases to see how much they’re changing the technology landscape.
Thanks to cloud giants like AWS, IBM, and Microsoft Azure's cognitive services, developer teams of all sizes now have access to [cognitive services](https://pubnub.com/resources/ebook/building-apps-with-cognitive-services/) of staggering power. Delivered through APIs, these services make it easy to inject next-generation intelligence into applications.
[**Chat**](https://pubnub.com/learn/glossary/what-is-a-chat-api/) **and Social Interaction**
--------------------------------------------------------------------------------------------
In 2015, monthly active users on chat apps surpassed those on social networks, and the chasm continues to widen. Indeed, messaging has become an essential feature of social networks themselves. And with this rapid growth, messaging apps have evolved from simple tools for sending and receiving short, text-based messages to innovative, full-featured experiences boasting surprising and delightful features. And driving that innovation are cognitive APIs.
### **Chatbots and Cognitive Computing**
Chatbots are one of the earliest forms of AI algorithms. While unlikely to pass the Turing test soon, they represent the natural evolution of voice-enabled applications. Where once you would call a support line and press 1 for Accounts Payable, now you can speak in complete sentences to a system that can discern your intent.
Whether you’re aware or not, chatbot adoption has exploded as companies seek to reduce wait times, improve customer experience and minimize the cost of human telephone operators. Right now, they’re mainly used to handle simple tasks: understanding basic requests and responding based on predefined rules, answering questions like, “Where is my order?” or “Chatbot, turn on mood lights.”
However, APIs like [Watson Assistant](https://www.ibm.com/cloud/watson-assistant/) or [Amazon Lex](https://aws.amazon.com/lex/) make it easy to build services that can apply logic to observed patterns in those natural-language requests. These services may, for instance, observe a sudden rush of calls from an airport suffering take-off delays and change the sequence of options to prioritize rescheduling flights. Or they may see that calls from a particular country or region tend to be conducted in a different language and change the default accordingly. They may even identify grammatical patterns that indicate customers to immediately forward to a supervisor.
Intelligent conversational interfaces using speech recognition, text-to-speech, facial recognition, and machine learning models can provide highly engaging experiences and life-like conversations for various purposes. And even better, they’ll learn from those experiences.
Chatbots will change how we bank, shop, and learn: making recommendations, understanding abstract concepts, and getting to know individuals based on prior engagements. Eventually, they’ll get so good you won’t even know if you’re talking to a human.
#### **Code Example: Home Automation Chatbot**
Using Watson and PubNub ChatEngine, you can easily [spin up a chatbot with artificial intelligence](https://www.pubnub.com/docs/chat/samples?) that controls your smart home.

This tutorial shows you how to build a chatbot that accepts text commands, parses them, and takes action based on them. For example, a user types “turn on the lights in the living room,” and the bot will trigger the lights.
```js
{
"homeauto_intents":
[
{
"intent":"turnOFF",
"examples":
[
{"text":"Put off"},
{"text":"Switch off"},
{"text":"Turn off"}
],
"description":"Turn on intents"
},
{
"intent":"turnON",
"examples":
[
{"text":"Put on"},
{"text":"Switch on"},
{"text":"Turn on"}
],
"description":"Turn off intents"
}
```
### **Natural Language Processing**
Another hugely impactful area is data science and natural language processing (NLP), the umbrella term for AI solutions that can fruitfully process large amounts of natural language data. NLP can not just gauge words and grammar from a semantic perspective but can divine sentiment and emotion, unearthing how users feel about a topic or subject through message-by-message analysis.
NLP is a huge benefit for brands, public figures, and organizations that need to understand and respond to user opinions at a time when reputations can be made or unmade in a matter of minutes. Imagine a brand launches a new commercial for a product. Using the right cognitive services, it can tap into a social media stream on a specific hashtag or the product name and have its NLP API analyze all relevant messages and provide feedback on how the public responds to the product.
Below is an example of an app designed to analyze and gauge how people felt about US politicians on Twitter. It monitors specific keywords and phrases and can then plot the emotion of users in defined geographical regions.

For example, if a user submits the text “I am happy”…
```js
{
"session_id": 1,
"text": "I am happy!"
}
```
Watson analyses the text and returns the following:
```js
{
"session_id": 1,
"text": "I am happy!"
"session_sentiment": {
"overall": 0.879998,
"positive": {
"count": 1,
"avg": 0.879998
},
"negative": {
"count": 0,
"avg": 0
},
"neutral": {
"count": 2,
"avg": null
}
},
"score": 0.88006828
}
```
Brands already spend large amounts on market sentiment analysis. As these systems grow more intelligent, robust, and automated, they’ll be able to understand the public far better at a lower cost.
**eCommerce**
-------------
Though online shopping has completely changed how we buy goods, e-tail lacks one key component of a brick-and-mortar store: helpful employees. At the scale online stores operate, it isn’t economically viable to have actual people staff live chat.
As a result, many online stores are turning to intelligent shopping assistant bots to optimize the experience, assist shoppers with their questions, make recommendations and even check out.
[Nordstrom has dominated previous holiday seasons](https://wersm.com/nordstrom-ruled-holidays-with-its-amazing-chatbot/) with their Messenger chatbot, which went beyond simple predefined questions and answers and used cognitive services to truly understand what the customer was looking for and assist as needed. It offered gift recommendations and could even help fulfill the order.

Chatbots also save us from the dreaded customer support phone call, waiting an hour for a representative to deal with a simple problem. Amazon has deployed chatbots that can resolve minor issues that most customers have when they need help with their orders.
Now that we’ve looked at a couple of examples of intelligence in the real world today let’s peer into the future and see how cognitive services will change our world in the future.
**Smart Cities**
----------------
Cities of the future will rely on various integrated intelligent services to make them safer, more efficient, and more environmentally conscious. Image recognition, computer vision, and vision APIs will play a critical role in this transformation, processing and taking action on images within the urban space.
**Agriculture**
---------------
Global populations continue to grow, and feeding those billions of people will be a considerable challenge in the years to come. Cognitive services will play a critical role in managing fields and factories, allowing us to make intelligent decisions and control resources with a precision we’ve never had before.
Smart farms and IoT will incorporate as many valuable data points as possible to create intelligent agricultural decisions, even ones that seem counterintuitive. For example, by aggregating real-time weather data, remote sensor data, and historical performance, cognitive services can perfect the individual irrigation plan and update it for every day’s unique circumstances.
**Data Security**
-----------------
As we grow more connected and our digital lives overshadow our physical ones, data privacy and security transform from something we’re vaguely aware of to a disconcerting, ever-present personal threat.
Regulations and rules—HIPAA, GDPR, SOC II—are one way to ensure that businesses and organizations have the proper guardrails in place. Implementing these complex regulations in detail can be a lot to handle, which is where machine learning comes into play.
Cognitive services can be trained to understand and make sense of rules and regulations, then suggest ways to achieve compliance. Cognitive services enable the delivery of valuable insights into data security, from relevant rules and laws to content moderation.
**Healthcare**
--------------
Innovation typically moves slower in the healthcare industry than others for several reasons, including tight margins, heavy regulation, and siloed research and development. Cognitive services offer the opportunity to lift barriers of innovation and improve the delivery system from organizations down to patients.
Decision-making in healthcare typically happens on a siloed patient-by-patient basis. Cognitive services, by contrast, analyze and act on a comprehensive view of factors that influence health: socioeconomic status, environment, access to healthcare, and so on. Cognitive services can recommend better, more targeted patient care to the physician, including health and wellness programs.
Cognitive services can drive the integration and connection of existing systems within healthcare organizations and unearth essential insights. Suddenly able to aggregate data and connect stakeholder needs, organizations can deliver better care while operating more efficiently.
**Intelligence Now**
--------------------
This article has described only a tiny sample of how cognitive services will change the way we think about business and the role that applications can play. In the past, software followed instructions. With cognitive services, solutions can adapt, evolve, and accomplish things that might have seemed impossible just a few years ago. We can’t see all the implications, but from what we know, there is little doubt the impact on business will be profound, positive—and here before you know it.
How can PubNub help you?
========================
This article was originally published on [PubNub.com](https://www.pubnub.com/blog/the-many-uses-of-cognitive-services/?)
Our platform helps developers build, deliver, and manage real-time interactivity for web apps, mobile apps, and IoT devices.
The foundation of our platform is the industry's largest and most scalable real-time edge messaging network. With over 15 points-of-presence worldwide supporting 800 million monthly active users, and 99.999% reliability, you'll never have to worry about outages, concurrency limits, or any latency issues caused by traffic spikes.
Experience PubNub
-----------------
Check out [Live Tour](https://www.pubnub.com/tour/introduction/?) to understand the essential concepts behind every PubNub-powered app in less than 5 minutes
Get Setup
---------
Sign up for a [PubNub account](https://admin.pubnub.com/signup/?) for immediate access to PubNub keys for free
Get Started
-----------
The [PubNub docs](https://www.pubnub.com/docs?) will get you up and running, regardless of your use case or [SDK](https://www.pubnub.com/docs?) | pubnubdevrel | |
1,872,105 | Node.js Contact Form: How to Create One, Validate Data, and Send Emails | In this step-by-step tutorial, I’ll show you how to create a Node.js contact form, give it a personal... | 0 | 2024-05-31T13:50:02 | https://mailtrap.io/blog/node-js-contact-form/ | In this step-by-step tutorial, I’ll show you how to create a Node.js contact form, give it a personal touch, retrieve and validate data from it, and then send emails through the form via SMTP or API.
_Note: you’ll need Node.js 6+ or any version released since May 2018 installed on your machine for the provided code snippets to work._
## How to create a Node.js contact form
As you’re reading this article, I’m assuming you know how to install Node.js and create a new project, so allow me to keep it brief and get straight into setting up the project.
However, to freshen up your knowledge, you can check out this article on [installing Node.js](https://blog.apify.com/how-to-install-nodejs/).
### Setting up the project
First things first, let’s install some dependencies for our backend by opening the terminal and entering the following commands:
- `**mkdir contact-form-test && cd contact-form-test**` – By running this command, we create a folder for our project, ensuring our project file will be within it instead of the current directory.
- `**npm init -y**` – This will create a package.json file that manages project dependencies and configurations.
- `**npm i express nodemailer**` – We need to install Express.js library and Nodemailer module as we will need them for setting up our server and sending emails with SMTP, respectively.
- `**npm i -D nodemon**` – This is a dev dependency that automates the process of restarting our server whenever we change our code, allowing us to see the changes we made without having us manually restart the server.
Once you install all the dependencies, your **package.json** file should look something like this:
```
{
"name": "contact-form-test", // Project name
"version": "1.0.0", // Project version
"description": "", // Description of the project
"main": "server.js", // Entry point file of the project
"scripts": {
"dev": "nodemon --watch public --watch server.js --ext js,html,css", // Script to run the server with nodemon for development
"start": "node server.js" // Script to start the server normally
},
"keywords": [], // Keywords related to the project
"author": "", // Author of the project
"license": "ISC", // License type
"dependencies": {
"express": "^4.19.2", // Express web server framework
"nodemailer": "^6.9.13" // Nodemailer for sending emails via SMTP
},
"devDependencies": {
"nodemon": "^3.1.0" // Nodemon for automatically restarting the server on code changes
}
}
```
**Bonus tips**:
- `**npm i dotenv**` – Although optional, this command installs the dotenv package, which loads environment variables where you can safely store your authentication credentials such as an API key. All you have to do is run the command, create an .env file in the root of your project directory, and paste your desired creds in there.
- Starting from [v20.6.0, Node.js](https://nodejs.org/en/blog/release/v20.6.0) has built-in support for **.env** files for configuring environment variables. So it is not necessary to use the dotenv package, but a lot of projects still depend on dotenv, so it’s still the de facto standard.
### Configuring the server
Next, in our project folder, let’s create a new **.js** file called **server.js** which we refer to in the production script from the **package.json** file.
Then, simply paste the following code snippet into the **server.js** file:
```
const express = require('express');
const app = express();
const path = require('path');
const PORT = process.env.PORT || 3000;
// Middleware
app.use(express.static('public'));
app.use(express.json());
app.get('/', (req, res) => {
res.sendFile(path.join(__dirname, 'public', 'contactform.html'));
});
app.post('/send-email', (req, res) => {
console.log(req.body);
res.send('Data received');
});
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
```
**Note**: Later in the article, we’ll use the server.js file to add Nodemailer as a transport via [SMTP](https://mailtrap.io/blog/node-js-contact-form/#Send-email-from-Nodejs-contact-form-using-SMTP) and an [API logic](https://mailtrap.io/blog/node-js-contact-form/#Send-email-from-Nodejs-contact-form-using-API) to send emails through our contact form.
### Creating the contact form
Now, let’s create a new **public** folder called public for the static files (**style.css**, **contactform.html**, and **app.js**) we’re going to be using for this contact form.
In the **contactform.html** file, enter the following code:
```
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8"> <!-- Specifies the character encoding for the HTML document -->
<link rel="stylesheet" href="/style.css"> <!-- Link to external CSS file for styling -->
<link rel="preconnect" href="https://fonts.gstatic.com"> <!-- Preconnect to load fonts faster -->
<link href="https://fonts.googleapis.com/css2?family=Poppins&display=swap" rel="stylesheet"> <!-- Google fonts link for 'Poppins' font -->
<meta name="viewport" content="width=device-width, initial-scale=1.0"> <!-- Responsive design meta tag -->
<title>Contact Form</title> <!-- Title of the document shown in the browser tab -->
</head>
<body>
<div class="form-container"> <!-- Container for the form to style it specifically -->
<form class="contact-form"> <!-- Form element where user inputs will be submitted -->
<h2>CONTACT</h2> <!-- Heading of the form -->
<input type="text" id="name" placeholder="Full name"><br> <!-- Input field for name -->
<input type="email" id="email" placeholder="Email"><br> <!-- Input field for email, validates email format -->
<input type="text" id="subject" placeholder="Subject"><br> <!-- Input field for subject -->
<textarea id="message" placeholder="Message" cols="30" rows="10"></textarea><br> <!-- Textarea for longer message input -->
<input type="submit" class="submit" value="Send Message"> <!-- Submit button to send the form data -->
</form>
</div>
<script src="/app.js"></script> <!-- Link to external JavaScript file for scripting -->
</body>
</html>
```
I’ve added annotations in this code snippet as well to help you navigate through it, but feel free to delete them for a cleaner-looking code. 🙂
### Styling the contact form
How about we tackle the frontend for a bit and add a personal touch to our contact form?
In the **style.css** file, enter the following code which will make our contact form prettier:
```
/* Global styles for all elements to ensure consistency */
* {
margin: 0; /* Remove default margin */
padding: 0; /* Remove default padding */
box-sizing: border-box; /* Include padding and border in the element's total width and height */
font-family: 'Poppins', sans-serif; /* Set a consistent font family throughout the app */
}
/* Styling for the html and body elements */
html, body {
background: #c0b7b7; /* Set the background color for the entire page */
}
/* Container for the form providing relative positioning context */
.form-container {
position: relative; /* Positioning context for absolute positioning inside */
left: 20%; /* Position the container 20% from the left side of the viewport */
width: 60%; /* Set the width of the container to 60% of the viewport width */
height: 100vh; /* Set the height to be 100% of the viewport height */
background-color: white; /* Set the background color of the form container */
}
/* Styling for the contact form itself */
.contact-form {
position: absolute; /* Position the form absolutely within its parent container */
top: 10%; /* Position the form 10% from the top of its container */
left: 10%; /* Position the form 10% from the left of its container */
width: 80%; /* The form width is 80% of its container */
min-height: 600px; /* Minimum height for the form */
}
/* Styling for input fields and textarea within the form */
input, textarea {
width: 100%; /* Make input and textarea elements take up 100% of their parent's width */
margin-top: 2rem; /* Add top margin to space out the elements */
border: none; /* Remove default borders */
border-bottom: 1px solid black; /* Add a bottom border for a minimalistic look */
padding: 10px; /* Add padding for better readability */
}
/* Styling for the submit button */
.submit {
border: 1px solid black; /* Add a solid border around the submit button */
padding: 1rem; /* Add padding inside the button for better clickability */
text-align: center; /* Center the text inside the button */
background-color: white; /* Set the background color of the button */
cursor: pointer; /* Change the cursor to a pointer to indicate it's clickable */
}
/* Styling for the submit button on hover */
.submit:hover {
opacity: 0.6; /* Change the opacity when hovered to give a visual feedback */
}
```
To see how your contact form looks, you can save the file and enter the following command in your terminal:
```
npm run dev
```
Then, you should see the message saying that your contact form is being hosted on the custom port you defined in your **.env** file or the default port 3000, which allows traffic to reach it.
Finally, paste the following link in your browser’s URL bar: [http://localhost:3000/](http://localhost:3000/) and you should see your contact form in its full glory.
## How to collect data from a Node.js contact form
To collect data from our Node.js contact form, we will add functionality for handling form submissions using JavaScript, which will capture form data and send it to the server without reloading the page.
For this, we’ll use an **AJAX request**, which allows us to avoid a full-page reload.
I’ve made this easy for you, so all you have to do is navigate to your **app.js** file and enter the following code:
```
const contactForm = document.querySelector('.contact-form');
const name = document.getElementById('name');
const email = document.getElementById('email');
const subject = document.getElementById('subject');
const message = document.getElementById('message');
contactForm.addEventListener('submit', (e) => {
e.preventDefault(); // Prevent the default form submission
const formData = {
name: name.value,
email: email.value,
subject: subject.value,
message: message.value
};
try {
const response = await fetch('/send-email', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(formData)
});
// Wait for JSON response to be parsed
const result = await response.json();
if (!response.ok) {
// If the response is not OK, handle it by showing an alert
alert(`Failed to send message: ${result.message}`);
return; // Exit the function early if there's an error
}
// Check application-specific status from JSON when response is OK
if (result.status === 'success') {
alert('Email sent');
// Reset form fields after successful submission
name.value = '';
email.value = '';
subject.value = '';
message.value = '';
} else {
// Handle application-level failure not caught by response.ok
alert('Operation failed: ' + result.message);
}
} catch (error) {
// Handle any exceptions that occur during fetch
console.error('Error:', error);
alert('Network error or cannot connect to server');
}
});
```
**Note**: I used the fetch API here which is a more modern alternative to ‘XMLHttpRequest’ and leverages async/await syntax for better error handling.
## How to validate data from a contact form
To validate data from a contact form in Node.js, you can either use:
- **[Deep email validator](https://www.npmjs.com/package/deep-email-validator)** – A comprehensive, MIT-certified, dependency package that makes sure an email is valid by running it through several different checks. It validates RegEx, common typos, disposable email blacklists, DNS records, and SMTP server responses.
- **Regex expressions** – Patterns used to match character combinations in strings, ensuring that inputs conform to a predefined format. They provide a more basic level of validation compared to dependency packages like Deep email validator.
### Deep email validator module
To install the Deep email validator, I typically use the npm command or [Yarn](https://yarnpkg.com/en/):
```
npm i deep-email-validator
# or
yarn add deep-email-validator
```
Then after importing const `{ validate } = require('deep-email-validator');` simply add the following code in your **/send-email** controller:
```
// Endpoint to handle form submission and send email
app.post('/send-email', async (req, res) => {
const { name, email, subject, message } = req.body;
if (!name || !email || !subject || !message) {
return res.status(400).json({ status: 'error', message: 'Missing required fields!' })
}
// Validate the email
const validationResult = await validate(email);
if (!validationResult.valid) {
return res.status(400).json({
status: 'error',
message: 'Email is not valid. Please try again!',
reason: validationResult.reason
});
}
// Email sending logic
// Placeholder response for a successful email submission
res.status(200).json({
status: 'success',
message: 'Email successfully sent'
});
});
```
For more information on Deep email validator, consult the official [GitHub page](https://github.com/mfbx9da4/deep-email-validator).
### Regular expressions for email validation
If you’d rather run basic ReGex checks against the email IDs instead of using a dependency package, you can paste the following code into your project file (e.g., **server.js**):
```
const emailRegex =
/^[a-zA-Z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*$/
function isEmailValid(email) {
// Check if the email is defined and not too long
if (!email || email.length > 254) return false;
// Use a single regex check for the standard email parts
if (!emailRegex.test(email)) return false;
// Split once and perform length checks on the parts
const parts = email.split("@");
if (parts[0].length > 64) return false;
// Perform length checks on domain parts
const domainParts = parts[1].split(".");
if (domainParts.some(part => part.length > 63)) return false;
// If all checks pass, the email is valid
return true;
}
```
Then in your **/send-email** route add the following code to perform email validation:
```
// Endpoint to handle form submission and send email
app.post('/send-email', async (req, res) => {
const { name, email, subject, message } = req.body;
if (!name || !email || !subject || !message) {
res.status(400).json({ status: 'error', message: 'Missing required fields!' })
}
// Validate the email
if (!isEmailValid(email)) {
return res.status(400).json({
status: 'error',
message: 'Email is not valid. Please try again!'
});
}
// Email sending logic
// Placeholder response for a successful email submission
res.status(200).json({
status: 'success',
message: 'Email successfully sent'
});
});
```
For more details on validating emails in Node.js, feel free to consult our [dedicated article](https://mailtrap.io/blog/nodejs-email-validation/), where you can also learn how to verify emails by sending an activation link/code the end users activate from their inbox.
## How to add reCAPTCHA
Now, let’s add a reCAPTCHA to our Node.js contact form to make sure no robots fill it out. 🤖
First, you need to get your ‘**SITE KEY**’ and ‘**SECRET KEY**’ from the [Google reCAPTCHA admin console](https://www.google.com/recaptcha/admin/create).
In this example we will be using Challenge (v2), so be sure to select it during creation and leave the default “I’m not a robot Checkbox”.
Also, since we are testing from a local computer we need to add ‘**localhost**’ to allowed domains during creation.
**Note**: if you are in production you would need to use your own domains. And to use this code, simply copy your **SITE KEY** and **SECRET KEY**.
Next, in our **contactform.html**, add the following code snippet just before the closing of the head section:
```
<script src="https://www.google.com/recaptcha/api.js" async defer></script> <!-- Load google recaptcha API -->
```
Inside our form, we can add the reCAPTCHA element after the textarea like this:
```
<div class="g-recaptcha" data-sitekey="YOUR_SITE_KEY"></div><br> <!-- Google recaptcha element -->
```
**Important**: don’t forget to replace **YOUR_SITE_KEY** with your actual **SITE KEY**.
Then, we need to update our **app.js** inside the public folder to pass the reCAPTCHA token to our `**formData**` so the backend can verify it. To do this, after the **`e.preventDefault();`** paste the following code :
```
// Get the reCAPTCHA token
const recaptchaToken = grecaptcha.getResponse();
if (!recaptchaToken) {
alert('Please complete the reCAPTCHA');
return; // Exit the function early if reCAPTCHA is not completed
}
const formData = {
name: name.value,
email: email.value,
subject: subject.value,
message: message.value,
'g-recaptcha-response': recaptchaToken
};
```
To verify the reCAPTCHA on the server side, we need to make a request. For this, we can use the built in Node.js ‘**https**’, or a package like ‘**node-fetch**’ or ‘**axios**’.
And since Node.js ver 18, the fetch method is available on the global scope natively, so let’s use ‘**fetch**’.
First, in our server.js let’s create a function to verify the reCAPTCHA (make sure to add the **SECRET KEY** as **RECAPTCHA_SECRET_KEY** in environment variables else the verification will not succeed):
```
// Function to verify reCAPTCHA
async function verifyRecaptcha(token) {
const recaptchaSecret = process.env.RECAPTCHA_SECRET_KEY;
const recaptchaUrl = `https://www.google.com/recaptcha/api/siteverify?secret=${recaptchaSecret}&response=${token}`;
const response = await fetch(recaptchaUrl, { method: 'POST' });
const result = await response.json();
return result.success;
}
```
Then in our /send-email endpoint we can update our code to this:
```
// Endpoint to handle form submission and send email
app.post('/send-email', async (req, res) => {
const { name, email, subject, message, 'g-recaptcha-response': recaptchaToken } = req.body;
if (!name || !email || !subject || !message || !recaptchaToken) {
res.status(400).json({ status: 'error', message: 'Missing required fields!' })
}
// Verify the reCAPTCHA token
const isRecaptchaValid = await verifyRecaptcha(recaptchaToken);
if (!isRecaptchaValid) {
return res.status(400).json({
status: 'error',
message: 'reCAPTCHA verification failed. Please try again.'
});
}
// Validate the email
const validationResult = await validate(email);
if (!validationResult.valid) {
return res.status(400).json({
status: 'error',
message: 'Email is not valid. Please try again!',
reason: validationResult.reason
});
}
// Email sending logic
// Placeholder response for a successful email submission
res.status(200).json({
status: 'success',
message: 'Email successfully sent'
});
});
```
## Send email from Node.js contact form using SMTP
Now that we have our Node.js contact form flow in place, let’s add an email-sending functionality to it.
For this, we’ll need two things:
- **Nodemailer** – If you’re reading this article, I’m guessing you already have Nodemailer installed, but if you don’t, [here’s how to install it](https://mailtrap.io/blog/sending-emails-with-nodemailer/#How-to-install-Nodemailer).
- **SMTP credentials** – With Nodemailer, you can leverage any SMTP server like, for example, Gmail. However, Gmail has some significant limitations, about which you can find out more in our dedicated [Nodemailer Gmail tutorial](https://mailtrap.io/blog/nodemailer-gmail/#Gmail-SMTP-limitations-and-possible-issues).
So, to overcome Gmail limitations, we’ll set up Mailtrap Email Sending instead, as the SMTP service in our Nodemailer transporter object. It offers an infrastructure with high deliverability rates by default and by design, plus it’s easy to use.
First, create a [free Mailtrap account](https://mailtrap.io/register/signup) and verify your domain. It only takes a couple of minutes, and you can watch the video we’ve prepared for you as a step-by-step tutorial.
{% embed https://youtu.be/vAfUyKpWj_M %}
Then, proceed to the **Sending Domains** section, choose your domain, and under **Integration**, select your preferred stream (Transactional, in this case). There, you will find your SMTP credentials, which you can easily paste into the Nodemailer config file.

Speaking of Nodemailer configuration, let’s insert it into the **server.js** file beneath the comment **// Create a transporter object**, like so:
```
// Create a transporter object
const transporter = nodemailer.createTransport({
host: 'live.smtp.mailtrap.io',
port: 587,
secure: false, // use false for STARTTLS; true for SSL on port 465
auth: {
user: '1a2b3c4d5e6f7g',
pass: '1a2b3c4d5e6f7g',
}
});
// Configure the mailoptions object
const mailOptions = {
from: 'yourusername@email.com',
to: 'yourfriend@email.com',
subject: 'Sending Email using Node.js',
text: 'That was easy!'
};
// Send the email
transporter.sendMail(mailOptions, (error, info) =>{
if (error) {
console.log('Error:' error);
return res.status(500).json({ status: 'error', message: 'Failed to send email due to server error.' });
} else {
console.log('Email sent: ' + info.response);
return res.status(200).json({
status: 'success',
message: 'Email successfully sent'
});
}
});
```
Then, just insert your Mailtrap credentials into their respectable fields, including `host`, `port`, `user`, and `pass`.
In the end, your **server.js** file should look something like this:
```
require('dotenv').config();
const express = require('express');
const { validate } = require('deep-email-validator');
const path = require('path');
const app = express();
const PORT = process.env.PORT || 3000;
// Middleware
app.use(express.static('public'));
app.use(express.json());
app.get('/', (req, res) => {
res.sendFile(path.join(__dirname, 'public', 'contactform.html'));
});
// Function to verify reCAPTCHA
async function verifyRecaptcha(token) {
const recaptchaSecret = process.env.RECAPTCHA_SECRET_KEY;
const recaptchaUrl = `https://www.google.com/recaptcha/api/siteverify?secret=${recaptchaSecret}&response=${token}`;
const response = await fetch(recaptchaUrl, { method: 'POST' });
const result = await response.json();
return result.success;
}
// Endpoint to handle form submission and send email
app.post('/send-email', async (req, res) => {
const { name, email, subject, message, 'g-recaptcha-response': recaptchaToken } = req.body;
if (!name || !email || !subject || !message || !recaptchaToken) {
return res.status(400).json({ status: 'error', message: 'Missing required fields!' })
}
// Verify the reCAPTCHA token
const isRecaptchaValid = await verifyRecaptcha(recaptchaToken);
if (!isRecaptchaValid) {
return res.status(400).json({
status: 'error',
message: 'reCAPTCHA verification failed. Please try again.'
});
}
// Validate the email
const validationResult = await validate(email);
if (!validationResult.valid) {
return res.status(400).json({
status: 'error',
message: 'Email is not valid. Please try again!',
reason: validationResult.reason
});
}
// Email sending logic
// Create a transporter object
const transporter = nodemailer.createTransport({
host: process.env.SMTP_HOST,
port: process.env.SMTP_PORT,
secure: false, // use false for STARTTLS; true for SSL on port 465
auth: {
user: process.env.SMTP_USER,
pass: process.env.SMTP_PASS,
}
});
// Configure the mailoptions object
const mailOptions = {
from: process.env.EMAIL_FROM,
to: process.env.EMAIL_TO,
replyTo: email,
subject: subject,
text: `From: ${name}\nEmail:${email}\n\n${message}`
};
// Send the email
transporter.sendMail(mailOptions, (error, info) => {
if (error) {
console.log('Error:', error);
return res.status(500).json({ status: 'error', message: 'Failed to send email due to server error.' });
} else {
console.log('Email sent: ' + info.response);
return res.status(200).json({
status: 'success',
message: 'Email successfully sent'
});
}
});
});
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
```
To safely store your Mailtrap credentials, you can use the environment variable, or the **.env** file, like so:
```
PORT=3200
SMTP_HOST='live.smtp.mailtrap.io'
SMTP_PORT=587
SMTP_USER='1a2b3c4d5e6f7g'
SMTP_PASS='1a2b3c4d5e6f7g'
EMAIL_FROM='yourusername@email.com'
EMAIL_TO='contactformrecipient@yourmail.com'
RECAPTCHA_SECRET_KEY='SECRET-KEY'
```
**[Send Emails with Mailtrap for Free](https://mailtrap.io/register/signup)**
## Send email from Node.js contact form using API
If you want to automate your contact form’s email-sending functionality, you can again rely on Mailtrap as it has a robust [sending package](https://github.com/railsware/mailtrap-nodejs). The package itself is regularly updated by a team of developers, lets you automate your sending process, and it’s super straightforward to use.
First, you’ll need a [Mailtrap account](https://mailtrap.io/register/signup), which, if you don’t already have one, you can create by following the instructions provided in the previous chapter.
Then, let’s install the Mailtrap package:
```
npm install mailtrap
# or, if you are using yarn:
yarn add mailtrap
```
Once it’s installed, you can use the following code snippet for email sending:
```
import { MailtrapClient } from "mailtrap"
/**
* For this example to work, you need to set up a sending domain,
* and obtain a token that is authorized to send from the domain.
*/
const TOKEN = "<YOUR-TOKEN-HERE>";
const SENDER_EMAIL = "<SENDER ADDRESS@YOURDOMAIN.COM>";
const RECIPIENT_EMAIL = "<RECIPIENT@EMAIL.COM>";
const client = new MailtrapClient({ token: TOKEN });
const sender = { name: "Mailtrap Test", email: SENDER_EMAIL };
client
.send({
from: sender,
to: [{ email: RECIPIENT_EMAIL }],
subject: "Hello from Mailtrap!",
text: "Welcome to Mailtrap Sending!",
})
.then(response => {
console.log("Email sent successfully:", response);
})
.catch(error => {
console.error("Error sending email:", error);
});
```
And for your convenience’s sake, here’s what your server.js file should look like:
```
require('dotenv').config();
const express = require('express');
const { MailtrapClient } = require('mailtrap');
const { validate } = require('deep-email-validator');
const path = require('path');
const app = express();
const PORT = process.env.PORT || 3000;
// Middleware
app.use(express.static('public'));
app.use(express.json());
app.get('/', (req, res) => {
res.sendFile(path.join(__dirname, 'public', 'contactform.html'));
});
// Function to verify reCAPTCHA
async function verifyRecaptcha(token) {
const recaptchaSecret = process.env.RECAPTCHA_SECRET_KEY;
const recaptchaUrl = `https://www.google.com/recaptcha/api/siteverify?secret=${recaptchaSecret}&response=${token}`;
const response = await fetch(recaptchaUrl, { method: 'POST' });
const result = await response.json();
return result.success;
}
// Endpoint to handle form submission and send email
app.post('/send-email', async (req, res) => {
const { name, email, subject, message, 'g-recaptcha-response': recaptchaToken } = req.body;
if (!name || !email || !subject || !message || !recaptchaToken) {
return res.status(400).json({ status: 'error', message: 'Missing required fields!' })
}
// Verify the reCAPTCHA token
const isRecaptchaValid = await verifyRecaptcha(recaptchaToken);
if (!isRecaptchaValid) {
return res.status(400).json({
status: 'error',
message: 'reCAPTCHA verification failed. Please try again.'
});
}
// Validate the email
const validationResult = await validate(email);
if (!validationResult.valid) {
return res.status(400).json({
status: 'error',
message: 'Email is not valid. Please try again!',
reason: validationResult.reason
});
}
// Configure mailtrap client and define sender
console.log(process.env.MAILTRAP_TOKEN);
const client = new MailtrapClient({ token: process.env.MAILTRAP_TOKEN });
const sender = { name: "NodeJS App", email: process.env.EMAIL_FROM };
// Send email
try {
const response = await client.send({
from: sender,
to: [{ email: process.env.EMAIL_TO }],
subject: subject,
text: `From: ${name}\nEmail: ${email}\n\n${message}`,
});
console.log('Email sent: ', response.message_ids);
res.status(200).json({
status: 'success',
message: 'Email successfully sent'
});
} catch (error) {
console.log('Error:', error);
res.status(500).json({ status: 'error', message: 'Failed to send email due to server error.' });
}
});
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
```
If you want to safely store your credentials in an environment variable, here’s what your **.env** should look like:
```
PORT=3200
MAILTRAP_TOKEN='7ff93fc2453461800734fb5c8bbe735d'
EMAIL_FROM='yourusername@email.com'
EMAIL_TO='contactformrecipient@yourmail.com'
RECAPTCHA_SECRET_KEY='SECRET-KEY'
```
And voila! Your Node.js contact form can now send emails via API.
Keep in mind that the code snippet I’ve shown you here is only for plain-text messages. If you wish to send HTML emails or add embedded images or attachments, please refer to the [examples](https://github.com/railsware/mailtrap-nodejs/tree/main/examples) folder in the GitHub repository.
## Test email and email sending on staging
If you’re making a contact form, the chances are that you’ll use it for collecting critical user information like email addresses, names, and other sensitive data.
Hence, you need to make sure that your contact form’s functionality is spot on and that your submitted forms are reaching the desired addresses as they’re supposed to.
And that’s exactly where email testing comes in—an industry-standard practice that makes sure your submission system is working as intended, that your emails are looking flawless, and more importantly, that your emails are not being marked as spam.
Personally, I use [Mailtrap Email Testing](https://mailtrap.io/email-sandbox/), another integral part of Mailtrap Email Delivery Platform that offers a sandbox for you to inspect and debug emails in staging, dev, and QA environments.
{% embed https://youtu.be/AveaJc6c3fI %}
With Mailtrap Email Testing, you can preview how your emails look in different devices/clients, inspect their source HTML/CSS and easily fix any faulty lines of code.

Additionally, thanks to the Spam Report feature, you will be aware of your spam score, which, if you keep under 5, prevents a considerable amount of potential [email deliverability](https://mailtrap.io/blog/email-deliverability/) issues.

Besides these, you also get access to other features for improving your email deliverability, including:
- Email preview in HTML form and raw text
- Email Testing API for [QA automation](https://mailtrap.io/qa-automation/)
- Multiple inboxes for different projects and stages
- User management, SSO
- Email templates testing
Now, let me show you how it works!
### SMTP
To start testing your emails with Nodemailer and SMTP, follow these steps:
- Create a [free Mailtrap account](https://mailtrap.io/register/signup)
- Go to **Email Testing** and select your inbox
- Copy your credentials from the **Integration** tab
- Insert the credentials into your **server.js** file
You can also use Mailtrap’s ready-to-use integration, like so:
- Select Nodemailer from the list of integrations

- Copy and paste the code snippet into your **server.js** file
Here’s what the snippet should look like:
```
const transport = nodemailer.createTransport({
host: "sandbox.smtp.mailtrap.io",
port: 2525,
auth: {
user: "1a2b3c4d5e6f7g",
pass: "1a2b3c4d5e6f7g"
}
});
```
**[Test Emails with Mailtrap for Free](https://mailtrap.io/register/signup)**
### API
Integrating Mailtrap [Email Testing API](https://mailtrap.io/automated-email-testing/) for testing, automation, and testing automated sequences is as simple as using the following code snippet:
```
require('dotenv').config();
const { MailtrapClient } = require('mailtrap');
const nodemailer = require('nodemailer');
/* The official mailtrap package doesn't support sending test emails, so we send a mail first with nodemailer */
/* Initialize nodemailer with SMTP configuration from environment variables. */
const transport = nodemailer.createTransport({
host: process.env.SMTP_HOST,
port: process.env.SMTP_PORT,
auth: {
user: process.env.SMTP_USER,
pass: process.env.SMTP_PASS
}
});
/* Asynchronously send a test email using nodemailer. */
async function sendTestEmail() {
const info = await transport.sendMail({
from: process.env.EMAIL_FROM,
to: "user@domain.com",
subject: "Mailtrap Email Testing",
html: '<h1>Mailtrap Email Testing</h1>',
text: 'Mailtrap Email Testing'
});
console.log("Message sent: %s", info.messageId);
return info;
}
/* Configure the Mailtrap client and fetch email information for testing and automation. */
const client = new MailtrapClient({
token: process.env.MAILTRAP_TOKEN,
testInboxId: process.env.TEST_INBOX_ID,
accountId: process.env.ACCOUNT_ID
});
const inboxesClient = client.testing.inboxes;
const messagesClient = client.testing.messages;
/* Send the test email and then retrieve it from Mailtrap for analysis. */
sendTestEmail().then(() => {
inboxesClient.getList()
.then(async (inboxes) => {
if (inboxes && inboxes.length > 0) {
const firstInboxId = inboxes[0].id;
console.log(`First inbox ID: ${firstInboxId}`);
const messages = await messagesClient.get(firstInboxId);
if (messages && messages.length > 0) {
const firstMessageId = messages[0].id;
console.log(`First message ID: ${firstMessageId}`);
const analysis = await messagesClient.getHtmlAnalysis(firstInboxId, firstMessageId);
console.log('HTML Analysis:', analysis);
const htmlMessage = await messagesClient.getHtmlMessage(firstInboxId, firstMessageId);
console.log('HTML Message:', htmlMessage);
const textMessage = await messagesClient.getTextMessage(firstInboxId, firstMessageId);
console.log('Text Message:', textMessage);
const headers = await messagesClient.getMailHeaders(firstInboxId, firstMessageId);
console.log('Mail Headers:', headers);
const eml = await messagesClient.getMessageAsEml(firstInboxId, firstMessageId);
console.log('Message as EML:', eml);
const htmlSource = await messagesClient.getMessageHtmlSource(firstInboxId, firstMessageId);
console.log('HTML Source:', htmlSource);
const rawMessage = await messagesClient.getRawMessage(firstInboxId, firstMessageId);
console.log('Raw Message:', rawMessage);
const spamScore = await messagesClient.getSpamScore(firstInboxId, firstMessageId);
console.log('Spam Score:', spamScore);
const emailMessage = await messagesClient.showEmailMessage(firstInboxId, firstMessageId);
console.log('Email Message:', emailMessage);
const updateStatus = await messagesClient.updateMessage(firstInboxId, firstMessageId, {
isRead: false
});
console.log('Update Status:', updateStatus);
// Forward the message (needs to be a confirmed email for forwarding in mailtrap)
// await messagesClient.forward(firstInboxId, firstMessageId, 'mock@mail.com');
// console.log('Message forwarded.');
// Delete the message
const response = await messagesClient.deleteMessage(firstInboxId, firstMessageId);
console.log('Delete Response:', response);
} else {
console.log('No messages found in the first inbox.');
}
} else {
console.log('No inboxes found.');
}
})
.catch(error => {
console.error('Error fetching inboxes or messages:', error);
});
}).catch(console.error);
```
With this code, we are first sending a test email using Nodemailer and retrieving the information related to the test email programmatically. This can potentially be used to perform automation. This code only logs the information of the test email.
And of course, here’s what your **.env** file should look like for the Email Testing API:
```
MAILTRAP_TOKEN=''
SMTP_HOST='sandbox.smtp.mailtrap.io'
SMTP_PORT=2525
SMTP_USER=''
SMTP_PASS=''
EMAIL_FROM=''
TEST_INBOX_ID=
ACCOUNT_ID=
```
For more information and use cases of Mailtrap Email Testing API, check out the [official GitHub page](https://github.com/railsware/mailtrap-nodejs?tab=readme-ov-file#email-testing-api).
## Wrapping up
And with that, we’ve come to the end of the line!
Now that you’ve got the ropes, you can style your Node.js contact form according to your artistic eye and make it function according to your application’s specific needs.
So, code away, and be sure to give our [blog](https://mailtrap.io/blog/) a read! We have a plethora of helpful JavaScript-related articles, including:
- [JavaScript Send Email – Read This First](https://mailtrap.io/blog/javascript-send-email/)
- [Send and Receive Emails with Node JS](https://mailtrap.io/blog/send-emails-with-nodejs/)
- [Sending Emails with Nodemailer Explained](https://mailtrap.io/blog/sending-emails-with-nodemailer/)
We appreciate you chose this article to know about [creating a Node.js contact form](https://mailtrap.io/blog/node-js-contact-form/). If you want to see more content on related topics, visit Mailtrap blog and explore it!
| idjuric660 | |
1,872,104 | Fourth LATAM School in Software Engineering (ACM SIGSOFT-sponsored) | Access https://cbsoft.sbc.org.br/2024/escola/ for the latest information on this school. Call for... | 0 | 2024-05-31T13:49:59 | https://dev.to/fronteirases/fourth-latam-school-in-software-engineering-acm-sigsoft-sponsored-5137 | Access <https://cbsoft.sbc.org.br/2024/escola/> for the latest information on this school.
**Call for participation**
It is a great pleasure to invite students and young researchers to apply for this fantastic opportunity, sponsored by ACM SIGSOFT, to learn about careers and research and attend CBSoft 2024, the Brazilian Software Congress.
We invite applications for the 4th Latin American School of Software Engineering. The school will take place in person, collocated with CBSoft 2024. The school welcomes students and young researchers from Latin American institutions and organizations.
The main goal of this school is to help new and future Software Engineering researchers to be part of the larger software engineering community, launching a successful career and managing the challenges while feeling joy in their life as researchers.
The school will also enable the participants to explore potential pathways and understand more about the academic challenges and opportunities. Students will have an excellent opportunity to interact with top researchers and learn about their careers, how to publish, and hear about important topics usually not covered during their studies.
Beyond the outstanding keynotes, we will provide a mentoring section. Students will be grouped by software engineering topics and have at least one mentor. The mentoring section will happen with students presenting a poster with a 1-2 minute lightning talk about their research. Also, students should be prepared to answer questions about their research and ask questions to the mentors.
**Eligibility**
The applicants must be:
• Ph.D. or MSc students from Latin American institutions; or who received the Ph.D. degree on or after 2023 and work in Latin American organizations/universities;
• Highly motivated undergraduate students might be accepted; and
• Working with software engineering topics.
There are \*\*limited spots\*\* (maximum of 50 students) to be filled for this school that will be given to Latin American applicants. We will select the participants based on different criteria (academic background, research topic, diversity) to ensure the adequacy of the school and diversity. We encourage all those interested in applying!
**How to apply?**
You need to provide the following documents upon submission:
• A structured abstract with the headings: Context/Problem, Aims, Method, and Results (Expected Results) written in English. No specific template is required. The structured abstract must explicitly describe the research plan the students are conducting (1-page limit).
• Poster: The poster should outline the research with interesting commentary about what you learned. It should be a balance of visuals and text. It must present the research overview, research questions, method, results obtained, and future plans.
◦ Dimensions: 122cm (height) and 92cm (width). Try to use large fonts, such as font size 36-48 for section headings and font size 20-28 for text.
• A document containing a list of papers published in journals and conferences between 2022 and 2024. The document must present the Year of Publication, Paper Title, Conference or Journal | List of Authors | and Link to the student DBLP pointing to the paper. We will only consider the conferences and journals available at CSIndex ([https://csindexbr.org/](https://csindexbr.org/)) - Software Engineering (list of TOP-15 best conferences and TOP-15 Journals). Students can also consider publications from the following venues:
• Ibero-American Conference on Software Engineering (CIbSE)
• Brazilian Symposium on Software Engineering (SBES)
• Brazilian Symposium on Software Components, Architectures, and Reuse (SBCARS)
• Brazilian Symposium on Systematic and Automated Software Testing (SAST)
• Brazilian Symposium on Software Quality (SBQS)
• Journal of Software Engineering Research and Development (JSERD).
Submissions must explicitly state the names of the applicant, advisor, and affiliation.
Submissions should be electronically made through JEMS3 ([https://jems3.sbc.org.br/events/131](https://jems3.sbc.org.br/events/131)).
**Benefits for selected participants**
• Free registration for the School (including lunch and coffee break)
• Students will be ranked by the number of publications, posters, and structured abstract quality. The following cost allowance will be provided:
◦ The TOP-3 applications from LATAM participants outside Brazil will receive a cost allowance after proving they will attend the LATAM School (e.g., air/bus tickets, hotel payment reservation). The cost allowance is limited to R$2,500 (Brazilian Reais), which in around 480 USD (rates from May 2024)
◦ The TOP-17 Brazilian applications will receive a cost allowance after proving they will attend the LATAM School (e.g., air/bus tickets, hotel payment reservation). The cost allowance is limited to R$1,500 (Brazilian Reais).
This fantastic deal is only possible due to generous sponsorship from ACM SIGSOFT. Upon acceptance to the school, participants will receive an email detailing the registration procedure.
**Program
**As an initiative aiming to increase participation of Global South authors at ICSE'26, which will take place in Rio de Janeiro, this year we will have a paper writing workshop led by Bianca Trinkenreich (Oregon State University), receiver of the 2024 ACM SIGSOFT Outstanding Dissertation Award. This is a pre-event of the LATAM School, which takes place on October 1st (Tuesday).
So far, we have two keynote speakers confirmed:
• Nicole Novielli (University of Bari, Italy)
• Paulo Borba (UFPE, Brazil)
Other great names in software engineering research and practice will be gradually confirmed to serve as keynote speakers.
**Important Dates**
Application deadline: June 30th, 2024
Notification date: July 15th, 2024
LATAM School pre-event: September 30th, 2024
LATAM School: October 1st, 2024
Location: Curitiba, Paraná, Brazil
More details about CBSoft 2024 and how to plan your trip to Curitiba, please visit the website: [https://cbsoft.sbc.org.br/2024/](https://cbsoft.sbc.org.br/2024/)
**Organization**
Kiev Gama, UFPE ([kiev@cin.ufpe.br](mailto:kiev@cin.ufpe.br)) | fronteirases | |
1,872,349 | Powering Up #NET Apps with #Phi-3 and #SemanticKernel | Hi! Introducing the Phi-3 Small Language Model Phi-3 is an amazing Small Language Model.... | 0 | 2024-06-03T14:57:48 | https://dev.to/azure/powering-up-net-apps-with-phi-3-and-semantickernel-203d | englishpost, codesample, github | ---
title: Powering Up #NET Apps with #Phi-3 and #SemanticKernel
published: true
date: 2024-05-31 13:49:48 UTC
tags: EnglishPost, CodeSample, EnglishPost, GitHub
canonical_url:
---
Hi!
# Introducing the Phi-3 Small Language Model
Phi-3 is an amazing Small Language Model. And hey, it’s also an easy one to use in C#. I already wrote how to use it with ollama, now it’s time to hit the ONNX version.
## Introduction to Phi-3 Small Language Model
The [Phi-3 Small Language Model (SLM)](https://azure.microsoft.com/en-us/blog/introducing-phi-3-redefining-whats-possible-with-slms/) represents a significant leap forward in the field of artificial intelligence. Developed by Microsoft, the Phi-3 family of models is a collection of the most capable and cost-effective SLMs available today. These models have been meticulously crafted to outperform other models of similar or even larger sizes across various benchmarks, including language understanding, reasoning, coding, and mathematical tasks.
Phi-3 models are not only remarkable for their performance but also for their efficiency and adaptability. They are designed to operate across a wide range of hardware, from traditional computing devices to edge devices like mobile phones and IoT devices. This makes Phi-3 particularly suitable for developers looking to integrate advanced AI capabilities into applications that require strong reasoning, limited compute resources, and low latency.
## C# Phi3-Labs GitHub Repository
[The GitHub repository in question serves as a practical guide for developers looking to harness the power of the Phi-3 SLM within their applications](https://github.com/elbruno/phi3-labs). It provides a set of sample projects to demonstrate how to use Phi-3 and C#.
- **LabsPhi301**. This is a sample project that uses a local phi3 model to ask a question. The project loads a local ONNX Phi-3 model using the Microsoft.ML.OnnxRuntime libraries.
- **LabsPhi302**. This is a sample project that implements a Console chat using Semantic Kernel.
- **LabsPhi303 (Comming soon!)**. This is a sample project that uses Phi-3 Vision to analyze images.
The repository contains detailed instructions for setting up the development environment, cloning the necessary models from Hugging Face, and installing the Phi-3 model.
You can learn more about [Phi-3 in Hugging Face](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-onnx).
A final running application will look like this.
()
Happy coding!
Greetings
El Bruno
More posts in my blog [ElBruno.com](https://www.elbruno.com).
* * * | elbruno |
1,872,103 | EC2 real network bandwidth | EC2 instances does not always provide exact numbers on their network capacity. In some cases where we... | 0 | 2024-05-31T13:49:12 | https://dev.to/carc/ec2-real-network-bandwidth-1k94 | devops, aws, terraform, performance | EC2 instances does not always provide exact numbers on their network capacity. In some cases where we need to optimize our network traffic we might need to know the exact numbers for our own instances.
## iperf & Terraform
One way to achieve this is using [`iperf3`](https://iperf.fr/). This tool uses a client-server model, which is ideal for testing network performance between two hosts. The server listens for incoming test requests, and the client initiates the test, making it easy to measure the capacity of the network link between them.
The next step involves setting up two EC2 instances. The first will act as a server, which will help us determine its actual network capacity. The second will act as a client. The client instance should have a higher network capacity than the server to ensure that the server's network link is fully utilized during the test.
To implement this, we need to create a pair of EC2 instances along with their corresponding resources, such as roles and security groups, in our AWS account. Doing this manually for every EC2 instance type we need to measure could be tedious, so we'll use [Terraform](https://www.terraform.io/) for this task.
Following are the important parts of terraform code to setup this test environment to measure network capacity of `m3.medium` instances.
``` HCL
resource "aws_instance" "iperf_server" {
ami = data.aws_ami.amazon_linux_2.id
instance_type = "m3.medium"
subnet_id = data.aws_subnet.current.id
security_groups = [aws_security_group.iperf.id]
iam_instance_profile = aws_iam_instance_profile.iperf.name
tags = {
Name = "${var.name}-iperf-server"
}
key_name = aws_key_pair.iperf.key_name
connection {
host = coalesce(self.public_ip, self.private_ip)
type = "ssh"
user = var.ssh_user
private_key = tls_private_key.iperf.private_key_pem
}
provisioner "remote-exec" {
inline = [
"sudo yum -y update",
"sudo yum -y install iperf3"
]
}
}
resource "aws_instance" "iperf_client" {
ami = data.aws_ami.amazon_linux_2.id
instance_type = "m5.12xlarge"
subnet_id = data.aws_subnet.current.id
security_groups = [aws_security_group.iperf.id]
iam_instance_profile = aws_iam_instance_profile.iperf.name
tags = {
Name = "${var.name}-iperf-client"
}
key_name = aws_key_pair.iperf.key_name
connection {
host = coalesce(self.public_ip, self.private_ip)
type = "ssh"
user = var.ssh_user
private_key = tls_private_key.iperf.private_key_pem
}
provisioner "remote-exec" {
inline = [
"sudo yum -y update",
"sudo yum -y install iperf3"
]
}
}
resource "aws_security_group" "iperf" {
name = "${var.name}-iperf-seg"
description = "Allow inbound traffic for iperf"
vpc_id = data.aws_vpc.current.id
ingress {
from_port = 0
to_port = 0
protocol = "-1"
self = true
}
ingress {
from_port = 5001 # Assuming iperf server listens on port 5001
to_port = 5001
protocol = "tcp"
cidr_blocks = ["0.0.0.0/0"]
}
ingress {
description = "port 22"
from_port = 22
to_port = 22
protocol = "tcp"
cidr_blocks = var.ssh_cidr_ingress_blocks
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
tags = {
"Name" : "${var.name}-iperf-seg"
}
}
output "server_private_ip" {
value = aws_instance.iperf_server.private_ip
}
output "server_public_ip" {
value = aws_instance.iperf_server.public_ip
}
output "client_public_ip" {
value = aws_instance.iperf_client.public_ip
}
```
## Conducting the test
Once we have created our test env, we just need to ssh into the server instance and start `iperf` in server mode
``` bash
$ iperf -s
```
Then we’ll ssh into the client and start the test attaching to the server using its private ip to reduce extra network overhead.
``` bash
$ iperf -c <server_private_ip>
```
If we plan to test multiple instance types, we can create a script to run the tests as follows.
``` bash
terraform_path=.
vars=$(terraform -chdir=$terraform_path output -json)
client_public_ip=$(echo $vars | jq -r '.client_public_ip.value')
server_public_ip=$(echo $vars | jq -r '.server_public_ip.value')
server_private_ip=$(echo $vars | jq -r '.server_private_ip.value')
keypair="$terraform_path/iperf-keypair.pem"
ssh -i $keypair \
-o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null \
ec2-user@$server_public_ip "iperf3 -s -D"
ssh -i $keypair \
-o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null \
ec2-user@$client_public_ip -t "iperf3 -c $server_private_ip -t 150 -i 60"
```
## Real figures
I ran this test for some instance types. The results are as follows:
| Instance Type | Bandwidth |
| --- | --- |
| m3.medium | 681 Mbits/sec |
| c4.large | 628 Mbits/sec |
| m4.large | 768 Mbits/sec |
| r4.large | 9.53 Gbits/sec |
| i3.large | 4.97 Gbits/sec |
| t2.nano | 1.04 Gbits/sec |
| t3.medium | 4.52 Gbits/sec |
Remember that burstable instances, such as the `t2` and `t3` families, incorporate the concept of network credits. These credits permit them to "burst" their network performance beyond the baseline level for short periods, as needed. Therefore, the figures in the table are not directly comparable between burstable and non-burstable instance types.
To establish the baseline network performance of burstable instances, we need to conduct the test for a duration sufficient to use up the accumulated network credits. | carc |
1,872,102 | Day 5 of Machine Learning|| Exploratory Data Analysis Part 2 | Hey reader👋Hope you are doing well!!! In the last post we have seen some basics of Exploratory Data... | 0 | 2024-05-31T13:49:00 | https://dev.to/ngneha09/day-5-of-machine-learning-exploratory-data-analysis-part-2-2afl | machinelearning, datascience, tutorial, python | Hey reader👋Hope you are doing well!!!
In the last post we have seen some basics of Exploratory Data Analysis. Taking our discussion further in this post we are going to see how EDA is performed on dataset.
So let's get started🔥
Dataset used here-:
(https://www.kaggle.com/datasets/sanyamgoyal401/customer-purchases-behaviour-dataset)
## Step1-: Import important libraries
`import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns`
- Panadas is used to load the dataset and manipulate the dataframe.
- Numpy is used to perform any mathematical operation on the dataset such as finding mean, median , mode etc.
- Matplotlib ad Seaborn are used for data visualization through graphs.
## Step2-: Load the dataset
`df=pd.read_csv('/kaggle/input/customer-purchases-behaviour-dataset/customer_data.csv')`
Here `read_csv(data address)` is used to load the csv file in our dataframe `df`.
## Step3-: Check the data
`df.head()`

Here `head()` will return first five rows of dataset, if you want to see more rows you can just pass the number of rows as argument in `head()` method.
Similarly for seeing last 5 rows we have `tail()` method.
`df.tail()`
**Get dimensions of dataset**
`df.shape`

The `shape` property will return pair `{x,y}` in which x refers to the number of rows and y refers to the number of columns.
**Check data type of each column**
`df.dtypes`

The `dtypes` will give a list containing the column name and data type of each column.
**Checking null values in each column**
If your dataset is containing any missing values then these are treated as NULL and these values need a special attention because they can directly impact our model's performance.
`df.isnull().sum()`

Here the `isnull()` method checks for missing values in every column, it returns a dataframe of same dimension as of dataset with every cell filled with boolean value. The `sum()` method returns the count of missing values in every column.
In this dataset we don't have any missing values but I assure you in the upcoming blogs we are going to see how we can handle missing values.
**Checking for duplicates in each column**
Sometimes due to some errors in data collection our dataset may contain duplicate values and these duplicates can be problematic as they can impact our model. So these are needed to be removed.
`df.duplicated().sum()`
So the `duplicated()` method will return a boolean series that tells us that whether a row is duplicated or not.
We don't have duplicates in this dataset. But if present in any dataset we will simply drop them using-:
`df.drop_duplicates(inplace=True)`
The `drop_duplicates()` method will drop all the duplicates but this method does not make change in original dataframe, to ensure this we use `inplace=True` so that changes made are reflected in original dataframe.
**Checking count of distinct values in each column**
We can look for number of distinct values in each column in dataframe.
`df.nunique()`

The `nunique()` method will give number of unique values present in each column.
**Getting a statistical summary of dataset**
`df.describe()`

The `describe()` function in pandas provides summary statistics for numerical columns in a DataFrame. It gives us information such as count, mean, standard deviation, minimum, maximum, and quartile values for each numeric column.
**Checking for outliers**
Outliers are the values in the dataset whose behavior is very different than rest of the values in the dataset. We have different techniques for detecting outliers in the dataset, one of the common techniques is detection of outliers using **BoxPlot**.
> Note -: You will see a complete blog on the detection and handling of outliers.
**What is a BoxPlot?**
A BosPlot is graphical representation of the distribution of dataset. It gives the information about maximum value, minimum value, median, 25 percentile ,75 percentile and outliers.

In this image the two dots to the left and right represent the outliers the two lines in the end represent maximum and minimum value and the middle line represents median and the extremes of box represents the 25th percentile and 75th percentile.
## Step4-: Performing Univariate Analysis
Univariate analysis involves examining the distribution, central tendency, and variability of a single variable in isolation, without considering its relationship with other variables.

Here the countPlot will count the number of Male and Female and show the graphical representation.

`x = df['education'].value_counts()`: This line calculates the frequency of each unique value in the 'education' column of the DataFrame df and stores the result in the variable x. It creates a Series where the index represents the unique values in the 'education' column, and the values represent the frequency of each value.
`plt.pie(x.values, labels=x.index, autopct='%1.1f%%')`: This line creates a pie chart using Matplotlib's plt.pie() function. It takes the values from the Series x.values (which represent the frequencies) and the index from x.index (which represent the unique education levels) to plot the pie chart. The autopct='%1.1f%%' parameter specifies that the percentages of each category will be displayed on the chart with one decimal place.
`plt.show()`: This line displays the pie chart.
`plt.figure(figsize=(8, 6))
sns.histplot(df['age'], kde=True)
plt.title('Histogram of Age')
plt.xlabel('Age')
plt.ylabel('Frequency')
plt.show()`

`plt.figure(figsize=(8, 6))`: This line creates a new figure with a specified size of 8 inches by 6 inches using Matplotlib's `plt.figure()` function. This sets the dimensions of the plot.
`sns.histplot(df['age'], kde=True)`: This line creates a histogram using Seaborn's `histplot()` function. It takes the 'age' column from the DataFrame df as input and plots the distribution of ages. The `kde=True` parameter adds a kernel density estimate curve to the plot, providing a smooth representation of the distribution.
`plt.title('Histogram of Age')`: This line sets the title of the plot to 'Histogram of Age' using Matplotlib's `plt.title()` function.
`plt.xlabel('Age')`: This line sets the label for the x-axis to 'Age' using `plt.xlabel()`.
`plt.ylabel('Frequency')`: This line sets the label for the y-axis to 'Frequency' using `plt.ylabel()`.
The histplot will give you the insight of how your data is distributed (normal distribution, Poisson distribution etc.). **We can use histplot for only numerical values.**
## Step5-: Performing Bivariate Analysis
Bivariate analysis involves analyzing the relationship between two variables simultaneously. It aims to understand how the value of one variable changes with respect to the value of another variable. Common techniques used in bivariate analysis include scatter plots, correlation analysis, and cross-tabulation. Bivariate analysis helps in identifying patterns, trends, and associations between variables, providing insights into their relationship and potential dependencies.
- Categorical V/S Categorical
- Categorical V/S Numerical
- Numerical V/S Numerical
We have different approaches for each defined above. Here we will only consider **Numerical V/S Numerical**-:
`plt.figure(figsize=(8, 6))
sns.lineplot(x='age', y='income', data=df)
plt.title('Line Plot of Age vs. Income')
plt.xlabel('Age')
plt.ylabel('Income')
plt.show()`

Here you can see that how income varies with age this can be useful to see the relationship between income and age.
We have an another plot named as scatterplot which also shows the relationship between numerical data.
We will see more about these plots in upcoming blogs. This was just an insight to how EDA is performed on datasets. I hope you have understood it well. Please leave some reactions and don't forget to follow me.
Thank you ❤
| ngneha09 |
1,872,101 | Vanilla JavaScript - Hamburger Menu Button | Creating a hamburger menu is a common task for modern web development, especially for responsive... | 0 | 2024-05-31T13:47:43 | https://dev.to/serhatbek/vanilla-javascript-hamburger-menu-button-2n56 | webdev, javascript, beginners, scss |
Creating a hamburger menu is a common task for modern web development, especially for responsive designs. In this tutorial, we'll walk through how to create a hamburger menu component using plain JavaScript, HTML, and SCSS.
### Burger Menu HTML Structure
Let's first set up the HTML for our hamburger menu button. This button will have a span element to represent the menu lines. The **hamburger-menu** class is the main container for our button. The **js-menu-toggle-btn** class is used for targeting the button in our JavaScript. The **hamburger-menu\_\_line** class is for the line inside the button.
```html
<body class="container">
<button
class="hamburger-menu js-menu-toggle-btn"
aria-label="Open mobile menu"
>
<span class="hamburger-menu__line"></span>
</button>
</body>
```
### SCSS Styling
Next, we define the styles for our hamburger menu button. The styles include positioning, sizing, and the transitions for the lines. We define the main styles for the **hamburger-menu** and its lines. The **&:before** and **&:after** pseudo-elements create the top and bottom lines of the hamburger. The **&--active** class defines the transformation for when the menu is active, rotating the lines and hiding the middle line.
```scss
// COLORS
$white: aliceblue;
$grayish-blue: #003249;
$burgerTransition: all 300ms ease-in-out;
// RESET
*,
*::before,
*::after {
box-sizing: border-box;
margin: 0;
padding: 0;
}
body {
background-color: $grayish-blue;
color: $white;
}
// STYLES
.container {
width: 100vw;
height: 100vh;
display: flex;
align-items: center;
justify-content: center;
}
.hamburger-menu {
$self: &;
width: 32px;
height: 32px;
padding: 0;
display: flex;
align-items: center;
justify-content: center;
position: relative;
border: 0;
outline: 0;
background: transparent;
cursor: pointer;
&__line {
height: 3px;
background-color: $white;
width: 100%;
display: block;
transition: $burgerTransition;
}
&:before {
content: '';
position: absolute;
width: 100%;
height: 3px;
background-color: $white;
top: 5px;
left: 0;
transition: $burgerTransition;
}
&:after {
content: '';
position: absolute;
width: 100%;
height: 3px;
background-color: $white;
left: 0;
bottom: 5px;
transition: $burgerTransition;
}
&--active {
&:after {
transform: rotate(-320deg);
bottom: 13px;
}
&:before {
transform: rotate(320deg);
top: 15px;
}
#{$self}__line {
opacity: 0;
width: 0;
}
}
}
```
### Adding JavaScript Functionality
Finally, we add the JavaScript to handle the toggle functionality. The script will add or remove the **hamburger-menu--active** class on button click. The **DOMContentLoaded** event ensures that the script runs after the DOM is fully loaded. We select the button with the class **js-menu-toggle-btn** and add a click event listener to it. The **toggleMobileMenu** function toggles the **hamburger-menu--active** class, triggering the CSS transitions.
```javascript
document.addEventListener('DOMContentLoaded', () => {
const burgerBtn = document.querySelector('.js-menu-toggle-btn');
const toggleMobileMenu = () => {
burgerBtn.classList.toggle('hamburger-menu--active');
/* you can add your mobile menu functionality here */
};
if (burgerBtn) {
burgerBtn.addEventListener('click', toggleMobileMenu);
}
});
```
We've just created a simple hamburger menu component using vanilla JavaScript. You can customize it for your project's needs. To see the detailed code check project's [Github](https://github.com/serhatbek/javascript-projects/tree/main/BurgerMenu) repo and [Codepen](https://codepen.io/serhatbek/pen/OJYWZer) for live demo.
Thank you for reading. If you find the article useful, please do not forget to give a star so that others can access it. Happy Coding! 🙃
<a href="https://www.buymeacoffee.com/serhatbek" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/default-orange.png" alt="Buy Me A Coffee" height="41" width="174"></a>
| serhatbek |
1,871,786 | king33company | King33 la thuong hieu ca cuoc truc tuyen mot san choi tap hop du moi uu diem tot nhat dam bao cho... | 0 | 2024-05-31T08:01:33 | https://dev.to/king33company/king33company-fl7 | King33 la thuong hieu ca cuoc truc tuyen mot san choi tap hop du moi uu diem tot nhat dam bao cho chat luong va ca su uy tin danh cho nhung nguoi ua thich ca cuoc online
Dia Chi: 22m Nguyen Thuc Tu, An Lac A, Binh Tan, Thanh pho Ho Chi Minh, Viet Nam
Email: tesweysorge.heidi@gmail.com
Website: https://king33.company/
Dien Thoai: (+63) 9633827688
#king33 #king33company #king33casino #nhacaiking33 #king33com
Social Media:
https://king33.company/
https://king33.company/nap-tien-king33/
https://king33.company/rut-tien-king33/
https://king33.company/tai-app-king33/
https://king33.company/dang-ky-king33/
https://king33.company/lien-he-king33/
https://king33.company/gioi-thieu-king33/
https://king33.company/chinh-sach-bao-mat-king33/
https://www.facebook.com/king33company
https://www.youtube.com/channel/UCqB3vWOsFAE4sOcfR0l4Q8g
https://www.pinterest.com/king33company/
https://www.tumblr.com/king33company
https://vimeo.com/king33company
https://www.twitch.tv/king33company/about
https://www.reddit.com/user/king33company/
https://500px.com/p/king33company?view=photos
https://gravatar.com/king33company
https://www.blogger.com/profile/14616381188921541568
https://king33company.blogspot.com/
https://draft.blogger.com/profile/14616381188921541568
https://twitter.com/king33company
https://www.gta5-mods.com/users/king33company
https://www.instapaper.com/p/king33company
https://hub.docker.com/u/king33company
https://www.mixcloud.com/king33company/
https://flipboard.com/@king33company/king33company-4nj2pit0y
https://issuu.com/king33company
https://www.liveinternet.ru/users/king33company/profile
https://beermapping.com/account/king33company
https://qiita.com/king33company
https://www.reverbnation.com/artist/king33company
https://guides.co/g/king33company/376183
https://os.mbed.com/account/confirm_email/793232a274c046ab953cc02a7ac39371
https://myanimelist.net/profile/king33company
https://www.metooo.io/u/king33company
https://www.fitday.com/fitness/forums/members/king33company.html
https://www.iniuria.us/forum/member.php?433823-king33company
https://www.veoh.com/users/king33company
https://gifyu.com/bongdawaplife
https://www.dermandar.com/user/king33company/
https://pantip.com/profile/8118949#topics
https://hypothes.is/users/king33company
http://molbiol.ru/forums/index.php?showuser=1345990
https://leetcode.com/u/king33company/
https://www.walkscore.com/people/292247779352/king33company
http://www.fanart-central.net/user/king33company/profile
http://hawkee.com/profile/6753652/
https://www.gta5-mods.com/users/king33company
https://codepen.io/king33company/pen/MWRdNxO
https://jsfiddle.net/king33company/uad94jv5/
https://forum.acronis.com/user/649702/
https://www.funddreamer.com/users/king33company
https://www.renderosity.com/users/id:1488758
https://www.storeboard.com/king33company1
https://doodleordie.com/profile/king33company
https://community.windy.com/user/king33company
https://connect.gt/user/king33company
https://teletype.in/@king33company
https://rentry.co/66aq6zbo
https://talktoislam.com/user/king33company
https://www.credly.com/users/king33company/badges
https://www.roleplaygateway.com/member/king33company/
https://masto.nu/@king33company
https://www.ohay.tv/profile/king33company
https://www.mapleprimes.com/users/king33company
http://www.rohitab.com/discuss/user/2177925-king33company/ | king33company | |
1,872,100 | What I will be doing for the 30 day code challenge | -Things that I will be studying for the 30 day code challenge: React Framework(Frontend dev) Flask... | 0 | 2024-05-31T13:46:11 | https://dev.to/francis_ngugi/what-i-will-be-doing-for-the-30-day-code-challenge-1k5 | adhd, frontend, backenddevelopment, hacking | -Things that I will be studying for the 30 day code challenge:
>React Framework(Frontend dev)
>Flask Framework(Backend dev)
>Ethical Hacking(Try Hack me)
-This will not be an easy challenge especially for a guy going through this with ADHD. | francis_ngugi |
1,872,099 | Git | Git is a distributed version control system used to track changes in source code during software... | 0 | 2024-05-31T13:46:07 | https://dev.to/mohamedabdiahmed/git-4f9i | Git is a distributed version control system used to track changes in source code during software development. It allows multiple developers to work on a project simultaneously, without interfering with each other's changes.
 | mohamedabdiahmed | |
1,872,097 | The Future of Networking: Trusted Digital Business Cards from the Leading Platform | In the digital age, networking has transformed drastically, and traditional business cards are... | 0 | 2024-05-31T13:43:23 | https://dev.to/digitize_card_ff7c61c21b0/the-future-of-networking-trusted-digital-business-cards-from-the-leading-platform-3pke | digitizebusinesscards, digitizecards, businesscards | In the digital age, networking has transformed drastically, and traditional business cards are becoming a thing of the past. Enter digital business cards, a modern, efficient, and eco-friendly way to share your professional information. If you’re looking to stay ahead in your networking game, trusted digital business cards from a leading platform like Digitize Cards are your best bet. Here’s why.
**
Why Digital Business Cards?**
Digital business cards offer several advantages over traditional paper cards. They are easy to update, eliminating the need to print new cards whenever your information changes. They are also more interactive, allowing you to include links to your social media profiles, websites, and portfolios. Additionally, they are environmentally friendly, reducing paper waste.
**Key Features of [Digital Business Cards](https://digitizecards.com/
)**
**Contact Information:** Just like traditional cards, digital business cards include your name, phone number, email address, and company name. However, they also allow you to add much more.
**Professional Photo**: A professional photo adds a personal touch and makes your card more memorable. It helps people put a face to the name.
**Job Title and Company Logo:** Clearly stating your job title and including your company’s logo helps recipients immediately recognize your role and organization.
**
Social Media Links:** You can integrate links to your professional social media profiles, such as LinkedIn, Twitter, or Instagram. This allows recipients to connect with you on multiple platforms.
**Interactive Features:** Digital business cards can include QR codes, videos, and links to your portfolio or website. These interactive elements make your card more engaging and informative.
**Personalized Message:** A brief personalized message or tagline can make your card stand out and leave a lasting impression.
**Digitize Business Cards in Pune**
For those in Pune, Digitize Cards offers a range of customizable options to create impactful digital business cards. Their platform provides easy-to-use templates and innovative features to help you design a professional and memorable digital business card. Whether you need to digitize business cards for a corporate event or personal networking, Digitize Cards in Pune has got you covered.
**Conclusion**
Digital business cards are the future of networking. They offer unparalleled convenience, interactivity, and eco-friendliness. By leveraging a trusted platform like Digitize Cards, you can create digital business cards that truly represent your professional identity. Embrace the future of networking with trusted digital business cards from the leading platform and make a lasting impression in every interaction.
| digitize_card_ff7c61c21b0 |
1,872,095 | Custom Software Development | Dev Technosys offers comprehensive custom software development services across various industries.... | 0 | 2024-05-31T13:36:51 | https://dev.to/shane_cornerus/custom-software-development-50bk | softwaredevelopment |
Dev Technosys offers comprehensive [custom software development services](https://devtechnosys.com/custom-software-development.php) across various industries. With over a decade of experience, they specialize in tailored solutions for healthcare, manufacturing, CRM, and more. Their expertise includes software consulting, integration, product development, and ongoing maintenance. Connect with Dev Technosys for scalable and efficient software solutions. | shane_cornerus |
1,872,094 | Choose The Best Influencer Marketing Agency Dubai | Welcome to the bustling city of Abu Dhabi, where the skyline is dotted with innovation, culture, and... | 0 | 2024-05-31T13:29:32 | https://dev.to/ybi_social_61ca8c4865f350/choose-the-best-influencer-marketing-agency-dubai-4dei | Welcome to the bustling city of Abu Dhabi, where the skyline is dotted with innovation, culture, and opportunity. Among its many thriving industries, influencer marketing stands out as a key player in helping businesses connect with their audiences in authentic and impactful ways. Today, we’re shining the spotlight on one of the best influencer marketing agencies in this vibrant city – let's dive into what makes them special!
Why Influencer Marketing?
Before we jump into the specifics of our featured agency, let’s quickly talk about why influencer marketing is a game-changer for businesses. In a world overflowing with advertisements, people crave genuine connections. Influencers bridge the gap between brands and their audiences by sharing honest, relatable content that resonates. This approach not only builds trust but also boosts engagement and drives sales.
Meet the Star of Abu Dhabi: YBI Social
One agency making waves in the influencer marketing space in Abu Dhabi is YBI Social. With a team of passionate experts, they’ve built a reputation for creating innovative campaigns that deliver real results. Here’s why they stand out:
1. Deep Local Insights
YBI Social understands Abu Dhabi's unique culture and market dynamics. They know what clicks with the local audience and tailor their strategies accordingly. This deep local knowledge ensures that their campaigns are not just effective but also culturally relevant.
2. Strong Relationships with Influencers
Building and maintaining strong relationships with influencers is key to any successful campaign. YBI Social has a vast network of influencers from various niches – from lifestyle and fashion to tech and travel. This diversity means they can match the perfect influencer with your brand, ensuring the message hits home.
3. Customized Campaigns
No two brands are the same, and YBI Social gets that. They take the time to understand your brand's goals, values, and target audience. This personalized approach allows them to design campaigns that are not just creative but also aligned with your business objectives.
4. Data-Driven Strategies
In the digital age, data is gold. YBI Social uses advanced analytics to track and measure the performance of their campaigns. This data-driven approach ensures that every strategy is optimized for maximum impact, providing you with clear insights and impressive results.
5. Comprehensive Services
From campaign planning and influencer selection to content creation and performance analysis, YBI Social offers end-to-end services. Their comprehensive approach means you can sit back and relax while they handle every aspect of your influencer marketing campaign.
Success Stories
YBI Social has worked with numerous brands, both big and small, helping them achieve their marketing goals. For example, they partnered with a local fashion brand to launch a new clothing line. By collaborating with popular fashion influencers in Abu Dhabi, they were able to generate significant buzz, leading to a 30% increase in sales within the first month.
Another success story involves a tech startup looking to increase its app downloads. YBI Social crafted a campaign featuring tech influencers who shared their genuine experiences with the app. This strategy not only increased downloads but also boosted user engagement and retention.
Why Choose YBI Social?
In a city as dynamic as Abu Dhabi, having a reliable and [Influencer Marketing Agency Dubai](https://ybi.social/) by your side can make all the difference. YBI Social’s blend of local insights, strong influencer relationships, customized strategies, data-driven approach, and comprehensive services make them a top choice for any brand looking to make a mark.
Whether you’re a small business looking to build your brand or a large corporation aiming to reach new heights, YBI Social has the expertise and passion to help you succeed.
Get in Touch
Ready to take your marketing to the next level? Reach out to YBI Social today and discover how they can help your brand shine in Abu Dhabi and beyond. | ybi_social_61ca8c4865f350 | |
1,872,092 | Cool Tools & Hot Trends: Generative AI & Smart Scraping | Hello people, Welcome to this week's newsletter. Sharing interesting news and tools for this week.... | 0 | 2024-05-31T13:26:59 | https://dev.to/shreyvijayvargiya/cool-tools-hot-trends-generative-ai-smart-scraping-3lg | news, webdev, javascript, beginners | Hello people,
Welcome to this week's newsletter. Sharing interesting news and tools for this week.
[Read directly on browser](https://ihatereading.in/newsletter)
[Mailyto - Open-source Email Editor](https://maily.to/)
I was searching the email editors npm module or probably tools and found the mailyto among the first on the Google search. I've used the website and found it pleasing, and a good user experience as well, so I'm sharing it with devs across the globe, in addition, this is a cool small project frontend developers can add to their portfolio.
[Tiptap: Rich Text Editor](https://tiptap.dev/docs/editor/introduction)
I reached to tiptap website through Google Inspect. While inspecting the mailyto website, the first thing I did was to open Google Inspect and that's where I found this cool open-source module to build a rich text editor in the front end.
[Frontend Resources](https://dev.to/miguelrodriguezp99/frontend-resources-1dl4)
An extensive list of resources for front-end development developers.
[Smart Scraping Web API, ScrapeNinja](https://scrapeninja.net/)
ScrapeNinja is making $8K/month and it's just a SAAS website run by a bunch of developers. This is a real deal breaker of how a developer can run a small yet huge profitable business out of his laptop. Check it out, a good inspiration for a small side project.
[Vercel AI SDK Introduction](https://www.youtube.com/watch?v=UDm-hvwpzBI&t=6s)
Watching vercel AI SDK videos on Youtube to get started with AI SDK in Next.js, here is the video
[Vercel launched version 15 of Next.js](https://www.youtube.com/watch?v=Txf2pbR2Cyk)
Vercel launched next.js version 15 after launching RSC components and vercel AI SDK we have a new version check the video as React 19 has also launched to vercel is thinking on the same ground. HINT: Something related to React Compiler which I've already mentioned in the last newsletter.
[Want to learn Generative AI](https://www.youtube.com/watch?v=2IK3DFHRFfw)
Watch the YouTube video with more than millions of views on Generative AI and what it is in a nutshell. LangChain is also quite trending to integrate LLMs or GPT into your website creating generative AI apps or AI-powered apps, [watch this playlist for LLM apps](https://www.youtube.com/watch?v=nAmC7SoVLd8&list=PLeo1K3hjS3uu0N_0W6giDXzZIcB07Ng_F)
[React 19 useOptimistic new hooks](https://x.com/steph_dietz_/status/1795468659617386956)
React 19 will be live in May 2024 and the big change was React Compiler which I've already covered in the last newsletter. In addition, React 19 introduce a few more hooks and useOptimistic is one among them, watch the video to understand it.
That's it for today
Feel free to share it with someone in need, see you next Friday
Shrey
| shreyvijayvargiya |
1,870,269 | AR Game ~Research of AR Foundation~ | Table of contents Background What is AR foundation Supported device versions Installation of AR... | 0 | 2024-05-31T13:25:06 | https://dev.to/takeda1411123/ar-game-research-of-ar-foundation-13eg | unity3d, gamedev | Table of contents
- Background
- What is AR foundation
- Supported device versions
- Installation of AR foundation
- Functions of AR foundation
- Next Step
# Background
I will develop AR Game with Unity, AR foundation and so on. To learn AR development, I am researching about AR and the software related it. This blog shows the research and the process of developing AR game. If you have a question, I am happy to answer it.
# What is AR foundation
There are many platforms for developing AR. In 2017, Apple released ARKit, which allows AR development on iOS devices.

Ref: https://testgrid.io/blog/automation-testing-an-arkit-application/
In the same year, Google released ARCore, which allows AR development on Android devices.

Ref: https://developers.googleblog.com/en/announcing-arcore-10-and-new-updates-to-google-lens/
Each platform relies on the device's OS. Therefore, if you wanted to develop an AR application for both iOS and Android devices, you would need to use both platforms (ARKit and ARCore).
To solve this problem, we can use AR Foundation. It was released by Unity in 2013. By using it, developers can create AR in a single project.
# Supported device versions
A complete list of ARCore supported android smartphone can be found on the link.
https://developers.google.com/ar/devices
A complete list of ARKit supported android smartphone can be found on the link.
https://www.apple.com/in/augmented-reality/
# Installation of AR foundation
You can install AR foundation referencing the below link.
https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/project-setup/install-arfoundation.html
https://developers.google.com/ar/develop/unity-arf/getting-started-ar-foundation
In May of 2024, AR Foundation 5.0 is compatible with Unity 2021.2 and up.
## How to install AR foundation Unity package.
How to install AR foundation from package manager on Unity.
1. Open Unity project
2. Window > Package Manager
3. Select Unity Registry
4. Search "AR Foundation"
5. Click **Install**
Or
When you create a Unity project, you can choose the AR Mobile project template on Unity Hub, which sets up the necessary AR packages. You do not need to import AR foundation package individually.
# Functions of AR foundation
The below list is the function of AR foundation.
## Basic
- Device Tracking
- Track the device's position and rotation in physical space
- Manage Session
- Enable, disable, and configure AR on the target platform
- Camera
- Render images from device cameras and perform light estimation
- Occlusion
- Occlude AR content with physical objects
## Detection and Tracking
- Plane detection
- Detect and track surfaces
- Point clouds
- Detect and track feature points
- Anchors
- Track arbitrary points in space
- Image
- Detect and track 2D images
- Object
- Detect and track 3D objects
- Face
- Detect and track human faces
- Body
- Detect and track a human body
- Participants
- Track other devices in a shared AR session.
## Others
- Raycasts
- Cast rays against tracked items
- Meshing
- Generate meshes of the environment
- Environment probes
- Generate cubemaps of the environment
# Next Step
I was going to display a sample AR object in Unity, but it will take a little more time. Therefore, I will show how to implement AR objects in Unity in the next post. | takeda1411123 |
1,872,091 | I want to implement P2P connection via Signaling Server | I have signaling server and I want to implement P2P connection via signaling server. Here is the... | 0 | 2024-05-31T13:24:48 | https://dev.to/code_healer_e01164fa18627/i-want-to-implement-p2p-connection-via-signaling-server-5167 | webdev, javascript, programming, beginners | I have signaling server and I want to implement P2P connection via signaling server.
Here is the information of signaling server.
https://p2p.vantagemdm.com:8890
- Supported Socket API functions
connect
disconnect
reconnect_error
connect_error
error
- Supported signalling/socket events are. Send ping to ensure that connection will remain alive
/v1/alive (every 10 seconds)
/v1/ready
/v1/stream/start
/v1/stream/destroy
/v1/stream/joined
/v1/stream/leaved
/v1/sdp/peer_ice
/v1/error
I want some advice from expert at WebRTC and Signaling.
| code_healer_e01164fa18627 |
1,871,868 | Sparky - simple and efficient alternative to Ansible | How to manage hundreds of hosts with Sparky | 0 | 2024-05-31T13:22:49 | https://dev.to/melezhik/sparky-simple-and-efficient-alternative-to-ansible-1fod | raku, automation, ansible | ---
title: Sparky - simple and efficient alternative to Ansible
published: true
description: How to manage hundreds of hosts with Sparky
tags: Raku, automation, ansible
cover_image: https://raw.githubusercontent.com/melezhik/sparky/master/logos/sparky.png
# Use a ratio of 100:42 for best results.
# published_at: 2024-05-31 09:47 +0000
---

---
So, you have a hundred VMs you need to manage, and you have ... Ansible ? I should stop here, as this is a tool that is standard in configuration management nowadays, but I'd dare to continue and say there is a better alternative to it.
*But before we get into it, why am I so frustrated with Ansible? Here are my points:*
* YAML based declarative DSL really stinks on complex tasks as it lacks the flexibility that imperative languages have.
* YAML is not even a programming language, and you gonna pay the price very soon.
* To keep ansible code clean and simple, extra efforts are required, one need to refactor out all the complexity from YAML to python modules and this feels like "why I _even_ start using YAML DSL"?
* Ansible reports are frustrating as I always need to add these debug tasks to show real STDOUT/STDERR emitted from commands, where it should just work out of the box.
* Ansible ties me to the idea of "running on a host," where sometimes I need to run tasks not tied to hosts, yes, you can still use "ansible_connection=local" but this feels awkward.
# Meet Sparky
So, meet Sparky - elegant, efficient and all-battery included automation tool. It's written on powerful and modern [Raku](http://raku.org) language, with [bulma](https://bulma.io) css frontend and web sockets.
To install Sparky - install [Rakudo](https://rakudo.org) first and then install Sparky itself as a Raku module:
```
curl https://rakubrew.org/install-on-perl.sh | sh
eval "$(~/.rakubrew/bin/rakubrew init Bash)"
rakubrew download moar-2024.05
git clone https://github.com/melezhik/sparky.git
cd sparky/
# install Sparky and it's dependencies
zef install --/test .
# init sparky sqlite database
raku db-init.raku
# run sparky job runner
nohup sparkyd >~/.sparkyd.log < /dev/null &
# run sparky web console
cro run
```
This simple scenario gets it up and running; if you go to http://127.0.0.1:4000 you'll see a nice Sparky web console. We use the console to run sparky jobs.

# Show me the design
So we have a control plane that would manage many hosts over ssh, using push mode:
```
---------------
| CP , Sparky |
---------------
[ssh]
/ / | \ \
host host host host host
```
This is pretty much what ansible does ...
# Show me the code
Now say, we have 5 NGINX servers we need to restart, let's drop a simple Sparky job to do this in pure [Raku](https://raku.org) language:
```perl
use Sparky::JobApi;
class Pipeline does Sparky::JobApi::Role {
method stage-main {
for 1..5 -> $i {
my $j = self.new-job :workers<5>;
$j.queue: %(
sparrowdo => %(
bootstrap => true,
host => "nginx_{$i}.local.domain"
),
tags => %(
stage => "child",
i => $i
)
);
}
}
method stage-child {
service-restart "nginx"
}
}
```
In this scenario, Sparky will run five parallel jobs that restart nginx on five hosts. Simple and elegant.
Moreover those five jobs will appear as five separate reports in Sparky UI …
# Got interested?
Of course, this is only a quick glance at Sparky architecture and features, things to cover further:
* Job orchestration (DAGs)
* Core DSL (pure Raku)
* Custom UIs
* Authentication (oauth2) and security access list
* Writing more sophisticated scenarios
* Extending Sparky with plugins by using many programming languages
* Installing Sparky with MySQL/Postgresql database storage (instead of sqlite)
* Using Sparky as CI server (SCM triggering and cron jobs)
# Links
Sparky project - https://github.com/melezhik/sparky | melezhik |
1,872,085 | 64-Bit Assembly Language Lab 3 part-4 | Hello everybody I am back with the last part of Lab-3 which is some optional challenges on the... | 0 | 2024-05-31T13:18:47 | https://dev.to/yuktimulani/64-bit-assembly-language-lab-3-part-4-3ck1 | aarch64, assembler, coding, tables | Hello everybody I am back with the last part of Lab-3 which is some optional challenges on the assembler. So, lets get right into it.
In todays post we will be writing a program to print tables in the assembler. The exact spec goes like this.
Write a program in aarch64 assembly language to print the times tables from 1-12 (“1 x 1 = 1” through “12 x 12 = 144”).
The output looks like this.
```
[ymulani@aarch64-001 aarch64]$ ./tables
1 x 1 = 1
2 x 1 = 2
3 x 1 = 3
4 x 1 = 4
5 x 1 = 5
6 x 1 = 6
7 x 1 = 7
8 x 1 = 8
9 x 1 = 9
10 x 1 = 10
11 x 1 = 11
12 x 1 = 12
1 x 2 = 2
2 x 2 = 4
3 x 2 = 6
4 x 2 = 8
5 x 2 = 10
6 x 2 = 12
7 x 2 = 14
8 x 2 = 16
9 x 2 = 18
10 x 2 = 20
11 x 2 = 22
12 x 2 = 24
1 x 3 = 3
2 x 3 = 6
3 x 3 = 9
4 x 3 = 12
5 x 3 = 15
6 x 3 = 18
7 x 3 = 21
8 x 3 = 24
9 x 3 = 27
10 x 3 = 30
11 x 3 = 33
12 x 3 = 36
1 x 4 = 4
2 x 4 = 8
3 x 4 = 12
4 x 4 = 16
5 x 4 = 20
6 x 4 = 24
7 x 4 = 28
8 x 4 = 32
9 x 4 = 36
10 x 4 = 40
11 x 4 = 44
12 x 4 = 48
1 x 5 = 5
2 x 5 = 10
3 x 5 = 15
4 x 5 = 20
5 x 5 = 25
6 x 5 = 30
7 x 5 = 35
8 x 5 = 40
9 x 5 = 45
10 x 5 = 50
11 x 5 = 55
12 x 5 = 60
1 x 6 = 6
2 x 6 = 12
3 x 6 = 18
4 x 6 = 24
5 x 6 = 30
6 x 6 = 36
7 x 6 = 42
8 x 6 = 48
9 x 6 = 54
10 x 6 = 60
11 x 6 = 66
12 x 6 = 72
1 x 7 = 7
2 x 7 = 14
3 x 7 = 21
4 x 7 = 28
5 x 7 = 35
6 x 7 = 42
7 x 7 = 49
8 x 7 = 56
9 x 7 = 63
10 x 7 = 70
11 x 7 = 77
12 x 7 = 84
1 x 8 = 8
2 x 8 = 16
3 x 8 = 24
4 x 8 = 32
5 x 8 = 40
6 x 8 = 48
7 x 8 = 56
8 x 8 = 64
9 x 8 = 72
10 x 8 = 80
11 x 8 = 88
12 x 8 = 96
1 x 9 = 9
2 x 9 = 18
3 x 9 = 27
4 x 9 = 36
5 x 9 = 45
6 x 9 = 54
7 x 9 = 63
8 x 9 = 72
9 x 9 = 81
10 x 9 = 90
11 x 9 = 99
12 x 9 = 198
1 x 10 = 10
2 x 10 = 20
3 x 10 = 30
4 x 10 = 40
5 x 10 = 50
6 x 10 = 60
7 x 10 = 70
8 x 10 = 80
9 x 10 = 90
10 x 10 = 190
11 x 10 = 110
12 x 10 = 120
1 x 11 = 11
2 x 11 = 22
3 x 11 = 33
4 x 11 = 44
5 x 11 = 55
6 x 11 = 66
7 x 11 = 77
8 x 11 = 88
9 x 11 = 99
10 x 11 = 110
11 x 11 = 121
12 x 11 = 132
1 x 12 = 12
2 x 12 = 24
3 x 12 = 36
4 x 12 = 48
5 x 12 = 60
6 x 12 = 72
7 x 12 = 84
8 x 12 = 96
9 x 12 = 198
10 x 12 = 120
11 x 12 = 132
12 x 12 = 144
```
## The Code reveal
```
.text
.globl _start
min = 1 /* starting value for the loop index; **note that this is a symbol (constant)**, not a variable */
max = 13 /* loop exits when the index hits this number (loop condition is i<max) */
table = 1
_start:
mov x19, min //Initialize the multipler (row index)
mov x20, table // initialise the multiplicand (Column index)
loop:
add x15, x19, 0x30
adr x14, msg
mov x12, 10
udiv x13, x19, x12
add x16, x13, 0x30
cmp x16, 0x30
b.eq ones
strb w16, [x14]
ones:
adr x14, msg+1
msub x13, x13, x12, x19
add x13, x13, 0x30
strb w13, [x14]
tensTable:
add x15, x20, 0x30
adr x14, msg+5
mov x12, 10
udiv x13, x20, x12
add x16, x13, 0x30
cmp x16, 0x30
b.eq onesTable
strb w16, [x14]
onesTable:
adr x14, msg+6
msub x13, x13, x12, x20
add x13, x13, 0x30
strb w13, [x14]
hundredres:
mul x21, x19, x20
adr x14, msg+10
mov x12, 100
udiv x15, x21, x12
add x13, x15, 0x30
cmp x13, 0x30
b.eq tensres
strb w13, [x14]
tensres:
msub x15, x15, x12, x21
mul x21, x19, x20
adr x14, msg+11
mov x12, 10
udiv x17, x15, x12
add x13, x17, 0x30
cmp x13, 0x30
b.eq onesres
strb w13, [x14]
onesres:
adr x14, msg+12
msub x17, x17, x12, x15
add x17, x17, 0x30
strb w17, [x14]
mov X0, 1
adr x1, msg
mov x2, len
mov x8, 64
svc 0
add x19, x19, 1
cmp x19, max
b.ne loop
mov x19, min
mov x13, ' '
adr x14, msg
strb w13, [x14]
adr x14, msg+10
strb w13, [x14]
adr x14, msg+11
strb w13, [x14]
add x20, x20, 1
cmp x20, max
b.ne loop
mov x0, 0
mov x8, 93
svc 0 /* syscall */
.data
msg: .ascii " # x # = #\n"
len= . - msg
```
## Walkthrough
### Data Section
```
.data
msg: .ascii " # x # = #\n"
len= . - msg
```
- `msg` is the format string used for printing each line of the multiplication table. It contains placeholders for the two numbers being multiplied and the result.
- `len` is calculated as the length of the msg string.
### Text Section
```
.text
.globl _start
min = 1
max = 13
table = 1
```
- `min, max,` and `table` are constants used for loop control.
- The entry point `_start` is defined as a global label.
### Start of the Program
```
_start:
mov x19, min //Initialize the multipler (row index)
mov x20, table // initialise the multiplicand (Column index)
```
- Initialize x19 with min (1), representing the starting row index.
- Initialize x20 with table (1), representing the starting column index.
### Main Loop
```
loop:
add x15, x19, 0x30
adr x14, msg
mov x12, 10
udiv x13, x19, x12
add x16, x13, 0x30
cmp x16, 0x30
b.eq ones
strb w16, [x14]
ones:
adr x14, msg+1
msub x13, x13, x12, x19
add x13, x13, 0x30
strb w13, [x14]
```
- The loop starts by converting the current row index x19 to a character.
- adr x14, msg loads the address of the msg string into x14.
- udiv x13, x19, x12 divides x19 by 10 to get the tens digit.
- add x16, x13, 0x30 converts the tens digit to its ASCII representation.
- cmp x16, 0x30 checks if the tens digit is zero.
- If the tens digit is zero, it branches to the ones label.
- Otherwise, it stores the tens digit in msg and proceeds.
### Formatting Ones Digit for Row Index
```
ones:
adr x14, msg+1
msub x13, x13, x12, x19
add x13, x13, 0x30
strb w13, [x14]
```
- `adr x14, msg+1` loads the address of the second character in msg.
- `msub x13, x13, x12, x19` calculates the remainder to get the ones digit.
- `add x13, x13, 0x30` converts the ones digit to its ASCII representation.
- `strb w13, [x14]` stores the ones digit in msg.
### Formatting Multiplier
```
tensTable:
add x15, x20, 0x30
adr x14, msg+5
mov x12, 10
udiv x13, x20, x12
add x16, x13, 0x30
cmp x16, 0x30
b.eq onesTable
strb w16, [x14]
onesTable:
adr x14, msg+6
msub x13, x13, x12, x20
add x13, x13, 0x30
strb w13, [x14]
```
- `add x15, x20, 0x30` converts the multiplier `x20` to a character.
- `adr x14, msg+5` loads the address of the character in msg where the multiplier should be placed.
- `udiv x13, x20, x12` divides `x20` by 10 to get the tens digit.
- `add x16, x13, 0x30` converts the tens digit to its ASCII representation.
- `cmp x16, 0x30` checks if the tens digit is zero.
- If the tens digit is zero, it branches to the onesTable label.
- Otherwise, it stores the tens digit in msg.
### Formatting Ones Digit for Multiplier
```
onesTable:
adr x14, msg+6
msub x13, x13, x12, x20
add x13, x13, 0x30
strb w13, [x14]
```
- `adr x14, msg+6` loads the address of the sixth character in msg.
- `msub x13, x13, x12, x20` calculates the remainder to get the ones digit.
- `add x13, x13, 0x30` converts the ones digit to its ASCII representation.
- `strb w13, [x14]` stores the ones digit in msg.
### Formatting Result
```
hundredres:
mul x21, x19, x20
adr x14, msg+10
mov x12, 100
udiv x15, x21, x12
add x13, x15, 0x30
cmp x13, 0x30
b.eq tensres
strb w13, [x14]
tensres:
msub x15, x15, x12, x21
mul x21, x19, x20
adr x14, msg+11
mov x12, 10
udiv x17, x15, x12
add x13, x17, 0x30
cmp x13, 0x30
b.eq onesres
strb w13, [x14]
onesres:
adr x14, msg+12
msub x17, x17, x12, x15
add x17, x17, 0x30
strb w17, [x14]
```
- `mul x21, x19, x20` calculates the product of the row and column indices.
- `adr x14, msg+10` loads the address where the result should start being placed in msg.
- `udiv x15, x21, x12` divides the product by 100 to get the hundreds digit.
- `add x13, x15, 0x30` converts the hundreds digit to its ASCII representation.
- `cmp x13, 0x30` checks if the hundreds digit is zero.
- If the hundreds digit is zero, it branches to the tensres label.
- Otherwise, it stores the hundreds digit in msg.
### Formatting Tens and Ones Digit for Result
```
tensres:
msub x15, x15, x12, x21
mul x21, x19, x20
adr x14, msg+11
mov x12, 10
udiv x17, x15, x12
add x13, x17, 0x30
cmp x13, 0x30
b.eq onesres
strb w13, [x14]
onesres:
adr x14, msg+12
msub x17, x17, x12, x15
add x17, x17, 0x30
strb w17, [x14]
```
- These sections handle formatting the tens and ones digits for the result similarly to the previous sections.
### Printing the Formatted Message
```
mov X0, 1
adr x1, msg
mov x2, len
mov x8, 64
svc 0
```
- Prepares the syscall to write the message to the standard output.
- `mov X0,` 1 sets the file descriptor to 1 (standard output).
- `adr x1,` msg loads the address of the message.
- `mov x2, len` sets the length of the message.
- `mov x8, 64` sets the syscall number for write.
- `svc 0` makes the syscall.
### Loop Control
```
add x19, x19, 1
cmp x19, max
b.ne loop
mov x19, min
mov x13, ' '
adr x14, msg
strb w13, [x14]
adr x14, msg+10
strb w13, [x14]
adr x14, msg+11
strb w13, [x14]
add x20, x20, 1
cmp x20, max
b.ne loop
```
- add x19, x19, 1: Increment the row index (x19).
- cmp x19, max: Compare the current row index (x19) with the maximum limit (max, which is 13).
- b.ne loop: If x19 is not equal to max, branch back to loop.
- mov x19, min: Reset the row index (x19) to the minimum value (min, which is 1).
- Insert spaces in the msg string to create a visual spacer between tables.
- add x20, x20, 1: Increment the column index (x20).
- cmp x20, max: Compare the current column index (x20) with the maximum limit (max).
- b.ne loop: If x20 is not equal to max, branch back to loop.
- If both row and column indices reach their maximum limits, the loop exits, and the program proceeds to termination.
So, that is it I wont talk much after this long and boring blog of walkthroughs and explainations. I hope you enjoyed it.
Until next time Happy Coding!!!!
| yuktimulani |
1,872,084 | Android App Development vs iOS App Development | When deciding between Android app development and iOS app development, it's essential to understand... | 0 | 2024-05-31T13:17:50 | https://dev.to/jenniferlily/android-app-development-vs-ios-app-development-2i68 | When deciding between Android app development and iOS app development, it's essential to understand the differences and unique advantages of each platform.
**Market Share and Audience:**
Android dominates the global market share with a larger user base, particularly in regions like Asia, Africa, and Latin America. iOS, however, has a strong presence in North America, Europe, and Australia. An [iOS app development](https://www.centrixcube.ae/flutter-app-development) firm often targets users who are typically more willing to spend on apps and in-app purchases, which can be advantageous for monetization.
**Development Environment:**
Android app development uses Java or Kotlin as programming languages and relies on Android Studio as its integrated development environment (IDE). In contrast, iOS app development primarily uses Swift or Objective-C, with Xcode as the preferred IDE. Each environment offers robust tools, but developers often find Xcode's interface more streamlined for a seamless development experience.
**Fragmentation:**
Android's open ecosystem means dealing with a wide range of devices, screen sizes, and OS versions, which can complicate the development and testing process. On the other hand, iOS offers a more controlled environment with fewer device variations, making it easier to ensure app consistency and performance across all devices.
**Revenue and Monetization:**
While Android apps reach a broader audience, iOS apps generally generate higher revenue through the App Store. An iOS app development company might focus on this aspect to maximize profitability.
**Cross-Platform Solutions:**
Using a Flutter app development company can bridge the gap between Android and iOS development. Flutter, an open-source framework by Google, allows developers to create high-quality, natively compiled applications for both platforms from a single codebase. This approach not only saves time and resources but also ensures a consistent user experience across devices.
**Conclusion**,
Both Android and iOS app development have their own sets of benefits and challenges. The choice largely depends on your target audience, budget, and specific project requirements. Leveraging the expertise of a [Flutter app development company in Dubai](https://www.centrixcube.ae/flutter-app-development) or an iOS app development company can help you navigate these complexities and achieve your app development goals efficiently. | jenniferlily | |
1,872,083 | RAG with llama.cpp and external API services | txtai is an all-in-one embeddings database for semantic search, LLM orchestration and language... | 11,018 | 2024-05-31T13:16:41 | https://neuml.hashnode.dev/rag-with-llamacpp-and-external-api-services | ai, llm, rag, vectordatabase | [](https://colab.research.google.com/github/neuml/txtai/blob/master/examples/62_RAG_with_llama_cpp_and_external_API_services.ipynb)
[txtai](https://github.com/neuml/txtai) is an all-in-one embeddings database for semantic search, LLM orchestration and language model workflows.
txtai has been and always will be a local-first framework. It was originally designed to run models on local hardware using Hugging Face Transformers. As the AI space has evolved over the last year, so has txtai. Additional LLM inference frameworks have been available for a while using llama.cpp and external API services (via LiteLLM). Recent changes have added the ability to use these frameworks for vectorization and made it easier to use for LLM inference.
This article will demonstrate how to run retrieval-augmented-generation (RAG) processes (vectorization and LLM inference) with llama.cpp and external API services.
# Install dependencies
Install `txtai` and all dependencies.
```
# Install txtai and dependencies
pip install llama-cpp-python[server] --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu121
pip install txtai[pipeline-llm]
```
# Embeddings with llama.cpp vectorization
The first example will build an Embeddings database backed by [llama.cpp](https://github.com/ggerganov/llama.cpp) vectorization.
The llama.cpp project states: _The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud_.
Let's give it a try.
```python
from txtai import Embeddings
# Create Embeddings with llama.cpp GGUF model
embeddings = Embeddings(
path="second-state/All-MiniLM-L6-v2-Embedding-GGUF/all-MiniLM-L6-v2-Q4_K_M.gguf",
content=True
)
# Load dataset
wikipedia = Embeddings()
wikipedia.load(provider="huggingface-hub", container="neuml/txtai-wikipedia")
query = """
SELECT id, text FROM txtai
order by percentile desc
LIMIT 10000
"""
# Index dataset
embeddings.index(wikipedia.search(query))
```
Now that the Embeddings database is ready, let's run a search query.
```python
embeddings.search("Inventors of electric-powered devices")
```
```
[{'id': 'Thomas Edison',
'text': 'Thomas Alva Edison (February 11, 1847October 18, 1931) was an American inventor and businessman. He developed many devices in fields such as electric power generation, mass communication, sound recording, and motion pictures. These inventions, which include the phonograph, the motion picture camera, and early versions of the electric light bulb, have had a widespread impact on the modern industrialized world. He was one of the first inventors to apply the principles of organized science and teamwork to the process of invention, working with many researchers and employees. He established the first industrial research laboratory.',
'score': 0.6758285164833069},
{'id': 'Nikola Tesla',
'text': 'Nikola Tesla (; , ; 1856\xa0– 7 January 1943) was a Serbian-American inventor, electrical engineer, mechanical engineer, and futurist. He is best-known for his contributions to the design of the modern alternating current (AC) electricity supply system.',
'score': 0.6077840328216553},
{'id': 'Alexander Graham Bell',
'text': 'Alexander Graham Bell (, born Alexander Bell; March 3, 1847 – August 2, 1922) was a Scottish-born Canadian-American inventor, scientist and engineer who is credited with patenting the first practical telephone. He also co-founded the American Telephone and Telegraph Company (AT&T) in 1885.',
'score': 0.4573010802268982}]
```
As we can see, this Embeddings database works just like any other Embeddings database. The difference is that it's using a llama.cpp model for vectorization instead of PyTorch.
# RAG with llama.cpp
LLM inference with llama.cpp is not a new txtai feature. A recent change added support for conversational messages in additional to standard prompts. This abstracts away having to understand prompting formats.
Let's run a retrieval-augmented-generation (RAG) process fully backed by llama.cpp models.
_It's important to note that conversational messages work with all LLM backends supported by txtai (transformers, llama.cpp, litellm)._
```python
from txtai import LLM
# LLM instance
llm = LLM(path="TheBloke/Mistral-7B-OpenOrca-GGUF/mistral-7b-openorca.Q4_K_M.gguf")
# Question and context
question = "Write a list of invented electric-powered devices"
context = "\n".join(x["text"] for x in embeddings.search(question))
# Pass messages to LLM
response = llm([
{"role": "system", "content": "You are a friendly assistant. You answer questions from users."},
{"role": "user", "content": f"""
Answer the following question using only the context below. Only include information specifically discussed.
question: {question}
context: {context}
"""}
])
print(response)
```
```
Based on the given context, here's a list of invented electric-powered devices:
1. Electric light bulb by Thomas Edison
2. Phonograph by Thomas Edison
3. Motion picture camera by Thomas Edison
4. Alternating current (AC) electricity supply system by Nikola Tesla
5. Telephone by Alexander Graham Bell
```
And just like that, RAG with llama.cpp🦙!
# Embeddings with external vectorization
Next, we'll show how an Embeddings database can integrate with external API services via [LiteLLM](https://github.com/BerriAI/litellm) .
In the LiteLLM project's own words: _LiteLLM handles loadbalancing, fallbacks and spend tracking across 100+ LLMs. All in the OpenAI format._
Let's first startup a local API service to use for this demo.
```
# Download models
wget https://huggingface.co/second-state/All-MiniLM-L6-v2-Embedding-GGUF/resolve/main/all-MiniLM-L6-v2-Q4_K_M.gguf
wget https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GGUF/resolve/main/mistral-7b-openorca.Q4_K_M.gguf
# Start local API services
nohup python -m llama_cpp.server --n_gpu_layers -1 --model all-MiniLM-L6-v2-Q4_K_M.gguf --host 127.0.0.1 --port 8000 &> vector.log &
nohup python -m llama_cpp.server --n_gpu_layers -1 --model mistral-7b-openorca.Q4_K_M.gguf --chat_format chatml --host 127.0.0.1 --port 8001 &> llm.log &
sleep 30
```
Now let's connect and use this local service to generate vectors for a new Embeddings database. Note that the local service responds in OpenAI's response format, hence the `path` setting below.
```python
from txtai import Embeddings
# Create Embeddings instance with external vectorization
embeddings = Embeddings(
path="openai/gpt-4-turbo",
content=True,
vectors={
"api_base": "http://localhost:8000/v1",
"api_key": "sk-1234"
}
)
# Load dataset
wikipedia = Embeddings()
wikipedia.load(provider="huggingface-hub", container="neuml/txtai-wikipedia")
query = """
SELECT id, text FROM txtai
order by percentile desc
LIMIT 10000
"""
# Index dataset
embeddings.index(wikipedia.search(query))
```
```python
embeddings.search("Inventors of electric-powered devices")
```
```
[{'id': 'Thomas Edison',
'text': 'Thomas Alva Edison (February 11, 1847October 18, 1931) was an American inventor and businessman. He developed many devices in fields such as electric power generation, mass communication, sound recording, and motion pictures. These inventions, which include the phonograph, the motion picture camera, and early versions of the electric light bulb, have had a widespread impact on the modern industrialized world. He was one of the first inventors to apply the principles of organized science and teamwork to the process of invention, working with many researchers and employees. He established the first industrial research laboratory.',
'score': 0.6758285164833069},
{'id': 'Nikola Tesla',
'text': 'Nikola Tesla (; , ; 1856\xa0– 7 January 1943) was a Serbian-American inventor, electrical engineer, mechanical engineer, and futurist. He is best-known for his contributions to the design of the modern alternating current (AC) electricity supply system.',
'score': 0.6077840328216553},
{'id': 'Alexander Graham Bell',
'text': 'Alexander Graham Bell (, born Alexander Bell; March 3, 1847 – August 2, 1922) was a Scottish-born Canadian-American inventor, scientist and engineer who is credited with patenting the first practical telephone. He also co-founded the American Telephone and Telegraph Company (AT&T) in 1885.',
'score': 0.4573010802268982}]
```
Like the previous example with llama.cpp, this Embeddings database behaves exactly the same. The main difference is that content is sent to an external service for vectorization.
# RAG with External API services
For our last task, we'll run a retrieval-augmented-generation (RAG) process fully backed by an external API service.
```python
from txtai import LLM
# LLM instance
llm = LLM(path="openai/gpt-4-turbo", api_base="http://localhost:8001/v1", api_key="sk-1234")
# Question and context
question = "Write a list of invented electric-powered devices"
context = "\n".join(x["text"] for x in embeddings.search(question))
# Pass messages to LLM
response = llm([
{"role": "system", "content": "You are a friendly assistant. You answer questions from users."},
{"role": "user", "content": f"""
Answer the following question using only the context below. Only include information specifically discussed.
question: {question}
context: {context}
"""}
])
print(response)
```
```
Based on the given context, a list of invented electric-powered devices includes:
1. Phonograph by Thomas Edison
2. Motion Picture Camera by Thomas Edison
3. Early versions of the Electric Light Bulb by Thomas Edison
4. AC (Alternating Current) Electricity Supply System by Nikola Tesla
5. Telephone by Alexander Graham Bell
```
# Wrapping up
txtai supports a number of different vector and LLM backends. The default method uses PyTorch models via the Hugging Face Transformers library. This article demonstrated how llama.cpp and external API services can also be used.
These additional vector and LLM backends enable maximum flexibility and scalability. For example, vectorization can be fully offloaded to an external API service or another local service. llama.cpp has great support for macOS devices, alternate accelerators such AMD ROCm / Intel GPUs and has been known to run on Raspberry Pi devices.
It's exciting to see the confluence of all these new advances coming together. Stay tuned for more!
| davidmezzetti |
1,872,082 | 5 Profound Benefits of Workday Integration Testing | Ensuring smooth integration is crucial in the modern, highly interconnected business landscape,... | 0 | 2024-05-31T13:16:21 | https://testingautomation.hashnode.dev/5-profound-benefits-of-workday-integration-testing | workday, integration, testing | 
Ensuring smooth integration is crucial in the modern, highly interconnected business landscape, where enterprises rely on multiple apps and platforms. The need for thorough integration testing is growing as Workday positions itself as the top Cloud-based solution for human and financial capital management (HCM). Workday integration testing with other corporate apps has several benefits that can help businesses succeed and achieve operational excellence.
**Mitigating Integration Risks**
One of the key advantages of Workday integration testing is its capacity to successfully lower integration risks. By conducting thorough evaluations of Workday's data flow, functionality, and compatibility with other systems, businesses may identify potential issues and take action before they cause major disruptions. In the end, by averting costly downtime, data loss, and operational inefficiencies, this proactive approach safeguards the business's profits and image.
**Ensuring Data Integrity and Consistency**
The consistency and integrity of information across many platforms are critical in today's data-driven society. Ensuring that data is accurately and consistently moved between a Workday and other apps is vital, Workday integration testing helps to reduce the possibility of errors, duplications, or inconsistencies. Organizations can build trust in their operations and decision-making processes by making educated judgments based on accurate and up-to-date information when data integrity is upheld.
**Enhancing Workforce Productivity**
Workday productivity can be greatly increased by a smooth interface with other company applications. Employees can access and utilize data from numerous sources without having to go through several interfaces or manually transfer information when applications operate together harmoniously. In addition to saving time, this optimized workflow lowers the possibility of human error, freeing up staff members to concentrate on a more important and strategically oriented work.
**Improving Customer and Employee Experiences**
Customer and staff experiences are immediately improved by testing for a Workday integration. By guaranteeing a smooth integration between Workday and client-facing systems, such as CRM platforms, businesses can offer a consistent and unified experience throughout all interactions. In the same vein, an ecosystem of apps that are well-integrated can promote teamwork, expedite procedures, and give employees a more productive and happy work environment.
**Facilitating Compliance and Governance**
Organizations must abide by strict regulatory standards and governance structures in many different industries. In order to guarantee that data flows and operations amongst connected systems adhere to these standards, Workday integration testing is essential. Organizations can preserve business continuity, avert expensive fines, and preserve their reputation for following industry best practices by seeing and resolving possible compliance issues early in the integration process.
**Conclusion**
Workday integration testing implementation is more than just a technical endeavor, it's a calculated investment that can have a big long-term impact on companies. Opkey simplifies Workday testing with its robust automation solution. Opkey's no-code platform enables business users to write scripts independently, even if they don't know how to code. Through pre-built accelerators for functional, regression, performance, and security tests, the system speeds up testing. The test discovery feature of Opkey finds gaps for ideal coverage and reveals both automated and manual tests that are currently in place. It makes seamless end-to-end integration testing across development stages and technologies possible. Opkey removes the need for numerous testing platforms by connecting with well-known DevOps technologies. Opkey's all-inclusive and intuitive automation optimizes Workday productivity and quality. | rohitbhandari102 |
1,872,080 | Generative AI's Role in 2024 Elections | Introduction As we approach the 2024 elections, the role of generative AI in shaping political... | 27,548 | 2024-05-31T13:13:53 | https://dev.to/aishikl/generative-ais-role-in-2024-elections-4nb1 | <h2>Introduction</h2>
<p>As we approach the 2024 elections, the role of generative AI in shaping political campaigns and influencing voter behavior has become a topic of significant concern. This blog explores the potential and risks associated with the use of generative AI in the upcoming elections, drawing insights from various research studies and expert opinions.</p>
<h2>The Promise of Generative AI</h2>
<h3>Enhancing Campaign Strategies</h3>
<p>Generative AI offers political campaigns the ability to create highly personalized and targeted content. By analyzing vast amounts of data, AI can generate tailored messages that resonate with specific voter segments, potentially increasing engagement and support.</p>
<h3>Streamlining Campaign Operations</h3>
<p>AI can also streamline various campaign operations, from automating routine tasks to optimizing resource allocation. This can lead to more efficient and effective campaign management, allowing candidates to focus on critical strategic decisions.</p>
<h2>The Risks of Generative AI</h2>
<h3>Misinformation and Deepfakes</h3>
<p>One of the most significant risks associated with generative AI is the potential for misinformation and deepfakes. AI-generated content can be used to create false narratives, impersonate candidates, and manipulate public opinion. This poses a serious threat to election integrity and voter trust.</p>
<p><a href="https://www.cisa.gov/resources-tools/resources/risk-focus-generative-ai-and-2024-election-cycle">Read more about the risks of AI in elections</a></p>
<h3>Election Security Concerns</h3>
<p>Federal intelligence agencies have warned that generative AI could threaten election security. AI tools can be exploited by both domestic and foreign actors to interfere with election processes, disrupt infrastructure, and sow discord among voters.</p>
<p><a href="https://www.cbsnews.com/news/generative-ai-threat-to-election-security-federal-intelligence-agencies-warn/">Learn more about election security threats</a></p>
<h2>Best Practices for Mitigating Risks</h2>
<h3>Promoting Transparency</h3>
<p>To mitigate the risks associated with generative AI, it is crucial to promote transparency in AI-generated content. Clear labeling and disclosure of AI-generated materials can help voters distinguish between authentic and manipulated content.</p>
<h3>Implementing Robust Regulations</h3>
<p>Establishing robust regulations and guidelines for the use of AI in political campaigns is essential. This includes setting standards for ethical AI use, monitoring compliance, and enforcing penalties for violations.</p>
<p><a href="https://www.gsb.stanford.edu/faculty-research/publications/preparing-generative-ai-2024-election-recommendations-best-practices">Explore best practices for AI governance</a></p>
<h2>Conclusion</h2>
<p>The 2024 elections will undoubtedly be a critical test for the role of generative AI in democratic processes. While AI offers significant potential to enhance campaign strategies and operations, it also presents substantial risks that must be carefully managed. By promoting transparency, implementing robust regulations, and fostering public awareness, we can harness the benefits of AI while safeguarding the integrity of our elections.</p>
<p>Rapid Innovation is a leading AI and Blockchain development firm offering cutting-edge solutions to clients worldwide. Our expertise in AI and technology innovation positions us at the forefront of addressing the challenges and opportunities presented by generative AI in various sectors.</p>
<p>For more insights and updates, visit our <a href="https://www.rapidinnovation.io/blogs">blog website</a>.</p>
<p>Follow us on:
- <a href="https://www.youtube.com/@RapidInnovation">YouTube</a>
- <a href="https://www.instagram.com/rapidinnovation.io/">Instagram</a>
- <a href="https://x.com/InnovationRapid">Twitter</a>
- <a href="https://www.facebook.com/rapidinnovation.io">Facebook</a>
- <a href="https://www.linkedin.com/company/rapid-innovation/">LinkedIn</a>
- <a href="https://www.rapidinnovation.io/">Website</a></p> | aishikl | |
1,871,585 | Top 10 Layer 1 Blockchains in 2024 | In the rapidly evolving world of blockchain technology, Layer 1 blockchains serve as the foundational... | 0 | 2024-05-31T04:15:35 | https://dev.to/laxita01/top-10-layer-1-blockchains-in-2024-1jai | blockchain, layer | In the rapidly evolving world of blockchain technology, Layer 1 blockchains serve as the foundational bedrock upon which decentralized applications (dApps) and services are built. As we progress through 2024, several Layer 1 blockchains have emerged as leaders in the space, each bringing unique capabilities and innovations to the table. Whether you are a blockchain enthusiast, a business looking to integrate blockchain solutions, or planning to hire blockchain developers, understanding these [top Layer 1 blockchains](https://www.solulab.com/top-layer-1-blockchains/) can provide valuable insights into the future of decentralization.
**1. Ethereum (ETH)**
Ethereum continues to be the most prominent Layer 1 blockchain, known for its robust smart contract capabilities and vast developer community. With the successful transition to Ethereum 2.0, it now offers enhanced scalability and energy efficiency, making it a go-to choice for a wide range of applications.
**2. Binance Smart Chain (BSC)**
Binance Smart Chain has gained significant traction due to its low transaction fees and fast confirmation times. It has become a popular platform for DeFi projects and NFT marketplaces, supported by a vibrant ecosystem fostered by Binance.
**3. Solana (SOL)**
Solana is renowned for its high throughput, boasting the ability to process thousands of transactions per second (TPS). Its innovative Proof of History (PoH) consensus mechanism ensures scalability without compromising security, making it a favorite for high-performance dApps.
**4. Cardano (ADA)**
Cardano stands out with its research-driven approach and peer-reviewed development process. Its Ouroboros Proof of Stake (PoS) protocol provides a secure and scalable foundation, attracting interest from blockchain consulting companies and developers worldwide.
**5. Polkadot (DOT)**
Polkadot enables interoperability between different blockchains through its unique relay chain and parachains architecture. This multi-chain approach allows for seamless communication and collaboration among diverse blockchain networks, fostering innovation and integration.
**6. Avalanche (AVAX)**
Avalanche is designed for high scalability and low latency, making it ideal for [decentralized finance (DeFi)](https://www.solulab.com/what-is-defi/) and enterprise solutions. Its consensus protocol, Avalanche, ensures quick finality and security, attracting a growing number of blockchain development companies.
**7. Tezos (XTZ)**
Tezos offers a self-amending blockchain, allowing it to evolve and upgrade without hard forks. Its on-chain governance model and energy-efficient Proof of Stake (PoS) consensus have made it a preferred choice for projects emphasizing sustainability and governance.
**8. Algorand (ALGO)**
Algorand's Pure Proof of Stake (PPoS) consensus ensures fast and secure transactions with minimal energy consumption. It has become a popular platform for financial applications, supply chain solutions, and more, supported by its scalable and robust infrastructure.
**9. Cosmos (ATOM)**
Cosmos aims to create an "Internet of Blockchains" through its Inter-Blockchain Communication (IBC) protocol. This enables different blockchains to communicate and transact with each other, fostering a connected and interoperable blockchain ecosystem.
**10. Near Protocol (NEAR)**
Near Protocol focuses on usability and developer experience, offering easy-to-use tools and a developer-friendly environment. Its sharding-based architecture ensures scalability and performance, making it an attractive option for building decentralized applications.
**Conclusion**
The top Layer 1 blockchains in 2024 showcase the diverse and dynamic nature of the blockchain landscape. Each platform brings its unique strengths and innovations, catering to different use cases and industries. For businesses looking to integrate blockchain solutions, partnering with a [blockchain consulting company](https://www.solulab.com/blockchain-consulting-services/) can provide the expertise needed to navigate this complex ecosystem. Whether you need to develop customized blockchain applications or want to explore the potential of decentralized finance, hiring a [skilled blockchain developer](https://www.solulab.com/hire-blockchain-developers/) is crucial to unlocking the full potential of these cutting-edge platforms. As we move forward, these Layer 1 blockchains will continue to shape the future of decentralization, driving innovation and transformation across the globe.
| laxita01 |
1,872,079 | Create and Share Custom 3D Experiences with BuildVR | Check out BuildVR at buildvr.gretxp.com and unlock your creativity! With BuildVR, you can easily... | 0 | 2024-05-31T13:11:52 | https://dev.to/gretxp_buildvr/create-and-share-custom-3d-experiences-with-buildvr-1i0o | Check out **BuildVR** at [buildvr.gretxp.com](buildvr.gretxp.com) and unlock your creativity! With BuildVR, you can easily upload your own 3D models or choose from a huge library to create unique 3D scenes. Whether you’re an artist, designer, or just love 3D modeling, our easy-to-use tools make it simple to bring your ideas to life. Share your creations anywhere and join a community of creative minds. Try BuildVR today and see how fun and easy 3D design can be!

Yes, with BuildVR, you can easily upload your 3D models, create immersive 3D experiences, and share them with everyone. Visit [BuildVR ](buildvr.gretxp.com) to get started. Our platform offers:
- Upload your own 3D models or choose from a vast library.
- Add lighting: ambient, spotlight, point light, directional light.
- Customize model size, rotation, position, and apply textures.
- Upload your own backgrounds or videos.
- Use these features to craft and share your unique 3D scenes effortlessly!
Check out a final sample 3D experience built using BuildVR: [3D Stage Experience](https://buildvr.gretxp.com/virtual-experience/VRE--P2V1304).

| gretxp_buildvr | |
1,872,077 | 🎉✨ Today marks 2 incredible years of 📲 Flutterflow Devs! 🎉✨ | This morning, I entered the office We were surprised by a beautiful decoration and amazing cake at... | 0 | 2024-05-31T13:08:17 | https://dev.to/flutterflowdevs/today-marks-2-incredible-years-of-flutterflow-devs-37in | flutter, flutterflow, celebration, teamwork |
This morning, I entered the office We were surprised by a beautiful decoration and amazing cake at our office! 🎈🥳
A big thanks to all team members for making it happen! 🎊🙌
I was literally wowed by this fantastic surprise.
Our journey has been phenomenal. From starting FlutterFlow to becoming official Flutterflow experts, we’ve come a long way. 🚀
Thanks to the entire Flutterflow Devs team for their hard work and dedication. Here’s to many more years of innovation, creativity, and success! 🎉🚀
🎂🎁🎆🎶🎉🎈🥳💫
| flutterflowdevs |
1,872,075 | How to Create Interactive Dashboards with Excel Charts in C# | Learn how to create interactive dashboards with Excel charts in C#. See more from Document Solutions today. | 0 | 2024-05-31T13:07:50 | https://developer.mescius.com/blogs/how-to-create-interactive-dashboards-with-excel-charts-in-c-sharp | webdev, devops, csharp, tutorial | ---
canonical_url: https://developer.mescius.com/blogs/how-to-create-interactive-dashboards-with-excel-charts-in-c-sharp
description: Learn how to create interactive dashboards with Excel charts in C#. See more from Document Solutions today.
---
**What You Will Need**
- Visual Studio
- DsExcel NuGet
**Controls Referenced**
- [Document Solutions for Excel](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/overview.html)
**Tutorial Concept**
C# Interactive Dashboards - Using a C# .NET Excel API and chart features, users can build interactive dashboards for their desktop applications.
---
Are you looking to make smarter decisions with your Excel data? Use charts to transform complex information into clear insights!
Identifying trends using raw data alone is very difficult. Without visual aids, it’s tough to spot patterns and key insights, which leads to slower and less accurate decision-making. Charts play a crucial role in creating dashboards or business reports by transforming complex data into clear visual insights. They help you easily identify trends, compare metrics, and see patterns in the data.
With our [Document Solutions for Excel](https://developer.mescius.com/document-solutions/dot-net-excel-api "https://developer.mescius.com/document-solutions/dot-net-excel-api") (DsExcel) API, you can effortlessly create various Excel charts and customize different parts to suit your needs. In this blog, we showcase how charts can be used with sales data to create a performance dashboard using the DsExcel API. The Excel dashboard includes different charts to display various metrics such as sales by each representative, quantity sold of each product, sales distribution by product, and sales trends over time.
Let’s break down the creation of the dashboard into the following simple steps:
* [Setup a Project with DsExcel Dependency](#Setup)
* [Add Data to the Worksheet](#Add)
* [Create Charts for Dashboard](#Create)
## <a id="Setup"></a>Setup a Project with DsExcel Dependency
Let's begin by setting up a new .NET 8 Console App that includes the DsExcel dependency by following these steps:
1. Open Visual Studio and select **File **| **New **| **Project **to create a new **Console App**.
2. Right-click on the project in Solution Explorer and choose **Manage NuGet Packages…** from the context menu.

3.Search for ** [**Ds.Documents.Excel**](https://www.nuget.org/packages/DS.Documents.Excel "https://www.nuget.org/packages/DS.Documents.Excel") in the NuGet Package Manager and click on Install**.

Now that we've successfully set up the project, it is time to create a new [Workbook](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/ManageWorkbook.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/ManageWorkbook.html") object to develop our report. The DsExcel code to initialize the new Workbook is below:
```
// Create a new Workbook Object
Workbook workbook = new Workbook();
//Access the first sheet
IWorksheet worksheet = workbook.Worksheets[0];
```
Next, we will add the sales data to our worksheet from the JSON file.
## <a id="Add"></a>Add Data to the Worksheet
In this step, we will add the sales data to our worksheet for which we will create charts to analyze the data. We have the sales data in the JSON file, so let’s [deserialize](https://learn.microsoft.com/en-us/dotnet/standard/serialization/system-text-json/deserialization "https://learn.microsoft.com/en-us/dotnet/standard/serialization/system-text-json/deserialization") and convert it into an object array.
The DsExcel code to extract the data and assign it to the worksheet is as follows:
```
public class SalesData
{
public string Date { get; set; }
public string Region { get; set; }
public string ProductCategory { get; set; }
public string ProductName { get; set; }
public int Sales { get; set; }
public int QuantitySold { get; set; }
public string CustomerSegment { get; set; }
public string SalesRep { get; set; }
public string SalesChannel { get; set; }
}
//Load JSON data into string
string jsonString = File.ReadAllText("Data.json");
JArray? jsonObject = JArray.Parse(jsonString);
// Parse the data into a C# array
SalesData[]? salesData = jsonObject?.ToObject<SalesData[]?>();
//Create Normal Object array to assign it to Workbook
object[,] objectSalesDataArray = new object[jsonObject!.Count + 1, typeof(SalesData).GetProperties().Length];
int colIndex = 0;
int rowIndex = 0;
foreach (var property in typeof(SalesData).GetProperties())
{
objectSalesDataArray[rowIndex, colIndex] = property.Name;
colIndex++;
}
rowIndex++;
//Iterate sales data array to get the values
foreach (var data in salesData!)
{
colIndex = 0;
foreach (var property in typeof(SalesData).GetProperties())
{
objectSalesDataArray[rowIndex, colIndex] = property.GetValue(data)!;
colIndex++;
}
rowIndex++;
}
//Assign object array to range
worksheet.Range["A1:I11"].Value = objectSalesDataArray;
```
After assigning the data and applying some formatting, the worksheet looks like this:

## <a id="Create"></a>Create Charts for Dashboard
After setting up the workbook with data, it is time to create charts to show and analyze the different metrics we initially discussed. Let’s follow the steps below to add a bar chart to the sheet showing the sales made by each sales representative. Then, we will perform various modifications to the chart.
### Create a Chart
**1:** To create the chart, use the [AddChart](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IShapes~AddChart(ChartType,IRange).html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IShapes~AddChart(ChartType,IRange).html") method of the [Shapes](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IShapes.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IShapes.html") collection of the sheet and pass the ChartType and target range to plot the chart using the code below:
```
//Add BarClustered Chart to create sales by representative chart
GrapeCity.Documents.Excel.Drawing.IShape representativeSalesChart = worksheet.Shapes.AddChart(ChartType.BarClustered, worksheet.Range["K2"]);
```
**2:** To add the data source for the chart, we will use the [SetSourceData](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IChart~SetSourceData(IRange,RowCol).html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IChart~SetSourceData(IRange,RowCol).html") method of the [Chart](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IChart.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IChart.html") class that we can access via the Chart property of our [IShape](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IShape_members.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IShape_members.html") object. The DsExcel code implementing this is as follows:
```
//Set Data Source for chart
representativeSalesChart.Chart.SetSourceData(worksheet.Range["E1: E11"], RowCol.Columns);
```
**3:** To set the sales representative’s name in the category axis with their associated values, let’s set the [CategoryNames](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IAxis~CategoryNames.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IAxis~CategoryNames.html") property of the Category axes. This property takes the string array, so we fetch this data from the _SalesRepresentative_ column using the following code:
```
representativeSalesChart.Chart.Axes.Item(AxisType.Category).CategoryNames = Enumerable.Range(1, 10).Select(i => worksheet.Range["H1: H11"][i, 0].Value.ToString()).ToArray();
```
**4:** To set the chart’s size and location, use the Height, Width, Top, and Left properties of the IShape object as below:
```
//Set Chart Size and Position
representativeSalesChart.Height = 230;
representativeSalesChart.Width = 500;
representativeSalesChart.Left = 450;
representativeSalesChart.Top = 10;
```
### Format a Chart
**1**: To set and format the chart title, use the [ChartTitle](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IChart~ChartTitle.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IChart~ChartTitle.html") class and customize its font and color using the following code:
```
//Set and Format Chart Title
representativeSalesChart.Chart.ChartTitle.Text = "Sales by Representative";
representativeSalesChart.Chart.ChartTitle.Font.Bold = true;
representativeSalesChart.Chart.ChartTitle.Font.Size = 24;
representativeSalesChart.Chart.ChartTitle.TextFrame.TextRange.Paragraphs[0].Font.Color.RGB = Color.FromArgb(141, 180, 226);
```
**2:** To customize the chart’s border, use the [ChartArea](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IChart~ChartArea.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/DS.Documents.Excel~GrapeCity.Documents.Excel.Drawing.IChart~ChartArea.html") class to set the border color and weight, as well as round its corners.
```
//Format Chart Area Border
representativeSalesChart.Chart.ChartArea.RoundedCorners = true;
representativeSalesChart.Chart.ChartArea.Format.Line.Color.RGB = Color.Black;
representativeSalesChart.Chart.ChartArea.Format.Line.Weight = 2;
```
**3:** To format any data point, you can access it from the Points collection using the Points property of your series. The DsExcel code to format the third data point is as follows:
```
//Customize Particular Data Point in the Chart
representativeSalesChart.Chart.SeriesCollection[0].Points[2].Format.Fill.Color.RGB = Color.Green;
representativeSalesChart.Chart.SeriesCollection[0].Points[2].Format.Line.Weight = 1.5;
```
After performing the steps above, the chart will appear as below:

In the same way, you can add the chart for other mentioned metrics. Check out the [attached sample](https://cdn.mescius.io/umb/media/i2fnvcge/dsexcelchartdemo.zip) to see how they’re implemented!
The final dashboard will appear as shown below after adding all the charts:

## Conclusion
In this blog post, we demonstrated how to create a sales performance dashboard using [DsExcel charts](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/UseChart.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/UseChart.html"). You can also leverage additional features of Excel like Pivot, Slicer, and Conditional Formatting to create more advanced reports using the DsExcel API.
For more details, please refer to the documentation and demos linked below:
* [Product Demo .NET](https://developer.mescius.com/document-solutions/dot-net-excel-api/demos "https://developer.mescius.com/document-solutions/dot-net-excel-api/demos")
* [Documentation .NET](https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/overview.html "https://developer.mescius.com/document-solutions/dot-net-excel-api/docs/online/overview.html")
* [Product Demo JAVA](https://developer.mescius.com/document-solutions/java-excel-api/demos "https://developer.mescius.com/document-solutions/java-excel-api/demos")
* [Documentation JAVA](https://developer.mescius.com/document-solutions/java-excel-api/docs/online/overview.html "https://developer.mescius.com/document-solutions/java-excel-api/docs/online/overview.html") | chelseadevereaux |
1,872,076 | Mastering JavaScript Loops🔁: for, for...in, for...of, and forEach.🚀 | In JavaScript, loops are essential for iterating over data structures like arrays and objects.... | 0 | 2024-05-31T13:06:17 | https://dev.to/dharamgfx/mastering-javascript-loops-for-forin-forof-and-foreach-1ded | webdev, javascript, beginners, programming |
In JavaScript, loops are essential for iterating over data structures like arrays and objects. Understanding the differences between `for`, `for...in`, `for...of`, and `forEach` will enhance your coding skills and help you choose the right loop for your task. Let's dive into each one in detail.
#### 1. The Classic `for` Loop
**Overview:**
The `for` loop is the most traditional looping structure in JavaScript. It's highly versatile and can iterate over any iterable by specifying the start and end conditions.
**Syntax:**
```javascript
for (initialization; condition; increment) {
// code to be executed
}
```
**Example:**
```javascript
for (let i = 0; i < 5; i++) {
console.log(i); // Output: 0, 1, 2, 3, 4
}
```
**Points:**
- **Control:** Provides complete control over the loop, including initialization, condition checking, and increment/decrement.
- **Flexibility:** Can be used with arrays, objects, or any iterable by customizing the conditions.
- **Performance:** Often faster in scenarios requiring complex iterations.
#### 2. `for...in` Loop
**Overview:**
The `for...in` loop is used to iterate over the enumerable properties of an object. It's best suited for objects rather than arrays.
**Syntax:**
```javascript
for (key in object) {
// code to be executed
}
```
**Example:**
```javascript
const person = { name: 'John', age: 30, city: 'New York' };
for (let key in person) {
console.log(key + ": " + person[key]); // Output: name: John, age: 30, city: New York
}
```
**Points:**
- **Objects:** Ideal for iterating over the properties of an object.
- **Key Access:** Accesses keys directly, making it easy to manipulate key-value pairs.
- **Non-Array Use:** Not recommended for arrays due to unexpected behavior with array indices.
#### 3. `for...of` Loop
**Overview:**
The `for...of` loop is designed for iterating over iterable objects like arrays, strings, maps, sets, etc. It provides a simpler and more readable syntax compared to the traditional `for` loop.
**Syntax:**
```javascript
for (element of iterable) {
// code to be executed
}
```
**Example:**
```javascript
const array = [10, 20, 30];
for (let value of array) {
console.log(value); // Output: 10, 20, 30
}
```
**Points:**
- **Iterables:** Works with any iterable object.
- **Value Access:** Directly accesses values, making the code more readable.
- **Modern:** Preferred for modern JavaScript code due to its simplicity and efficiency.
#### 4. `forEach` Method
**Overview:**
The `forEach` method is an array method that executes a provided function once for each array element. It's a functional approach to looping.
**Syntax:**
```javascript
array.forEach(function(currentValue, index, array) {
// code to be executed
});
```
**Example:**
```javascript
const numbers = [1, 2, 3];
numbers.forEach(function(number) {
console.log(number); // Output: 1, 2, 3
});
```
**Points:**
- **Arrays:** Specifically designed for arrays.
- **Callback Function:** Uses a callback function to execute logic on each element.
- **No Break:** Cannot use `break` or `continue` to control the loop, making it less flexible in some cases.
### Summary
Choosing the right loop depends on your specific use case:
- Use the **classic `for` loop** for full control over iterations.
- Use **`for...in`** for iterating over object properties.
- Use **`for...of`** for iterating over iterable objects like arrays and strings.
- Use **`forEach`** for array-specific iterations with a functional approach.
By understanding these differences, you can write more efficient and readable JavaScript code. | dharamgfx |
1,872,074 | AWS Cost Optimization: Top 5 Best Practices & Tools | In order to get the most return on your cloud investment, AWS cost optimization is essential. AWS... | 0 | 2024-05-31T13:03:33 | https://dev.to/techpartner/aws-cost-optimization-top-5-best-practices-tools-59hc | aws, cloudcomputing, cloudpractitioner | In order to get the most return on your cloud investment, AWS cost optimization is essential. AWS continues to gain popularity and importance for the flexible and scalable infrastructure it provides to many firms out there; as a result, managing and optimizing costs plays a significant role in its organizations’ objectives of sustaining and increasing profitability while also increasing operational performance. It is crucial for your cost optimization to run through your approaches from time to time in order to save money, become more flexible, and choose the right instances for your range of business. In this blog, we dive into the top 5 AWS cost reduction strategies and AWS cost optimization tools to enable you to get the most out of your investment in AWS cloud.

**Top 5 AWS Cost Optimization Best Practices**
**1. Right-Sizing Your Instances**
Right sizing deals with the careful assessment of your usage of different resources to fit your actual requirements. This means that it is possible to avoid over-provisioning risks and save money by choosing the right instance types for services such as EC2, RDS and Redshift among others. You will need to first locate underutilized instances and then eliminate or scale back these instances, either by de-provisioning or shrinking them.
**2. Save money by using savings plans & reserved instances**
[AWS Savings Plans](https://aws.amazon.com/savingsplans/) provides up to 72% more cost savings than on-demand pricing on AWS EC2 instances, Fargate, and Lambda. This is because the more you commit to using it, either for 1 or 3 years consistently, you will qualify for more savings. [AWS EC2 Reserved Instances](https://aws.amazon.com/ec2/pricing/reserved-instances/) are 1 or 3 year term commitments getting up to 75% off the on-demand price but for specific instances in specific regions and mostly useful for predictable loads. But you can’t decrease the instance during this period, and increasing the instance will be charged at on-demand pricing.
**3. Leveraging Spot Instances**
[AWS EC2 Spot instances](https://aws.amazon.com/ec2/spot/) are unused AWS instances available at a bid level thus achieving discounts of 90% on an on-demand instance price. These are best used in batch processes, stateless website services, high-performance computing tasks, or big data applications and applications that can be interrupted. However, AWS can allow someone else to bid and take the instance back within two minutes if the someone has bid higher than you.
**4. Optimize Storage Costs**
A number of measures used in AWS S3 cost optimization ensure that storage costs are kept as low as possible while still ensuring that data is readily accessible. Store production files in S3/GLACIER and activity based storage tier migration to move production files between different storage tiers. This involves using the [Amazon S3 Intelligent-Tiering](https://aws.amazon.com/s3/storage-classes/intelligent-tiering/) option that enables the automatic moving of data based on their access. For long-term storage, principally use [Amazon S3 Glacier](https://aws.amazon.com/s3/storage-classes/glacier/) and, even more cost-efficient, [Amazon S3 Glacier Deep Archive](https://aws.amazon.com/s3/storage-classes/glacier/) to archive infrequently accessed data. Select the EBS volume type based on the requirement of the application and make sure that ‘Delete on termination’ checkbox is checked in order to prevent further charges when the EC2 instances are terminated.
**5. AWS Auto Scaling for Cost Optimization**
Use [AWS Auto Scaling Groups](https://aws.amazon.com/autoscaling/) (ASGs) to control the size of your EC2 instances rendering them either larger or smaller depending on your utilization and defined scaling policies. optimizing both performance and cost with ASGs by regularly reviewing the policies and updating them.
**Top 5 AWS Cloud Cost Optimization Tools**
**1. AWS Cost Explorer**
[AWS Cost Explorer](https://aws.amazon.com/aws-cost-management/aws-cost-explorer/) is a tool that enables users to analyze, review, control, and contain their expenditures and usage patterns on the platform over time. You can generate various reports, customize them and filter or group by various dimensions and costs. Cost Explorer forecasts future costs using historical data and alerts you to cost anomalies.
**2. AWS Budgets**
[AWS Budgets](https://aws.amazon.com/aws-cost-management/aws-budgets/) lets you create cost and usage budgets, and informs you when a budgetary ceiling has been breached. Some of the features include having set maximum allowed cost, maximum allowed usage, maximum allowed utilization on Reserved Instance/Savings Plan, and a notification when the budgets have been surpassed. This tool can be integrated with AWS Cost Explorer, for richer visual cost analysis.
**3. AWS Trusted Advisor**
[AWS Trusted Advisor](https://aws.amazon.com/premiumsupport/technology/trusted-advisor/) is an on-demand resource that helps you to maximize your AWS usage by giving real-time recommendations. Cost Controller includes Unused Capacity which helps in identifying underutilized or idle resources. Reserved Instance Purchase Recommendations provide a guide to the appropriate purchase of Reserved Instances, and lastly Cost Saving Suggestions for identifying more savings.
**4. Amazon Web Services Cost and Usage Report (CUR)**
The [AWS Cost and Usage Report](https://aws.amazon.com/aws-cost-management/aws-cost-and-usage-reporting/) is a summary of AWS costs and usage metrics that offer detailed information on the company’s expenses. Some of the main ones that appeal to most users are for example the ability to get cost and usage information precisely and in the smallest detail possible; the ability to be able to design a report in a way that the end user would need it to be designed.
**5. AWS Compute Optimizer**
[AWS Compute Optimizer](https://aws.amazon.com/compute-optimizer/) is a service that enables you to identify the optimal AWS services to use for your instances, thereby minimizing cost and improving efficiency. This includes daily and weekly usage analytics, proposing the most suitable EC2 instance types, Auto Scaling Groups, and Lambda functions optimization.
**About Techpartner Alliance**
[Techpartner Alliance](https://www.techpartneralliance.com/) specializes in AWS and was established in 2017 by [Ravindra Katti](https://www.linkedin.com/in/ravindrakatti/), an AWS ex-seller, and [Prasad Wani](https://www.linkedin.com/in/prasadwani/), an AWS cloud architect. In our capacity as a reviewed partner of the [Well-Architected Framework Review (WAR)](https://www.techpartneralliance.com/well-architected-review/), we can perform the WAR which seeks to evaluate various architectural weaknesses in your ecosystem and then establish and implement a WAR for AWS cost management. Additionally, in our capacity as a certified service delivery partner of [AWS Graviton](https://www.techpartneralliance.com/graviton-arm-processor/), we can assess your workloads to migrate Graviton processors which provide up to a 40% price-performance saving compared to Intel x86 processors.
Follow our [LinkedIn page](https://www.linkedin.com/company/techpartner-alliance/) for regular updates on latest tech trends and AWS cloud!
---
Citations:
[1] https://spot.io/resources/aws-cost-optimization/8-tools-and-tips-to-reduce-your-cloud-costs/
[2] https://www.cloudzero.com/blog/aws-cost-optimization-tools/
[3] https://www.nops.io/blog/aws-cost-optimization-tools/
[4] https://aws.amazon.com/aws-cost-management/cost-optimization/
[5] https://docs.aws.amazon.com/whitepapers/latest/cost-optimization-laying-the-foundation/reporting-cost-optimization-tools.htm
| arunasri |
1,872,072 | Ultimate Guide to Basic Server Types | Ultimate Guide to Basic Server Types Introduction Welcome to the Ultimate Guide... | 0 | 2024-05-31T12:59:31 | https://dev.to/kernelrb/ultimate-guide-to-basic-server-types-56i1 | webdev, beginners, networking, learning | # Ultimate Guide to Basic Server Types
## Introduction
Welcome to the Ultimate Guide to Basic Server Types!
Servers are the backbone of the internet, powering websites, handling emails, and ensuring that your data reaches its destination. In this blog, we'll explore five essential types of servers: Origin, Proxy, Mail, Web, and DNS servers. By understanding these server types, you can make informed decisions about your network infrastructure and hosting needs.
---
## Origin Servers
### Definition
An origin server is the primary source server that holds the original content or data that needs to be delivered to end users or other servers. It is typically used to host the main version of a website, application, or any other type of data that needs to be distributed.
### How It Works
Origin servers store and manage the original version of the content. When a request is made (e.g., a user accessing a webpage), the origin server processes the request and serves the content directly to the end user or to a cache server. Origin servers can be physical or virtual, depending on the infrastructure needs.
### Common Use Cases
- Hosting website content
- Managing large databases
- Storing application data
- Serving media content (videos, images, etc.)
- Running APIs
### Pros and Cons
**Pros:**
- Full control over content
- High performance for data-intensive applications
- Direct access to original data
**Cons:**
- High cost for high-performance hardware
- Requires robust security measures
- Can be a single point of failure if not properly managed
### Conclusion
Origin servers are essential for hosting and managing original content. They offer high performance and control but require proper setup and management to ensure reliability and security.
---
## Proxy Servers
### Definition
A proxy server acts as an intermediary between clients and other servers, forwarding client requests to the appropriate server. Proxy servers can enhance security, improve load balancing, and provide content filtering.
### How It Works
Proxy servers receive client requests and then forward them to the destination server. They can cache responses to reduce load, filter requests based on rules, and hide the client's IP address for anonymity. There are various types of proxies, including forward proxies, reverse proxies, and web proxies.
### Common Use Cases
- Enhancing network security
- Load balancing for web servers
- Filtering content and blocking malicious sites
- Anonymizing user requests
- Caching frequently accessed content
### Pros and Cons
**Pros:**
- Improved security
- Load balancing capabilities
- Anonymity and privacy
**Cons:**
- Potential latency due to additional hop
- Complexity in configuration and management
- Possible single point of failure
### Conclusion
Proxy servers are versatile tools that can enhance security, improve performance, and provide anonymity. Proper setup and management are crucial to maximize their benefits and minimize potential downsides.
---
## Mail Servers
### Definition
A mail server is responsible for sending, receiving, and storing emails. It uses protocols such as SMTP (Simple Mail Transfer Protocol), IMAP (Internet Message Access Protocol), and POP3 (Post Office Protocol) to manage email communication.
### How It Works
Mail servers consist of incoming mail servers (IMAP/POP3) and outgoing mail servers (SMTP). When an email is sent, it is transferred from the sender's mail server to the recipient's mail server using SMTP. The recipient can then retrieve the email using IMAP or POP3.
### Common Use Cases
- Business email hosting
- Personal email services
- Managing mailing lists
- Archiving and storing emails
- Providing secure email communication
### Pros and Cons
**Pros:**
- Reliable email delivery
- Control over email policies and configurations
- Enhanced security for sensitive communications
**Cons:**
- Requires constant maintenance and monitoring
- Can be a target for spam and phishing attacks
- Needs proper configuration to prevent misuse
### Conclusion
Mail servers are critical for managing email communications. They provide reliable and secure email delivery but require vigilant maintenance and security measures to prevent misuse and attacks.
---
## Web Servers
### Definition
A web server stores, processes, and delivers web pages to clients (browsers) over the internet. It handles requests using the HTTP/HTTPS protocols and serves static or dynamic content.
### How It Works
Web servers receive requests from clients, process them, and respond with the requested content (HTML, images, videos, etc.). They can handle static content directly or interact with application servers to generate dynamic content. Common web server software includes Apache, Nginx, and IIS.
### Common Use Cases
- Hosting websites and web applications
- Serving static files (images, CSS, JavaScript)
- Running APIs
- Streaming media content
- Handling user interactions and form submissions
### Pros and Cons
**Pros:**
- Essential for online presence
- Scalable to handle varying traffic loads
- Supports a wide range of content types
**Cons:**
- Needs proper security measures to prevent attacks
- Requires bandwidth management
- Performance can be affected by high traffic
### Conclusion
Web servers are the backbone of the internet, enabling websites and web applications to function. Proper setup, security, and optimization are key to maintaining a reliable and efficient web server.
---
## DNS Servers
### Definition
A DNS (Domain Name System) server translates domain names (like www.example.com) into IP addresses that computers use to identify each other on the network. DNS servers play a crucial role in directing internet traffic.
### How It Works
DNS servers store and manage DNS records, which contain information about domain names and their corresponding IP addresses. When a user enters a domain name, the DNS server resolves it to an IP address, allowing the browser to connect to the correct server. There are different types of DNS servers, including authoritative, recursive, and caching DNS servers.
### Common Use Cases
- Resolving domain names to IP addresses
- Load balancing and traffic management
- Enhancing network performance and reliability
- Managing domain names for websites
- Supporting email services
### Pros and Cons
**Pros:**
- Essential for internet navigation
- Improves load times and network performance
- Supports redundancy and load balancing
**Cons:**
- Can be targeted by DNS attacks (e.g., DDoS, DNS poisoning)
- Requires regular updates and security patches
- Complex to manage large DNS infrastructures
### Conclusion
DNS servers are fundamental for translating domain names to IP addresses, enabling seamless internet navigation. Proper setup, security, and management are vital to ensure DNS server reliability and performance.
---
## Conclusion
Understanding the different types of servers - Origin, Proxy, Mail, Web, and DNS - is essential for managing a robust and efficient network. Each server type plays a unique role in the infrastructure, and knowing their functions, advantages, and setup processes helps you make informed decisions. I hope this guide has provided valuable insights into the world of servers.
Matab Saif eddine.
| kernelrb |
1,872,070 | The Backbone of Modern Web Applications and How It Can Propel Your Software Development and Solutions Business | The digital landscape is a battlefield, and your website or web application is your frontline... | 0 | 2024-05-31T12:55:38 | https://dev.to/kunal_69181fe299490eae953/the-backbone-of-modern-web-applications-and-how-it-can-propel-your-software-development-and-solutions-business-5emm | softwaredevelopment | The digital landscape is a battlefield, and your website or web application is your frontline soldier. In this ever-evolving war for customer attention, a clunky, outdated application simply won't cut it. You need a robust, engaging, and feature-rich solution that delivers a seamless user experience. This is where software development and solutions come in, wielding the power of full stack development to transform your business.
Full Stack Mastery: Building Cohesive Web Experiences
Full stack development goes beyond the realm of standard web development. It's about having a team of skilled individuals who are jacks-of-all-trades when it comes to software development and solutions. They possess expertise in both the front-end (what users see) and the back end (how it works). Imagine a developer who can design a captivating user interface and build robust systems that power it under the hood. That's the magic of a full stack developer.
Why Full Stack Development is a Game Changer for Software Development and Solutions Businesses
There are several reasons why full stack development is becoming an increasingly asset for businesses in the software development and solutions industry:
Enhanced Efficiency: Full stack developers streamline communication and development processes, leading to faster project completion times and reduced costs for your clients.
Unified Vision: With a single team handling both front-end and back-end aspects, there's a clear understanding of the application's overall goals and functionalities. This eliminates communication silos and fosters a more cohesive product for your clients.
Flexibility and Scalability: Full stack developers have the skillset to adapt to changing project needs and integrate new features seamlessly. This allows your clients' web applications to grow alongside their businesses.
Improved User Experience: By understanding both sides of the coin, full stack developers can create user interfaces that are not only visually appealing but also intuitive and responsive, leading to higher user engagement and satisfaction for your clients' applications.
What a Full Stack Development Service Providers Can Propel Your Software Development and Solutions Business Growth
Partnering with a company that specializes in full stack development can be a strategic move for your software development and solutions business:
Access to Expertise: You gain access to a pool of highly skilled and experienced full stack developers who can navigate the complexities of modern web development, allowing you to offer a wider range of services to your clients.
Cost-Effectiveness: Hiring a full stack development team in-house can be expensive. Partnering with a service provider offers a cost-effective solution with access to a wider range of expertise, allowing you to be more competitive in your pricing.
Faster Time-to-Market: With streamlined processes and efficient communication, full stack development service providers can deliver your clients' web applications faster, allowing them to capitalize on opportunities sooner.
Ongoing Support: Many service providers offer ongoing maintenance and support to ensure your clients' web applications continue to function optimally and stay up to date with the latest technologies, providing exceptional value to your clients.
Investing in Full Stack Development: A Strategic Move for the Future of Software Development and Solutions
In today's digital age, a well-crafted web application is no longer a luxury, it's a necessity. By specializing in software development and solutions with a focus on full stack development, you can offer your clients a powerful tool to propel their businesses forward. You gain access to a team that can build solutions that not only meet their current needs but also have the flexibility to adapt and grow with their businesses. It's an investment in the future of your software development and solutions company, one that can leave the competition in dust.
For More Information
Visit Us: https://cloudprism.in/software-development
| kunal_69181fe299490eae953 |
1,872,069 | Game Dev Digest — Issue #235 - Rendering Tutorials, Physics, Community, and more | Issue #235 - Rendering Tutorials, Physics, Community, and more This article was originally... | 4,330 | 2024-05-31T12:55:03 | https://gamedevdigest.com/digests/issue-235-rendering-tutorials-physics-community-and-more.html | gamedev, unity3d, csharp, news | ---
title: Game Dev Digest — Issue #235 - Rendering Tutorials, Physics, Community, and more
published: true
date: 2024-05-31 12:55:03 UTC
tags: gamedev,unity,csharp,news
canonical_url: https://gamedevdigest.com/digests/issue-235-rendering-tutorials-physics-community-and-more.html
series: Game Dev Digest - The Newsletter About Unity Game Dev
---
### Issue #235 - Rendering Tutorials, Physics, Community, and more
*This article was originally published on [GameDevDigest.com](https://gamedevdigest.com/digests/issue-235-rendering-tutorials-physics-community-and-more.html)*

Get it while it's hot! Another jam packed issue of game dev content. Enjoy!
---
[**Voxel Displacement Renderer — Modernizing the Retro 3D Aesthetic**](https://blog.danielschroeder.me/2024/05/voxel-displacement-modernizing-retro-3d/?) - I’ve been developing a custom real-time renderer which uses very small voxels to produce a distinctive visual style that modernizes the look and feel of classic 90’s 3D games. By approaching the problem of rendering voxels in an unusual way, I’m able to produce these visuals from conventional art assets — low-poly triangle meshes and textures — that are familiar and efficient to create.
[_blog.danielschroeder.me_](https://blog.danielschroeder.me/2024/05/voxel-displacement-modernizing-retro-3d/?)
[**City In A Bottle – A 256 Byte Raycasting System**](https://frankforce.com/city-in-a-bottle-a-256-byte-raycasting-system/) - Hello size coding fans. Today, I have something amazing to share: A tiny raycasting engine and city generator that fits in a standalone 256 byte html file. In this post I will share all the secrets about how this magical program works.
[_frankforce.com_](https://frankforce.com/city-in-a-bottle-a-256-byte-raycasting-system/)
[**The secrets to gaming gravity**](https://www.polygon.com/24157223/how-gravity-physics-work-for-jumping-falling-in-games?) - “Find“Find the fun.” Game development’s common refrain can manifest as the creation of a new game mechanic, the refinement or combination of existing mechanics, or making something simple — like jumping, flying, and fighting — feel great. Even falling.
[_polygon.com_](https://www.polygon.com/24157223/how-gravity-physics-work-for-jumping-falling-in-games?)
[**2D Rigid Body Collision ResolutionPart 1: Defining the problem**](https://www.sassnow.ski/rigid-body-collisions/1?) - From Mario bouncing off a Goomba to two cars bumping into each other in a racing game, dealing with collisions is such an integral part of most video games that we often take it for granted.
[_sassnow.ski_](https://www.sassnow.ski/rigid-body-collisions/1?)
[**Unity Tutorial: Fake Cloud Shadows**](https://mirzabeig.substack.com/p/unity-tutorial-fake-cloud-shadows) - Fake cloud shadows are a great way to add depth to your scene, providing the illusion of high-quality environmental shading for negligible cost and effort. Below is an example from Warhammer 3, where it's apparent in the sped-up footage the clouds in the sky are looping/repeating and don't match the actual shadows on the ground.
[_mirzabeig.substack.com_](https://mirzabeig.substack.com/p/unity-tutorial-fake-cloud-shadows)
[**Instantly Boost Unity Game Performance With IL2CPP_USE_SPARSEHASH**](https://gamedev.center/instantly-boost-unity-game-performance-with-il2cpp_use_sparsehash/) - Unity abstracts away many low-level details, allowing game developers to focus on creating amazing experiences. However, as your game scales and grows in complexity, you may need to dive deeper into the internals of abstracted engine components like IL2CPP. In this post, we will uncover one of these low-level details to boost the performance of your game in one easy step.
[_gamedev.center_](https://gamedev.center/instantly-boost-unity-game-performance-with-il2cpp_use_sparsehash/)
[**How To Use Steam's Marketing?**](https://stepupyourgame.blog/2024/05/27/how-to-use-steams-marketing/) - Earlier this month, Steam released an article addressing some of the big questions you might have about marketing your game. As you want to sell your game on Steam, having a good understanding of how it works is key.
[_stepupyourgame.blog_](https://stepupyourgame.blog/2024/05/27/how-to-use-steams-marketing/)
[**Pathfinding Part 1 with Dijkstra's Algorithm**](https://excaliburjs.com/blog/Pathfinding%20Algorithms%20Part%201/) - One of the most common problems that need solved in game development is navigating from one tile to a separate tile somewhere else. Or sometimes, I need just to understand if that path is clear between one tile and another. Sometimes you can have a graph node tree, and need to understand the cheapest decision. These are the kinds of challenges where one could use a pathfinding algorithm to solve.
[_excaliburjs.com_](https://excaliburjs.com/blog/Pathfinding%20Algorithms%20Part%201/)
[**Climbing content mountain**](https://www.valadria.com/climbing-content-mountain/) - Tips & advice for climbing content mountain, that gigantic, intimidating middle part of finishing a game.
[_valadria.com_](https://www.valadria.com/climbing-content-mountain/)
[**Unity 6: A Deep Dive Into the Update's New Features & Enhancements**](https://80.lv/articles/unity-6-a-deep-dive-into-the-update-s-new-features-enhancements/) - In light of Unity 6 Preview's recent release, Unity Technologies' Mathieu Muller has joined 80 Level to provide an in-depth overview of the update, discuss the new graphical features and improvements it introduces, and explain how you can get the most out of the engine's new version.
[_80.lv_](https://80.lv/articles/unity-6-a-deep-dive-into-the-update-s-new-features-enhancements/)
[**The good within: Designing a memorable horror game protagonist**](https://www.gamedeveloper.com/design/the-good-within-designing-a-memorable-horror-game-protagonist) - The survival horror genre has no shortage of leads who share the player's eagerness to go through hell and back to achieve their goal of living to see another day.
[_gamedeveloper.com_](https://www.gamedeveloper.com/design/the-good-within-designing-a-memorable-horror-game-protagonist)
[**Introducing our new e-book: Unity’s Data-Oriented Technology Stack (DOTS) for advanced developers**](https://blog.unity.com/engine-platform/new-ebook-understanding-unity-dots) - This 50+ page e-book, Introduction to the Data-Oriented Technology Stack for advanced Unity developers, is now available to download for free. Use it as a primer to better understand data-oriented programming and evaluate if DOTS is the right choice for your next project. Whether you’re looking to start a new DOTS-based project, or implement DOTS for performance-critical parts of your Monobehaviour-based game, this guide covers all the necessary ground in a structured and clear manner.
[_Unity_](https://blog.unity.com/engine-platform/new-ebook-understanding-unity-dots)
[**Unity Shader Variants Optimization & Troubleshooting Tips**](https://blog.unity.com/engine-platform/shader-variants-optimization-troubleshooting-tips) - Here, I’d like to share a few practical tips on how to handle variants, understand where they are coming from, and some effective ways to reduce them. Your project build time and memory footprint will greatly benefit as a result.
[_Unity_](https://blog.unity.com/engine-platform/shader-variants-optimization-troubleshooting-tips)
## Videos
[](https://www.youtube.com/watch?v=LSNQuFEDOyQ)
[**Lerp smoothing is broken - a journey of decay and delta time**](https://www.youtube.com/watch?v=LSNQuFEDOyQ) - I had to learn differential equations for this oh boy
[_Freya Holmér_](https://www.youtube.com/watch?v=LSNQuFEDOyQ)
[**Collision Detection in my Procedural Animation State Machine**](https://www.youtube.com/watch?v=Ld7V4547d3s) - Collision detection is crucial for my procedural animations, and in this video, I’ll show you exactly how I implemented it in my state machine.
[_iHeartGameDev_](https://www.youtube.com/watch?v=Ld7V4547d3s)
[**Get Started with Custom Unity Packages (Step by Step)**](https://www.youtube.com/watch?v=f2xW24xyDEg) - Learn how to package your custom code in a modular, version controlled way with Unity Custom Packages, that include Dependency management! Understand the difference between a basic unity packaged Asset and a Custom Unity Package. Today we're taking our Improved Timers code from the previous video and packaging it to share with the community! …
[_git-amend_](https://www.youtube.com/watch?v=f2xW24xyDEg)
[**Why Photorealistic And Stylized Graphics Are The Same**](https://www.youtube.com/watch?v=KkOkx0FiHDA) - Despite aesthetic diversity, the math that drives video game and movie lighting is almost always derived from the same exact lighting model. Knowing is half the battle, so today we go over how Disney derived their hallmark principled BRDF and how you could do it yourself!
[_Acerola_](https://www.youtube.com/watch?v=KkOkx0FiHDA)
[**Creator Spotlight: Finding Your Community**](https://www.youtube.com/live/r_LcTspOIe4) - The teams at SF Game Development, RVA Game Jams, and Gumbo Collective Inc. are here to share there stories and tips on how to find your community. We will be welcoming them on our next Creator Spotlight to talk about community building.
[_Unity_](https://www.youtube.com/live/r_LcTspOIe4)
[**Unity - Foil Card Shader Graph**](https://www.youtube.com/watch?v=f-ca4lAf_Gw&t=33s) - I've made a tutorial on how to create a foil card with a 'fake depth' effect.
[_Game Slave_](https://www.youtube.com/watch?v=f-ca4lAf_Gw&t=33s)
[**Damaged Glass Shader In Blender | Tutorial**](https://www.youtube.com/watch?v=PlY-nsNoviI) - learn how to make a damaged glass shader in Blender.
[_PIXXO 3D_](https://www.youtube.com/watch?v=PlY-nsNoviI)
## Assets
[](https://assetstore.unity.com/?on_sale=true&orderBy=1&rows=96&aid=1011l8NVc)
[**Template Toolkit Sale: Up to 60% off**](https://assetstore.unity.com/?on_sale=true&orderBy=1&rows=96&aid=1011l8NVc) - Get the templates, system packs, and tutorials you need to build your games faster. Discover over 200 assets on sale.
[_Unity_](https://assetstore.unity.com/?on_sale=true&orderBy=1&rows=96&aid=1011l8NVc) **Affiliate**
[**Epic Environments Mega Bundle - Unity & Unreal**](https://assetstore.unity.com/?on_sale=true&orderBy=1&rows=96&aid=1011l8NVc) - Create fantasy worlds in Unity & Unreal. Bring the fantasy world of your dreams to life in your Unity or Unreal Engine game project with this massive bundle of assets! From bustling medieval urbanscapes to ancient temple environments teeming with mystery and majesty, you’ll get tons of modular assets with which to craft play spaces that’ll enchant and transport your players. Also included are a host of characters and props to help make your environments feel convincing and lived-in. Pay what you want for this epic toolkit, valued at over $1,000, and help support Code.org with your purchase.
[_Humble Bundle_](https://assetstore.unity.com/?on_sale=true&orderBy=1&rows=96&aid=1011l8NVc) **Affiliate**
[**unity-prefs-editor**](https://github.com/fish-ken/unity-prefs-editor?) - Unity Player/Editor Prefs Editor
[_fish-ken_](https://github.com/fish-ken/unity-prefs-editor?) *Open Source*
[**Lattice**](https://github.com/Pontoco/Lattice?) - A visual scripting system for Unity ECS. Quickly create gameplay logic.
[_Pontoco_](https://github.com/Pontoco/Lattice?) *Open Source*
[**YNL-Simple-AI-System**](https://github.com/Yunasawa-Studio/YNL-Simple-AI-System?) - Implement basic AI behaviors effortlessly using this straightforward toolkit. It’s designed to handle object behaviors and interactions.
[_Yunasawa-Studio_](https://github.com/Yunasawa-Studio/YNL-Simple-AI-System?) *Open Source*
[**zombie-ai**](https://github.com/baponkar/zombie-ai?) - Advanced Zombie AI or Zombie NPC for Unity Game Engine with State Machine and Behavior Tree Controlled.
[_baponkar_](https://github.com/baponkar/zombie-ai?) *Open Source*
[**junelite**](https://github.com/kleineluka/junelite?) - Make Unity prettier. A free, open-source, and redistributable post-processing stack for Unity and VRChat.
[_kleineluka_](https://github.com/kleineluka/junelite?) *Open Source*
[**UnityDxrTest**](https://github.com/keijiro/UnityDxrTest?) - A testbed project for Unity real-time ray tracing features
[_keijiro_](https://github.com/keijiro/UnityDxrTest?) *Open Source*
[**SimpleUDP**](https://github.com/StrumDev/SimpleUDP?) - SimpleUDP - UDP library for C#
[_StrumDev_](https://github.com/StrumDev/SimpleUDP?) *Open Source*
[**JoltPhysicsUnity**](https://github.com/seep/JoltPhysicsUnity?) - Jolt Physics bindings for Unity
[_seep_](https://github.com/seep/JoltPhysicsUnity?) *Open Source*
[**EasierVRAssets**](https://github.com/kimryan0416/EasierVRAssets?) - A collection of prefabs, scripts, and assets that can be dragged and dropped into Unity for the Oculus Quest. Requires the Oculus Implementations package from the Oculus Store, but makes various aspects about using it easier.
[_kimryan0416_](https://github.com/kimryan0416/EasierVRAssets?) *Open Source*
[**unity-webgl-microphone**](https://github.com/bnco-dev/unity-webgl-microphone?) - Microphone interface for Unity when using WebGL/WebXR build target
[_bnco-dev_](https://github.com/bnco-dev/unity-webgl-microphone?) *Open Source*
[**Responsible**](https://github.com/sbergen/Responsible?) - Reactive asynchronous automated testing utility for .NET and Unity
[_sbergen_](https://github.com/sbergen/Responsible?) *Open Source*
[**50% off Blink - Publisher Sale**](https://assetstore.unity.com/publisher-sale?aid=1011l8NVc) - Blink takes pride in helping independent developers succeed by providing them with innovative tools and high-quality art. Mixing assets from different publishers can be difficult, so Blink Art has created asset types from all categories that can be seamlessly combined together. PLUS, get [100+ Stylized Weapons Bundle - Fantasy RPG](https://assetstore.unity.com/packages/3d/props/weapons/100-stylized-weapons-bundle-fantasy-rpg-204803?aid=1011l8NVc) for FREE with code BLINK2024.
[_Unity_](https://assetstore.unity.com/publisher-sale?aid=1011l8NVc) **Affiliate**
[**Low Poly Game Dev Bundle**](https://www.humblebundle.com/software/low-poly-game-dev-bundle-software?partner=unity3dreport) - Low-poly building blocks. Nail the evocative retro look of the 32-bit era in your next project with this bundle of low-poly game assets, usable on Unity, Unreal, and other game engines big and small! You’ll get thousands of individual assets across dozens of themed packs, allowing you to create everything from awe-inspiring futuristic space colonies, to post-apocalyptic ruins teeming with hazard—plus, all the props you need to bring them to life! Everything in this bundle is in FBX format, so you’ll be able to integrate it all seamlessly, regardless of your workflow. Play what you want for this bundle of amazing building blocks and help support Save the Children with your purchase!
[_Humble Bundle_](https://www.humblebundle.com/software/low-poly-game-dev-bundle-software?partner=unity3dreport) **Affiliate**
[**Gamedev Market's RPG Adventure Essentials Bundle**](https://www.humblebundle.com/software/gamedev-markets-rpg-adventure-essentials-software?partner=unity3dreport) - Build stunning 2D worlds. Game makers, get ready to supercharge your 2D creations with this massive bundle, overflowing with pixel-perfect assets ready to drop into your next project! You'll get dozens of versatile tilesets, from somber cyberpunk cityscapes to idyllic medieval villages, allowing you to bring the worlds in your imagination to life. Populate them with a vast array of diverse characters, fearsome monsters, and charming critters, and add the finishing touches with slick icon packs, sound effects, and retro-inspired music. Pay what you want for this expansive toolkit, ready to use whatever your specific workflow, and help support the Michael J. Fox Foundation with your purchase!
[_Humble Bundle_](https://www.humblebundle.com/software/gamedev-markets-rpg-adventure-essentials-software?partner=unity3dreport) **Affiliate**
## Spotlight
[](https://store.steampowered.com/app/1043710/Unlanded/)
[**Unlanded**](https://store.steampowered.com/app/1043710/Unlanded/) - Drift your way through space of hazards! In Unlanded you'll drift among colourful environments and bizarre hazards. The game has semi-realistic physics: there is no speed limit, your ship won't stop automatically and its thrust is not instant. Some players may find it confusing, some won't be able to complete the tutorial.
_[You can get it in Early Access on [Steam](https://store.steampowered.com/app/1043710/Unlanded/) and follow them on [Twitter](https://twitter.com/aw79904)]_
[_Eki-Eki-Eki_](https://store.steampowered.com/app/1043710/Unlanded/)
---
[](https://store.steampowered.com/app/2623680/Call_Of_Dookie/)
My game, Call Of Dookie. [Demo available on Steam](https://store.steampowered.com/app/2623680/Call_Of_Dookie/)
---
You can subscribe to the free weekly newsletter on [GameDevDigest.com](https://gamedevdigest.com)
This post includes affiliate links; I may receive compensation if you purchase products or services from the different links provided in this article.
| gamedevdigest |
1,872,068 | Unveiling the Tape Solutions: Prateek Tapes Revolutionizing the Industry | Situate in the heart of India’s capital Delhi, amidst the flurry of industries, stands a tape... | 0 | 2024-05-31T12:54:06 | https://dev.to/prateektapes/unveiling-the-tape-solutions-prateek-tapes-revolutionizing-the-industry-1c0a | tapes, adhesive, prateektapes, adhesivetape |

Situate in the heart of India’s capital Delhi, amidst the flurry of industries, stands a tape manufacturing company that has been silently revolutionizing the way we perceive adhesive solutions - Prateek Tapes, a part of Lalit Jain Industries Private Limited. Started by Director Mr. Lalit Kumar Jain in the early 90s, Prateek Tapes has emerged as a frontrunner in the tape manufacturing industry, offering a diverse range of adhesive solutions tailored to meet the evolving needs of businesses across various sectors.
Established in the heart of Delhi, Prateek Tapes has carved a niche for itself through its commitment to quality, innovation, and customer satisfaction. Specializing in a wide array of tapes, including but not limited to double-sided tape, adhesive tape, paper tape, Double Tape, Double sided glue tape, packaging tape, both side tape, Double sided, white tape, brown tape, gum tape, sticky tape, paper tape paper, paper tape paper tape and more, Prateek Tapes caters to the diverse requirements of industries ranging from automotive and construction to packaging and electronics and FMCG to label industry.
Quality Redefined:
At Prateek Tapes, quality reigns supreme. Each tape manufactured undergoes stringent quality checks to ensure optimal performance and durability. By utilizing cutting-edge manufacturing techniques and premium-grade materials, Prateek Tapes delivers tapes that adhere seamlessly to various surfaces, withstand extreme conditions, and offer long-lasting bonding solutions.
Innovation at Its Core:
Innovation is the cornerstone of Prateek Tapes' success story. The company continually invests in research and development to stay ahead of the curve and introduce innovative tape solutions that address emerging challenges faced by industries. Whether it's developing tapes with enhanced adhesive properties or introducing eco-friendly alternatives, Prateek Tapes is committed to driving innovation that adds value to its customers' operations.
Diverse Product Portfolio:
Prateek Tapes takes pride in its diverse product portfolio, catering to a wide spectrum of industrial applications. From the versatile double-sided tapes ideal for mounting and bonding tasks to the robust duct tapes engineered for heavy-duty applications, Prateek Tapes offers solutions tailored to meet the unique requirements of each industry. The company also specializes in specialty tapes such as electrical insulation tapes, masking tapes, and surface protection tapes, providing comprehensive adhesive solutions under one roof.
Customer-Centric Approach:
Prateek Tapes places utmost importance on customer satisfaction, and its customer-centric approach reflects in every aspect of its operations. The company works closely with clients to understand their specific needs and challenges, offering customized tape solutions that align with their requirements. With a responsive customer support team and a robust distribution network, Prateek Tapes ensures seamless delivery of products and exceptional after-sales service, fostering long-term partnerships with its clients.
Commitment to Sustainability:
In an era marked by growing environmental concerns, Prateek Tapes is committed to sustainability and environmental responsibility. The company adopts eco-friendly manufacturing practices, minimizing waste generation and reducing its carbon footprint. Moreover, Prateek Tapes explores alternative materials and processes to develop biodegradable tapes that meet stringent environmental standards without compromising on performance.
Global Reach, Local Presence:
While Prateek Tapes has garnered recognition on a global scale, the company remains deeply rooted in its local community. With its headquarters situated in Delhi, Prateek Tapes takes pride in its Indian heritage and actively contributes to the socio-economic development of the region.
Driving Industry Standards:
Prateek Tapes doesn't just follow industry standards; it sets them. With its unwavering commitment to excellence, the company has become a benchmark for quality and reliability in the tape manufacturing industry. By adhering to stringent quality control measures and investing in continuous improvement initiatives, Prateek Tapes raises the bar for performance, setting new standards that inspire trust and confidence among its clientele.
Looking Ahead:
As Prateek Tapes embarks on its journey towards greater milestones, the company remains steadfast in its mission to redefine the tape solutions landscape. With a focus on innovation, quality, and customer satisfaction, Prateek Tapes is poised to continue its legacy of excellence, providing adhesive solutions that empower businesses to thrive in an ever-evolving market environment.
In conclusion, Prateek Tapes stands as a testament to the power of dedication, innovation, and integrity. With its unwavering commitment to quality, diverse product portfolio, and customer-centric approach, Prateek Tapes continues to shape the future of the tape manufacturing industry, one adhesive solution at a time.
Check out our Website on **[PrateekTapes](https://www.prateektapes.com/)**
| prateektapes |
1,872,066 | TailwindCSS | Group Selector | Hello my fellow frontend developers, today i will be showing 1 cool feature of tailwind css. ... | 0 | 2024-05-31T12:49:29 | https://dev.to/shubhamtiwari909/tailwindcss-group-focus-b4o | html, css, webdev, tutorial | Hello my fellow frontend developers, today i will be showing 1 cool feature of tailwind css.
## Group Selector
Consider an input field with an icon—let's say, a search icon inside search input field. If the input field is focused, you just want the search icon to appear. Tailwind group selector may be used to achieve these kinds of situations, where the focus or hover on the parent element also affects the styling of the child components.
## Input Component
```js
"use client";
import React, { useState } from "react";
const Input = ({
inputClasses,
iconClasses,
focusClasses,
keepDefaultInputClasses,
}: {
inputClasses: string;
iconClasses: string;
focusClasses: string;
keepDefaultInputClasses?: boolean;
}) => {
const [search, setSearch] = useState("");
return (
<div className="relative overflow-hidden group">
<label
htmlFor="default-search"
className="mb-2 text-sm font-medium text-gray-900 sr-only dark:text-white"
>
Search
</label>
<input
type="search"
id="default-search"
value={search}
onChange={(e) => setSearch(e.target.value)}
className={
keepDefaultInputClasses
? `pl-8 pr-2 py-1.5 rounded-lg min-w-60 lg:min-w-72 focus:outline-none focus:border focus:border-gray-400 ${inputClasses}`
: `${inputClasses}`
}
/>
<div
className={`${iconClasses} ${search.length > 0 ? focusClasses : ""}`}
>
<svg
aria-hidden="true"
className="w-5 h-5 currentColor"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
xmlns="http://www.w3.org/2000/svg"
>
<path
strokeLinecap="round"
strokeLinejoin="round"
strokeWidth="2"
d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z"
></path>
</svg>
</div>
</div>
);
};
Input.defaultProps = {
inputClasses: "",
iconClasses:
"absolute inset-y-0 -left-8 group-hover:-left-0 transition-all duration-200 ease-in-out flex items-center pl-2 pointer-events-none",
focusClasses: "",
};
export default Input;
```
* Three elements make up this input component: an input area, a search icon, and a label intended specifically for screen readers.
* The main div now has a tailwind class "group" applied to it, so everything within it will now be a part of this group.
* Next, we have a few props: inputClasses (which pass the input field styles), iconsClasses (which pass the icon styles), focusClasses (which check if the input field has a value before applying the focus state; otherwise, it will only apply when a focus is present), keepDefaultInputClasses (which maintain the default values for the input field classes and add the manually added classes alongside the default classes), and search state (which tracks the input field values).
## Input component usage
```js
<Input
focusClasses="-left-0"
keepDefaultInputClasses={true}
iconClasses="absolute inset-y-0 -left-8 group-focus-within:-left-0 transition-all duration-200 ease-in-out flex items-center pl-2 pointer-events-none"
/>
```
* focusClasses in this case will be "-left-0," which indicates that if a value is present in the input field, this class will be applied; * keepDefaultInputClasses is true, meaning that the default input classes will be used;
* iconClasses - In this case, we are defining our transition effect for the focus. Initially, the icon will have the "-left-8" class, which will push it to the left side outside of the view;
* group-focus-within:-left-0, which will make the icon visible by adding "-left-0," which is the left side starting point of the container;
* Once the focus is removed from the input, the icon will again have the "-left-8" class, outside of the view.
## Codesandbox
{% codesandbox https://codesandbox.io/embed/jpyqnf?view=editor+%2B+preview&module=%2Fsrc%2Fcomponents%2FInput.tsx %}
THANK YOU FOR CHECKING THIS POST
You can contact me on -
Instagram - https://www.instagram.com/supremacism__shubh/
LinkedIn - https://www.linkedin.com/in/shubham-tiwari-b7544b193/
Email - shubhmtiwri00@gmail.com
You can help me with some donation at the link below Thank you👇👇
☕ --> https://www.buymeacoffee.com/waaduheck <--
Also check these posts as well
{% link https://dev.to/shubhamtiwari909/button-component-with-cva-and-tailwind-1fn8 %}
{% link https://dev.to/shubhamtiwari909/microfrontend-react-solid-vue-333b %}
{% link https://dev.to/shubhamtiwari909/codium-ai-assistant-for-devs-57of %}
{% link https://dev.to/shubhamtiwari909/zustand-a-beginners-guids-fh7 %} | shubhamtiwari909 |
1,866,263 | Media and Entertainment Brands Innovate with Brainboard | The landscape of media consumption is evolving rapidly, encompassing advertisers, marketing agencies,... | 0 | 2024-05-31T12:53:00 | https://dev.to/brainboard/media-and-entertainment-brands-innovate-with-brainboard-23he | terraform, infrastructureascode, media, entertainment | The landscape of media consumption is evolving rapidly, encompassing advertisers, marketing agencies, media firms, brands, game developers, carriers, and service providers. To stay competitive and capture market share, brands must create unique, differentiated digital experiences across multiple devices. However, maintaining system security and protecting customer data remains a critical concern. As the saying goes, trust is earned over time and can be lost in an instant.
Leading media and entertainment companies partner with Brainboard to advance their multi-cloud strategies. Discover how Brainboard helps them deliver secure, innovative experiences across multiple digital channels.

> "As a long-time Terraform cloud architect, upgrading to Brainboard made perfect sense. We could offload the responsibility of managing different platforms and services without sacrificing the consistency and efficiency we had built over the years with Terraform."
## Brainboard Accelerates Digital Transformation for Media and Entertainment Brands
### **Native Git Integration**

Brainboard aligns with existing GitOps workflows, offering seamless version control and easy deployment of changes. We support GitHub, GitLab, Azure DevOps, Bitbucket, and GitLab.
### **Visual and Code Accuracy**

Brainboard bridges the gap between visual design and Infrastructure as Code (IaC) with its smart cloud designer, ensuring real-time documentation and versioning of changes.
### **Cost and Security Enforcement**:
With built-in security checks and cost estimations, Brainboard ensures infrastructures are secure and within budget.
### **Full CI/CD Integration**
Our end-to-end integrated engine allows for continuous integration and delivery, enabling seamless deployment.
### **Multi-Cloud Support**
Brainboard supports all major cloud providers, including AWS, Azure, GCP, OCI, Scaleway, Azure AD, Azure DevOps, Azure Stack, and Kubernetes. It allows for design import from AWS and Azure, and auto-generates Terraform code from existing infrastructures, minimizing the learning curve.
### **Collaborative Environment**:
Brainboard provides a shared platform where all stakeholders can understand, contribute to, and maintain the cloud infrastructure collaboratively.
## Brainboard solutions for media and entertainment
- Network Infrastructure Automation: Automate manual tasks across multiple network devices.
- Zero Trust Security: Implement a security model that trusts nothing and authenticates and authorizes everything.
- Application Networking: Automate the network discovery, connection, and deployment of your applications.
## Ready to get started?
[Talk to our technical sales team](https://meetings.hubspot.com/brainboard/discovery) to answer your questions and learn how Brainboard can transform your media and entertainment brand's digital experience.
 | miketysonofthecloud |
1,872,059 | Manticore Search 6.3.0 | We're excited to announce the release of Manticore Search 6.3.0! This version brings a host of... | 0 | 2024-05-31T12:52:33 | https://dev.to/sanikolaev/manticore-search-630-mii | We're excited to announce the release of Manticore Search 6.3.0! This version brings a host of enhancements, new features, and updates, making your search engine even more powerful and user-friendly.
### Vector Search
- **Float vector data type**: We've introduced the [float_vector](https://manual.manticoresearch.com/Creating_a_table/Data_types#Float-vector) data type, which allows you to store and query floating-point number arrays. This is particularly useful for applications that need to perform similarity searches using vector search.
- **Vector search capability**: Coupled with the new data type, the vector search feature enables you to execute k-nearest neighbor (KNN) vector searches. This is ideal for building more intuitive and responsive search functionalities in apps. Read more in the blog post [Vector Search in Manticore](/blog/vector-search/).

### JOIN (beta)
The addition of JOIN capabilities in Manticore Search although still in beta, represents a significant enhancement to the way users can perform queries and manage data relationships. [Read more in the documentation](https://manual.manticoresearch.com/Searching/Joining).
Example:
```sql
SELECT * FROM purchases AS p LEFT JOIN articles AS a ON a.id = p.article_id:
+------+------------+-------------+------+-------+-------------+
| id | article_id | customer_id | id | title | @right_null |
+------+------------+-------------+------+-------+-------------+
| 1 | 1 | 10 | 1 | book | 0 |
| 2 | 1 | 11 | 1 | book | 0 |
| 3 | 3 | 10 | 0 | | 1 |
+------+------------+-------------+------+-------+-------------+
```
### REGEX
The new [REGEX operator](https://manual.manticoresearch.com/Searching/Full_text_matching/Operators#REGEX-operator) significantly improves how you can search for complex text patterns. This feature is especially important in areas that need very accurate search results, such as analyzing patents, reviewing contracts, and searching for trademarks.
For instance, in data analytics, the REGEX operator can help find specific error codes or programming patterns in log files or code. In academic research, it makes it easier to find articles that use certain citation styles. For trademark searches, this tool is excellent for spotting trademarks that are exactly the same or very similar. This enhancement makes Manticore Search much more powerful and precise for handling detailed and complex searches.
Read more in the [blogpost](/blog/regexp/):

Example:
```sql
SELECT * FROM brands WHERE MATCH('"REGEX(/(c|sea).*crest/) REGEX(/flo(we|u)r/)"')
+---------------------+-----------------+
| id | name |
+---------------------+-----------------+
| 1515699435999330620 | SeaCrest Flower |
| 1515699435999330621 | C-Crest Flour |
| 1515699435999330622 | CCrest Flower |
+---------------------+-----------------+
```
### Range() and histogram()
The new [RANGE function](https://manual.manticoresearch.com/Searching/Faceted_search#Facet-over-set-of-ranges) enhances aggregation, faceting, and grouping by categorizing values into specified intervals. These intervals are defined using `range_from` and `range_to`, which determine the boundaries within which values fall. This functionality allows for effective sorting and analysis of data based on user-defined ranges.
Example:
```sql
select * from test;
+---------------------+-----------+-------+
| id | data | value |
+---------------------+-----------+-------+
| 8217240980223426563 | Product 1 | 12 |
| 8217240980223426564 | Product 2 | 15 |
| 8217240980223426565 | Product 3 | 23 |
| 8217240980223426566 | Product 4 | 3 |
+---------------------+-----------+-------+
SELECT COUNT(*), RANGE(value, {range_to=10},{range_from=10,range_to=25},{range_from=25}) price_range FROM test GROUP BY price_range ORDER BY price_range ASC;
+----------+-------------+
| count(*) | price_range |
+----------+-------------+
| 1 | 0 |
| 3 | 1 |
+----------+-------------+
```
The [HISTOGRAM()](https://manual.manticoresearch.com/Searching/Faceted_search#Facet-over-histogram-values) function in Manticore Search categorizes data into buckets based on a specified bucket size. It returns the bucket number for each value, using `hist_interval` and `hist_offset` parameters to determine the appropriate bucket. The function calculates the bucket key by measuring the distance from the starting point of the bucket, adjusted by the interval size. This feature is especially useful for creating histograms, which group data into specific value ranges for easier analysis and visualization.
Example:
```sql
select count(*), histogram (value, {hist_interval=10}) as price_range from test GROUP BY price_range ORDER BY price_range ASC;
+----------+-------------+
| count(*) | price_range |
+----------+-------------+
| 1 | 0 |
| 2 | 10 |
| 1 | 20 |
+----------+-------------+
```
There are also [date_range](https://manual.manticoresearch.com/Searching/Faceted_search#Facet-over-set-of-date-ranges) and [date_histogram](https://manual.manticoresearch.com/Searching/Faceted_search#Facet-over-histogram-date-values) for similar aggregations with date/time data.
### New commands to simplify data updates and schema management
* [ALTER TABLE ... type='distributed'](https://manual.manticoresearch.com/Updating_table_schema_and_settings#Changing-a-distributed-table) lets you change a distributed table without having to delete it first.
* [CREATE TABLE ... LIKE ... WITH DATA](https://manual.manticoresearch.com/Creating_a_table/Local_tables/Real-time_table#CREATE-TABLE-LIKE:) makes it easy to copy a real-time table along with all its data.
* Use [REPLACE INTO ... SET](https://manual.manticoresearch.com/Data_creation_and_modification/Updating_documents/REPLACE#SQL-REPLACE) for updating parts of records in a table.
* [Attaching one real-time table to another](https://manual.manticoresearch.com/Data_creation_and_modification/Adding_data_from_external_storages/Adding_data_to_tables/Attaching_one_table_to_another#Attaching-one-table-to-another) combines two tables into one.
* Rename a real-time table with [ALTER TABLE ... RENAME](https://manual.manticoresearch.com/Updating_table_schema_and_settings#Renaming-a-real-time-table) to keep your database organized.
### Replication-related changes
Significant changes have been made in the replication area to improve the process of data transmission between nodes. Replication error when transferring large files has been fixed, a mechanism for retrying command execution has been added, and network management during replication has been improved. Issues with blocking during replication and attribute updates have also been resolved, and the functionality of skipping replication update commands has been added for nodes joining the cluster. All these changes allow for increased efficiency and reliability of the replication process in various usage scenarios.
For detailed information about the changes, see [here](https://manual.manticoresearch.com/Changelog#Replication-related-changes).
### License change and performance optimizations
We've changed the Manticore Search license to GPLv3-or-later. This new license offers better legal safety for users and works better with other open-source licenses. This change shows our dedication to meeting the needs of the community and keeping open-source software strong. In version 6.3.0, we added the Apache 2 licensed [CCTZ library](https://github.com/google/cctz), which makes date/time functions much faster. Look at the improvement:
Before:
```sql
mysql> select count(*),year(time_local) y, month(time_local) m from logs10m where y>2010 and m<5;
+----------+------+------+
| count(*) | y | m |
+----------+------+------+
| 10365132 | 2019 | 1 |
+----------+------+------+
1 row in set (8.26 sec)
```
Now:
```sql
mysql> select count(*),year(time_local) y, month(time_local) m from logs10m where y>2010 and m<5;
+----------+------+------+
| count(*) | y | m |
+----------+------+------+
| 10365132 | 2019 | 1 |
+----------+------+------+
1 row in set (0.11 sec)
```
The query is now 75 times faster.
We have also improved how tables are compacted. Previously, when merging disk chunks, Manticore removed deleted documents from any chunks that had them, using a lot of resources. We have stopped using this method. Now, merging chunks is managed only by the [progressive_merge](https://manual.manticoresearch.com/Server_settings/Common#progressive_merge) setting, which makes the process simpler and less heavy on resources.
### Ubuntu Noble 24.04
Ubuntu Noble 24.04 is now supported.

### And many more
The updates highlighted above are just a part of the many improvements included in Manticore 6.3.0. Please read about:
🚀 9 major changes
✅ 50+ minor changes
🐞 120+ bug fixes
in the [changelog](https://manual.manticoresearch.com/Changelog).
We hope you enjoy the new features and improvements in Manticore Search. We welcome your feedback and encourage you to engage with us by:
* Starting a discussion on our [Community Forum](https://forum.manticoresearch.com)
* Reporting bugs or suggesting new features on [GitHub](https://github.com/manticoresoftware/manticoresearch/issues/new/choose)
* Joining the conversation in our [Public Slack Chat](https://slack.manticoresearch.com/)
* Emailing us directly at `contact@manticoresearch.com` | sanikolaev | |
1,840,137 | So you want to make a podcast | On my drive home from DjangoCon 2023 I came up with a wild idea. I should make a podcast about... | 0 | 2024-05-31T12:50:48 | https://dev.to/adamghill/so-you-want-to-make-a-podcast-242c | podcast, django, webdev | On my drive home from [DjangoCon 2023](https://2023.djangocon.us) I came up with a wild idea.
I should make a podcast about Django.
But, I didn't want to make a podcast by myself! However, I had just worked with my coworker and friend, Sangeeta, to create [quirky stickers](https://djangostickers.com) to give out at DjangoCon, and she would be the perfect co-host.
There are already a few podcasts about Django available, but I thought we could do a fun spin on it. And similar to when we decided, spur of the moment, to sign up to give a Lightning Talk at DjangoCon, the idea was half-exciting and half-terrifying.
We both listen to a lot of podcasts so we already had some ideas about what makes good content and what doesn't. And between the Django stickers and an internal work presentation we co-created in the past, Sangeeta and I know how to work well together on a creative project.
We talked about a few other topics we could make a podcast about, but we kept coming back to Django -- we both use it daily, have lots of production experience with it, have opinions about it, _and_ it seemed like a nice way to give something back to the Django community.
From inception (October 2023) to the first episode launch (March 2024) of [Django Brew](https://djangobrew.com), it has been a huge learning curve. We knew what we wanted the podcast to be like, but we didn't know how to create it. We still aren't professional podcasters or anything, but we are three episodes in, so we know more now than we did at the beginning!
## Decisions, decisions
First, decide on what sort of content you are uniquely qualified to talk about. Lean into your "unfair advantage" because that will resonate best with the audience. For us, we are passionate about Django and the community. I think all of those qualities come through in the podcast. If we didn't care as much the audience would instinctively know.
Then, decide on what kind of podcast you want to make. There are interviews, news, tutorials, long-winded opinion shows, and lots of things in-between. We knew of a few interview shows for Django, but fewer where developers talked about their experience building with Django.
And decide why your podcast will be different than all the others. Our goal was to make the podcast like two friends talking about Django -- we wanted it to be funny, full of excitement, and a little bit goofy. We try to capture our personalities while talking about a framework that we love. I couldn't (and wouldn't!) want to make a podcast by myself. My favorite podcasts have 2 hosts because the dynamic completely changes with a second person -- I'm definitely not engaging enough to host a podcast by myself. 😂
## Equipment
Originally, we tried iPhone Bluetooth mics, laptop mics, and a few other corded and cordless mics. But, the sound quality wasn't great and we kept reading that a better mic would help, so we decided to splurge and each got a dedicated microphone.
### Microphone
The [Samson Q2U](https://www.amazon.com/gp/product/B07K1XSDZP/) came highly recommended from Reddit and [Transistor's podcast equipment guide](https://transistor.fm/how-to-start-a-podcast/podcasting-equipment-guide/). We've been pretty happy with the sound quality -- they aren't super high-end (we can't all be Wes Bos!), but for $100 for a mic, boom, shock mount, and pop filter it seemed reasonable.
## Software
We tried a few different software setups before settling on our current setup. I am sure there are less time-intensive ways to record a podcast, but this is the software and process we currently use. For other types of podcasts (or more seasoned professionals!), it might be completely different.
### Notion
Sangeeta and I use the free tier of [Notion](https://www.notion.so) to brainstorm episode topics, write scripts, and keep a rough outline for each episode.
### Riverside
We use the free tier of [Riverside](https://riverside.fm) to record the actual podcast audio. One benefit over Zoom is that there is no time limit for the "meeting". It also has the ability to export each person's audio separately which makes the editing of the final audio easier. We don't use any other features in Riverside other than recording audio, but might look into it more in the future.
There _is_ a limit in Riverside for how much recorded content can be stored every month, but we don't currently hit that limit.
### Audacity
After recording each session, I export each audio track as a WAV and import it into [Audacity](https://www.audacityteam.org). There is a little bit of a learning curve with Audacity, but it is free and rock-solid. For the limited things I want to do (i.e. cut/copy/paste, silence, limit, fade-in, and fade-out) it works great.

For a 20-30 minute episode, we tend to record 2-3 hours of audio over multiple sessions (that includes lots of re-takes, general chatting, and some planning of what to talk about next)!
I do rough cuts of all of the audio together and then I send drafts to Sangeeta for her to listen to and give me notes. We'll go back and forth over multiple drafts, making notes for each other, and deciding on different parts we want to change or re-record. It probably takes at least a few hours to do all of the editing.

As part of the editing process, I also grab anything particularly goofy and keep another project with all of our episode bloopers.
### Ableton
Once we get close to a finished product, I send the episode and bloopers to Sangeeta who uses [Ableton](https://www.ableton.com) for the final edits. Ableton is not free, but Sangeeta had used it before for other projects and had already paid for it.
She adds the intro and outro music (from [Pixabay](https://pixabay.com/music/)) and goes through the episode one more time to tweak audio levels and do a general clean up.
Then, she edits all of the bloopers together (side note: this is my favorite part of every episode). Most episodes have 5+ minutes of unedited bloopers, but she grabs only the best ones and stitches them together.
After that's all done, Sangeeta exports the final cut and uploads it to Buzzsprout.
### Buzzsprout
[Buzzsprout](https://www.buzzsprout.com/) is where we host the podcast episodes. It pushes new episodes to _all_ of the podcast aggregators, stores the podcasts, provides a handy embeddable player, and podcast analytics. Personally, I am not sure how accurate the podcast analytics are, but it is nice to get a sense if people are listening to the episodes.

### Coltrane
We used the built-in Buzzspout hosted website for a little while, but eventually wanted more control over the styles and content. I grabbed all of the HTML and CSS from the hosted site and used my own mini web framework, [Coltrane](https://coltrane.readthedocs.io), to build a static site. We have a few ideas about potential further enhancements for the site in the future, but for now it's good enough for our purposes.
## Process
For most of our episodes it's taken around a month to decide what to talk about, record, and edit. That's a long time! However, we only work on it nights and weekends because we have day jobs and other life priorities also tend to get in the way!
For some episodes we have written out a basic script for what we wanted to talk about. For others, we had an outline, but it was recorded more "off the cuff". I don't think we have nailed down a perfect process yet, but we are still working through it. I think we'll also get better at talking "on-mic" over time, but like with most things practice makes ~~perfect~~ better.
## Publishing
After we upload the new episode to Buzzsprout, we write custom show notes with links to everything we talked about. The new episode will automatically show up on the website once it's published, but we also write a post for [Twitter](https://twitter.com/djangobrew) and [Mastodon](https://fosstodon.org/@djangobrew) to notify people on those platforms.
## Conclusion
That's our current process to record the [Django Brew](https://djangobrew.com) podcast! I'm sure we'll learn new techniques and the process will change over time, but we've been pleasantly surprised by the community engagement and we look forward to creating more podcasts in the future!
If you are interested in Django at all, please check out [Django Brew](https://djangobrew.com) and let us know what you think. Or just listen to our bloopers and see how often we mess up while recording. 😉
>Thanks to Sangeeta Jadoonanan for reading the first draft and giving some helpful feedback. And also being my podcast co-host.
>Photo by <a href="https://unsplash.com/@austindistel?utm_content=creditCopyText&utm_medium=referral&utm_source=unsplash">Austin Distel</a> on <a href="https://unsplash.com/photos/two-black-headphones-on-brown-wooden-table-VCFxt2yT1eQ?utm_content=creditCopyText&utm_medium=referral&utm_source=unsplash">Unsplash</a>
| adamghill |
1,872,067 | Viruses for Beginner Developer 🦠 | Dengue virus : Family Flaviviridae genus Flavivirus single-stranded RNA virus with 10,700... | 0 | 2024-05-31T12:50:19 | https://dev.to/keshavgbpecdel/viruses-for-beginner-developer-558e | webdev, beginners, viruses | ## Dengue virus :
- Family Flaviviridae
- genus Flavivirus
- single-stranded RNA virus with 10,700 bases
Note : `Flavivirus includes - yellow fever virus, West Nile virus, Zika virus`
## Nipah virus (NiV) :
- zoonotic virus [animals to humans]
Note : `Smallpox vaccine contains live vaccinia virus`
## Adeno Viruses :
- airways cold symptoms sore throat, runny-nose, cough
### We developer should always remember :
Methanol : toxic (even very small quantity)
- Methanol oxidize to methanal in liver
- Methanal reacts with cell components
- it coagulates the protoplasm
- Methanol affect optic nerve causing blindness also
### Extra
- Lysozyme - antibacterial tear & saliva
- Eosinophils - wbc fungal and parasitic infections
- Monocytes - wbc in immune system - destroy germs bacteria
- Pulse - rhythmic contraction & relaxation in the aorta & main arteries
| keshavgbpecdel |
1,872,190 | Spend 2 hours discussing, and make the wrong decision anyway. | For more content like this subscribe to the ShiftMag newsletter. It can be hard to detect when a... | 0 | 2024-06-04T09:17:03 | https://shiftmag.dev/unproductive-meetings-software-engineering-3430/ | productivity, developerproductivit, unproductivemeetings | ---
title: Spend 2 hours discussing, and make the wrong decision anyway.
published: true
date: 2024-05-31 12:49:56 UTC
tags: Productivity,DeveloperProductivit,unproductivemeetings
canonical_url: https://shiftmag.dev/unproductive-meetings-software-engineering-3430/
---

_For more content like this **[subscribe to the ShiftMag newsletter](https://shiftmag.dev/newsletter/)**._
It can be hard to detect when a meeting is unproductive, especially when others share the illusion.
You gained nothing from it. However, there is a good chance that someone in that meeting feels like that time was well spent. Unfortunately, your time was wasted.
Allow me to illustrate.
## What makes code the way it is?
> Any organization that designs a system […] will produce a design whose structure is a copy of the organization’s communication structure.
>
> <cite>Melvin Conway (Conway’s Law)</cite>
This is simple: if we have a problem, a program is a set of instructions to the computer that tells it what to calculate to solve our problem.
But we all know that code is abstract and full of patterns and idioms that are mostly concerned with how we, humans, understand that code. **Our entire systems reflect the organizations we work in and our business processes**.
In other words, the shape of the problem is not the only thing that dictates the shape of the solution.
What’s more important about this topic is that different people often have different ideas about what our system is and what it should be.
## The projects we work on
I’m sure you’ve encountered problems that are easy to describe and discuss, but the code that implements that supposedly simple solution is **completely incomprehensible and unmaintainable.**
I once worked on a project that had gone through the 9 Circles of Hell you might be familiar with:
- Unclear concept,
- Developers unfamiliar with the domain,
- Experimental solutions to unfamiliar problems,
- Changing requirements,
- Scope creep,
- Tough deadlines,
- Hacks upon hacks,
- Tech debt,
- Changing team members.
We had to build a boat…
…but the boat was **supposed to be a Zeppelin for a while, but then we figured it needed big tractor tires**. And also a rocket engine just in case.
The “obvious” “solution” is to strip away all the cruft and make the boat the way an actual boat should be. Unfortunately, you can’t just wish a change like that into existence. The boat should be a boat, but right now, it isn’t one.
## Navigating the Sea of Requirements
Every few weeks, **an unexpected issue or an urgent new requirement pops up**. These require us to make course corrections, which is sometimes hard to do because turning the rudder requires editing complex SQL procedures while maintaining backward compatibility.
When the system is this complicated, sometimes the only thing keeping the boat from capsizing is a well-balanced stack of bricks on the top of the mast. When making changes, even if you make an effort to balance the boat in a better way, you still have to be careful when removing those bricks. Otherwise, a barrage of previously valid but now malformed requests might start hitting some REST endpoint, producing useless logs. Now you’ve got a whole bunch of leaking holes to plug, and your carpets (traffic metrics) are ruined.
Of course, **we are really careful**. We practice code review and write lots of tests. We make sure we discuss important changes thoroughly. We include all the stakeholders and seek external advice.
But there is a sneaky issue, a fog preventing us from communicating effectively, but making us feel like we are.
**There is simply too much to handle.**
1. No single person knows how everything on the boat works.
2. No individual can monitor and coordinate everyone’s work.
3. No one can keep track of \*\*all\*\* the changes everyone does.
4. Changing one thing almost always affects something else unpredictably.
5. **When everyone has a limited perspective on the boat and its course, we often talk about different things,** **using the same words**.
## Differing perspectives
Extending our sailing analogy, let’s imagine a new requirement pops up: **we have to go around an Iceberg**. We schedule a meeting to decide on how to proceed. All the relevant people are invited:
- **The Project Manager** _(they ultimately decide on where we’re going)_,
- **All 6 developers** _(less chance of forgetting something)_,
- **3 Support Specialists** _(they have to answer questions from users)_,
- **The Iceberg Committee** _(it’s their Iceberg, after all)_.
**Here is how everyone sees the problem and their solutions** :
**The Project Manager** : Believes they are captaining a proper boat. It’s got some weird but minor issues. Simply steer, send a signal to the Iceberg’s API, and realign with the destination.
**The developers** : Each has their own idea of how the boat should look. Each one is aware of different issues and strange dependencies on the boat. Only one knows that steering left hasn’t working since the last release (this isn’t a big deal because our destination is moving to the right anyway). Steer right, take care that the bricks don’t fall from the mast, and then carefully readjust everything to continue straight to the destination.
**The Support Specialists** : They are still putting out fires from last week. They don’t want to be here but have to be better prepared for the next incident. They don’t care if the boat has to fly; just don’t hit that Iceberg, _please_.
**The Iceberg Committee** : For them, this is just another Tuesday. They have a solid, time-proven API with great documentation, and no one has any issue registering with it. They will provide documentation and claim they are here for any questions. They won’t respond because they “just read the docs,” and they have more important things to do.
## Communication breakdown
Imagine **a timeline of the meeting** :
- Alice suggests turning right.
- The devs discuss for a bit and figure it’s nothing they can’t solve.
- The PM mentions in passing that the boat needs to ping the Iceberg’s API.
- Bob remembers the tractor tires on the side of the boat. It is not clear where to fit the REST cannon.
- The PM suggests going left. This confuses the devs since they thought our goal was to the right.
- Alice suddenly remembers that the boat can’t turn left.
- Now the discussion turns to this problem. Why? Can it be fixed?
- The devs try to decide whether they have time to fix this.
- The PM thinks outside the box and suggests they fly over, like a blimp. The discussion shifts again.
- The team gives another shot to the idea of going right.
- They go more in-depth about building the REST cannon.
- One dev still has questions about flying over; he still thinks it’s a viable idea.
- The PM likes the sound of it.
By the time the team gets back to discussing going right, everyone has already forgotten the reasons they don’t have time to make it work.
After rehashing all the ideas a few more times and rediscovering problems three times over, they settle on going left.
## The aftermath
Only when someone starts working on the task do they remember **they don’t have time to fix turning left?** They quickly re-plan the tasks to go right.
But alas, no one remembered to tie up the bricks.
In the end, after two hours of discussions, **the team made the same decision they could have made in five minutes** , including the same oversights.
Some members believe the discussion went well because many ideas were shared, and that the problem is how badly the tasks were written.
Other members think that the discussion was worth it, even though it was a mess. They think it would have turned out even worse with less “planning”.
To the Support Specialists: this meeting could have been an email.
## Dispelling the illusion
There are multiple reasons why someone might feel this meeting had been useful. Perhaps even necessary. You may have even learned some things.
Even if the meeting had been a net positive, could you have learned more by working on something or by asking directed questions?
The sneakiest illusion might be hidden by the fact that **some decision was made**. Some progress was made. Something exists after the meeting, that hadn’t existed before. In reality, even more progress may have been made if the meeting hadn’t occurred at all.
## Managing discourse
Sharing ideas about a complex system is complex in itself. The meeting was unproductive because we didn’t approach it in a structured manner. Aside from practicing communicating effectively and exactly, I have an actionable suggestion:
Consider creating a **persistent** , authoritative document that describes your project. I can’t tell you exactly how it should look, but it must be **continuously referenced** and updated in every discussion.
All tasks and meeting notes should reference this single source of truth. It should serve as a force that pulls every discussion towards facts and keeps everyone on the same page (as much as possible).
## Instead of a conclusion…
…remember that the key is **recognizing when discussions become unproductive**. Discussions about complex systems should be structured and lead to verifiable conclusions.
Remember, this is hard. It’s hard to notice when you, yourself are not communicating effectively, so don’t forget to be kind to others.
We don’t need to be perfect, but we can do better.
The post [Spend 2 hours discussing, and make the wrong decision anyway.](https://shiftmag.dev/unproductive-meetings-software-engineering-3430/) appeared first on [ShiftMag](https://shiftmag.dev). | shiftmag |
1,872,064 | Romantic Abroad Vacation Spots in 2024 and Essential Tips for travel | The chances of self-growth and a different life view that may be obtained through visiting some new... | 0 | 2024-05-31T12:48:09 | https://dev.to/dennisbell/romantic-abroad-vacation-spots-in-2024-and-essential-tips-for-travel-5dg5 | rome, meetandgreetatmanchester, valetparking, publictransport |

The chances of self-growth and a different life view that may be obtained through visiting some new destinations are huge. Only traveling gives us the chance to get in the thick of it, opening our horizons, and inputting in us these amazing experiences that will accompany us for our whole life. To what extent do you agree that whether you are finding your way through the ancient ruins in [Rome](https://en.wikipedia.org/wiki/Rome), swim and blow bubbles in the beach of Bali or to exploring the colorful culture of Tokyo travel give us this quite same opportunity. Having in mind many of those of us who plan to do a long awaited overseas vacation in four years time in 2024, one can say that 2024 will be a great year for tourism for sure. Nevertheless, this undertaking requires the commission of other logistic functions, such as searching for airports’ parking places.
In this article, details of this year's most preferred international tourist spots and compares different methods of managing car parking at the Uk airports. Certain advice which we shall be promoting entails the usage of parking platforms, the thinking through alternative parking choices and means, the use of valet parking, and the undertaking of using public transport systems. Coming to the end of this essay we believe that you possess the better skills to devise your strategy for the next trip outside the country and at the same time you can really appreciate the calmness and freedom of your own car in Britain’s airport parking.
## Paris, located in France:
The City of Light and Love or Paris for the short is definitely a worthy visit for those who want either to deepen their knowledge in the area of art, fashion, and cuisine which represent world-class achievements. Paris is a historical city with many monuments like the Eiffel Tower and the Louvre Museum which are worth exploring therefore travelers should make every effort to visit the monuments. Green parks, recreational and romantic Seine cruises, as well as charming cafés to the city are some more highlights. And, alongside designer boutiques and antique shops, it is not surprising that the shining star in setting the global trend for fashion is also known to be the city in the world that offers the most wonderful shopping experience.
## Rome, Italy:
The Eternal City (Rome), along with tangible history and architecture, is a location frequently chosen by visitors keen to see the city. Rome is a city for all tastes. Here you find the remains of the largest empire in the world, such as the Pantheon and the Colosseum, all of them within walking distance. Within the Italian food you will find the best, healthiest, and tastiest food products in the world. Furthermore visitors are able to see Vatican city which holds Sistine chapel and st. Peter's basilica or are supposed to throw a coin in trevi fountain so they could acquire good fortune.
## City of Bangkok, Thailand:
Bangkok has become a globally risen to be the most loved travel spot by anybody who want to take the cultural diversity within Southeast Asia. It is a city famed for its vibrant market streets, the scrumptious perfumes of its street foods, as well as the aesthetically pleasing religious temples. Lastly, it is known for its rampant pollution, green wasteland, and underdevelopment. Among activities tourists love when visiting Thailand are having the Grand Palace, an excursion by the boat along Chao Phraya River, and a traditional Thai massage. The city follows the trend for the vibrant nightlife as there is a choice between nightclubs and rooftop bars open for people at all hours.
## Sydney, Australia:
Sydney which is the capital city of Australia it gives visitors a chance to enjoy a range of experiences from the busy city life to the beach life which makes the Sydney location rich in diversity. While another must-see on your itinerary has to be the Bondi Beach, you should also go to the top of the Sydney Harbour Bridge and see the Opera House. If you want it chill outing though, take a boat to Manly Beach, or you can enjoy the Royal Botanic Gardens’ tour.
## Managing Airport Parking:
When travelling overseas, managing airport parking may be a difficult experience, particularly if you are not familiar with the airport or the parking alternatives available in the area. To assist you in navigating it without any difficulty, here are some tips:
## Make a plan in advance:
##First things first:
Make sure to take a look in advance when you are planning to travel from an airport. It is good to know that you can find an offer that suits you. The fact that this choice gives you the opportunity both to save time and money by having rolling reserved vehicle spots at almost every airport, including Manchester is remarkable. The wonderful thing about online booking app is that you can always guarantee the pand in advance for your travel to be wonderful every time.
## Utilise airport parking applications:
Companies such as EzyBook generally aid you in getting yourself to a space that is not full, may give you driving directions to the parking area and can even be used to pay for a parking spot in the app. To illustrate, booking a [**meet and greet at Manchester**](https://www.ezybook.co.uk/car-parking-manchester.php) will require you to pre-book a reservation beforehand, which will then allow you to follow shuttle buses for the guidance, accuracy of information on available parking spots, as well as real-time updates.
## Think about parking off-site:
There are instances when parking off-site can be a more convenient and cost-effective alternative than parking at the airport itself. Shuttle services to and from the airport are offered by a number of off-site parking sites, making it simple to travel to your terminal using these services. However, before making a reservation, it is imperative that you look into the security and dependability of the parking lot that is located off-site.
## Take use of valet parking:
If you are pressed for time or do not wish to deal with the inconvenience of obtaining a parking place, you might think about the meet and greet parking option. There are a lot of airports that allow you to take use of these services, which may help you save time and make parking less stressful. On the other hand, before you arrive at Manchester Airport, you should be sure to verify the availability of the affordable parking bargains that are available.
## Think about taking public transport:
If you are staying at a hotel that is close to the airport or if you live in the region, you should think about utilising public transport to access the airport. Train and bus stations are located in many airports, and these stations can give convenient access to the terminals. It is possible that this alternative is less harmful to the environment and more cost-effective than driving to the airport and purchasing parking there.
## Simply put:
Organising a vacation to a well-known overseas location for the year 2024 may be an exciting experience; yet, navigating the parking lot at the airport can be a challenging endeavour. You will be able to make your experience of parking at the airport stress-free and concentrate on enjoying your vacation if you plan ahead, use airport parking deals apps, explore parking off-site, take advantage of valet parking, and evaluate the choices available for public transit. Regardless of whether you decide to travel to Paris, Rome, Bangkok, or Sydney, you should make it a point to spend sufficient time seeing these remarkable locations and making experiences that will last a lifetime. | dennisbell |
1,872,063 | Urgent Courier Service From National Couriers Direct | A post by National Couriers Direct | 0 | 2024-05-31T12:48:01 | https://dev.to/johnadwardsnick/urgent-courier-service-from-national-couriers-direct-47g6 | [](https://youtu.be/I24wr2SKLd0?si=HwB_u4-lPv9ZSYvm)
| johnadwardsnick | |
1,872,060 | Ship Faster, Learn Sooner: Validating Your Product Idea with Minimum Viable Products (MVPs) | Hey developers! Ever had a brilliant product idea but weren't sure if there was a real need for it in... | 0 | 2024-05-31T12:46:51 | https://dev.to/cygnismedia/ship-faster-learn-sooner-validating-your-product-idea-with-minimum-viable-products-mvps-46p0 | startup, development, tutorial, learning | Hey developers! Ever had a brilliant product idea but weren't sure if there was a real need for it in the market? Building a full-fledged product can be a risky and time-consuming gamble. This is where Minimum Viable Products (MVPs) come in to save the day!
[This](https://www.cygnismedia.com/blog/types-of-mvp-for-startups-with-examples/) blog post dives deep into the world of MVPs, your secret weapon for validating product ideas efficiently. You'll learn:
- The core concept of MVPs and their purpose in the development lifecycle.
- Different MVP types you can leverage, from single-feature MVPs to concierge MVPs, depending on your project needs.
- Actionable tips: How to craft an effective MVP strategy to gather valuable user feedback and iterate quickly.
- Real-world inspiration: Explore successful MVP examples from industry giants like Slack and Uber, and see how they used MVPs to test their concepts and achieve product-market fit.
- Building an MVP allows you to focus on core functionalities, gather user insights early, and avoid wasting time and resources on features nobody wants.
Ready to ship faster and learn sooner with MVPs? Check out the full blog post for a developer-friendly guide to MVP development: [The MVP Blueprint: 10 Inspiring Examples to Launch Your Product](https://www.cygnismedia.com/blog/types-of-mvp-for-startups-with-examples/) | cygnismedia |
1,858,328 | How to switch or update PHP version in Laragon | Laragon is a portable, isolated, fast & powerful universal development environment for PHP,... | 0 | 2024-05-31T12:44:48 | https://dev.to/murizdev/how-to-switch-or-update-php-version-in-laragon-1k3n | php, laragon, windows, tutorial | Laragon is a portable, isolated, fast & powerful universal development environment for PHP, Node.js, Python, Java, Go, Ruby. It is fast, lightweight, easy-to-use and easy-to-extend.
Unlike other development environments, in Laragon you can change the version of the programming language and database used in Laragon, which is what makes Laragon easy to use and extend.
However, Laragon no longer receives updates since it was last updated on September 16, 2022. And if you want to find out how to update the programming language or database used, it will be a little difficult because Laragon itself doesn't have any documentation about it.
## How to switch or update PHP version
First of all, make sure you have Laragon installed on your Windows, you can download Laragon via this link [laragon.org/download](https://laragon.org/download/)
### Download PHP version for Windows
Search for the PHP version you want to install on Windows, you can find it via this link [windows.php.net/download](https://windows.php.net/download/).
Choose according to the Windows architecture you are using (X64 or X86) and decide whether you want thread safety or non-thread safety, I recommend choosing thread safety.
### Download the latest version of Apache
If you choose to update the PHP version to a higher version, don't forget to also update the version of Apache, I recommend downloading the latest version, you can download the latest version of Apache via this link [apachelounge.com/download](https://www.apachelounge.com/download/)

### Extract your PHP and Apache to Laragon
If you have successfully downloaded your desired version of PHP and Apache, you can extract it into Laragon
Extract PHP into `laragon > bin > php` folder

Extract Apache into `laragon > bin > apache` folder

### Edit the system environment variables
After you have successfully extracted it, don't forget to edit your system environment variables, you can type env in Windows search to open the menu.
In the system variables section, go to the path menu in the system environment variable and add the paths of PHP and Apache that you downloaded earlier.

Note: move it to the top of the PHP version you want to use, like in the picture where I want to use version 8.3 so I put it at the top.
Once done you can press the Ok button until the system environment variable menu closes, and you can check if the versions of your PHP and Apache have changed using the terminal.

Good, you have successfully changed it, don't forget to also set the PHP and Apache versions that you are using in Laragon.


You can exit Laragon to restart it after you have finished setting the versions of PHP and Apache. Good luck... | murizdev |
1,872,058 | What game | I don't know | 0 | 2024-05-31T12:43:48 | https://dev.to/_b5d52782547112/what-game-448l | I don't know
 | _b5d52782547112 | |
1,872,057 | Automating Payroll with HRM: A Comprehensive Guide | In brand new speedy-paced commercial enterprise environment, efficiency and accuracy are paramount.... | 0 | 2024-05-31T12:42:57 | https://dev.to/liong/automating-payroll-with-hrm-a-comprehensive-guide-55lm | human, functional, malaysia, automate | In brand new speedy-paced commercial enterprise environment, efficiency and accuracy are paramount. Automating payroll thru Human Resource Management (HRM) systems is a transformative approach that addresses those desires. This article explores the advantages, key functions, and implementation steps of payroll automation within HRM systems.
**The Need for Payroll Automation**
Manual payroll processing is fraught with demanding situations. From human mistakes in calculations to compliance dangers because of ever-converting tax regulations, the traditional approach can be both inefficient and steeply-priced. Automated payroll structures embedded in HRM answers provide a robust alternative, streamlining procedures and ensuring that payroll is dealt with with precision and compliance.
Key Advantages of Payroll Automation
**Enhanced Accuracy and Reduced Errors**
Automated payroll systems limit human errors. By integrating information inputs from various HR functions, including attendance and time tracking, those systems ensure that calculations for salaries, taxes, and blessings are particular. This accuracy is essential in retaining employee believe and warding off prison complications.
**Compliance and Regulatory Adherence**
Keeping up with tax legal guidelines and labor policies can be daunting. HRM systems are regularly updated to reflect the contemporary felony necessities, making sure that payroll processing remains compliant. This reduces the danger of consequences and audits.
**Time and Cost Savings**
Automating payroll frees HR employees from repetitive tasks, permitting them to attention on strategic projects. The discount in manual processing also cuts down administrative costs and decreases the likelihood of costly mistakes.
**Improved Data Security**
Payroll facts is sensitive and calls for stringent safety. HRM systems rent superior security measures, consisting of encryption and position-primarily based get entry to controls, to safeguard employee facts in opposition to unauthorized get entry to and breaches.
**Scalability for Growing Businesses**
As organizations increase, coping with payroll for a bigger body of workers can emerge as complicated. Automated structures can without difficulty scale, accommodating increase without extra stress on HR resources.
**Enhanced Employee Experience**
Self-service portals in HRM structures allow employees to access their pay stubs, tax documents, and different payroll statistics at their convenience. This transparency and accessibility improve worker satisfaction and decrease HR workload.
**Comprehensive Reporting and Analytics**
Automated payroll structures provide exact reporting abilties. HR specialists can generate reports on payroll expenses, tax liabilities, and workforce trends, providing valuable insights for strategic planning and decision-making.
**Essential Features of Automated Payroll Systems**
**Integration with Time and Attendance**
Accurate payroll processing depends on reliable information on worker operating hours. HRM structures frequently integrate seamlessly with time and attendance monitoring gear, ensuring that payroll calculations are based on correct and up to date information.
**Customizable Payroll Configurations**
Different groups have specific payroll requirements. Automated structures provide customizable configurations to deal with diverse pay structures, worker classifications, and reimbursement models, making sure flexibility and accuracy.
**Automated Tax Calculations and Filing*****
Keeping up with tax regulations can be difficult. HRM structures automate tax calculations and filings, making sure compliance with nearby, nation, and federal regulations. This reduces the danger of mistakes and penalties associated with tax submissions.
**Multiple Payment Methods**
Offering a couple of charge options, along with direct deposit, tests, and digital transfers, computerized payroll systems cater to various employee preferences. This flexibility complements employee satisfaction and guarantees well timed bills.
**Detailed Audit Trails**
Maintaining an audit path is important for compliance and duty. Automated payroll structures file all transactions, offering a transparent and traceable history of payroll sports that may be reviewed during audits.
**Implementing Payroll Automation**
**Assess Organizational Needs**
Begin by using comparing the particular payroll desires of your business enterprise. Identify current pain points, inclusive of blunders fees, compliance problems, or time-ingesting processes. Define clean targets for what you purpose to acquire with payroll automation.
**Select the Right HRM System**
Choosing the best HRM system is essential. Consider factors including scalability, ease of use, integration capabilities, and the precise capabilities that align along with your payroll necessities. Ensure the system can grow together with your commercial enterprise.
**Data Migration and System Integration**
Transitioning from a manual to an automatic payroll device requires cautious making plans. Ensure that every one payroll statistics is as it should be migrated to the new machine. Integrate the HRM device with other business software program, such as accounting and ERP structures, to enhance typical performance.
**Training and Support**
Proper schooling is crucial for a clean transition. Provide complete education sessions for HR personnel and personnel to familiarize them with the brand new gadget. Offer ongoing assist to cope with any questions or problems that arise.
**Continuous Monitoring and Improvement**
After implementation, constantly screen the overall performance of the automated payroll gadget. Collect comments from users and make essential changes to optimize its functionality. Regularly evaluation system outputs to make sure accuracy and compliance.
**Conclusion**
Automating payroll the usage of HRM systems is a strategic move that offers numerous benefits, along with progressed accuracy, compliance, and performance. By streamlining payroll approaches and enhancing statistics safety, HRM systems empower companies to manage their workforce more efficiently. As corporations develop and evolve, embracing payroll automation becomes important for preserving operational performance and accomplishing lengthy-time period fulfillment.
| liong |
1,872,056 | Part 3 Angular's ngOnInit: Your Key to Component Initialization Excellence | Understanding Angular ngOnInit Lifecycle Hook Introduction In the world of... | 0 | 2024-05-31T12:39:13 | https://dev.to/chintanonweb/part-3-angulars-ngoninit-your-key-to-component-initialization-excellence-11gm | webdev, javascript, angular, typescript | # Understanding Angular ngOnInit Lifecycle Hook
{% embed https://www.youtube.com/embed/WSQ5Qu8BWJg %}
## Introduction
In the world of Angular, understanding lifecycle hooks is crucial for developers to manage component initialization, state changes, and clean-up processes effectively. Among these hooks, `ngOnInit` stands out as one of the most fundamental ones. In this article, we'll delve into the intricacies of `ngOnInit`, exploring its purpose, how it works, and practical examples to grasp its usage thoroughly.
## What is ngOnInit?
`ngOnInit` is a lifecycle hook provided by Angular, specifically designed for components. It is called once, after Angular has initialized all data-bound properties of a component and the component's view. This hook is commonly used for initialization tasks such as fetching data from a server, initializing properties, or setting up subscriptions.
## How Does ngOnInit Work?
When an Angular component is created, Angular goes through a series of initialization phases. During this process, Angular sets up the component and its associated view. After the component's data-bound properties and the view are initialized, Angular calls the `ngOnInit` method of that component if it exists. This makes `ngOnInit` the perfect place to perform any initial tasks that depend on the component being fully initialized.
## Examples of Using ngOnInit
Let's dive into some practical examples to understand how to utilize `ngOnInit` effectively.
### Example 1: Initializing Component Properties
```typescript
import { Component, OnInit } from '@angular/core';
@Component({
selector: 'app-example',
template: '<p>{{ message }}</p>',
})
export class ExampleComponent implements OnInit {
message: string;
ngOnInit(): void {
this.message = 'Hello, ngOnInit!';
}
}
```
In this example, the `ngOnInit` hook is used to initialize the `message` property of the component.
### Example 2: Fetching Data from a Server
```typescript
import { Component, OnInit } from '@angular/core';
import { DataService } from './data.service';
@Component({
selector: 'app-data',
template: '<p>{{ data }}</p>',
})
export class DataComponent implements OnInit {
data: any;
constructor(private dataService: DataService) {}
ngOnInit(): void {
this.dataService.getData().subscribe((response) => {
this.data = response;
});
}
}
```
Here, `ngOnInit` is employed to fetch data from a server using a service and subscribing to the asynchronous operation.
### Example 3: Setting Up Subscriptions
```typescript
import { Component, OnInit, OnDestroy } from '@angular/core';
import { Subscription } from 'rxjs';
import { TimerService } from './timer.service';
@Component({
selector: 'app-timer',
template: '<p>{{ time }}</p>',
})
export class TimerComponent implements OnInit, OnDestroy {
time: number;
private timerSubscription: Subscription;
constructor(private timerService: TimerService) {}
ngOnInit(): void {
this.timerSubscription = this.timerService.getTimer().subscribe((t) => {
this.time = t;
});
}
ngOnDestroy(): void {
this.timerSubscription.unsubscribe();
}
}
```
In this example, `ngOnInit` is used to set up a subscription to a timer service. Additionally, `ngOnDestroy` is implemented to unsubscribe from the subscription when the component is destroyed, ensuring no memory leaks occur.
## FAQs
### What is the difference between ngOnInit and the constructor?
The constructor is a TypeScript feature used for basic initialization of a class. In contrast, `ngOnInit` is an Angular lifecycle hook specifically designed for initialization tasks related to Angular components. While both can be used for initialization, `ngOnInit` is preferred for tasks that depend on Angular's initialization process.
### When should I use ngOnInit?
`ngOnInit` should be used when you need to perform initialization tasks that depend on Angular's initialization process, such as initializing component properties, fetching data from a server, or setting up subscriptions.
### Can I call ngOnInit manually?
No, `ngOnInit` is called automatically by Angular after the component's data-bound properties and view are initialized. It should not be called manually.
## Conclusion
In conclusion, `ngOnInit` plays a crucial role in Angular component initialization, providing developers with a hook to perform tasks after the component is fully initialized. By understanding its purpose and usage through practical examples, developers can leverage `ngOnInit` effectively to manage component initialization in their Angular applications. | chintanonweb |
1,872,055 | Bonjour Dev.to ! 👋 | 1. Introduction Salut tout le monde ! Je suis [Votre Nom], un(e) développeur(se) passionné(e) par la... | 0 | 2024-05-31T12:38:39 | https://dev.to/bacar_bml/bonjour-devto--29d7 | development, frontend, github, react | **1. Introduction**
Salut tout le monde ! Je suis [Votre Nom], un(e) développeur(se) passionné(e) par la programmation et les nouvelles technologies. Je suis nouveau(elle) sur Dev.to et j'ai hâte de partager mes connaissances, d'apprendre de cette incroyable communauté et de collaborer sur des projets passionnants.
**2. Mon parcours**
Je suis actuellement un developpeur full stack et j'ai une expérience de 3ans dans le développement d'application. Voici quelques technologies et langages que j'utilise régulièrement :
- Langages de programmation : JavaScript, Python, Java
- Frameworks : React, Django, Angular, Laravel
- Bases de données : MySQL, MongoDB
- Outils : Git, Docker
**3. Ce que j'aime faire**
En dehors du travail, j'adore contribuer à des projets open source et participer à des hackathons. J'aime aussi écrire des articles techniques pour partager mes découvertes et mes astuces avec la communauté. Je suis particulièrement intéressé(e) par :
- Les meilleures pratiques de développement logiciel
- Les nouvelles tendances en matière de technologies web
- L'optimisation des performances applicatives
- La sécurité informatique
**4. Pourquoi Dev.to ?**
J'ai rejoint Dev.to parce que j'ai entendu dire que c'est une plateforme formidable pour les développeurs, où l'entraide et le partage de connaissances sont au cœur de la communauté. Je suis impatient(e) de lire vos articles, d'apprendre de vous tous et de contribuer à mon tour.
**5. Conclusion**
Merci de m'accueillir dans cette fantastique communauté ! N'hésitez pas à me suivre et à commenter mes articles. Je suis également ouvert(e) à toute suggestion d'article ou collaboration sur des projets.
À très bientôt !
[](https://github.com/bacarlo)
[](https://www.linkedin.com/in/bayembayelo/)
[](https://portfolio-bml.vercel.app/) | bacar_bml |
1,872,054 | Transforming the Food Industry: The Role of AI | The food industry is undergoing a technological revolution, with AI it is playing a pivotal role in... | 0 | 2024-05-31T12:38:23 | https://dev.to/winsay/transforming-the-food-industry-the-role-of-ai-47eh | ai, foodindustry, automation | The food industry is undergoing a technological revolution, with AI it is playing a pivotal role in transforming the way food is produced, processed, and consumed. From optimizing farming practices to enhancing food safety and revolutionizing customer experiences, AI is reshaping every aspect of the food supply chain. This article explores the various applications of AI in the food industry and the benefits it brings to stakeholders.
####Precision Agriculture
AI is revolutionizing agriculture by enabling precision farming techniques. AI algorithms can provide farmers with valuable insights into crop health, soil conditions, and weather patterns by analyzing data from drones, satellites, and sensors. This information allows farmers to optimize irrigation, fertilization, and pest control, leading to increased crop yields and reduced environmental impact.
####Food Safety and Quality Control
Ensuring food safety and quality is paramount in the food industry. AI-powered systems can detect contaminants, pathogens, and foreign objects in food products, helping to prevent foodborne illnesses and product recalls. AI can also analyze the quality of agricultural products, such as fruits and vegetables, based on size, color, and ripeness, ensuring only the highest quality products reach the market.
####Supply Chain Optimization
AI optimizes the food supply chain by improving inventory management, forecasting demand, and reducing waste. Intelligent solutions can analyze historical sales data, market trends, and external factors like weather and holidays to predict future demand more accurately. This helps suppliers and retailers optimize their inventory levels, reduce stockouts, and minimize food waste.
####Personalized Nutrition
AI enables personalized nutrition recommendations based on individual preferences, dietary restrictions, and health goals. By analyzing data from wearable devices, food diaries, and genetic profiles, AI algorithms can provide personalized meal plans and dietary advice, promoting healthier eating habits and reducing the risk of chronic diseases.
####Enhanced Customer Experiences
AI is revolutionizing customer experiences in the food industry through personalized recommendations, interactive menus, and efficient ordering systems. Chatbots and virtual assistants powered by AI can provide customers with personalized recommendations based on their preferences and dietary needs, improving customer satisfaction and loyalty.
##Conclusion
AI is revolutionizing the food industry by enabling precision agriculture, enhancing food safety and quality control, optimizing the supply chain, enabling personalized nutrition, and enhancing customer experiences. As AI technologies continue to evolve, the food industry is poised to become more efficient, sustainable, and customer-centric than ever before. To fully harness these advancements, it is essential for companies to [hire AI developers](https://www.bacancytechnology.com/hire-ai-developer) who can create and implement these intelligent solutions effectively.
| winsay |
1,872,053 | How to track your Jira todos and accomplishments | Hi all. Last week we covered GitHub todos and accomplishments. This week, we'll take a look at... | 0 | 2024-05-31T12:37:18 | https://www.beyonddone.com/blog/posts/jira-todos-and-accomplishments | jira, webdev, softwaredevelopment, productivity |

Hi all. Last week we covered [GitHub todos and accomplishments](https://dev.to/sdotson/how-to-track-your-github-todos-and-accomplishments-3n59). This week, we'll take a look at Jira.
Jira presents many opportunities for tasks and accomplishments to fall through the cracks. Just like in GitHub, your coworkers can mention you in a comment on an issue assigned to someone else. Assigned tickets can sometimes fall off the Jira Board you look at with your team.
Rest assured that you are not alone. These are challenges faced by a lot of engineers. Thankfully there are solutions for these challenges. I'll describe the Jira todos and accomplishments most software engineers want to track then describe five solutions.
## Jira todos and accomplishments
Here is a short list of Jira events and the todo action required from the software engineer:
- Jira ticket assigned - Start research and investigation. Write code. Open a pull request. Get the code deployed to production.
- Username mentioned in a comment - Respond to the coworker who addressed you in a comment.
In addition, there are Jira accomplishments that the software engineer should remember for standup and their reference:
- Jira tickets created and updated
- Jira tickets started
- Jira tickets completed
- Comments and discussions
## Option #1 Watch your email inbox
The first alternative is to configure your notifications so that you receive notifications to your email inbox. The exact URL will depend on your Jira configuration, but generally the URL will be `https://${your-subdomain}.atlassian.net//jira/settings/personal/notifications`, being careful to replace `${your-subdomain}` with your subdomain name.
### Advantages:
- Real-time notifications when things happen.
### Disadvantages:
- If there are more than just a few of these notifications, it can be easy to lose or forget about them, especially when mixed with your other email.
- You still have to devise some sort of system for marking when items are completed. This could be with email filters, tags, or deletion.
## Option #2 Jira Your Work page
The second alternative is Jira's Your Work page that you can find at `https://${your-subdomain}.atlassian.net/jira/your-work`. Here you can see a list of tickets you've worked on, viewed, starred, or been assigned.
### Advantages
- Built and supported by Jira.
### Disadvantages
- No sense of completion. You only see tickets that you've "worked on", whatever that means.
- No sense of what is a lingering todo item.
- Requires you to use this page as your main task management systems. Activity on other platforms, such as on GitHub, are not included.
## Option #3 Jira Board
If you're a software engineer using Jira, you're probably doing using it within the context of a team and looking at the board for their team. The URL will look something like `https://${your-subdomain}.atlassian.net/jira/core/projects/${project}/board`, replacing the `${your-subdomain}` and the `${project}` with the appropriate values.
The Jira board makes it really easy to see the state of specific tasks, assuming they're represented by a Jira ticket. You can see what needs to be started, what is in progress, and what is completed. There is also a filter where you can toggle only the items you've been assigned.
### Advantages
- Built and supported by Jira.
- The state of tasks is readily apparent.
- Relevant GitHub pull requests can be seen by clicking on a specific issue.
### Disadvantages
- Only Jira tickets that fit the current criteria for the board are displayed. If your Lead or Product Manager forgets to move a ticket from one sprint to the next, the ticket is not visible.
- Only work captured by a Jira ticket is displayed. Your comments and Jira ticket updates or ticket creations are not included.
- Your tasks are mixed in with everybody else's on your team.
- Informal tasks are not captured like being mentioned in a comment.
## Option #4 JQL Queries
Jira has its own query language because of course it does. Atlassian gazed across a landscape of unimaginable splendor, a real cornucopia of options, and somehow saw a desert.
Look at the [JQL documentation page](https://support.atlassian.com/jira-service-management-cloud/docs/use-advanced-search-with-jira-query-language-jql/) to learn how to use the language. I'll see you in two weeks.
The page to construct your own JQL queries is at `https://${your-subdomain}.atlassian.net/issues/DEMO-7?jql=`.
You can use these queries to gather issues in a more detailed way:
- Current todos: `assignee = currentUser() AND statusCategory not in (Done)`
- Mentions: `comment ~ currentUser() AND statusCategory not in (Done)`
Here are some other queries for completed actions:
- Your activity in a given range: `(updated >= ${startDateIso}) AND (creator = currentUser() OR assignee = currentUser() OR assignee was currentUser() OR reporter = currentUser() OR commentedBy = currentUser()`
Open up a few browser tabs, learn a bespoke query language, and explore to your heart's content!
### Advantages
- Built and supported by Jira.
- Addresses many disadvantages of a Jira Board or the Your Work page.
### Disadvantages
- You have to learn the Jira Query language JQL.
- You must open several tabs or navigate Jira's interface to find the JQL query page.
- Information about updated Jira tickets and when you updated them is lost.
- There's no way to tell if you've responded to a mention.
## Option #5 BeyondDone App
The final option is the BeyondDone app, which lets you see all your Jira todos and accomplishments in one view.
BeyondDone goes beyond what is offered through the Jira platform in many ways. Jira mentions are displayed and marked done when you respond to the comment. All the Jira tickets you update are included. The diff and date of each update are included. Your Jira comments are displayed and mixed in with all your other activities for your reference.
BeyondDone also aggregates your todos and activity from GitHub. GitHub activity related to Jira tickets is grouped together. You can add your todos and accomplishments when you have items that don't currently have a BeyondDone integration. I've used these to remind myself to resolve Slack conversations or capture work not covered by a Jira ticket.
I use BeyondDone every day and it has turbocharged my ability to stay on top of things and sell myself better in standup and with my supervisor.
I encourage you all to [sign up today](https://www.beyonddone.com?utm_source=devto&utm_medium=blog). There's a 30-day free trial and no payment information is required up-front.
| sdotson |
1,870,459 | Understanding Change Tracking for Better Performance in EF Core | Originally published at https://antondevtips.com. Change Tracker is the heart of EF Core, that keeps... | 0 | 2024-05-31T12:36:07 | https://antondevtips.com/blog/understanding-change-tracking-for-better-performance-in-ef-core | programming, dotnet, efcore, backend | ---
canonical_url: https://antondevtips.com/blog/understanding-change-tracking-for-better-performance-in-ef-core
---
_Originally published at_ [_https://antondevtips.com_](https://antondevtips.com/blog/understanding-change-tracking-for-better-performance-in-ef-core)_._
**Change Tracker** is the heart of EF Core, that keeps an eye on entities that are added, updated and deleted.
In today's post you will learn how Change Tracker works, how entities are tracked, and how to attach existing entities to the Change Tracker.
You will receive guidelines on how to improve your application's performance with tracking techniques.
In the end, we will explore how EF Core Change Tracker can significantly improve our code in the read-world scenario.
## What is Change Tracker in EF Core
The **Change Tracker** is a key part of EF Core responsible for keeping track of entity instances and their states.
It monitors changes to these instances and ensures the database is updated accordingly.
This tracking mechanism is essential for EF Core to know which entities must be inserted, updated, or deleted in the database.
When you query the database, EF Core automatically starts tracking the returned entities.
```csharp
using (var dbContext = new ApplicationDbContext())
{
var users = await dbContext.Users.ToListAsync();
}
```
After querying users from the database, all entities are automatically added to the Change Tracker.
When updating the users - change tracker will compare the `users` collection with its inner collection of `User` entities that were retrieved from the database.
EF Core will use the comparison result to decide what SQL commands to generate to update entities in the database.
```csharp
using (var dbContext = new ApplicationDbContext())
{
var users = await dbContext.Users.ToListAsync();
users[0].Email = "test@mail.com";
await dbContext.SaveChangesAsync();
}
```
In this example, we are updating the first user's email.
After calling `dbContext.SaveChangesAsync()` EF Core compares `users` collection with the one saved in the Change Tracker.
After comparing, EF Core finds out that `users` collection was updated and the **update** SQL query is sent to the database:
```sql
Executed DbCommand (0ms) [Parameters=[@p1='****', @p0='test@mail.com' (Nullable = false) (Size = 13)], CommandType='Text', CommandTimeout='30']
UPDATE "users" SET "email" = @p0
WHERE "id" = @p1
RETURNING 1;
```
To add and delete entities you should call the `Add` and `Remove` methods:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var users = await dbContext.Users.ToListAsync();
dbContext.Users.Remove(users[1]);
dbContext.Users.Add(new User
{
Id = Guid.NewGuid(),
Email = "one@mail.com"
});
await dbContext.SaveChangesAsync();
}
```
Change Tracker will detect that a second user is deleted and a new user is added.
As a result, the following SQL commands will be sent to the database to delete and create a user:
```sql
Executed DbCommand (0ms) [Parameters=[@p0='***'], CommandType='Text', CommandTimeout='30']
DELETE FROM "users"
WHERE "id" = @p0
RETURNING 1;
Executed DbCommand (0ms) [Parameters=[@p0='***', @p1='one@mail.com' (Nullable = false) (Size = 12)], CommandType='Text', CommandTimeout='30']
INSERT INTO "users" ("id", "email")
VALUES (@p0, @p1);
```
## Change Tracker and Child Entities
Change Tracker in EF Core also tracks child entities that are loaded together with other entities.
Let's explore the following entities:
```csharp
public class Book
{
public required Guid Id { get; set; }
public required string Title { get; set; }
public required int Year { get; set; }
public Guid AuthorId { get; set; }
public Author Author { get; set; } = null!;
}
public class Author
{
public required Guid Id { get; set; }
public required string Name { get; set; }
public List<Book> Books { get; set; } = [];
}
```
A `Book` is mapped as one-to-many to the `Author`.
When executing the following code and updating the first book's author name:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var books = await dbContext.Books
.Include(x => x.Author)
.ToListAsync();
books[0].Author.Name = "Jack Sparrow";
await dbContext.SaveChangesAsync();
}
```
EF Core generates an update request to the database:
```sql
Executed DbCommand (0ms) [Parameters=[@p1='***', @p0='Jack Sparrow' (Nullable = false) (Size = 12)], CommandType='Text', CommandTimeout='30']
UPDATE "authors" SET "name" = @p0
WHERE "id" = @p1
RETURNING 1;
```
Now let's try to add a new book to the first author:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var authors = await dbContext.Authors
.Include(x => x.Books)
.ToListAsync();
var newBook = new Book
{
Id = Guid.NewGuid(),
Title = "Asp.Net Core In Action",
Year = 2024
};
authors[0].Books.Add(newBook);
dbContext.Entry(newBook).State = EntityState.Added;
await dbContext.SaveChangesAsync();
}
```
In this case, you need to manually notify Change Tracker that book was added to the author:
```csharp
dbContext.Entry(newBook).State = EntityState.Added;
```
As a result, an insert query with a foreign key to `Author` will be sent to the database:
```sql
Executed DbCommand (11ms) [Parameters=[@p0='fba984cd-a7b8-4eee-998b-165db95068a5', @p1='1072efd7-a71f-40a5-a939-5e68b7e34e0c', @p2='Asp.Net Core In Action' (Nullable = false) (Size = 22), @p3='2024'], CommandType='Text', CommandTimeout='30']
INSERT INTO "books" ("id", "author_id", "title", "year")
VALUES (@p0, @p1, @p2, @p3);
```
## How Entities are Tracked In EF Core
Entities in EF Core are tracked based on their state, which can be one of the following:
* **Added** - the entity is new and will be inserted into the database.
* **Modified** - the entity has been modified and will be updated in the database
* **Deleted** - the entity has been marked for deletion
* **Detached** - the entity should not be tracked and will be removed from the change tracker
* **Unchanged** - the entity has not been modified since it was loaded
You can check the state of an entity using the `Entry` property of the DbContext:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var book = dbContext.Books.First();
var entry = dbContext.Entry(book);
var state = entry.State; // EntityState.Unchanged
}
```
## Attaching Existing Entities to the Change Tracker
As you've already seen, sometimes, you might need to attach an existing entity to the Change Tracker.
This is common in scenarios where entities are retrieved from a different context or from outside the database (e.g., from an API).
To attach an entity, you can use the `Attach` method so the Change Tracker will start tracking this entity.
This method marks the entity as `Unchanged` by default.
You need to specify whether this entity should be either modified or deleted in the database:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var book = new Book
{
Id = Guid.NewGuid(),
Title = "Asp.Net Core In Action",
Year = 2024
};
dbContext.Books.Attach(book);
dbContext.Entry(book).State = EntityState.Modified;
dbContext.Books.Attach(book);
dbContext.Entry(book).State = EntityState.Deleted;
}
```
## Batch Tracking Operations in EF Core
EF Core provides range operations to perform batch operations on multiple entities.
These methods can simplify code and improve performance.
### AddRange
Adds a collection of new entities to the context:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var author = new Author
{
Id = Guid.NewGuid(),
Name = "Andrew Lock"
};
var books = new List<Book>
{
new()
{
Id = Guid.NewGuid(),
Title = "Asp.Net Core In Action 2.0",
Year = 2020,
Author = author
},
new()
{
Id = Guid.NewGuid(),
Title = "Asp.Net Core In Action 3.0",
Year = 2024,
Author = author
}
};
dbContext.Books.AddRange(books);
await dbContext.SaveChangesAsync();
}
```
### UpdateRange
Updates a collection of entities in the context:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var booksToUpdate = await dbContext.Books
.Where(x => x.Year >= 2020)
.ToListAsync();
booksToUpdate.ForEach(b => b.Title += "-updated");
dbContext.Books.UpdateRange(booksToUpdate);
await dbContext.SaveChangesAsync();
}
```
### RemoveRange
Removes a collection of entities from the context:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var blogsToDelete = await dbContext.Books
.Where(x => x.Year < 2020)
.ToListAsync();
dbContext.Books.RemoveRange(blogsToDelete);
await dbContext.SaveChangesAsync();
}
```
### AttachRange
Attaches a collection of existing entities to the context:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var books = new List<Book>
{
// ...
};
dbContext.Books.AttachRange(books);
foreach (var book in books)
{
dbContext.Entry(book).State = EntityState.Modified;
}
}
```
## How to Disable Change Tracker
When you read entities from the database, and you don't need to update them, you can inform EF Core to not track these entities in the Change Tracker.
It is especially useful when you are retrieving a lot of records from the database and don't want to waste memory for tracking these entities as they won't be modified.
The `AsNoTracking` method is used to query entities without tracking them.
This can improve performance for read-only operations, as EF Core skips the overhead of tracking changes:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var books = await dbContext.Books
.Include(x => x.Author)
.AsNoTracking()
.ToListAsync();
}
```
It's a small performance tip for optimizing read-only queries in EF Core and you need to know it.
## How To Access Tracking Entities in EF Core
EF Core allows you to access and manipulate tracked entities in the Change Tracker of the current DbContext.
You can retrieve all tracked entities using the `Entries` method:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var books = await dbContext.Books
.Include(x => x.Author)
.ToListAsync();
var trackedEntities = dbContext.ChangeTracker.Entries();
foreach (var entry in trackedEntities)
{
Console.WriteLine($"Entity: {entry.Entity}, State: {entry.State}");
}
}
```
You can also filter entities by their state:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var books = await dbContext.Books
.Include(x => x.Author)
.ToListAsync();
books[0].Author.Name = "Jack Sparrow";
var modifiedEntities = dbContext.ChangeTracker.Entries()
.Where(e => e.State == EntityState.Modified);
foreach (var entry in modifiedEntities)
{
Console.WriteLine($"Modified Entity: {entry.Entity}");
}
}
```
## A Real-World Example of Using Change Tracker
Let's explore a real world example on how using a Change Tracker can significantly simplify our code.
Imagine that you have entities that have `CreatedAtUtc` and `UpdatedAtUtc` properties.
These properties are used for time audit.
`CreatedAtUtc` should be assigned with current UTC time when a new entity is added to the database.
`UpdatedAtUtc` should be assigned with current UTC time whenever an existing entity is updated in the database.
Let's explore the most basic implementation for a `User` entity:
```csharp
public class User
{
public Guid Id { get; set; }
public required string Email { get; set; }
public DateTime CreatedAtUtc { get; set; }
public DateTime? UpdatedAtUtc { get; set; }
}
```
When creating a new user or updating an existing one, you need to manually specify these values:
```csharp
using (var dbContext = new ApplicationDbContext())
{
var user = new User
{
Id = Guid.NewGuid(),
Email = "test@mail.com",
CreatedAtUtc = DateTime.UtcNow
};
dbContext.Users.Add(user);
await dbContext.SaveChangesAsync();
user.Email = "another@mail.com";
user.UpdatedAtUtc = DateTime.UtcNow;
await dbContext.SaveChangesAsync();
}
```
It might seem that this is not a big deal, but imagine you have a more complex application where you can update not only a user's email, but his password, personal data and permissions.
And you can have a lot of entities that should have `CreatedAtUtc` and `UpdatedAtUtc` properties.
Using manual approach will clutter your code, you will have code duplications here and there.
Moreover, you can forget to set these properties and introduce a bug in your code.
What if I tell you that you can use Change Tracker in EF Core to set these properties automatically in one place for all entities that should have time audit?
First, let's introduce an interface:
```csharp
public interface ITimeAuditableEntity
{
DateTime CreatedAtUtc { get; set; }
DateTime? UpdatedAtUtc { get; set; }
}
```
All entities that need time audit should inherit from this interface:
```csharp
public class Book : ITimeAuditableEntity
{
// Other properties
public DateTime CreatedAtUtc { get; set; }
public DateTime? UpdatedAtUtc { get; set; }
}
public class Author : ITimeAuditableEntity
{
// Other properties
public DateTime CreatedAtUtc { get; set; }
public DateTime? UpdatedAtUtc { get; set; }
}
```
Now in the DbContext you can override the `SaveChangesAsync` method to automatically set the `CreatedAtUtc` and `UpdatedAtUtc` properties:
```csharp
public class ApplicationDbContext : DbContext
{
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = new CancellationToken())
{
var entries = ChangeTracker.Entries<ITimeAuditableEntity>();
foreach (var entry in entries)
{
if (entry.State is EntityState.Added)
{
entry.Entity.CreatedAtUtc = DateTime.UtcNow;
}
else if (entry.State is EntityState.Modified)
{
entry.Entity.UpdatedAtUtc = DateTime.UtcNow;
}
}
return await base.SaveChangesAsync(cancellationToken);
}
}
```
By using `ChangeTracker.Entries<ITimeAuditableEntity>();` you can receive filtered tracked entities.
After that, `CreatedAtUtc` and `UpdatedAtUtc` properties are set for entities that are added and updated.
Finally, we are calling `base.SaveChangesAsync` method to save changes to the database.
If you have multiple DbContexts in your application, you can use an EF Core **Interceptor** to achieve the same goal.
This way you won't need to duplicate the code across all DbContexts.
Here is how to create such an **Interceptor**:
```csharp
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Diagnostics;
public class TimeAuditableInterceptor : SaveChangesInterceptor
{
public override async ValueTask<InterceptionResult<int>> SavingChangesAsync(
DbContextEventData eventData,
InterceptionResult<int> result,
CancellationToken cancellationToken = default)
{
var context = eventData.Context!;
var entries = context.ChangeTracker.Entries<ITimeAuditableEntity>();
foreach (var entry in entries)
{
if (entry.State == EntityState.Added)
{
entry.Entity.CreatedAtUtc = DateTime.UtcNow;
}
else if (entry.State == EntityState.Modified)
{
entry.Entity.UpdatedAtUtc = DateTime.UtcNow;
}
}
return await base.SavingChangesAsync(eventData, result, cancellationToken);
}
}
```
And register the interceptor in the DbContext:
```csharp
builder.Services.AddDbContextFactory<ApplicationDbContext>(options =>
{
options.EnableSensitiveDataLogging().UseSqlite(connectionString);
options.AddInterceptors(new TimeAuditableInterceptor());
});
```
You can register this interceptor for multiple DbContexts and reuse the single code base for performing time audit for any number of entities.
Hope you find this blog post useful. Happy coding!
_Originally published at_ [_https://antondevtips.com_](https://antondevtips.com/blog/understanding-change-tracking-for-better-performance-in-ef-core)_._
### After reading the post consider the following:
- [Subscribe](https://antondevtips.com/blog/understanding-change-tracking-for-better-performance-in-ef-core#subscribe) **to receive newsletters with the latest blog posts**
- [Download](https://github.com/AntonMartyniuk-DevTips/dev-tips-code/tree/main/backend/EfCore/ChangeTracking) **the source code for this post from my** [github](https://github.com/AntonMartyniuk-DevTips/dev-tips-code/tree/main/backend/EfCore/ChangeTracking) (available for my sponsors on BuyMeACoffee and Patreon)
If you like my content — **consider supporting me**
Unlock exclusive access to the source code from the blog posts by joining my **Patreon** and **Buy Me A Coffee** communities!
[](https://www.buymeacoffee.com/antonmartyniuk)
[](https://www.patreon.com/bePatron?u=73769486) | antonmartyniuk |
1,872,052 | Best 9 Payment Solutions for Efficient Marketplace Operations | When it comes to running a successful online marketplace, having reliable and efficient payment... | 0 | 2024-05-31T12:31:55 | https://dev.to/felicityjohns/best-9-payment-solutions-for-efficient-marketplace-operations-4737 | onlinemarketplaces | When it comes to running a successful online marketplace, having reliable and efficient payment solutions is crucial. If you need guidance on choosing the best options, [refer to this article](https://sloboda-studio.com/blog/how-to-choose-a-marketplace-payment-solution/). Here, we’ll explore nine top payment solutions that can enhance your marketplace operations, ensuring smooth and secure transactions for both buyers and sellers.
**PayPal**
- Overview: PayPal is one of the most popular payment solutions globally, known for its ease of use and security.
- Benefits: Offers buyer and seller protection, supports multiple currencies, and provides easy integration with most e-commerce platforms.
- Drawbacks: Higher transaction fees compared to some other solutions.
**Stripe**
- Overview: Stripe is a robust payment processing platform designed for internet businesses of all sizes.
- Benefits: Supports a wide range of payment methods, offers advanced security features, and provides extensive customization options.
- Drawbacks: Can be complex to set up for beginners.
**Square**
- Overview: Square is a versatile payment solution that caters to both online and offline transactions.
- Benefits: Easy to use, offers a free point-of-sale system, and has competitive transaction fees.
- Drawbacks: Limited international availability.
**Adyen**
- Overview: Adyen is a global payment company that provides businesses with a single platform to accept payments anywhere in the world.
- Benefits: Supports over 250 payment methods, offers advanced fraud protection, and provides detailed reporting.
- Drawbacks: Geared more towards larger enterprises.
**Braintree**
- Overview: Braintree, a PayPal service, offers full-stack payment solutions for online and mobile payments.
- Benefits: Seamless integration with PayPal, supports multiple payment methods, and provides recurring billing options.
- Drawbacks: Slightly higher fees for certain transactions.
**Authorize.Net**
- Overview: Authorize.Net is a trusted payment gateway offering comprehensive services for small to medium-sized businesses.
- Benefits: Easy to integrate, excellent customer support, and supports a variety of payment options.
- Drawbacks: Monthly fees and setup costs.
**2Checkout (now Verifone)**
- Overview: 2Checkout is a global payment platform that simplifies the complexities of digital commerce.
- Benefits: Supports multiple currencies, offers a simple integration process, and provides extensive fraud protection.
- Drawbacks: Transaction fees can be high for certain payment methods.
**Amazon Pay**
- Overview: Amazon Pay allows customers to use their Amazon account to make purchases on your marketplace.
- Benefits: Trusted brand, easy checkout process, and robust fraud protection.
- Drawbacks: Limited to customers with Amazon accounts.
**Worldpay**
- Overview: Worldpay is a leading payment processing company that offers a wide range of solutions for businesses.
- Benefits: Supports multiple currencies and payment methods, provides detailed analytics, and has robust security features.
- Drawbacks: Complex fee structure.
**Conclusion**
Choosing the right payment solution for your marketplace is essential for ensuring efficient operations and a positive user experience. Each of the options listed above has its own strengths and weaknesses, so consider your specific needs and requirements when making a decision. By integrating the best payment solution, you can facilitate seamless transactions, enhance security, and ultimately drive the success of your marketplace.
| felicityjohns |
1,872,051 | The Evolution and Future of Mobile Application Development | Mobile utility improvement has visible unprecedented boom during the last decade, revolutionizing how... | 0 | 2024-05-31T12:31:06 | https://dev.to/liong/the-evolution-and-future-of-mobile-application-development-3ei5 | mobile, devices, malaysi, technology | Mobile utility improvement has visible unprecedented boom during the last decade, revolutionizing how we interact with technology and basically changing the digital landscape. From simple application apps to complicated, multi-useful structures, cellular apps have become essential to daily life for billions of people international. This blog explores the evolution, contemporary trends, and destiny instructions of cellular utility improvement.
**The Evolution of Mobile Applications**.
**Early Days: Simple and Functional**
The adventure of cell application improvement started with basic, functional apps designed to satisfy precise needs. The earliest mobile apps have been pre-hooked up on devices and blanketed smooth utilities like calendars, calculators, and primary video video games. The release of the Apple App Store in 2008 marked a widespread turning component, imparting a platform for developers to distribute their apps to a worldwide target audience. This democratization of app development spurred innovation and brought approximately an explosion of new programs.
**The Rise of Smartphones**
The proliferation of smartphones considerably multiplied the boom of cell apps. Devices just like the iPhone and Android smartphones delivered superior skills, which includes high-decision touchscreens, effective processors, and strong strolling structures, which enabled the development of extra modern-day apps. Mobile applications started out out incorporating capabilities like GPS, cameras, and sensors, imparting a richer and more interactive person revel in.
**The App Ecosystem: Diversity and Expansion**
As the app atmosphere multiplied, the range of programs varied. Social media apps like Facebook and Instagram, enjoyment apps like Spotify and Netflix, and software program apps like Uber and Google Maps have grow to be ubiquitous. Mobile games additionally evolved, with titles like Angry Birds and Pokémon Go fascinating heaps and hundreds of clients. The kind of apps available converted smartphones into flexible tools for paintings, play, and conversation.
**Current Trends in Mobile Application Development**
**Cross-Platform Development**
One of the tremendous inclinations in cellular software program improvement is the rise of move-platform improvement frameworks like React Native, Flutter, and Xamarin. These frameworks permit builders to write code as quickly as and set up it during multiple systems (iOS, Android, and so forth.), substantially reducing improvement time and expenses. This technique guarantees a steady user experience across exceptional devices and systems.
**Artificial Intelligence and Machine Learning**
Artificial intelligence (AI) and system mastering (ML) are reworking mobile applications through way of permitting extra customized and realistic customer memories. AI-powered chatbots, customized pointers, voice recognition, and photo processing are a number of the functionalities that beautify the customer enjoy. For example, AI-driven apps like Google Assistant and Siri offer voice-activated help, making smartphones even extra powerful and client-satisfactory.
**Augmented Reality (AR) and Virtual Reality (VR)**
AR and VR technologies are developing immersive studies in cellular apps. Applications like Pokémon Go and IKEA Place exhibit the capacity of AR by mixing digital content with the real global. VR apps, often utilized in gaming and simulations, offer totally immersive environments. The integration of AR and VR in cellular programs is anticipated to develop, providing new ways for users to have interaction with virtual content.
**Internet of Things (IoT)**
The Internet of Things (IoT) is another trend influencing cell app development. Enabled apps can join and manipulate smart devices, developing a seamless and interconnected surroundings. Applications that control clever domestic devices, wearable tech, and commercial IoT structures are getting more and more popular. This connectivity enhances comfort and performance, allowing customers to manipulate various elements of their environment from their smartphones.
**5G Technology**
The rollout of 5G networks is ready to revolutionize cell utility improvement with the aid of supplying faster statistics speeds and lower latency. This advancement will enable greater responsive and actual-time packages, especially in areas like gaming, streaming, and augmented truth. 5G will also facilitate the development of recent packages that require excessive bandwidth and occasional latency, consisting of remote healthcare and smart metropolis solutions.
Challenges in [Mobile Application Development](https://ithubtechnologies.com/mobile-application-development/?utm_source=dev.to%2F&utm_campaign=mobileapplicationdevelopment&utm_id=Offpageseo+2024)
Despite the improvements and opportunities, cellular application
**improvement faces several challenges:**
**Device Fragmentation**
The variety of gadgets and working systems poses a good sized assignment for builders. Ensuring compatibility throughout special display screen sizes, resolutions, and OS versions calls for extensive checking out and optimization. This fragmentation can result in elevated improvement expenses and time.
**Security and Privacy**
With the growing amount of private information stored on cellular devices, ensuring protection and privateers is paramount. Developers should positioned into impact strong security features to defend person information from breaches and cyber-attacks. Compliance with information protection policies, consisting of GDPR, adds any other layer of complexity to mobile app development.
**User Experience**
Delivering a seamless and intuitive person experience is vital for the fulfillment of cell programs. Developers need to recognition on person-centric layout concepts, overall performance optimization, and accessibility to make sure that their apps provide a fantastic enjoy for all users. Balancing functionality with ease of use calls for cautious planning and execution.
.
**The Future of Mobile Application Development**
**Progressive Web Apps (PWAs)**
Progressive Web Apps (PWAs) are poised to bridge the distance among internet and mobile programs. PWAs provide the benefits of each, imparting a quick, dependable, and appealing customer revel in much like local apps, however with the accessibility of net apps. They can be accessed thru an internet browser without needing installation, making them a cost-powerful solution for developers and customers.
**Enhanced AI and Machine Learning Capabilities**
The future of mobile packages will see even more sophisticated AI and gadget gaining knowledge of competencies. Predictive analytics, superior herbal language processing, and real-time statistics processing will enable apps to deliver particularly customized reports and insights. As AI technology continues to conform, cellular packages turns into smarter and extra intuitive.
**Block chain Integration**
Block chain era is predicted to play a widespread position in the future of mobile app improvement. Block chain can decorate safety, transparency, and data integrity in diverse applications, which include finance, deliver chain, and healthcare. Decentralized apps (DApps) built on block chain systems provide new possibilities for solid and obvious interactions.
**Extended Reality (XR)**
Extended Reality (XR), encompassing AR, VR, and blended truth (MR), will in addition revolutionize cellular applications. XR will permit extra immersive and interactive reports, especially in gaming, education, schooling, and faraway collaboration. As XR generation advances, its integration into cell apps will create new possibilities for innovation and user engagement.
**Conclusion**
Mobile software improvement has come an extended manner from its early days of easy application apps. Today, it includes a various and dynamic surroundings driven by means of advancements in generation and changing person expectations. Current tendencies like cross-platform development, AI, AR/VR, IoT, and 5G are shaping the destiny of cell apps, even as challenges like device fragmentation, security, and consumer revel in remain. Looking beforehand, innovations including PWAs, stronger AI capabilities, block chain integration, and XR promise to take cellular programs to new heights, offering interesting opportunities for developers and users alike.
| liong |
1,872,038 | 10+ Essential JavaScript Functions to Streamline Your Code | JavaScript Guide | JavaScript is a versatile language that heavily relies on functions, making it essential for both... | 0 | 2024-05-31T12:20:32 | https://dev.to/ahsaniftikhar99/10-essential-javascript-functions-to-streamline-your-code-javascript-guide-28mo | webdev, javascript, tutorial, node | JavaScript is a versatile language that heavily relies on functions, making it essential for both beginners and experienced developers to master them. Functions in JavaScript help you encapsulate reusable code, making your programming more efficient and organized. Here are some examples of handy JavaScript functions that can simplify your code and improve your development workflow:
##Regular function
```
function subtract(a, b) {
return a - b;
```
## Function expression
```
const sum = function (a, b) {
return a + b;
};
```
## Arrow function
```
const subtract= (a, b) => {
return a - b;
};
// OR
const subtract= (a, b) => a - b;
```
## Generator function
Generator functions in JavaScript provide a powerful way to work with iterators and allow you to pause and resume execution at various points. They are especially useful for handling sequences of data and can be a valuable tool in your coding arsenal. Let's explore the basics of generator functions and see some examples.
**What is a Generator Function?**
A generator function is defined using the function* syntax (with an asterisk). Unlike regular functions, generator functions can yield multiple values over time, allowing you to iterate through them one by one.
```
function* indexGenerator() {
let index = 0;
while (true) {
yield index++;
}
}
const g = indexGenerator();
console.log(g.next().value); // => 0
console.log(g.next().value); // => 1
```
## Generate random number
```
const random = (min, max) => Math.floor(Math.random() * (max - min + 1)) + min;
console.log(random(1, 10));
// This will generate a random number between a min and max range
```
## Convert array to object
```
const toObject = (arr) => ({ ...arr });
console.log(toObject(["a", "b"])); // { 0: 'a', 1: 'b' }
```
## Debounce Function
Limits the rate at which a function can fire.
```
export function debounce(func, wait) {
// Declare a variable to keep track of the timeout ID
let timeout;
// Return a new function that wraps the original function
return function(...args) {
// Clear the previous timeout, if it exists
clearTimeout(timeout);
// Set a new timeout to invoke the function after the specified wait time
timeout = setTimeout(() => {
// Use apply to call the original function with the correct `this` context and arguments
func.apply(this, args);
}, wait);
};
}
---
import React, { useState, useCallback } from 'react';
import axios from 'axios';
import { debounce } from './utils/debounce';
function Search() {
const [query, setQuery] = useState('');
const [results, setResults] = useState([]);
const fetchResults = async (query) => {
if (!query) {
setResults([]);
return;
}
try {
const response = await axios.get(`https://api.example.com/search?q=${query}`);
setResults(response.data);
} catch (error) {
console.error('Error fetching data:', error);
}
};
// Create a debounced version of the fetchResults function
const debouncedFetchResults = useCallback(debounce(fetchResults, 300), []);
const handleInputChange = (event) => {
const newQuery = event.target.value;
setQuery(newQuery);
debouncedFetchResults(newQuery);
};
return (
<div>
<input
type="text"
value={query}
onChange={handleInputChange}
placeholder="Search..."
/>
<ul>
{results.map((result, index) => (
<li key={index}>{result.name}</li>
))}
</ul>
</div>
);
}
```
## Throttle Function
Ensures a function is only called at most once in a given period.
```
function throttle(func, limit) {
// Variables to track the last function call and the time it was last run
let lastFunc;
let lastRan;
// Return a new function that wraps the original function
return function(...args) {
// If the function hasn't been run yet, call it and record the time
if (!lastRan) {
func.apply(this, args);
lastRan = Date.now();
} else {
// Clear any previously scheduled function calls
clearTimeout(lastFunc);
// Schedule a new function call
lastFunc = setTimeout(() => {
// If the time since the last function call is greater than or equal to the limit, call the function
if ((Date.now() - lastRan) >= limit) {
func.apply(this, args);
lastRan = Date.now();
}
}, limit - (Date.now() - lastRan));
}
};
}
// Example usage of the throttle function
// A function that will be throttled
function logMessage(message) {
console.log(message);
}
// Create a throttled version of the logMessage function with a limit of 2000 milliseconds (2 seconds)
const throttledLogMessage = throttle(logMessage, 2000);
// Simulate calling the throttled function multiple times
setInterval(() => {
throttledLogMessage('Hello, World!');
}, 500);
// This example will log "Hello, World!" to the console at most once every 2 seconds, even though the function is called every 500 milliseconds.
```
## Clone Object
Creates a deep copy of an object.
```
function cloneObject(obj) {
return JSON.parse(JSON.stringify(obj));
}
```
## Get Unique Values from Array
Returns an array with unique values.
```
function getUniqueValues(array) {
return [...new Set(array)];
}
```
## Safe JSON Parse
```
function safeJSONParse(jsonString) {
try {
return JSON.parse(jsonString);
} catch (e) {
return null;
}
}
// Dummy JSON strings for testing
const validJSON = '{"name": "John", "age": 30, "city": "New York"}';
const invalidJSON = '{"name": "John", "age": 30, "city": "New York"';
// Calling the function with a valid JSON string
const parsedValidJSON = safeJSONParse(validJSON);
console.log(parsedValidJSON); // Output: { name: 'John', age: 30, city: 'New York' }
// Calling the function with an invalid JSON string
const parsedInvalidJSON = safeJSONParse(invalidJSON);
console.log(parsedInvalidJSON); // Output: null
```
## Reverse String
```
const reverseString = (str) => str.split("").reverse().join("");
console.log(reverseString("hello")); // olleh
```
## Array Chunking
Splits an array into chunks of a specified size
```
function chunkArray(array, size) {
// Initialize an empty array to hold the chunks
const result = [];
// Loop through the input array in increments of 'size'
for (let i = 0; i < array.length; i += size) {
// Use the slice method to create a chunk of 'size' elements
// starting from the current index 'i' and push it to the result array
result.push(array.slice(i, i + size));
}
// Return the array of chunks
return result;
}
// Example usage:
const sampleArray = [1, 2, 3, 4, 5, 6, 7, 8, 9];
const chunkSize = 3;
const chunkedArray = chunkArray(sampleArray, chunkSize);
console.log(chunkedArray);
// Output: [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
```
| ahsaniftikhar99 |
1,872,037 | 100 Salesforce Pardot Interview Questions and Answers | Pardot, a powerful marketing automation platform owned by Salesforce, empowers organizations to... | 0 | 2024-05-31T12:18:44 | https://www.sfapps.info/100-salesforce-pardot-interview-questions-and-answers/ | blog, interviewquestions | ---
title: 100 Salesforce Pardot Interview Questions and Answers
published: true
date: 2024-05-31 12:13:08 UTC
tags: Blog,InterviewQuestions
canonical_url: https://www.sfapps.info/100-salesforce-pardot-interview-questions-and-answers/
---
Pardot, a powerful marketing automation platform owned by Salesforce, empowers organizations to streamline their marketing efforts, nurture leads, and drive revenue growth through targeted campaigns and personalized engagement. With features like email marketing, lead scoring, dynamic content, and robust analytics, Pardot enables marketers to deliver the right message to the right audience at the right time, ultimately driving better results and ROI.
### Position Requirements:
The ideal candidate for the position of Salesforce Pardot Specialist should possess:
- Strong understanding of Salesforce and Pardot platforms.
- Experience in implementing and optimizing marketing automation workflows.
- Proficiency in configuring Pardot features such as email marketing, lead scoring, and automation rules.
- Knowledge of Salesforce-Pardot integration and ability to troubleshoot integration issues.
- Familiarity with data management best practices and experience in leveraging Pardot for lead segmentation and targeting.
- Strong analytical skills and ability to interpret data to drive marketing decisions.
- Excellent communication and collaboration skills, with the ability to work cross-functionally with sales, marketing, and IT teams.
- Certifications such as Salesforce Certified Pardot Specialist or Salesforce Certified Marketing Cloud Consultant are a plus.
Overall, the successful candidate will be a strategic thinker with a passion for leveraging technology to drive marketing success and achieve business objectives.
## List of 100 Salesforce Pardot Interview Questions and Answers
- [Interview Questions and Answers for a Junior Salesforce Pardot Specialist](#aioseo-interview-questions-and-answers-for-a-junior-salesforce-pardot-specialist)
- [Interview Questions and Answers for a Middle Salesforce Pardot Developer](#aioseo-interview-questions-and-answers-for-a-middle-salesforce-pardot-developer)
- [Interview Questions and Answers for a Senior Salesforce Pardot Software Engineer](#aioseo-interview-questions-and-answers-for-a-senior-salesforce-pardot-software-engineer)
- [Scenario Based Interview Questions and Answers for a Salesforce Pardot Consultant](#aioseo-scenario-based-interview-questions-and-answers-for-a-salesforce-pardot-consultant)
- [Technical Interview Questions for a Salesforce Pardot Specialist](#aioseo-technical-interview-questions-for-a-salesforce-pardot-specialist)
The most common question at any interview is the presence of Salesforce certification. Do you have it covered?
If not yet – we recommend taking courses offered by the FocusOnForce Team.
[Explore Certification Practice Exams](https://www.sfapps.info/marketing-cloud-certification-study-guide/)

## Interview Questions and Answers for a Junior Salesforce Pardot Specialist
1. **What is Pardot and how does it integrate with Salesforce?**
Pardot is a marketing automation platform that helps businesses automate their marketing tasks. It integrates seamlessly with Salesforce, allowing for a unified view of marketing and sales activities.
1. **Describe the lead management process in Pardot.**
The lead management process in Pardot involves lead generation, scoring, nurturing, and handing off qualified leads to sales.
1. **Explain the difference between a prospect and a lead in Pardot.**
A prospect in Pardot is any individual or organization that you market to, whereas a lead is a prospect that has shown interest in your products or services by taking a specific action, like filling out a form.
1. **What are automation rules in Pardot?**
Automation rules in Pardot are criteria-based actions that automatically perform tasks like assigning prospects to lists, changing prospect scores, or sending emails based on predefined criteria.
1. **How do you set up a drip campaign in Pardot?**
To set up a drip campaign in Pardot, you create an Engagement Studio program where you define the steps, actions, and timing for sending emails to prospects based on their behavior or characteristics.
1. **Explain what a Pardot campaign is and how it differs from a Salesforce campaign.**
A Pardot campaign is a marketing initiative that tracks the effectiveness of marketing efforts. It differs from a Salesforce campaign in that it is specifically designed for marketing automation and is not tied to Salesforce campaign functionality.
1. **How can you track email performance in Pardot?**
Email performance in Pardot can be tracked using metrics such as open rates, click-through rates, bounce rates, and conversion rates. These metrics provide insights into the effectiveness of email campaigns.
1. **What is lead scoring and how is it implemented in Pardot?**
Lead scoring is the process of assigning values to prospects based on their interactions with your marketing content. In Pardot, lead scoring is implemented using criteria such as email opens, form submissions, and website visits to determine a prospect’s level of interest.
1. **How do you create custom fields in Pardot?**
Custom fields in Pardot can be created by navigating to Admin > Configure Fields and adding new fields with the desired field type and options.
1. **Explain the difference between a Pardot list and a Pardot dynamic list.**
A Pardot list is a static list of prospects that you manually add or remove members from, whereas a Pardot dynamic list automatically adds or removes prospects based on predefined criteria.
1. **What are Pardot tags and how are they used?**
Pardot tags are labels that you can assign to prospects, lists, campaigns, and other assets to categorize and organize them. They are useful for segmentation and targeting in marketing campaigns.
1. **Describe the lead qualification process in Pardot.**
The lead qualification process in Pardot involves scoring leads based on their engagement and behavior, nurturing them with targeted content, and identifying when they are ready to be handed off to sales for further follow-up.
1. **How do you create a landing page in Pardot?**
To create a landing page in Pardot, you use the landing page builder to design and customize the page layout, add form fields to capture prospect information, and configure settings such as thank you messages and completion actions.
1. **What is a Pardot tracker domain and why is it important?**
A Pardot tracker domain is a custom domain that is used to track links and assets in marketing emails and landing pages. It’s important because it maintains branding consistency and improves email deliverability.
1. **Explain the concept of lead nurturing and how it is implemented in Pardot.**
Lead nurturing is the process of building relationships with prospects at every stage of the buyer’s journey. In Pardot, lead nurturing is implemented using automated drip campaigns, targeted content, and personalized messaging to guide prospects through the sales funnel.
1. **How do you create an automation rule in Pardot?**
To create an automation rule in Pardot, you define the criteria that trigger the rule and specify the actions that should be taken when the criteria are met, such as assigning prospects to lists or updating their scores.
1. **What is a Pardot prospect lifecycle and how does it work?**
The Pardot prospect lifecycle defines the stages that a prospect goes through from initial awareness to becoming a customer. It typically includes stages such as awareness, consideration, decision, and advocacy, with corresponding marketing activities tailored to each stage.
1. **What is the Pardot API and how can it be used?**
The Pardot API is a set of tools and protocols that allow developers to programmatically interact with Pardot data and functionality. It can be used to integrate Pardot with other systems, automate tasks, and build custom applications.
1. **How do you track ROI in Pardot?**
ROI in Pardot can be tracked by measuring the performance of marketing campaigns against key metrics such as leads generated, opportunities created, and revenue generated. By analyzing these metrics, you can determine the effectiveness of your marketing efforts and calculate ROI.
1. **What are Pardot connectors and how do they work?**
Pardot connectors are integrations that allow Pardot to connect with other systems and platforms, such as CRMs, email marketing tools, and analytics platforms. They work by synchronizing data between Pardot and external systems to ensure consistency and accuracy across all marketing channels.
**You might be interested:** [Salesforce Marketing Cloud Interview Questions](https://www.sfapps.info/100-salesforce-marketing-cloud-interview-questions-and-answers/)
### Insight:
When interviewing Junior Salesforce Pardot Specialists, it’s crucial to focus on their foundational knowledge of both Salesforce and Pardot platforms, as well as their understanding of marketing automation concepts. Pardot interview questions should assess their familiarity with basic Pardot functionalities such as email marketing, lead scoring, and form creation, along with their ability to navigate the Pardot interface. Evaluating their understanding of Salesforce-Pardot integration and their problem-solving skills in resolving common issues is also important. Additionally, probing into their eagerness to learn and grow within the role can provide valuable insights into their potential for development and long-term success within the organization. Overall, a balanced approach that gauges both technical competencies and attitude towards learning will help identify promising candidates for the role of Junior Salesforce Pardot Specialist.
## Interview Questions and Answers for a Middle Salesforce Pardot Developer
1. **Explain how Pardot handles data privacy and compliance with regulations like GDPR.**
Pardot provides tools for managing consent and preferences, enabling compliance with regulations like GDPR through features such as double opt-in forms, unsubscribe options, and data encryption.
1. **Describe the process of integrating Pardot with external systems like CRM platforms.**
Integration between Pardot and CRM platforms like Salesforce involves mapping fields between the two systems, setting up synchronization schedules, and configuring automation rules to ensure data consistency and accuracy.
1. **How do you troubleshoot email deliverability issues in Pardot?**
Troubleshooting email deliverability in Pardot involves checking factors such as sender reputation, email content, list quality, and domain authentication settings. Tools like Pardot’s email testing and spam analysis can help identify and resolve issues.
1. **What is lead nurturing and how can it be optimized in Pardot?**
Lead nurturing is the process of guiding prospects through the buyer’s journey with targeted content and personalized messaging. In Pardot, lead nurturing can be optimized by segmenting audiences, tailoring content to their interests, and using automation to deliver timely messages.
1. **Explain the difference between automation rules and completion actions in Pardot.**
Automation rules in Pardot are criteria-based actions that automatically perform tasks like assigning prospects to lists or updating their scores, whereas completion actions are actions triggered when a prospect completes a specific form or landing page.
1. **How do you implement lead scoring models in Pardot to prioritize leads for sales follow-up?**
Lead scoring models in Pardot involve assigning point values to prospect actions and behaviors, such as email opens, website visits, and form submissions, to identify the most engaged and qualified leads for sales engagement.
1. **What are engagement programs in Pardot and how do they differ from drip campaigns?**
Engagement programs in Pardot are automated nurturing programs that deliver a series of targeted emails to prospects based on their behavior or characteristics, whereas drip campaigns are simpler, linear email sequences typically used for lead nurturing.
1. **Explain how you would design and optimize a landing page for lead generation in Pardot.**
Designing and optimizing a landing page for lead generation in Pardot involves creating a clear call-to-action, minimizing form fields, using compelling imagery and copy, and testing variations to improve conversion rates.
1. **How do you track and analyze marketing attribution in Pardot to measure campaign effectiveness?**
Marketing attribution in Pardot involves tracking the sources and touchpoints that contribute to conversions, using tools like multi-touch attribution models and campaign influence reports to measure the effectiveness of marketing campaigns.
1. **What are custom redirects in Pardot and how can they be used for tracking and analytics?**
Custom redirects in Pardot are shortened URLs that redirect to specific landing pages or assets, allowing marketers to track clicks, measure engagement, and analyze campaign performance through built-in reporting and analytics.
1. **Describe how you would implement lead segmentation and personalization strategies in Pardot.**
Implementing lead segmentation and personalization in Pardot involves using criteria such as demographics, behavior, and engagement history to create targeted lists and deliver personalized content and messaging to prospects.
1. **What role does A/B testing play in optimizing marketing campaigns in Pardot?**
A/B testing in Pardot allows marketers to experiment with different variations of emails, landing pages, and forms to identify the most effective elements and optimize campaign performance based on metrics like open rates, click-through rates, and conversion rates.
1. **How do you manage and maintain data quality and cleanliness in Pardot?**
Managing data quality in Pardot involves regularly auditing and cleaning up prospect data, enforcing data validation rules, and implementing processes for data governance and hygiene to ensure accurate reporting and segmentation.
1. **Explain the concept of progressive profiling and how it can be implemented in Pardot.**
Progressive profiling in Pardot is the practice of gradually collecting additional prospect information over time through iterative form submissions, allowing marketers to gather more insights while minimizing form fatigue and improving conversion rates.
1. **What are Pardot Einstein features and how do they enhance marketing automation capabilities?**
Pardot Einstein features leverage artificial intelligence and machine learning to provide predictive analytics, behavioral insights, and automated recommendations that help marketers optimize campaigns, prioritize leads, and drive engagement.
1. **How do you set up lead scoring categories and thresholds in Pardot to align with sales priorities?**
Setting up lead scoring categories and thresholds in Pardot involves collaborating with sales teams to define criteria and point values that indicate lead readiness and align with sales priorities and qualification criteria.
1. **Explain the role of Pardot tags and custom fields in organizing and segmenting prospect data.**
Pardot tags and custom fields are used to categorize and segment prospect data based on attributes, interests, and behaviors, enabling more targeted and personalized marketing campaigns and communications.
1. **Describe the process of building and deploying dynamic content in Pardot emails to improve engagement.**
Building and deploying dynamic content in Pardot emails involves creating content blocks with personalized messaging or images that are dynamically populated based on prospect attributes or segmentation criteria, increasing relevance and engagement.
1. **How do you monitor and analyze the performance of Pardot campaigns using key performance indicators (KPIs)?**
Monitoring and analyzing Pardot campaign performance involves tracking KPIs such as conversion rates, ROI, cost per acquisition, and engagement metrics like opens, clicks, and forwards to evaluate effectiveness and optimize future campaigns.
1. **What are some best practices for optimizing the Pardot-Salesforce integration to ensure data integrity and alignment between marketing and sales teams?**
Best practices for optimizing the Pardot-Salesforce integration include establishing clear data governance policies, implementing standardized naming conventions, conducting regular data audits, and providing training and support to ensure alignment between marketing and sales teams and maximize the value of shared data.
### Insight:
When interviewing Middle Salesforce Pardot Specialists, it’s essential to delve deeper into their technical expertise and practical experience with both Salesforce and Pardot platforms. Salesforce Pardot interview questions should assess their proficiency in designing and implementing complex marketing automation workflows, lead scoring models, and segmentation strategies within Pardot. Additionally, evaluating their ability to optimize Salesforce-Pardot integration, troubleshoot integration-related issues, and leverage advanced Pardot features like B2B Marketing Analytics is crucial.
## Interview Questions and Answers for a Senior Salesforce Pardot Software Engineer
1. **Explain how you would design a scalable and efficient architecture for integrating Pardot with Salesforce in a large enterprise environment.**
A scalable and efficient architecture for integrating Pardot with Salesforce in a large enterprise environment would involve leveraging tools like Salesforce Connector and APIs to ensure seamless data synchronization, implementing best practices for data modeling and security, and designing automated workflows to streamline processes.
1. **Describe a complex customization or integration project you have implemented in Pardot and Salesforce, including the challenges you faced and how you overcame them.**
In a previous project, we implemented a custom lead scoring model in Pardot and Salesforce to prioritize leads for sales follow-up. One challenge was aligning the scoring criteria with the sales team’s priorities, but we addressed this by conducting workshops and refining the model iteratively based on feedback.
1. **How do you ensure data quality and governance in a Pardot-Salesforce environment, especially when dealing with large volumes of prospect and customer data?**
Ensuring data quality and governance in a Pardot-Salesforce environment involves implementing data validation rules, enforcing data hygiene processes, conducting regular audits, and providing training and documentation to stakeholders to promote data stewardship and compliance.
1. **Explain the concept of account-based marketing (ABM) and how it can be implemented using Pardot and Salesforce together.**
Account-based marketing (ABM) is a strategic approach that targets high-value accounts with personalized campaigns and messaging. In Pardot and Salesforce, ABM can be implemented by leveraging features like account-based lists, lead scoring based on account engagement, and personalized content for key accounts.
1. **How do you design and implement lead lifecycle processes in Pardot and Salesforce to ensure smooth handoffs between marketing and sales teams?**
Designing and implementing lead lifecycle processes in Pardot and Salesforce involves defining qualification criteria, establishing lead routing rules, setting up automated notifications and alerts, and implementing closed-loop feedback mechanisms to track lead progression and optimize processes.
1. **Describe your experience with advanced Pardot features like B2B Marketing Analytics and Engagement History, and how they can provide insights to drive marketing strategy.**
B2B Marketing Analytics and Engagement History in Pardot provide deep insights into marketing performance and prospect behavior, enabling marketers to analyze campaign effectiveness, track engagement across channels, and make data-driven decisions to optimize strategy and drive revenue growth.
1. **How do you approach troubleshooting and debugging complex issues in Pardot and Salesforce, especially when dealing with integration-related challenges?**
Troubleshooting and debugging complex issues in Pardot and Salesforce involve analyzing logs, reviewing configuration settings, testing scenarios in sandbox environments, collaborating with cross-functional teams, and leveraging community resources and support channels for assistance.
1. **Explain how you would design and implement a lead scoring model that accounts for both explicit and implicit indicators of prospect interest and intent.**
Designing a lead scoring model that accounts for both explicit and implicit indicators involves defining point values for actions like form submissions and email clicks (explicit) as well as engagement metrics like time spent on website pages and content downloads (implicit), and adjusting scoring criteria based on prospect behavior and conversion patterns.
1. **Describe your experience with Pardot API integrations and how you have leveraged them to extend functionality or automate processes.**
In previous projects, I have used the Pardot API to integrate with third-party systems for data synchronization, lead enrichment, and custom reporting. This has involved developing custom scripts and applications to automate tasks like data imports, lead scoring updates, and email notifications.
1. **How do you ensure compliance with regulatory requirements like GDPR and CCPA when managing prospect and customer data in Pardot and Salesforce?**
Ensuring compliance with regulations like GDPR and CCPA involves implementing features like consent management, data encryption, and data retention policies in Pardot and Salesforce, as well as providing training and resources to stakeholders to educate them on their responsibilities and rights regarding data privacy.
1. **Explain how you would design and implement a multi-channel marketing strategy in Pardot, integrating email, social media, and advertising campaigns to reach and engage prospects across different touchpoints.**
Designing and implementing a multi-channel marketing strategy in Pardot involves creating targeted messaging and content for each channel, coordinating campaign schedules and messaging to ensure consistency and relevance, and leveraging automation to track and measure engagement across channels.
1. **Describe your experience with advanced segmentation techniques in Pardot, such as dynamic lists, segmentation rules, and predictive segmentation, and how they can be used to target and personalize marketing campaigns.**
Advanced segmentation techniques in Pardot, such as dynamic lists, segmentation rules, and predictive segmentation, allow marketers to target and personalize campaigns based on factors like demographics, behavior, and predictive insights, improving relevance and engagement with prospects and customers.
1. **How do you approach optimizing Pardot performance and scalability to support growing business needs and increasing data volumes?**
Optimizing Pardot performance and scalability involves monitoring system performance metrics, identifying bottlenecks and areas for improvement, implementing best practices for data management and automation, and periodically reviewing and optimizing configurations to ensure optimal performance as business needs evolve.
1. **Explain how you would design and implement a lead nurturing program in Pardot, incorporating drip campaigns, dynamic content, and personalized messaging to guide prospects through the buyer’s journey.**
Designing and implementing a lead nurturing program in Pardot involves defining the buyer’s journey stages, mapping content and messaging to each stage, setting up drip campaigns with targeted content sequences, and using dynamic content and personalization to engage prospects based on their interests and behavior.
1. **Describe your experience with Salesforce Campaign Influence and how you have used it to track and measure the impact of marketing campaigns on revenue generation and customer acquisition.**
Salesforce Campaign Influence allows marketers to track and measure the impact of marketing campaigns on revenue generation by attributing opportunities and closed deals to specific campaigns, channels, and touchpoints, providing insights into campaign effectiveness and ROI.
1. **How do you approach user training and adoption of Pardot and Salesforce features and functionality, especially when rolling out new tools or processes to large teams or organizations?**
Approaching user training and adoption involves developing comprehensive training materials and resources, conducting hands-on workshops and webinars, providing ongoing support and guidance, and soliciting feedback and input from users to ensure successful adoption and utilization of Pardot and Salesforce capabilities.
1. **Explain how you would design and implement lead scoring automation rules in Pardot to adjust scoring criteria based on prospect behavior and engagement patterns.**
Designing and implementing lead scoring automation rules in Pardot involves defining criteria for adjusting point values based on prospect behavior and engagement patterns, setting up rules to trigger scoring updates automatically, and regularly reviewing and refining scoring criteria to ensure alignment with sales priorities and qualification criteria.
1. **Describe your experience with Pardot Einstein features like Lead Scoring and Behavior Scoring, and how they leverage AI and machine learning to enhance marketing automation capabilities.**
Pardot Einstein features like Lead Scoring and Behavior Scoring leverage AI and machine learning to analyze prospect behavior and engagement patterns, predict lead quality and likelihood to convert, and prioritize leads for sales follow-up, enabling marketers to optimize campaign targeting and maximize ROI.
1. **How do you approach data migration and cleanup projects in Pardot and Salesforce, especially when consolidating multiple data sources or migrating from legacy systems?**
Approaching data migration and cleanup projects involves conducting thorough data analysis and mapping, developing data migration plans and strategies, executing data migration tasks in phases or batches, and performing data validation and testing to ensure accuracy and completeness of migrated data.
1. **Explain how you would design and implement advanced reporting and analytics solutions in Pardot and Salesforce, incorporating custom dashboards, reports, and data visualizations to provide actionable insights for marketing strategy and decision-making.**
Designing and implementing advanced reporting and analytics solutions involves identifying key performance indicators (KPIs) and metrics, building custom dashboards and reports to track and measure performance, and using data visualizations and insights to inform marketing strategy, optimize campaigns, and drive business growth.
**You might be interested in hiring** [Salesforce Certified Pardot Consultant](https://www.sfapps.info/salesforce-pardot-consultant-for-professional-services/)
### Insight:
In interviewing Senior Salesforce Pardot Specialists, the focus shifts towards evaluating their depth of expertise and leadership capabilities within the Salesforce and Pardot ecosystem. Pardot consultant interview questions and answers should assess their proficiency in architecting sophisticated marketing automation solutions, optimizing system performance, and driving strategic initiatives to enhance marketing operations efficiency and effectiveness. It’s crucial to probe into their experience with designing and implementing complex Salesforce-Pardot integrations, including data migration, synchronization, and custom development. Additionally, assessing their ability to mentor and guide junior team members, as well as collaborate cross-functionally with sales, marketing, and IT stakeholders, is paramount.
## Scenario Based Interview Questions and Answers for a Salesforce Pardot Consultant
1. **You’re tasked with implementing a lead scoring model in Pardot for a B2B company. How would you approach this task?**
I would start by collaborating with stakeholders to define lead scoring criteria based on attributes and behaviors indicative of prospect engagement and readiness to buy. Next, I would configure scoring categories and point values in Pardot, considering factors like email opens, form submissions, website visits, and engagement with specific content. Finally, I would set up automation rules to adjust lead scores based on prospect interactions and ensure alignment with sales qualification criteria.
1. **A marketing campaign manager wants to create a segmented email campaign in Pardot targeting prospects in different geographic regions. How would you assist them in achieving this?**
I would advise the campaign manager to use Pardot dynamic lists to segment prospects based on geographic criteria such as country or region. They can create dynamic lists with rules specifying the desired geographic attributes, ensuring that the email campaign is targeted to the right audience segments. Additionally, I would recommend leveraging dynamic content to personalize email messaging based on each recipient’s geographic location.
1. **A sales manager reports that leads generated from a recent marketing campaign in Pardot are not being assigned to the appropriate sales representatives in Salesforce. How would you troubleshoot this issue?**
First, I would check the lead assignment rules in both Pardot and Salesforce to ensure they are configured correctly and are active. Next, I would review the criteria used for lead assignment, such as territory or lead source, to identify any discrepancies or misconfigurations. If necessary, I would test lead assignment rules in a sandbox environment to verify their functionality and make adjustments as needed.
1. **A new Salesforce Pardot integration is being implemented, and there are concerns about data synchronization and consistency between the two systems. How would you address these concerns?**
I would start by reviewing the data mapping between Pardot and Salesforce to ensure that fields are mapped correctly and data is synchronized accurately. I would also check synchronization schedules and settings to ensure data is being updated in a timely manner. Additionally, I would recommend implementing validation rules and processes in both systems to enforce data integrity and consistency.
1. **A marketing team wants to track the effectiveness of an upcoming webinar promotion campaign in Pardot. How would you set up tracking and reporting for this campaign?**
I would create a new campaign in Pardot specifically for the webinar promotion and associate all related assets, such as emails, landing pages, and forms, with this campaign. I would use UTM parameters or Pardot campaign tags to track traffic and conversions from the campaign across different channels. Additionally, I would set up custom reports and dashboards in Pardot and Salesforce to monitor campaign performance metrics like leads generated, conversions, and ROI.
1. **The marketing team wants to implement lead nurturing workflows in Pardot to follow up with prospects who have downloaded a whitepaper. How would you design and implement these workflows?**
I would start by creating an automation rule in Pardot to identify prospects who have downloaded the whitepaper based on form submissions. Next, I would design a lead nurturing workflow using Engagement Studio, where prospects would receive a series of automated emails with relevant content and calls-to-action to further engage them. I would also set up automation rules to adjust lead scores and notify sales when prospects exhibit buying signals.
1. **A marketing manager wants to track the ROI of an upcoming trade show event using Pardot and Salesforce. How would you set up tracking and reporting for this event?**
I would create a new campaign in both Pardot and Salesforce specifically for the trade show event and associate all related assets and leads with this campaign. I would use custom campaign member statuses to track the progress of leads through different stages of engagement, from registration to follow-up. Additionally, I would set up custom reports and dashboards in Salesforce to monitor key metrics like leads generated, opportunities created, and revenue attributed to the event.
1. **A marketing team wants to personalize email content in Pardot based on a prospect’s industry. How would you implement dynamic content to achieve this?**
I would create dynamic content blocks in Pardot email templates with variations tailored to different industries. Then, I would use dynamic content rules to display the appropriate content block based on the prospect’s industry field value or segmentation criteria. This would ensure that each recipient receives personalized email content relevant to their industry.
1. **A marketing campaign manager wants to schedule a series of automated follow-up emails in Pardot for prospects who have attended a recent webinar. How would you assist them in setting up this workflow?**
I would recommend using Engagement Studio in Pardot to create a nurturing program for webinar attendees. I would set up triggers to enroll prospects in the program when they meet the criteria for webinar attendance, and then design a series of automated email actions with personalized content and timing to engage and nurture these prospects further.
1. **A sales manager wants to receive real-time notifications in Salesforce whenever a high-value lead engages with marketing content in Pardot. How would you set up this alert system?**
I would configure automation rules in Pardot to monitor prospect engagement and trigger alerts when specific criteria are met, such as a high lead score or certain actions taken. Then, I would set up Salesforce workflow rules or process builder to generate real-time notifications for the sales manager based on these alerts, ensuring timely follow-up and engagement with high-value leads.
1. **The marketing team wants to ensure that only qualified leads are synced from Pardot to Salesforce to prevent cluttering the sales pipeline. How would you configure lead syncing settings between Pardot and Salesforce?**
I would recommend setting up lead syncing filters in Pardot to define criteria for qualifying leads before they are synced to Salesforce. This could include criteria such as lead score thresholds, specific actions taken, or demographic attributes. By configuring these filters, only qualified leads meeting the specified criteria would be synced to Salesforce, ensuring a clean and relevant sales pipeline.
1. **A marketing manager wants to analyze the effectiveness of an email campaign in Pardot by tracking the open and click-through rates of different email variations. How would you set up A/B testing for this campaign?**
I would create an A/B test in Pardot for the email campaign, selecting the email variations to be tested and defining the test parameters such as sample size and test duration. I would then configure tracking for open and click-through rates for each variation and monitor the results in real-time. Based on the performance data, I would determine the winning variation and deploy it to the remaining audience for the campaign.
1. **A marketing team wants to implement lead scoring based on both explicit and implicit criteria in Pardot to better prioritize leads for sales follow-up. How would you design a comprehensive lead scoring model that incorporates both types of criteria?**
I would start by identifying explicit criteria such as demographic attributes, firmographic data, and explicit actions taken by prospects (e.g., form submissions, email clicks). These criteria would be assigned point values based on their importance and relevance to the lead qualification process. Additionally, I would incorporate implicit criteria such as engagement behavior, website interactions, and content consumption patterns, leveraging automation rules to adjust scores dynamically based on prospect behavior over time. By combining both explicit and implicit criteria, the lead scoring model would provide a comprehensive assessment of lead quality and readiness for sales engagement.
1. **A marketing campaign manager wants to track the ROI of an email campaign in Pardot by attributing revenue generated from campaign-related opportunities in Salesforce. How would you set up campaign tracking and revenue attribution for this campaign?**
I would associate the email campaign in Pardot with a specific campaign record in Salesforce and ensure that all related opportunities are associated with this campaign as well. By tracking campaign member statuses and opportunity stages in Salesforce, I would be able to attribute revenue generated from campaign-related opportunities back to the email campaign. Additionally, I would use custom reports and dashboards in Salesforce to monitor campaign ROI and track key metrics such as revenue attributed to the campaign.
1. **A marketing team wants to ensure that prospects who download a gated piece of content in Pardot receive follow-up emails with related content offers. How would you set up automated content nurturing workflows for this scenario?**
I would use Engagement Studio in Pardot to create a content nurturing program for prospects who download the gated piece of content. I would set up triggers to enroll prospects in the program upon downloading the content and design a series of automated email actions with relevant content offers based on their interests and engagement history. By using dynamic content and personalized messaging, I would ensure that prospects receive tailored follow-up emails that encourage further engagement and progression through the buyer’s journey.
1. **A marketing manager wants to track the performance of a paid advertising campaign in Pardot by monitoring clicks and conversions from campaign-related URLs. How would you set up tracking and reporting for this campaign?**
I would create custom URLs with tracking parameters (e.g., UTM parameters) for the campaign-related assets in Pardot, such as landing pages and forms. By using these custom URLs in the advertising campaign, I would be able to track clicks and conversions from campaign-specific sources. Additionally, I would set up custom reports and dashboards in Pardot to monitor campaign performance metrics such as clicks, conversions, and cost per acquisition, providing insights into the effectiveness of the advertising campaign.
1. **A marketing team wants to automate lead qualification in Pardot by assigning scores based on prospect engagement and behavior. How would you design and implement an automated lead scoring model for this purpose?**
I would start by identifying key engagement actions and behaviors indicative of lead interest and readiness to buy, such as email opens, form submissions, website visits, and content downloads. I would then assign point values to these actions based on their importance and relevance to the lead qualification process. Using automation rules in Pardot, I would configure scoring triggers to adjust lead scores dynamically based on prospect behavior over time. By continuously monitoring and refining the lead scoring model, I would ensure that leads are accurately prioritized for sales follow-up based on their level of engagement and qualification.
1. **A marketing manager wants to personalize email content in Pardot based on a prospect’s past interactions with the company, such as previous purchases or webinar attendance. How would you implement dynamic content to achieve this level of personalization?**
I would create dynamic content blocks in Pardot email templates with variations tailored to different prospect segments based on their past interactions and behaviors. Using dynamic content rules, I would define criteria for displaying specific content blocks based on attributes such as purchase history, webinar attendance, or engagement level. This would allow me to deliver personalized email content to each recipient that reflects their past interactions with the company, increasing relevance and engagement.
1. **A marketing campaign manager wants to implement lead nurturing workflows in Pardot to guide prospects through the buyer’s journey with targeted content and messaging. How would you design and implement these workflows to optimize engagement and conversion rates?**
I would use Engagement Studio in Pardot to create lead nurturing workflows that deliver a series of automated email actions to prospects based on their behavior and engagement history. I would segment prospects into different nurture tracks based on their interests, demographics, or stage in the buyer’s journey, and design personalized content sequences for each track. By incorporating triggers and decision points to dynamically adjust the flow of communication based on prospect interactions, I would ensure that prospects receive relevant content and messaging at each stage of their journey, optimizing engagement and conversion rates.
1. **A marketing team wants to implement a lead scoring model in Pardot that takes into account both explicit and implicit indicators of lead interest and intent. How would you design a comprehensive lead scoring model that incorporates these factors to prioritize leads for sales follow-up?**
I would start by identifying explicit criteria such as demographic attributes, firmographic data, and explicit actions taken by prospects (e.g., form submissions, email clicks). These criteria would be assigned point values based on their importance and relevance to the lead qualification process. Additionally, I would incorporate implicit criteria such as engagement behavior, website interactions, and content consumption patterns, leveraging automation rules to adjust scores dynamically based on prospect behavior over time. By combining both explicit and implicit criteria, the lead scoring model would provide a comprehensive assessment of lead quality and readiness for sales engagement, enabling sales teams to prioritize follow-up efforts effectively.
### Insight:
These Pardot developer interview questions provide candidates with hypothetical scenarios related to Pardot implementation, optimization, or troubleshooting and evaluate their approach to addressing complex challenges. By presenting candidates with scenarios that reflect common scenarios encountered in Salesforce Pardot environments, recruiters can assess their critical thinking, decision-making abilities, and practical understanding of the platform.
## Technical Interview Questions for a Salesforce Pardot Specialist
Need professional help with conducting technical interview?
Get top Salesforce interviewer from our parent company!
[Explore More](https://mobilunity.com/technical-interview-services/)

1. **What are the different types of connectors available for integrating Pardot with other systems, and how do they differ?**
The different types of connectors available for integrating Pardot with other systems are:
- Salesforce Connector: Provides seamless integration between Pardot and Salesforce, allowing for data synchronization and bi-directional communication.
- API Connector: Allows developers to integrate Pardot with third-party systems using RESTful APIs for data exchange and automation.
- Custom Connector: Enables developers to build custom integrations using Pardot APIs and webhooks to connect with specific systems or platforms.
1. **How would you retrieve a list of prospects using the Pardot API?**
To retrieve a list of prospects using the Pardot API, you would make a GET request to the /prospect/query endpoint, specifying the desired search criteria and filters in the request parameters. The API response would contain a list of prospects matching the specified criteria.
1. **Explain the difference between Pardot lists and Pardot dynamic lists.**
- Pardot Lists: Static lists of prospects that you manually add or remove members from. The list membership remains constant until manually updated.
- Pardot Dynamic Lists: Automatically update their membership based on predefined criteria or rules. Prospects are added to or removed from dynamic lists dynamically as they meet or no longer meet the specified criteria.
1. **How would you create a new prospect record using the Pardot API?**
To create a new prospect record using the Pardot API, you would make a POST request to the /prospect/version/4/do/create endpoint, providing the required prospect data in the request body. The API response would contain the newly created prospect’s ID and other details.
1. **Explain how you would set up email tracking in Pardot to monitor email opens and clicks.**
Email tracking in Pardot can be set up by enabling the “Email Click and Open Tracking” feature in email templates or individual emails. This feature automatically adds tracking pixels and redirects to track email opens and clicks when emails are sent. The tracking data is then captured and displayed in Pardot reports for analysis.
1. **How would you use completion actions in Pardot to automate follow-up actions after a prospect submits a form?**
Completion actions in Pardot allow you to automate tasks or actions that occur after a prospect completes a form. To use completion actions, you would define the desired actions, such as adding the prospect to a list, sending a follow-up email, or assigning a task to a user, and then configure the form settings to trigger these actions upon form submission.
1. **What is the difference between Pardot custom fields and default fields?**
- Pardot Custom Fields: Custom fields that you can create to capture additional prospect data beyond the default fields provided by Pardot. You can define custom field types, such as text, picklist, or date, and use them to collect specific information relevant to your marketing needs.
- Pardot Default Fields: Standard fields provided by Pardot to capture common prospect attributes, such as email address, first name, last name, and company. These fields are predefined and cannot be modified or deleted.
1. **How would you retrieve a list of email templates using the Pardot API?**
To retrieve a list of email templates using the Pardot API, you would make a GET request to the /emailTemplate/version/4/do/query endpoint. The API response would contain a list of email templates available in Pardot, including their IDs, names, and other details.
1. **Explain how you would set up lead scoring categories and thresholds in Pardot to prioritize leads for sales follow-up.**
To set up lead scoring categories and thresholds in Pardot, you would first define the scoring criteria and point values for each category, such as email opens, form submissions, or website visits. Then, you would configure automation rules to adjust lead scores based on prospect interactions and behaviors. Finally, you would set thresholds or cutoff scores to determine when leads are considered qualified and ready for sales follow-up.
1. **How would you use automation rules in Pardot to automatically assign prospects to specific lists based on their behavior or attributes?**
To use automation rules in Pardot to automatically assign prospects to lists, you would define the criteria or conditions that trigger the rule, such as specific actions taken or field values matched. Then, you would specify the action to be taken, such as adding the prospect to a list or removing them from a list, and configure the rule to run on a scheduled basis or in real-time as prospects meet the criteria.
1. **Explain how you would create a new email template in Pardot using the API.**
To create a new email template in Pardot using the API, you would make a POST request to the /emailTemplate/version/4/do/create endpoint, providing the required template data in the request body, such as the template name, HTML content, and folder ID where the template should be saved. The API response would contain the ID of the newly created email template.
1. **How would you retrieve a list of campaigns using the Pardot API?**
To retrieve a list of campaigns using the Pardot API, you would make a GET request to the /campaign/version/4/do/query endpoint. The API response would contain a list of campaigns available in Pardot, including their IDs, names, and other details.
1. **Explain how you would use custom redirects in Pardot to track and analyze link clicks in marketing emails.**
Custom redirects in Pardot allow you to create shortened URLs that redirect to specific web pages or assets, while also tracking click data and providing analytics. To use custom redirects, you would create a new redirect in Pardot, specify the destination URL, and use the generated redirect URL in your marketing emails. Pardot will then track clicks on the redirect URL and provide click-through data in reports for analysis.
1. **How would you retrieve a list of automation rules using the Pardot API?**
To retrieve a list of automation rules using the Pardot API, you would make a GET request to the /automationRule/version/4/do/query endpoint. The API response would contain a list of automation rules available in Pardot, including their IDs, names, and other details.
1. **Explain how you would set up lead nurturing campaigns in Pardot using Engagement Studio.**
To set up lead nurturing campaigns in Pardot using Engagement Studio, you would first define the campaign goals, audience segments, and desired outcomes. Then, you would design a series of automated nurture tracks with personalized content and messaging tailored to each segment. You would use triggers and decision points to dynamically adjust the flow of communication based on prospect behavior and engagement, ensuring that prospects receive relevant content at each stage of their journey.
1. **How would you retrieve a list of forms using the Pardot API?**
To retrieve a list of forms using the Pardot API, you would make a GET request to the /form/version/4/do/query endpoint. The API response would contain a list of forms available in Pardot, including their IDs, names, and other details.
1. **Explain how you would set up lead scoring automation rules in Pardot to adjust lead scores based on prospect behavior and engagement patterns.**
To set up lead scoring automation rules in Pardot, you would define the criteria or conditions that trigger the rule, such as specific actions taken or engagement thresholds met. Then, you would specify the action to be taken, such as adjusting the lead score up or down, and configure the rule to run on a scheduled basis or in real-time as prospects meet the criteria. By using automation rules to dynamically adjust lead scores based on prospect behavior and engagement patterns, you can ensure that leads are accurately prioritized for sales follow-up.
1. **How would you retrieve a list of tags using the Pardot API?**
To retrieve a list of tags using the Pardot API, you would make a GET request to the /tag/version/4/do/query endpoint. The API response would contain a list of tags available in Pardot, including their IDs, names, and other details.
1. **Explain how you would use Pardot APIs to sync prospect data with an external CRM system like Salesforce.**
To sync prospect data with an external CRM system like Salesforce using Pardot APIs, you would use the Prospect endpoint to retrieve prospect data from Pardot and the appropriate CRM APIs to create or update corresponding records in the CRM system. You would typically use a combination of batch processing and real-time syncing to ensure data consistency and accuracy between the two systems, leveraging data mapping and transformation as needed to align data structures and field mappings.
1. **How would you retrieve a list of opportunities associated with a specific campaign using the Pardot API?**
To retrieve a list of opportunities associated with a specific campaign using the Pardot API, you would make a GET request to the /opportunity/version/4/do/query endpoint, specifying the campaign ID as a filter parameter in the request. The API response would contain a list of opportunities associated with the specified campaign, including their IDs, names, and other details.
### Insight:
Salesforce Pardot Technical interviews are designed to assess candidates’ depth of knowledge and proficiency in leveraging the platform’s technical capabilities to drive marketing automation initiatives. Recruiters focus on probing candidates’ understanding of Pardot architecture, integration with Salesforce, and mastery of key technical concepts such as automation rules, custom fields, and API integration. Salesforce Pardot specialist interview questions delve into candidates’ ability to design and implement complex workflows, troubleshoot integration issues, and optimize system performance. By evaluating candidates’ hands-on experience with Pardot customization, data management, and coding proficiency, recruiters can ascertain their readiness to tackle the technical challenges inherent in Salesforce Pardot roles.
## Conclusion
These sample questions to ask Pardot consultant and requirements serve as a solid foundation for evaluating candidates for the Salesforce Pardot Specialist role. However, it’s essential to tailor the interview questions and position requirements to the specific needs and objectives of the hiring organization. As the landscape of marketing automation continues to evolve, candidates with a combination of technical expertise, strategic thinking, and a passion for innovation will play a crucial role in driving marketing success. With careful consideration and thoughtful evaluation, the hiring team can identify candidates who demonstrate the potential to make significant contributions to the organization’s marketing efforts and overall growth.
The post [100 Salesforce Pardot Interview Questions and Answers](https://www.sfapps.info/100-salesforce-pardot-interview-questions-and-answers/) first appeared on [Salesforce Apps](https://www.sfapps.info). | doriansabitov |
1,870,337 | Level Up Your Coding Skills: Rust Operators Simplified. | In the world of programming and computing, operators play a crucial role in instructing compilers and... | 0 | 2024-05-31T12:12:22 | https://dev.to/johnniekay/level-up-your-coding-skills-rust-operators-simplified-4fjj | In the world of programming and computing, **operators** play a crucial role in instructing compilers and interpreters how to compute specific mathematical or logical operations. Rust categorizes operators by their precedence and associativity, operators are standard symbols used for the purpose of logical and arithmetic operations performed on values and variables. Understanding Rust's operator precedence and associativity helps you write clear and predictable code by ensuring operations are performed in the intended order.
In this article, we will look into the precedence of operators and the different types of Rust operators and how they are used in code blocks.
Operator precedence determines the order in which a program evaluates different operations in an expression, as operators with higher precedence get evaluated before those with lower precedence.
For example, multiplication has higher precedence than addition. Thus, the expression `3 + 2 × 3` is interpreted to have the value `3 + (2 × 3) = 9`, and not `(3 + 2) × 3 = 15`. When exponent is used in the expression, it has precedence over both addition and multiplication.
The following table lists the precedence of Rust operators. Operators are listed top to bottom, in descending order:
| Precedence | Operator | Description |
| --- | --- | --- |
| 19 | `Paths` | The specific location of a field or element. |
| 18 | `Method calls` | Performs specific operations or calculations on data. |
| 17 | `Field Expressions` | Used to access individual fields or attributes of a data. |
| 16 | `Function calls, Arrays indexing` | Used to execute user-defined or built-in functions while array index allows you to access individual elements within an array. |
| 15 | `?` | Question mark operator. |
| 14 | `-a` | Unary minus. |
| | `!` | Bitwise or Logical NOT. |
| | `*` | Difference operator. |
| | `&` | Shared borrow operator. |
| | `&mut` | Mutable borrow. |
| 13 | `as` | Type casting keyword. |
| | `:` | Multiple uses. |
| 12 | `* / %` | Multiplication, Division, Remainder. |
| 11 | `+ -` | Addition, Subtraction. |
| 10 | `<< >>` | Bitwise left shift and right shift. |
| 9 | `&` | Bitwise or Logical AND. |
| 8 | `^` | Bitwise or Logical XOR. |
| 7 | | Bitwise or Logical OR. |
| 6 | `== !==` | Equality, Inequality. |
| | `< <= > >=` | Less than, Less than or equal, Greater than, Greater than or eqaul. |
| 5 | `&&` | Logical AND. |
| 4 | `||` | Logical OR. |
| 3 | `.. ..=` | Range literal, Assignment by range literal. |
| 2 | `=` | Direct assignment. |
| | `+= -= *= /= %=` | Compound assignment by sum, difference, product, quotient and remainder. |
| | `<<= >>=` | Compound assignment by Bitwise left shift and right shift. |
| | `&= ^=` |= | Compound assignment by Bitwise AND, XOR and OR. |
| 1 | `return` | Return statement. |
| | `break` | Break statement. |
Operators with higher precedence are evaluated before operators with lower precedence but when operators have the same precedence, the associativity of the operators determines the order in which the operations are performed.
Table of Associativity:
| Operator | Description | Associativity |
| --- | --- | --- |
| `Field expressions` | Expressions | Left to Right |
| `as` | Type casting keyword |
| `:` | Operator (multiple uses) |
| `* / %` | Multiplication, Division, Remainder |
| `<< >>` | Bitwise left shift and right shift |
| `&` | Bitwise or Logical AND |
| `^` | Bitwise or Logical XOR |
| `&&` | Logical AND |
| `=` | Direct assignment | Right to Left |
| `+= -= *= /= %=` | Compound assignment by sum, difference, product, quotient and remainder |
| `<<= >>=` | Compound assignment by Bitwise left shift and right shift |
| `&= ^= =` | Compound assignment by Bitwise AND, XOR and OR |
Here are a few examples that illustrate operator associativity:
- Addition and Subtraction (Left-to-Right).
``` rust
let result = 7 + 4 - 2; // interpreted as (7 + 4) - 2
```
- Assignment Operators (Right-to-Left).
``` rust
let mut x = 7;
let mut y = 14;
x = y = 21; // Interpreted as x = (y = 21)
```
- Unary Operators (Right-to-Left).
``` rust
let x = 7;
let y = -x;
let z = !true;
```
The following are the types of operators in Rust:
- Arithmetic Operators
- Comparison Operators
- Logical Operators
- Bitwise Operators
- Compound Assignment Operators
## Arithmetic Operators
Rust supports several arithmetic operators for performing basic mathematical arithmetic operations like addition, subtraction, multiplication, and division.
Below is a code block example of Rust's arithmetic operators that perform and print the result of various calculations:
``` rust
fn main() {
let a = 8;
let b = 4;
println!("a = {}, b = {}\n", a, b);
// Addition
let result_add = a + b;
println!("a + b = {}", result_add);
// Subtraction
let result_sub = a - b;
println!("a - b = {}", result_sub);
// Multiplication
let result_mul = a * b;
println!("a * b = {}", result_mul);
// Division
let result_div = a / b;
println!("a / b = {}", result_div);
// Return remainder
let result_modulo = a % b;
println!("a % b = {}", result_modulo);
}
```
Here is the output of the arithmetic operator code block:
``` rust
a = 8, b = 4
a + b = 12
a - b = 4
a * b = 32
a / b = 2
a % b = 0
```
## Coparison Operators
Rust provides several comparison operators that allow you to compare values. These operators return a boolean value (true or false), if the values match, true is returned; if they do not match, the operators return false. This immediate feedback helps in making decisions and controlling the flow of the program based on the comparisons.
## Example
``` rust
fn main() {
let x = 8;
let y = 10;
// Equal to
let is_equal = x == y;
println!("{} == {} is {}", x, y, is_equal);
// Not equal to
let is_not_equal = x != y;
println!("{} != {} is {}", x, y, is_not_equal);
// Greater than
let is_greater = x > y;
println!("{} > {} is {}", x, y, is_greater);
// Less than
let is_less = x < y;
println!("{} < {} is {}", x, y, is_less);
// Greater than or equal to
let is_greater_or_equal = x >= y;
println!("{} >= {} is {}", a, b, is_greater_or_equal);
// Less than or equal to
let is_less_or_equal = x <= y;
println!("{} <= {} is {}", x, y, is_less_or_equal);
}
```
Here is the output of comparison operators code block:
``` rust
8 == 10 is false
8 != 10 is true
8 > 10 is false
8 < 10 is true
8 >= 10 is false
8 <= 10 is true
```
Understanding and using comparison operators correctly is essential in programming to make decisions based on conditions. Rust's comparison operators provide a straightforward way to compare values and control the flow of your program.
## Logical Operators
Logical operators are used to combine or modify two or more boolean expressions, and returns true only when all conditions are true, otherwise returns false.
Rust provides three main logical operators:
- `&&`
Logical AND: Returns true when all conditions are true.
- `||`
Logical OR: Returns true when any condition is true.
- `!`
Logical NOT: Returns true when the given condition is not true.
## Example
``` rust
fn main() {
let x = 12;
let y = 16;
let z = 18;
// Checks if x is less than y and if y is less than z
let result = x < y && y < z;
println!("Result: {}", result);
// Check if x is greater than y or, if y is greater than z
let result_2 = x > y || y > z;
println!("Result_2: {}", result_2);
// Check if x is not equal to y
let result3 = x != y;
println!("Result_3: {}", result_3);
}
```
Below is the result from the logical operators code block above:
``` rust
Result: true
Result_2: false
Result_3: true
```
## Bitwise Operators
Bitwise operators in Rust perform operations at binary level, manipulating the individual bits of integer values. Bitwise operations involves working with individual bits, which are the smallest units of data in a computer.
Each bit has a single binary value: 0 or 1.
List of available bitwise operators in Rust:
- `&`
Bitwise AND: It performs a Boolean AND operation on each bit of its integer arguments.
- `|`
Bitwise OR: It performs a Boolean OR operation on each bit of its integer arguments.
- `^`
Bitwise XOR: It performs a Boolean exclusive OR operation on each bit of its integer arguments.
- `!`
Bitwise NOT: It is a unary operator and operates by reversing all the bits in the operand.
- `>>`
Bitwise Rigth shift: The left operand’s value is moved right by the number of bits specified by the right operand.
- `<<`
Bitwise Left shift: It moves all the bits in its first operand to the left by the number of places specified in the second operand. New bits are filled with zeros.
These operators are useful in low-level programming, such as when you need to manipulate individual bits of data, optimize performance, or work with hardware interfaces.
## Example
``` rust
fn main() {
let x = 0b1100;
let y = 0b1010;
// Bitwise AND
let result_and = x & y;
println!("{} & {} = {}", x, y, result_and);
// Bitwise OR
let result_or = x | y;
println!("{} | {} = {}", x, y, result_or);
// Bitwise XOR
let result_xor = x ^ y;
println!("{} ^ {} = {}", x, y, result_xor);
// Bitwise NOT
let result_not = !x;
println!("!{} = {}", x, result_not);
// Left Shift
let result_left_shift = x << 2;
println!("{} << 2 = {}", x, result_left_shift);
// Right Shift
let result_right_shift = x >> 2;
println!("{} >> 2 = {}", x, result_right_shift);
}
```
Output of the bitwise code block operators:
``` rust
12 & 10 = 0b1000
12 | 10 = 0b1110
12 ^ 10 = 0b0110
!12 = 0b11111111111111111111111111110011
12 << 2 = 0b110000
12 >> 2 = 0b0011
```
## Compound Assignment Operators
Compound assignment operators in Rust combine an arithmetic or bitwise operation with assignment, streamlining the process of modifying the value of a variable. In programming, Instead of separating the operation and assignment, you can use these operators to perform both actions in a single step.
Here are the main compound assignment operators in Rust:
- `+=`
Arithmetic addition and assignment
- `-=`
Arithmetic subtraction and assignment
- `*=`
Arithmetic multiplication and assignment
- `/=`
Arithmetic division and assignment
- `%=`
Arithmetic remainder and assignment
- `<<=`
Left-shift and assignment
- `>>=`
Right-shift and assignment
- `&=`
Bitwise AND and assignment
- `|=`
Bitwise OR and assignment
- `^=`
Bitwise exclusive OR and assignment
## Example
``` rust
fn main() {
let mut x = 10;
let mut y = 5;
// Addition Assignment
x += y;
println!("x += y: {}", x);
// Subtraction Assignment
x -= y;
println!("x -= y: {}", x);
// Multiplication Assignment
x *= y;
println!("x *= y: {}", x);
// Division Assignment
x /= y;
println!("x /= y: {}", x);
// Remainder Assignment
x %= y;
println!("x %= y: {}", x);
// Reset x and y for bitwise operations
x = 0b1010; // 10 in binary
y = 0b1100; // 12 in binary
// Bitwise AND Assignment
x &= y;
println!("x &= y: {}", x);
}
```
Compound assignment operators in programming are shorthand notations that combine an arithmetic or bitwise operation with assignment, which makes the code more concise and readable.
Output of the compound assignment code block:
``` rust
x += y: 15
x -= y: 10
x *= y: 50
x /= y: 10
x %= y: 0
x &= y: 8
```
In conclusion, Rust operators play a crucial role in functionality and efficiency, providing a wide array of operations that cater to various programming needs. From arithmetic and comparison to logical and bitwise operators,It ensures developers can perform complex calculations, make decisions, and manipulate data effectively. Understanding and utilizing these operators proficiently empowers developers to write more robust, efficient, and readable code, ultimately enhancing their ability to tackle sophisticated programming challenges. | johnniekay | |
1,872,034 | The Importance of Comments in Large Projects | In the world of software development, particularly when dealing with large projects, comments play an... | 0 | 2024-05-31T12:12:21 | https://dev.to/vidyarathna/the-importance-of-comments-in-large-projects-34ka | codemaintenance, programmingtips, developerbestpractices, codecomments |
In the world of software development, particularly when dealing with large projects, comments play an indispensable role. Neglecting to write comments can lead to a multitude of issues that can hamper both the short-term and long-term success of a project. Let's explore why comments are crucial, what should be included in them, and best practices for writing effective comments.
### Why Comments are Important
1. **Maintainability:** Large projects often involve many developers. Clear comments help maintain the code by providing insights into the purpose and functionality of code segments. This is especially important when the original authors are not available to explain their work.
2. **Collaboration:** When multiple developers work on a project, comments facilitate better collaboration. They help team members understand each other's code quickly, reducing the time needed for explanations and allowing for more efficient teamwork.
3. **Debugging and Troubleshooting:** Comments can guide developers during the debugging process. By understanding the intent behind a piece of code, it is easier to identify and fix bugs.
4. **Documentation:** Comments serve as a form of inline documentation. They provide context and details that might not be covered in external documentation, bridging the gap between high-level project documentation and the code itself.
5. **Onboarding New Developers:** For new team members, comments are invaluable. They help new developers get up to speed with the codebase faster, making the onboarding process smoother and less time-consuming.
### Consequences of Neglecting Comments
- **Increased Time for Understanding Code:** Without comments, developers spend more time deciphering code, leading to inefficiencies.
- **Higher Risk of Errors:** Misunderstanding the purpose of code can lead to incorrect modifications, introducing bugs.
- **Poor Code Quality:** Lack of comments can result in poorly maintained and inconsistent code, making the project harder to scale and extend.
- **Decreased Morale:** Developers might feel frustrated when working with uncommented code, affecting their productivity and job satisfaction.
### What to Include in Comments
1. **Purpose of the Code:** Explain what the code is supposed to achieve.
2. **Logic Explanation:** Provide insights into the logic, especially if it is complex or non-obvious.
3. **Input and Output Details:** Describe the expected inputs and outputs for functions or methods.
4. **Edge Cases and Assumptions:** Note any assumptions made and how edge cases are handled.
5. **References:** Mention any references to other parts of the code or external resources that provide additional context.
### How to Write Effective Comments
1. **Be Clear and Concise:** Avoid overly verbose comments. Aim for clarity and brevity.
2. **Use Proper Grammar and Spelling:** Well-written comments reflect professionalism and make them easier to read.
3. **Keep Comments Up-to-Date:** Ensure comments are updated whenever the associated code changes.
4. **Avoid Obvious Comments:** Do not state the obvious (e.g., `// Increment x by 1` for `x++`). Focus on explaining the why, not the what.
5. **Use TODOs and FIXMEs:** Clearly mark areas that need further work or have known issues using standardized tags like `TODO` or `FIXME`.
### Conclusion
Comments are a critical component of code quality in large projects. They enhance maintainability, facilitate collaboration, aid in debugging, and serve as documentation. By following best practices and ensuring comments are clear, concise, and relevant, developers can significantly improve the readability and robustness of their code. Remember, well-commented code is a sign of a thoughtful and considerate developer.
Happy Coding! | vidyarathna |
1,872,033 | A Guide to Mobile App UI Designs | The App Store or the Play Store boasts about 5.679 million apps, whose main objective is to see users... | 0 | 2024-05-31T12:12:08 | https://www.peppersquare.com/blog/a-guide-to-mobile-app-ui-designs/ | ui, mobile, design | The App Store or the Play Store boasts about 5.679 million apps, whose main objective is to see users download them. However, to do so, there are multiple requirements that apps need to satisfy.
One of those requirements begins with mobile app design. However, because apps lose about 77% of their users in the first three days, the importance of designing a mobile app and other requirements cannot be further expressed.
Beginning with the user interface, there’s a lot that needs to be furnished.
## How has mobile app UI design changed?
Mobile app UI designing, be it app user interface design or other specifics, there’s a lot that has changed with time. These changes result from trends, and there are plenty of them in 2023.
Hence, here are the top trends in mobile app [UI design](https://www.peppersquare.com/blog/11-blogs-with-compelling-ui-design-that-will-inspire-you/) for 2023.
## The power of Artificial Intelligence (AI)
AI has penetrated through various design stages and is now firmly established. With the power of AI, designers can now look into developing adaptable interfaces that go a long way in meeting customer preferences.
Thanks to AI, designers now have an extra tool for,
1. Generating user engagement
2. Building seamless navigation
3. Conducting [mobile app market research](https://www.peppersquare.com/blog/best-way-to-conduct-mobile-app-research/)
AI’s robust potential has also suggested that its market could grow to $282 billion by 2027. This refers to a point in the future where UI/UX designers could view AI as a companion.
## Combining Augmented Reality (AR) and Virtual Reality (VR)
With smartphones being the perfect medium to access several things, who’s not to say that top-notch AR effects cannot be accessed? As a top trend in 2023, users are expected to engage a lot more with digital content thanks to the impact of AR.
Take 3D as a top example. With its inclusion, users can view any product in detail before purchasing it. Such examples hint at a future where we further explore an immersive digital environment with AR and VR being at the center and UX designs for apps being critical.
## Mobile-first designs
A new trend enables designers to consider designing for mobiles first and then move over to desktops and other devices. With billions of smartphone users worldwide, the need to cater to them and find a [top mobile UI/UX agency](https://www.peppersquare.com/ui-ux-design/mobile-app-design/) grows further.
While the user experience needs to be seamless across all devices, smartphones grab the most attention; therefore, mobile UX becomes the talk of the town.
## Multi-path navigation
There’s a new type of navigation in town, and it is classified as a method aimed at simplifying flows and interactions that are otherwise considered complex based on the kind of app.
Multi-path navigation incorporates,
1. Vertically scrolling displays and
2. Horizontal sliders

Its benefits are also many in number, with different navigation styles enhancing the app’s visual appeal and increasing its usability. Users are also more likely to use the app for more extended periods, increasing customer retention.
## Personalized user experiences
Personalized user experiences aren’t a new thing. However, they are regarded as a trend and included in design guidelines, thanks to the benefits that they can offer. In addition, with companies collecting user data, they expect this information to generate personalized experiences.
For the same purpose, you can witness a [top UI/UX design agency](https://www.peppersquare.com/ui-ux-design/) create user personas to further develop UX designs and come out with a favorable outcome. In modern times, AI also chips in and combines the goodness of customization and personalization.
Top streaming platforms, like Netflix, use AI to showcase content similar to their users’ interests and likes. This only goes to prove that more personalized user experiences await us shortly.
## How does a well-designed Mobile App UI help your business?
Every business aims to gain returns, and the same can be said for designing mobile apps. Businesses benefit from a well-designed mobile app; the following points convey those benefits.
- **Helps in increasing ROI (Return on Investment)**
Mobile UX is closely connected with Key Performance Indicators (KPIs), which are known to impact ROI significantly. Therefore, a well-designed and enhanced UX not only helps in customer retention but also customer loyalty.
It plays a vital role in conversions, placing the onus on UI/UX designers to ensure everything goes as planned. If your business has the backing of a well-structured and simplified design, you can expect ROI to be in your business’s favor.
- **Helps in brand building**
There are approximately 6.84 billion smartphone users in the world, leaving you with more than enough reasons to build a mobile app. With a broad audience, a business with a mobile app is also an appealing aspect of brand building.
By incorporating top design practices or by learning about [how to improve mobile app user experience (UX)](https://www.peppersquare.com/blog/how-to-improve-your-mobile-app-user-experience-ux/), you can look to build a recognizable brand whose message can be spread across the world through an app.
- **Better rankings**
Apps with the best user experience are known to have better ranks and a higher chance of being downloaded. So, from content to design, app UX is crucial in helping businesses rank their apps higher.
When the design is engaging and the features helpful, customers are more likely to meet their requirements by downloading your app.
- **Be ahead of the competition**
Top industries and the companies and businesses running them have apps, which is now the rule of the land. So if your business does not have an app, you will fall behind and lose the race.
Your target group will go elsewhere to meet their needs. On top of it all, if your app does not have a top UI, you are bound to be disregarded by the competition.
## What are some of the tips for developing a good Mobile UI?

Developing a good mobile UI falls on specifics. While requirements must be met, a few tips, especially design tips, can help you get things done.
- **Conducting user research**
User interfaces need to be simple and easy to understand. A cluttered interface will always cause a hazard and prevent users from getting what they want. The idea is to keep it simple and conduct user research to understand their needs and requirements.
First, you need to understand whether your target group needs a [mobile app or a mobile website.](https://www.peppersquare.com/blog/mobile-app-vs-mobile-website/) And once you have the answer, you need to move on to the next part of the project.
If a mobile app is the way to go forward, then you need to formulate your mobile app design based on data extracted from critical user research.
- **Being consistent**
Maintaining consistency in colors, typography, and other design elements helps you create a unique visual identity for your app. Along with consistency, you also need to inject a contextual design that stays relevant to the type of product or service that you are promoting.
When consistency and contextual design are followed, you get an app that stands on top for its intended use.
- **Following intuitive navigation and usability testing**
Being able to access the features and functions that you need without having to go and search for them is a requirement rather than a benefit. It helps your users understand your application and heightens the value that it generates.
Intuitive navigation is a must-have for mobile UX and a prime design tip you must follow. Towards the end, you also need to conduct a non-functional test for usability to understand the effectiveness of your app.
With usability testing, you can deduct flaws and quickly rectify errors you may not have seen during development.
- **Effective use of push notifications**
Anything in abundance can cause a problem; the same can be said about push notifications. These notifications are meant to provide information and generate a reaction. Therefore, they shouldn’t be classified as a nuisance by the user.
Striking a balance between informing the user and understanding their preferences is the efficient way to utilize push notifications. Messages you provide for the user must be engaging, valuable and personalized.
- **Creating a single-screen experience**
While smartphones offer split-screen features, users still need to adapt. The information generated through split screens may not be appealing to all users, and decision-makers are returning to developing the single-screen experience.
With the single-screen method, you can effectively place information and navigate your users to the relevant spaces. Applications that provide financial services and more are better used for the single-screen method due to the heavy load of information that must be presented.
- **Utilizing video content**
The use of video content is projected to rise to 80% by 2028, making it an essential requirement for your app. However, its usage, as a trend for mobile UX, is driven by the portrait mode rather than other forms.
Considering how users interact with videos, it is essential to present them in a manner that drives home the point of your video. For this exact purpose, you require the expert services of a [top video production agency.](https://www.peppersquare.com/video-production/)
From building a compelling UI to ensuring that the intricacies of design are maintained, a mobile app needs everything it requires to succeed.
<a href="https://www.peppersquare.com/contact-us/"></a>
| pepper_square |
1,872,032 | Entity FrameWork | Entity Framework (EF) is an object-relational mapper (ORM) for .NET, developed by Microsoft. It... | 0 | 2024-05-31T12:10:38 | https://dev.to/mohamedabdiahmed/entity-framework-4e37 | csharp, aspdotnet, entityframework, dotnetcore | Entity Framework (EF) is an object-relational mapper (ORM) for .NET, developed by Microsoft. It enables .NET developers to work with a database using .NET objects, eliminating the need for most of the data-access code that developers usually need to write. EF allows developers to focus on writing business logic by providing a higher-level abstraction for database operations.
 | mohamedabdiahmed |
1,872,031 | Lotto Date calculator Draw | By using pure javascript, i want to predict a future date and past date based on the user input date.... | 0 | 2024-05-31T12:06:02 | https://dev.to/raj_sriselvam_35ccdbe7a7/lotto-date-calculator-draw-57kl | javascript | By using pure javascript, i want to predict a future date and past date based on the user input date. considering lotto draw happens every wednesday and saturday at 8pm.
I have the code logic, but i need some help to finish this logic. | raj_sriselvam_35ccdbe7a7 |
1,870,293 | Monthly Challenge: Mid-Year Check-In — Recharge and Refocus for an Amazing Second Half! | Have you ever felt like the year was flying by? It's already halfway through 2024! In January, we... | 0 | 2024-05-31T12:00:00 | https://dev.to/virtualcoffee/monthly-challenge-mid-year-check-in-recharge-and-refocus-for-an-amazing-second-half-2k4c | challenge, community, goals, motivation | Have you ever felt like the year was flying by? It's already halfway through 2024!

In January, we kicked off the new year with the "[New Year, New Goals](https://dev.to/virtualcoffee/join-virtual-coffee-in-new-year-new-goals-241m)" challenge.
This month is the perfect time to pause and reflect on your goals. The "[Mid-Year Check-In](https://virtualcoffee.io/monthlychallenges/june-2024)" challenge is your chance to assess your progress, learn from experiences, and reignite your motivation for the remaining months.
**Grab that pen, and let's dive in!**

- **Victorious Achievements**: What goals have you achieved so far? Did you master a new skill, hit a personal target, or finally tackle that project you've been putting off? Write them down and celebrate these wins! Recognizing your accomplishments is a powerful motivator that will move you forward.
- **Learning from Setbacks**: Did something not go according to plan? It happens to the best of us! Instead of feeling discouraged, view these experiences as valuable lessons. Ask yourself: What went wrong? What could I have done differently? This self-reflection will equip you to adjust your approach and avoid similar pitfalls in the future.
- **Unfinished Business**: Think about the goals that still remain. Are they within reach? Do they need some tweaking? Write down the things you want to achieve by the end of December. Be specific and set clear deadlines to stay motivated.
- **Shifting Goals**: Life throws curveballs sometimes, and your priorities might have shifted since the beginning of the year. That's okay! Revisit your initial goals and adjust them as needed. Remember, your goals should reflect where you are now and where you want to be.
We understand that staying motivated can be challenging. Don't go it alone! In the comment section below, share your achievements, learnings, remaining goals, and/or new goals you want to accomplish by the end of the year. We're here to support and encourage you and keep you moving forward.
So, are you ready to take charge of your year? Let's do this together!
 | adiatiayu |
1,872,028 | What are the Benefits of Regression Test Automation? | One of the fundamental processes in software development and maintenance is regression testing. This... | 0 | 2024-05-31T11:59:36 | https://itsblogstime.com/benefits-of-regression-test-automation/ | regression, test, automation | 
One of the fundamental processes in software development and maintenance is regression testing. This testing guarantees that once changes or new features are implemented, the existing ones will still function correctly. The process is often carried out manually. But, in addition to being labor-intensive, it is also prone to mistakes.
That is why many teams prefer automating the procedure, as automation provides multiple benefits to any software development project. We’ll talk about the advantages of regression test automation in this blog post, as well as why it’s becoming more and more important for contemporary development teams.
**Increased Test Coverage and Accuracy**
The essence of automated regression testing lies in the ability to repeatedly and consistently perform testing of all critical paths and scenarios. As practice shows, it is difficult to provide such coverage when using manual methodologies due to the lack of time, material, and human resources.
**Cost-effective and Time-saving**
The most important advantage of regression test automation is the economy of time and money. Manual regression testing is massively laborious, and takes much time and human resources, particularly with huge and sophisticated software applications. Automated regression testing allows software development teams to run tests faster and more efficiently, thus freeing up humans to do more important and challenging things like build new software or innovate solutions.
**Early Detection of Defects**
Another way of explaining this is that regression test automation helps software teams identify and rectify errors at the start of the development process, at which point they have no means of propagating and are less expensive to rectify. With automated regression examinations integrated with the CI/CD pipeline, more severe bugs can be identified and resolved the moment they appear, reducing the possibility of releasing unhealthy software to customers.
**Improved Software Quality**
Automation of regression testing helps the software teams maintain applications at the required quality level at all times. The automated test scripts can be run regularly or after even a small code change to get quick feedback on potential problems or regressions.
**Scalability and Maintainability**
The number of test cases that must be covered to adequately conduct a complete regression test skyrockets as software applications become more and more complicated and add functionality. Managing and manually running the test becomes unmanageable.
**Parallel Execution and Faster Feedback**
Most of the new and modern regression test automation frameworks and tools offered in the market come with parallel execution support. This will help run multiple test cases simultaneously with different machines or environments, which will significantly reduce the total test execution time.
**Conclusion**
Regression testing is being revolutionized by Opkey; a test automation platform driven by artificial intelligence. For enterprises, this platform drastically cuts expenses, labor, and time-to-market. Test cases may be easily created by business users and QA analysts because of its user-friendly, no-code interface. More than 30,000 ready-to-use test cases covering more than 12 ERP systems are pre-loaded onto the platform, allowing regression test coverage to grow quickly.
It suggests and ranks pertinent test cases in accordance with its change impact analysis capabilities, which offer insights into how application modifications affect business operations. Additionally, Opkey uses self-healing script technology, which minimizes maintenance efforts by over 80% by automatically mending test scripts in the event of failures. This feature enables users to fix faulty tests with a single click. | rohitbhandari102 |
1,872,027 | Top Ionic Interview Questions and Answers | If you’re a fresher preparing for an Ionic Framework interview, it’s essential to familiarize... | 0 | 2024-05-31T11:59:27 | https://dev.to/lalyadav/top-ionic-interview-questions-and-answers-46gh | ionic, programming, javascript, html | If you’re a fresher preparing for an **[Ionic Framework interview](https://www.onlineinterviewquestions.com/ionic-framework-interview-questions-answers/)**, it’s essential to familiarize yourself with commonly asked questions. Here, we’ve compiled a list of the top 25 Ionic Framework interview questions along with their answers to help you ace your interview.

**Q1. What is Ionic Framework?**
Ans: Ionic Framework is an open-source UI toolkit for building cross-platform mobile applications using web technologies like HTML, CSS, and JavaScript. It allows developers to create native-like experiences across multiple platforms with a single codebase.
**Q2. What are the key features of Ionic Framework?**
Ans: There are following key features of Ionic Framework include:
Cross-platform compatibility
Built-in UI components
Theming and customization
Native device features integration
Performance optimization
Community support and plugins
**Q3. Explain the difference between Ionic and Cordova.**
Ans: Ionic is a UI toolkit that provides a set of pre-designed UI components and tools for building mobile applications, while Cordova is a platform for building mobile applications using web technologies by wrapping them in a native container.
**Q4. What is Angular in the context of Ionic Framework?**
Ans: Angular is a popular JavaScript framework for building web applications, and Ionic Framework is built on top of Angular. Angular provides features like data binding, dependency injection, and routing, which are leveraged by Ionic for building robust mobile applications.
**Q5. What is Capacitor, and how does it relate to Ionic Framework?**
Ans: Capacitor is a cross-platform runtime and native API layer for web applications. It enables web developers to build native mobile applications using web technologies and deploy them to multiple platforms. Capacitor is commonly used in conjunction with Ionic Framework for building cross-platform mobile apps.
**Q6. What is the role of CLI in Ionic Framework development?**
Ans: CLI (Command Line Interface) in Ionic Framework is a powerful tool that streamlines development tasks such as project setup, code scaffolding, building, testing, and deployment. It provides commands for creating, serving, and building Ionic applications, making development faster and more efficient.
**Q7. Explain the concept of lazy loading in Ionic Framework.**
Ans: Lazy loading is a technique used to improve the performance of Ionic applications by loading modules, components, or resources only when they are needed. In Ionic Framework, lazy loading is achieved by dynamically loading modules and components when the user navigates to a specific route, reducing initial load times.
**Q8. How can you handle device-specific features in Ionic applications?**
Ans: Ionic provides plugins and APIs for integrating device-specific features such as camera, geolocation, accelerometer, and push notifications into your applications. These plugins allow developers to access native device capabilities using JavaScript code and provide a consistent user experience across different platforms.
**Q9. What is the significance of Ionic Native in Ionic Framework development?**
Ans: Ionic Native is a library of TypeScript wrappers for Cordova/PhoneGap plugins and other native device APIs. It simplifies the process of integrating native device features into Ionic applications by providing a consistent and easy-to-use API layer that abstracts away platform differences.
**Q10. How do you debug Ionic applications?**
Ans: Ionic applications can be debugged using browser developer tools like Chrome DevTools or Firefox Developer Tools. By running your Ionic app in a browser, you can inspect and debug HTML, CSS, and JavaScript code, set breakpoints, and analyze network requests for troubleshooting and optimization. | lalyadav |
1,872,026 | 10 FREE JavaScript Courses to Boost Your Skills! | JavaScript is a handy programming language for making websites interactive. Whether you're just... | 0 | 2024-05-31T11:58:58 | https://dev.to/aqsa81/10-free-javascript-courses-to-boost-your-skills-5552 | javascript, webdev, programming, react | JavaScript is a handy programming language for making websites interactive. Whether you're just starting or want to get better, there are lots of free online courses to help. Here's a list of 10 great courses that cover different parts of JavaScript, from the basics to more advanced stuff.
---
## 1. JavaScript Design Patterns
- **Course Link:** [JavaScript Design Patterns](https://imp.i115008.net/9WZYvy)
- **What You'll Learn:** This course teaches you how to organize your JavaScript code better. You'll learn about patterns like Module, Singleton, Observer, and Revealing Module. These help you write code that's easier to understand and work with.
## 2. JavaScript Promises
- **Course Link:** [JavaScript Promises](https://imp.i115008.net/gbQAG9)
- **What You'll Learn:** Here, you'll learn about promises, which are super helpful for dealing with things that take time in JavaScript. You'll see how to write cleaner code for tasks like loading data or making requests to a server.
## 3. JavaScript Testing
- **Course Link:** [JavaScript Testing](https://imp.i115008.net/3P05ZB)
- **What You'll Learn:** Testing is like checking if your code works properly. This course shows you different ways to test your JavaScript code using tools like Jasmine and Mocha. Testing helps you catch mistakes early and make sure your code does what it's supposed to.
## 4. Object-Oriented JavaScript
- **Course Link:** [Object-Oriented JavaScript](https://imp.i115008.net/Rydxng)
- **What You'll Learn:** Object-Oriented Programming (OOP) is a way to organize your code so it's easier to manage. This course teaches you how to use objects, classes, and other OOP concepts in JavaScript. It helps you write code that's more organized and reusable.
## 5. Intro to JavaScript
- **Course Link:** [Intro to JavaScript](https://imp.i115008.net/gb7zDv)
- **What You'll Learn:** If you're new to JavaScript, this course is perfect for you. It covers the basics like how to write JavaScript code, use different data types, and create functions. You'll get a solid foundation for learning more about JavaScript.
## 6. ES6 – JavaScript Improved
- **Course Link:** [ES6 – JavaScript Improved](https://imp.i115008.net/WD3JEX)
- **What You'll Learn:** ES6 introduced some cool new features to JavaScript that make coding easier and more fun. In this course, you'll learn about things like arrow functions, template literals, and classes. These features help you write cleaner and more modern JavaScript code.
## 7. Intro to AJAX
- **Course Link:** [Intro to AJAX](https://imp.i115008.net/DVzAad)
- **What You'll Learn:** AJAX is a way to send and receive data from a server without refreshing the whole web page. This course shows you how to use AJAX with JavaScript to make your web pages more dynamic and interactive.
## 8. Asynchronous JavaScript Requests
- **Course Link:** [Asynchronous JavaScript Requests](https://imp.i115008.net/Vy3RdE)
- **What You'll Learn:** Asynchronous programming is a bit tricky but very useful in JavaScript. This course dives into how to manage asynchronous tasks like loading data or making requests. You'll learn about things like callbacks and promises, which help you write code that doesn't get stuck waiting around.
## 9. JavaScript and the DOM
- **Course Link:** [JavaScript and the DOM](https://imp.i115008.net/MX3oVq)
- **What You'll Learn:** The DOM is like a map of your web page that JavaScript can read and change. In this course, you'll learn how to use JavaScript to interact with the DOM. You'll see how to find elements, handle events like clicks, and update content on your web page.
## 10. Website Performance Optimization
- **Course Link:** [Website Performance Optimization](https://imp.i115008.net/QOXQVA)
- **What You'll Learn:** Nobody likes a slow website! This course teaches you how to make your JavaScript-powered websites faster and more efficient. You'll learn tricks for reducing loading times, using fewer resources, and making your site feel snappy for users.
---
**Also, Check- [Udacity FREE Courses Python, SQL, Product Design, C++, and UI/UX](https://www.mltut.com/udacity-free-courses-on-machine-learning/)**
## Conclusion
These free JavaScript courses are full of helpful stuff to help you become a better JavaScript developer. Whether you're interested in organizing your code better, learning new features, or making your websites faster, there's something here for you. So dive in, have fun, and get ready to level up your JavaScript skills! | aqsa81 |
1,872,025 | Kubernetes Deployment Best Practices: Ensuring Stability and Scalability | Kubernetes, a popular tool for managing containerized applications, has emerged as the industry... | 0 | 2024-05-31T11:58:08 | https://dev.to/rachgrey/kubernetes-deployment-best-practices-ensuring-stability-and-scalability-14e6 | kubernetes, deployment, programming | Kubernetes, a popular tool for managing containerized applications, has emerged as the industry standard. It provides a robust framework for seamlessly deploying, scaling, and managing applications. Given its complex nature, it is imperative to stick to best practices to ensure your applications' security, scalability, and reliability. This article delves into the essential best practices for deploying Kubernetes, offering comprehensive guidance on optimizing your Kubernetes environment to meet your needs and requirements.
## 1. Namespace Utilization
In Kubernetes, namespaces are essential for managing resources and keeping environments separate. Using different namespaces for development, staging, and production can prevent resource conflicts and ensure that each environment runs independently. This separation allows for individual policies, resource quotas, and access controls, making management more efficient. Additionally, using namespace-specific resource quotas helps ensure fair resource allocation across the cluster, preventing one team from using up all the resources. This method encourages a more productive and efficient workflow while optimizing the use of resources.
## 2. Effective Use of Labels and Annotations
Kubernetes offers robust annotation and labeling features for effective resource management and monitoring. Applying key-value pair labels to all Kubernetes objects makes filtering and selecting easier. Labels like app, version, environment, and team aid in organization and resource management. Annotations hold non-identifying metadata, providing additional data for deployment notes, documentation URLs, or specific monitoring configurations. This metadata is valuable for resource management and location tracking and provides important context for audits and troubleshooting.
## 3. Configuration Management
Managing configuration data is essential for applications running in Kubernetes. Kubernetes provides ConfigMaps and Secrets to handle this data efficiently. Non-sensitive configuration data stored in ConfigMaps includes environment variables, command-line arguments, and configuration files. This separation helps with better portability and management by keeping the configuration separate from the application code. Secrets manage sensitive data such as TLS certificates, API keys, and passwords.
## 4. Resource Requests and Limits
Setting container resource requests and limits in a Kubernetes cluster is necessary to use resources and prevent conflicts efficiently. Resource requests specify a container's minimum CPU and memory, helping Kubernetes decide where to place pods. Resource limits define a container's maximum CPU and memory usage, preventing resource monopolization and maintaining application stability and performance.
## 5. Readiness and Liveness Probes
To keep your applications stable in Kubernetes, do regular health checks. Kubernetes has two types of checks: liveness and readiness. Liveness checks help your application recover automatically from problems, while readiness checks ensure a container is ready to receive traffic. Using these checks, you can keep your apps updated and prepared to handle traffic.
## 6. Rolling Updates and Rollbacks
Kubernetes supports rolling updates and rollbacks, which are crucial for maintaining application stability and availability during deployments. With rolling updates, you can gradually release updated versions of your application, reducing the possibility of errors and ensuring no downtime during the update process. This approach enables you to monitor the update and identify any issues before fully implementing the changes. By minimizing downtime and ensuring application stability, this feature provides a seamless user experience.
## 7. Pod Disruption Budgets
When you're making upgrades or doing maintenance work, Pod Disruption Budgets (PDBs) help to keep your application running. PDBs set out the minimum number or percentage of pods that need to keep working when there are planned disruptions, like cluster upgrades or node maintenance. Setting PDBs ensures your application stays usable and effective even if some pods are temporarily unavailable. This is important for keeping your application available and handling interruptions without causing big user problems.
## 8. Networking and Service Discovery
In a Kubernetes cluster, microservices communicate with each other using efficient networking and service discovery. Kubernetes DNS makes it easy for services to find each other using domain names, simplifying communication between services and reducing the complexity of maintaining service endpoints. Consider utilizing a service mesh such as Istio or Linkerd for improved observability, security features, and traffic management. A service mesh with strong security policies, automated load balancing, and precise control over traffic flow will help your microservices architecture operate more reliably and efficiently overall.
## Conclusion
Creating a reliable, scalable, and secure environment with Kubernetes deployment involves following best practices. This includes optimizing resource use, preserving application health, and efficiently managing namespaces, labels, annotations, and configurations. Conducting thorough health checks, rolling updates, and establishing Pod Disruption Budgets are essential to enhance reliability. Protect your activities through careful monitoring, logging, and strict security measures. Collaborating with [Kubernetes consultants](https://www.bacancytechnology.com/kubernetes-consulting-services) can provide valuable insights and ensure your deployments are optimized for success, allowing your business to utilize Kubernetes' potential fully.
| rachgrey |
1,872,024 | Chrome Hearts Sweatpants: A Blend of Comfort and Luxury | Overview Chrome Hearts sweatpants are a shining illustration of how to combine luxury... | 0 | 2024-05-31T11:57:27 | https://dev.to/muhammad_waqas_acef025b93/chrome-hearts-sweatpants-a-blend-of-comfort-and-luxury-3mj0 | chromeheart, chromeheartshoodie, clothing, fashion |

## Overview
Chrome Hearts sweatpants are a shining illustration of how to combine luxury design with casual comfort. The fashion industry has been enthralled with Chrome Hearts’ distinctive products, a brand that is synonymous with luxury and edgy designs. Chrome Hearts have given sweatpants—which are typically thought of as comfortable and carefree clothing—a whole new level of sophistication and style. What, though, is so unique about these sweatpants? [chrome hearts clothing](https://chromeheartusa.com/)
## Chrome Hearts’ Past
When Richard Stark founded Chrome Hearts in 1988, the company first sold leather motorcycle accessories. It developed into a high-end brand throughout time, renowned for its elaborate silver accessories, gothic style, and fine clothing. From its modest origins in Los Angeles, Chrome Hearts has become into a highly regarded worldwide fashion powerhouse.
## The Style of Chrome Hearts
The eye-catching designs of Chrome Hearts are distinguished by their audacious use of leather, sterling silver, and elaborate embellishments. The brand is known for its cross patterns, fleur-de-lis, and dagger iconography, which are frequently adorned with fine details. A devoted fan base has developed around this unique style, as evidenced by the regular sightings of Drake, Gigi Hadid, and Kanye West wearing Chrome Hearts clothing.
## Chrome Hearts Sweatpants: An Essential Piece of Clothing
Chrome Hearts sweatpants have made a name for themselves in the world of upscale streetwear. These are a statement piece, not just another pair of sweatpants. Fashionistas adore them for their comfort, adaptability, and distinctive Chrome Hearts flair, which elevate any ensemble. Whether you’re taking a leisurely stroll or spending time at home, [Eric Emanuel](https://eeofficial.shop/)
## Components and Artistry
The outstanding quality of materials utilized in Chrome Hearts sweatpants is one of its distinguishing features. These sweatpants are made of high-quality materials including velvety fleece and blends of soft cotton for optimal comfort. Every stitch demonstrates the painstaking craftsmanship, which guarantees longevity and an opulent feel that never fades.
## Variations in Design
To accommodate a range of interests and inclinations, Chrome Hearts provides a selection of sweatpants in various styles. The original designs have gothic elements and the trademark Chrome Hearts emblem, while limited edition releases frequently have exclusive patterns, colors, and partnerships with other luxury brands. Every pair is an artistic creation, embodying the brand’s dedication to uniqueness and excellence.
## Elegance and Coziness Come Together
The ultimate in coziness are Chrome Hearts sweatpants. interacting with luxury. They are ideal for people who, even when dressing casually, won’t sacrifice style. Sweatpants’ premium materials and careful design combine to provide an effortlessly stylish look that lets you enjoy their comfort. This harmony is what distinguishes Chrome Hearts sweatpants from the others.
## Value and Cost
Recognizing the value of what you’re getting is necessary to comprehend Chrome Hearts sweatpants’ price point. These are luxurious sweatpants that have been expertly and carefully made; they are not your typical pair. The premium cost is justified by the superior materials, expert craftsmanship, and prestigious reputation of the brand. Purchasing a pair of Chrome Hearts sweatpants is a popular way for people to become the owners of a piece of fashion history.
## Chrome Hearts Sweatpants Style Guide
Chrome Hearts sweatpants are versatile enough to be dressed in a variety of ways. They look great with shoes and a basic t-shirt. If you want to look more put together, pair them with an elegant jacket and accessories. These sweatpants are perfect for going from a casual
## How to Take Care of Your Chrome Hearts Sweatshirt
You must take excellent care of your Chrome Hearts sweatpants if you want them to last. Refrain from using bleach or other harsh chemicals and wash them in cool water with a mild detergent. It is advised to air dry the fabric to preserve its integrity. These sweatpants can last for years in addition to feeling and looking great with proper maintenance.
## Where to Get Sweatpants with Chrome Hearts?
Genuine Chrome Hearts sweatpants are available at official Chrome Hearts outlets and a few upscale merchants. Reputable internet retailers like SSENSE and Farfetch also have a selection of Chrome Hearts clothing. Purchasing from reliable vendors is essential to guaranteeing that the goods you receive are authentic.
## Authenticity Issues
Due to Chrome Hearts’ widespread appeal, fake goods are regrettably quite common. Buy from authorized stores to avoid falling for counterfeit goods, and be cautious of offers that appear too good to be true. Genuine Chrome Hearts sweatpants are difficult to fake since they include unique tags and branding elements.
## Reviews & Testimonials from Customers
Customers adore Chrome Hearts sweatpants for their design and comfort. A lot of people point out the superior materials and distinctive designs as noteworthy aspects. Testimonials frequently highlight how satisfying it is to acquire a luxury item that skillfully blends comfort and style.
## Chrome Hearts Sweatpants: Why Choose Them?
Selecting Chrome Hearts sweatpants is making the choice of unmatched elegance and quality. They provide a special fusion of elegance and coziness that is difficult to find elsewhere. | muhammad_waqas_acef025b93 |
1,872,005 | Considerations for effective application performance management: Areas to look out for | In the contemporary business landscape, applications are fundamental to an organization's success.... | 0 | 2024-05-31T11:57:04 | https://dev.to/manageengineapm/considerations-for-effective-application-performance-management-areas-to-look-out-for-35ok | applicationperformance, apm, monitoring, observability | In the contemporary business landscape, applications are fundamental to an organization's success. Ensuring optimal application performance is therefore paramount. [Application performance monitoring](https://www.manageengine.com/products/applications_manager/application-performance-monitoring.html?utm_source=dev.to&utm_medium=post&utm_id=dev-communitites) (APM) tools have emerged as an indispensable asset, empowering organizations to proactively safeguard application health, meet user expectations, and ultimately drive business growth. This article explores potential challenges that may arise while implementing an APM strategy within your IT application environment.
## 1. The multifaceted nature of APM solutions
APM monitoring tools can be powerful allies in optimizing application health, but their feature-rich nature can also introduce complexity. Several factors, such as feature abundance, deployment, integration with third-party software, and customization demands, can contribute to this complexity and heightens the stress of developing an APM strategy.
APM tools offer a wide range of functionalities like real-time monitoring, alerting, diagnostics, tracing, analytics, and reporting.
While these features are valuable, organizations must carefully consider their specific needs and choose a tool that provides the right balance of functionality without overwhelming users. Then comes the problem of deployment. Choosing the right deployment option depends on an organization's existing infrastructure and long-term IT strategy. Compatibility issues during integration can hinder the seamless operation of even the best APM tools.
## 2. A tangled web of technologies

Modern business applications reside in a dynamic IT environment characterized by microservices, containers, and hybrid cloud deployments. These advancements, alongside concepts like distributed systems, CI/CD pipelines, and edge computing, offer unprecedented agility and scalability to businesses.
However, managing a diverse technology stack (heterogeneous tech stack) can be complex. Each technology has its own intricacies and relies on others to function. While [microservices](https://www.manageengine.com/products/applications_manager/microservices-monitoring.html?utm_source=dev.to&utm_medium=post&utm_id=dev-communitites) architecture improves fault tolerance by distributing functionalities, it introduces challenges in communication, data consistency, and monitoring. [Hybrid cloud](https://www.manageengine.com/products/applications_manager/hybrid-cloud-monitoring.html?utm_source=dev.to&utm_medium=post&utm_id=dev-communitites) environments introduce complexities like data inconsistency due to varying storage formats and interoperability challenges stemming from incompatible platforms. While DevOps and CI/CD pipelines streamline software delivery, integrating edge computing further increases IT system complexity, creating a potential visibility gap into application operations and transactions.
Fortunately, innovative solutions like service meshes, event-driven architectures, and [distributed tracing](https://www.manageengine.com/products/applications_manager/tech-topics/distributed-tracing.html?utm_source=dev.to&utm_medium=post&utm_id=dev-communitites) can mitigate these complexities in hybrid cloud and edge computing environments. Specifically, to address the [application observability](https://www.manageengine.com/products/applications_manager/application-observability.html?utm_source=dev.to&utm_medium=post&utm_id=dev-communitites) challenges, organizations can leverage powerful Application Performance Monitoring tools. Applications Manager, for instance, offers features like distributed tracing, dependency mapping, AI-powered anomaly detection, and full-stack monitoring capabilities.
## 3. Unstructured, exponentially growing data

Big data in APM can be a double-edged sword. On the one hand, it brings significant benefits like deeper insights and improved decision-making. But on the other hand, dealing with all that information can be a real headache, presenting several challenges that need to be addressed.
The ever-expanding data-sphere imposes challenges in scalability and data analysis. As the volume of data collected by your APM tool grows, traditional infrastructure might struggle to keep up. The sheer amount of data can strain storage capacity, requiring you to invest in additional resources or explore more cost-effective storage solutions.
Processing and analyzing massive datasets can overwhelm your existing infrastructure, causing slowdowns and impacting the overall performance of your APM tool. Data from diverse sources might be stored in separate silos, making it difficult to correlate and analyze it for a holistic view of application performance.
## 4. Implementation challenge one: People
Even with the undeniable benefits of comprehensive APM, some organizational cultures might hesitate to embrace it fully. This reluctance to implement stems from the perception that they would lose control over their applications and a fear of reprisal. If the advantages of APM are not clearly communicated and understood across the organization, some stakeholders might resist its implementation.
Apart from this, it should be noted that implementing APM effectively requires a diverse skillset encompassing software development, system administration, networking, and data analysis. Unfortunately, a lack of expertise in these areas can create blind spots in monitoring, leading to inaccurate analysis and hindering effective problem resolution. These shortcomings can translate into significant financial losses, increased operational costs, and, ultimately, dissatisfied customers.
The good news is that organizations can empower their teams to overcome these challenges. Investing in training programs equips users with the necessary skills to utilize these tools effectively.
## 5. Implementation challenge two: Legacy software
Integrating modern APM solutions with legacy systems can be a hurdle. These older systems, while critical, often lack the built-in monitoring capabilities (instrumentation) and communication protocols on which newer tools rely. This makes it difficult to gain clear visibility into their performance. Obtaining performance data from legacy systems can be a daunting task. It might require significant development effort and involve navigating a mix of older and newer technologies (heterogeneous tech stacks).
## Consequences of ignoring these challenges

Unmitigated APM challenges can have a cascading effect, negatively impacting application performance, user experience, and, ultimately, business outcomes. Let's explore the potential consequences of neglecting these challenges:
- **Deteriorating application performance:** Unidentified performance bottlenecks can silently erode the user experience. This can manifest as sluggish application responsiveness, increased page load times, and potential cart abandonment in e-commerce scenarios. The resulting frustration can negatively impact customer satisfaction, employee productivity, and revenue generation.
- **Escalated downtime risks:** Critical issues that remain undetected can evolve into significant system outages. These outages disrupt core operations, erode user trust, and damage brand reputation. The financial impact of downtime can be substantial, affecting not only IT budgets but also broader business objectives.
- **Inefficient resource utilization:** Without transparent visibility into resource consumption patterns, organizations risk improper resource allocation. Over-provisioning resources leads to unnecessary expenditures and wasted performance potential. Conversely, under-provisioning resources creates bottlenecks and hinders overall application performance.
- **Reactive problem-solving:** Issues that remain hidden until they surface as problems force IT teams to take a reactive approach, which consumes valuable time and resources that could be better directed toward proactive prevention and optimization initiatives.
By proactively acknowledging and addressing these common APM challenges, organizations can unlock a transformative opportunity. APM transcends its role as a mere accountability tool and becomes a powerful catalyst for achieving operational excellence by optimizing application performance and user experiences.
## Break free from these APM challenges with Applications Manager
Tired of fragmented data and hidden bottlenecks hindering your application performance?

[ManageEngine Applications Manager](http://www.manageengine.com/products/applications_manager/?utm_source=dev.to&utm_medium=post&utm_id=dev-communitites) empowers you to optimize applications and ensure flawless user experience—all within a single, user-friendly platform.
## Here's how Applications Manager empowers you:
- **Full-stack visibility:** Gain a holistic view of your application performance—from user interactions to backend infrastructure—with no blind spots.
- **Real-time monitoring:** Identify and address issues instantly with real-time data insights and customizable dashboards.
- **Distributed tracing:** Pinpoint the exact source of performance bottlenecks across complex microservice architectures.
- **AI-powered anomaly detection:** Leverage AI to automatically detect and diagnose performance issues before they impact users.
- **Dynamic scalability:** Scale seamlessly to accommodate your growing needs without compromising performance.
- **Agentless or agent-based architecture:** Eliminate the need for intrusive agents, simplify deployment, and reduce overhead. For deeper digging, make use of our agents for [end user experience monitoring](https://www.manageengine.com/products/applications_manager/end-user-experience-monitoring.html?utm_source=dev.to&utm_medium=post&utm_id=dev-communitites), APM, and real user monitoring.
- **Easy integration:** Integrate seamlessly with your existing tools and technologies for a unified view of your IT landscape.
- **Monitor over 150 enterprise technologies:** Gain comprehensive insights into all your applications, regardless of their underlying technology.
- **Customizable dashboards:** Create collaborative spaces where teams can share insights, identify bottlenecks, and drive proactive optimization.
- **Transparent pricing:** Know exactly what you're paying for with our clear and predictable [pricing](http://www.manageengine.com/products/applications_manager/pricing.html?utm_source=dev.to&utm_medium=post&utm_id=dev-communitites) structure.
- **Free training sessions:** Bridge the skill gap and equip your team with the knowledge to unlock the full potential of Applications Manager.
- **Continuous support:** Rest assured that our dedicated team is always available to help you get the most out of Applications Manager.
Ready to transform your application performance? [Start your 30-day, free trial of Applications Manager today and experience the difference!](https://www.manageengine.com/products/applications_manager/download.html??utm_source=dev.to&utm_medium=post&utm_id=dev-communitites)
| angies |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.