id int64 5 1.93M | title stringlengths 0 128 | description stringlengths 0 25.5k | collection_id int64 0 28.1k | published_timestamp timestamp[s] | canonical_url stringlengths 14 581 | tag_list stringlengths 0 120 | body_markdown stringlengths 0 716k | user_username stringlengths 2 30 |
|---|---|---|---|---|---|---|---|---|
1,896,333 | Secret to Code better | So yes, today i have found something which is super interesting.It has worked for me today and... | 0 | 2024-06-21T18:25:01 | https://dev.to/arjungitdotcom/secret-to-code-better-175h | productivity, learning, coding | So yes, today i have found something which is super interesting.It has worked for me today and possibly for you too. We all know that sitting in one stretch is a pain and more importantly unproductive,so we all take breaks.So the next time you take a break please be sure to drink water.Today i worked for four productive hours and the reason is i drank water in regular intervals.I found myself more refreshed and was able to woek seamlessly.It may be a simple thing but if you can implement this as part of your work routine,i guarantee positive results.Happy coding!!! | arjungitdotcom |
1,896,331 | How to learn programming without a computer? | _Note: This is not a professional nor informative article. I wrote it while setting in the bathroom.... | 0 | 2024-06-21T18:21:06 | https://dev.to/thisishemmi/how-to-learn-to-program-without-a-computer-5eb3 | learning, mobile, github, codenewbie | **_Note: This is not a professional nor informative article. I wrote it while setting in the bathroom. _**
Picture this: Morocco, a kid born into a low-income family... well, let's be real, a very poor one. That was me. We're talking "can't afford a computer because it costs a year's worth of work" level of poor. But hey, life's full of surprises, right?
Fast forward to 2019. I'm 18, fresh out of high school with a diploma specializing in Life and Earth Sciences (what we call SVT - Sciences de la Vie et de la Terre), and I've just scored my first-ever phone - a Samsung Galaxy Grand Prime. This brick of a device? It became my ticket to the coding world.
Now, I had my sights set on the fancy SMI program (that's Science Mathématiques et Informatique equivalent to Computer science program). But here's the kicker - my physics and math grades were about as impressive as a wet noodle. Dream crushed? Nah, just redirected.
So, I did what any reasonable person would do - I enrolled in biology. Why? Two words: government grant. Also, my original speciality. We're talking a whopping $190 every three months. Cha-ching! First year of college rolls around, and guess what I'm studying? Not biology, that's for sure. My brain had gone rogue, obsessing over HTML, CSS, and JavaScript. Biology textbooks? More like coasters for my endless cups of coffee.
After a year of this charade, I pulled a 180 and switched to English studies. Why? Because every scientific subject in Morocco is taught in French without having other options, and let's just say French and I have some... historical beef. That's a story for another day.
This English gig? It blew the doors wide open. Suddenly, I'm swimming in top-notch tutorials, discovering tech I'd never even heard of in the Arabic dev community. Before I knew it, I'm juggling React, PHP, C++, C, Java, Python, Kotlin - you name it, I was trying to learn everything. I watched the full 100s playlist of Fireship. One tiny problem though - I knew all these languages in theory, but I had no way to actually learn and run them. Frustrating? You bet your last dirham it was.
Enter Termux - the tiny terminal that could. This bad boy let me run Python on my phone. Then Ubuntu. My mind was blown. Six months and two grants later, I upgraded to a used Samsung S9 for a cool $150. Cue the victory dance! at that time.
With my "new" S9, I'm running Linux GUI through VNC, and even managed to get VSCode up and running. Sure, my phone was basically a handheld radiator at this point, but who cares about a little third-degree burn when you're coding? I dove into design, React, Svelte, more C/C++, and then struck gold - ThePrimeagen's algorithm course on Frontend Masters. This dude, along with Theo, became my coding heroes. Thanks to them, I leveled up to TypeScript and Rust.
So there I was, feeling pretty good about my TypeScript and Rust skills, when I stumbled upon the Tauri framework. GUI desktop apps on a phone? Challenge accepted! But let me tell you, Rust was being a real pain in the... well, you know. My poor phone would practically have a meltdown trying to compile Rust projects. It was like asking a toaster to launch a space shuttle.
This setback led to a temporary halt in my coding endeavors. However, an unexpected opportunity arose during a visit to a local coffee shop. Overhearing a conversation about student management challenges at a private school, I approached the speaker and proposed developing a desktop application to address these issues. I smooth-talked my way into a "paid internship" (spoiler alert: it wasn't) to develop a desktop app for managing students, payments, classes - the whole shebang. But how was I gonna pull this off without a computer that could handle Rust? Enter GitHub's student pack and Codespaces. Free access to a virtual machine I could control from my phone using VNC. Finally, I could compile that b***h (Rust, I mean). Take that, limited resources!
Three months of blood, sweat, and probably a few tears later, I proudly presented my partial work to Mr. Coffee Shop. I asked for a partial payment to keep the project going (internet ain't cheap in these parts, folks). His response? "There's no such thing as a paid internship." Ouch. Talk about a plot twist.
Just when I thought things couldn't get more challenging, life threw me a curveball. My father was diagnosed with cancer. Coding? It took a backseat. For two long years, my focus shifted entirely to family.
After my father's recovery, I decided to jump back into the coding world with a bang. My brilliant idea? A tutorial on creating a sidebar in Svelte. Why? Because apparently, not that much people had titled a video quite like that. Niche markets, am I right?
Recording this beast of a tutorial in Codespace was... an adventure. VNC lag? Check. VS Code eating RAM for breakfast? Double-check. I switched to Nvim (shoutout to ThePrimeagen for this) and re-recorded that tutorial so many times, I probably could've recited it in my sleep. Ten takes or so later, I'd burned through all my free core-hours for the month. And it was only the beginning of the month! Talk about pressure.
With one last shot at glory, I managed to record a solid 2-hour video. Was it perfect? Of course not.
After I recorded the video, I found myself in a classic "expectation vs. reality" situation. Picture this: me, armed with my trusty phone and a dream of YouTube stardom, downloading not one, not two, but FIVE video editing apps. Why? Because apparently, I thought I was editing the next "Avengers" movie instead of a simple coding tutorial.
These apps had the audacity to demand anywhere from 9 to 30 hours to spit out the final video. I mean, come on! By that time, I could've learned a whole new programming language or watched an entire season of "Silicon Valley" - twice!
But then, I stumbled upon a feature in Alight Motion called "bookmarks". These little lifesavers are basically time markers you can plop down in your video. And here's where it gets good - the app let me export the project as XML.
Now, remember that C++ I learned way back when my phone was doubling as a space heater? Time for it to shine! I whipped up a script faster than you can say "sudo make me a sandwich". This bad boy took those XML bookmarks, converted them into an array of "hh:mm:ss.SS" formatted time values, and used them to slice and dice my video like a coding ninja. Each pair of values became the start and end of a clip, and voilà - all the good parts extracted and stitched together into one beautiful video file.
So there you have it, folks. Who needs fancy equipment when you've got a smartphone, a dream, and a healthy dose of coding ingenuity? Next stop: YouTube channel, viral video, and of course, becoming a millionaire. I mean, that's how it works, right? Just add some sick beats, maybe a lens flare or two, and watch the money roll in. Silicon Valley, here I come!
(Disclaimer: Results may vary. Millionaire status not guaranteed. But hey, at least I'll have a cool video to show for it!) | thisishemmi |
1,896,407 | Tutorial: Simple 3D Cube in Rust 🦀 | Hello everyone, welcome back to my blog or if you're new here, hi, I'm Eleftheria and I'm learning... | 0 | 2024-06-22T09:26:18 | https://eleftheriabatsou.hashnode.dev/tutorial-simple-3d-cube-in-rust | rust, ascii | ---
title: Tutorial: Simple 3D Cube in Rust 🦀
published: true
date: 2024-06-21 18:19:40 UTC
tags: Rust, ASCII
canonical_url: https://eleftheriabatsou.hashnode.dev/tutorial-simple-3d-cube-in-rust
cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c0ehztytenqey9f1roc9.jpeg
---
Hello everyone, welcome back to my blog or if you're new here, hi, I'm Eleftheria and I'm learning Rust. Today we are going to create a simple spinning 3D cube using Rust and as less as dependencies as possible. This project is beginner-friendly and at the end of this article, you can find all the code on GitHub.
## Introduction
This is how the cube will look like:
```rust
//! 4 +------+ 6
//! /| /|
//! 5 +------+ | 7
//! | | | |
//! 0 | +----|-+ 2
//! |/ |/
//! 1 +------+ 3
```
Our cube is going to be a 3D object with eight vertices and six faces.
There are some decent algebra libraries for doing 3D but in our case, it's not necessary to use a library. **Let's start coding:**
```rust
#[derive(Debug, Clone, Copy)]
struct Matrix([[f32; 4]; 4]);
#[derive(Debug, Clone, Copy)]
struct Vector([f32; 4]);
```
The `matrix` is an array of 4 arrays of 4 numbers. It's 4x4 matrix. Each of our inner arrays `[[f32; 4]` contains 4 floating point numbers which represent a column of our matrix. We tend to think of it as a convention in computer graphics to use columns based matrices.
The `Vector` is also relatively simple. It's an array of 4 numbers.
Let's also define the vertices:
```rust
const VERTICES : [Vector; 8] = [
Vector([-1.0, -1.0, -1.0, 1.0]),
Vector([-1.0, -1.0, 1.0, 1.0]),
Vector([ 1.0, -1.0, -1.0, 1.0]),
Vector([ 1.0, -1.0, 1.0, 1.0]),
Vector([-1.0, 1.0, -1.0, 1.0]),
Vector([-1.0, 1.0, 1.0, 1.0]),
Vector([ 1.0, 1.0, -1.0, 1.0]),
Vector([ 1.0, 1.0, 1.0, 1.0]),
];
```
If you take a closer look at the table above, you'll notice we have 8 vertices, it's basically a binary truth table of vertices. The last column is always `1` and the last row is always `1` . `0` means we're representing a distance rather than a position, so positions have `1` and directions have `0` .
We have the positions (`VERTICES` ), now we also need to represent the `FACES` of the cube. Let's add an array of indices for the faces.
```rust
const FACES : [[u8; 4]; 6] = [
[1, 5, 7, 3],
[3, 7, 6, 2],
[0, 4, 5, 1],
[2, 6, 4, 0],
[0, 1, 3, 2],
[5, 4, 6, 7],
];
```
Each face has 4 indices, these show which index represents which face of the cube. The order of these is quite important, we're going clockwise in this particular case and to understand it a little bit better, have a look again here (each row represents one face):
```rust
//! 4 +------+ 6
//! /| /|
//! 5 +------+ | 7
//! | | | |
//! 0 | +----|-+ 2
//! |/ |/
//! 1 +------+ 3
```
**Now let's have** a function that does: matrix x vector.
```rust
fn matrix_times_vector(m: &Matrix, v: &Vector) -> Vector {
let [mx, my, mz, mw] = &m.0;
let [x, y, z, w] = v.0;
// The product is the weighted sum of the columns.
Vector([
x * mx[0] + y * my[0] + z * mz[0] + w * mw[0],
x * mx[1] + y * my[1] + z * mz[1] + w * mw[1],
x * mx[2] + y * my[2] + z * mz[2] + w * mw[2],
x * mx[3] + y * my[3] + z * mz[3] + w * mw[3],
])
}
```
You'll notice we have a `&Matrix` and a `&Vector` and we get a `Vector`. We're going to multiply the matrix by the vector. If you're wondering how this works then -> we kind of use destructuring to unpack the columns of the vector: `let [mx, my, mz, mw] = &m.0;`
And we also unpack the rows of the vector into: `let [x, y, z, w] = v.0;`
Now we have our basic cube structure! This structure will also help us spin the cube!
## Render the Cube
We need a screen to render the cube. Let's start with our definition of the screen. We're going to render it in ASCII so our screen will be actually quite low resolution (40x80).
*You can experiment with higher resolution but it's out of the scope of this article.*
```rust
const SCREEN_WIDTH : usize = 80;
const SCREEN_HEIGHT : usize = 40;
```
I'm also going to define an offset and a scale. These will help me position the cube in the center of my screen.
```rust
const OFFSET_X : f32 = SCREEN_WIDTH as f32 * 0.5;
const OFFSET_Y : f32 = SCREEN_HEIGHT as f32 * 0.5;
const SCALE_X : f32 = SCREEN_WIDTH as f32 * 0.5;
const SCALE_Y : f32 = SCREEN_HEIGHT as f32 * 0.5;
```
## Write a Rendering Loop
Let's write a rendering look for our animated cube:
```rust
for frame_number in 0.. {
let mut frame = [[b' ';SCREEN_WIDTH]; SCREEN_HEIGHT];
let t = frame_number as f32 * 0.01;
let (c, s) = (t.cos(), t.sin());
let cube_to_world = Matrix([
// Each row is a column of a matrix.
[ c, 0.0, s, 0.0],
[0.0, 1.0, 0.0, 0.0],
[ -s, 0.0, c, 0.0],
[0.0, 0.0,-2.5, 1.0],
]);
.
.
}
```
`frame_number` represent the time we're rendering and typically we tend to render at a fixed interval, so the frame number will be representing our time. We also need time to render into, so let's make a frame buffer: `let mut frame = [[b' ';SCREEN_WIDTH]; SCREEN_HEIGHT];` . Width and height are bytes, these bytes are going to be our ASCII characters which represent our ASCII screen which we're going to render.
The next thing we need to do is to transform our cube because we're rotating it. Firstly, we need to create a time: `let t = frame_number as f32 * 0.01;` . Here we're taking the `frame_number` and multiplying it by a small number so this gives us gives a relatively slowly incrementing time.
The next thing is to define a `cos` and a `sin` , in a plot graph, the `cos` is the `x` direction and the `sin` is the `y`.
`let (c, s) = (t.cos(), t.sin());`
On the matrix, we have the `x`, `y` and `z` directions, and the last one has the position that represents the origin of our matrix. So this matrix is going to spin our cube around a vertical axis.
```rust
let cube_to_world = Matrix([
// Each row is a column of a matrix.
[ c, 0.0, s, 0.0],
[0.0, 1.0, 0.0, 0.0],
[ -s, 0.0, c, 0.0],
[0.0, 0.0,-2.5, 1.0],
]);
```
Now, let's use this matrix to transform the coordinates into screen positions which have a few components...
```rust
.
let mut screen_pos = [[0.0, 0.0]; 8];
for (v, s) in VERTICES.iter().zip(screen_pos.iter_mut()) {
let world_pos = matrix_times_vector(&cube_to_world, v);
let recip_z = 1.0 / world_pos.0[2];
let screen_x = world_pos.0[0] * recip_z * SCALE_X + OFFSET_X;
let screen_y = world_pos.0[1] * recip_z * SCALE_Y + OFFSET_Y;
*s = [screen_x, screen_y];
}
.
.
```
As you see above, we'll need to create another loop! With this, we'll build an array of screen positions (`screen_pos` ). In our case, they're two-dimensional coordinates so there are 8 screen positions to our corresponding 8 `VERTICES`. Firstly, we're going to loop over all the vertices, and we also need to create a `world_pos` to read from `v` , hence we're going to use the `matrix_times_vector(&cube_to_world, v)` function. This will transform our cube into `world_pos` . Then we're going to calculate the `z` coordinates and the screen position (`screen_x`, `screen_y`) ~ as we mentioned earlier, we want the cube to be near the center of our screen. In the end, I take the `screen_x`, `screen_y` coordinates and throw them into a temporary array `*s = [screen_x, screen_y];`.
Right now, we're at a very good point with the cube, but let's keep going!
### Cube's Faces
Time to draw some lines to represent the cube's faces.
```rust
for face in FACES {
if !cull(screen_pos[face[0] as usize], screen_pos[face[1] as usize], screen_pos[face[2] as usize]) {
let mut end = face[3];
for start in face {
draw_line(&mut frame, screen_pos[start as usize], screen_pos[end as usize]);
end = start;
}
}
}
```
We'll need another loop! I'll take the `FACES` from our cube which represents the indexes and draw some lines for the `FACES` . To do that, I'll start by creating another function (`draw_line`). What this function does is draw some slashes (we'll write it just below). To show what a line looks like we're going to use the screen position at the start and end of our line.
`draw_line(&mut frame, screen_pos[start as usize], screen_pos[end as usize]);`
We're going to draw 4 lines in total for the face (`let mut end = face[3];`) and then at the end we set `end = start;` so we always end up with `start`. The next thing we need to do is to render our screen. There's going to be a blank screen to start with but:
```rust
for l in 0..SCREEN_HEIGHT {
let row = std::str::from_utf8(&frame[l]).unwrap();
println!("{}", row);
}
```
with the code you see above, we can iterate over our screen and convert each of the lines of our `frame` into a string: `let row = std::str::from_utf8(&frame[l]).unwrap();`
Printing the row is how we're rendering it.
We're also going to add beneath our for loop a `sleep` so we can see things slowly. (There are more ways to achieve this, but I think this is one of the simplest tricks)
```rust
std::thread::sleep(std::time::Duration::from_millis(30));
```
Another thing we want to do is to clear the screen or at least reset the cursor. This is how we can achieve this:
`print!("\x1b[{}A;", SCREEN_HEIGHT);`
#### draw\_line Function
`draw_line` takes 3 parameters:
* `frame: &mut [[u8; SCREEN_WIDTH]; SCREEN_HEIGHT]`
* `start: [f32; 2]`
* `end: [f32; 2]`
```rust
fn draw_line(frame: &mut [[u8; SCREEN_WIDTH]; SCREEN_HEIGHT], start: [f32; 2], end: [f32; 2]) {
if dy.abs() > dx.abs() {
.
.
.
}
} else {
.
.
.
}
}
```
Then we need to deconstruct our start and end into:
```rust
let [x0, y0] = start;
let [x1, y1] = end;
let [dx, dy] = [x1 - x0, y1 - y0];
```
`d` stands for our `delta` direction and we're doing the end minus the start (`[x1 - x0, y1 - y0]`). If we've got a vertical line this: `x1 - x0` will be `0` , and if we've got a horizontal line this: `y1 - y0` would be `0`.
We've 2 different types of lines to draw: **horizontal** and **vertical** lines. When we're drawing vertical lines we only want to draw one character per row and when we draw near horizontal lines we want to draw only one character per column. As you can see from the code above we've split the lines up into 2 drawing loops.
```rust
if dy.abs() > dx.abs() {
let ymin = y0.min(y1);
let ymax = y0.max(y1);
let iymin = ymin.ceil() as usize;
let iymax = ymax.ceil() as usize;
let dxdy = dx / dy;
for iy in iymin..iymax {
let ix = ((iy as f32 - y0) * dxdy + x0) as usize;
frame[iy][ix] = b'|';
}
```
If `dy.abs() > dx.abs()` then this is a largely vertical line, else if it's a largely horizontal line then run this piece of code:
```rust
else {
let xmin = x0.min(x1);
let xmax = x0.max(x1);
let ixmin = xmin.ceil() as usize;
let ixmax = xmax.ceil() as usize;
let dydx = dy / dx;
for ix in ixmin..ixmax {
let iy = ((ix as f32 - x0) * dydx + y0) as usize;
frame[iy][ix] = b'-';
}
}
```
We're going to draw from the minimum to the maximum, which is actually from the top to the bottom because our `y` coordinate is sort of `0` downwards in this particular coordinate system and we're going to kind of make `ymin` and `ymax` in floating point and `iymin` and `iymax` in integer, this is because we need to draw a specific number of lines in our specific number of integer cells.
The horizontal lines are similar. I'm not going to explain the code in detail but if you have any questions feel free to ask me in the comments.
## Run It
Are you still here?! Congrats! 🥳
We're ready to run it. Just type on your terminal `cargo run` and you'll see it running. To stop, type `ctrl+c`.

## Conclusion
I hope you liked this project, I know it was different than what I usually do (and this one has more maths than my previous projects), but if you find it interesting, you can also have a look at my GitHub profile, where you will find all the [code](https://github.com/EleftheriaBatsou/3d-cube-rust).
{% embed https://github.com/EleftheriaBatsou/3d-cube-rust %}
Happy Rust Coding! 🤞🦀
---
👋 Hello, I'm Eleftheria, **Community Manager,** developer, public speaker, and content creator.
🥰 If you liked this article, consider sharing it.
🔗 [**All links**](https://limey.io/batsouelef) | [**X**](https://twitter.com/BatsouElef) | [**LinkedIn**](https://www.linkedin.com/in/eleftheriabatsou/) | eleftheriabatsou |
1,896,142 | Comprehensive Guide to Setting Up Django | Setting up Django involves a series of steps to get the framework installed, configured, and ready... | 0 | 2024-06-21T18:19:27 | https://dev.to/kihuni/comprehensive-guide-to-setting-up-django-3ff5 | webdev, beginners, python, django | Setting up Django involves a series of steps to get the framework installed, configured, and ready for development. Here’s a detailed guide:
<h2>Prerequisites</h2>
- Python Installation
Ensure Python is installed on your system. Django is a Python-based framework, so you need Python 3.6 or higher.
- Check Installation:
Run `python --version` in your command line to check your Python version.
- Download and Install:
If Python is not installed, download it from [python.org](https://www.python.org/) and follow the installation instructions for your operating system.
<h2>Installing Django</h2>
- Using pip
`pip` is the Python package installer, used to install Django and other packages.
- Install pip:
Most Python installations include pip. Verify by running:
```
pip --version
```
- Creating Virtual environment:
To create a virtual environment, decide upon a directory where you want to place it and run the `venv module` as a script with the directory path:
```
python3 -m venv venv
```
Activating virtual environment:
On Windows, run:
```
venv\Scripts\activate
```
On Unix or MacOS, run:
```
source venv/bin/activate
```
- Install Django:
Run `pip install django` to install the latest version of Django.
```
pip install django
```
## Creating a Django Project
- Project Initialization
Django projects contain all settings and configurations for your website.
Use `django-admin startproject projectname` command to create a new project. Replace projectname with your desired project name.
- Project Structure:
The command creates a directory structure that includes:
- manage.py:
A command-line utility for interacting with your project.
- A directory named projectname containing:
- __init__.py:
An empty file that indicates this directory is a Python package.
- settings.py:
Configuration settings for your project.
- urls.py:
URL declarations for your project.
- wsgi.py:
An entry-point for WSGI-compatible web servers.
- asgi.py:
An entry-point for ASGI-compatible web servers.
```
projectname/
manage.py
projectname/
__init__.py
settings.py
urls.py
asgi.py
wsgi.py
```
## Running the Development Server
- Starting the Server
Django includes a lightweight web server for development purposes.
- Command:
Navigate to your project directory and run:
```
python manage.py runserver
```
You’ll see the following output on the command line:
```
Performing system checks...
System check identified no issues (0 silenced).
You have unapplied migrations; your app may not work properly until they are applied.
Run 'python manage.py migrate' to apply them.
June 09, 2024 - 15:50:53
Django version 5.0, using settings 'mysite.settings'
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.
```
- Accessing the Server:
Open a web browser and go to `http://127.0.0.1:8000/`. You should see the Django welcome page.

## Create New App
To create your app, make sure you’re in the same directory as manage.py and type this command:
```
$ python manage.py startapp newApp
```
Replace the `newApp` name with your desired App name
That’ll create a directory NewApp. which is laid out like this:
```
newApp/
__init__.py
admin.py
apps.py
migrations/
__init__.py
models.py
tests.py
views.py
```
## Configuring the New App
- Adding to INSTALLED_APPS
Include your new app in the project settings to make Django aware of it.
- Edit settings.py:
Add 'appname', to the INSTALLED_APPS list in settings.py.
```
INSTALLED_APPS = [
'newApp',
]
```
## Database Setup
- Default Database
Django uses SQLite by default, which is suitable for development and testing.
- Settings:
The database settings are located in settings.py. No additional configuration is required for SQLite.
```
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
}
}
```
- Using Other Databases
For production, you might use databases like PostgreSQL or MySQL.
- Install Database Adapter:
Use `pip install psycopg2` for PostgreSQL or `pip install mysqlclient` for MySQL.
- Update settings.py:
Modify the DATABASES setting with the appropriate configuration for your database.
## Applying Migrations
- Database Schema
Migrations are Django’s way of propagating changes to your models (adding a field, deleting a model, etc.) into your database schema.
- Create Migrations:
Run `python manage.py makemigrations` to create new migrations based on the changes you made to your models.
```
python manage.py makemigrations
```
```
Migrations for 'newApp':
posts/migrations/0001_initial.py
- Create model newModel
```
- Apply Migrations:
Run `python manage.py migrate` to apply the migrations and synchronize the database state with your models.
```
python manage.py migrate
```
```
Apply all migrations: admin, auth, contenttypes, newApp, sessions
Running migrations:
Applying contenttypes.0001_initial... OK
Applying auth.0001_initial... OK
Applying admin.0001_initial... OK
Applying admin.0002_logentry_remove_auto_add... OK
Applying admin.0003_logentry_add_action_flag_choices... OK
Applying contenttypes.0002_remove_content_type_name... OK
Applying auth.0002_alter_permission_name_max_length... OK
Applying auth.0003_alter_user_email_max_length... OK
Applying auth.0004_alter_user_username_opts... OK
Applying auth.0005_alter_user_last_login_null... OK
Applying auth.0006_require_contenttypes_0002... OK
Applying auth.0007_alter_validators_add_error_messages... OK
Applying auth.0008_alter_user_username_max_length... OK
Applying auth.0009_alter_user_last_name_max_length... OK
Applying auth.0010_alter_group_name_max_length... OK
Applying auth.0011_update_proxy_permissions... OK
Applying auth.0012_alter_user_first_name_max_length... OK
Applying posts.0001_initial... OK
Applying sessions.0001_initial... OK
```
## Creating a Superuser
- Admin Interface Access
The Django admin interface allows for easy management of site content.
- Command:
Run `python manage.py createsuperuser` and follow the prompts to create an administrative user.
```
python manage.py createsuperuser
```
```
Username (leave blank to use 'virus'):
Email address:
Password:
Password (again):
This password is too short. It must contain at least 8 characters.
This password is too common.
This password is entirely numeric.
Bypass password validation and create user anyway? [y/N]: y
Superuser created successfully.
```
## Running the Development Server Again
- Start Server
With the initial setup complete, start the development server again to ensure everything is working.
- Command:
Run `python manage.py runserver` and verify that your project and apps are functioning correctly by visiting http://127.0.0.1:8000/.
```
python manage.py runserver
```
Thank you for reading this far. Stay tuned for a glimpse into application Configuration!
## References
- Django Documentation: [docs.djangoproject.com](https://docs.djangoproject.com/en/5.0/)
- Django Project Website: [djangoproject.com](https://www.djangoproject.com/)
- Python Official Website: [python.org](https://www.python.org/)
| kihuni |
1,896,324 | Fluent Icons | I would like to introduce my project, Fluent Icons, developed with Vue 3. This project aims to... | 0 | 2024-06-21T18:19:23 | https://dev.to/veoper/fluent-icons-352i | vue, javascript, webdev, programming | I would like to introduce my project, [**Fluent Icons**](https://fluenticon.com/), developed with **Vue 3**. This project aims to streamline your design and development processes by offering functionalities to export, customize, and manage icons in various formats. Here are the features that Fluent Icons offers:
## **Multiple Export Options**
- PNG: Export your icons in high-resolution PNG format.
- SVG: Export as vector SVG format for scalable use.
- Webfont: Export your icons as a web font for easy use in web projects.
- Vue and React: Export as Vue and React components for quick integration into your projects.
- CSS: Export icons as CSS classes to include in your style sheets.
- Base64: Copy icons as Base64 encoded code to embed directly into HTML or CSS files.
## **Customizable Icons**
Each icon can be individually colored, allowing you to easily apply color palettes that suit your design needs.
## **Favorite and Bulk Download**
Add your favorite icons to a list and download them in bulk. This feature helps you keep your frequently used icons organized.
## **Advanced Search Functionality**
Separate your favorite selections in the search section for a faster and more efficient search experience. This feature helps you find the icon you need more quickly.
## **Automatic Updates**
Fluent Icons is designed to stay up-to-date continuously. The library automatically updates with the latest icons and features after each update. This ensures you always have access to the newest and most functional icons.
Fluent Icons is a comprehensive icon library filled with features like exporting in various formats, customization, and favoriting icons. For more information and to try the project, visit [fluenticon.com](https://fluenticon.com/)
`I hope this post will increase your interest in Fluent Icons and that I'll receive your feedback. If you have any questions or suggestions, please feel free to contact me. Happy coding!` | veoper |
1,896,322 | Enhance Your React Apps: Best Practices for Reusable Custom Hooks | Enhance Your React Apps: Best Practices for Reusable Custom Hooks React's custom hooks are essential... | 0 | 2024-06-21T18:12:44 | https://dev.to/msubhro/enhance-your-react-apps-best-practices-for-reusable-custom-hooks-2c3e | react, reactjsdevelopment, webdev | Enhance Your React Apps: Best Practices for Reusable Custom Hooks
React's custom hooks are essential for cleaner and more maintainable components. Follow these best practices to ensure your hooks are reusable and efficient:
1. **Single Responsibility Principle**
Ensure each custom hook addresses a single concern. This makes your hooks easier to understand, test, and maintain.
2. **Naming Conventions**
Name your hooks starting with use. This convention signals that the function adheres to React's rules of hooks.
3. **Parameterize for Reusability**
Design your hooks to accept parameters. This flexibility allows the same hook to be used in different contexts.
4. **Handle Side Effects Safely**
Manage side effects like API calls with proper cleanup to avoid memory leaks or unintended behavior.
5. **Documentation and Testing**
Document your hooks clearly, specifying expected parameters and return values. Write tests to cover various scenarios and edge cases to ensure reliability.
Implementing these best practices will help you create robust, reusable custom hooks, enhancing the overall quality of your React applications.
## Conclusion
Creating reusable custom hooks in React requires careful consideration of their focus, structure, and reusability. By following these best practices, you can develop hooks that are not only powerful and flexible but also easy to understand, maintain, and test. Custom hooks are a powerful feature in React, and with these guidelines, you can make the most of them in your applications.
| msubhro |
1,896,320 | Unlocking the Power of JavaScript Generators: Master Asynchronous Programming with Ease | JavaScript generators might seem like a mystical feature reserved for advanced developers, but... | 0 | 2024-06-21T18:11:39 | https://dev.to/delia_code/unlocking-the-power-of-javascript-generators-master-asynchronous-programming-with-ease-21ce | webdev, javascript, programming, beginners | JavaScript generators might seem like a mystical feature reserved for advanced developers, but they're actually an incredibly powerful tool that can simplify your asynchronous code. Whether you’re a beginner just dipping your toes into the world of JavaScript or a seasoned pro looking to deepen your understanding, this guide will walk you through the ins and outs of generators, providing examples, highlighting their pros and cons, and offering recommendations to get the most out of them.
#### What Are JavaScript Generators?
Generators are a special type of function that can pause and resume their execution. Unlike regular functions that run to completion once called, generators can yield control back to the calling code, maintaining their state between executions.
Here's a simple generator function:
```javascript
function* simpleGenerator() {
yield 'Hello';
yield 'World';
}
const gen = simpleGenerator();
console.log(gen.next().value); // Output: Hello
console.log(gen.next().value); // Output: World
console.log(gen.next().value); // Output: undefined
```
In this example, `simpleGenerator` is a generator function, indicated by the `*` after the `function` keyword. The `yield` keyword is used to pause the function and return a value. Each call to `gen.next()` resumes the function from where it left off.
#### How Do Generators Work?
A generator function returns a generator object. This object conforms to both the iterable protocol and the iterator protocol. When the generator's `next()` method is called, the function's execution resumes until it encounters the next `yield` statement, which returns an object with two properties:
- **value**: The value yielded by the generator.
- **done**: A boolean indicating whether the generator has completed its execution.
Here's a step-by-step breakdown of how generators work:
1. **Initialization**: When you call a generator function, it doesn't run its code immediately. Instead, it returns a generator object that can be used to control the execution of the function.
2. **Execution**: The first call to `next()` starts the execution of the generator function until the first `yield` statement is encountered.
3. **Yielding Values**: The `yield` statement pauses the function and returns the specified value.
4. **Resuming Execution**: Subsequent calls to `next()` resume the function from where it was paused, running until the next `yield` statement or the end of the function.
5. **Completion**: When the function runs to completion, `next()` returns an object with `done` set to `true`.
#### Why Use Generators?
Generators are particularly useful for handling asynchronous operations in a more synchronous-looking manner. They shine in scenarios where you need to manage complex asynchronous workflows, such as:
- **Iterating over asynchronous data sources**
- **Managing state across asynchronous operations**
- **Implementing cooperative multitasking**
#### Example: Asynchronous Programming with Generators
Let’s dive into an example that shows how generators can be used for asynchronous programming:
```javascript
function* asyncGenerator() {
const data1 = yield fetch('https://jsonplaceholder.typicode.com/posts/1');
console.log(await data1.json());
const data2 = yield fetch('https://jsonplaceholder.typicode.com/posts/2');
console.log(await data2.json());
}
const gen = asyncGenerator();
function handle(gen, result) {
if (result.done) return;
result.value.then(data => handle(gen, gen.next(data)));
}
handle(gen, gen.next());
```
Here, the `asyncGenerator` fetches data from two different URLs. The `handle` function orchestrates the asynchronous calls, resuming the generator function once each fetch operation completes.
#### Combining Generators with Promises
One of the most powerful use cases for generators is their combination with Promises to handle asynchronous flows. By yielding a Promise, you can pause the generator until the Promise resolves, creating a more readable and maintainable code structure for asynchronous tasks.
```javascript
function* fetchData() {
try {
const response = yield fetch('https://api.example.com/data');
const data = yield response.json();
console.log(data);
} catch (error) {
console.error('Error fetching data:', error);
}
}
const iterator = fetchData();
function run(iterator) {
function step(result) {
if (result.done) return;
const promise = result.value;
promise.then(
res => step(iterator.next(res)),
err => step(iterator.throw(err))
);
}
step(iterator.next());
}
run(iterator);
```
In this example, `fetchData` is a generator function that fetches data from an API. The `run` function manages the flow, resuming the generator after each Promise resolves and handling errors gracefully.
#### Pros of JavaScript Generators
1. **Simplified Asynchronous Code**: Generators can make asynchronous code appear more synchronous and linear, which is easier to read and maintain.
2. **Control Flow Management**: They provide fine-grained control over the execution of a function, allowing for complex control flows that are difficult to achieve with regular functions.
3. **Memory Efficiency**: Generators can be more memory-efficient than other iterable constructs because they produce values on the fly.
#### Cons of JavaScript Generators
1. **Complexity**: For beginners, the concept of generators can be difficult to grasp, particularly how `yield` and `next` interact.
2. **Debugging Difficulty**: Debugging generator functions can be more challenging due to their pausable nature and the non-linear execution flow.
3. **Performance Overhead**: In some cases, generators may introduce performance overhead compared to simpler constructs like loops or array methods.
#### Recommendations for Using Generators
1. **Start Simple**: Begin with basic generator functions to get comfortable with `yield` and `next`.
2. **Use with Promises**: Combine generators with Promises for handling asynchronous operations elegantly.
3. **Integrate with Async/Await**: Although async/await is more common for asynchronous code, generators can still play a role in complex scenarios where more control is needed.
4. **Leverage Libraries**: Libraries like [co](https://github.com/tj/co) can help manage generators and asynchronous code, providing a more streamlined experience.
### What is `co`?
`co` is a small library that makes it easier to work with generators for asynchronous programming. It allows you to write asynchronous code that looks synchronous, by automatically handling the `next()` calls and Promise resolution.
Here’s an example of using `co` with a generator function:
```javascript
const co = require('co');
co(function* () {
const data1 = yield fetch('https://jsonplaceholder.typicode.com/posts/1');
console.log(await data1.json());
const data2 = yield fetch('https://jsonplaceholder.typicode.com/posts/2');
console.log(await data2.json());
}).catch(err => {
console.error('Error:', err);
});
```
In this example, `co` manages the generator function, handling the `yield` expressions and Promises behind the scenes, making the code more concise and easier to read.
JavaScript generators are a powerful feature that, once mastered, can significantly enhance your coding capabilities, particularly for asynchronous programming. By understanding how to pause and resume functions with `yield`, manage complex control flows, and integrate with Promises, you can write cleaner, more maintainable code.
Remember to practice with simple examples, gradually incorporating generators into more complex scenarios. As with any tool, the key to mastery is consistent and thoughtful practice. Happy coding! | delia_code |
1,896,319 | Add album art to music files via the command line | My digital music collection goes all the way back to 2002 when I bought my first laptop. Over the... | 0 | 2024-06-21T18:09:02 | https://benpatterson.io/ogg/abcde/mp3/music/programming/2024/06/21/album-art.html | ogg, music, mp3, albumart | My digital music collection goes all the way back to 2002 when I bought my first laptop. Over the years, I downloaded various tracks, ripped copies of my cds using windows media player, copied family members cds to my computer. I bought music in a variety of sources, encoded new cds in multiple formats, transferred the collection from one operation system to another and from iPhone to Android and back again.
My collection isn't huge but it's mine. It had taken years to collect and almost every file had some sort of sentimental value. There was always a time or place that I had bought the album or a song that had a certain resonance to it. As I spend more time purchasing new music instead of just streaming it, I feel a greater affinity to the music I do have. I want to invest more time listening to an album in it's entirety. It makes me appreciate the effort put into the album and it makes me want to talk about it with others more.
It is a mess though. I have been cleaning things up. There is something about having accurate track data and album art that enhances the listening experience. I wanted better quality files. My hearing isn't great but I've been trying to opt for better quality files even if I personally can't tell the difference. Apologies to audiophiles.
## Implementation
I opted to use the command line tool `abcde` to do this re-copy the cds. It allowed my to set a higher bitrate and took care of things like embedding album art.
What was left was a collection in `mp3`, `m4a` and `ogg` formats. Some had album art, some didn't. Some albums had been split into different folders due to issues with iTunes or other software that I had used over the years. Some albums lacked any track data at all!!
I also opted to use the command line to embed album art. Googling didn't return any ui options and I wasn't about to pay for a program without being sure it would solve my problem. I added track data manually with VLC. Luckily there where only a couple of albums where I had to do that.
The below was all run on a mac. Command line tools where installed using [brew](https://brew.sh/) . I suggest making a backup of your files before running any scripts against them
### MP3
For MP3s I used [ffmpeg](https://ffmpeg.org/) and followed the following steps
1. Find the album art online
2. Download it into the folder where the album is stored. I've named it cover.jpg, png files will work too
3. Run the below script inside the folder
4. Delete the original folder and rename the copy that is created
```bash
thisdir=$(pwd)
dircp="${thisdir}-copy"
mkdir -p "$dircp"
for filename in ./*.mp3; do
basefile=$(basename "$filename" ".mp3")
ffmpeg -i "${basefile}.mp3" -i cover.jpg -map_metadata 0 -map 0 -map 1 -acodec copy "${dircp}/${basefile}.mp3"
done
```
### M4A
For the `m4a` files, I used [`AtomicParsley`](https://formulae.brew.sh/formula/atomicparsley). I was pretty happy to be able to copy in place for this option
1. Find the album art online
2. Download it into the folder where the album is stored. I've named it cover.jpg, png files will work too
3. Run the below script inside the folder
```bash
for filename in ./*.m4a; do
basefile=$(basename "$filename")
AtomicParsley "$basefile" --artwork "cover.jpg" --overWrite
done
for filename in ./cover-resized*; do
rm $filename
done
```
### OGG
The `ogg` files where a bit tougher. I had to dive into [abcde's](https://abcde.einval.com/wiki/) [source code](https://github.com/johnlane/abcde/blob/master/abcde#L3433). to see how they did it. The below is slightly altered to work on my mac. It uses [vorbiscomment](https://formulae.brew.sh/formula/vorbis-tools#default) to add meta data. All thanks to [abcde](https://abcde.einval.com/wiki/) for the below. You should definitely make a copy of your files before running this. It's easy to mess up the metadata and tricky to get it back
1. Find the album art online
2. Download it into the folder where the album is stored. I've named it cover.jpg, png files will work too
3. Run the below script inside the folder
```bash
#!/bin/bash
thisdir=$(pwd)
ABCDETEMPDIR="${thisdir}-tmp"
mkdir -p "$ABCDETEMPDIR"
VORBISCOMMENT=vorbiscomment
ALBUMARTFILE=cover.jpg
MIMETYPECOVER=$(file -b --mime-type "$ALBUMARTFILE")
EXPORTTAGS="${ABCDETEMPDIR}/export_ogg_tags"
BUILDHEADER="${ABCDETEMPDIR}/build_header"
# Now build the header, gory details are here:
# https://xiph.org/flac/format.html#metadata_block_picture
# Picture Type:
printf "0: %.8x" 3 | xxd -r -g0 > "$BUILDHEADER"
# Mime type length
printf "0: %.8x" $(echo -n "$MIMETYPECOVER" | wc -c) | xxd -r -g0 >> "$BUILDHEADER"
# Mime type:
echo -n "$MIMETYPECOVER" >> "$BUILDHEADER"
printf "0: %.8x" $(echo -n "Cover Image" | wc -c) | xxd -r -g0 >> "$BUILDHEADER"
# Description:
echo -n "Cover Image" >> "$BUILDHEADER"
# Picture width:
printf "0: %.8x" 0 | xxd -r -g0 >> "$BUILDHEADER"
# Picture height:
printf "0: %.8x" 0 | xxd -r -g0 >> "$BUILDHEADER"
# Picture color depth:
printf "0: %.8x" 0 | xxd -r -g0 >> "$BUILDHEADER"
# Picture color count:
printf "0: %.8x" 0 | xxd -r -g0 >> "$BUILDHEADER"
# Image file size:
printf "0: %.8x" $(wc -c "$ALBUMARTFILE" | awk '{print $1}') | xxd -r -g0 >> "$BUILDHEADER"
# cat the image file:
cat "$ALBUMARTFILE" >> "$BUILDHEADER"
# Now process each ogg file by first exporting the original tags then
# appending the cover image and finally copying the whole thing back
# to the original image:
for i in *.ogg
do
# Make a backup of the existing tags:
"$VORBISCOMMENT" -d metadata_block_picture $i
"$VORBISCOMMENT" --list --raw "$i" > "$EXPORTTAGS-$i.txt"
cat "$EXPORTTAGS-$i.txt"
# base64 the file and then mix it all together with the exported tags:
echo "metadata_block_picture=$(base64 < "$BUILDHEADER")" >> "$EXPORTTAGS-$i.txt"
# # Update the original ogg file with exported tags and the appended base64'd image:
"$VORBISCOMMENT" --write --raw --commentfile "$EXPORTTAGS-$i.txt" "$i"
# Delete the EXPORTTAGS file ready to be recreated for the next file in the loop,
# note that the header file BUILDHEADER will be reused for each file in the loop:
done
rm -rf "$ABCDETEMPDIR"
```
And that's it! Let me know if this helps you out or if you run into bugs!
This was a fun diversion for me. It reminded me about all the things I love about programming, like diving into source code, learning about new tools, implementing simple solutions. I didn't have to write any complex code to solve my problem or re-invent the wheel. Research and patience went so much further than trying to write a script that would fix everything at once. Maybe if there is market for aging millennials and gen-xers desperately hanging on to decades old digital media, I can revisit this.
| benruns |
1,896,318 | WordPress Staging: A Comprehensive Guide | Dive into the realm of WordPress staging sites—a pivotal tool cherished by website wizards and... | 0 | 2024-06-21T18:08:51 | https://dev.to/shabbir_mw_03f56129cd25/wordpress-staging-a-comprehensive-guide-2c7m | webdev, wordpress, staging, migration | Dive into the realm of WordPress staging sites—a pivotal tool cherished by website wizards and developers alike. Imagine a twin universe of your live site, where bold experiments with updates, new features, and design revolutions unfold without a hint of disruption. It's the ultimate shield ensuring every tweak undergoes rigorous scrutiny before stepping into the limelight.
**Advantages of Your WordPress Staging Site**
**Sanctuary of Safe Trials:** Venture fearlessly into the unknown, shielding your live site from unforeseen turbulence.
**Precision Troubleshooting:** Seek out bugs and glitches in their natural habitat, leaving your live site impeccable.
**Elevated User Delight:** Refine and perfect changes to captivate your audience anew, all while keeping your live site pristine.
**What to Expect from Your WordPress Staging Site**
**Where Security Meets Exploration**
Your [WordPress staging site](https://instawp.com/wordpress-staging/) is a fortress for updates, plugins, and design metamorphoses. Here, you orchestrate scenarios, mimicking real-world interactions to ensure seamless functionality when the curtain rises.
**Transition, Seamless, and Serene**
What's proven here moves harmoniously to your live site, where downtime is a forgotten myth and users glide through unscathed.
**Compatibility**
Here, themes, plugins, and custom codes dance in sync, choreographed meticulously to prevent discord and ensure a flawless post-update spectacle.
**What to Remember: WordPress Staging Best Practices**
**Cherish Backups: The Scribes of Safety**
Before the revolution, safeguard your kingdom—a complete archive of all realms, from databases to multimedia, primed for swift restoration should chaos ensue.
**Mastering the Backup Rites
**With the flick of a wand—or rather, a plugin—automate the sacred ritual of backups to the cloud, where secrets remain safe.
**Rigorous Testing**
Here, updates don't just arrive—they perform. In the staging arena, test their mettle meticulously, sparing your live site any unwelcome surprises.
**Defining the Rhythm of Updates**
Core updates, theme adjustments, plugin metamorphoses—all take center stage, their compatibility scrutinized to prevent discord.
**Sentry of Security**
Where updates arrive, so does vigilance. Arm your plugins, fortify with two-factor locks, and keep watch over passwords—strong, mutable, and resilient.
**Monitoring the Post-Update Overture**
As updates settle, scrutinize the stage for signs of discord. Gauge speeds, measure metrics—let not performance falter in this kingdom.
**Bridging Staging to Live**
Once validated, changes march forth, synchronized from stage to the living stage. A ballet of precision, executed during the ebb of user footfall.
**Conclusion**
Harness the force of WordPress staging—a sanctuary where site integrity meets innovation. Embrace rituals of vigilance and innovation, ensuring each curtain rise is a triumph. In this realm, wield the power of staging to sculpt your digital domain, safeguarding both security and spectacle.
| shabbir_mw_03f56129cd25 |
1,896,317 | Estruturas de estilização CSS -pt.2 | Prefácio Sobre as aulas que tive a respeito das estruturas CSS (QUE TEM ANOTAÇÕES SEM... | 0 | 2024-06-21T18:08:42 | https://dev.to/marimnz/estruturas-de-estilizacao-css-pt2-le4 | css, beginners | ##Prefácio
Sobre as aulas que tive a respeito das estruturas CSS (QUE TEM ANOTAÇÕES SEM FIM!!), ficou faltando apenas comentar a respeito das estruturas de fontes, textos e algumas configurações sobre sombras, mas não vou me aprofundar.
PS: É muito clichê trazer conteúdo de estudo e ainda mais sobre desenvolvimento web que está tão sobrecarregado ultimamente?
---
##Fontes
Quais tipos de fontes utilizamos?
- `Serif`: Fontes serifadas
- `sans-serif`: Sem serifa
- `monospace`: Letras com largura fixa
- `hadwriting`: Manuscritas, com aparência a caligrafia humana
- `fantasy`: Fontes decorativas
##Configurações de fonte
- `font-family`: Determina quais serão as fontes que o elemento deve ter
- `font-size`: Determina o tamanho da fonte
- `fonte-weight`: Espessura da fonte
- `font-style`: Estilo da fonte (italic, oblique, normal, bold)
- `font-variant`: Maiúscula e minúscula
- `font-stretch`: Estreitamento do texto
- `line-height`: Altura da linha
##Fontes importadas / personalizadas
- `@font-face`: Fonte personalizada
- `@import`: Importar fontes externas
> Google fonts contém uma vasta galeria de fontes personalizadas
---
##Textos
`text-alingn`: Alinhar os textos em um elemento
`text-ident`: Recuo na linha
`letter-spacing`: Espaçamento entre cada caractere
`word-spacing`: Espaçamento entre cada palavra
**text-transform**: Definir quais caracteres vão estar em maiúsculo ou minúsculo:
1. `capitalize`: Primeira letra maiúscula
2. `uppercase`: Todo o texto em maiúsculo
3. `lowercase`: Tudo minúsculo
**text-decoration**: Adiciona ou apaga linhas:
1. `none`, `underline`, `line-through`, `overline`
2. `line`, `style`, `color`, `thikness`
**white-space**: Arruma os espaços em branco:
1. `normal`: Os espaços são combinados em apenas 1 quebra de linha ignorada
2. `nowrap`: Sem quebra de linha e espaço
3. `pre`: Preserva o espaço em branco
4. `pre-line`: Sem espaço, permite quebra de linha
5. `pre-wrap`: Espaços preservados com quebra de linha
6. `break-spaces`: Quebra os espaços para a próxima linha
**word-break**: Definem o quanto o texto deve ter uma quebra de linha
1. `normal`
2. `break-all`/`keep-all`
**writing-mode**: Define a orientação do texto
1. `horizontal-tb`
2. `vertical-rl`
3. `vertical-lr
`
---
##Sombras
`box-shadow`: Sombra em caixa
`drop-shadow` (filter): Sombra em PNG
`text-shadow`: Sombra no texto
`opacity`: Transparência variando de 0 a 1 | marimnz |
1,896,316 | Keyword Cannibalization: Khái niệm và Giải pháp xử lí tình trạng ăn thịt từ khóa | Khi làm SEO cho website, bạn có thể gặp phải tình trạng nhiều bài viết cùng tối ưu cho một chủ đề... | 0 | 2024-06-21T18:07:03 | https://dev.to/khoahocseoimta1/keyword-cannibalization-khai-niem-va-giai-phap-xu-li-tinh-trang-an-thit-tu-khoa-58h3 | Khi làm SEO cho website, bạn có thể gặp phải tình trạng nhiều bài viết cùng tối ưu cho một chủ đề tương tự, dẫn đến hiện tượng Keyword Cannibalization (Ăn thịt từ khóa). Đây là khi một từ khóa SEO chính được dùng để xếp hạng cho hai hoặc nhiều bài viết trên cùng một trang web.
**Keyword Cannibalization là gì?**

Keyword Cannibalization là hiện tượng mà một từ khóa SEO chính được sử dụng để xếp hạng cho nhiều bài viết trên cùng một website. Điều này gây ra xung đột giữa các trang, khiến các bài viết cạnh tranh nhau trong cùng một kết quả tìm kiếm của Google, và có thể ảnh hưởng tiêu cực đến thứ hạng của website.
**Tác động của Keyword Cannibalization**
Tình trạng Keyword Cannibalization làm cho công cụ tìm kiếm khó xác định trang nào nên được ưu tiên xếp hạng. Kết quả là, cả hai trang có thể bị mất thứ hạng. Đối với người dùng, việc này gây khó khăn trong việc xác định bài viết nào là phù hợp nhất với nhu cầu của họ.
Việc phát hiện và xử lý Keyword Cannibalization là rất quan trọng để đảm bảo hiệu quả SEO cho website của bạn.
**Đọc thêm:** [Keyword Cannibalization](https://imta.edu.vn/keyword-cannibalization-la-gi/)
**Tham khảo khóa học chuyên gia SEO tại IMTA**: [khoa-hoc-seo](https://imta.edu.vn/khoa-hoc-seo-website/)
#khoahocseoimta #daotaoseoimta #daotaoseotphcm
**Thông tin liên hệ:**
**Điện thoại**: 028 22699899
**Email**: info@imta.edu.vn
**Địa chỉ**: Tòa Nhà Charmington La Pointe , Số 181 Cao Thắng nối dài, Phường 12, Quận 10, Thành phố Hồ Chí Minh, Việt Nam
**Google Maps**: [maps](https://www.google.com/maps?cid=1922248513636655971)
**Website**: [imta](https://imta.edu.vn/)
**Social: **
[instapaper](https://www.instapaper.com/read/1688948861)
[X](https://x.com/khoahocseoimta1/status/1804210895150813201)
[pin.it](https://pin.it/1f3XGp6ai)
[folkd](https://folkd.com/link/Keyword-Cannibalization--l---g----C--ch-x----l-----n-th---t-t----kh--a)
[wakelet](https://wakelet.com/wake/ka8r6PvVSq2CwL6o0IOAC)
[tr.ee](https://tr.ee/KuZe37iwdA)
[flic.kr](https://flic.kr/p/2pYBJb1)
[webtretho](https://www.webtretho.com/f/kinh-nghiem-hay-huu-ich/keyword-cannibalization-khai-niem-va-giai-phap-xu-li-tinh-trang-an-thit-tu-khoa)
[bravesites](https://khoahocseowebsite.bravesites.com/entries/general/Keyword-Cannibalization-Kh%C3%A1i-ni%E1%BB%87m-v%C3%A0-Gi%E1%BA%A3i-ph%C3%A1p-x%E1%BB%AD-l%C3%AD-t%C3%ACnh-tr%E1%BA%A1ng-%C4%83n-th%E1%BB%8Bt-t%E1%BB%AB-kh%C3%B3a)
| khoahocseoimta1 | |
1,896,312 | 40 Days Of Kubernetes (4/40) | Day 4/40 Why Kubernetes Is Used - Kubernetes Simply Explained Video... | 0 | 2024-06-21T18:03:01 | https://dev.to/sina14/40-days-of-kubernetes-440-13ne | kubernetes, 40daysofkubernetes | ## Day 4/40
# Why Kubernetes Is Used - Kubernetes Simply Explained
[Video Link](https://www.youtube.com/watch?v=lXs1VCWqIH4)
@piyushsachdeva
[Git Repository](https://github.com/piyushsachdeva/CKA-2024/)
[My Git Repo](https://github.com/sina14/40daysofkubernetes)
We're going to understand some fundamentals about Kubernetes.
Why, What and How!
Assume we have an application, and it contains some containers such as database, backend, frontend and ... and everyone is happy with them :)
If one of those containers failed, you cannot find a happy person in your team or clients because of a major incident and a downtime.
So we need to take care of them everytime and have some instance of them if the application needs help. I mean, if container A failed, there is at least one another container, let's say container B which can do exactly the duty of container A.
Also, we need something else to manage our workloads such as:
1. Container Networking
2. Resource Management
3. Security
4. High Availability
5. Fault Tolerance
6. Service Discovery
7. Simplified operations
8. Resilience
9. Scalability
10. Load Balancing
11. Orchestration
and so on.

But we have to remember: "**Kubernetes is not always the solution!**"
Specially if we have small application with a few containers, it is waste of resources, money and administration workload on our Operation team. | sina14 |
1,896,314 | Make a Billion in 6 Month in Tech (Only For Billionaires) | If you want to become a billionaire in less than a year and are wondering what the best path to... | 0 | 2024-06-21T18:01:22 | https://dev.to/scofieldidehen/make-a-billion-in-6-month-in-tech-only-for-billionaires-a14 | webdev, programming, beginners, productivity |
If you want to become a billionaire in less than a year and are wondering what the best path to follow is, this is your guide to getting started.
First I must congratulate you as you embark on a journey filled with sleepless nights, caffeine-induced hallucinations, and the constant fear that your brilliant idea is just a fever dream. But fear not, aspiring mogul!
This fool-proof guide will help you navigate the treacherous waters of the tech world and emerge victorious (or at least with a decent LinkedIn profile).
I have not yet tried this process, but I know someone who knows someone who knows Elon Musk, so trust me, this guide will change your life.
And if you do not trust me, then kiss billions good bye.
## Become a Coding Wizard (or Pretend to Be One)
First things first: you need to learn to code. Or do you?
The beauty of the tech world is that perception is often more important than reality. Sure, you could spend years mastering Python, Java, and C++, but why bother when you can throw around buzzwords like "blockchain," "AI," and "machine learning" with reckless abandon?
If that does not work, ChatpGPT got you; keep writing posts on Linkedin and x, and if anyone asks you, tell them you work in a Fortune 500 company as CEO in waiting.
Everyone does it, so when asked about your latest project, say, "Oh, I'm working on a quantum-based, AI-driven, blockchain-enabled platform for optimizing cat videos." Watch as eyes glaze over and people nod, impressed. Congratulations! You're now a tech visionary.
Nothing says "I'm a serious tech entrepreneur," like having a nervous breakdown over a missing semicolon at 3 AM.
Wait a bit. Has Tedx contacted you yet? I guess you can say you are too modest for the Nobel Prize. Just create your category and paste it all over social media.
## Master the Art of Buzzword Bingo
In the tech world, it's not about what you know but how confusing you can make it sound. The key to success is creating a product that no one understands but everyone thinks they need.
Imagine you're pitching to investors. Your product does not matter; you can pitch your kidney and tell the world how you started from your backyard. Now, you have built a revolutionary app that uses "synergistic cloud-based algorithms to leverage big data for optimizing user engagement through gamified blockchain interactions for health." What does it do? Who cares! You've just secured $50 million in funding.
Why communicate clearly when you can baffle people into submission?
## Perfect Your Turtleneck Game
Every tech billionaire needs a signature look. While hoodies and flip-flops are acceptable for the startup phase, true billionaires know the power of a good turtleneck. It screams, "I'm too busy changing the world to worry about my neck being cold."
Why choose the turtleneck? Nothing says "I'm the next Steve Jobs" like a wardrobe malfunction waiting to happen.
## Develop a Healthy God Complex
To make it big in tech, you need to believe that your half-baked idea will revolutionize the world genuinely. It's not enough to create a slightly better photo filter – you need to convince everyone (including yourself) that your app will solve world hunger, bring about world peace, and maybe even make pineapple on pizza socially acceptable.
Modesty is for people who don't have a yacht with its own smaller yacht.
## Master the Art of Failing Upwards
In the tech world, failure is just success in disguise. The key is to fail spectacularly and then spin it as a learning experience that makes you even more qualified to run a billion-dollar company.
Your first startup burns through $10 million in venture capital and produces nothing but a fancy logo and a foosball table. Instead of admitting defeat, write a LinkedIn article about how this "growth opportunity" taught you valuable lessons about "iterative development" and "pivoting in dynamic markets." Watch as job offers and investment opportunities flood in.
Because in tech, the only thing more impressive than success is surviving catastrophic failure with your ego intact.
## Cultivate an Eccentric Persona
Every tech billionaire needs a quirk that makes journalists salivate. Whether it's an obsession with Mars colonization, a diet solely of green juice, or a habit of taking ice baths while reciting Shakespeare, find your weird and lean into it hard.
At your next board meeting, announce that you've decided to communicate only through interpretive dance for the next quarter to "boost creativity and disrupt traditional communication paradigms." Watch your employees scramble to learn the Macarena to ask for a raise.
Being normal is for people who don't have billions of dollars to insulate them from social consequences.
## Redefine Basic Concepts to Suit Your Needs
Privacy? That's so 20th century. Profit? Oh, you mean "community value creation." Work-life balance? Sorry, we only believe in "passion-driven productivity ecosystems" here.
When faced with criticism about your app's invasive data collection practices, explain that you're not violating privacy; you're "enhancing user experience through personalized data synergy." If anyone objects, accuse them of being against progress and innovation.
Why choose reality distortion? You can decide what words mean when you're rich enough.
## Conclusion
Your Path to (Probably Imaginary) Billions: There you have it, future tech titan! Follow this guide, and you'll be well on your way to joining the ranks of the Silicon Valley elite. Remember, the key to success in tech is confidence, buzzwords, and a complete disconnection from reality.
Will you make a billion dollars? Probably not. But with these skills, you'll certainly sound like you could, and in the end, isn't that what matters?
Now, go forth and disrupt, innovate, and synergize your way to glory! Just don't forget us little people when you're sipping champagne on your private island – we'll still try to figure out how to center a div.
This is my weekend softball to get us ready for the weekend. If you love what I write, I write exciting posts; you can check out my blog for exciting posts on [Learnhub Blog](https://blog.learnhub.africa/); we write everything tech from [Cloud computing](https://blog.learnhub.africa/category/cloud-computing/) to [Frontend Dev](https://blog.learnhub.africa/category/frontend/), [Cybersecurity](https://blog.learnhub.africa/category/security/), [AI](https://blog.learnhub.africa/category/data-science/), and [Blockchain](https://blog.learnhub.africa/category/blockchain/).
| scofieldidehen |
1,896,313 | Master Async/Await in JavaScript: Tips and Tricks for Pros | JavaScript has come a long way from its humble beginnings, evolving into a powerful and versatile... | 0 | 2024-06-21T18:00:16 | https://dev.to/msubhro/master-asyncawait-in-javascript-tips-and-tricks-for-pros-dcd | javascript, webdev, programming, reactjsdevelopment | JavaScript has come a long way from its humble beginnings, evolving into a powerful and versatile language. One of its most powerful features, introduced in ECMAScript 2017, is the async/await syntax. This modern approach to handling asynchronous code makes it more readable and easier to debug, which is a boon for developers. In this blog, we'll delve into the basics of async/await and provide a simple example to help you master it like a pro.
## Understanding Asynchronous JavaScript
Before diving into async/await, it's crucial to understand the problem it solves. JavaScript is single-threaded, meaning it can only do one thing at a time. Asynchronous operations, like fetching data from a server or reading a file, can take a while to complete. To avoid blocking the main thread and ensure a smooth user experience, JavaScript uses asynchronous programming.
Initially, callbacks were used to handle asynchronous code. However, callbacks can lead to "callback hell," where nested callbacks become hard to manage and read. Promises were introduced to alleviate this issue, providing a cleaner and more manageable way to handle asynchronous operations. async/await builds on promises, offering an even more intuitive syntax.
#### The Basics of Async/Await
1 **Async Functions**: An async function is a function declared with the async keyword. It always returns a promise. If the function returns a value, the promise will resolve with that value. If the function throws an error, the promise will reject with that error.
```
async function fetchData() {
return "Data fetched";
}
fetchData().then(data => console.log(data)); // Output: Data fetched
```
2 **Await Keyword**: The await keyword can only be used inside an async function. It pauses the execution of the async function and waits for the promise to resolve or reject. Once resolved, it returns the result. If the promise rejects, await throws the rejected value.
```
async function fetchData() {
let response = await fetch('https://api.example.com/data');
let data = await response.json();
return data;
}
fetchData().then(data => console.log(data));
```
#### A Simple Example
Let's walk through a basic example to see async/await in action. We'll create a function that fetches user data from an API and logs it to the console.
1 **Setting Up the Async Function**
First, we'll define our async function and use await to handle the asynchronous operations.
```
async function getUserData() {
try {
let response = await fetch('https://jsonplaceholder.typicode.com/users/1');
let user = await response.json();
console.log(user);
} catch (error) {
console.error('Error fetching user data:', error);
}
}
getUserData();
```
2 **Handling Errors**
Notice the try...catch block. This is essential for handling errors in async functions. If any of the awaited promises reject, the error will be caught, and we can handle it appropriately.
3 **Chaining Async Functions**
You can also chain multiple async functions. For instance, let's create another function to fetch posts by the user and log them.
```
async function getUserPosts(userId) {
try {
let response = await fetch(`https://jsonplaceholder.typicode.com/posts?userId=${userId}`);
let posts = await response.json();
console.log(posts);
} catch (error) {
console.error('Error fetching user posts:', error);
}
}
async function getUserDataAndPosts() {
try {
let user = await getUserData();
await getUserPosts(user.id);
} catch (error) {
console.error('Error fetching data:', error);
}
}
getUserDataAndPosts();
```
In this example, getUserDataAndPosts calls getUserData to fetch user data, then calls getUserPosts to fetch the user's posts using the user's ID. This demonstrates how async/await simplifies chaining asynchronous operations, making the code more readable and maintainable.
## Best Practices for Using Async/Await
- **Always Use try...catch**: Handle errors gracefully by wrapping your await calls in try...catch blocks.
- **Avoid Blocking the Main Thread**: Be cautious with await inside loops. Use Promise.all for parallel execution where possible.
- **Use Descriptive Variable Names**: Make your code more readable by using meaningful names for your variables and functions.
- **Keep Functions Small and Focused**: Break down complex tasks into smaller, more manageable async functions.
## Conclusion
Mastering async/await can significantly improve your ability to write clean, readable, and maintainable asynchronous code in JavaScript. By understanding the basics and following best practices, you'll be well on your way to becoming a pro in handling asynchronous operations.
| msubhro |
1,896,076 | HealthLingo | AI Agents Enabling Multilingual Doctor consultation via WhatsApp | This is a submission for Twilio Challenge v24.06.12 What I Built People can directly... | 0 | 2024-06-21T17:57:07 | https://dev.to/ashiqsultan/healthlingo-consult-doctors-from-whatsapp-in-your-native-language-twilio-with-gpt-544m | devchallenge, twiliochallenge, ai, twilio | *This is a submission for [Twilio Challenge v24.06.12](https://dev.to/challenges/twilio)*
## What I Built
<!-- Share an overview about your project. -->
People can directly contact doctors from WhatsApp in their native language.
Based on patients query a suitable doctor from the list of doctor will be selected by AI.
The Doctor replies in English while the patient's will receive the reply in their preferred language.
## Demo

Video explanation of the project and the codebase
{% embed https://youtu.be/5pdg0Vm-4X8 %}
<!-- Share a link to your app and include some screenshots here. -->

## Working of the App
The app consists of two layers
1. Business layer
2. AI layer
The business layer is responsible for handling the incoming msg and Database interactions while the AI layer consists of multiple small agents which interacts with Open AI API. I have used MongoDB as the database to store information like patient details, chat summary and doctor information.

## Twilio and AI
<!-- Tell us how you leveraged Twilio’s capabilities with AI -->
### AI
- Multiple Agents for handling specific tasks using OpenAI's GPT-4o model
- Translate Agent
- Chat Summary Agent
- Two Agents for Details Collection
- Basic Details Agent
- Medical Details Agent
### Twilio
- Twilio Whatsapp service.
- Initially the app was deployed as Twilio Functions but was as I added more conditions I was hitting timeouts sometimes. So yeah, I just deployed in AWS using docker. And yes only one Twilio service.
## Additional Prize Categories
<!-- Does your submission qualify for any additional prize categories (Twilio Times Two, Impactful Innovators, Entertaining Endeavors)? Please list all that apply. -->
**Impactful Innovators**: People who are visiting a country for first time or immigrants who still find it difficult to express especially their medical condition in the foreign language can use this kind of app to communicate with doctors and express their symptoms and conditions.
Also the AI will pick the right doctor based on their condition.
The privacy of both doctor and patient is maintained as no phone numbers are shared.
> The vision of such a app in real life would be
> - Social service mined Doctors signing up for such a service and give some details about their specialty
> - Patients who are looking for answers for some rare or specific condition could potentially find the right doctor and explain their problem in the language they are comfortable with
If you've made it this far, thanks for reading the post! Leaving a like would mean a lot, as it supports my work and encourages me to keep creating more content. Have a great day !
### GitHub
{% embed https://github.com/ashiqsultan/twilio-whatsapp-ai-bot %}
<!-- Team Submissions: Please pick one member to publish the submission and credit teammates by listing their DEV usernames directly in the body of the post. -->
<!-- Don't forget to add a cover image (if you want). -->
| ashiqsultan |
1,896,311 | F1 app made in react | Hello! I am new in this world of programming and I have decided to join two of my passions:... | 0 | 2024-06-21T17:54:38 | https://dev.to/itzale/f1-app-made-in-react-4187 | react, showdev, junior, f1 | Hello! I am new in this world of programming and I have decided to join two of my passions: programming and formula 1. to create this app I have used the [ergast](https://ergast.com/mrd/) api and as framework I have used React, the project is still in process, there is still a lot to do but it is on the right track 😊
you can see the web [here](https://f1-app-sigma.vercel.app) and the github repository [here](https://github.com/ItzAle/f1-app)
any help and opinion is welcome (*^-^*) | itzale |
1,896,310 | Remote-first companies we should know about 🤔 | Hi everyone, I'm not sure how many of u are working remotely, but I'm sure there is quite a bit of... | 0 | 2024-06-21T17:49:39 | https://github.com/devs-on-remote/remote-first-software-companies | discuss, beginners, programming, career | Hi everyone, I'm not sure how many of u are working remotely, but I'm sure there is quite a bit of you.
I want to create a <u>[directly of software companies with a remote-first culture](https://github.com/devs-on-remote/remote-first-software-companies)</u>, that don't treat this just as a fancy word, but actually mean it.
I've created a [repo](https://github.com/devs-on-remote/remote-first-software-companies) where I'll be adding them one by one **with some meaningful information** like:
- language used within the organisation
- size
- careers page
- most used tech stack
**Does anyone of u have some suggestions? Who should I add? :D**
---
PS: btw I'm building a job board for remote developers and companies with remote-first culture. If you are interested here is a link: [devsonremote.com](https://devsonremote.com/) | devonremote |
1,896,307 | An Ultralearning Approach to the Technical Interview | Intro In 2023 I interviewed with Amazon and Google. Both were eye opening experiences and... | 0 | 2024-06-21T17:43:59 | https://dev.to/himynameisoleg/an-ultralearning-approach-to-the-technical-interview-njg | ultralearning, interview, leetcode |
# Intro
In 2023 I interviewed with Amazon and Google. Both were eye opening experiences and exposed me to the rigor and extreme caution that big tech companies take with their hiring process. Their philosophy, as I’ve come to understand, is that passing on a few good candidates is better than hiring one bad egg. This poses an especially big challenge for us humble interviewees because we need to be extremely prepared, knowledgeable and level-headed come interview time. How then can we make ourselves stand out? I set out on a mission to answer just that.
# How it all started
A few months ago I was researching learning strategies to more effectively pick up a new programming language. I had shown interest in Rust a few years ago but my learning quickly hit a wall and I closed the book on it early. Then one day in the not-so-distant future I came across an awesome book called ["Ultralearning" by Scott Young](https://amzn.to/3VvDYZi) . It outlines some strategies for learning difficult things in an intense, focused, and structured way. I highly encourage you to read the book too, but if I were to convey its ethos in two bullet points it would be:
* **Plan** more than you think you need to
* Learn by doing the thing you want to do **directly**
I had decided I wanted to "ultra learn" the Rust programming language. I made many mistakes in my first attempt. I got stuck in "tutorial hell" without a well-structure learning plan and no concise motivation for learning outside of "I should probably know a low level language". So this time I started applying some of the ultralearning techniques, and project-ifying the learning. I spent the recommended 10% on meta-learning -- doing research on **how** I will learn the language, how **others** have learned it, and collecting the books, materials and GitHub repos to aid in the learning. I then laid out a weekly plan, drilled the katas, did the coding projects and supplemented learning with a few recommended books.
However, after 4 weeks I put this project on pause again.
"But, Why?"
Well, it wasn't because I hit the wall again.
In fact, I was seeing tremendous results! This time learning Rust and Systems Programming came much more rapidly. My only problem was that this strategy worked so well. I had this burning feeling that I could be applying this wonderful new technique to anything. Something different, better, “higher impact”. Something like, say, ... Technical Interviews.
So here's what I did.
# The Ultralearning Phase
## Meta-Learning
When I did interview prep last year I just blindly solved some recommended LeetCode questions, followed a few YouTube videos, read the most important chapters in [Cracking the Coding Interview](https://amzn.to/3VvDYZi). I forgot them almost immediately. I had poor structure and no set schedule, and as a consequence nothing seemed to stick. The learning decay curve kicked in immediately and when the day of the interview came I found it hard to retrieve the knowledge that I had only **just practiced**.
This time around I dedicated way more time to planning. Ultralearning recommends 10% of total time to be spent on planning out a strategy. Having a well devised plan helps keep you on target, helps combat the learning decay curve, and acts as an accountability partner.
### How I planned and organized
I was not trying to re-invent the wheel with this project. I wanted a simple concise study plan that I could adapt to my own style. I found this amazing documentation-style website called the [Tech Interview Handbook](https://www.techinterviewhandbook.org/software-engineering-interview-guide/) and made heavy use of it throughout. I highly recommend it as part of your planning phase. Read through this front to back. Its's not very long and has some gems of advice.
I went with its recommended approach of 3 month of learning, 11 hours per week. I factored in 2 extra weeks for ... well ... this is life and sh$t happens, which came out to a cool 154 hours. This meant that about 15 hours should be dedicated to meta-learning, planning, building a calendar, finding resources, paying for courses, setting up LeetCode, etc.
In the [study and practice plan](https://www.techinterviewhandbook.org/coding-interview-study-plan/) section there is a breakdown of each topic by week, recommended time, and priority level of each topic. I used this as a baseline to craft my weekly schedule, trying to incorporate time for **spaced repetition** and **direct learning** strategies I discovered from Ultralearning. The beginning is always a little awkward when starting these things, and I used the first few days to sus-out what worked and what didn't. I eventually arrived on this sequence:
#### The Learning Sequences
0. Active Learning
- Read the recommended articles and watch the recommended videos
- Over-learn the topic - articles often link to other resources so jump down that rabbit hole
1. Direct Practice
- Solve 3-4 of the essential LeetCode - be sure to [[02-an-ultralearning-approach-to-the-technical-interview#1. Time yourself|time yourself]]
- Subsequently review each attempted question and understand your weaknesses
2. Weekly Review
- Comprehensive review at the end of the week
- Drill and Review the recommended LeetCode
- Re-try some of the failed questions from earlier in the week
Because I work full-time, this is the schedule that I landed on after tuning the first week.
Sundays were spent on "light reading".
My work week, Mon-Fri, I spent about 1-2 hours per day drilling problem sets and reviewing.
Saturdays were spent on a longer comprehensive review of the weeks topics. Similar to Mon-Fri just more diverse set of problems. Here is how the schedule looked:
| | Sun | Mon | Tue | Wed | Tr | Fri | Sat |
| -------- | --- | --- | --- | --- | --- | --- | --- |
| Sequence | 0 | 1 | 1 | 1 | 1 | 1 | 2 |

Learning days (0s) are just there to put you in the right headspace. They are not quite as important as the other days and serve to be primers for the content to come. The problems attempted in the Direct Practice days (1) are far more important, because most of the learning will happen during your struggle when trying to complete a problem in 15-20 minutes, inevitably failing, and then reviewing what went wrong. I have come to appreciate the fact that failure is a very powerful teacher. By intentionally setting myself up for failure during the drill sessions I was able to highlight my weaknesses and force myself to go back and grok the thing.
> "Failure is a very powerful teacher."
So that was the majority of the planing phase. Initially, I left some extra time for planning during the first week to make adjustments to the schedule based on how things were going. With my calendar printed out, LeetCode purchased, books coming in the mail, I was well equipped for the next 12-14 weeks of this marathon. Now lets dive a little deeper into the learning strategy.
## The Power of Directness
Wanting to pass the technical interview meant that I needed to **practice doing technical interviews**. All the YouTube, LeetCode, books, and courses are nothing but tools to aid in understanding the domain of knowledge, but they don't magically make you a good technical interviewee. This comes down to directly practicing in the style of the interview -- usually a blank document or a physical whiteboard. There are 3 things to consider when emulating this direct interview practice.
### 1. Time yourself
During each and every problem I attempted, I had a timer going. It didn’t matter if I finished solving the question or not. The simple practice of timing the problems exposed weaknesses in my problem solving strategies and highlighted the areas where I needed to focus more effort. I kept a pen and paper handy and quickly jotted down questions that came to mind while I was solving the problem, and came back to them during the subsequent review session. This ensured that I was practicing under time pressure and also had some topics to review later under more relaxed conditions.
### 2. Bring your own examples
LeetCode is a great platform for practicing, but it gives too much information at once. Part of the challenge in the real interview is asking your interviewer the right questions. Questions like "What does the input look like", "the output", "are duplicates allowed", etc. LeetCode lays all of these out form the start. I made sure to ignore the examples and come up with my own, and only reference the examples when I was genuinely stuck. This more closely emulates a real interview.
### 3. Don't run the code
First off, solve the problem before coding it out. I made the mistake last year of jumping to the intricacies of the code before actually solving the problem. In my naive initial practice I made heavy use of the "Run" button in LeetCode. Forget this button exists. Solve the problem to the best of your ability in the time given, optimize as much as you can, but don't focus on running the code. This is not Test Driven Development and you will not have this kind of feedback during the interview.
Following these 3 strategies when drilling interview problems helped more closely simulate the true format of the interview. This is exactly how it went in 2023 and this is exactly how it is now. But all the drilling in the world was not going help if I didn't have a good strategy for review and retention.
## Retention
Learning decay is real and it sucks. Our brain is great at archiving information that we don’t access on a regular basis. I have found two strategies that work best to combat learning decay, and helped me retain information for longer periods of time:
1. Over learning
2. Spaced Repetition
### Overlearning
Going beyond surface level on any topic is probably the most effective way to make it stick. Additionally, over-learning is less of a “trick” than other strategies. Lets compare it to mnemonics. As with mnemonic strategies you are creating associations in your brain to help you quickly recall the topic. But unlike mnemonics, over-learning fundamentally adds to your overall understanding of the subject. Instead of silly acronyms helping you remember something, you have a robust web of inter-related information to fall back on.
Let me use Hash Tables as an example for how I over-learned to help me understand the topic. Initially, I knew that you can use the Hash Table data structure to quickly access data stored as a key-value in near constant time, O(1). What I didn't know about were all of the hashing functions and the collision resolutions mechanisms when building out a Hash Table. These are usually abstracted away in higher level languages.
I read a bunch of articles on the subject (I love it when online articles links to more articles). Next thing I knew I had 12 tabs open. Even though each article covered essentially the same thing: "here's what a Hash Table is, what it does, here's how", they all approached it slightly differently. Seeing the various approaches and different implementation strategies really helped contextualize the essence of the topic for me, even if I never had to recall the implementation details.
What I have found is that my brain is able to create some common patterns after consuming about 3-5 different resources on one topic. So once I have reviewed 3 different materials on the subject and can solve 3 similar problems, I feel confident enough about moving on. If I still cant solve 3 problems, there there are likely some knowledge gaps I need to fill.
The challenge with the over-learning strategy is knowing when to stop. Knowing how deeply to go down the rabbit hole. This is entirely subjective but a great segue to the next strategy: Spaced Repetition.
### Spaced Repetition
We know that time is the enemy of our learning. In my first year of university I took a psychology class and one of the first things my professor pulled up on the board was the classic "forgetting curve". We learn really quickly at first, but we forget just as quickly. Thankfully we can combat this tendency to forget with repetition. The curve becomes far less steep with subsequent reviews, and this is why I wanted to incorporate a spaced repetition strategy into my schedule. Spacing the learning and drilling 1 day apart ensures that there is ample time between learning something quickly and then reviewing aggressively to reinforce the ideas, and challenges any misconceptions you may have mistakenly formed in the initial study. Having a comprehensive drill / review session at the end of the week further solidifies this.

# How its going
## Timing is key
After piloting these strategies for a week, I have learned quite a bit. Firstly, the 11 hour per week time commitment feels bit too low to deeply cover all the content prescribed in the guide. When I allocate 15 minutes to solving each problem, the subsequent review can take up to 30 minutes if the problem is sufficiently complex or used some Pythonic trick that I am not familiar with. E.g. I learned that I cannot use a List as the key for a dictionary because Lists in Python are mutable. It took time for me to understand that casting Lists as Tuple makes them immutable and this can be a useful hashing mechanism for anagram problems if you are optimizing for performance and trying to avoid sorting.
Despite the early time investment, I am very hopeful that it will get easier. Early on, I tripped up on simple problems. Now I can solve these types of problems in half the time, even using the most efficient strategy! The learning curve seems to be steep initially, but once I've encountered each type of problem and start grokking it, my time to solve seems to drop significantly. I am hopeful this trend continues.
## Distraction
It was Easter Sunday weekend. I knew the weekend would be spent with my family, but I also wanted to keep up with my studies. I planned to pepper some light reading and review in between the holiday festivities. I loaded up some Medium articles onto my Kindle. There were only about 6-8 articles on Linked Lists, Queues, Stacks, Sorting and Searching - topics I was already pretty familiar with. What I learned, however, is that distraction is the enemy of progress. What should have taken me at-most an hour or two, wound up taking sever hours due to distraction. This further highlighted the importance of having a quiet focused study space, because this kind of technical literature, even the reading" cannot be consumed while I am distracted.
## Motivation
I have been following the [r/leetcode](https://reddit.com/r/leetcode) subreddit and have been seeing a wide spectrum of stories of both inspiring and demoralized experiences with leetcode interview prep. There is definitely a strong learning curve with the problems but given enough time and with the right strategy it becomes like second nature and quite fun actually. After going through the initial learning phase and getting over a week of the Easy level drilling, I can confidently say I can solve most easy problems in just a few minutes. Your brain starts recognizing patterns and there are only a dozen or so different problems. I am cautiously optimistic though, because I know once I start on the Mediums, there will definitely be a new hump to overcome. But having a solid foundation is very helpful as Mediums generally just add a layer of complexity or an additional data structure on top of the easy. My advice to those frustrated, demoralized souls on reddit is this - "stick with it, lay a good foundation, and don't burn yourself out".
# Applications and Offers
Throughout this process I have also been researching companies and applying to the ones I find to be a good fit. In total I applied to about 30 carefully researched and considered roles. I received many polite rejection letters and even a weird AI screening. Here are some of the highlights, interviews and decisions:
## 1. Fintech Research Company - Interview, No Offer 😵
This was my first invitation to interview. There was no Online Assessment just went straight to interview. I was your average technical grilling. As this was the first interview in a while I felt a little underprepared. It also felt like my interviewers were a bit underprepared or were not accustomed to interviews. It was a jumble of technical trivia and generic interview questions which I did my best to answer but some of the questions were so trivial I felt as if I were on a game show -- less about problem solving and more about intricacies of a specific language or database.
I learned a lot from this interview, namely that I needed a refresher on some of the tech I hadn't used in a while, especially if it shows up on the job description. Ultimately I think my lack of preparedness showed and I didn't move on.
## 2. Amazon - Screened Out Again 😵
I applied to Amazon in 2022 and never made it past the coding assessment. As I was already in their system I got an invitation to go through another interview. Its almost like they saw that I ordered "Cracking the Coding Interview" and magically knew I was looking 😅.
The process was pretty standard.
I emailed back the recruiter with my new resume, filled out the online application. She also asked me to answer a list of some basic screening questions. Things like openness to relocate, my current role, how much time I spend coding/designing, etc.
I took the Online Assessment about a week later and the questions were fairly simple. The first one passed most test cases. The second one stumped me and only passed a few edge cases. I had the right approach just missed a minor detail - I managed to solve it later but my approach was probably not even looked at and I was screened out .
> Big takeaway here is if you don't pass the Online Assessment questions you'll likely be screened out.
Given the current market for software engineers, I think the cold truth is that "companies can be super picky". The applicant pool is pretty massive now after all the layoffs, and they are more liberally screening out. So being able to pass the OA is a huge foot in the door and shouldn't be taken lightly.
## 3. Healthcare Company - Offer ✅
I applied early April 2024, did a short phone screen, and took another HackerRank assessment. This one was more technology specific and role oriented. They asked some very specific multiple choice questions about React, Auth flow, and some infrastructure YAML formatting. A little trivia-like, but I think they wanted to gauged if I used certain technologies or not. There was one coding problem in JavaScript related to generators which I completely forgot were a thing in JS, but a quick language ref lookup I solved it no problem. Overall a pretty straight foreword exam. Not the best way to gage the applicant overall problem solving abilities but much easier to get foot in the door than the "leetcode" style problems. Neither of these approaches is perfect in my opinion.
The following week I was invited for a panel style interview with two architects and a director. Having learned my lesson before, I spent the weekend diving into the technologies listed in the job description and going over the tech I wasn't super familiar with. I built a small app using Next.js, TypeScript and Auth0 as a refresher. This exercise helped a ton. I think any technology listed on the JD is always fair game for questions in the interview. The weekend prep payed off and because of my prep, the interview went really well.
I sent a thank you note to the recruiter a few days later and asked for any feedback. We had a conversation the following day and I received news that I was their top candidate!
# Did I accept the offer?
YES! After a few more email and some salary negotiation I was happy with the final offer and accepted it.
# Key Takeaways and Summary
Well, I didn't make it the whole 14 weeks. In fact I "only" made it to week 11 by the time I got an offer. But in the last 10 weeks I sure did learn a thing or two.
For Big Tech companies my bottleneck was the Online Assessment. I think if I had the full 14 weeks instead of just 4 of leetcode grind I would have been in much better shape to solve the Amazon questions perfectly.
For the companies that didn't do leetcode the big mistake I made was not being prepared to answer interview questions. I learned from my mistakes during the first interview and prepared a list of examples of **challenges**, **failures** and **successes** across all of my projects and tied them closely to the job requirements for the new position. Better yet I built a small demo app using some of the technologies from the job description to give myself a nice refresher.
Last of all, this process **takes time**. 14 weeks may seem like a long time, but it sure did go by quickly. Thankfully the intensity of this period really paid off. I learned a metric ton of "tricks" and strategies for solving problems efficiently. And the awesome reward from all of those long hours after work and on weekends: the closing of one chapter and opening of an exciting new one!
# So whats next?
### Building new habits
Changing jobs or starting new work projects always feels like a clean slate.
I always like to take this time as an opportunity to reflect on what worked in the past and what didn't. In the next few weeks as I transition from my old job to the new, I plan to start setting some goal for myself in this next chapter.
### Forever LeetCode - I get it
Having gone through a sizable chunk of the learning plan, all of the pain of getting stuck and manic joy of finally solving a problem on my own, I kinda understand the hype. Im relieved to get 3 hours back each day, but also kind of bummed I didn't get to do more of those problems during the interview process. I definitely feel like these last 10 weeks have made me a much stronger software engineer. There is no doubt about that. Though I have a massive list of books I want to get through, I will definitely continue to solve leetcode on the regular just to keep current. Maybe even try using a language other than Python.
### Ultralearning ubiquity
If I learned one thing during this whole process is that the ultralearning framework works tremendously well! Its portable and can be applied to nearly any new learning. I am extremely happy to have successfully applied it to my personal life. I feel like I have discovered a sort of cheat code to life and feel motivated to continue applying it in other practical areas and next project!
## Outro
If you made this this far, thanks for taking the time and reading. If you have any questions or constructive feedback lets connect!
perchykoleg@gmail.com
@himynameisoleg
| himynameisoleg |
1,896,308 | Production Level Context with Next.js (Typescript) 🔥 | Connect 👋 Xam LinkedIn Setup First, let's create a file for defining the... | 0 | 2024-06-21T17:42:52 | https://dev.to/codexam/production-level-context-with-nextjs-typescript-15kk | webdev, nextjs, javascript, react | ## Connect 👋
- [Xam](https://github.com/Subham-Maity)
- [LinkedIn](https://www.linkedin.com/in/subham-xam/)
### Setup
First, let's create a file for defining the context (let's call it `app-context.ts`):
```tsx
import React from "react";
export interface ContextState {
name: string;
}
export interface ContextDispatch {
setName: React.Dispatch<React.SetStateAction<string>>;
}
type ContextProps = ContextState & ContextDispatch;
const defaultState: ContextState = {
name: "Hello",
};
const defaultDispatch: ContextDispatch = {
setName: () => {},
};
const defaultContext: ContextProps = {
...defaultState,
...defaultDispatch,
};
const AppContext = React.createContext<ContextProps>(defaultContext);
export default AppContext;
```
Now, let's create a file for the provider component (let's call it `app-context-provider.tsx`):
```tsx
"use client";
import React, { useState } from "react";
import AppContext, { ContextState, ContextDispatch } from "./appContext";
interface AppProviderProps {
children: React.ReactNode;
}
const AppProvider: React.FC<AppProviderProps> = ({ children }) => {
const [name, setName] = useState<string>("Hello");
const contextState: ContextState = {
name,
};
const contextDispatch: ContextDispatch = {
setName,
};
return (
<AppContext.Provider value={{ ...contextState, ...contextDispatch }}>
{children}
</AppContext.Provider>
);
};
export default AppProvider;
```
Finally, let's create a custom hook for using the context (you can put this in a separate file or in the `index.ts` file):
```tsx
import { useContext } from "react";
import AppContext from "./appContext";
export function useAppContext() {
return useContext(AppContext);
}
```
Now you can use it in your application like this:
Wrap your app or a part of it with the `AppProvider`:
`layout.tsx`
```tsx
export default function RootLayout({
children,
}: Readonly<{
children: React.ReactNode;
}>) {
return (
<html lang="en">
<body className={`${inter.className} bg`}>
<AppProvider>{children}</AppProvider>
</body>
</html>
);
}
```
### Use the context in your components:
`a-component.tsx`
```tsx
"use client";
import React from "react";
import { useAppContext } from "@/context";
const AComponent = () => {
const { setName } = useAppContext();
return (
<div>
<button
className="default-button"
onClick={() => {
setName("Subham Maity");
}}
>
Change The Name
</button>
</div>
);
};
export default AComponent;
```
`b-component.tsx`
```tsx
"use client";
import React from "react";
import { useAppContext } from "@/context";
const BComponent = () => {
const { name } = useAppContext();
return (
<div>
<p>{name}</p>
</div>
);
};
export default BComponent;
```
This structure provides better type safety, separates concerns, and is more scalable. It's easier to add new state variables and update functions as your app grows. | codexam |
1,896,299 | What is Adobe Experience Manager(AEM) & it's Features | Adobe experience manager(AEM) is a content management systems(CMS) for building website , mobile... | 0 | 2024-06-21T17:24:56 | https://dev.to/sagar7170/what-is-adobe-experience-manageraem-its-features-3091 | aem, webdev |

> Adobe experience manager(AEM) is a content management systems(CMS)
> for building website , mobile aaps , forms . It enables organizations to create, manage, and deliver digital experiences across various channels, ensuring consistency and relevance.
## Key features
**Content Management **: AEM Provides powerful tools for creating , editing and managing web content. It allows content authors to build and update websites with a WYSIWYG editor, without needing extensive technical knowledge.
**Digital Asset Management(DAM)**: AEM include DAM system for managing rich media like images , videos , documents.
**Multi-site Management**: AEM support managing multiple websites from a single platform. It allows for content reuse, localization, and regionalization, making it easier to maintain consistency across global sites.
**Responsive Design**: AEM provides tools for creating responsive and adaptive designs, ensuring that websites look and function well on various devices and screen sizes.
**Forms and Documents**: AEM Forms enables the creation and management of forms and documents, supporting use cases like applications, surveys, and customer correspondence.
**Commerce Integration**: AEM integrates with e-commerce platforms to deliver personalized and consistent shopping experiences across all digital touchpoints.
**Extensibility**: AEM is highly extensible, allowing developers to create custom components, templates, and services. It supports integration with third-party systems through APIs and connectors.
## Use Cases for AEM
**Corporate Websites**: AEM is used to build and manage large corporate websites that require frequent updates, localization, and integration with marketing tools.
**E-commerce**: Online retailers use AEM to deliver personalized shopping experiences, integrate with back-end systems, and manage digital assets.
**Government Portals**: Government agencies use AEM for managing large volumes of content, ensuring accessibility, and delivering services to citizens.
**Media and Entertainment**: Media companies use AEM to manage digital assets, streamline content workflows, and deliver engaging multimedia experiences.
**Financial Services**: Banks and financial institutions use AEM to create secure, compliant, and personalized digital experiences for customers. | sagar7170 |
1,896,306 | 🗞 Rapyd Developer Newsletter: June 2024 💳 🔗 Rapyd Payment Gateways + Save Card Details Toolkit Integration | API Changelog | Product Changelog Building an Ecommerce Travel Agency Website with the... | 0 | 2024-06-21T17:39:16 | https://dev.to/rapyd/rapyd-developer-newsletter-may-2024-rapyd-payment-gateways-save-card-details-toolkit-integration-4ojd | rapydnews, fintech, payments, tutorial | ##### [**API Changelog**](https://docs.rapyd.net/en/api-changelog.html) | [**Product Changelog**](https://docs.rapyd.net/en/product-changelog.html)
---
[**Building an Ecommerce Travel Agency Website with the Rapyd Payment Gateway**](https://community.rapyd.net/t/building-a-travel-agency-website-with-the-rapyd-payment-gateway/59353) – Use the Rapyd Payment Gateway to support your online business. From travel agencies to e-commerce, the payment possibilities through Rapyd Collect are limitless.
[**High-Opportunity Industries Payments Report**](https://community.rapyd.net/t/73-of-businesses-struggle-with-payment-delays-according-to-rapyd-s-2024-state-of-payments-for-high-opportunity-industries/59349) – Essential insights from Rapyd's latest report every developer should know. Discover how high-opportunity industries are fast-growing, highly profitable, and have unmet payment needs.
[**Python Beta Testers Program **](https://community.rapyd.net/t/join-our-python-beta-testers-program/59354) – Demonstrate your skills as a Python developer by joining the Beta Testers Program. Access the latest features and tools while networking with other Python developers.
[**Integrate With the Hosted Save Card Details Toolkit**](https://docs.rapyd.net/en/hosted-save-card-details-page.html) – Check out our updated documentation for the Save Card Details Toolkit. Integrate seamlessly and embed the best solution for your business website.
[**Historical FX Rates**](https://community.rapyd.net/t/query-regarding-access-to-historical-foreign-exchange-rates-via-rapyd-api/59287) – See your questions get answered in the Rapyd Developer Community. Utilize the Rapyd API and our documentation to find out about historical FX rates.
[**KYB Application Updates**](https://docs.rapyd.net/en/activating-your-account--kyb-.html) – The user experience for the Know Your Business (KYB) Application has been updated. Learn how to quickly verify your business to begin transacting with Rapyd.
---
Do you have ideas for improving this newsletter, or do you wish to contribute an article or join a panel? Let us know by responding.
Thanks for reading,
Drew, McKay, and the entire Rapyd team | uxdrew |
1,896,305 | Redux-Toolkit vs React Context API: Mastering State Management in React | State management is a critical aspect of modern web development, especially when building complex... | 0 | 2024-06-21T17:37:38 | https://dev.to/msubhro/redux-toolkit-vs-react-context-api-mastering-state-management-in-react-2o40 | reactjsdevelopment, redux, javascriptlibraries, mern | State management is a critical aspect of modern web development, especially when building complex applications with frameworks like React. Two popular tools for managing state in React are Redux-Toolkit and the React Context API. Each has its own strengths and use cases, and understanding these can help you choose the right tool for your project. In this blog post, we'll take a deep dive into both Redux-Toolkit and React Context API, comparing their features, usage, and performance, with basic examples to illustrate their differences.
## Redux-Toolkit: Streamlining Redux
#### What is Redux-Toolkit?
Redux-Toolkit is the official, recommended way to write Redux logic. It provides a set of tools and best practices that simplify the process of writing Redux code, making it more efficient and less error-prone. Redux-Toolkit includes utilities for creating and managing slices of state, dispatching actions, and configuring the store.
#### Key Features of Redux-Toolkit
- **Simplified Configuration**: Redux-Toolkit reduces boilerplate code with functions like configureStore and createSlice.
- **Immutability**: Built-in support for immutable updates using Immer.
- **Enhanced DevTools**: Better integration with Redux DevTools for debugging.
- **Middleware**: Simplified middleware setup.
#### Basic Example with Redux-Toolkit
Let's create a simple counter application using Redux-Toolkit.
1 **Install Redux-Toolkit and React-Redux**:
```
npm install @reduxjs/toolkit react-redux
```
2 **Create a Redux slice**:
```
// features/counter/counterSlice.js
import { createSlice } from '@reduxjs/toolkit';
const counterSlice = createSlice({
name: 'counter',
initialState: { value: 0 },
reducers: {
increment: (state) => {
state.value += 1;
},
decrement: (state) => {
state.value -= 1;
},
},
});
export const { increment, decrement } = counterSlice.actions;
export default counterSlice.reducer;
```
3 **Configure the store**:
```
// app/store.js
import { configureStore } from '@reduxjs/toolkit';
import counterReducer from '../features/counter/counterSlice';
const store = configureStore({
reducer: {
counter: counterReducer,
},
});
export default store;
```
4 **Connect React components**:
```
// App.js
import React from 'react';
import { useSelector, useDispatch } from 'react-redux';
import { increment, decrement } from './features/counter/counterSlice';
import store from './app/store';
import { Provider } from 'react-redux';
const Counter = () => {
const count = useSelector((state) => state.counter.value);
const dispatch = useDispatch();
return (
<div>
<p>{count}</p>
<button onClick={() => dispatch(increment())}>Increment</button>
<button onClick={() => dispatch(decrement())}>Decrement</button>
</div>
);
};
const App = () => (
<Provider store={store}>
<Counter />
</Provider>
);
export default App;
```
## React Context API: Simplicity and Flexibility
#### What is React Context API?
The React Context API is a built-in feature of React that allows you to pass data through the component tree without having to pass props down manually at every level. It is often used for theming, user authentication, and managing simple state.
#### Key Features of React Context API
- **Simplicity**: Easy to set up and use for small to medium-sized applications.
- **Flexibility**: Suitable for a variety of use cases, from global themes to user settings.
- **Integration**: Works seamlessly with React’s built-in hooks.
#### Basic Example with React Context API
Let's create the same counter application using the React Context API.
1 **Create a Context and Provider**:
```
// CounterContext.js
import React, { createContext, useReducer, useContext } from 'react';
const CounterContext = createContext();
const counterReducer = (state, action) => {
switch (action.type) {
case 'increment':
return { value: state.value + 1 };
case 'decrement':
return { value: state.value - 1 };
default:
throw new Error(`Unknown action: ${action.type}`);
}
};
export const CounterProvider = ({ children }) => {
const [state, dispatch] = useReducer(counterReducer, { value: 0 });
return (
<CounterContext.Provider value={{ state, dispatch }}>
{children}
</CounterContext.Provider>
);
};
export const useCounter = () => {
const context = useContext(CounterContext);
if (!context) {
throw new Error('useCounter must be used within a CounterProvider');
}
return context;
};
```
2 **Use Context in components**:
```
// App.js
import React from 'react';
import { CounterProvider, useCounter } from './CounterContext';
const Counter = () => {
const { state, dispatch } = useCounter();
return (
<div>
<p>{state.value}</p>
<button onClick={() => dispatch({ type: 'increment' })}>Increment</button>
<button onClick={() => dispatch({ type: 'decrement' })}>Decrement</button>
</div>
);
};
const App = () => (
<CounterProvider>
<Counter />
</CounterProvider>
);
export default App;
```
## Comparison: Redux-Toolkit vs React Context API
#### When to Use Redux-Toolkit
- **Complex State Logic**: Ideal for applications with complex state logic, multiple reducers, and middleware needs.
- **Large Applications**: Scales well for large applications where state management needs to be robust and maintainable.
- **Advanced Features**: Benefits from advanced Redux features like DevTools and middleware.
#### When to Use React Context API
- **Simplicity**: Perfect for smaller applications or components where you need a simple state management solution.
- **Component-Scoped State**: Useful for managing state that doesn’t need to be shared across many components.
- **Lightweight**: Avoids the overhead of adding a full-fledged state management library.
## Performance Considerations
Redux-Toolkit generally offers better performance in large applications due to its optimized updates and middleware capabilities. React Context API, while simpler, can suffer from performance issues if not used carefully, as it re-renders all consuming components whenever the context value changes.
## Conclusion
Both Redux-Toolkit and React Context API have their place in the React ecosystem. Redux-Toolkit is powerful and suitable for large, complex applications, while React Context API offers simplicity and ease of use for smaller projects. Understanding their strengths and limitations will help you make an informed decision based on the needs of your application.
By exploring the examples provided, you can get a hands-on feel for how each approach works and determine which one aligns best with your project requirements.
| msubhro |
1,896,304 | 💅🏻CSS-in-JS is Making a Comeback: What’s New? | So, it looks like we are changing the direction of how we write CSS. At first, I didn't like writing... | 0 | 2024-06-21T17:35:41 | https://dev.to/girordo/so-css-in-js-is-back-what-is-now-1fc6 | javascript, css, webdev, design | So, it looks like we are changing the direction of how we write CSS.
At first, I didn't like writing CSS-in-JS. It seemed unnecessary, but then I had a job where they only used styled-components, and I guess I adapted.
I didn't understand why we needed a library to write CSS when we could just use stylesheets directly or something like Tailwind CSS. But now that I’m a more experienced developer (I think), I kind of get why people love these libraries so much.
JavaScript frameworks are always changing direction from time to time, and now with React Server Components and React 19 supporting style hoisting, it feels like we’re heading into a new era. React 19 introduces new features that handle `<script>` and `<style>` tags inside components more intelligently, moving them to the document's head based on their precedence. This works seamlessly with concurrent rendering on the client and streaming rendering on the server, marking a significant shift in how we manage styles and scripts.
Some of these cool libraries are really embracing this change:
- [Panda CSS](https://panda-css.com/)
- [Stylex](https://stylexjs.com/blog/introducing-stylex)
- [Pigment CSS](https://mui.com/blog/introducing-pigment-css/)
- [ReStyle](https://www.restyle.dev/)
- [Stitches](https://stitches.dev/)
---
Inspiration
- [Bytes](https://bytes.dev/archives/298)
<p align="center"><em>This article was crafted and tailored with ChatGPT help.</em> 🤖💡</p> | girordo |
1,896,302 | Angular + Auth0 + .NET | Intro As some of you might know, I am on my road to become a senior software engineer. To... | 27,811 | 2024-06-21T17:33:12 | https://dev.to/suneeh/angular-auth0-net-10i3 | webdev, angular, dotnet, fullstack | ## Intro
As some of you might know, I am on my road to become a senior software engineer. To do so, I follow a backend roadmap that will teach me everything I need to know about .NET, Databases and some deployment as well. Since I currently work as Fullstack Software Engineer - I wanted to start a frontend project along side with it.
Since Webshops were the biggest part of my training, the choice for the topic of the Angular frontend became very obvious. Today I want to share my first steps and future goals of this series.
## 🖥️ Backend
The backend will consist of the new [minimal APIs of .NET](https://learn.microsoft.com/en-us/aspnet/core/fundamentals/minimal-apis?view=aspnetcore-8.0). You can see my introduction [here](https://dev.to/suneeh/net-fundamentals-minimal-api-1h9). A [PostgeSQL Database](https://www.postgresql.org/), EfCore and some [CRUD](https://en.wikipedia.org/wiki/Create,_read,_update_and_delete) endpoints were created super quickly.
## 🌍 Frontend
I created a new Angular App using all the newest features: [Signals](https://angular.dev/guide/signals), the [new control flow syntax](https://angular.dev/guide/templates/control-flow), [standalone components](https://blog.angular-university.io/angular-standalone-components/) and all that. I started out with a simple "shell" component, that will be the main "frame" of the website - it holds items like the side-nav, header, footer and a router-outlet for the content of the current page. Basically this is ALWAYS shown, to be consistent with my navigation and over all look and feel.
## 🔐 Auth0 and RBAC (Role based Access Control)
Today I started using [Auth0](https://auth0.com/) - a free Identity Provider by [okta](https://www.okta.com/). They have a nice free plan that is very much enough for my little project and can be upgraded afterwards, if this project will ever go live and commercial. See my progress [here](https://github.com/Suneeh/webshop/commit/b6a0d3b83d886e50366ea8c68361583d4dfbb33c)
Auth0 has a nice [Angular npm package](https://www.npmjs.com/package/@auth0/auth0-angular) `ng add @auth0/auth0-angular` that makes it super simple to handle login/logout/session/userinfo. After a 5-10 minute setup of your Auth0 App, API and Roles/Permissions you can start seeing the first results already! Registering your first user through your own Angular app and all that was needed was a login button that calls the `AuthService` provided by the package. After logging in, you get an AccessToken and IdentityToken holding all the information about the User and their permissions.
Afterwards I built an API-Service that can call my .NET backend while appending the AccessToken as an Authorization Header to the request.
On the Backend I then built an Authorization Handler that checked the AccessToken claims for the permissions and added the `.RequireAuthorization(string permission)` to the endpoints that I wanted to protect. It took me some time to figure out some [CORS](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS) stuff on my local machine, but in the end everything worked out just fine, and a [POC](https://en.wikipedia.org/wiki/Proof_of_concept) was successfully created.
## 📋 Future Tasks
Future goals will be, to copy this protection to all the endpoints, make a plan of what endpoints need protection.
Maybe rethink the Roles, currently there is a `Admin` role, that will be able to Create, Edit and Remove Products and Categories - as well as a `User` role that represents an authenticated user (since logged in users have more features than non logged in users - Cart/Profile/History...).
Create more endpoints (currently there is a set of endpoints that allows CRUD for `Products`).
Create frontend pages and [guards](https://angular.dev/guide/routing/common-router-tasks#preventing-unauthorized-access) for the existing CRUD endpoints.
If you want to track my progress, or check out some [code](https://github.com/Suneeh/webshop), feel free.
## 🙏🏽 Thanks
Thank you so much if you read this article all the way! Leave a comment if you have any questions, I'll be more than happy to answer right away. If you are shy you can also message me directly on [GitHub](https://github.com/Suneeh), [Instagram](https://www.instagram.com/_suneeh/) or [TikTok](https://www.tiktok.com/@_suneeh). | suneeh |
1,896,300 | Teste do Capítulo 1 | Apêndice A Respostas dos testes (página 604) 1. O que é bytecode e por que ele é importante para o... | 0 | 2024-06-21T17:27:26 | https://dev.to/devsjavagirls/teste-do-capitulo-1-1p69 | java, javaprogramming, javaparainiciantes | Apêndice A Respostas dos testes (página 604)
**1. O que é bytecode e por que ele é importante para o uso de Java em programação na Internet?**
Bytecode é um conjunto de instruções altamente otimizado que é executado pela Máquina Virtual Java. Ele ajuda Java a fornecer portabilidade e segurança.
**2. Quais são os três princípios básicos da programação orientada a objetos?** Encapsulamento, polimorfismo e herança
**3. Onde os programas Java começam a ser executados?**
Os programas Java começam a ser executados em main( )
**4. O que é uma variável?**
Uma variável é um local nomeado na memória. O conteúdo de uma variável pode ser alterado durante a execução de um programa.
**5. Qual dos nomes de variável a seguir é inválido?**
A. count
B. $count
C. count27
D. 67count
R: A variável inválida é a da opção D. Nomes de variável não podem começar com um dígito.
**6. Como se cria um comentário de linha única? E um comentário de várias linhas?** Um comentário de linha única começa com // e termina no fim da linha. Um comentário de várias linhas começa com /* e termina com */.
**7. Mostre a forma geral da instrução if. Mostre também a do laço for.**
Forma geral de if:
if(condição) instrução
Forma geral de for:
for(inicialização; condição; iteração) instrução;
**8. Como se cria um bloco de código?**
Um bloco de código começa com uma chave de abertura e termina com uma chave de fechamento.
**9. A gravidade da Lua é cerca de 17% a da Terra. Crie um programa que calcule seu peso na Lua.**
```
/*
Calcula seu peso na Lua.
Chame este arquivo de Moon.java.
*/
class Moon {
public static void main(String args[]) {
double earthweight; // peso na Terra
double moonweight; // peso na Lua
earthweight = 165;
moonweight = earthweight * 0.17;
System.out.println(earthweight + " earth-pounds is equivalent to " + moonweight + " moon-pounds.");
}
}
```
**10. Adapte o código da seção Tente isto 1-2 para que ele exiba uma tabela de conversões de polegadas para metros. Exiba 12 pés de conversões, polegada a polegada. Gere uma linha em branco a cada 12 polegadas.** (Um metro é igual à aproximadamente 39,37 polegadas.)
```
/*
Este programa exibe uma tabela de
conversão de polegadas para metros.
Chame-o de InchToMeterTable.java.
*/
class InchToMeterTable {
public static void main(String args[]) {
double inches, meters;
int counter;
counter = 0;
for(inches = 1; inches <= 144; inches++) {
meters = inches / 39.37; // converte para metros
System.out.println(inches + " inches is " + meters + " meters.");
counter++;
// a cada 12 linhas, exibe uma linha em branco
if(counter == 12) {
System.out.println();
counter = 0; // zera o contador de linhas
}
}
}
}
```
**11. Se você cometer um engano na digitação ao inserir seu programa, isso vai resultar em que tipo de erro?** Erro de sintaxe.
**12. É importante o local onde inserimos uma instrução em uma linha?**
Não, Java é uma linguagem de forma livre.
| devsjavagirls |
1,896,298 | Day 20 of 30 JavaScript | Hey reader👋 Hope you are doing well😊 In the last post we have talked about about hoisting and... | 0 | 2024-06-21T17:23:45 | https://dev.to/akshat0610/day-20-of-30-javascript-ilb | webdev, javascript, beginners, tutorial | Hey reader👋 Hope you are doing well😊
In the last post we have talked about about hoisting and interpolation in JavaScript. In this post we are going to know about Callbacks in JavaScript and an introduction to Asynchronous and Synchronous JavaScript.
So let's get started🔥
## Callbacks in JavaScript
A callback is a function passed as an argument to another function.

Here `forEach` is a method available on arrays in JavaScript.
It iterates over each element in the array and executes the provided callback function `(fun)` once for each element.
So here we have a method `foreach` in which `fun` is argument.
We know that JavaScript functions are executed in the order they are called and sometimes we would like to have better control over when to execute a function. So to have better control over execution of functions callbacks came into picture. You will know more about these when we will discuss about Asynchronous JavaScript.
**Application of Callback functions ->**
1. Event handlers -> In event-driven programming, such as in web development, callback functions are used to handle events like clicks, form submissions, and mouse movements.

So here when a button with id `myButton` is clicked (an event occured) then a callback function is called and executed.
2. Higher-order Functions -> Callbacks enable higher-order functions, which are functions that take other functions as arguments or return them. This allows for more abstract and flexible code.

Here `forEach` method takes `fun` function as argument and calls it for every array element.
3. Customizable Behavior -> Callbacks allow functions to be customized. Instead of writing multiple functions with slight variations, you can write one function that takes a callback to perform specific actions.

These are highly used in Asynchronous JavaScript. But to understand the use of callbacks in Asynchronous JavaScript. It is very important to understand Synchronous and Asynchronous JavaScript.
## Synchronous JavaScript
Synchronous JavaScript executes code sequentially, one statement at a time. Each operation must complete before the next one starts. If a task takes a long time to complete, it blocks the execution of subsequent code.

So here you can see that first hello is printed then I am Akshat and then Bye.Here every line in executed in a sequence this is what Synchronous JS is.
## Asynchronous JavaScript
Asynchronous JavaScript allows code to run without blocking the execution of other operations. When an asynchronous operation is initiated, it allows the code to continue executing while waiting for the operation to complete.

So here you can see that hello is printed first then there is a timeout having a callback function and delay of 2000 milliseconds i.e. 2 seconds. JS sees the timeout and then executes it in background and move forward and prints Outside timeout first and when timeout is ready it executes timeout. This is what Asynchronous JavaScript is.
Now you may be thinking that why do we need Synchronous and Asynchronous JS. The answer is simple as Asynchronous JS doesn't blocks the execution of code so it is important while fetching data from API and other places where we don't want to block the execution.
Synchronous JS is necessary where we need some data from user to proceed further and at places where blocking is important.
So this was it for this blog. I hope you have understood it. In next blog we will read more about Asynchronous JS. Till then stay connected and don't forget to follow me.
Thankyou 🩵 | akshat0610 |
1,896,297 | Palavras-Chaves | Conceitos Chave de Java: - Palavras-chave: Java possui 50 palavras-chave reservadas, como int, for,... | 0 | 2024-06-21T17:21:01 | https://dev.to/devsjavagirls/palavras-chaves-1pb7 | java, javaprogramming, javaparainiciantes | **Conceitos Chave de Java:**
**- Palavras-chave:**
Java possui 50 palavras-chave reservadas, como int, for, if, etc.
**- Identificadores:**
Nomes de variáveis, métodos ou classes que devem ser significativos e não podem começar com dígitos ou usar palavras-chave reservadas.
**- Bibliotecas de Classes:**
Java fornece diversas bibliotecas padrão, como System, que contém métodos println() e print(), fundamentais para operações básicas de I/O.

Além das palavras-chave, Java reserva as palavras a seguir: true, false e null. São valores definidos pela linguagem. Você não pode usar essas palavras em nomes de variáveis, classes e assim por diante. | devsjavagirls |
1,896,296 | Top 10 Futuristic Gadgets of 2024 That Are Redefining Technology | In 2024, technology is advancing at an unprecedented rate, bringing futuristic gadgets to life that... | 0 | 2024-06-21T17:20:06 | https://dev.to/futuristicgeeks/top-10-futuristic-gadgets-of-2024-that-are-redefining-technology-1abc | webdev, futuretech, technology, ai | In 2024, technology is advancing at an unprecedented rate, bringing futuristic gadgets to life that promise to revolutionize how we live, work, and play. Here are ten real-life products set to amaze and inspire in the near future:
1. Tesla Bot: Elon Musk’s Tesla Bot is a humanoid robot designed by Tesla, led by Elon Musk. It aims to automate tasks that are repetitive, dangerous, or tedious for humans.
2. Meta (formerly Facebook) Reality Labs Glasses: Meta’s Reality Labs is developing augmented reality (AR) glasses that integrate digital information with the physical world.
3. IBM Quantum System Two: IBM’s Quantum System Two represents a leap forward in quantum computing, offering enhanced capabilities for solving complex problems. It’s paving the way for breakthroughs in fields like cryptography, materials science, and drug discovery.
4. Microsoft HoloLens 3: The Microsoft HoloLens 3 is an advanced mixed reality headset that combines virtual and augmented reality experiences.
5. Amazon Astro: Amazon Astro is a home robot equipped with AI and sensors to assist with household tasks and entertainment.
6. Apple AR Glasses: Apple’s AR Glasses are rumored to revolutionize augmented reality experiences with seamless integration into Apple’s ecosystem.
7. Neuralink: Neuralink, founded by Elon Musk, aims to develop brain-computer interface technology to connect the human brain directly with machines. In 2024, advancements could enable individuals to control devices and interact with computers directly through their thoughts, with potential applications in healthcare and accessibility.
8. Boston Dynamics Stretch: Boston Dynamics’ Stretch is a robotic arm designed for warehouse automation and logistics applications.
9. Samsung Freestyle Projector: Samsung’s Freestyle Projector is a portable smart projector capable of projecting images onto any surface.
10. Sony Airpeak S1: Sony Airpeak S1 is a professional-grade drone designed for filmmakers and content creators.
[Read the complete article for more details.](https://futuristicgeeks.com/top-10-futuristic-gadgets-of-2024-that-are-redefining-technology/) | futuristicgeeks |
1,896,295 | Buy verified cash app account | https://dmhelpshop.com/product/buy-verified-cash-app-account/ Buy verified cash app account Cash... | 0 | 2024-06-21T17:19:05 | https://dev.to/pigohe7925/buy-verified-cash-app-account-3417 | webdev, javascript, beginners, programming | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-cash-app-account/\n\n\nBuy verified cash app account\nCash app has emerged as a dominant force in the realm of mobile banking within the USA, offering unparalleled convenience for digital money transfers, deposits, and trading. As the foremost provider of fully verified cash app accounts, we take pride in our ability to deliver accounts with substantial limits. Bitcoin enablement, and an unmatched level of security.\n\nOur commitment to facilitating seamless transactions and enabling digital currency trades has garnered significant acclaim, as evidenced by the overwhelming response from our satisfied clientele. Those seeking buy verified cash app account with 100% legitimate documentation and unrestricted access need look no further. Get in touch with us promptly to acquire your verified cash app account and take advantage of all the benefits it has to offer.\n\nWhy dmhelpshop is the best place to buy USA cash app accounts?\nIt’s crucial to stay informed about any updates to the platform you’re using. If an update has been released, it’s important to explore alternative options. Contact the platform’s support team to inquire about the status of the cash app service.\n\nClearly communicate your requirements and inquire whether they can meet your needs and provide the buy verified cash app account promptly. If they assure you that they can fulfill your requirements within the specified timeframe, proceed with the verification process using the required documents.\n\nOur account verification process includes the submission of the following documents: [List of specific documents required for verification].\n\nGenuine and activated email verified\nRegistered phone number (USA)\nSelfie verified\nSSN (social security number) verified\nDriving license\nBTC enable or not enable (BTC enable best)\n100% replacement guaranteed\n100% customer satisfaction\nWhen it comes to staying on top of the latest platform updates, it’s crucial to act fast and ensure you’re positioned in the best possible place. If you’re considering a switch, reaching out to the right contacts and inquiring about the status of the buy verified cash app account service update is essential.\n\nClearly communicate your requirements and gauge their commitment to fulfilling them promptly. Once you’ve confirmed their capability, proceed with the verification process using genuine and activated email verification, a registered USA phone number, selfie verification, social security number (SSN) verification, and a valid driving license.\n\nAdditionally, assessing whether BTC enablement is available is advisable, buy verified cash app account, with a preference for this feature. It’s important to note that a 100% replacement guarantee and ensuring 100% customer satisfaction are essential benchmarks in this process.\n\nHow to use the Cash Card to make purchases?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card. Alternatively, you can manually enter the CVV and expiration date. How To Buy Verified Cash App Accounts.\n\nAfter submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a buy verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account.\n\nWhy we suggest to unchanged the Cash App account username?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card.\n\nAlternatively, you can manually enter the CVV and expiration date. After submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account. Purchase Verified Cash App Accounts.\n\nSelecting a username in an app usually comes with the understanding that it cannot be easily changed within the app’s settings or options. This deliberate control is in place to uphold consistency and minimize potential user confusion, especially for those who have added you as a contact using your username. In addition, purchasing a Cash App account with verified genuine documents already linked to the account ensures a reliable and secure transaction experience.\n\n \n\nBuy verified cash app accounts quickly and easily for all your financial needs.\nAs the user base of our platform continues to grow, the significance of verified accounts cannot be overstated for both businesses and individuals seeking to leverage its full range of features. How To Buy Verified Cash App Accounts.\n\nFor entrepreneurs, freelancers, and investors alike, a verified cash app account opens the door to sending, receiving, and withdrawing substantial amounts of money, offering unparalleled convenience and flexibility. Whether you’re conducting business or managing personal finances, the benefits of a verified account are clear, providing a secure and efficient means to transact and manage funds at scale.\n\nWhen it comes to the rising trend of purchasing buy verified cash app account, it’s crucial to tread carefully and opt for reputable providers to steer clear of potential scams and fraudulent activities. How To Buy Verified Cash App Accounts. With numerous providers offering this service at competitive prices, it is paramount to be diligent in selecting a trusted source.\n\nThis article serves as a comprehensive guide, equipping you with the essential knowledge to navigate the process of procuring buy verified cash app account, ensuring that you are well-informed before making any purchasing decisions. Understanding the fundamentals is key, and by following this guide, you’ll be empowered to make informed choices with confidence.\n\n \n\nIs it safe to buy Cash App Verified Accounts?\nCash App, being a prominent peer-to-peer mobile payment application, is widely utilized by numerous individuals for their transactions. However, concerns regarding its safety have arisen, particularly pertaining to the purchase of “verified” accounts through Cash App. This raises questions about the security of Cash App’s verification process.\n\nUnfortunately, the answer is negative, as buying such verified accounts entails risks and is deemed unsafe. Therefore, it is crucial for everyone to exercise caution and be aware of potential vulnerabilities when using Cash App. How To Buy Verified Cash App Accounts.\n\nCash App has emerged as a widely embraced platform for purchasing Instagram Followers using PayPal, catering to a diverse range of users. This convenient application permits individuals possessing a PayPal account to procure authenticated Instagram Followers.\n\nLeveraging the Cash App, users can either opt to procure followers for a predetermined quantity or exercise patience until their account accrues a substantial follower count, subsequently making a bulk purchase. Although the Cash App provides this service, it is crucial to discern between genuine and counterfeit items. If you find yourself in search of counterfeit products such as a Rolex, a Louis Vuitton item, or a Louis Vuitton bag, there are two viable approaches to consider.\n\n \n\nWhy you need to buy verified Cash App accounts personal or business?\nThe Cash App is a versatile digital wallet enabling seamless money transfers among its users. However, it presents a concern as it facilitates transfer to both verified and unverified individuals.\n\nTo address this, the Cash App offers the option to become a verified user, which unlocks a range of advantages. Verified users can enjoy perks such as express payment, immediate issue resolution, and a generous interest-free period of up to two weeks. With its user-friendly interface and enhanced capabilities, the Cash App caters to the needs of a wide audience, ensuring convenient and secure digital transactions for all.\n\nIf you’re a business person seeking additional funds to expand your business, we have a solution for you. Payroll management can often be a challenging task, regardless of whether you’re a small family-run business or a large corporation. How To Buy Verified Cash App Accounts.\n\nImproper payment practices can lead to potential issues with your employees, as they could report you to the government. However, worry not, as we offer a reliable and efficient way to ensure proper payroll management, avoiding any potential complications. Our services provide you with the funds you need without compromising your reputation or legal standing. With our assistance, you can focus on growing your business while maintaining a professional and compliant relationship with your employees. Purchase Verified Cash App Accounts.\n\nA Cash App has emerged as a leading peer-to-peer payment method, catering to a wide range of users. With its seamless functionality, individuals can effortlessly send and receive cash in a matter of seconds, bypassing the need for a traditional bank account or social security number. Buy verified cash app account.\n\nThis accessibility makes it particularly appealing to millennials, addressing a common challenge they face in accessing physical currency. As a result, ACash App has established itself as a preferred choice among diverse audiences, enabling swift and hassle-free transactions for everyone. Purchase Verified Cash App Accounts.\n\n \n\nHow to verify Cash App accounts\nTo ensure the verification of your Cash App account, it is essential to securely store all your required documents in your account. This process includes accurately supplying your date of birth and verifying the US or UK phone number linked to your Cash App account.\n\nAs part of the verification process, you will be asked to submit accurate personal details such as your date of birth, the last four digits of your SSN, and your email address. If additional information is requested by the Cash App community to validate your account, be prepared to provide it promptly. Upon successful verification, you will gain full access to managing your account balance, as well as sending and receiving funds seamlessly. Buy verified cash app account.\n\n \n\nHow cash used for international transaction?\nExperience the seamless convenience of this innovative platform that simplifies money transfers to the level of sending a text message. It effortlessly connects users within the familiar confines of their respective currency regions, primarily in the United States and the United Kingdom.\n\nNo matter if you’re a freelancer seeking to diversify your clientele or a small business eager to enhance market presence, this solution caters to your financial needs efficiently and securely. Embrace a world of unlimited possibilities while staying connected to your currency domain. Buy verified cash app account.\n\nUnderstanding the currency capabilities of your selected payment application is essential in today’s digital landscape, where versatile financial tools are increasingly sought after. In this era of rapid technological advancements, being well-informed about platforms such as Cash App is crucial.\n\nAs we progress into the digital age, the significance of keeping abreast of such services becomes more pronounced, emphasizing the necessity of staying updated with the evolving financial trends and options available. Buy verified cash app account.\n\nOffers and advantage to buy cash app accounts cheap?\nWith Cash App, the possibilities are endless, offering numerous advantages in online marketing, cryptocurrency trading, and mobile banking while ensuring high security. As a top creator of Cash App accounts, our team possesses unparalleled expertise in navigating the platform.\n\nWe deliver accounts with maximum security and unwavering loyalty at competitive prices unmatched by other agencies. Rest assured, you can trust our services without hesitation, as we prioritize your peace of mind and satisfaction above all else.\n\nEnhance your business operations effortlessly by utilizing the Cash App e-wallet for seamless payment processing, money transfers, and various other essential tasks. Amidst a myriad of transaction platforms in existence today, the Cash App e-wallet stands out as a premier choice, offering users a multitude of functions to streamline their financial activities effectively. Buy verified cash app account.\n\nTrustbizs.com stands by the Cash App’s superiority and recommends acquiring your Cash App accounts from this trusted source to optimize your business potential.\n\nHow Customizable are the Payment Options on Cash App for Businesses?\nDiscover the flexible payment options available to businesses on Cash App, enabling a range of customization features to streamline transactions. Business users have the ability to adjust transaction amounts, incorporate tipping options, and leverage robust reporting tools for enhanced financial management.\n\nExplore trustbizs.com to acquire verified Cash App accounts with LD backup at a competitive price, ensuring a secure and efficient payment solution for your business needs. Buy verified cash app account.\n\nDiscover Cash App, an innovative platform ideal for small business owners and entrepreneurs aiming to simplify their financial operations. With its intuitive interface, Cash App empowers businesses to seamlessly receive payments and effectively oversee their finances. Emphasizing customization, this app accommodates a variety of business requirements and preferences, making it a versatile tool for all.\n\nWhere To Buy Verified Cash App Accounts\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nThe Importance Of Verified Cash App Accounts\nIn today’s digital age, the significance of verified Cash App accounts cannot be overstated, as they serve as a cornerstone for secure and trustworthy online transactions.\n\nBy acquiring verified Cash App accounts, users not only establish credibility but also instill the confidence required to participate in financial endeavors with peace of mind, thus solidifying its status as an indispensable asset for individuals navigating the digital marketplace.\n\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nConclusion\nEnhance your online financial transactions with verified Cash App accounts, a secure and convenient option for all individuals. By purchasing these accounts, you can access exclusive features, benefit from higher transaction limits, and enjoy enhanced protection against fraudulent activities. Streamline your financial interactions and experience peace of mind knowing your transactions are secure and efficient with verified Cash App accounts.\n\nChoose a trusted provider when acquiring accounts to guarantee legitimacy and reliability. In an era where Cash App is increasingly favored for financial transactions, possessing a verified account offers users peace of mind and ease in managing their finances. Make informed decisions to safeguard your financial assets and streamline your personal transactions effectively.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com" | pigohe7925 |
1,896,294 | Buy verified cash app account | https://dmhelpshop.com/product/buy-verified-cash-app-account/ Buy verified cash app account Cash... | 0 | 2024-06-21T17:19:00 | https://dev.to/pofasex916/buy-verified-cash-app-account-3e8l | webdev, javascript, beginners, programming | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-cash-app-account/\n\n\n\n\nBuy verified cash app account\nCash app has emerged as a dominant force in the realm of mobile banking within the USA, offering unparalleled convenience for digital money transfers, deposits, and trading. As the foremost provider of fully verified cash app accounts, we take pride in our ability to deliver accounts with substantial limits. Bitcoin enablement, and an unmatched level of security.\n\nOur commitment to facilitating seamless transactions and enabling digital currency trades has garnered significant acclaim, as evidenced by the overwhelming response from our satisfied clientele. Those seeking buy verified cash app account with 100% legitimate documentation and unrestricted access need look no further. Get in touch with us promptly to acquire your verified cash app account and take advantage of all the benefits it has to offer.\n\nWhy dmhelpshop is the best place to buy USA cash app accounts?\nIt’s crucial to stay informed about any updates to the platform you’re using. If an update has been released, it’s important to explore alternative options. Contact the platform’s support team to inquire about the status of the cash app service.\n\nClearly communicate your requirements and inquire whether they can meet your needs and provide the buy verified cash app account promptly. If they assure you that they can fulfill your requirements within the specified timeframe, proceed with the verification process using the required documents.\n\nOur account verification process includes the submission of the following documents: [List of specific documents required for verification].\n\nGenuine and activated email verified\nRegistered phone number (USA)\nSelfie verified\nSSN (social security number) verified\nDriving license\nBTC enable or not enable (BTC enable best)\n100% replacement guaranteed\n100% customer satisfaction\nWhen it comes to staying on top of the latest platform updates, it’s crucial to act fast and ensure you’re positioned in the best possible place. If you’re considering a switch, reaching out to the right contacts and inquiring about the status of the buy verified cash app account service update is essential.\n\nClearly communicate your requirements and gauge their commitment to fulfilling them promptly. Once you’ve confirmed their capability, proceed with the verification process using genuine and activated email verification, a registered USA phone number, selfie verification, social security number (SSN) verification, and a valid driving license.\n\nAdditionally, assessing whether BTC enablement is available is advisable, buy verified cash app account, with a preference for this feature. It’s important to note that a 100% replacement guarantee and ensuring 100% customer satisfaction are essential benchmarks in this process.\n\nHow to use the Cash Card to make purchases?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card. Alternatively, you can manually enter the CVV and expiration date. How To Buy Verified Cash App Accounts.\n\nAfter submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a buy verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account.\n\nWhy we suggest to unchanged the Cash App account username?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card.\n\nAlternatively, you can manually enter the CVV and expiration date. After submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account. Purchase Verified Cash App Accounts.\n\nSelecting a username in an app usually comes with the understanding that it cannot be easily changed within the app’s settings or options. This deliberate control is in place to uphold consistency and minimize potential user confusion, especially for those who have added you as a contact using your username. In addition, purchasing a Cash App account with verified genuine documents already linked to the account ensures a reliable and secure transaction experience.\n\n \n\nBuy verified cash app accounts quickly and easily for all your financial needs.\nAs the user base of our platform continues to grow, the significance of verified accounts cannot be overstated for both businesses and individuals seeking to leverage its full range of features. How To Buy Verified Cash App Accounts.\n\nFor entrepreneurs, freelancers, and investors alike, a verified cash app account opens the door to sending, receiving, and withdrawing substantial amounts of money, offering unparalleled convenience and flexibility. Whether you’re conducting business or managing personal finances, the benefits of a verified account are clear, providing a secure and efficient means to transact and manage funds at scale.\n\nWhen it comes to the rising trend of purchasing buy verified cash app account, it’s crucial to tread carefully and opt for reputable providers to steer clear of potential scams and fraudulent activities. How To Buy Verified Cash App Accounts. With numerous providers offering this service at competitive prices, it is paramount to be diligent in selecting a trusted source.\n\nThis article serves as a comprehensive guide, equipping you with the essential knowledge to navigate the process of procuring buy verified cash app account, ensuring that you are well-informed before making any purchasing decisions. Understanding the fundamentals is key, and by following this guide, you’ll be empowered to make informed choices with confidence.\n\n \n\nIs it safe to buy Cash App Verified Accounts?\nCash App, being a prominent peer-to-peer mobile payment application, is widely utilized by numerous individuals for their transactions. However, concerns regarding its safety have arisen, particularly pertaining to the purchase of “verified” accounts through Cash App. This raises questions about the security of Cash App’s verification process.\n\nUnfortunately, the answer is negative, as buying such verified accounts entails risks and is deemed unsafe. Therefore, it is crucial for everyone to exercise caution and be aware of potential vulnerabilities when using Cash App. How To Buy Verified Cash App Accounts.\n\nCash App has emerged as a widely embraced platform for purchasing Instagram Followers using PayPal, catering to a diverse range of users. This convenient application permits individuals possessing a PayPal account to procure authenticated Instagram Followers.\n\nLeveraging the Cash App, users can either opt to procure followers for a predetermined quantity or exercise patience until their account accrues a substantial follower count, subsequently making a bulk purchase. Although the Cash App provides this service, it is crucial to discern between genuine and counterfeit items. If you find yourself in search of counterfeit products such as a Rolex, a Louis Vuitton item, or a Louis Vuitton bag, there are two viable approaches to consider.\n\n \n\nWhy you need to buy verified Cash App accounts personal or business?\nThe Cash App is a versatile digital wallet enabling seamless money transfers among its users. However, it presents a concern as it facilitates transfer to both verified and unverified individuals.\n\nTo address this, the Cash App offers the option to become a verified user, which unlocks a range of advantages. Verified users can enjoy perks such as express payment, immediate issue resolution, and a generous interest-free period of up to two weeks. With its user-friendly interface and enhanced capabilities, the Cash App caters to the needs of a wide audience, ensuring convenient and secure digital transactions for all.\n\nIf you’re a business person seeking additional funds to expand your business, we have a solution for you. Payroll management can often be a challenging task, regardless of whether you’re a small family-run business or a large corporation. How To Buy Verified Cash App Accounts.\n\nImproper payment practices can lead to potential issues with your employees, as they could report you to the government. However, worry not, as we offer a reliable and efficient way to ensure proper payroll management, avoiding any potential complications. Our services provide you with the funds you need without compromising your reputation or legal standing. With our assistance, you can focus on growing your business while maintaining a professional and compliant relationship with your employees. Purchase Verified Cash App Accounts.\n\nA Cash App has emerged as a leading peer-to-peer payment method, catering to a wide range of users. With its seamless functionality, individuals can effortlessly send and receive cash in a matter of seconds, bypassing the need for a traditional bank account or social security number. Buy verified cash app account.\n\nThis accessibility makes it particularly appealing to millennials, addressing a common challenge they face in accessing physical currency. As a result, ACash App has established itself as a preferred choice among diverse audiences, enabling swift and hassle-free transactions for everyone. Purchase Verified Cash App Accounts.\n\n \n\nHow to verify Cash App accounts\nTo ensure the verification of your Cash App account, it is essential to securely store all your required documents in your account. This process includes accurately supplying your date of birth and verifying the US or UK phone number linked to your Cash App account.\n\nAs part of the verification process, you will be asked to submit accurate personal details such as your date of birth, the last four digits of your SSN, and your email address. If additional information is requested by the Cash App community to validate your account, be prepared to provide it promptly. Upon successful verification, you will gain full access to managing your account balance, as well as sending and receiving funds seamlessly. Buy verified cash app account.\n\n \n\nHow cash used for international transaction?\nExperience the seamless convenience of this innovative platform that simplifies money transfers to the level of sending a text message. It effortlessly connects users within the familiar confines of their respective currency regions, primarily in the United States and the United Kingdom.\n\nNo matter if you’re a freelancer seeking to diversify your clientele or a small business eager to enhance market presence, this solution caters to your financial needs efficiently and securely. Embrace a world of unlimited possibilities while staying connected to your currency domain. Buy verified cash app account.\n\nUnderstanding the currency capabilities of your selected payment application is essential in today’s digital landscape, where versatile financial tools are increasingly sought after. In this era of rapid technological advancements, being well-informed about platforms such as Cash App is crucial.\n\nAs we progress into the digital age, the significance of keeping abreast of such services becomes more pronounced, emphasizing the necessity of staying updated with the evolving financial trends and options available. Buy verified cash app account.\n\nOffers and advantage to buy cash app accounts cheap?\nWith Cash App, the possibilities are endless, offering numerous advantages in online marketing, cryptocurrency trading, and mobile banking while ensuring high security. As a top creator of Cash App accounts, our team possesses unparalleled expertise in navigating the platform.\n\nWe deliver accounts with maximum security and unwavering loyalty at competitive prices unmatched by other agencies. Rest assured, you can trust our services without hesitation, as we prioritize your peace of mind and satisfaction above all else.\n\nEnhance your business operations effortlessly by utilizing the Cash App e-wallet for seamless payment processing, money transfers, and various other essential tasks. Amidst a myriad of transaction platforms in existence today, the Cash App e-wallet stands out as a premier choice, offering users a multitude of functions to streamline their financial activities effectively. Buy verified cash app account.\n\nTrustbizs.com stands by the Cash App’s superiority and recommends acquiring your Cash App accounts from this trusted source to optimize your business potential.\n\nHow Customizable are the Payment Options on Cash App for Businesses?\nDiscover the flexible payment options available to businesses on Cash App, enabling a range of customization features to streamline transactions. Business users have the ability to adjust transaction amounts, incorporate tipping options, and leverage robust reporting tools for enhanced financial management.\n\nExplore trustbizs.com to acquire verified Cash App accounts with LD backup at a competitive price, ensuring a secure and efficient payment solution for your business needs. Buy verified cash app account.\n\nDiscover Cash App, an innovative platform ideal for small business owners and entrepreneurs aiming to simplify their financial operations. With its intuitive interface, Cash App empowers businesses to seamlessly receive payments and effectively oversee their finances. Emphasizing customization, this app accommodates a variety of business requirements and preferences, making it a versatile tool for all.\n\nWhere To Buy Verified Cash App Accounts\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nThe Importance Of Verified Cash App Accounts\nIn today’s digital age, the significance of verified Cash App accounts cannot be overstated, as they serve as a cornerstone for secure and trustworthy online transactions.\n\nBy acquiring verified Cash App accounts, users not only establish credibility but also instill the confidence required to participate in financial endeavors with peace of mind, thus solidifying its status as an indispensable asset for individuals navigating the digital marketplace.\n\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nConclusion\nEnhance your online financial transactions with verified Cash App accounts, a secure and convenient option for all individuals. By purchasing these accounts, you can access exclusive features, benefit from higher transaction limits, and enjoy enhanced protection against fraudulent activities. Streamline your financial interactions and experience peace of mind knowing your transactions are secure and efficient with verified Cash App accounts.\n\nChoose a trusted provider when acquiring accounts to guarantee legitimacy and reliability. In an era where Cash App is increasingly favored for financial transactions, possessing a verified account offers users peace of mind and ease in managing their finances. Make informed decisions to safeguard your financial assets and streamline your personal transactions effectively.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com\n\n" | pofasex916 |
1,896,293 | I want to deploy the project on rhel using docker and npm install / pip install -r requirements.txt not work inside the docker | Unable to create image error npm install cmd [root@acer frontend]# docker run -it --rm... | 0 | 2024-06-21T17:17:52 | https://dev.to/shubham_mojad_8fcf948a05b/npm-install-pip-install-r-requirementstxt-not-work-inside-the-docker-4cd8 | help, fastapi, linux, docker | Unable to create image error npm install cmd
[root@acer frontend]# docker run -it --rm frontend-image /bin/sh
/frontend # node -v
v20.14.0
/frontend # npm -v
10.7.0
/frontend # npm install --verbose
npm verbose cli /usr/local/bin/node /usr/local/bin/npm
npm info using npm@10.7.0
npm info using node@v20.14.0
npm verbose title npm install
npm verbose argv "install" "--loglevel" "verbose"
npm verbose logfile logs-max:10 dir:/root/.npm/_logs/2024-06-21T16_54_17_775Z-
npm verbose logfile /root/.npm/_logs/2024-06-21T16_54_17_775Z-debug-0.log
npm verbose reify failed optional dependency /frontend/node_modules/fsevents
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/win32-x64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/win32-ia32
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/win32-arm64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/sunos-x64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/openbsd-x64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/netbsd-x64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/linux-s390x
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/linux-riscv64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/linux-ppc64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/linux-mips64el
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/linux-loong64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/linux-ia32
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/linux-arm64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/linux-arm
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/freebsd-x64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/freebsd-arm64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/darwin-x64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/darwin-arm64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/android-x64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/android-arm64
npm verbose reify failed optional dependency /frontend/node_modules/@esbuild/android-arm
npm http fetch GET https://registry.npmjs.org/npm attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/zip-stream/-/zip-stream-4.1.1.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/yaml/-/yaml-1.10.2.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/yallist/-/yallist-3.1.1.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/xmlchars/-/xmlchars-2.2.0.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/xlsx/-/xlsx-0.18.5.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/wrappy/-/wrappy-1.0.2.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/word/-/word-0.3.0.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/wmf/-/wmf-1.0.2.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/which-typed-array/-/which-typed-array-1.1.15.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/which-collection/-/which-collection-1.0.2.tgz attempt 1 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/which-boxed-primitive/-/which-boxed-primitive-1.0.2.tgz attempt 1 failed with EAI_AGAIN
npm verbose audit error FetchError: request to https://registry.npmjs.org/-/npm/v1/security/audits/quick failed, reason: getaddrinfo EAI_AGAIN registry.npmjs.org
npm verbose audit error at ClientRequest.<anonymous> (/usr/local/lib/node_modules/npm/node_modules/minipass-fetch/lib/index.js:130:14)
npm verbose audit error at ClientRequest.emit (node:events:519:28)
npm verbose audit error at _destroy (node:_http_client:880:13)
npm verbose audit error at onSocketNT (node:_http_client:900:5)
npm verbose audit error at process.processTicksAndRejections (node:internal/process/task_queues:83:21) {
npm verbose audit error code: 'EAI_AGAIN',
npm verbose audit error errno: 'EAI_AGAIN',
npm verbose audit error syscall: 'getaddrinfo',
npm verbose audit error hostname: 'registry.npmjs.org',
npm verbose audit error type: 'system'
npm verbose audit error }
npm http fetch GET https://registry.npmjs.org/npm attempt 2 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz attempt 2 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/zip-stream/-/zip-stream-4.1.1.tgz attempt 2 failed with EAI_AGAIN
npm http fetch GET https://registry.npmjs.org/yaml/-/yaml-1.10.2.tgz attempt 2 failed with EAI_AGAIN
/^Z[1]+ Stopped npm install --verbose
/frontend #
| shubham_mojad_8fcf948a05b |
1,896,292 | Tente Isto 1-2 Melhore o conversor | Resumo do Artigo sobre a Criação de um Conversor de Galões para Litros em Java - Objetivo: Criar uma... | 0 | 2024-06-21T17:17:37 | https://dev.to/devsjavagirls/tente-isto-1-2-melhore-o-conversor-4d84 | java, javaprogramming, javaparainiciantes |
**Resumo do Artigo sobre a Criação de um Conversor de Galões para Litros em Java**
**- Objetivo:**
Criar uma versão aprimorada do conversor de galões para litros utilizando laço for, instrução if e blocos de código. A nova versão exibe uma tabela de conversão de 1 a 100 galões, inserindo uma linha em branco a cada 10 galões.
- Passos para Implementação:
**- Criar o Arquivo Java:**
Nomeie o arquivo como GalToLitTable.java.
**- Código do Programa:**

**Compilação e Execução:**
- Compile o programa com o comando:
javac GalToLitTable.java
- Execute o programa com o comando:
java GalToLitTable
- Funcionamento do Programa:
1. Inicialização do Contador:
O contador de linhas começa com zero.
- Loop de Conversão:
O loop for vai de 1 a 100 galões, calculando a conversão para litros.
- Contagem e Linha em Branco:
A cada 10 iterações (linhas), uma linha em branco é exibida e o contador é resetado para zero.
**- Exemplo de Saída:**

até 100
| devsjavagirls |
1,896,291 | Top 10 Security Tips to Avoid Becoming a Victim of Cybercrime | In an increasingly digital world, the threat of cybercrime is ever-present, affecting individuals,... | 0 | 2024-06-21T17:15:21 | https://dev.to/futuristicgeeks/top-10-security-tips-to-avoid-becoming-a-victim-of-cybercrime-4e1o | cybersecurity, cyberawareness, onlinefraud, webdev | In an increasingly digital world, the threat of cybercrime is ever-present, affecting individuals, businesses, and governments globally. Here are ten essential security tips, bolstered by facts and figures, to help safeguard yourself against cyber threats.
1. Understanding the Scale of Cybercrime
2. Use Strong, Unique Passwords
3. Enable Two-Factor Authentication (2FA)
4. Keep Software Updated
5. Beware of Phishing Scams
6. Secure Your Devices
7. Limit Information Sharing
8. Backup Important Data
9. Educate Yourself and Others
10. Monitor Financial Accounts
[Click here](https://futuristicgeeks.com/top-10-security-tips-to-avoid-becoming-a-victim-of-cybercrime/) Explore the full article here. | futuristicgeeks |
1,896,290 | Made a Assignment Generator | I had just one day left to submit my semester-long assignment. The requirement was that it had to be... | 0 | 2024-06-21T17:11:07 | https://dev.to/adarshagupta/made-a-assignment-generator-33j6 | python, flask, webdev, javascript | I had just one day left to submit my semester-long assignment. The requirement was that it had to be a handwritten note that was typed out. Feeling completely lost and too lazy to do it myself, I remembered a project I saw years ago. It involved users inputting text, and a program generating a realistic image of a handwritten assignment. At the time, I dismissed it as unnecessary, but now it inspired me. So, I decided to create my own code to generate the assignment with minimal effort.
Website: https://homeworkai.adarshgupta.co/

It's very simple to use just paste whatever you want and it will generate a realistic image of your homework. I am planning to add an AI feature to complete all the work just by prompt using open API.
| adarshagupta |
1,896,288 | What was your win this week? | 👋👋👋👋 Looking back on your week -- what was something you're proud of? All wins count -- big or... | 0 | 2024-06-21T17:08:26 | https://dev.to/devteam/what-was-your-win-this-week-mj | weeklyretro | 👋👋👋👋
Looking back on your week -- what was something you're proud of?
All wins count -- big or small 🎉
Examples of 'wins' include:
- Getting a promotion!
- Starting a new project
- Fixing a tricky bug
- Finding a moment to decompress 😄

Happy Friday! | jess |
1,896,286 | Code Smell 255 - Parallel Hierarchies | Double Trouble: The Curse of Redundant Structures TL;DR: Parallel hierarchies lead to duplication... | 9,470 | 2024-06-21T17:02:03 | https://maximilianocontieri.com/code-smell-255-parallel-hierarchies | webdev, beginners, programming, tutorial | *Double Trouble: The Curse of Redundant Structures*
> TL;DR: Parallel hierarchies lead to duplication and tight coupling.
# Problems
- Increased complexity
- DRY / Code Duplication
- Maintenance Nightmare
- Coupling
- Ripple Effect
- Potential for inconsistencies across different hierarchies
# Solutions
1. Merge hierarchies
2. Use composition
3. Extract Common Functionality
# Refactorings
{% post https://dev.to/mcsee/refactoring-013-remove-repeated-code-4npi %}
{% post https://dev.to/mcsee/refactoring-007-extract-class-18ei %}
# Context
Parallel hierarchies occur when you must make a counterpart every time you create a domain class.
The counterpart might be persistence, UI, Controller, tests, Serialization, etc
This leads to duplicate structures and tight coupling.
Changes in the domain model require changes in the parallel classes, making the system more brittle and harder to manage.
# Sample Code
## Wrong
[Gist Url]: # (https://gist.github.com/mcsee/1b8a4c6bc7bd1fc9947f684e4e92b30c)
```java
// Domain classes
abstract class Transaction {
private String id;
private double amount;
}
class BankTransaction extends Transaction {
private String bankName;
}
class CreditCardTransaction extends Transaction {
private String cardNumber;
}
// Persistence classes
abstract class TransactionDAO {
private String id;
private double amount;
}
class BankTransactionDAO extends TransactionDAO {
private String bankName;
}
class CreditCardTransactionDAO extends TransactionDAO {
private String cardNumber;
}
```
## Right
[Gist Url]: # (https://gist.github.com/mcsee/30d21e449099f361010a767dcc66c571)
```java
public class TransactionService {
private EntityManager entityManager;
public TransactionService(EntityManager entityManager) {
this.entityManager = entityManager;
}
public void saveTransaction(Transaction transaction) {
entityManager.getTransaction().begin();
entityManager.persist(transaction);
entityManager.getTransaction().commit();
}
public Transaction loadTransaction(
Long id, Class<? extends Transaction> transactionClass) {
return entityManager.find(transactionClass, id);
}
}
```
# Detection
[X] Semi-Automatic
You can detect this smell by traversing the hierarchies
# Exceptions
- Some frameworks force you to extend your domain using this technique
# Tags
- Hierarchies
# Level
[X] Intermediate
# AI Generation
AI generators often create this smell by mirroring domain models in persistence layers without understanding the implications, leading to unnecessary duplication.
# AI Detection
AI Assistants can fix this smell with instructions to consolidate hierarchies and use composition, reducing duplication and improving maintainability.
ChatGPT offered a solution using **'Instanceof'** which is an even worse code smell
# Conclusion
Parallel hierarchies create unnecessary complexity and make the codebase harder to maintain.
They bring [deep hierarchies](https://dev.to/mcsee/code-smell-137-inheritance-tree-too-deep-3j5p) which is a symptom of [subclassification for code reuse](https://dev.to/mcsee/code-smell-11-subclassification-for-code-reuse-1136)
You can merge the hierarchies and use composition to simplify the design and improve the system's robustness.
You can use Metaprogramming to manage the persistence or the unit tests.
[Metaprogramming](https://dev.to/mcsee/laziness-i-meta-programming-32a9) is also a code smell when you use it for domain problems, but persistence and testing are orthogonal domains.
# Relations
{% post https://dev.to/mcsee/code-smell-137-inheritance-tree-too-deep-3j5p %}
{% post https://dev.to/mcsee/code-smell-11-subclassification-for-code-reuse-1136 %}
{% post https://dev.to/mcsee/code-smell-58-yo-yo-problem-ej9 %}
# More Info
{% post https://dev.to/mcsee/laziness-i-meta-programming-32a9 %}
# Disclaimer
Code Smells are my [opinion](https://dev.to/mcsee/i-wrote-more-than-90-articles-on-2021-here-is-what-i-learned-1n3a).
# Credits
Foto de <a href="https://unsplash.com/@artisanalphoto">ArtisanalPhoto</a> en <a href="https://unsplash.com/fotos/barandillas-de-metal-gris-en-escalera-blanca-MJcb7ZhNeUA">Unsplash</a>
* * *
> Inheritance is surely a good answer but who knows the questions?
_Michel Gauthier_
{% post https://dev.to/mcsee/software-engineering-great-quotes-26ci %}
* * *
This article is part of the CodeSmell Series.
{% post https://dev.to/mcsee/how-to-find-the-stinky-parts-of-your-code-1dbc %} | mcsee |
1,896,462 | Multiple Regions, Single Pane of Glass | by Emmanuel Pot Multiple Regions, Single Pane of Glass A common problem when building... | 0 | 2024-06-24T20:23:11 | https://dev.to/warpstream/multiple-regions-single-pane-of-glass-1kjj | streaming, apachekafka, datastreaming, dataengineering | ---
title: Multiple Regions, Single Pane of Glass
published: true
date: 2024-06-21 17:01:44 UTC
tags: streaming,apachekafka,datastreaming,dataengineering
canonical_url:
---
by Emmanuel Pot
### Multiple Regions, Single Pane of Glass
A common problem when building infrastructure-as-a-service products is the need to provide highly available and isolated resources in **many different regions** while also having the overall product present as a “single pane of glass” to end-users. Unfortunately, these two requirements stand in direct opposition to each other. Ideally, regional infrastructure is, well, regional, with zero inter-regional dependencies. On the other hand, users really don’t want to have to sign into multiple accounts/websites to manage infrastructure spread across many different regions.
When we first designed how we would expand WarpStream’s cloud control planes from a single region to many, we searched around for good content on the topic and didn’t find much. Many different infrastructure companies have solved this problem, but very few have blogged about it, so we decided to write about our approach and, perhaps more importantly, some of the approaches we _didn’t take_.
Let’s start by briefly reviewing WarpStream’s architecture by tracing the flow of a single request through the system. An operation usually begins with a customer’s Kafka client issuing a Kafka protocol message to the Agents, say a Metadata request. Since Kafka Metadata requests don’t interact with raw topic data like Produce and Fetch do, they can be handled solely by the WarpStream control plane. So when the WarpStream Agents receive a Kafka Metadata request, they just proxy it directly to the control plane.

WarpStream Agents deployed in a customer cloud account, sending metadata requests to WarpStream’s Metadata Store.
The request will hit a load balancer and then one of WarpStream’s “Gateway” nodes. The Gateway node’s job is to perform light authentication and authorization (basically, verify the request’s API key and map it to the correct customer / virtual cluster), and then forward the request to the Metadata Store for this customer’s cluster.
Based on this, it’s already clear that WarpStream’s control plane has to deal with two very different types of data:
1. **Platform data** : everything that users can control from our [web console and APIs:](https://console.warpstream.com/) users, clusters, API keys, SASL credentials, etc. This data is persisted in a primary Aurora database that runs in us-east-1 and changes very infrequently.
2. **Cluster metadata** : all the metadata that enables WarpStream to present the abstraction of Kafka on top of a low-level primitive like commodity object storage. For example, the Metadata Store keeps track of all the topic-partitions (and offsets) that are contained within every file stored in the user’s object storage bucket.
These two different types of data have very different requirements. The cluster metadata is in the critical path of every Kafka operation (both writes and reads), and therefore must be strongly consistent, extremely durable, highly available, and have low latency. As a result, we run every instance of the Metadata Store in a single region, whichever region is closest to the user’s WarpStream Agents. We also run each instance of the Metadata Store quadruply replicated across three availability zones, and we never replicate this metadata across multiple regions (for now).
The requirements for the platform data, on the other hand, look completely different. This data changes infrequently, and the data being slightly stale is of no consequence (eventual consistency is ok). While platform data like API keys are _technically_ required in the critical path, since they’re trivially cacheable for arbitrarily long periods of time, they’re not _really_ in the critical path. Also, unlike the cluster metadata, some of the platform data needs to be available in _multiple regions_ for the service to function as a single pane of glass.
When we were evaluating how to add support for additional regions to WarpStream, there wasn’t much to think about for the virtual cluster Metadata Stores. We would just run dedicated instances of it in more regions, and users would connect their Agents to whichever region was closest to their Agents since most (but not all) WarpStream clusters run in a single region anyway.
The platform data (like API keys) is a different story. We could have used the same approach we did with the Metadata Store for the platform data by running a dedicated (and fully isolated) Aurora instance in every region, but that would have resulted in a poor user experience. Every region would have presented to users as a fully independent “website,” and users who wanted to run clusters in multiple regions would have had to maintain different WarpStream accounts, re-invite their teams, configure billing multiple times, etc, which is not what we wanted.
### Hub and Spoke
When we looked at these requirements, the architecture that seemed like the best candidate was a “hub and spoke” model. The us-east-1 region that hosts our Aurora cluster would be the primary “hub” region that hosts the WarpStream UI and all of our “infrastructure as code” APIs for creating/destroying virtual clusters. All the other regions would be “spokes” that run fully independent and isolated versions of WarpStream’s Metadata Store, but not the Aurora database that stores the “platform data”.

Three spoke regions running fully isolated Metadata Stores powered by platform data replicated from the hub region.
CRUD operations to create and destroy virtual clusters would _always_ be routed to the hub region, but actual customer WarpStream clusters and their Agents would only ever interact with a single “spoke” region and have no cross-regional dependencies.
This approach would give us the best of both worlds: a single pane of glass where WarpStream customers could manage clusters in any region while still keeping regions independent from each other such that a failure in one region (including the hub region) would never cause a failure in any other region. The one caveat with this approach is that any unavailability of Aurora in the primary hub region would prevent customers from _creating new clusters_ in _all regions_, but _existing_ _clusters_ would continue working just fine. We felt like this was an acceptable trade-off.
However, this architecture did present a conundrum for us. In order for our product to present as a single pane of glass, _some_ of the data in our primary region (like whether a virtual cluster exists, whether an API key was valid, etc) had to be made available in _all_ of our spoke regions.

The hub region can read the platform data from the primary Aurora database, but where do the spoke regions read it from?
But we also needed to avoid creating any critical path inter-regional dependencies. Whatever we ended up doing, we had to ensure that the failure of a single region could never impact clusters running in different regions.
Easier said than done!
### Option 1: Multi-Region Aurora
The first option we considered was to leverage AWS Aurora’s native multi-region functionality. Specifically, AWS Aurora has support for spawning read replicas in other regions. There are limits on how many additional regions can contain read replicas, and this approach would only work with AWS so we’d need a different solution for multi-cloud, but we thought this solution could be a good enough stop-gap in the short term to scale from a single region to a handful without much engineering work. We also really liked the idea of offloading the tricky problem of replicating a subset of our platform data to the AWS Aurora team.

Multi-region AWS Aurora cluster with the primaries in the hub region and read replicas in the spoke regions.
Unfortunately, when we investigated further, we discovered that any unavailability of the primary Aurora region could result in the unavailability of the secondary region read replicas. If that ever happened to us, we’d end up in a terrible situation: all of our spoke regions and their associated clusters / Metadata Stores would still be running (thanks to the in-memory caches), but restarting or deploying the control plane nodes would cause an incident due to the in-memory caches being dropped and unable to be refilled.
It turns out that multi-region functionality in Aurora is designed for a completely different use case: failing over regions fast when the primary region fails. Useful for that situation, but we wanted a solution that would never require manual intervention, so we ruled it out.
We briefly considered migrating to a different database with better multi-region availability support like CockroachDB or Spanner, but we had no previous experience with these technologies, and migrating all of our platform data to a brand-new database technology felt like overkill.
### Option 2: Smart (and durable) Caches
Luckily, the platform data (like which virtual clusters exist and which API keys are valid) changes infrequently. So, another approach we considered was to query the source of truth (our us-east-1 Aurora database) from all subregions and then cache that data aggressively. For example, the first time a gateway node encountered an API key that it had not seen before, it would query Aurora in us-east-1 to determine if it was valid and then cache the result in memory.

ap-southeast-1 spoke region querying the AWS Aurora cluster in us-east-1 to fill its in-memory caches.
This approach was appealing because it would require only minimal code changes, and it took advantage of a strategy we were already employing within the primary region to be resilient against Aurora failures: in-memory caching. The Gateway nodes were already using a custom “smart” cache (internally referred to as the “loading cache”) that would fit the bill perfectly. This cache employs a number of tricks to make it suitable for critical use cases like this:
1. It automatically deduplicates cache fills. This eliminates the thundering herd problem.
2. It incorporates _negative caching_ as a first-class concept, so if the Gateway nodes keep receiving requests for API keys that no longer exist, they don’t keep querying Aurora over and over again.
3. It limits the _concurrency_ of cache fills so that a flood of requests with new and unique API keys results in the cache being filled at a continuous (and manageable) rate instead of flooding Aurora with queries.
4. It implements _asynchronous background refreshes_ (again, with limited concurrency) so that changes in Aurora (like invalidating an API key) are eventually reflected back into the state of the in-memory caches. This ensures that in normal circumstances, when Aurora is available, invalidating an API key is reflected within seconds, but in rare circumstances where Aurora is unavailable, the API gateway nodes can keep running more or less indefinitely as long as they aren’t restarted.
This smart caching strategy had served us well within our primary region, but ultimately we decided that it wasn’t an acceptable solution to our multi-region data replication problem. A failure of the primary Aurora database in us-east-1 wouldn’t immediately impair the other regions, but it _would_ leave us unable to deploy or restart any of the control planes in our other regions until the availability of the Aurora database was restored. In other words, this approach suffered from the same problem as the Aurora read replicas approach.
Briefly, we considered extending our existing loading cache implementation to be _durable_ so that we could restart control plane nodes, even when the primary Aurora database was down, without losing the data that had already been cached.

ap-southeast-1 spoke region querying the AWS Aurora cluster in us-east-1 to fill its in-memory caches, but then persisting the cached data to a local DynamoDB instance so that the Gateway nodes can still be restarted safely even if the primary Aurora cluster is unavailable.
However, we also decided against that approach because it didn’t feel very stable. The system would function completely differently when the primary Aurora database was available than when it was unavailable, and we didn’t like the idea of relying heavily on an infrequently exercised code path for such critical functionality.
Ultimately, we decided that while the loading cache was great for caching data _within_ a region, it was not an acceptable solution for replicating data _across_ regions.
### Option 3: Push-Based Replication (we chose this one)
Both of the models we considered previously were “pull-based” models. Instead, we decided to pursue a “push-based” approach using a technique we’d learned at previous jobs called “contexts”. A Context is a bundle of metadata with the following characteristics:
1. Its values change slowly (if at all).
2. The metadata is associated with specific clusters or tenants.
3. The metadata needs to be made available on a large number of machines in a highly available manner.
4. Availability is always favored over consistency, i.e., we’d rather use values that are several hours old than have the system fail entirely.
For example, one of the contexts we created is called the “cluster context” and it contains:
1. The cluster’s ID
2. The cluster’s name
3. The ID of the tenant (customer) the cluster belongs to
4. A few additional internal fields are required for the Metadata Store to begin processing requests
Building the contexts was straightforward. We wrote a job that scans the Aurora database every 10 minutes, builds the contexts, and then writes them as individual key-value pairs to a durable KV store in the relevant spoke regions.

Context publisher replicates context from the hub regions to the spoke regions.
Of course, we pride ourselves on the fact that a new WarpStream cluster can be created in under a second, so forcing users to wait 10 minutes before their clusters were usable after creation wasn’t acceptable. Solving for this was easy though: when a new cluster is created (or any operation is performed that could result in a context being created or an existing one mutated), we submit an asynchronous request to the same job service that will trigger an update for that specific context immediately.
This gives us the best of both worlds. Changes to the contexts (like a new cluster being created or an API key being revoked) are reflected in their associated subregions almost instantaneously, but in the worst-case scenario where we forget to issue the async update request in some code path (or it fails for some reason), the issue will automatically resolve itself within a few minutes. In other words, this approach is fast in the happy path, and self-healing in the non-happy path. Simple and easy to reason about.
The primary downside of this approach is that it was a lot more work to implement. But we think it was worth it for a few reasons:
1. We truly have zero inter-regional dependencies in the critical path. Instead, the primary region pushes updates to the sub-regions proactively, but the sub-regions _never_ query the primary region or create any external connections. In fact, the spoke regions aren’t even _aware_ of the hub region in any meaningful way. This makes reasoning about availability, reliability, and failure modes easy. We know the failure of one region will never impact other regions because no region takes dependencies on another region, so it can’t have any impact by definition.
2. The context framework we created is broadly useful. For example, in the future we’ll use it to build out support for our own feature flagging system without taking on any additional external dependencies.
With this setup, we have been able to deploy our control plane in three additional new regions all over the world, and we would be ready to spawn more depending on customers’ needs. | warpstream |
1,896,285 | 🧩 JavaScript Closures 🧩 | I've been exploring the power of JavaScript closures. Closures are a fundamental concept in... | 0 | 2024-06-21T16:59:44 | https://dev.to/chinwuba_okafor_fed1ed88f/javascript-closures-35e6 | techlife, javascript, 100daysofcode, codinglife | I've been exploring the power of JavaScript closures. Closures are a fundamental concept in JavaScript that allow functions to retain access to their lexical scope, even when executed outside of that scope. This can be incredibly useful for creating private variables and functions.
Here's a simple example to illustrate closures:

What's Next?
I'm currently enhancing my skills in JavaScript and React, and I'm eager to transition into the full-stack space. The journey has been challenging but incredibly rewarding, and I'm excited about the new opportunities and knowledge that lie ahead.
Thank you all for your support and encouragement! Feel free to ask me anything about my journey or the tech concepts I'm exploring. Let's keep pushing the boundaries of what's possible with code! 💻✨
#TechJourney #FullStackDeveloper #JavaScript #React #CodingLife #SelfTaughtDeveloper #100DaysOfCode
| chinwuba_okafor_fed1ed88f |
1,896,279 | JavaScript30 - 6 Ajax Type Ahead | Hello and welcome back to another awe-inspiring addition to my experiences with Wes Bos's... | 0 | 2024-06-21T16:59:12 | https://dev.to/virtualsobriety/javascript30-6-ajax-type-ahead-1ne6 | javascript, beginners, learning, javascript30 | Hello and welcome back to another awe-inspiring addition to my experiences with Wes Bos's [JavaScript30!](https://javascript30.com/) This time I attacked the Ajax Type Ahead challenge and let's just say, even after completing the challenge, I still have close to ZERO idea what "Ajax Type Ahead" means. I mean...type ahead makes sense but...anyway I digress.
I have all but decided that moving forward I will be coding along with Wes during these challenges and then rewatch the video as many times as it takes for me to understand what we actually did. There were more than a few parts of this challenge where I had basically no idea what we were even typing out or why. There was even one part where Wes made a joke and I wasn't sure if he was being serious or not because, as it turns out, my base knowledge is no where near where I thought it was.
## The Challenge
So the challenge itself was for us to take data in the form of an array from one specific location (in this case it was a .json file) and then make a way to search through the data while showing a complete list of any item in the array based on the letters typed while highlighting said letters in the list in real time. wow. Oh did I mention the list consisted of cities, the states they are located in and their population? Because that's pretty important.

As you can see the list works. You cannot see it working in real time but of course, it does. Wes being Wes he did already set up the CSS and basic HTML elements so the focus of the challenge was just the JavaScript. I do long for the days where I could flaunt my HTML and CSS skills just to give myself that little confidence boost before a lesson. Also, I still find myself struggling with reading someone else's code as opposed to writing my own from scratch.
```JS
const endpoint = 'https://gist.githubusercontent.com/Miserlou/c5cd8364bf9b2420bb29/raw/2bf258763cdddd704f8ffd3ea9a3e81d25e2c6f6/cities.json';
const cities = [];
fetch(endpoint)
.then(blob => blob.json())
.then(data => cities.push(...data));
```
There were quite a few pieces of JavaScript that I have never encountered before. Such as `fetch` which grabs data from an outside source with a `promise`. Which makes sense when you think about how `promise`'s work and how it can relate to `.then()`. Now what I just said I can barely explain at the moment. But I am starting to understand how and why they work and work together...or at least I think I am. As it turns out getting the information was the easy part and it was about time to get thrown into the deep end.
```js
function findMatches(wordToMatch, cities) {
return cities.filter(place => {
// here we need to figure out if the city or state matches what was searched
const regex = new RegExp(wordToMatch, 'gi');
return place.city.match(regex) || place.state.match(regex)
});
}
```
This introduced other concepts I haven't encountered before, such as: `RegExp`, word matching with `.match` and `'gi'` (global insensitive). The `'gi'` was where he made his joke about things being insensitive...which I DID NOT KNOW was a joke at first and also didn't realize it was referring to case sensitivity and looking back at it makes me feel pretty silly. Much like everything else introduced here I can almost explain each of these concepts but I am starting to grasp what they do and why.
```js
function displayMatches() {
const matchArray = findMatches(this.value, cities);
const html = matchArray.map(place => {
const regex = new RegExp(this.value, 'gi');
const cityName = place.city.replace(regex, `<span class="hl">${this.value}</span>`);
const stateName = place.state.replace(regex, `<span class="hl">${this.value}</span>`);
return `
<li>
<span class="name">${cityName}, ${stateName}</span>
<span class="population">${numberWithCommas(place.population)}</span>
</li>
`;
}).join('');
suggestions.innerHTML = html;
}
```
Now that we could search through the data it was time to display that on the html. Although it seemed like a lot at first, looking over it now I do see the purpose for each line of code and what they did. One thing that I thought was crazy was the concept of replacing each character in the data with a highlighted version of the same character based on what was typed in the search bar. To me highlighting a word or character is exactly that. Basically taking a hypothetical highlighter and then in turn highlighting words in real time on the screen...OF COURSE this is not how it works but that's just how I have processed it for my last 35 years of life. That being said I did find an issue with the finished product of this challenge based on the concept of ignoring case sensitivity and replacing characters based on the search bar, see the image below.

As it turns out replacing a character with the same exact character but "highlighted" while ignoring case sensitivity while searching itself can case some issues. You can make some pretty hysterical results while messing with upper and lowercase letters in the search bar and then changing the results as it updates in real time. Yeah, okay, so the search function doesn't break and still works exactly how we programmed it to work but I feel like it is worth mentioning/revisiting this "bug".
## One More thing
I feel the need to mention one last part of this challenge. There was a blink and you miss it part that I had to rewatch about 5 times before even knowing what happened and I still do not really understand of the syntax used here.
```js
function numberWithCommas(x) {
return x.toString().replace(/\B(?=(\d{3})+(?!\d))/g, ',');
```
This is nuts. All this did was add a comma to the population numbers while searching. But when he added this to the code just by being so nonchalant and saying "yeah, you can just find this and copy it". Yeah I get it...I have done that many times before...but at least then I broke down the code piece by piece to ensure I also understand what I am doing and calling...but that...that thing scares me.
## Conclusion
Well...I did it...or more so Wes did it and I followed along. But this is another challenge for the history books. Of course there are other parts of this that I left out but if you want to know more why not give it a try yourself! Much like the last few challenges I have done in this course I do not feel as accomplished as before, but I do know I am getting practice. Given the free time I have at the moment I know this is exactly what I should be doing. I just can't wait for when I can actually take more time and focus on coding the way I would like to.
Tune in next time for another jaw-dropping addition to my journey into Wes Bos's [JavaScript30!](https://javascript30.com/) where I will be tackling: Array Cardio Day 2!

| virtualsobriety |
1,895,948 | Understanding Generators, Coroutines, and Fibers Across Different Languages | Generators, coroutines, and fibers are programming constructs that enable more efficient and... | 0 | 2024-06-21T16:55:00 | https://dev.to/francescoagati/understanding-generators-coroutines-and-fibers-across-different-languages-4len | javascript, ruby, php, python | Generators, coroutines, and fibers are programming constructs that enable more efficient and manageable asynchronous code. Though they share some conceptual similarities, each has unique characteristics and implementations in different languages. This article explores these constructs and their implementations in JavaScript, Python, PHP, and Ruby.
## Generators
Generators are special functions that allow you to pause execution and resume it later. They provide a way to iterate through a sequence of values over time, rather than computing them all at once and sending them back in a list.
### JavaScript Implementation
In JavaScript, generators are created using the `function*` syntax. They use the `yield` keyword to return a value and pause execution. Here's an example:
```javascript
// JavaScript Generator Example
function* numberGenerator() {
let number = 1;
while (true) {
yield number++;
}
}
// Using the generator
const generator = numberGenerator();
console.log(generator.next().value); // 1
console.log(generator.next().value); // 2
console.log(generator.next().value); // 3
console.log(generator.next().value); // 4
// and so on...
```
In this example, `numberGenerator` is a generator function that yields an infinite sequence of numbers, starting from 1. Each call to `next()` resumes the generator from where it left off, returning the next number in the sequence.
### Python Implementation
In Python, generators are defined using the `def` keyword and `yield` expressions. Here's a similar example in Python:
```python
# Python Generator Example
def number_generator():
number = 1
while True:
yield number
number += 1
# Using the generator
generator = number_generator()
print(next(generator)) # 1
print(next(generator)) # 2
print(next(generator)) # 3
print(next(generator)) # 4
# and so on...
```
The `number_generator` function yields an infinite sequence of numbers, similar to the JavaScript example. The `next()` function is used to retrieve the next value from the generator.
### PHP Implementation
PHP introduced generators in version 5.5 using the `yield` keyword within a function. Here's how it looks:
```php
<?php
// PHP Generator Example
function numberGenerator() {
$number = 1;
while (true) {
yield $number;
$number++;
}
}
// Using the generator
$generator = numberGenerator();
echo $generator->current() . "\n"; // 1
$generator->next();
echo $generator->current() . "\n"; // 2
$generator->next();
echo $generator->current() . "\n"; // 3
$generator->next();
echo $generator->current() . "\n"; // 4
// and so on...
?>
```
In PHP, the `current()` method retrieves the current value yielded by the generator, and `next()` advances the generator to the next yield.
## Coroutines
Coroutines are generalizations of subroutines (functions) that can be paused and resumed. They are used for cooperative multitasking and can maintain their state between invocations.
### Python Coroutines
In Python, coroutines are a type of generator enhanced with asynchronous capabilities. They use `async def` to define and `await` to pause execution until a given task completes. Here's a basic example:
```python
import asyncio
async def number_generator():
number = 1
while True:
yield number
number += 1
async def main():
gen = number_generator()
print(await gen.__anext__()) # 1
print(await gen.__anext__()) # 2
print(await gen.__anext__()) # 3
print(await gen.__anext__()) # 4
# and so on...
asyncio.run(main())
```
In this example, `number_generator` is an asynchronous generator, and `await` is used to fetch the next value.
## Fibers
Fibers are lightweight concurrency primitives that allow multiple execution contexts to coexist and be manually switched between. Unlike threads, fibers must yield control explicitly.
### Ruby Implementation
Ruby has built-in support for fibers, which can be used to implement generators. Here’s how it can be done:
```ruby
# Ruby Fiber Generator Example
require 'fiber'
def number_generator
Fiber.new do
number = 1
loop do
Fiber.yield number
number += 1
end
end
end
# Using the generator
generator = number_generator
puts generator.resume # 1
puts generator.resume # 2
puts generator.resume # 3
puts generator.resume # 4
# and so on...
```
In Ruby, a fiber is created using `Fiber.new`, and values are yielded using `Fiber.yield`. The `resume` method is used to fetch the next value from the fiber.
## Summary
Generators, coroutines, and fibers provide powerful tools for managing asynchronous and iterative computations across various programming languages. JavaScript, Python, PHP, and Ruby each have unique implementations:
- **JavaScript**: Uses `function*` and `yield` for generators.
- **Python**: Uses `def` and `yield` for generators and `async def` and `await` for coroutines.
- **PHP**: Uses `yield` within functions for generators.
- **Ruby**: Uses fibers, with `Fiber.new` and `Fiber.yield`, to implement generator-like functionality.
Understanding these constructs helps developers write more efficient and maintainable asynchronous code. | francescoagati |
1,895,825 | API Integration Best Practices: Ensuring Robust and Scalable Systems | Robust and scalable API integration is critical for success in modern business. Ensuring seamless... | 0 | 2024-06-21T16:52:31 | https://dev.to/apidna/api-integration-best-practices-ensuring-robust-and-scalable-systems-50hd | api, webdev, testing, design | Robust and scalable API integration is critical for success in modern business.
Ensuring seamless communication between different systems requires a strategic approach.
This article delves into essential API integration best practices such as choosing the right solutions, documentation, and testing.
Each of these practices is vital for creating resilient and efficient systems.
While each best practice warrants its own deep dive, we’ve referenced some of our previous articles throughout to provide you with comprehensive insights.
## Choose the Right API Integration Solution and Avoid Embedding Logic
An incompatible solution can lead to numerous problems such as integration failures, increased complexity, higher maintenance costs, and reduced system performance.
The right solution should align with your deployment model, support the necessary protocols and standards, and provide the flexibility to adapt to future changes:
- **Assess Your Deployment Model:** Identify the deployment environment by determining whether your applications are on-premises, cloud-based, or hybrid. For instance, if your applications are primarily cloud-based, opt for an iPaaS (Integration Platform as a Service) solution that is designed for cloud environments.
- **Evaluate Compatibility:** Ensure the integration solution supports the necessary communication protocols (e.g., REST, SOAP) and data formats (e.g., JSON, XML). Assess whether the solution can handle your current and projected API call volumes without compromising performance.
- **Consider Future Needs:** Choose a solution that can easily adapt to future changes, and with an active user community for troubleshooting.

Embedding integration logic directly within applications can lead to tightly coupled systems, making it difficult to update, maintain, or replace individual components without affecting others.
This approach can also cause redundancy, as similar logic may need to be replicated across multiple applications.
By keeping integration logic separate, you promote modularity, reusability, and easier management of integrations.
At APIDNA, our integration platform is designed to handle integration logic outside of your core applications.
It uses new autonomous agent technology to simplify the whole API integration experience, which you can learn more about from our [previous article here](https://apidna.ai/the-essential-roles-of-autonomous-agents-in-modern-api-integration/).
Alternatively, you can give it a try for yourself by [clicking here](https://apidna.ai/), and beginning your journey to simplify API integrations.
## API Integration Documentation and Understanding
Without a deep understanding, developers might misinterpret how an API works, leading to errors, security vulnerabilities, and inefficient use of resources.
Knowledge of API endpoints, request/response formats, authentication methods, rate limits, and error handling mechanisms is crucial for building robust integrations.
Here’s how to ensure that you thoroughly understand the APIs that you are integrating:
- **Read Official Documentation:** Study the official API documentation from the API provider. Pay attention to endpoint details, request and response formats, authentication methods, rate limits, and error codes. Utilise any developer guides, tutorials, or sample code from the API provider to gain practical insights into API usage.
- **Experiment and Test:** Use a testing environment to experiment with API calls. This helps in understanding how the API behaves without affecting production data.
- **Community and Support:** Engage with developer communities, forums, and support channels. These can provide valuable insights, best practices, and solutions to common issues.

Thoroughly understanding the APIs you are using will make it much easier to document your integrations.
Comprehensive documentation is critical for making integrations easier to maintain, update, and troubleshoot.
Future developers or team members can quickly understand how integrations are set up and how they function.
Documentation also ensures consistency and standardisation across different integration projects as their scale increases.
It aids in compliance with industry standards and ensures that security measures are correctly implemented and adhered to.
You can learn more about the key API security protocols from our [previous article here](https://apidna.ai/api-security-key-protocols/).
Here’s the key aspects to include in your integration documentation:
- Document the API endpoints, request/response formats, authentication methods, rate limits, and error handling mechanisms.
- Describe the integration workflows, including data flow diagrams, process steps, and dependencies.
- Utilise standard formats like OpenAPI since these formats are widely accepted and can be easily understood.
- Use Markdown or HTML to create readable and accessible documentation.
## Leverage Internal Endpoints and Real-Time Capabilities
Internal API endpoints are crucial because they standardise access to commonly-used data and services within an organisation.
They provide a consistent, reusable interface for accessing data, which simplifies the development process, enhances maintainability, and ensures data integrity.
They also reduce duplication of effort, as different teams do not need to write their own methods for accessing the same data.
It promotes code reuse, saving time and resources, and reducing the likelihood of errors.
If the underlying data source or business logic changes, you only need to update the internal API rather than modifying multiple services.

Internal endpoints also play a crucial role in enabling and optimising real-time capabilities.
Real-time capabilities are important because they provide immediate access to up-to-date information, enabling faster decision-making and more responsive applications.
Real-time data is crucial for applications requiring instant feedback, such as financial transactions, live monitoring systems, and customer interactions.
Access to real-time data allows businesses to make informed decisions quickly.
In industries like finance, healthcare, and logistics, real-time data is essential for timely and accurate decision-making.
Real-time capabilities can also streamline operations by automating and accelerating workflows.
For instance, real-time inventory management systems can automatically reorder stock when levels fall below a certain threshold.
Here’s how to implement real-time capabilities:
- **Choose the Right Technology:** Common options include WebSockets, server-sent events (SSE), and message queues (e.g. RabbitMQ, Apache Kafka).
- **Design for Scalability:** Design the architecture to scale horizontally by adding more servers or instances as needed.
- **Implement Efficient Data Handling:** This includes efficient data storage, quick retrieval mechanisms, and minimising network delays.
- **Ensure Data Consistency and Integrity:** This might include transactional guarantees, data validation, and error handling.
- **Monitor and Optimise Performance:** Use performance metrics to identify bottlenecks and optimise the system for better speed and reliability. You can learn more about different performance metrics from our [previous article here](https://apidna.ai/the-role-of-ai-in-optimising-api-performance/).
## Set Up Automated Testing and Respect Rate Limiting
Automated Testing is essential for ensuring the reliability, functionality, and performance of API integrations.
It involves the use of automated tools to execute predefined test cases, which helps in identifying and fixing issues quickly and efficiently.
Automated tests can be run frequently and consistently without manual intervention, saving time and resources.
This allows for rapid feedback and faster iteration during the development cycle.
You can learn more about API testing by checking out our [previous article here](https://apidna.ai/api-integration-testing-ensuring-robustness-and-reliability/).

One key component of this is automated rate limit testing.
Rate limiting prevents malicious users from overwhelming the API with excessive requests, which can lead to service disruption and degraded performance for legitimate users.
It acts as a protective barrier against Denial-of-Service (DoS) attacks.
By controlling the request rate, it ensures the backend can handle the load without crashing or experiencing significant slowdowns.
It ensures that all users have fair access to the API by restricting the number of requests a single user or client can make within a specific time period.
This prevents any single user from monopolising the API’s resources.
You can learn more about implementing rate limits from our [previous article here](https://apidna.ai/api-rate-limits-a-beginners-guide/).
## Conclusion
While we’ve covered essential practices like choosing the right integration solution, understanding and documenting APIs, leveraging internal endpoints, respecting rate limiting, and setting up automated testing, there are many more valuable practices to explore.
For a deeper dive into additional strategies, check out the further reading resources listed below.
## Further Reading
[What is API Rate Limiting and How to Implement It – DATADOME](https://datadome.co/bot-management-protection/what-is-api-rate-limiting/)
[What does integration testing an API involve? – Ministry of Testing](https://club.ministryoftesting.com/t/what-does-integration-testing-an-api-involve/18588)
[Understanding API Endpoints: A Beginner’s Guide to Streamlining Development – Improvitz](https://impactum.mx/understanding-api-endpoints-a-beginners-guide-to-streamlining-development/)
[5 Things to Consider When Choosing An API – PKFSCS](https://www.pkfscs.co.uk/5-things-to-consider-when-choosing-an-api/) | itsrorymurphy |
1,896,284 | Dissecting Anti-TDD Statements | I dissect these 3 statements and hopefully give you insight as to where devs are coming from that say that, a different perspective on using tests to design, and what you can do if you hate updating tests. | 0 | 2024-06-21T16:50:54 | https://jessewarden.com/2024/06/dissecting-anti-tdd-statements.html | tdd, testdrivendevelopment, testing |
---
title: Dissecting Anti-TDD Statements
published: true
description: I dissect these 3 statements and hopefully give you insight as to where devs are coming from that say that, a different perspective on using tests to design, and what you can do if you hate updating tests.
tags: tdd,testdrivendevelopment,testing
canonical_url: https://jessewarden.com/2024/06/dissecting-anti-tdd-statements.html
cover_image:
---
I dissect these 3 statements and hopefully give you insight as to where devs are coming from that say that, a different perspective on using tests to design, and what you can do if you hate updating tests.
> You can’t use TDD unless you know the requirements
Requirements Are Bogus
----------------------
The [requirements are wrong](https://medium.com/defense-unicorns/5-minute-devops-the-three-wrongs-6c660f1287e7). You’re understanding of them is wrong. The user’s needs will change as you develop and deliver. Repeat.
Delivering Requirements is Not What We Do Here
----------------------------------------------
Your success as a dev is if you deliver software that’s valuable to the user. It’s not if you built to the requirements. Your job is also to help understand and redefine those requirements as time goes on.
> TDD is pointless because when things change, you have to change the tests, or even delete them + the code. Why do twice the amount of work for zero point?
Test What it Does, Not How it Does
----------------------------------
Testing behavior means you can refactor the code, and the tests don’t have to change. Testing just behavior and not implementation details is harder than most testing zealots make it out to be. You can, however, get better with practice. So practice.
Visualize the API, Then Test It Into Existence
----------------------------------------------
Anti-TDD devs will “explore the problem through writing code” and eventually arrive at something that feels right. Testing that after can be hard, unless they’re one of those rare devs that can write testable code without writing the tests first.
You can arrive at the same design, testing first, and you don’t have to risk having code that’s hard to test/low coverage. While the tests and types you write help guide the design, they too are not infallible or immutable. Doesn’t feel right? Don’t like your design in a particular part? Change it. The types and tests will tell you what needs to change and what doesn’t. As you iterate on this, they’ll continue to tell you if your design is good (e.g. easy to test), and what parts aren’t (e.g. hard to test, or lots of mock/stub setup).
Test Coverage & Coupling
------------------------
When you’re “done” for the day, week, month, you’ll have tests covering the parts you’re not working on, or didn’t realize were coupled, which is a nice side benefit.
> I hate updating tests.
Empathy on Factors at Play
--------------------------
Me too. There are a lot of factors here, specifically on:
- programming language & types
- framework
- skill level
Languages like Elm, Scala, or OCaml have such a good type system, they negate the need for many unit tests (does the code work to a dev’s approval) so you can focus more on acceptance tests (does the code work according to users/business/product people).
Languages like JavaScript or Python are so error prone, you have to write tests just to ensure you can successfully import modules, and this work can be quite tiresome. So you can see how TDD practitioners in in something like Haskell are confused when someone in JavaScript has having so much irritation writing unit tests.
Frameworks can make it difficult to test. Angular, for example, requires an immense amount of setup just to test 1 class method. In addition, the way you test class methods is using return values or assertions on class properties while HTTP calls require expectations with 2 manual steps, whereas testing the DOM requires yet another way. This can make testing not fun at all, or someone to just prefer Acceptance Tests only in Cypress or Playwright.
Finally, skill level can prevent many from making progress despite evidence from DORA (⚠️ be wary of the non-transparent research) that it is \_the\_ only known way for Juniors to not create a big ball of mud ( e.g. large, untestable, technical debt filled mess).
Solutions To Challenges
-----------------------
Regardless of language, focus on Acceptance Tests first. For Web UI’s, that means things like Playwright/Cypress/Puppeteer where you stub all HTTP calls to ensure your tests work every time (aka Whitebox aka Component tests). For unit tests, only cover the places where you code makes decisions, such as if thens, switch statements, or does raw data transformation (e.g. JSON.parse, Zod parsing, file reads, fetch response parsing).
Regardless of app or testing framework, try to follow [Pure Core, Imperative Shell](https://destroyallsoftware.com/screencasts/catalog/functional-core-imperative-shell/). It’ll result in 80% of your code being pure, and much easier to test. [Scott Wlaschin has a good talk](https://www.youtube.com/watch?v=P1vES9AgfC4) outlining how you can do that[.](https://t.co/BZm6KlxUWK)
As soon as you hit the 20% that has side-effects, where you start needing to use Mocks, Spies, or Expectations (as opposed to pure function Stubs), don’t. Just cover those cases in your Acceptance Tests to give yourself a break.
If you’re Junior, try following the Chicago method of testing, where you just start unit testing little functions/classes that you start to wire together into larger classes. [https://devlead.io/DevTips/LondonVsChicago](https://t.co/kB8bOzngHT) The tradeoff is all the pieces might not fit together at the end, but you’ll have learned how to write testable code, and how using dependencies and side-effects in your code makes it harder to test. Try to not use mutation, as it’s a side-effect, and treat all data as immutable as it’s easier to test. If your function/class method needs to do a side-effect, make it take in that class instance/function as a method parameter/function parameter so you can stub it in the test, and in the real code give it a concrete. Get nervous when you see class methods/functions not returning values as they’re probably doing side-effects. | jesterxl |
1,896,282 | Configuring and Deploying VPCs | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T16:48:20 | https://dev.to/vidhey071/configuring-and-deploying-vpcs-3p6f | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,281 | Docker Compose up --build not working | RHEL version - Red Hat Enterprise Linux release 8.9 (Ootpa) [root@acer abmcl]# ls assets backend ... | 0 | 2024-06-21T16:47:57 | https://dev.to/shubham_mojad_8fcf948a05b/docker-compose-up-build-not-working-17i3 | docker, linux, dns, webdev | RHEL version - Red Hat Enterprise Linux release 8.9 (Ootpa)
[root@acer abmcl]# ls
assets backend docker docker-compose.yml docs frontend nginx package.json package-lock.json README.md server
[root@acer-BILLATMTN abmcl]# docker compose up --build
WARN[0000] /home/BillingAutomation/abmcl/docker-compose.yml: `version` is obsolete
[+] Building 51.1s (7/9) docker:default
=> [backend internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 299B 0.0s
=> [backend internal] load metadata for docker.io/library/python:3.11 2.1s
=> [backend internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [backend 1/5] FROM docker.io/library/python:3.11@sha256:01b1035a2912ade481cf6db2381dc10c97ee19a4f670b056138517e22d8ea1c5 0.0s
=> [backend internal] load build context 0.0s
=> => transferring context: 783.83kB 0.0s
=> CACHED [backend 2/5] WORKDIR /server 0.0s
=> CACHED [backend 3/5] COPY requirements.txt . 0.0s
=> [backend 4/5] RUN pip install -r requirements.txt 48.9s
=> => # WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnec
=> => # tion object at 0x7f4036ad0110>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /simple/fastapi/
=> => # WARNING: Retrying (Retry(total=3, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnec
=> => # tion object at 0x7f4036ad0910>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /simple/fastapi/
=> => # WARNING: Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnec
=> => # tion object at 0x7f4036ab0810>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /simple/fastapi/
| shubham_mojad_8fcf948a05b |
1,896,280 | Migration Evaluator Overview of Customers | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T16:47:36 | https://dev.to/vidhey071/migration-evaluator-overview-of-customers-1882 | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,896,277 | VS Code - Code Helper process using more than 100% CPU on macOS | Possible solution is... | 0 | 2024-06-21T16:46:09 | https://dev.to/dingzhanjun/vs-code-code-helper-process-using-more-than-100-cpu-on-macos-4maa | Possible solution is here:
https://apple.stackexchange.com/questions/351761/vs-code-code-helper-process-using-more-than-100-cpu-on-macos | dingzhanjun | |
1,896,276 | Don't Be The Next Victim! This Library Will Keep You Safe! | Helmet.js is a powerful middleware for Express.js that helps secure your web applications by setting... | 0 | 2024-06-21T16:45:35 | https://dev.to/ashsajal/dont-be-the-next-victim-this-library-will-keep-you-safe-27jo | Helmet.js is a powerful middleware for Express.js that helps secure your web applications by setting various security-related HTTP response headers. This guide will provide a detailed overview of Helmet's capabilities, its default settings, and how to customize its behavior to meet your specific security requirements.
**Getting Started**
Here's a simple example of using Helmet in your Express.js application:
```javascript
import express from "express";
import helmet from "helmet";
const app = express();
// Use Helmet!
app.use(helmet());
app.get("/", (req, res) => {
res.send("Hello world!");
});
app.listen(8000);
```
**Default Headers**
By default, Helmet sets the following headers:
* **Content-Security-Policy:** Mitigates XSS attacks by defining a whitelist of allowed resources (scripts, images, stylesheets).
* **Cross-Origin-Opener-Policy:** Helps process-isolate your web page.
* **Cross-Origin-Resource-Policy:** Blocks others from loading your resources cross-origin.
* **Origin-Agent-Cluster:** Changes process isolation to be origin-based.
* **Referrer-Policy:** Controls the Referer header, which can be used to track user behavior.
* **Strict-Transport-Security:** Tells browsers to prefer HTTPS over HTTP.
* **X-Content-Type-Options:** Avoids MIME sniffing attacks.
* **X-DNS-Prefetch-Control:** Controls DNS prefetching.
* **X-Download-Options:** Forces downloads to be saved (Internet Explorer only).
* **X-Frame-Options:** Legacy header that mitigates clickjacking attacks.
* **X-Permitted-Cross-Domain-Policies:** Controls cross-domain behavior for Adobe products.
* **X-Powered-By:** Info about the web server. Removed by Helmet because it could be used in simple attacks.
* **X-XSS-Protection:** Legacy header that tries to mitigate XSS attacks, but Helmet disables it by default as it can cause issues.
**Configuring Headers**
Each header can be customized. For example, here's how you configure the Content-Security-Policy header:
```javascript
// This sets custom options for the
// Content-Security-Policy header.
app.use(
helmet({
contentSecurityPolicy: {
directives: {
"script-src": ["'self'", "example.com"],
},
},
})
);
```
**Disabling Headers**
You can also disable specific headers. For example, here's how to disable the Content-Security-Policy and X-Download-Options headers:
```javascript
// This disables the Content-Security-Policy
// and X-Download-Options headers.
app.use(
helmet({
contentSecurityPolicy: false,
xDownloadOptions: false,
})
);
```
**Detailed Header Configuration**
Let's delve into the configuration options for each header:
**Content-Security-Policy**
* **Directives:** A nested object containing directives for the Content-Security-Policy header. Each key represents a directive name in camel case (e.g., `defaultSrc`) or kebab case (e.g., `default-src`). Each value is an array (or iterable) of strings or functions for that directive.
* **Use Defaults:** A boolean value (defaults to `true`) that determines whether to use the default directives.
* **Report Only:** A boolean value (defaults to `false`) that sets the `Content-Security-Policy-Report-Only` header instead of the standard `Content-Security-Policy` header. This allows you to test your CSP configuration without blocking resources.
**Cross-Origin-Embedder-Policy**
* **Policy:** A string representing the policy. Options include:
* `require-corp`: Requires the embedder to have the same origin as the embedded content.
* `credentialless`: Allows embedding only if the embedder is not sending credentials.
**Cross-Origin-Opener-Policy**
* **Policy:** A string representing the policy. Options include:
* `same-origin`: Allows embedding only if the embedder has the same origin as the embedded content.
* `same-origin-allow-popups`: Allows embedding only if the embedder has the same origin as the embedded content and allows popups.
**Cross-Origin-Resource-Policy**
* **Policy:** A string representing the policy. Options include:
* `same-origin`: Blocks cross-origin requests.
* `same-site`: Allows requests only from the same site.
**Origin-Agent-Cluster**
This header takes no options and is set by default.
**Referrer-Policy**
* **Policy:** A string or array of strings representing the policy. Options include:
* `no-referrer`: Sends no Referer header.
* `origin`: Sends the origin of the request.
* `unsafe-url`: Sends the full URL.
**Strict-Transport-Security**
* **Max Age:** The number of seconds browsers should remember to prefer HTTPS.
* **Include Subdomains:** A boolean value (defaults to `true`) that dictates whether to include the `includeSubDomains` directive, extending the policy to subdomains.
* **Preload:** A boolean value (defaults to `false`) that adds the `preload` directive, expressing intent to add your HSTS policy to browsers.
**X-Content-Type-Options**
This header takes no options and is set by default.
**X-DNS-Prefetch-Control**
* **Allow:** A boolean value (defaults to `false`) that dictates whether to enable DNS prefetching.
**X-Download-Options**
This header takes no options and is set by default.
**X-Frame-Options**
* **Action:** A string that specifies which directive to use: `DENY` or `SAMEORIGIN`.
**X-Permitted-Cross-Domain-Policies**
* **Permitted Policies:** A string that must be "none", "master-only", "by-content-type", or "all".
**X-Powered-By**
This header is removed by default.
**X-XSS-Protection**
This header is disabled by default.
**Standalone Middleware**
Helmet provides standalone middleware for each header, allowing you to use them individually. For example:
```javascript
app.use(helmet.contentSecurityPolicy());
```
## Helmet.js Reference
**Official Documentation:**
* **Helmet.js GitHub Repository:** [https://helmetjs.github.io/](https://helmetjs.github.io/) - The primary source for documentation, examples, and API reference.
**Additional Resources:**
* **MDN Web Docs:** [https://developer.mozilla.org/en-US/](https://developer.mozilla.org/en-US/) - Provides detailed information on each supported HTTP header.
* **OWASP:** [https://owasp.org/](https://owasp.org/) - Offers comprehensive security guidance and best practices.
* **CSP Evaluator:** [https://csp-evaluator.withgoogle.com/](https://csp-evaluator.withgoogle.com/) - A tool for validating and testing Content-Security-Policy configurations.
**Conclusion**
Helmet.js is an essential tool for securing your Express.js applications. By setting security-related HTTP response headers, it helps protect your website from common attacks. Remember to carefully configure Helmet to meet your specific security needs and to keep your application up-to-date with the latest security patches.
**Follow me in [X/Twitter](https://twitter.com/ashsajal1)** | ashsajal | |
1,896,275 | Amazon EBS Primer | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T16:43:08 | https://dev.to/vidhey071/amazon-ebs-primer-5786 | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,274 | AWS Network Connectivity Options | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T16:42:34 | https://dev.to/vidhey071/aws-network-connectivity-options-4b7j | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,893,488 | Building a Custom Flutter Widget from Scratch | Walk through the process of creating a reusable widget with unique functionality, animations, and interactions. | 0 | 2024-06-21T16:38:57 | https://dev.to/harsh8088/building-a-custom-flutter-widget-from-scratch-335i | flutter, custom, widgets | ---
title: Building a Custom Flutter Widget from Scratch
published: true
description: Walk through the process of creating a reusable widget with unique functionality, animations, and interactions.
tags: #flutter, #custom, #widgets
cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ov27p2j7nuo40r2d2fjr.png
# Use a ratio of 100:42 for best results.
# published_at: 2024-06-19 11:05 +0000
---
Flutter's magic lies in its extensive widget library. But what if you need a UI element that doesn't quite fit the mold? That's where custom widgets come in! Buckle up, Flutter developers, as we embark on a journey to build a custom widget from scratch.
**Why Custom Widgets?**
Custom widgets offer a treasure trove of benefits:
* **Reusability:** Write your widget once, use it everywhere! This saves code, promotes consistency, and streamlines development.
* **Encapsulation:** Package functionality and appearance into a neat unit, keeping your code clean and organized.
* **Customization:** Tailor your widget's behavior and appearance to precisely meet your needs.
**Let's Build a Star Rating Widget!**
Imagine a widget that displays a row of stars, allowing users to rate something. Here's how we'll break it down:
**1. Setting Up:**
Create a new Flutter project and a dedicated Dart file for your widget (e.g., custom_rating_bar.dart).
Import necessary packages like flutter and material.
**2. The CustomStarRating Class:**
Define a class named `CustomRatingBar` that extends `StatelessWidget`.
**3. Star Count and Rating Properties:**
Add properties to the `CustomRatingBar` class:
`starCount`: An integer representing the total number of stars.
`rating`: A double representing the current user rating (optional).
`filledColor`: Color representing the filled color for the star.
`unfilledColor`: Color representing the unfilled color for the star.
**4. Building the Stars:**
Override the build method of the `CustomRatingBar` class.
Use a Row widget to display the stars horizontally.
Loop through a list based on `starCount`.
Inside the loop, use an `Icon` for each star.
Customize the Icon displayed based on the current rating (filled star for selected, unfilled star for unselected).
**5. Handling User Interaction:**
Set the `onTap` callback of the `GestureDetector` to update the `rating` property.
Consider emitting an event (using a `ValueNotifier` or similar) to notify parent widgets about rating changes.
**6. Adding Flair (Optional):**
Style your stars using `Icon` properties like color and size.
Implement custom animations for star selection using AnimatedIcon.
**Putting it All Together:**
With all the pieces in place, use the `CustomRatingBar` widget in your app's layout:
```dart
CustomRatingBar(starCount: 5, rating: 1.0,
filledColor: Colors.amber,
unfilledColor: Colors.grey,
onRatingChanged: () {})
```
`custom_rating_bar.dart`
```dart
import 'package:flutter/material.dart';
class CustomRatingBar extends StatefulWidget {
final double rating;
final int starCount;
final Function onRatingChanged;
final Color filledColor;
final Color unfilledColor;
const CustomRatingBar({
super.key,
required this.rating,
required this.starCount,
required this.onRatingChanged,
required this.unfilledColor,
required this.filledColor,
});
@override
State<CustomRatingBar> createState() => _CustomRatingBarState();
}
class _CustomRatingBarState extends State<CustomRatingBar> {
double _currentRating = 0.0;
@override
void initState() {
super.initState();
_currentRating = widget.rating - 1;
}
@override
Widget build(BuildContext context) {
return Row(
children: List.generate(
widget.starCount,
(index) => GestureDetector(
onTap: () => _onStarTap(index.toDouble()),
child: Icon(
Icons.star,
size: 30.0,
color: _getColor(index),
),
),
),
);
}
Color _getColor(int index) {
if (index <= _currentRating) {
return widget.filledColor;
} else {
return widget.unfilledColor;
}
}
void _onStarTap(double newRating) {
if (_currentRating == newRating) {
newRating--;
}
setState(() {
_currentRating = newRating;
// widget.onRatingChanged(newRating);
});
}
}
```
**Conclusion**
This blog demonstrates a basic implementation of a custom rating bar in Flutter. It offers features like:
**Star Count:** You can easily adjust the number of stars displayed.
**Rating:** Users can tap on stars to provide their rating.
**Colors:** Define the desired colors and icons for filled, empty, and half-filled states (if applicable).
**Further Enhancement Scopes**
* **Animated Selection:** The selected stars smoothly animate with a scaling effect.
* **Half Rating Support:** Modify the function to include logic for displaying half-filled stars.
[GitHub Code](https://github.com/harsh8088/flutter_samples/blob/master/lib/custom/custom_rating_bar.dart)
Happy Coding!!! 🧑🏻💻 | harsh8088 |
1,896,273 | How to Write a Resume that Doesn't Suck | What’s up fam!! I survived my third year at Render, and it was another stellar year. Hats off to... | 0 | 2024-06-21T16:35:59 | https://dev.to/tdesseyn/how-to-write-a-resume-that-doesnt-suck-4kei | job, resume | What’s up fam!! I survived my third year at Render, and it was another stellar year. Hats off to Justin Samuels and the crew for putting on a BANGER of an event. They already have 2025 up and running for tickets and they are SUPER cheap so go ahead and buy them ASAP.
I do want to plug another amazing event I caught wind of from my dear friend Chris DeMars. You know I’m all about a virtual networking conference and this has the makings of a good one! It’s Digital Ocean’s yearly conference that is FREE and they are creating virtual spaces (hubs) where folks can come in an engage with the community, speak to solution architects about building on DO, partnerships hub etc. If you want to sign up head over to: [https://do.co/4cxzdUf](https://do.co/4cxzdUf)
Finally - I am on baby watch!! Me and the wife are 2 weeks out (July 4th) of having our second kid. CAN’T WAIT FOR ALL THE SLEEPLESS NIGHTS. But for real, we’re pumped. If you don’t hear from me next week you know why!
Been really into doing live’s/podcast episodes by myself again to go over pretty tactical job search stuff. Let me know if you like it just me or with guests! Here is an overview on how to write a resume that doesn’t suck! And if you’re a visual learner, I walk you through what a good resume looks like [here](https://www.youtube.com/live/GLSJxpzpVsE).
But my best advice is that if you’re looking for resume help try to stay within your industry. And if you think you’re going to take a short cut and pay someone to write it or send it through ChatGPT… I mean you do you, but it’s pretty easy to spot a generic template that has a little personal info dropped in it.
Here’s my quick advice:
**Leverage the Header space**
This is where you’re going to throw your name and contact info. Remember we’re trying to get as much on the first page as possible.
Make sure you can manage whatever contact info you include. So if you don’t want to be fielding surprise calls, maybe just include your email.
You can also link to your LinkedIn or GitHub here. But remember, you don’t want too many links so it doesn’t get tossed when going through certain ATS.
**Professional Summary**
Outline your career as it relates to the job you’re applying to (as best as you can). It’s not really feasible to adjust your resume for EVERYTHING you apply to, but if you have the time I’d recommend it.
This is your quick grab highlight reel. You should be writing things like “8+ Years of Python development” not full sentences.
**Professional Experience**
Use clear formatting
Bold should be use to break of the major text blocks, not every other line.
Company mm/year - mm/year
Title
Location (or Remote)
First Bullet Point:
Give me an overview of what the company does mainly around revenue and/or sort of idea of how big or important the company is. I’m not going out of my way to Google your company.
Second Bullet Point:
Give me an overview of why you were hired and what your main responsibility is.
Rest of the Bullet Points:
Start with action words (managed, ran, created, etc)
A little less tasks, a lot more action.
Talk about quantitative things YOU have done.
Get the actual stats.
I’m sure you’re team is nice and all, but this resume should be all about you.
Keep each line to one sentence/line. This is going to be a challenge for some of y’all, but it makes your resume so much easier to read.
Rinse and repeat until you’ve built our your full resume. Length wise, I say keep it three pages or under. I promise you can do it. Another big thing to note is your resume is probably going to be read on someone’s phone. So when it comes to formatting, no one is going to want to read your three column wide text block. Keep it short, keep it concise, keep it clean. | tdesseyn |
1,896,272 | Weekly Updates - June 21, 2024 | Hi everyone! 👋 We hope you had a great week. 👏 Exciting updates for Developers! - The... | 0 | 2024-06-21T16:34:44 | https://dev.to/couchbase/weekly-updates-june-21-2024-c2f | couchbase, community, learning, appdev | Hi everyone! :wave:
We hope you had a great week.
- 👏 **Exciting updates for Developers!** - The CouchbaseVSCode Extension now supports GitHub Codespaces, Google Project IDX, and other remote development environments! [*Learn more here >>*](https://www.couchbase.com/blog/couchbase-vscode-remote-development-environments/)
<br>
- :book: **New Blog: Building Modern Apps with Couchbase in Q1** - We recently celebrated a successful quarter and some accomplishments we had. We want to share some customer highlights with you! [*Read about our first quarter and some of our customers here >>*](https://www.couchbase.com/blog/building-modern-apps-couchbase-q1-fy25/)
<br>
- 💻 **Level up your career with Couchbase!** We're searching for talent to join our global teams across engineering, sales, marketing, and more! [*Explore our various job openings here >>*](https://www.couchbase.com/careers/open-positions/)
<br>
- :mortar_board: **Free 90-minutes hands-on course for developers coming to a time zone near you!** Check out our FREE hands-on course for Developers. In just 90 minutes with a dedicated instructor, you'll learn everything you need to get started with Couchbase Capella and kick off your Couchbase certification journey. [*Find a course that suits your time zone here >>*](https://www.couchbase.com/couchbase-capella-test-drive/?utm_campaign=adaptive-apps&utm_medium=event&utm_source=meetup&utm_content=webinar&utm_term=developer)
Have a great weekend!
Team Couchbase
| brianking |
1,896,163 | Getting Started with RabbitMQ and Python: A Practical Guide | Introduction In this article, we will guide you through setting up and using RabbitMQ with... | 0 | 2024-06-21T16:34:41 | https://dev.to/felipepaz/getting-started-with-rabbitmq-and-python-a-practical-guide-57fi | rabbitmq, python | ## Introduction
In this article, we will guide you through setting up and using RabbitMQ with Python. RabbitMQ is a powerful message broker that allows applications to communicate with each other via messages. This practical guide will show you how to connect to RabbitMQ, publish messages to a queue, and consume messages from a queue using Python. Additionally, we will use Docker to manage RabbitMQ in a containerized environment, ensuring a smooth and isolated setup. Whether you are new to message brokers or looking to integrate RabbitMQ into your Python projects, this guide will provide you with a solid foundation to get started.
## Summary
In this article, we will cover the essentials of setting up and using RabbitMQ with Python. You will learn how to:
* Set up your development environment.
* Use Docker to run RabbitMQ in a container.
* Connect to RabbitMQ from a Python application.
* Publish messages to a queue.
* Consume messages from a queue.
* Access the RabbitMQ management console.
By the end of this guide, you will have a working example of a RabbitMQ setup with Python and Docker, and you will be ready to integrate these tools into your own projects.
## Prerequisites
Before we begin, make sure you have the following tools and technologies installed on your system:
* Python 3.8+: Ensure you have Python installed. You can download it from the official Python website.
* Docker: Docker is required to run RabbitMQ in a containerized environment. You can download and install Docker from the official Docker website.
* Docker Compose: Docker Compose is used to manage multi-container Docker applications. It is included with Docker Desktop.
Having these prerequisites installed will ensure a smooth setup and execution of the examples provided in this guide.
## Environment Setup
To get started, follow these steps to set up your development environment:
1. Create and activate the virtual environment
Creating a virtual environment helps to isolate your project’s dependencies. Here’s how you can create and activate a virtual environment:
- Create the virtual environment:
```sh
python -m venv .venv
```
- Activate the virtual environment:
* On Windows:
```sh
.venv\Scripts\activate
```
* On macOS/Linux:
```sh
source .venv/bin/activate
```
2. Install the dependencies
Next, install the required Python packages. For this project, we will use the pika library to interact with RabbitMQ.
- Install the required packages:
```sh
pip install pika
```
- Save the installed packages to requirements.txt:
```sh
pip freeze > requirements.txt
```
With the environment set up and dependencies installed, you’re ready to move on to the next steps in configuring RabbitMQ with Docker.
## Docker Setup
In this section, we will configure and start RabbitMQ using Docker. This allows us to easily manage and isolate RabbitMQ in a containerized environment.
1. Create a `docker-compose.yaml` file
Create a file named `docker-compose.yaml` in the root of your project directory. Add the following content to the file:
```yaml
version: '3.8'
services:
rabbitmq:
image: rabbitmq:3-management
container_name: rabbitmq
ports:
- '5672:5672'
- '15672:15672'
environment:
RABBITMQ_DEFAULT_USER: guest
RABBITMQ_DEFAULT_PASS: guest
```
This configuration sets up RabbitMQ with the management plugin enabled, allowing you to access the RabbitMQ management console on port 15672.
2. Start the RabbitMQ container
Run the following command to build and start the RabbitMQ container:
```sh
docker-compose up -d
```
This command will download the RabbitMQ Docker image (if not already available locally), start the container, and run it in the background. The RabbitMQ server will be accessible at `localhost:5672`, and the management console will be available at `http://localhost:15672`.
With RabbitMQ running in a Docker container, you are ready to move on to the next steps of developing the application and integrating RabbitMQ with your Python code.
## Application Development
In this section, we will develop the application that connects to RabbitMQ, publishes messages to a queue, and consumes messages from a queue. We’ll start by implementing a class to manage the connection and interactions with RabbitMQ.
1. Implement the RabbitMQ class
Create a file named `rabbitmq.py` in the root of your project directory. Add the following content to the file:
```python
import pika
import os
class RabbitMQ:
def __init__(self):
self.user = os.getenv('RABBITMQ_USER', 'user')
self.password = os.getenv('RABBITMQ_PASSWORD', 'password')
self.host = os.getenv('RABBITMQ_HOST', 'localhost')
self.port = int(os.getenv('RABBITMQ_PORT', 5672))
self.connection = None
self.channel = None
self.connect()
def connect(self):
credentials = pika.PlainCredentials(self.user, self.password)
parameters = pika.ConnectionParameters(host=self.host, port=self.port, credentials=credentials)
self.connection = pika.BlockingConnection(parameters)
self.channel = self.connection.channel()
def close(self):
if self.connection and not self.connection.is_closed:
self.connection.close()
def consume(self, queue_name, callback):
if not self.channel:
raise Exception("Connection is not established.")
self.channel.basic_consume(queue=queue_name, on_message_callback=callback, auto_ack=True)
self.channel.start_consuming()
def publish(self, queue_name, message):
if not self.channel:
raise Exception("Connection is not established.")
self.channel.queue_declare(queue=queue_name, durable=True)
self.channel.basic_publish(exchange='',
routing_key=queue_name,
body=message,
properties=pika.BasicProperties(
delivery_mode=2, # make message persistent
))
print(f"Sent message to queue {queue_name}: {message}")
```
This class handles connecting to RabbitMQ, publishing messages to a queue, and consuming messages from a queue. The connection parameters are read from environment variables.
2. Create the main.py script
Create a file named `main.py` in the root of your project directory. Add the following content to the file:
```python
from rabbitmq import RabbitMQ
import sys
def callback(ch, method, properties, body):
print(f"Received message: {body}")
def main():
rabbitmq = RabbitMQ()
try:
print("Connection to RabbitMQ established successfully.")
rabbitmq.consume(queue_name='test_queue', callback=callback)
except Exception as e:
print(f"Failed to establish connection to RabbitMQ: {e}")
sys.exit(1)
finally:
rabbitmq.close()
if __name__ == "__main__":
main()
```
This script connects to RabbitMQ and starts consuming messages from a queue named *test_queue*.
3. Create the publisher.py script
Create a file named `publisher.py` in the root of your project directory. Add the following content to the file:
```python
from rabbitmq import RabbitMQ
def publish_test_message():
rabbitmq = RabbitMQ()
try:
rabbitmq.publish(queue_name='test_queue', message='Test message')
print("Test message published successfully.")
except Exception as e:
print(f"Failed to publish test message: {e}")
finally:
rabbitmq.close()
if __name__ == "__main__":
publish_test_message()
```
This script publishes a test message to the *test_queue*.
With these scripts in place, you are ready to run and test your application.
## Running the Application
In this section, we will run the application to ensure everything is set up correctly and that we can successfully publish and consume messages using RabbitMQ.
1. Start the RabbitMQ server
Run the `main.py` script to start the **RabbitMQ** server and begin consuming messages from the *test_queue*:
```python
python main.py
```
You should see a message indicating that the connection to RabbitMQ was established successfully. The script will continue running and wait for messages to consume from the test_queue.
2. Publish a test message
Open a new terminal window or tab, and run the *publisher.py* script to publish a test message to the *test_queue*:
```python
python publisher.py
```
You should see a message indicating that the test message was published successfully. The main.py script should also display the received message, indicating that the message was successfully consumed from the test_queue.
### Example Output
When you run *main.py*, you should see something like this:
```sh
Connection to RabbitMQ established successfully.
Received message: b'Test message'
```
When you run *publisher.py*, you should see something like this:
```sh
Sent message to queue test_queue: Test message
Test message published successfully.
```
## Conclusion
In this guide, we covered the steps to set up and use RabbitMQ with Python. We demonstrated how to configure the development environment, run RabbitMQ in a Docker container, and create a simple application to publish and consume messages. This should give you a solid foundation to start integrating RabbitMQ into your own projects.
For more information and advanced usage of RabbitMQ, please refer to the official [RabbitMQ documentation](https://www.rabbitmq.com/docs).
## GitHub Repository
You can find the complete code for this project in the following GitHub repository: [python-rabbitmq](https://github.com/pazfelipe/python-rabbitmq).
This repository contains all the scripts and configurations discussed in this article, including the docker-compose.yaml, rabbitmq.py, main.py, publisher.py, and the README.md with detailed setup instructions. Feel free to clone the repository and experiment with the code to further your understanding of RabbitMQ and Python integration. | felipepaz |
1,896,271 | AWS Networking Gateways | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T16:33:14 | https://dev.to/vidhey071/aws-networking-gateways-39em | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,270 | Global Weather APIs: Which One Is Right for Your Project? | In today’s technology-driven world, the demand for precise and real-time weather data has surged... | 0 | 2024-06-21T16:32:32 | https://dev.to/sameeranthony/global-weather-apis-which-one-is-right-for-your-project-3c3a | webdev, api, weather, react | In today’s technology-driven world, the demand for precise and real-time weather data has surged significantly. Whether you're developing a precise weather app, a local weather forecast API free for your community, or an intricate system for climate monitoring, choosing the right global weather API is crucial.
Understanding Weather APIs
Weather APIs are interfaces that provide access to weather data. They offer functionalities such as current weather conditions, forecasts, historical data, and more. With REST API weather services, developers can integrate these functionalities into their applications, ensuring that users have access to [real time weather data](https://weatherstack.com/product) seamlessly. The choice of a public weather API can determine the accuracy and reliability of the weather information your project delivers.
Weatherstack API
Among the prominent names in the field is the weatherstack API. Known for its reliability and ease of use, it provides real-time weather data for any location worldwide. Weatherstack's extensive data coverage makes it a go-to choice for applications requiring accurate and timely weather updates. Additionally, it offers a free API weather plan, which is beneficial for smaller projects or startups with limited budgets. Its response format in JSON makes it a simple weather API to work with, ensuring smooth integration into various systems.
OpenWeatherMap API
Another popular option is the OpenWeatherMap API. It is widely regarded as one of the best free weather APIs available. OpenWeatherMap offers a comprehensive set of data, including current conditions, forecasts, and historical data. One of its standout features is the provision of a weather API forecast, which is ideal for applications needing forward-looking weather information. The OpenWeather API free tier is particularly attractive for developers looking to test their projects without incurring initial costs.
Weatherbit API
For developers seeking highly detailed and accurate weather data, the Weatherbit API is an excellent choice. This free weather forecast API provides extensive meteorological data, including temperature, wind speed, humidity, and more. Weatherbit is known for its high precision, making it suitable for applications where accuracy is paramount. The API’s structure is straightforward, allowing for easy integration and efficient use in both web and mobile applications.
AccuWeather API
AccuWeather is synonymous with weather accuracy. The AccuWeather API offers a wide range of data, from current conditions and forecasts to severe weather alerts. It is often cited as the most accurate weather site, thanks to its sophisticated forecasting models and real-time updates. While it may not offer a completely free current weather API, its pricing plans are scalable, catering to both small and large projects. For developers looking for a weather report API with high reliability, AccuWeather is a solid choice.
Weather Company API
The Weather Company API, part of IBM's suite of services, provides robust weather data suitable for enterprise-level applications. It offers comprehensive weather information, including forecasts, current conditions, and historical data. This live weather API is particularly useful for industries such as aviation, logistics, and agriculture, where weather plays a critical role. Although it is not entirely a free API weather service, the value it offers justifies the investment for businesses requiring advanced weather analytics.
Comparing APIs: Which One is Right for You?
Choosing the right [global weather API](https://weatherstack.com/) depends on several factors, including the specific requirements of your project, budget, and the level of accuracy needed. Here are some key considerations:
Data Accuracy: If your project demands the highest level of precision, consider the most accurate weather site like AccuWeather.
Cost: For budget-conscious projects, look for a free weather API JSON such as OpenWeatherMap or Weatherbit.
Ease of Integration: A simple weather API with easy-to-use documentation can save development time and reduce complexity. Weatherstack and OpenWeatherMap are known for their developer-friendly interfaces.
Coverage: Ensure the API covers all geographic areas relevant to your project. Global APIs like Weatherstack and Weatherbit provide extensive coverage.
Special Features: Some projects may require specific data such as severe weather alerts or historical data. Choose an API that caters to these needs.
Conclusion
Selecting the right weather information API is a critical decision that can significantly impact the performance and reliability of your application. Whether you need a free weather forecast API for a small project or a comprehensive weather API forecast for a larger enterprise solution, there is a wide range of options available. By considering factors such as accuracy, cost, and ease of integration, you can choose the best weather API that aligns with your project's needs.
 | sameeranthony |
1,896,269 | Protecting your Instance with Security Groups | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T16:32:29 | https://dev.to/vidhey071/protecting-your-instance-with-security-groups-20l0 | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,896,268 | I used react-router but not for routing | I was working on my React project, fetching some data using fetch, and looking for a good way to do... | 0 | 2024-06-21T16:31:57 | https://dev.to/nguyenhongphat0/i-used-react-router-but-not-for-routing-19k3 | react, javascript, storytelling, performance | I was working on my React project, fetching some data using `fetch`, and looking for a good way to do something like this Python code:
```python
"{a} {b}!".format(a = "Hello", b = "World") # == "Hello World!"
```
> 💡 Why not use JavaScript string interpolation, you may ask? I want to put the params elsewhere, not within the string.
So I ~~spent hours doing Regex~~ went code scavenging on Stack Overflow, and I found [this](https://stackoverflow.com/questions/610406/javascript-equivalent-to-printf-string-format). It’s interesting that there is a function named [`formatUnicorn`](https://stackoverflow.com/a/18234317) right on Stack Overflow’s website, which does exactly what I was looking for:

But I was hesitant to loot the code. I want to keep my code base as strongly typed as possible, and modifying the String prototype makes me feel insecure. What’s more, I would like to have at least some validation in case someone else forgot to pass in `a` and/or `b`. Implementing that seems beyond my scope of work, and installing another library is overkill. I was staring at the import section (just to make sure that I don’t import so much 3rd party stuff to keep our SPA app startup fast since FCP is crucial to our business), and that’s when I saw `react-router` import…
React Router lets you register routes with params like this:
```js
path: "teams/:teamId",
```
So they must have implemented that “unicorn format” somewhere. I skimmed over their exports hoping that they would expose that unicorn function so I could use it, and thankfully they did:

And the function is unicornly strong typed:

> 💡 Did you expect that TypeScript can infer substrings as types like above? I didn’t!
### The moral of the story
Sometimes, installing one more library may not be necessary because what you need is already (unexpectedly) available elsewhere within your project. Bundling too many libraries can greatly affect performance if it is crucial to your business.
I hope my story was inspiring! | nguyenhongphat0 |
1,896,267 | Tracking Order Status in Node.js: A Deep Dive | **_In the dynamic world of e-commerce, tracking the status of an order is crucial for both businesses... | 0 | 2024-06-21T16:30:18 | https://dev.to/muhammadranju/tracking-order-status-in-nodejs-a-deep-dive-4f21 | programming, node, webdev, javascript | **_In the dynamic world of e-commerce, tracking the status of an order is crucial for both businesses and customers. An effective tracking system ensures transparency and enhances the customer experience. In this article, we will explore a Node.js Express controller designed to handle order tracking. The controller fetches order details based on a tracking number and returns a status-specific response.
_**
**
### Introduction to the Code
This Node.js codebase is a part of an Express application. It is designed to fetch an order's status from a database and return a corresponding message. The code uses asynchronous operations to handle database queries and HTTP responses efficiently.
**Key Components
**
1. Imports and Constants
```
const asyncHandler = require("../../../../../utils/asyncHandler");
const Orders = require("../../../../../models/Orders.model");
const ApiResponse = require("../../../../../utils/ApiResponse");
const OrderStatusEnum = require("../../../../../Constants");
```
- asyncHandler: A utility to manage asynchronous functions and error handling.
- Orders: The Mongoose model for accessing order data.
- ApiResponse: A utility to create standardized API responses.
- OrderStatusEnum: An object containing various order status constants.
2. Controller Definition
```
const orderTrackingController = asyncHandler(async (req, res) => {
```
The controller function, orderTrackingController, is wrapped in asyncHandler to handle asynchronous operations gracefully.
3. Extracting Tracking Number
```
const { trackingNumber } = req.body;
```
This line extracts the trackingNumber from the request body, which is needed to fetch the order details.
4. Fetching Order
```
const order = await Orders.findOne({ trackingNumber: trackingNumber }).populate("shippingAddressId");
```
This query fetches the order matching the provided tracking number and populates the shippingAddressId field with related data.
5. Handling Order Not Found
```
if (!order) {
return res.status(404).json(new ApiResponse(404, null, "Order not found"));
}
```
If no order is found, the controller returns a 404 response with an appropriate message.
6. Creating Response Function
```
const createResponse = (heading, body) => new ApiResponse(200, { heading, body });
```
7. Determining the Response Based on Order Status
```
switch (order.orderStatus) {
```
The switch statement checks the orderStatus and constructs a response based on its value.
- Order Status: Pending
```
case OrderStatusEnum.PENDING:
response = createResponse(
"Processed and Ready to Ship",
"Your package has been processed and will be with our delivery partner soon."
);
break;
```
- Order Status: Cancelled
```
javascriptCopy codecase OrderStatusEnum.CANCELLED:
response = createResponse(
"Your Order has been cancelled",
"Your order has been cancelled due to some reasons."
);
break;
```
- Order Status: Placed
```
javascriptCopy codecase OrderStatusEnum.PLACED:
response = createResponse(
"Reached our Logistics Facility",
"Your package has arrived at our logistics facility from where it will be sent to the last mile hub."
);
break;
```
- Order Status: Shipped
```
javascriptCopy codecase OrderStatusEnum.SHIPPED:
response = createResponse(
"Shipped",
`Your package is on the way to our last hub with tracking number ${order.trackingNumber} from where it will be delivered to you.`
);
break;
```
- Order Status: Out for Delivery
```
javascriptCopy codecase OrderStatusEnum.OUT_FOR_DELIVERY:
response = createResponse(
"Out for Delivery",
`Our delivery partner will attempt to deliver your package today to ${order.shippingAddressId?.city}.`
);
break;
```
- Order Status: Delivered
` javascriptCopy codecase OrderStatusEnum.DELIVERED:
response = createResponse(
"Delivered",
`Your package has been delivered to ${order.shippingAddressId?.city}.`
);
break;
`
- Default Case
```
javascriptCopy codedefault:
response = new ApiResponse(200, order, "Order status found successfully");
break;
```
8. Sending Response to Client
```
return res.status(200).json(response);
```
This line sends the constructed response back to the client.
9. Exporting the Controller
```
module.exports = orderTrackingController;
```
This Node.js Express controller is a robust solution for tracking order statuses. By using asynchronous operations and clear, status-specific messages, it enhances the user experience and ensures efficient order tracking. Whether an order is pending, shipped, or delivered, this controller provides precise updates, keeping customers informed throughout the delivery process. This approach not only improves transparency but also builds trust with customers, making it an essential component of any e-commerce application. | muhammadranju |
1,896,266 | bootcamp | just joined Columbia/EdX bootcamp. Learning everything i can | 0 | 2024-06-21T16:28:40 | https://dev.to/sean_dolan/bootcamp-cp9 | productivity, career, computerscience | just joined Columbia/EdX bootcamp. Learning everything i can | sean_dolan |
1,896,264 | AWS Network - Monitoring and Troubleshooting | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T16:22:17 | https://dev.to/vidhey071/aws-network-monitoring-and-troubleshooting-gn7 | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,257 | Amazon RDS Service Primer | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T16:21:32 | https://dev.to/vidhey071/amazon-rds-service-primer-4742 | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,896,170 | 1052. Grumpy Bookstore Owner | 1052. Grumpy Bookstore Owner Medium There is a bookstore owner that has a store open for n minutes.... | 27,523 | 2024-06-21T16:20:07 | https://dev.to/mdarifulhaque/1052-grumpy-bookstore-owner-2ffd | php, leetcode, algorithms, programming | 1052\. Grumpy Bookstore Owner
Medium
There is a bookstore owner that has a store open for `n` minutes. Every minute, some number of customers enter the store. You are given an integer array `customers` of length n where `customers[i]` is the number of the customer that enters the store at the start of the <code>i<sup>th</sup></code> minute and all those customers leave after the end of that minute.
On some minutes, the bookstore owner is grumpy. You are given a binary array grumpy where `grumpy[i]` is `1` if the bookstore owner is grumpy during the <code>i<sup>th</sup></code> minute, and is `0` otherwise.
When the bookstore owner is grumpy, the customers of that minute are not satisfied, otherwise, they are satisfied.
The bookstore owner knows a secret technique to keep themselves not grumpy for `minutes` consecutive minutes, but can only use it once.
Return _the maximum number of customers that can be satisfied throughout the day_.
**Example 1:**
- **Input:** customers = [1,0,1,2,1,1,7,5], grumpy = [0,1,0,1,0,1,0,1], minutes = 3
- **Output:** 16
- **Explanation:** The bookstore owner keeps themselves not grumpy for the last 3 minutes.
The maximum number of customers that can be satisfied = 1 + 1 + 1 + 1 + 7 + 5 = 16.
**Example 2:**
- **Input:** customers = [1], grumpy = [0], minutes = 1
- **Output:** 1
**Constraints:**
- <code>n == customers.length == grumpy.length</code>
- <code>1 <= minutes <= n <= 2 * 10<sup>4</sup></code>
- <code>0 <= customers[i] <= 1000</code>
- `grumpy[i]` is either `0` or `1`.
**Solution:**
```
class Solution {
/**
* @param Integer[] $customers
* @param Integer[] $grumpy
* @param Integer $minutes
* @return Integer
*/
function maxSatisfied($customers, $grumpy, $minutes) {
$satisfied = 0;
$madeSatisfied = 0;
$windowSatisfied = 0;
for ($i = 0; $i < count($customers); ++$i) {
if ($grumpy[$i] == 0)
$satisfied += $customers[$i];
else
$windowSatisfied += $customers[$i];
if ($i >= $minutes && $grumpy[$i - $minutes] == 1)
$windowSatisfied -= $customers[$i - $minutes];
$madeSatisfied = max($madeSatisfied, $windowSatisfied);
}
return $satisfied + $madeSatisfied;
}
}
```
**Contact Links**
- **[LinkedIn](https://www.linkedin.com/in/arifulhaque/)**
- **[GitHub](https://github.com/mah-shamim)**
| mdarifulhaque |
1,896,166 | Buy Verified Paxful Account | https://dmhelpshop.com/product/buy-verified-paxful-account/ Buy Verified Paxful Account There are... | 0 | 2024-06-21T16:19:41 | https://dev.to/vapaxeh806/buy-verified-paxful-account-2pm6 | react, python, ai, devops | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-paxful-account/\n\n\n\n\nBuy Verified Paxful Account\nThere are several compelling reasons to consider purchasing a verified Paxful account. Firstly, a verified account offers enhanced security, providing peace of mind to all users. Additionally, it opens up a wider range of trading opportunities, allowing individuals to partake in various transactions, ultimately expanding their financial horizons.\n\nMoreover, Buy verified Paxful account ensures faster and more streamlined transactions, minimizing any potential delays or inconveniences. Furthermore, by opting for a verified account, users gain access to a trusted and reputable platform, fostering a sense of reliability and confidence.\n\nLastly, Paxful’s verification process is thorough and meticulous, ensuring that only genuine individuals are granted verified status, thereby creating a safer trading environment for all users. Overall, the decision to Buy Verified Paxful account can greatly enhance one’s overall trading experience, offering increased security, access to more opportunities, and a reliable platform to engage with. Buy Verified Paxful Account.\n\nBuy US verified paxful account from the best place dmhelpshop\nWhy we declared this website as the best place to buy US verified paxful account? Because, our company is established for providing the all account services in the USA (our main target) and even in the whole world. With this in mind we create paxful account and customize our accounts as professional with the real documents. Buy Verified Paxful Account.\n\nIf you want to buy US verified paxful account you should have to contact fast with us. Because our accounts are-\n\nEmail verified\nPhone number verified\nSelfie and KYC verified\nSSN (social security no.) verified\nTax ID and passport verified\nSometimes driving license verified\nMasterCard attached and verified\nUsed only genuine and real documents\n100% access of the account\nAll documents provided for customer security\nWhat is Verified Paxful Account?\nIn today’s expanding landscape of online transactions, ensuring security and reliability has become paramount. Given this context, Paxful has quickly risen as a prominent peer-to-peer Bitcoin marketplace, catering to individuals and businesses seeking trusted platforms for cryptocurrency trading.\n\nIn light of the prevalent digital scams and frauds, it is only natural for people to exercise caution when partaking in online transactions. As a result, the concept of a verified account has gained immense significance, serving as a critical feature for numerous online platforms. Paxful recognizes this need and provides a safe haven for users, streamlining their cryptocurrency buying and selling experience.\n\nFor individuals and businesses alike, Buy verified Paxful account emerges as an appealing choice, offering a secure and reliable environment in the ever-expanding world of digital transactions. Buy Verified Paxful Account.\n\nVerified Paxful Accounts are essential for establishing credibility and trust among users who want to transact securely on the platform. They serve as evidence that a user is a reliable seller or buyer, verifying their legitimacy.\n\nBut what constitutes a verified account, and how can one obtain this status on Paxful? In this exploration of verified Paxful accounts, we will unravel the significance they hold, why they are crucial, and shed light on the process behind their activation, providing a comprehensive understanding of how they function. Buy verified Paxful account.\n\n \n\nWhy should to Buy Verified Paxful Account?\nThere are several compelling reasons to consider purchasing a verified Paxful account. Firstly, a verified account offers enhanced security, providing peace of mind to all users. Additionally, it opens up a wider range of trading opportunities, allowing individuals to partake in various transactions, ultimately expanding their financial horizons.\n\nMoreover, a verified Paxful account ensures faster and more streamlined transactions, minimizing any potential delays or inconveniences. Furthermore, by opting for a verified account, users gain access to a trusted and reputable platform, fostering a sense of reliability and confidence. Buy Verified Paxful Account.\n\nLastly, Paxful’s verification process is thorough and meticulous, ensuring that only genuine individuals are granted verified status, thereby creating a safer trading environment for all users. Overall, the decision to buy a verified Paxful account can greatly enhance one’s overall trading experience, offering increased security, access to more opportunities, and a reliable platform to engage with.\n\n \n\nWhat is a Paxful Account\nPaxful and various other platforms consistently release updates that not only address security vulnerabilities but also enhance usability by introducing new features. Buy Verified Paxful Account.\n\nIn line with this, our old accounts have recently undergone upgrades, ensuring that if you purchase an old buy Verified Paxful account from dmhelpshop.com, you will gain access to an account with an impressive history and advanced features. This ensures a seamless and enhanced experience for all users, making it a worthwhile option for everyone.\n\n \n\nIs it safe to buy Paxful Verified Accounts?\nBuying on Paxful is a secure choice for everyone. However, the level of trust amplifies when purchasing from Paxful verified accounts. These accounts belong to sellers who have undergone rigorous scrutiny by Paxful. Buy verified Paxful account, you are automatically designated as a verified account. Hence, purchasing from a Paxful verified account ensures a high level of credibility and utmost reliability. Buy Verified Paxful Account.\n\nPAXFUL, a widely known peer-to-peer cryptocurrency trading platform, has gained significant popularity as a go-to website for purchasing Bitcoin and other cryptocurrencies. It is important to note, however, that while Paxful may not be the most secure option available, its reputation is considerably less problematic compared to many other marketplaces. Buy Verified Paxful Account.\n\nThis brings us to the question: is it safe to purchase Paxful Verified Accounts? Top Paxful reviews offer mixed opinions, suggesting that caution should be exercised. Therefore, users are advised to conduct thorough research and consider all aspects before proceeding with any transactions on Paxful.\n\n \n\nHow Do I Get 100% Real Verified Paxful Accoun?\nPaxful, a renowned peer-to-peer cryptocurrency marketplace, offers users the opportunity to conveniently buy and sell a wide range of cryptocurrencies. Given its growing popularity, both individuals and businesses are seeking to establish verified accounts on this platform.\n\nHowever, the process of creating a verified Paxful account can be intimidating, particularly considering the escalating prevalence of online scams and fraudulent practices. This verification procedure necessitates users to furnish personal information and vital documents, posing potential risks if not conducted meticulously.\n\nIn this comprehensive guide, we will delve into the necessary steps to create a legitimate and verified Paxful account. Our discussion will revolve around the verification process and provide valuable tips to safely navigate through it.\n\nMoreover, we will emphasize the utmost importance of maintaining the security of personal information when creating a verified account. Furthermore, we will shed light on common pitfalls to steer clear of, such as using counterfeit documents or attempting to bypass the verification process.\n\nWhether you are new to Paxful or an experienced user, this engaging paragraph aims to equip everyone with the knowledge they need to establish a secure and authentic presence on the platform.\n\nBenefits Of Verified Paxful Accounts\nVerified Paxful accounts offer numerous advantages compared to regular Paxful accounts. One notable advantage is that verified accounts contribute to building trust within the community.\n\nVerification, although a rigorous process, is essential for peer-to-peer transactions. This is why all Paxful accounts undergo verification after registration. When customers within the community possess confidence and trust, they can conveniently and securely exchange cash for Bitcoin or Ethereum instantly. Buy Verified Paxful Account.\n\nPaxful accounts, trusted and verified by sellers globally, serve as a testament to their unwavering commitment towards their business or passion, ensuring exceptional customer service at all times. Headquartered in Africa, Paxful holds the distinction of being the world’s pioneering peer-to-peer bitcoin marketplace. Spearheaded by its founder, Ray Youssef, Paxful continues to lead the way in revolutionizing the digital exchange landscape.\n\nPaxful has emerged as a favored platform for digital currency trading, catering to a diverse audience. One of Paxful’s key features is its direct peer-to-peer trading system, eliminating the need for intermediaries or cryptocurrency exchanges. By leveraging Paxful’s escrow system, users can trade securely and confidently.\n\nWhat sets Paxful apart is its commitment to identity verification, ensuring a trustworthy environment for buyers and sellers alike. With these user-centric qualities, Paxful has successfully established itself as a leading platform for hassle-free digital currency transactions, appealing to a wide range of individuals seeking a reliable and convenient trading experience. Buy Verified Paxful Account.\n\n \n\nHow paxful ensure risk-free transaction and trading?\nEngage in safe online financial activities by prioritizing verified accounts to reduce the risk of fraud. Platforms like Paxfu implement stringent identity and address verification measures to protect users from scammers and ensure credibility.\n\nWith verified accounts, users can trade with confidence, knowing they are interacting with legitimate individuals or entities. By fostering trust through verified accounts, Paxful strengthens the integrity of its ecosystem, making it a secure space for financial transactions for all users. Buy Verified Paxful Account.\n\nExperience seamless transactions by obtaining a verified Paxful account. Verification signals a user’s dedication to the platform’s guidelines, leading to the prestigious badge of trust. This trust not only expedites trades but also reduces transaction scrutiny. Additionally, verified users unlock exclusive features enhancing efficiency on Paxful. Elevate your trading experience with Verified Paxful Accounts today.\n\nIn the ever-changing realm of online trading and transactions, selecting a platform with minimal fees is paramount for optimizing returns. This choice not only enhances your financial capabilities but also facilitates more frequent trading while safeguarding gains. Buy Verified Paxful Account.\n\nExamining the details of fee configurations reveals Paxful as a frontrunner in cost-effectiveness. Acquire a verified level-3 USA Paxful account from usasmmonline.com for a secure transaction experience. Invest in verified Paxful accounts to take advantage of a leading platform in the online trading landscape.\n\n \n\nHow Old Paxful ensures a lot of Advantages?\n\nExplore the boundless opportunities that Verified Paxful accounts present for businesses looking to venture into the digital currency realm, as companies globally witness heightened profits and expansion. These success stories underline the myriad advantages of Paxful’s user-friendly interface, minimal fees, and robust trading tools, demonstrating its relevance across various sectors.\n\nBusinesses benefit from efficient transaction processing and cost-effective solutions, making Paxful a significant player in facilitating financial operations. Acquire a USA Paxful account effortlessly at a competitive rate from usasmmonline.com and unlock access to a world of possibilities. Buy Verified Paxful Account.\n\nExperience elevated convenience and accessibility through Paxful, where stories of transformation abound. Whether you are an individual seeking seamless transactions or a business eager to tap into a global market, buying old Paxful accounts unveils opportunities for growth.\n\nPaxful’s verified accounts not only offer reliability within the trading community but also serve as a testament to the platform’s ability to empower economic activities worldwide. Join the journey towards expansive possibilities and enhanced financial empowerment with Paxful today. Buy Verified Paxful Account.\n\n \n\nWhy paxful keep the security measures at the top priority?\nIn today’s digital landscape, security stands as a paramount concern for all individuals engaging in online activities, particularly within marketplaces such as Paxful. It is essential for account holders to remain informed about the comprehensive security protocols that are in place to safeguard their information.\n\nSafeguarding your Paxful account is imperative to guaranteeing the safety and security of your transactions. Two essential security components, Two-Factor Authentication and Routine Security Audits, serve as the pillars fortifying this shield of protection, ensuring a secure and trustworthy user experience for all. Buy Verified Paxful Account.\n\nConclusion\nInvesting in Bitcoin offers various avenues, and among those, utilizing a Paxful account has emerged as a favored option. Paxful, an esteemed online marketplace, enables users to engage in buying and selling Bitcoin. Buy Verified Paxful Account.\n\nThe initial step involves creating an account on Paxful and completing the verification process to ensure identity authentication. Subsequently, users gain access to a diverse range of offers from fellow users on the platform. Once a suitable proposal captures your interest, you can proceed to initiate a trade with the respective user, opening the doors to a seamless Bitcoin investing experience.\n\nIn conclusion, when considering the option of purchasing verified Paxful accounts, exercising caution and conducting thorough due diligence is of utmost importance. It is highly recommended to seek reputable sources and diligently research the seller’s history and reviews before making any transactions.\n\nMoreover, it is crucial to familiarize oneself with the terms and conditions outlined by Paxful regarding account verification, bearing in mind the potential consequences of violating those terms. By adhering to these guidelines, individuals can ensure a secure and reliable experience when engaging in such transactions. Buy Verified Paxful Account.\n\n \n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com" | vapaxeh806 |
1,896,159 | How to use smart-doc to generate JMeter test scripts | smart-doc is a tool for automatically generating Java API documentation. It creates documentation by... | 0 | 2024-06-21T16:14:54 | https://dev.to/yu_sun_0a160dea497156d354/how-to-use-smart-doc-to-generate-jmeter-test-scripts-3eaa | java, testing, webdev | smart-doc is a tool for automatically generating Java API documentation. It creates documentation by analyzing interfaces and comments in the source code, and supports a variety of document output formats, including `Markdown`, `HTML5`, `OpenAPI 3.0`, and more. The design goal of smart-doc is to simplify the document writing process, improve development efficiency, and ensure the accuracy and timeliness of the documentation.
In the software development lifecycle, the automatic generation of API documentation and the performance testing of interfaces are key steps in improving development efficiency and ensuring product quality. With the addition of the ability to generate JMeter performance testing scripts in smart-doc version 3.0.1, developers can more conveniently accomplish these two tasks. This article will introduce how to use smart-doc and JMeter for effective performance testing.

## Generate JMeter scripts
Using Smart-doc to generate JMeter performance test scripts can significantly reduce the time taken to write performance test scripts, thereby improving testing efficiency. The JMeter scripts automatically generated by Smart-doc can be directly run in JMeter without the need for complex configuration and debugging, making performance testing much simpler and faster.
First, ensure that the `smart-doc-maven-plugin` has been added to your project. Then, configure the relevant parameters of the smart-doc plugin in the `pom.xml` file of the project, for example:
```xml
<plugin>
<groupId>com.ly.smart-doc</groupId>
<artifactId>smart-doc-maven-plugin</artifactId>
<version>[latest version]</version>
<configuration>
<configFile>./src/main/resources/smart-doc.json</configFile>
<projectName>${project.description}</projectName>
</configuration>
</plugin>
```
Run the command `mvn smart-doc:jmeter`. Smart-doc will scan the project source code, extract annotation information, and automatically generate the corresponding JMeter performance test scripts.
Those who are not familiar with how to use it can refer to the official [smart-doc documentation](https://smart-doc-group.github.io/#/integrated/jmeter)
## Import Into JMeter
Open JMeter, click "File" -> "Open", select the JMeter script file generated in the first step, and click the "Start" button. JMeter will then begin performing performance tests according to the script.

## Configure Prometheus
`Prometheus` is an open-source monitoring and alerting tool for handling time-series data. We can use it to perform real-time monitoring during the `JMeter` stress testing process, thereby enhancing the observability of performance testing.
**Step 1: Install the JMeter Prometheus Plugin**
First, you need to install the `Prometheus plugin` in the `lib/ext` directory of JMeter. You can download the plugin from the JMeter Plugins Manager or the [official website](https://jmeter-plugins.org/) . It can also be downloaded from [GitHub](https://github.com/johrstrom/jmeter-prometheus-plugin/releases) . For this instance, download the latest version `0.7.1` from GitHub.

- Note: JMeter's default listening IP address is `127.0.0.1`, which by default will prevent Prometheus from connecting to the JMeter Prometheus listening port. Therefore, it is necessary to add `prometheus.ip=0.0.0.0` in `jmeter.properties`.
**Step 2: Add Prometheus Listener**
Open `JMeter` and add the `Prometheus Listener` to your test plan. This can be done by right-clicking on **Test Plan-> Add -> Listener -> Prometheus Listener**.

The configuration of the listener can refer to the official settings, with reference as follows (`smart-doc 3.0.4` supports configuring and adding Prometheus Listener generation):

**Step 3: Configure Prometheus Scrape**
In the configuration file of Prometheus (`prometheus.yml`), add a new `scrape_config` to fetch data from the JMeter Prometheus plugin. For example:
```yaml
scrape_configs:
- job_name: 'jmeter'
scrape_interval: 15s
static_configs:
- targets: ['<Your JMeter machine IP>:9270']
```
Here, `<Your JMeter machine IP>` is the IP address of the machine running the JMeter test, and `9270` is the default listening port for the JMeter Prometheus plugin.
**Step 4: Run the Test Plan**
For the purpose of easy verification in this article, the thread group is set to "`infinite loop`" during stress testing, which can be adjusted according to actual needs.

After starting successfully, JMeter Prometheus will, by default, create a service on the local port `9270`.
Access the URL `http://localhost:9270/metrics` and if you see the following content, it means it has been successful.

**Step 5: Start Prometheus**
After starting `Prometheus`, it will begin to fetch data from the `JMeter Prometheus plugin`. As shown in the following configuration, once successfully started, you can see the set targets in Prometheus.

**Step 6: Configure Grafana**
On the Grafana official website, find the prometheus-jmeter monitoring panel provided by the official site. Here, we select the template with ID `14927` to import into Grafana.

After clicking **Load**, select the `Prometheus` data source.

- Note: During testing, it was found that the original template `14927` had some errors. These were fixed during the writing process of this article. Import the corrected template downloaded from [GitHub](https://github.com/smart-doc-group/smart-doc-demo/blob/master/jmeter/grafana-template/jmeter-prometheus-14972.json).
After the template is successfully imported, we will be able to see the entire performance testing monitoring data in `Grafana`.


To facilitate a rapid experience of the entire performance testing process, the `smart-doc` community has curated and provided a template that can be launched with a single command using `docker-compose`. For those who wish to experience it through `Kubernetes` deployment, AI tools can be utilized to directly convert the `docker-compose` template into a `Kubernetes` deployment template.

The project for the experience also includes usage instructions.

The example code for this article can be found [here](https://github.com/smart-doc-group/smart-doc-demo).
## The Assistance of Smart-Doc in JMeter Performance Testing
The combination of smart-doc and JMeter for performance stress testing offers several advantages:
- Automation: smart-doc can automatically extract API information from the source code and generate JMeter performance test scripts without the need for manual writing, greatly improving efficiency.
- Precision: The JMeter performance test scripts generated by smart-doc are completely consistent with the API definitions in the source code, avoiding errors that may occur when manually writing scripts.
- Flexibility: smart-doc supports a variety of configuration options, allowing the generated JMeter scripts to be customized according to testing requirements.
smart-doc will also continue to improve and optimize support for JMeter. Please stay tuned for the ongoing development of the [smart-doc open-source project](https://github.com/TongchengOpenSource/smart-doc).
## Conclusion
By combining `smart-doc` and `JMeter`, we can not only automate the generation of API documentation but also swiftly create performance test scripts and conduct stress testing. This automation tool significantly enhances development and testing efficiency while helping teams more easily maintain and optimize the performance of software systems. We hope this article has provided practical references for you to apply these tools more efficiently in your daily work.
We also welcome everyone to continue to follow and support the smart-doc open-source community. In the future, we are exploring support for additional languages to assist more developers.
| yu_sun_0a160dea497156d354 |
1,896,165 | AWS Foundations : Cost Management | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T16:12:50 | https://dev.to/vidhey071/aws-foundations-cost-management-30k6 | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,164 | LLM operators, an approach that opens up new possibilities | I am sure that like many of us, we can't wait till the day AI finally replaces us so we can finish... | 0 | 2024-06-21T16:12:04 | https://dev.to/abdallah_meddah/llm-operators-144h | javascript, ai, discuss, computerscience | I am sure that like many of us, we can't wait till the day AI finally replaces us so we can finish all our side projects. However, and that's my opinion, I don't think we are there yet. My intuition tells me that the LLMs have been able to model a function that takes in many words (tokens) and outputs one word, I am not sure they can do abstractions and build upon ideas like humans do. Anyhow, I also think that LLMs bring something new to the table, something that we have not been exploiting correctly so far, again, in my opinion.
When processing data using LLMs the community has been very inventive over the last two years, coming up with a bunch of prompting techniques and RAG techniques to provide the necessary context to extract something valuable from these LLMs. However, from what I have seen from my workplace and after discussing with a few people, it turns out the results are never entirely satisfying, and all that LLMs are useful for today is acting like an enhanced search engine.
I believe that we can use LLMs more than just as search engine, I believe we can write operators that leverage these LLMs to operate on data we don't know the exact structure of. For example, say you're scraping a web page for a certain date, the actual HTML might change a lot changing the selectors used, but the page always shows the date you're looking for in a way comprehensible by humans, if you write an algorithm that relies on the HTML structure or a specific wording, it won't necessarily always work. But what if you feed the content of the page to an LLM, it will be able to effectively get you the correct value each time. And that's why I wrote a library in JavaScript that contains a few operators that leverage LLMs to process data. This is an example of a text you might want to get from an HTML:
```typescript
console.log(
await Unsure("<html><body><div class=\"scrapable\">Target Content<div></body></html>")
.flatMapTo("contain of the div with the class scrapable")
); // this will yield "target content"
```
With this, we can process data that is not well-polished and doesn't necessarily have a consistent structure.
We can also evaluate data that we don't know entirely, say you want to get better feedback on your app than just a 5-star rating, and you want your user to leave a comment, and according to comments you prioritize the bug backlog ( I know you have one, just like the rest of us ). You can do something like that
```typescript
console.log(
await Unsure(userComment)
.categorize([
"bug ticket 139",
"bug ticket 639",
"bug ticket 420",
"none"
])
); // this will yield "bug ticket 639" for example if the user comments about a bug that's described by ticket 639
```
I'll give one last example and then I'll give you a link to the library where there are other examples.
Suppose you have to filter a list of documents by keeping only the ones that talk about a legal subject that's between two certain entities in a certain year, you can simply do something like this:
```typescript
if(await Unsure(document.text)
.is("A legal document between entity1 and 2 that happened in 2024")
) {
wantedDocuments.push(document.id);
}
```
And far more cases where we can leverage LLMs to process data that we couldn't have processed correctly before. Here is the link to the library in javascript [Unsure](https://www.npmjs.com/package/unsure-js) and here's the python [package](https://pypi.org/project/unsurepy/). It's in javascript and python so far, I am currently re-writing it in Rust and Go.
Thank you for reading!
| abdallah_meddah |
1,896,162 | 📌 Linux ate my ram 📌 | Have you ever checked your Linux system's memory usage and panicked because it seemed like all your... | 0 | 2024-06-21T16:10:33 | https://dev.to/lakhera2015/linux-ate-my-ram-30dh | devops, linux, redhat, performance |
Have you ever checked your Linux system's memory usage and panicked because it seemed like all your RAM was being used? Don't worry your RAM is fine! This video will explain how Linux manages memory, why it looks like your RAM is full, and why you shouldn't be concerned.
{% embed https://youtu.be/M1RqPePwkRw %}
🙋♂️ Now the question is, what's Going On?
When you see high memory usage on Linux, it's primarily because of disk caching. Linux uses available memory to cache disk operations, making the system faster and more efficient. This cached memory is borrowed when not needed elsewhere and can be instantly released for applications when required.
🙋♂️ The next question is, why does Linux use disk Cache?
Disk caching improves system performance by keeping frequently accessed data in memory. This reduces the time it takes to read data from the disk, making the system more responsive. So what is the downside? It can confuse users into thinking their memory is low, but this isn't true.
🙋♂️ Let's understand How Memory is Managed
Linux categorizes memory usage as follows:
✅ Used Memory is the memory actively used by applications.
✅ Free Memory is the one not used at all.
✅ Available Memory is used by the disk cache but can be repurposed for applications instantly.
When you need to understand your system's memory, focus on the "available" memory rather than "free" memory.
🙋♂️ Another confusing question is, do You Need More Swap?
Probably not. Disk caching uses idle RAM and returns it to applications as needed. Swap is used when physical RAM is fully utilized. If applications require more memory, the kernel reallocates it from the disk cache, ensuring minimal swap usage.
🙋♂️ How to Verify Memory Usage
To accurately check your memory usage, use:
free -m comand
Look at the "available" column to see how much memory is actually available for your applications. This provides a true picture of your memory usage.
🙋♂️ The important question is When to Worry.
While disk caching is generally beneficial, there are signs of genuine low memory:
✅ Available memory is nearly zero.
✅ Increasing or fluctuating swap usage.
✅ The o o m killer is active, which can be checked with dmesg.
🏁 to sum up
Understanding how Linux manages memory can alleviate unnecessary worry about your system's performance. Disk caching makes your system faster and more responsive, and the memory used can be reclaimed for applications instantly. Focusing on "available" memory gives you a clearer picture of your system's health.
🖼 Image ref: https://www.linuxatemyram.com/atemyram.png
📚Book link:
https://pratimuniyal.gumroad.com/l/cracking-the-devops-interview | lakhera2015 |
1,896,161 | How to Find the Missing Number in an Array | In an array containing numbers from 1 to N, where one number is missing, the goal is to find the... | 27,580 | 2024-06-21T16:09:40 | https://blog.masum.dev/how-to-find-the-missing-number-in-an-array | algorithms, computerscience, cpp, tutorial | In an array containing numbers from 1 to N, where one number is missing, the goal is to find the missing number. Here, we'll discuss four different methods to achieve this, ranging from brute force to optimal approaches.
### Solution 1: Brute Force Approach (Using Nested Loop & Linear Search)
This approach involves checking for each number from 1 to N whether it exists in the array.
**Implementation**:
```cpp
// Solution-1: Brute Force Approach (using Nested Loop & Linear Search)
// Time Complexity: O(n*n)
// Space Complexity: O(1)
int findMissingNumber(vector<int> &arr, int n)
{
// Outer loop that runs from 1 to N
for (int i = 1; i <= n; i++)
{
// `flag` variable to check if an element exists
bool flag = false;
// Search the element using Linear Search
for (int j = 0; j < n - 1; j++)
{
if (arr[j] == i)
{
flag = true;
break;
}
}
// Check if the element is missing
if (!flag)
{
return i;
}
}
// If loop completes without finding a missing number (shouldn't happen)
// It is just to avoid warnings.
return -1;
}
```
**Logic**:
1. **Outer Loop**: Iterate through numbers from 1 to N.
2. **Linear Search**: For each number, check if it exists in the array using a nested loop.
3. **Flag Check**: Use a flag to indicate if the number was found. If not, return the number as it is the missing one.
**Time Complexity**: O(n²)
* **Explanation**: The outer loop runs from 1 to N and for each iteration of the outer loop, the inner loop runs up to N-1. This results in a time complexity of O(n \* (n-1)), which simplifies to O(n²).
**Space Complexity**: O(1)
* **Explanation**: Only a few additional variables (like `flag`) are used, which do not depend on the size of the input array.
**Example**:
* **Input**: `arr = [1, 2, 4, 6, 3, 7, 8]`, `n = 8`
* **Output**: `5`
* **Explanation**: The number `5` is missing from the array.
---
### Solution 2: Better Approach (Using Hashing)
This approach uses a hash table to track the presence of numbers in the array.
**Implementation**:
```cpp
// Solution-2: Better Approach (using Hashing)
// Time Complexity: O(n) + O(n) ~ O(2n)
// Space Complexity: O(n)
int findMissingNumber(vector<int> &arr, int n)
{
// Hash table to store the presence of numbers
vector<int> hash(n + 1, 0);
// Mark the presence of elements in the hash table
for (int i = 0; i < n - 1; i++)
{
hash[arr[i]] = 1;
}
// Find the missing number by iterating through the hash table
for (int i = 1; i <= n; i++)
{
if (hash[i] == 0)
{
return i;
}
}
// If loop completes without finding a missing number (shouldn't happen)
// It is just to avoid warnings.
return -1;
}
```
**Logic**:
1. **Initialize Hash Table**: Create a hash table to track numbers from 1 to N.
2. **Mark Presence**: Iterate through the array, marking the presence of each number in the hash table.
3. **Find Missing Number**: Iterate through the hash table to find the number that is not marked.
**Time Complexity**: O(n)
* **Explanation**: The first loop runs through the array of size N-1 to populate the hash table and the second loop runs through the hash table of size N to find the missing number. Thus, the overall time complexity is O(n) + O(n-1), which simplifies to O(n).
**Space Complexity**: O(n)
* **Explanation**: An additional hash table of size N is used to keep track of the presence of each number.
**Example**:
* **Input**: `arr = [1, 2, 4, 6, 3, 7, 8]`, `n = 8`
* **Output**: `5`
* **Explanation**: The number `5` is not present in the hash table.
---
### Solution 3: Optimal Approach 1 (Summation Approach)
This approach uses the sum formula for the first N natural numbers to find the missing number.
**Implementation**:
```cpp
// Solution-3: Optimal Approach 1 (Summation Approach)
// Time Complexity: O(n)
// Space Complexity: O(1)
int findMissingNumber(vector<int> &arr, int n)
{
// Sum of all numbers from 1 to N
int sumOfNNums = (n * (n + 1)) / 2;
int sumOfArr = 0;
// Calculate the actual sum of elements in the array
for (int i = 0; i < n - 1; i++)
{
sumOfArr += arr[i];
}
int missingNum = sumOfNNums - sumOfArr;
return missingNum;
}
```
**Logic**:
1. **Sum of N Numbers**: Calculate the sum of numbers from 1 to N using the formula sum = (n \* (n + 1)) / 2.
2. **Sum of Array**: Calculate the sum of elements in the array.
3. **Find Missing Number**: Subtract the sum of the array from the sum of N numbers to get the missing number.
**Time Complexity**: O(n)
* **Explanation**: The loop runs through the array once to calculate the sum of the elements, resulting in a time complexity of O(n).
**Space Complexity**: O(1)
* **Explanation**: Only a few additional variables (like `sumOfNNums` and `sumOfArr`) are used, which do not depend on the size of the input array.
**Example**:
* **Input**: `arr = [1, 2, 4, 6, 3, 7, 8]`, `n = 8`
* **Output**: `5`
* **Explanation**: The sum of numbers from `1` to `8` is `36` and the sum of the array is `31.` Thus, the missing number is `36 - 31 = 5`.
---
### Solution 4: Optimal Approach 2 (XOR Approach)
This approach uses the XOR operation to find the missing number efficiently.
**Implementation**:
```cpp
// Solution-4: Optimal Approach 2 (XOR Approach)
// Time Complexity: O(n)
// Space Complexity: O(1)
int findMissingNumber(vector<int> &arr, int n)
{
int xor1 = 0;
int xor2 = 0;
for (int i = 0; i < n - 1; i++)
{
// XOR of array elements
xor2 ^= arr[i];
// XOR up to [1...N-1]
xor1 ^= (i + 1);
}
// XOR up to [1...N]
xor1 ^= n;
// The missing number
return xor1 ^ xor2;
}
```
**Logic**:
1. **XOR Array Elements**: Compute the XOR of all elements in the array.
2. **XOR Up to N**: Compute the XOR of all numbers from 1 to N.
3. **Find Missing Number**: XOR the results from steps 1 and 2. The result will be the missing number.
**Time Complexity**: O(n)
* **Explanation**: The first loop runs through the array of size N-1 to calculate the XOR of the elements and the second loop runs up to N to calculate the XOR of the range from 1 to N. Thus, the overall time complexity is O(n-1) + O(n), which simplifies to O(n).
**Space Complexity**: O(1)
* **Explanation**: Only a few additional variables (like `xor1` and `xor2`) are used, which do not depend on the size of the input array.
**Example**:
* **Input**: `arr = [1, 2, 4, 6, 3, 7, 8]`, `n = 8`
* **Output**: `5`
* **Explanation**: XOR of all array elements and XOR of numbers from `1` to `8` will result in the missing number, which is `5`.
---
### Comparison
* **Brute Force Approach**:
* **Pros**: Simple and easy to understand.
* **Cons**: Inefficient with O(n²) time complexity.
* **Hashing Approach**:
* **Pros**: More efficient with O(n) time complexity.
* **Cons**: Uses extra space for the hash table.
* **Summation Approach**:
* **Pros**: Very efficient with O(n) time complexity and O(1) space complexity.
* **Cons**: Might face integer overflow for very large values of N.
* **XOR Approach**:
* **Pros**: Efficient with O(n) time complexity and O(1) space complexity.
* **Cons**: Slightly more complex to understand compared to summation approach.
### Edge Cases
* **Empty Array**: If the input array is empty, the missing number should be `1`.
* **Single Element Missing**: When the array contains only one element, either `1` (missing `2`) or `2` (missing `1`), handle by checking and returning the missing number.
* **All Elements Present**: If the array contains all elements from 1 to N without any missing number (should not happen if constraints guarantee a missing number), return a sentinel value like `-1`.
* **Duplicated Elements**: This scenario is not considered because the problem constraints specify that all elements are distinct.
### Additional Notes
* **Integer Overflow**: For the summation approach, consider the risk of integer overflow when calculating the sum of the first N natural numbers. In practical implementations, ensure the language or environment can handle large integer values.
* **Data Types**: Choose appropriate data types to handle the range of input values, especially for large N.
* **Performance Considerations**: While the brute force approach is easy to implement, it is not suitable for large arrays due to its O(n²) time complexity. The hashing and XOR approaches are preferable for large datasets.
* **In-place Modifications**: For space efficiency, the XOR approach modifies the input array in place. Ensure that this behavior is acceptable in the context of the problem or use case.
* **Understanding XOR**: The XOR approach leverages the property that `a ^ a = 0` and `a ^ 0 = a`. It effectively cancels out duplicate elements and isolates the missing number. This method is both efficient and elegant but requires understanding of bitwise operations.
### Conclusion
Finding the missing number in an array of size N-1 can be achieved using various methods, from simple brute force to optimal XOR-based solutions. Depending on the constraints and requirements, one can choose the most suitable approach.
--- | masum-dev |
1,891,490 | How to create and connect to a Linux VM using a Public Key. | Azure virtual machines (VMs) can be created through the Azure portal. The Azure portal is a... | 0 | 2024-06-21T16:08:57 | https://dev.to/adeola_adebari/how-to-create-and-connect-to-a-linux-vm-using-a-public-key-1jl1 |
Azure virtual machines (VMs) can be created through the Azure portal. The Azure portal is a browser-based user interface to create Azure resources. This quickstart shows you how to use the Azure portal to deploy a Linux virtual machine (VM) running Ubuntu Server 22.04 LTS. To see your VM in action, you also SSH to the VM and install the NGINX web server.
## Sign in to Azure
1 - Sign in to the Azure portal.

## Create virtual machine
1 - Enter virtual machines in the search.
2 - Under Services, select Virtual machines.

3 - In the Virtual machines page, select Create and then Virtual machine. The Create a virtual machine page opens.

4 - In the Basics tab, under Project details, make sure the correct subscription is selected and then choose to Create new resource group. Enter _AdeResourceGRP_ for the name.*.

5 - Under Instance details, enter AdeVM for the Virtual machine name, and choose Ubuntu Server 20.04 LTS - Gen2 for your Image. Leave the other defaults. The default size and pricing is only shown as an example. Size availability and pricing are dependent on your region and subscription.

6 - Under Administrator account, select SSH public key.
7 - In Username enter azureuser.
8 - For SSH public key source, leave the default of Generate new key pair, and then enter _AdeVM_Key_ for the Key pair name.

9 - Under Inbound port rules > Public inbound ports, choose Allow selected ports and then select SSH (22) and HTTP (80) from the drop-down.

10 - Leave the remaining defaults and then select the Review + create button at the bottom of the page.

11 - On the Create a virtual machine page, you can see the details about the VM you are about to create. When you are ready, select Create.

12 - When the Generate new key pair window opens, select Download private key and create resource. Your key file will be download as myKey.pem. Make sure you know where the .pem file was downloaded; you will need the path to it in the next step.

13 - When the deployment is finished, select Go to resource.

14 - On the page for your new VM, select the public IP address and copy it to your clipboard.
## Connect to virtual machine
Create an SSH connection with the VM.
1 - If you are on a Mac or Linux machine, open a Bash prompt and set read-only permission on the .pem file using _chmod 400 ~/Downloads/AdeVM_Key.pem_. If you are on a Windows machine, open a PowerShell prompt.

2 - At your prompt, open an SSH connection to your virtual machine. Replace the IP address with the one from your VM, and replace the path to the .pem with the path to where the key file was downloaded.

## Install web server
To see your VM in action, install the NGINX web server. From your SSH session, update your package sources and then install the latest NGINX package.
1 - Update package sources

2 - Install latest NGINX package

## View the web server in action
Use a web browser of your choice to view the default NGINX welcome page. Type the public IP address of the VM as the web address. The public IP address can be found on the VM overview page or as part of the SSH connection string you used earlier.

| adeola_adebari | |
1,896,160 | Buy verified cash app account | https://dmhelpshop.com/product/buy-verified-cash-app-account/ Buy verified cash app account Cash... | 0 | 2024-06-21T16:08:56 | https://dev.to/vapaxeh806/buy-verified-cash-app-account-4k9b | webdev, javascript, beginners, programming | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-cash-app-account/\n\n\n\n\nBuy verified cash app account\nCash app has emerged as a dominant force in the realm of mobile banking within the USA, offering unparalleled convenience for digital money transfers, deposits, and trading. As the foremost provider of fully verified cash app accounts, we take pride in our ability to deliver accounts with substantial limits. Bitcoin enablement, and an unmatched level of security.\n\nOur commitment to facilitating seamless transactions and enabling digital currency trades has garnered significant acclaim, as evidenced by the overwhelming response from our satisfied clientele. Those seeking buy verified cash app account with 100% legitimate documentation and unrestricted access need look no further. Get in touch with us promptly to acquire your verified cash app account and take advantage of all the benefits it has to offer.\n\nWhy dmhelpshop is the best place to buy USA cash app accounts?\nIt’s crucial to stay informed about any updates to the platform you’re using. If an update has been released, it’s important to explore alternative options. Contact the platform’s support team to inquire about the status of the cash app service.\n\nClearly communicate your requirements and inquire whether they can meet your needs and provide the buy verified cash app account promptly. If they assure you that they can fulfill your requirements within the specified timeframe, proceed with the verification process using the required documents.\n\nOur account verification process includes the submission of the following documents: [List of specific documents required for verification].\n\nGenuine and activated email verified\nRegistered phone number (USA)\nSelfie verified\nSSN (social security number) verified\nDriving license\nBTC enable or not enable (BTC enable best)\n100% replacement guaranteed\n100% customer satisfaction\nWhen it comes to staying on top of the latest platform updates, it’s crucial to act fast and ensure you’re positioned in the best possible place. If you’re considering a switch, reaching out to the right contacts and inquiring about the status of the buy verified cash app account service update is essential.\n\nClearly communicate your requirements and gauge their commitment to fulfilling them promptly. Once you’ve confirmed their capability, proceed with the verification process using genuine and activated email verification, a registered USA phone number, selfie verification, social security number (SSN) verification, and a valid driving license.\n\nAdditionally, assessing whether BTC enablement is available is advisable, buy verified cash app account, with a preference for this feature. It’s important to note that a 100% replacement guarantee and ensuring 100% customer satisfaction are essential benchmarks in this process.\n\nHow to use the Cash Card to make purchases?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card. Alternatively, you can manually enter the CVV and expiration date. How To Buy Verified Cash App Accounts.\n\nAfter submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a buy verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account.\n\nWhy we suggest to unchanged the Cash App account username?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card.\n\nAlternatively, you can manually enter the CVV and expiration date. After submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account. Purchase Verified Cash App Accounts.\n\nSelecting a username in an app usually comes with the understanding that it cannot be easily changed within the app’s settings or options. This deliberate control is in place to uphold consistency and minimize potential user confusion, especially for those who have added you as a contact using your username. In addition, purchasing a Cash App account with verified genuine documents already linked to the account ensures a reliable and secure transaction experience.\n\n \n\nBuy verified cash app accounts quickly and easily for all your financial needs.\nAs the user base of our platform continues to grow, the significance of verified accounts cannot be overstated for both businesses and individuals seeking to leverage its full range of features. How To Buy Verified Cash App Accounts.\n\nFor entrepreneurs, freelancers, and investors alike, a verified cash app account opens the door to sending, receiving, and withdrawing substantial amounts of money, offering unparalleled convenience and flexibility. Whether you’re conducting business or managing personal finances, the benefits of a verified account are clear, providing a secure and efficient means to transact and manage funds at scale.\n\nWhen it comes to the rising trend of purchasing buy verified cash app account, it’s crucial to tread carefully and opt for reputable providers to steer clear of potential scams and fraudulent activities. How To Buy Verified Cash App Accounts. With numerous providers offering this service at competitive prices, it is paramount to be diligent in selecting a trusted source.\n\nThis article serves as a comprehensive guide, equipping you with the essential knowledge to navigate the process of procuring buy verified cash app account, ensuring that you are well-informed before making any purchasing decisions. Understanding the fundamentals is key, and by following this guide, you’ll be empowered to make informed choices with confidence.\n\n \n\nIs it safe to buy Cash App Verified Accounts?\nCash App, being a prominent peer-to-peer mobile payment application, is widely utilized by numerous individuals for their transactions. However, concerns regarding its safety have arisen, particularly pertaining to the purchase of “verified” accounts through Cash App. This raises questions about the security of Cash App’s verification process.\n\nUnfortunately, the answer is negative, as buying such verified accounts entails risks and is deemed unsafe. Therefore, it is crucial for everyone to exercise caution and be aware of potential vulnerabilities when using Cash App. How To Buy Verified Cash App Accounts.\n\nCash App has emerged as a widely embraced platform for purchasing Instagram Followers using PayPal, catering to a diverse range of users. This convenient application permits individuals possessing a PayPal account to procure authenticated Instagram Followers.\n\nLeveraging the Cash App, users can either opt to procure followers for a predetermined quantity or exercise patience until their account accrues a substantial follower count, subsequently making a bulk purchase. Although the Cash App provides this service, it is crucial to discern between genuine and counterfeit items. If you find yourself in search of counterfeit products such as a Rolex, a Louis Vuitton item, or a Louis Vuitton bag, there are two viable approaches to consider.\n\n \n\nWhy you need to buy verified Cash App accounts personal or business?\nThe Cash App is a versatile digital wallet enabling seamless money transfers among its users. However, it presents a concern as it facilitates transfer to both verified and unverified individuals.\n\nTo address this, the Cash App offers the option to become a verified user, which unlocks a range of advantages. Verified users can enjoy perks such as express payment, immediate issue resolution, and a generous interest-free period of up to two weeks. With its user-friendly interface and enhanced capabilities, the Cash App caters to the needs of a wide audience, ensuring convenient and secure digital transactions for all.\n\nIf you’re a business person seeking additional funds to expand your business, we have a solution for you. Payroll management can often be a challenging task, regardless of whether you’re a small family-run business or a large corporation. How To Buy Verified Cash App Accounts.\n\nImproper payment practices can lead to potential issues with your employees, as they could report you to the government. However, worry not, as we offer a reliable and efficient way to ensure proper payroll management, avoiding any potential complications. Our services provide you with the funds you need without compromising your reputation or legal standing. With our assistance, you can focus on growing your business while maintaining a professional and compliant relationship with your employees. Purchase Verified Cash App Accounts.\n\nA Cash App has emerged as a leading peer-to-peer payment method, catering to a wide range of users. With its seamless functionality, individuals can effortlessly send and receive cash in a matter of seconds, bypassing the need for a traditional bank account or social security number. Buy verified cash app account.\n\nThis accessibility makes it particularly appealing to millennials, addressing a common challenge they face in accessing physical currency. As a result, ACash App has established itself as a preferred choice among diverse audiences, enabling swift and hassle-free transactions for everyone. Purchase Verified Cash App Accounts.\n\n \n\nHow to verify Cash App accounts\nTo ensure the verification of your Cash App account, it is essential to securely store all your required documents in your account. This process includes accurately supplying your date of birth and verifying the US or UK phone number linked to your Cash App account.\n\nAs part of the verification process, you will be asked to submit accurate personal details such as your date of birth, the last four digits of your SSN, and your email address. If additional information is requested by the Cash App community to validate your account, be prepared to provide it promptly. Upon successful verification, you will gain full access to managing your account balance, as well as sending and receiving funds seamlessly. Buy verified cash app account.\n\n \n\nHow cash used for international transaction?\nExperience the seamless convenience of this innovative platform that simplifies money transfers to the level of sending a text message. It effortlessly connects users within the familiar confines of their respective currency regions, primarily in the United States and the United Kingdom.\n\nNo matter if you’re a freelancer seeking to diversify your clientele or a small business eager to enhance market presence, this solution caters to your financial needs efficiently and securely. Embrace a world of unlimited possibilities while staying connected to your currency domain. Buy verified cash app account.\n\nUnderstanding the currency capabilities of your selected payment application is essential in today’s digital landscape, where versatile financial tools are increasingly sought after. In this era of rapid technological advancements, being well-informed about platforms such as Cash App is crucial.\n\nAs we progress into the digital age, the significance of keeping abreast of such services becomes more pronounced, emphasizing the necessity of staying updated with the evolving financial trends and options available. Buy verified cash app account.\n\nOffers and advantage to buy cash app accounts cheap?\nWith Cash App, the possibilities are endless, offering numerous advantages in online marketing, cryptocurrency trading, and mobile banking while ensuring high security. As a top creator of Cash App accounts, our team possesses unparalleled expertise in navigating the platform.\n\nWe deliver accounts with maximum security and unwavering loyalty at competitive prices unmatched by other agencies. Rest assured, you can trust our services without hesitation, as we prioritize your peace of mind and satisfaction above all else.\n\nEnhance your business operations effortlessly by utilizing the Cash App e-wallet for seamless payment processing, money transfers, and various other essential tasks. Amidst a myriad of transaction platforms in existence today, the Cash App e-wallet stands out as a premier choice, offering users a multitude of functions to streamline their financial activities effectively. Buy verified cash app account.\n\nTrustbizs.com stands by the Cash App’s superiority and recommends acquiring your Cash App accounts from this trusted source to optimize your business potential.\n\nHow Customizable are the Payment Options on Cash App for Businesses?\nDiscover the flexible payment options available to businesses on Cash App, enabling a range of customization features to streamline transactions. Business users have the ability to adjust transaction amounts, incorporate tipping options, and leverage robust reporting tools for enhanced financial management.\n\nExplore trustbizs.com to acquire verified Cash App accounts with LD backup at a competitive price, ensuring a secure and efficient payment solution for your business needs. Buy verified cash app account.\n\nDiscover Cash App, an innovative platform ideal for small business owners and entrepreneurs aiming to simplify their financial operations. With its intuitive interface, Cash App empowers businesses to seamlessly receive payments and effectively oversee their finances. Emphasizing customization, this app accommodates a variety of business requirements and preferences, making it a versatile tool for all.\n\nWhere To Buy Verified Cash App Accounts\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nThe Importance Of Verified Cash App Accounts\nIn today’s digital age, the significance of verified Cash App accounts cannot be overstated, as they serve as a cornerstone for secure and trustworthy online transactions.\n\nBy acquiring verified Cash App accounts, users not only establish credibility but also instill the confidence required to participate in financial endeavors with peace of mind, thus solidifying its status as an indispensable asset for individuals navigating the digital marketplace.\n\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nConclusion\nEnhance your online financial transactions with verified Cash App accounts, a secure and convenient option for all individuals. By purchasing these accounts, you can access exclusive features, benefit from higher transaction limits, and enjoy enhanced protection against fraudulent activities. Streamline your financial interactions and experience peace of mind knowing your transactions are secure and efficient with verified Cash App accounts.\n\nChoose a trusted provider when acquiring accounts to guarantee legitimacy and reliability. In an era where Cash App is increasingly favored for financial transactions, possessing a verified account offers users peace of mind and ease in managing their finances. Make informed decisions to safeguard your financial assets and streamline your personal transactions effectively.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com\n\n\n\n\n\n" | vapaxeh806 |
1,896,158 | Learning from the NCS Group Incident: Sennovate Expert Opinion | In today’s dynamic cybersecurity landscape, threats evolve rapidly. A recent incident involving a... | 0 | 2024-06-21T16:08:05 | https://dev.to/sennovate/learning-from-the-ncs-group-incident-sennovate-expert-opinion-4cg4 | security, iam, cybersecurity, mssp | In today’s dynamic cybersecurity landscape, threats evolve rapidly. A recent incident involving a former employee of NCS Group, Kandula Nagaraju, highlights significant vulnerabilities in user access management. Nagaraju, terminated for poor performance and disgruntled post-offboarding, retaliated by deleting 180 critical virtual servers essential for NCS Group’s QA testing. This incident underscores the crucial importance of effective Identity and Access Management (IAM), particularly in the deprovisioning process. In this post, we will delve into how such incidents can be prevented with a robust IAM solution.
**Incident Analysis: A Costly Oversight**
In NCS Group incident, the oversight in user access management led to significant non-compliance and setbacks:
· Unauthorized Access: Despite leaving the company, the ex-employees’ accounts remained active for four months, providing them with ample opportunity to access servers which they were no longer authorized to access.
· Recovery Cost: The immediate financial impact from the incident was substantial, amounting to approximately USD $678,000. This figure includes the costs associated with restoring the deleted servers, data recovery efforts, and additional security measures implemented post-incident to prevent future breaches.
· Critical Operational Disruption: Kandula Nagaraju’s actions post-termination resulted in the deletion of 180 virtual servers, a operational disruption for NCS Group.
This incident highlights glaring oversights in maintaining secure access controls and user lifecycle management.
**What could have prevented this incident?**
Effective identity and access management (IAM) is critical for securing an organization’s data and systems from unauthorized access. It involves managing the entire lifecycle of user identities, from onboarding to offboarding. However, merely deploying an IAM solution is not sufficient to prevent security incidents. Organizations must ensure they have the right combination of people, processes, and technology to effectively mitigate risks.
**Let’s discuss few key solutions that could have prevented this incident:**
Efficient User Lifecycle Management – Proper User Lifecycle Management ensures that only authorized users have access to the necessary resources. It reduces the risk of security breaches by ensuring that access is revoked immediately upon termination or role change.
Zero Trust Security Model – Zero Trust eliminates the concept of implicit trust, ensuring that every user, device, and application is continuously verified and validated, reducing the risk of insider threats. Zero Trust enforces the principle of least privilege, granting users and devices only the access they need to perform their tasks, thereby reducing the attack surface. Just-in-time access provisioning with an approval workflow can further tighten the access controls.
Periodic Access Reviews – By reviewing access regularly, organizations can identify and mitigate potential risks associated with unauthorized access, insider threats, or other security vulnerabilities. Regular reviews reinforce the importance of access control among employees and promote a culture of accountability regarding data protection and security.
Password Rotation – It is a fundamental component of a comprehensive cybersecurity strategy. Regularly changing passwords reduces the time window in which a compromised password can be used by malicious actors. This minimizes the potential damage that can be inflicted. Frequent password changes can help in identifying breaches. If a password is changed and subsequent unauthorized access attempts are detected, it can signal a security breach.
User Behavior Analytics (UBA) – By leveraging advanced data analytics, machine learning, and pattern recognition, UBA helps organizations identify abnormal or malicious activities that might indicate a security breach. By establishing a baseline of normal user behavior, UBA can flag activities that deviate from this baseline, such as accessing unusual resources or logging in from unexpected locations.
**Conclusion**
The NCS Group breach starkly illustrates the critical importance of having an effective IAM solution and managed service in place. Failing to deactivate access for ex-employees for a significant period of 4 months exposed the organization to entirely preventable risks, highlighting the need for stringent access management practices.
To avoid such devastating incidents, companies must adopt a proactive stance on cybersecurity. This entails implementing a robust IAM framework that encompasses the right mix of people, processes, and technology.
**Learn More About Sennovate IAM-as-a-Service**
Don’t leave your organization’s security to chance. Discover the unmatched protection Sennovate offers through our IAM-as-a-Service offering. We assist organizations like NCS in assessing their security posture, identifying risks, and implementing robust security solutions aligned with industry best practices to mitigate those risks effectively. We provide comprehensive end-to-end Identity and Access Management services, covering advisory, implementation, and 24×7 managed services.
To know more about our solutions and services, visit [www.sennovate.com](https://sennovate.com) or contact us at hello@sennovate.com | sennovate |
1,896,157 | Choosing Ruby: What made me choose Ruby as my primary programming language. | Choosing a programming language can be a daunting task for many developers. There are countless... | 27,731 | 2024-06-21T16:05:11 | https://dev.to/palak/choosing-ruby-what-made-me-choose-ruby-as-my-primary-programming-language-b9g | career, learning, beginners, ruby | Choosing a programming language can be a daunting task for many developers. There are countless options available, but Ruby continues to attract a loyal following. Why? Ruby is beloved for its readability, elegance, and simplicity. It’s a language that allows developers to write concise and intuitive code, leading to faster and more enjoyable application development.
## Why Do People Choose Ruby?
Developers choose Ruby for several compelling reasons:
**- Readable and Intuitive Code:** Ruby was designed with the goal of maximising code readability. The philosophy that "programming should be enjoyable" best describes this language. Ruby's syntax is close to natural language, making the code easy to understand even for those who aren't experienced programmers.
**- Ruby on Rails Framework:** One of the most popular tools for rapid web application development, Rails makes building applications simpler with ready-to-use libraries and structures. For many developers, it’s a game-changer that allows quick transition from concept to a working application.
**- Active Community:** Ruby boasts a large and active ecosystem, meaning plenty of available resources, tutorials, and pre-built solutions. The Ruby community is known for supporting newcomers and actively backing open-source projects.
**- Wide Range of Applications:** Ruby is used by both startups and large corporations alike. From web applications to DevOps tools, and even automation and testing—Ruby has a broad array of applications.
## Examples of Applications Using Ruby
Ruby and Ruby on Rails have been employed to develop many well-known applications, such as:
**- Basecamp:** A project management tool that became one of the first high-profile Ruby on Rails applications, gaining massive popularity for its simplicity and effectiveness.
**- GitHub:** A platform for hosting and reviewing code. GitHub has become the collaboration center for millions of developers worldwide.
**- Shopify:** A popular e-commerce platform enabling entrepreneurs to set up online stores quickly and easily.
**- Airbnb:** A short-term rental platform that started its journey with Ruby on Rails. This framework helped Airbnb swiftly develop features and scale globally.
**- SoundCloud:** A music sharing platform that uses Ruby on Rails to handle vast amounts of data and user interactions.
**- Hulu:** A popular streaming service offering a wide range of movies and TV shows.
**- Zendesk:** A customer service tool also built using Ruby.
**- Hey:** A modern email platform created by the makers of Basecamp.
**- Dribbble:** A social platform for graphic designers and creative professionals to showcase their work.
**- Kickstarter:** A crowdfunding platform that helps creators gather funds for their projects.
**- Heroku:** A platform-as-a-service (PaaS) offering cloud app hosting.
**- Coinbase:** A popular cryptocurrency exchange.
**- Cookpad:** A social recipe-sharing service.
## Ruby is NOT Dead!
Every so often, rumours surface in the developer community about the "downfall" of a particular language. However, if you visit [isrubydead.com](isrubydead.com), you’ll quickly see that Ruby is very much alive! With robust community support and new tools like Hotwire, Ruby remains one of the top programming languages. In reality, Ruby is continuously evolving, gaining new technologies that enhance its functionality and performance.
## My Journey to Choosing Ruby
My journey with Ruby began quite by accident. While scrolling through Instagram, I came across the profile of @andrzejkrzywda At that time, Andrzej was regularly sharing stories about programming, business, and, of course, Ruby. His insights and enthusiasm piqued my interest.
Andrzej encouraged learning Ruby and offered a course titled "From zero to first application." I decided to give it a shot and enrolled in the course (I'll discuss the course and my learning process in a separate post). This course turned out to be a perfect fit, providing me with solid foundations and guidance on building my first application.
I wanted to create web applications, and Ruby, along with Ruby on Rails, was the ideal tool for that. Thanks to its readable syntax and the powerful capabilities of the Rails framework, I quickly managed to build my first applications, reaffirming my choice. Ruby also allowed me to focus on real business problems, avoiding overly complex technical implementations.
## Conclusion
Choosing a programming language always depends on individual needs and goals. For me, Ruby turned out to be the best choice, especially for web application development. The buzz around Hotwire further solidified my belief that Ruby is a language of the future. If you're looking for simplicity, elegance, and powerful tools for building your projects, Ruby is worth a shot.
**For me, it was a bullseye, but I'm curious about your opinions!**
Do you use Ruby? What are your experiences with this language? Maybe you have other favourite programming languages? Share your thoughts in the comments! Your feedback is invaluable, and I'd love to hear about your experiences and perspectives.
Mati | palak |
1,896,155 | Differences Between Security Groups and NACLs | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T16:01:46 | https://dev.to/vidhey071/differences-between-security-groups-and-nacls-3kke | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,890,237 | Exploring Destructuring in JavaScript | What is Destructuring? Destructuring is a special really cool syntax feature in... | 0 | 2024-06-21T16:01:45 | https://dev.to/ddebajyati/exploring-destructuring-in-javascript-5a24 | webdev, javascript, beginners, programming | ## What is Destructuring?
**Destructuring** is a special really cool syntax feature in JavaScript, which lets us to extract values from _arrays_, _objects_, or other iterable structures and assign them to variables.
It's a shorthand way to access properties or elements of a data structure without having to use dot notation or array indices.
## How is it beneficial for us (who write code in JavaScript)?
Destructuring has several benefits that make our code more concise, readable, and maintainable!
- **_Improved Readability_**: Destructuring simplifies code by reducing the need for complex variable assignments and dot notation.
- **_Less Boilerplate Code_**: You can extract values directly from data structures without needing to create intermediate variables.
- **_More Concise Code_**: Destructuring can reduce the number of lines of code needed to achieve the same result.
- **_Flexibility_**: You can destructure data structures of any type (objects, arrays, iterables), making it a versatile tool in your JavaScript toolkit.
Effective destructuring 🚀 enables us to write more _**expressive**_, **_maintainable_**, and _**efficient**_ code that's easier to understand and debug.
## Basic Example
```JavaScript
const person = { name: 'John', age: 30 };
const { name, age } = person;
console.log(name); // "John"
console.log(age); // 30
```
Here we have destructured an object `person` with two properties: `name` and `age`.
When destructuring an JavaScript object, the values we extract must be the exact same keys in the object. You can't place `userName` in place of `name` in the line
`const { name, age } = person;`. Which simply means - `const { userName, age } = person;` won't work.
But yes! We can apply aliasing while destructuring an object.
E.G. -
```JavaScript
const person = { name: 'John', age: 30 };
const { name:userName, age:userAge } = person;
console.log(userName); // "John"
console.log(userAge); // 30
```
Most probably you have seen destructuring an object for the first time when you were importing a module. For example when importing the exec function -
```JavaScript
import { exec } from "node:child_process"; // ES Module syntax
```
```JavaScript
const { exec } = require("child_process"); // commonJS syntax
```
**Similarly we can destructure arrays** also -
```JavaScript
const numbers = [4, 5, 6];
const [x, y, z] = numbers;
console.log(x); // 4
console.log(y); // 5
console.log(z); // 6
```
Here when destructuring arrays, you don't need to use aliasing to assign any element to a custom variable name. Because array elements are simply just values, they aren't bound with some keys.
## Default Values
Destructuring allows you to assign default values to variables if the property doesn't exist in the object.
```JavaScript
const person = { name: 'John' };
const { name = 'Anonymous', age } = person; // age will be undefined
console.log(name); // "John"
console.log(age); // undefined
```
Here the string value `'John'` wasn't substituted by the value `'Anonymous'`in the variable `name` because it already existed in the object.
Whereas -
```JavaScript
const person = { name: 'John' };
const { name, age = 30 } = person; // age defaults to 30 if not present
console.log(name); // "John"
console.log(age); // 30
```
## Spread Syntax
The **spread** syntax or say, **operator** `(...)` can be used with destructuring to capture remaining elements of an array or properties of an object into a new variable.
- spread syntax with Arrays -
```JavaScript
const numbers = [1, 2, 3, 4, 5];
const [first, second, ...rest] = numbers;
console.log(first); // 1
console.log(second); // 2
console.log(rest); // [3, 4, 5] (remaining elements)
```
- spread syntax with Objects -
```JavaScript
const person = { name: 'John', age: 30, city: 'New York' };
const { name, ...info } = person;
console.log(name); // "John"
console.log(info); // { age: 30, city: "New York"} (remaining properties)
```
## Nested Destructuring
Destructuring can be nested to extract values from deeply nested objects or arrays.
```JavaScript
const data = {
user: {
name: 'Alicia',
origin: 'Romania',
eyes: 'blue',
address: {
city: 'London',
}
}
};
const { user: { name, address: { city } } } = data;
console.log(name); // "Alicia"
console.log(city); // "London"
```
## Destructuring in function parameter list
Suppose we've an JavaScript object named `credentials` -
```JavaScript
const credentials = {
name: 'Debajyati',
age: 20,
address: {
city: 'Kolkata',
state: 'West Bengal',
country: 'India'
},
phone: '',
email: '',
hobbies: ['reading', 'listening to music', 'coding', 'watching Anime'],
skills: {
programming: true,
blogging: true,
singing: false
}
}
```
And a function named `showCredentials` which takes only 1 argument value which is an object and Standard outputs a string based on some of the object properties.
Well, we could write the function definition in this way -
```JavaScript
function showCredential(obj) {
const hasSkill = (skill) => obj.skills[skill];
console.log(
`${obj.name} is ${obj.age} years old.\n Lives in ${obj.address.city}, ${obj.address.country}.\n`,
`He has the following hobbies: ${obj.hobbies.join(", ")}`,
);
if (hasSkill("programming")) {
console.log(`He is a programmer.`);
}
if (hasSkill("singing")) {
console.log(`He is a singer.`);
}
if (hasSkill("blogging")) {
console.log(`He is also a tech blogger.`);
}
}
```
Calling it with -
```JavaScript
showCredential(credentials);
```
Getting this output -
```
Debajyati is 20 years old.
Lives in Kolkata, India.
He has the following hobbies: reading, listening to music, coding, watch
ing Anime
He is a programmer.
He is also a tech blogger.
```
Instead we can destructure the object argument in the parameter list while defining the function. Like this -
```JavaScript
function showCredential({ name, age, address: { city, country}, hobbies, skills }) {
const hasSkill = (skill) => skills[skill];
console.log(
`${name} is ${age} years old.\n Lives in ${city}, ${country}.\n`,
`He has the following hobbies: ${hobbies.join(", ")}`,
);
if (hasSkill("programming")) {
console.log(`He is a programmer.`);
}
if (hasSkill("singing")) {
console.log(`He is a singer.`);
}
if (hasSkill("blogging")) {
console.log(`He is also a tech blogger.`);
}
}
```
which gives the same output.
>
| :information_source: NOTE |
|-----------------------------|
The function still takes only one argument. Destructuring didn't increase number of arguments in the function parameter list.
Also, calling the function didn't change as well. It still is -
```JavaScript
showCredential(credentials);
```
### So, Why destructure objects in Function Parameter List?
While destructuring in function arguments list may seem cumbersome or tedious at first but it has it's quite important benefits.
#### Important Points to Consider
- _**Safer Code:**_
Destructuring can help prevent errors by making it clear which properties are expected by the function. If a property is missing in the passed object, destructuring will result in an error during function execution, aiding in early detection of potential issues.
- _**Reduced Verbosity:**_
By directly extracting properties into variables within the parameter list, you avoid repetitive object property access using dot notation. This leads to cleaner and more concise function definitions.
- _**Focus on Functionality:**_
By destructuring within the parameter list, you separate data access logic from the function's core functionality. This improves code organization and makes the function's purpose clearer.
## Destructuring Strings
Just how we destructure arrays, similarly we can also unpack strings as array elements. A clever usage of our intelligence.
```JavaScript
const fruit = 'grape';
const [first, second, ...rest] = fruit;
const animal = rest.join('');
console.log(animal); // ape
```
> | :warning: Remember !|
|-----------------------|
When you use the spread operator `(...)` to capture the remaining characters from a string, you don't get a string. You get an array of those characters.
## Some Handy Application Examples of Destructuring
- **_Destructuring for swapping without 3rd variable_**:
JavaScript traditionally required a temporary variable to swap the values of two variables. Destructuring offers a more concise and readable way to achieve this.
- Before Destructuring:
```JavaScript
let a = 10;
let b = 20;
let temp = a;
a = b;
b = temp;
console.log(a, b); // Output: 20 10
```
- After Destructuring:
```JavaScript
let a = 10;
let b = 20;
[a, b] = [b, a];
console.log(a, b); // Output: 20 10
```
So nifty & elegant✨! Isn't it?
- _**Destructuring Function Return Values**_: Functions can return multiple values as an array or object. Destructuring allows you to unpack these returned values into separate variables, improving code clarity.
Let's suppose you have a function that fetches data from an API and returns a response object:
```JavaScript
function getUserUpdates(id) {
// Simulating some API call with a GET request
return {
data: {
player: response.group.names[id],
brain: "rotting",
powerLevel: Number(response.group.power[id]),
useAsDecoy: true,
},
statusCode: Number(response.status),
};
}
```
In the context of building APIs or handling server responses, it offers distinct advantages that enhance code quality and maintainability.
Accessing individual properties is going to be breeze, because you can directly extract the properties you need from the function's return value into separate variables during the function call itself.
```Javascript
const {
data: {player, useAsDecoy, powerLevel},
statusCode,
} = getUserUpdates(1);
```
Whenever a function returns an object and you are interested in specific property values, always apply destructuring straight away.
If you're still thinking destructuring in return values isn't a good idea, these 2 more advantages may convince you -
(A) _**Simplified Mental Model:**_ Destructuring simplifies the thought process required to understand data flow for the developer who will be using your function. Instead of memorizing intricate property access chains, developers can focus on the meaning conveyed by the variable names used in the destructuring pattern. This reduces cognitive load and promotes better code comprehension.
(B) _**Reduced Boilerplate Code for Complex Return Objects:**_
When functions return objects with numerous or nested properties, destructuring significantly reduces the boilerplate code needed to access them individually. This leads to a more concise and less cluttered codebase, improving overall code quality.
- **_Destructuring with Conditions_**:Destructuring can be combined with conditional statements to handle different scenarios based on the structure of an object. If you've a function that receives an object with optional properties:
```JavaScript
function greetUser(user) {
const { name = "Anonymous" } = user || {}; // Destructuring with default value
console.log(`Hello, ${name}!`);
}
greetUser({ name: "Bob" }); // Output: "Hello, Bob!"
greetUser({}); // Output: "Hello, Anonymous!" (no name property)
greetUser(undefined); // Output: "Hello, Anonymous!" (function receives no argument)
```
## Conclusion
Throughout the whole article, we've learnt that **'Destructuring'** is a powerful and versatile feature in JavaScript that can significantly improve your code's readability, maintainability, and efficiency. By effectively using destructuring techniques, you can write cleaner, more concise, and less error-prone code. So, embrace destructuring and take your JavaScript skills to the next level!
If you found this POST helpful, if this blog added some value to your time and energy, please show some love by giving the article some likes and share it with your friends.
Feel free to connect with me at - [Twitter](https://twitter.com/ddebajyati), [LinkedIn](https://www.linkedin.com/in/debajyati-dey) or [GitHub](https://github.com/Debajyati) :)
Happy Coding 🧑🏽💻👩🏽💻! Have a nice day ahead! 🚀 | ddebajyati |
1,896,153 | AWS Storage | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T15:57:42 | https://dev.to/vidhey071/aws-storage-196c | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,770,605 | Age of Spagetti Code | I writen this game POC (2015), before I knew about git and js building system, so whole program... | 0 | 2024-06-21T15:56:42 | https://dev.to/pengeszikra/age-of-spagetti-code-5451 | webdev, javascript, beginners | I writen this game POC (2015), before I knew about git and js building system, so whole program written in a single JS file 11277 LOC on the final version.
I write not just a code, but the "documentation" also. I found a 1400 LOC single I wrote down my thights, and that part is just a greatest continous "markdown" parts.
Technically this is not just 1 game, but a 2D pixi.js powered CCG and a Three.js powered 3D space station are, where I can even go inside one of spaceship interiour, where is a room with a alien player play this card game around a table.
Suddenly this project ended, when startup ( hired me as graphic designer ) are splited and after finished.
But my coding practice at that time was really simple: I just open my editor: notepad ++, and open program local on browser.
## develop / build process:
> before age of hot reload
- write a code
- ctrl + S : save
- ctrl + R : refresh the browser
## perfect editor setup:
I using following 3 shortkey to navigate in code at lightning fast:
- alt + left arrow : mark/delete jump point to current line
- alt + up arrow : jump to previous jump point
- alt + down arrow : jump to next jump point
KISS | pengeszikra |
1,896,152 | Day 25 of my progress as a vue dev | About today Today was an interesting one. I ended up wasting half of it because I overslept and then... | 0 | 2024-06-21T15:54:12 | https://dev.to/zain725342/day-25-of-my-progress-as-a-vue-dev-2bg4 | webdev, vue, typescript, tailwindcss | **About today**
Today was an interesting one. I ended up wasting half of it because I overslept and then for rest of the half I was extremely productive and I also ended up removing a lot of distractions today which is a good thing I believe to achieve the flow state. I worked on my landing page and I have a few creative ideas to make it a little bit more complex to test out my development skills.
**What's next?**
I will be done with my second landing page tomorrow. I am enjoying this process so far and I would like to continue this for a couple of more reps so I have a few pages to showcase on my github as well as the my portfolio website/landing page I will be working on after this.
**Improvements required**
Currently the pages look a little too generic which is not sitting right with me, so I really want to test out new tings to make them look advanced but not messy.
Wish me luck! | zain725342 |
1,896,151 | NetDisco | Presentation: A versatile app designed for both Windows and Linux platforms. This tool... | 0 | 2024-06-21T15:53:22 | https://dev.to/7axel/netdisco-1ipo | networking, windows, linux, python | ## Presentation:
- A versatile app designed for both Windows and Linux platforms. This tool efficiently scans the LAN network, displaying detailed information such as IP addresses, MAC addresses, and host names of devices connected to the same Wi-Fi network as the host PC. Its capability to execute seamlessly across different operating systems enhances user accessibility and provides valuable insights into network connectivity, making it an invaluable tool for both home and professional use.
## Use:
- When you run the app, it initiates a scan of your local LAN network, systematically searching for connected devices. The interface displays ongoing progress as it identifies each device, showcasing detailed information such as IP addresses, MAC addresses, and host names. This real-time visibility into your network environment provides a comprehensive view of all devices connected to the same Wi-Fi network as your PC, ensuring you stay informed about your network's status and security
- **_Captures_**:

- If you want to initiates another scann press **Update** button:

## Download:
- You can download the app by pressing [Download](https://github.com/77AXEL/NetDisco/raw/main/NetDisco.exe?download=)
- On Windows, you can launch the executable directly. However, on Linux-based systems, you'll need to use an emulator like Wine to run the application:
```
wine NetDisco.exe
```
This setup allows the app to function seamlessly across different operating environments, ensuring compatibility and accessibility for users on varying platforms
## URLs:
- You can also view the project on GitHub: [https://github.com/77AXEL/NetDisco/](https://github.com/77AXEL/NetDisco/)
| 7axel |
1,893,918 | Namaste Your Way to Wellness: AI-Powered Yoga Recommendations | This is a submission for the Twilio Challenge Today is International Yoga Day! Whether you're a... | 0 | 2024-06-21T15:52:37 | https://dev.to/engineeredsoul/namaste-your-way-to-wellness-ai-powered-yoga-recommendations-2n3p | devchallenge, twiliochallenge, ai, twilio | *This is a submission for the [Twilio Challenge](https://dev.to/challenges/twilio)*
Today is International Yoga Day! Whether you're a seasoned yogi or just dipping your toes into the world of yoga, it's the perfect day to embrace the benefits of this ancient practice. And there's no better way to celebrate than with YogaConnect, your new personalized yoga companion.
## What I Built
<!-- Share an overview about your project. -->
[YogaConnect](https://yogaconnect.streamlit.app/) is designed to make your yoga journey enjoyable and tailored to your needs. With personalized routines, and helpful wellness tips, YogaConnect ensures you stay motivated and on track. Here’s how YogaConnect can help you make the most out of your yoga practice:
### Key Features of YogaConnect
**Personalized Yoga Recommendations**
YogaConnect uses AI to analyze your experience level, goals, and any physical limitations to provide personalized yoga poses. Whether you're a beginner, intermediate, or advanced practitioner, YogaConnect tailors routines that fit your unique needs.
**Interactive User Interface**
Built on Streamlit, YogaConnect offers a clean and interactive interface. Easily input your details, view recommendations, and access video tutorials seamlessly, making your yoga journey smooth and enjoyable.
**Email Follow-Ups**
To keep you motivated, YogaConnect sends detailed follow-up emails containing your personalized yoga routines and additional wellness tips. Leveraging [Twilio's SendGrid](https://sendgrid.com/en-us), these emails are delivered consistently and reliably to keep you engaged and informed.
## **How YogaConnect Works**
### Step 1: Tell Us About Yourself
Upon visiting YogaConnect, you’ll be prompted to fill in your details:
Yoga Experience Level: Choose from Beginner, Intermediate, or Advanced.
Yoga Goals: Select your goals such as Flexibility, Strength, or Relaxation.
Physical Limitations: Optionally, you can specify any physical limitations you might have.
Email Address: Provide your email address to receive follow-up emails.
### Step 2: Get AI-Powered Recommendations
Once you submit your details, YogaConnect uses Gemini AI engine to generate a list of recommended yoga poses tailored to your profile. This ensures that the yoga routine fits your specific needs and goals.
### Step 3: Watch Video Tutorials
For each recommended pose, YogaConnect fetches a relevant instructional video from YouTube, ensuring you have a clear visual guide to follow. This makes it easier to practice each pose correctly and safely.
### Step 4: Receive Email Delivery
After generating your personalized yoga routine, YogaConnect sends an email to the provided address with all the details. This email includes:
The name and Sanskrit name of each pose.
The benefits of each pose.
The best time to perform each pose.
Suggestions and tips for each pose.
Links to video tutorials.
## Demo
<!-- Share a link to your app and include some screenshots here. -->
Check out the [YogaConnect app](https://yogaconnect.streamlit.app/)!



## Twilio and AI
<!-- Tell us how you leveraged Twilio’s capabilities with AI -->
### Twilio's SendGrid and Dynamic Templates
Twilio's SendGrid is a cloud-based service that provides a reliable and scalable platform for sending transactional and marketing emails. It is widely used for sending notifications, newsletters, and other automated email communications. SendGrid offers robust features such as analytics, template management, and email deliverability tools.
Dynamic templates in SendGrid allow you to create email templates with placeholders that can be filled with dynamic data when the email is sent. This feature is particularly useful for sending personalized emails where the content can change based on the recipient's data.
Here’s how dynamic templates are typically used:
- Create a Dynamic Template: In the SendGrid dashboard, create a new email template and define placeholders for the dynamic content.
- Define Template Variables: Use placeholders in the template body, such as `{{name}}` or `{{yoga_pose}}`, which will be replaced with actual data when the email is sent.

- Send Email with Dynamic Data: When sending the email via the SendGrid API, pass the dynamic data in the request to populate the placeholders.
**Example of Sending Dynamic Emails**
In YogaConnect, we use a dynamic template to send personalized yoga routines. Here’s the code that handles this:
```
def send_dynamic_email(to_emails, email_content):
FROM_EMAIL = 'FROM_EMAIL'
TEMPLATE_ID = 'SENDGRID_TEMPLATE_ID'
message = Mail(
from_email=FROM_EMAIL,
to_emails=[to_emails]
)
message.dynamic_template_data = {
'content': email_content
}
message.template_id = TEMPLATE_ID
try:
sg = SendGridAPIClient(SENDGRID_API_KEY)
response = sg.send(message)
print("Dynamic Messages Sent!")
return "Dynamic Messages Sent!"
except Exception as e:
print(f"Error: {e}")
return "Error sending email"
```
**TEMPLATE_ID**: The ID of the dynamic template created in the SendGrid dashboard.
**dynamic_template_data**: A dictionary containing the data to replace placeholders in the template.
**SendGridAPIClient**: Used to send the email via the SendGrid API.
### Google Gemini AI
In YogaConnect, [Google Gemini AI](https://deepmind.google/technologies/gemini/) is used to generate personalized yoga recommendations based on user input. The model is configured to ensure relevant and accurate responses tailored to each user's needs.
Using Google AI Studio, I created a Structured Prompt that returns the required parameters in JSON format.

Here's how I configure and use Google Gemini AI in YogaConnect:
```
import google.generativeai as genai
genai.configure(api_key=GEMINI_API_KEY)
generation_config = {
"temperature": 1,
"top_p": 0.95,
"top_k": 64,
"max_output_tokens": 8192,
"response_mime_type": "application/json",
}
model = genai.GenerativeModel(
model_name="gemini-1.5-flash",
generation_config=generation_config,
)
```
**GEMINI_API_KEY**: API key for accessing Google Gemini AI services.
**generation_config**: Configuration settings for the model to control the output's creativity and coherence.
The `get_ai_recommended_poses` function uses the configured Gemini model to generate yoga pose recommendations based on user input.
## Additional Prize Categories
YogaConnect fits the "Impactful Innovators" category by promoting physical and mental well-being through personalized yoga recommendations. By leveraging AI, the app offers tailored yoga practices that cater to individual needs, helping users achieve specific wellness goals such as flexibility, strength, and stress reduction
## Links/References
- [YogaConnect App](https://yogaconnect.streamlit.app/)
- [Github Repo](https://github.com/dotAadarsh/YogaConnect)
- [Twilio | SendGrid](https://www.twilio.com/docs/sendgrid)
- [Google AI Studi | Gemini API](https://ai.google.dev/aistudio)
- [Streamlit API Reference](https://docs.streamlit.io/develop/api-reference)
---
I will keep on updating the project before the deadline.
Banner credit: <a href="https://storyset.com/people">People illustrations by Storyset</a>
Thanks for the read and I appreciate your feedbacks!
<!-- Thanks for participating! --> | engineeredsoul |
1,896,150 | Amazon S3 Cost Optimization | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T15:50:21 | https://dev.to/vidhey071/amazon-s3-cost-optimization-bje | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,896,149 | 🚀 Enhance Your Cypress Tests with Custom Commands | Learn how to streamline your Cypress test suite by adding custom commands that enhance readability... | 0 | 2024-06-21T15:49:30 | https://dev.to/geraldhamiltonwicks/enhance-your-cypress-tests-with-custom-commands-5893 | cypress, e2e, typescript, testing | Learn how to streamline your Cypress test suite by adding custom commands that enhance readability and maintainability. Whether you're automating login flows or validating UI elements, these custom commands make your tests more robust.
## Enhance Your Cypress Tests with Custom Commands
Cypress is a robust framework for end-to-end testing of web applications, known for its simplicity and powerful capabilities. By adding custom commands with TypeScript, you can enhance your testing suite's readability and maintainability. This guide will walk you through the steps required to integrate custom commands into your Cypress setup.
### Prerequisites
Before proceeding, ensure you have set up Cypress in your project. Here’s a typical folder structure:
```
cypress
├── downloads
├── e2e
├── fixtures
└── support
├── commands.ts
└── index.d.ts
```
### Step-by-Step Guide
### Step 1: Install TypeScript
If you haven't already installed TypeScript, do so by running the following command:
```bash
npm install typescript --save-dev
```
### Step 2: Configure TypeScript
Create a `tsconfig.json` file at the root of your Cypress folder with the following configuration:
```json
{
"compilerOptions": {
"target": "es5",
"lib": ["es5", "dom"],
"types": ["cypress", "node"]
},
"include": ["**/*.ts"]
}
```
This configuration ensures TypeScript compiles your code to ES5, includes necessary typings for Cypress and Node.js, and processes all `.ts` files.
### Step 3: Define Custom Commands
Navigate to `/cypress/support/commands.ts` and define your custom commands. For example, let's create a command for user login:
```typescript
// /cypress/support/commands.ts
Cypress.Commands.add('loginByUi', (email: string, password: string) => {
// Visit login page
cy.visit('http://localhost:3000/');
// Enter email and password
cy.get('input[id="email"]').type(email);
cy.get('input[id="password"]').type(password);
// Click submit button
cy.get('button[type="submit"]').click();
});
```
### Step 4: Declare Custom Command Types
Create a declaration file `index.d.ts` inside `/cypress/support` to provide TypeScript with type definitions for your custom commands:
```typescript
// /cypress/support/index.d.ts
declare namespace Cypress {
interface Chainable {
/**
* Custom command to login via UI.
* @example cy.loginByUi('email', 'password')
*/
loginByUi(email: string, password: string): void;
}
}
```
### Step 5: Implement and Use Custom Commands
Once defined, you can use your custom command in Cypress test files (e.g., `app.cy.ts`):
```typescript
// /cypress/integration/app.cy.ts
describe('App component', () => {
it('Should navigate to app component', () => {
// Perform login using custom command
cy.loginByUi(email, password);
// Assert post-login elements
cy.get('img').should('be.visible'); // Check for image logo
cy.contains('p', 'Edit').should('be.visible'); // Check for 'Edit' text
cy.contains('a', 'Learn React').should('be.visible'); // Check for 'Learn React' link
});
});
```
### Conclusion
By following these steps, you can efficiently extend Cypress with custom commands using TypeScript, making your tests more concise and maintainable. This approach enhances the clarity of your test scripts and promotes reusability across your testing scenarios. Start integrating custom commands today to optimize your Cypress testing workflow and ensure robust application testing. Happy testing! 🚀
| geraldhamiltonwicks |
1,896,148 | Amazon S3 File Gateway | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T15:49:22 | https://dev.to/vidhey071/amazon-s3-file-gateway-5bel | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,147 | How Mobile App Developers can use Wearables? | I recently started exploring the intersection of mobile app development and wearable technology, and... | 0 | 2024-06-21T15:49:06 | https://dev.to/nicolet22222/how-mobile-app-developers-can-use-wearables-85o | wearablle, wearabledevices, development, developers | I recently started exploring the intersection of mobile app development and wearable technology, and I'm amazed at the potential these devices hold. Whether it's smartwatches, fitness trackers, or even smart glasses, wearables are becoming an integral part of our daily lives. For developers, especially those diving into [wearable application development](https://www.cubix.co/wearable-app-development), the opportunities are endless.
First off, wearables provide a unique way to enhance user experience. Think about fitness apps that can track your heart rate, steps, and even sleep patterns. This data can be seamlessly integrated into mobile apps, providing users with real-time feedback and personalized insights. For example, a health app could use data from a smartwatch to offer customized workout plans or alert users to irregular heart rates. Developers focusing on visionOS application development can leverage the immersive capabilities of wearables to create even more engaging and interactive experiences.
Moreover, wearables can significantly improve user safety. Features like fall detection and emergency alerts can be lifesaving. Imagine an app that can detect when a user has fallen and immediately alert emergency contacts or services. This not only enhances user safety but also adds a layer of trust and reliability to the app.
From a business perspective, incorporating wearable technology can set your app apart from the competition. It's an exciting way to offer new functionalities and keep users engaged. For instance, integrating wearable data into a visionOS application can create immersive experiences for users, like virtual fitness coaches or interactive training modules, which could be a game-changer in the fitness industry.
However, with all this data being collected, it's crucial to prioritize user privacy and data security. Ensuring that all data is encrypted and securely stored is essential to maintaining user trust. Also, providing clear information about how the data will be used and obtaining user consent can help in building a transparent relationship with your users.
I'm curious, how are you guys planning to integrate wearables into your apps? Have you already started, or are you still exploring the possibilities? Let's share our thoughts and experiences! | nicolet22222 |
1,896,146 | Unlocking the Power of a JavaScript Formatter: Cleaner Code, Better Projects | https://ovdss.com/apps/javascript-formatter JavaScript is one of the most widely used programming... | 0 | 2024-06-21T15:48:29 | https://dev.to/johnalbort12/unlocking-the-power-of-a-javascript-formatter-cleaner-code-better-projects-4jnc |

https://ovdss.com/apps/javascript-formatter
JavaScript is one of the most widely used programming languages in the world, powering everything from small websites to large-scale web applications. However, as any developer knows, managing and maintaining clean, readable code can be a daunting task. This is where a JavaScript formatter comes into play. In this blog post, we'll delve into what a JavaScript formatter is, why it’s essential for your projects, and how to use it effectively to enhance your coding workflow.
What is a JavaScript Formatter?
A JavaScript formatter is a tool that automatically adjusts the layout of your code to follow a consistent style. It rearranges your code according to predefined rules or customizable configurations, ensuring that your codebase remains clean, readable, and maintainable. This can include tasks such as fixing indentation, adding or removing whitespace, aligning code, and formatting comments.
Why You Need a JavaScript Formatter
1. Consistency
One of the most significant benefits of using a JavaScript formatter is achieving consistency across your codebase. When multiple developers work on the same project, differences in coding styles can lead to a disorganised and hard-to-read code. A formatter enforces a uniform style, making it easier for any developer to understand and navigate the code.
2. Readability
Clean and well-formatted code is easier to read and understand. This can significantly reduce the time needed to comprehend the code's functionality, especially for new team members or contributors. Improved readability also aids in identifying bugs and potential issues more quickly.
3. Productivity
By automating the task of code formatting, developers can save time and focus on writing actual code. This automation reduces the need for manual adjustments and code reviews focused solely on style issues, thus boosting overall productivity.
4. Error Reduction
Inconsistent formatting can sometimes obscure logical errors or typos. A formatter ensures that your code structure is clear, making it easier to spot mistakes that might otherwise go unnoticed.
5. Collaboration
When everyone on a team uses the same formatting rules, merging code changes becomes smoother. There are fewer merge conflicts related to code style, which streamli
Conclusion
A JavaScript formatter is an indispensable tool for any developer looking to maintain clean, readable, and consistent code. By integrating a formatter into your workflow, you can enhance productivity, reduce errors, and facilitate better collaboration among your team. Whether you choose Prettier, ESLint, or any other formatter, the key is to ensure that your code remains a joy to read and easy to maintain. So, take the step towards cleaner code today and unlock the full potential of your projects with a JavaScript formatter.
| johnalbort12 | |
1,896,145 | Managing Large Files with Git LFS | Git LFS (large file system) hell Managing Large Files with Git LFS I recently faced a... | 0 | 2024-06-21T15:46:03 | https://dev.to/ebrahimramadan/managing-large-files-with-git-lfs-3327 | git, lfs, webdev, versioncontrolsystem |

## [Git LFS (large file system) hell](https://ebrahim-ramadan.vercel.app/blogs/GitLFS)
Managing Large Files with Git LFS
I recently faced a first-thing-for-everything challenge while working on my portfolio (this site). I had some quitelarge .gif filesthat I decided to manage using Git LFS (Large File Storage). However, things didn’t go as smoothly as I anticipated. Here's how I went through it and what I learned along the way.
> "--distributed-even-if-your-workflow-isnt"
Gitis a powerful version control system with many benefits, including storing and managing large files. However, it’s important to note that storing large files directly in Git can significantly slow down operations like pulling, pushing, and cloning the repository. This can frustrate collaborators who rely on these operations to work efficiently.
When a large file is added to a Git repository, every collaborator on the repository must download the entire file, including all versions of it. This process can be time-consuming, especially for collaborators with slower internet connections. Additionally, storing large files on Git can result in a large repository size, making collaboration difficult.
That is whenGit LFScomes into play. It is a Git extension that allows you to store large files in a separate, encrypted repository, and stores a single text pointer in the current regular repository that points to the actual centent in the remote server. This means that only the collaborators who need the file can download it, reducing the size of the repository and improving collaboration.
# Installing
Refer to Git LFS, note that required Git ≥ 1.8.2
#windows
download https://git-lfs.github.com/
`>_ git lfs install`
#macOS
>_ brew install git-lfs
`>_ git lfs install`
#Linux
`>_ sudo apt-get install git-lfs`
`>_ git lfs install`
this will return output like this
Updated Git Hooks
Git LFS initialized
# Tracking Files
To track that .gif type of file in my repo, I just ran
`>_ git add .gitattributes`
`>_ git lfs track "*.gif"`
this cmd let me git lfs track all .gif files in the repo directory, also will actually create the .gitattributes file in the root of the repo dir, so it has something like
*.gif filter=lfs diff=lfs merge=lfs -text
This is git mechanism that binds special behaviors to certain file patterns. Git LFS binds to filters using tracked file patterns via the .gitattributes file. And then you can absolutely commit/push
`>_ git add .`
`>_ git commit -m "gif files to lfs"`
`>_ git push origin main`
see now the gif file content does not exist on my actual repo, It is jsut a pointer. so when someone clones or pulls the changes, git will try to pull the changes, there are a few ways to ensure the LFS content isretrieved&available:
1. Before deploying, you can run `git lfs fetch --all` to download all LFS objects.
2. On-demand fetching: Some hosting platforms (like GitHub Pages) can fetch LFS content on-demand when requested.
3. Custom server logic: You could implement server-side logic to fetch LFS content when requested.
## Untracking Files
I could not have the content served for me on dev nor production, so I tried to untrack the files by compressing them to be less than 50MB (gh repo limit). first thing to strat with:
`>_ git lfs untrack "*.gif"`
now you have to pull the files contents from the lfs remote server to your local machine by running:
`>_ git lfs pull`
This command downloads the actual file content for any LFS-tracked files referenced in your repository.as it would be for any other regular file in the repository.
# Problems I confronted
Source Code
To ensure the file type is completely removed from LFS tracking, you should remove it from the LFS cache. Run the following command:
`>_ git rm --cached "*.gif"`
Ensure the file is untracked by Git LFS and that the actual file content is present in your local working directory. You can check if the file is tracked/not by listing all the files there:
>_ git lfs ls-files
# Machine Learning reproducibility crisis
ML devs
"The so-called crisis is because of the difficulty in replicating the work of co-workers or fellow scientists, threatening their ability to build on each others work or to share it with clients or to deploy production services. Since machine learning, and other forms of artificial intelligence software, are so widely used across both academic and corporate research, replicability or reproducibility is a critical problem."
[David Herron](https://dev.to/robogeek/why-git-and-git-lfs-is-not-enough-to-solve-the-machine-learning-reproducibility-crisis-3cnm?ref=ebrahim-ramadan-portfolio-webdev-git-lfs-blog)
Posted on Jun 15, 2019
Read the [full article](https://dev.to/robogeek/why-git-and-git-lfs-is-not-enough-to-solve-the-machine-learning-reproducibility-crisis-3cnm?ref=ebrahim-ramadan-portfolio-webdev-git-lfs-blog) by David on[ Why Git and Git-LFS is not enough to solve the Machine Learning Reproducibility crisis](https://dev.to/robogeek/why-git-and-git-lfs-is-not-enough-to-solve-the-machine-learning-reproducibility-crisis-3cnm?ref=ebrahim-ramadan-portfolio-webdev-git-lfs-blog) and see how machine learning use git LFS in its models, datasets, and others it is really helpful for the ML devs.
| ebrahimramadan |
1,896,144 | 10 Best Tools for Secure and Efficient File Sharing in 2024 | 10 Best Tools for Secure and Efficient File Sharing in 2024 In today’s digital age,... | 0 | 2024-06-21T15:41:18 | https://dev.to/sh20raj/10-best-tools-for-secure-and-efficient-file-sharing-in-2024-1hh2 | ## 10 Best Tools for Secure and Efficient File Sharing in 2024
In today’s digital age, sharing files quickly and securely is crucial for both personal and professional needs. Whether you are sharing a large video file, sensitive documents, or everyday images, having a reliable tool can make all the difference. Here, we present the ten best tools for file sharing that stand out for their security, ease of use, and efficiency.
{% youtube https://www.youtube.com/watch?v=oe-907slDIg %}
### 1. Magic Wormhole
**Magic Wormhole** is a command-line tool that lets you securely send files and directories to another computer. It uses a short and easy-to-remember code to establish a secure connection, making it perfect for those comfortable with terminal commands.
**Key Features**:
- Secure file transfer
- Easy-to-use code system
- Command-line interface
**Learn More**: [Magic Wormhole](https://github.com/magic-wormhole/magic-wormhole)
### 2. Send
Originally developed by Mozilla, **Send** is a simple and private file sharing service. It offers end-to-end encryption and links that automatically expire, ensuring your files remain secure and accessible only to the intended recipient.
**Key Features**:
- End-to-end encryption
- Link expiration
- User-friendly web interface
**Learn More**: [Send](https://github.com/timvisee/send)
### 3. Croc
**Croc** is a versatile tool that allows secure and easy file transfer between computers. It supports cross-platform functionality and relay servers, making it a robust choice for various use cases.
**Key Features**:
- Secure and fast file transfer
- Cross-platform support
- Relay server functionality
**Learn More**: [Croc](https://github.com/schollz/croc)
### 4. toffeeshare
**toffeeshare** Share files directly from your device to anywhere.
Send files of any size directly from your device without ever storing anything online.
**Key Features**:
- No file size limit
- Peer-to-peer
- Blazingly fast
- End-to-end encrypted
- Peer-to-peer file transfer
- No server storage
- Browser-based interface
**Learn More**: [toffeeshare](https://toffeeshare.com/en/)
### 5. Snapdrop
**Snapdrop** is an open-source alternative to AirDrop, allowing file sharing over local networks without an internet connection. It’s browser-based and works seamlessly across different devices.
**Key Features**:
- Local network file transfer
- Browser-based
- Cross-device compatibility
**Learn More**: [Snapdrop](https://github.com/RobinLinus/snapdrop)
### 6. OnionShare
**OnionShare** leverages the Tor network to enable anonymous and secure file sharing. It supports files of any size, making it ideal for highly sensitive or large documents.
**Key Features**:
- Anonymous sharing via Tor
- No size limit
- High security
**Learn More**: [OnionShare](https://github.com/onionshare/onionshare)
### 7. Wormhole
**Wormhole** is a web-based tool that ensures secure file sharing with end-to-end encryption. It offers instant sharing for small files and secure, fast transfers for larger ones.
**Key Features**:
- End-to-end encryption
- Instant sharing
- User-friendly interface
**Learn More**: [Wormhole](https://wormhole.app)
### 8. Firefox Send (Discontinued, but Forks Exist)
**Firefox Send** was a popular tool for private file sharing with strong encryption. Although it has been discontinued, several forks maintain its functionality and security features.
**Key Features**:
- End-to-end encryption
- Link expiration
- Simple web interface
**Learn More**: [Send](https://github.com/timvisee/send)
### 9. Resilio Sync
**Resilio Sync** uses peer-to-peer technology to sync and share files. It’s highly efficient for both personal and business use, ensuring secure and fast file transfers without relying on cloud servers.
**Key Features**:
- Peer-to-peer sync and share
- Secure and fast
- Ideal for business use
**Learn More**: [Resilio Sync](https://www.resilio.com/individuals/)
### 10. Syncthing
**Syncthing** is an open-source continuous file synchronization program. It synchronizes files between two or more computers in real-time, ensuring data remains secure and accessible.
**Key Features**:
- Continuous file synchronization
- Open-source
- Real-time sync
**Learn More**: [Syncthing](https://syncthing.net)
### Conclusion
Choosing the right file sharing tool depends on your specific needs, whether it’s high security, ease of use, or cross-platform functionality. The tools listed above offer a range of features to help you share files securely and efficiently in 2024.
Ensure your digital exchanges are safe and seamless by opting for one of these top-rated file sharing solutions. | sh20raj | |
1,896,143 | Buy verified cash app account | https://dmhelpshop.com/product/buy-verified-cash-app-account/ Buy verified cash app account Cash... | 0 | 2024-06-21T15:38:17 | https://dev.to/helapik680/buy-verified-cash-app-account-3j7b | webdev, javascript, beginners, programming | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-cash-app-account/\n\n\nBuy verified cash app account\nCash app has emerged as a dominant force in the realm of mobile banking within the USA, offering unparalleled convenience for digital money transfers, deposits, and trading. As the foremost provider of fully verified cash app accounts, we take pride in our ability to deliver accounts with substantial limits. Bitcoin enablement, and an unmatched level of security.\n\nOur commitment to facilitating seamless transactions and enabling digital currency trades has garnered significant acclaim, as evidenced by the overwhelming response from our satisfied clientele. Those seeking buy verified cash app account with 100% legitimate documentation and unrestricted access need look no further. Get in touch with us promptly to acquire your verified cash app account and take advantage of all the benefits it has to offer.\n\nWhy dmhelpshop is the best place to buy USA cash app accounts?\nIt’s crucial to stay informed about any updates to the platform you’re using. If an update has been released, it’s important to explore alternative options. Contact the platform’s support team to inquire about the status of the cash app service.\n\nClearly communicate your requirements and inquire whether they can meet your needs and provide the buy verified cash app account promptly. If they assure you that they can fulfill your requirements within the specified timeframe, proceed with the verification process using the required documents.\n\nOur account verification process includes the submission of the following documents: [List of specific documents required for verification].\n\nGenuine and activated email verified\nRegistered phone number (USA)\nSelfie verified\nSSN (social security number) verified\nDriving license\nBTC enable or not enable (BTC enable best)\n100% replacement guaranteed\n100% customer satisfaction\nWhen it comes to staying on top of the latest platform updates, it’s crucial to act fast and ensure you’re positioned in the best possible place. If you’re considering a switch, reaching out to the right contacts and inquiring about the status of the buy verified cash app account service update is essential.\n\nClearly communicate your requirements and gauge their commitment to fulfilling them promptly. Once you’ve confirmed their capability, proceed with the verification process using genuine and activated email verification, a registered USA phone number, selfie verification, social security number (SSN) verification, and a valid driving license.\n\nAdditionally, assessing whether BTC enablement is available is advisable, buy verified cash app account, with a preference for this feature. It’s important to note that a 100% replacement guarantee and ensuring 100% customer satisfaction are essential benchmarks in this process.\n\nHow to use the Cash Card to make purchases?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card. Alternatively, you can manually enter the CVV and expiration date. How To Buy Verified Cash App Accounts.\n\nAfter submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a buy verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account.\n\nWhy we suggest to unchanged the Cash App account username?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card.\n\nAlternatively, you can manually enter the CVV and expiration date. After submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account. Purchase Verified Cash App Accounts.\n\nSelecting a username in an app usually comes with the understanding that it cannot be easily changed within the app’s settings or options. This deliberate control is in place to uphold consistency and minimize potential user confusion, especially for those who have added you as a contact using your username. In addition, purchasing a Cash App account with verified genuine documents already linked to the account ensures a reliable and secure transaction experience.\n\n \n\nBuy verified cash app accounts quickly and easily for all your financial needs.\nAs the user base of our platform continues to grow, the significance of verified accounts cannot be overstated for both businesses and individuals seeking to leverage its full range of features. How To Buy Verified Cash App Accounts.\n\nFor entrepreneurs, freelancers, and investors alike, a verified cash app account opens the door to sending, receiving, and withdrawing substantial amounts of money, offering unparalleled convenience and flexibility. Whether you’re conducting business or managing personal finances, the benefits of a verified account are clear, providing a secure and efficient means to transact and manage funds at scale.\n\nWhen it comes to the rising trend of purchasing buy verified cash app account, it’s crucial to tread carefully and opt for reputable providers to steer clear of potential scams and fraudulent activities. How To Buy Verified Cash App Accounts. With numerous providers offering this service at competitive prices, it is paramount to be diligent in selecting a trusted source.\n\nThis article serves as a comprehensive guide, equipping you with the essential knowledge to navigate the process of procuring buy verified cash app account, ensuring that you are well-informed before making any purchasing decisions. Understanding the fundamentals is key, and by following this guide, you’ll be empowered to make informed choices with confidence.\n\n \n\nIs it safe to buy Cash App Verified Accounts?\nCash App, being a prominent peer-to-peer mobile payment application, is widely utilized by numerous individuals for their transactions. However, concerns regarding its safety have arisen, particularly pertaining to the purchase of “verified” accounts through Cash App. This raises questions about the security of Cash App’s verification process.\n\nUnfortunately, the answer is negative, as buying such verified accounts entails risks and is deemed unsafe. Therefore, it is crucial for everyone to exercise caution and be aware of potential vulnerabilities when using Cash App. How To Buy Verified Cash App Accounts.\n\nCash App has emerged as a widely embraced platform for purchasing Instagram Followers using PayPal, catering to a diverse range of users. This convenient application permits individuals possessing a PayPal account to procure authenticated Instagram Followers.\n\nLeveraging the Cash App, users can either opt to procure followers for a predetermined quantity or exercise patience until their account accrues a substantial follower count, subsequently making a bulk purchase. Although the Cash App provides this service, it is crucial to discern between genuine and counterfeit items. If you find yourself in search of counterfeit products such as a Rolex, a Louis Vuitton item, or a Louis Vuitton bag, there are two viable approaches to consider.\n\n \n\nWhy you need to buy verified Cash App accounts personal or business?\nThe Cash App is a versatile digital wallet enabling seamless money transfers among its users. However, it presents a concern as it facilitates transfer to both verified and unverified individuals.\n\nTo address this, the Cash App offers the option to become a verified user, which unlocks a range of advantages. Verified users can enjoy perks such as express payment, immediate issue resolution, and a generous interest-free period of up to two weeks. With its user-friendly interface and enhanced capabilities, the Cash App caters to the needs of a wide audience, ensuring convenient and secure digital transactions for all.\n\nIf you’re a business person seeking additional funds to expand your business, we have a solution for you. Payroll management can often be a challenging task, regardless of whether you’re a small family-run business or a large corporation. How To Buy Verified Cash App Accounts.\n\nImproper payment practices can lead to potential issues with your employees, as they could report you to the government. However, worry not, as we offer a reliable and efficient way to ensure proper payroll management, avoiding any potential complications. Our services provide you with the funds you need without compromising your reputation or legal standing. With our assistance, you can focus on growing your business while maintaining a professional and compliant relationship with your employees. Purchase Verified Cash App Accounts.\n\nA Cash App has emerged as a leading peer-to-peer payment method, catering to a wide range of users. With its seamless functionality, individuals can effortlessly send and receive cash in a matter of seconds, bypassing the need for a traditional bank account or social security number. Buy verified cash app account.\n\nThis accessibility makes it particularly appealing to millennials, addressing a common challenge they face in accessing physical currency. As a result, ACash App has established itself as a preferred choice among diverse audiences, enabling swift and hassle-free transactions for everyone. Purchase Verified Cash App Accounts.\n\n \n\nHow to verify Cash App accounts\nTo ensure the verification of your Cash App account, it is essential to securely store all your required documents in your account. This process includes accurately supplying your date of birth and verifying the US or UK phone number linked to your Cash App account.\n\nAs part of the verification process, you will be asked to submit accurate personal details such as your date of birth, the last four digits of your SSN, and your email address. If additional information is requested by the Cash App community to validate your account, be prepared to provide it promptly. Upon successful verification, you will gain full access to managing your account balance, as well as sending and receiving funds seamlessly. Buy verified cash app account.\n\n \n\nHow cash used for international transaction?\nExperience the seamless convenience of this innovative platform that simplifies money transfers to the level of sending a text message. It effortlessly connects users within the familiar confines of their respective currency regions, primarily in the United States and the United Kingdom.\n\nNo matter if you’re a freelancer seeking to diversify your clientele or a small business eager to enhance market presence, this solution caters to your financial needs efficiently and securely. Embrace a world of unlimited possibilities while staying connected to your currency domain. Buy verified cash app account.\n\nUnderstanding the currency capabilities of your selected payment application is essential in today’s digital landscape, where versatile financial tools are increasingly sought after. In this era of rapid technological advancements, being well-informed about platforms such as Cash App is crucial.\n\nAs we progress into the digital age, the significance of keeping abreast of such services becomes more pronounced, emphasizing the necessity of staying updated with the evolving financial trends and options available. Buy verified cash app account.\n\nOffers and advantage to buy cash app accounts cheap?\nWith Cash App, the possibilities are endless, offering numerous advantages in online marketing, cryptocurrency trading, and mobile banking while ensuring high security. As a top creator of Cash App accounts, our team possesses unparalleled expertise in navigating the platform.\n\nWe deliver accounts with maximum security and unwavering loyalty at competitive prices unmatched by other agencies. Rest assured, you can trust our services without hesitation, as we prioritize your peace of mind and satisfaction above all else.\n\nEnhance your business operations effortlessly by utilizing the Cash App e-wallet for seamless payment processing, money transfers, and various other essential tasks. Amidst a myriad of transaction platforms in existence today, the Cash App e-wallet stands out as a premier choice, offering users a multitude of functions to streamline their financial activities effectively. Buy verified cash app account.\n\nTrustbizs.com stands by the Cash App’s superiority and recommends acquiring your Cash App accounts from this trusted source to optimize your business potential.\n\nHow Customizable are the Payment Options on Cash App for Businesses?\nDiscover the flexible payment options available to businesses on Cash App, enabling a range of customization features to streamline transactions. Business users have the ability to adjust transaction amounts, incorporate tipping options, and leverage robust reporting tools for enhanced financial management.\n\nExplore trustbizs.com to acquire verified Cash App accounts with LD backup at a competitive price, ensuring a secure and efficient payment solution for your business needs. Buy verified cash app account.\n\nDiscover Cash App, an innovative platform ideal for small business owners and entrepreneurs aiming to simplify their financial operations. With its intuitive interface, Cash App empowers businesses to seamlessly receive payments and effectively oversee their finances. Emphasizing customization, this app accommodates a variety of business requirements and preferences, making it a versatile tool for all.\n\nWhere To Buy Verified Cash App Accounts\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nThe Importance Of Verified Cash App Accounts\nIn today’s digital age, the significance of verified Cash App accounts cannot be overstated, as they serve as a cornerstone for secure and trustworthy online transactions.\n\nBy acquiring verified Cash App accounts, users not only establish credibility but also instill the confidence required to participate in financial endeavors with peace of mind, thus solidifying its status as an indispensable asset for individuals navigating the digital marketplace.\n\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nConclusion\nEnhance your online financial transactions with verified Cash App accounts, a secure and convenient option for all individuals. By purchasing these accounts, you can access exclusive features, benefit from higher transaction limits, and enjoy enhanced protection against fraudulent activities. Streamline your financial interactions and experience peace of mind knowing your transactions are secure and efficient with verified Cash App accounts.\n\nChoose a trusted provider when acquiring accounts to guarantee legitimacy and reliability. In an era where Cash App is increasingly favored for financial transactions, possessing a verified account offers users peace of mind and ease in managing their finances. Make informed decisions to safeguard your financial assets and streamline your personal transactions effectively.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com\n\n" | helapik680 |
1,896,141 | Participe da maior competição de IA do Brasil! | O Langflow é uma plataforma visual, Open-Source, que permite criar aplicações de IA, multi-agente,... | 0 | 2024-06-21T15:36:24 | https://dev.to/guiachcar/participe-da-maior-competicao-de-ia-do-brasil-47n4 | braziliandevs, langflow, langchain, ai | O Langflow é uma plataforma visual, Open-Source, que permite criar aplicações de IA, multi-agente, RAG, automações, entre outros. É de código aberto, desenvolvido em Python, totalmente personalizável e gratuito. É o projeto open-source de IA que mais cresce no mundo! E é do Brasil🇧🇷!
Para encontrar talentos e fomentar a comunidade open-source no Brasil o Langflow está promovendo a maior competição de AI do país. Desenvolva apps de IA em minutos! Totalmente grátis!
A competição, além de ensinar uma nova tecnologia irá premiar os melhores projetos. Participe agora, são vagas e milhares em prêmios!
[Inscrição](https://www.langflow.org/iadevs)
Aproveite e entre no projeto e dê uma estrela:
[Repo Github - Langflow](https://github.com/langflow-ai/langflow) | guiachcar |
1,896,140 | Amazon S3 | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T15:35:13 | https://dev.to/vidhey071/amazon-s3-4o4k | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,896,028 | Easy way to change Ruby version in Mac, M1, M2 and M3 | I had some problems updating and installing Ruby on my Mac, which made it difficult to configure... | 0 | 2024-06-21T15:26:20 | https://dev.to/luizgadao/easy-way-to-change-ruby-version-in-mac-m1-m2-and-m3-16hl | ios, ruby, kmp, cocoapods | > I had some problems updating and installing Ruby on my Mac, which made it difficult to configure essential tools like CocoaPods for my daily work. Here's a guide to help you change your Ruby version easily. **_I spent 90 minutes figuring this out, but with this guide, you should be able to set up your environment in just 5 minutes_**.
>
#### Step 1: Install Homebrew
If you don't have Homebrew installed on your Mac, paste this line into your terminal to install it:
```
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
```
#### Step 2: Install rbenv
rbenv is a simple tool to manage Ruby versions, similar to SDKMAN for Java. To install rbenv, use the following command:
```
brew install rbenv ruby-build
```
#### Step 3: Initialize rbenv
To start rbenv, run:
```
rbenv init
```
Maybe you need restart your sheel or open a new terminal window, but for me, it wasn't necessary.
#### Step 4: Check Your Current Ruby Version
To check your current Ruby version, use:
```
ruby -v
```
Example output:
```
ruby 2.6.10p210 (2022-04-12 revision 67958) [universal.arm64e-darwin23]
```
#### Step 5: Install a Newer Ruby Version
To see the available Ruby versions, run:
```
rbenv install -l
```
Example output:
```
3.1.6
3.2.4
3.3.3
jruby-9.4.7.0
mruby-3.3.0
picoruby-3.0.0
```
To install and set up a newer version, for example, version 3.3.3, use:
```
rbenv install 3.3.3
```
After donwloaded and intalled a ruby version, use:
```
rbenv global 3.3.3
```
#### Step 6: Verify Your Ruby Version
Now, check your Ruby version again:
```
ruby -v
ruby 2.6.10p210 (2022-04-12 revision 67958) [universal.arm64e-darwin23]
```
If it still shows the old version, don't worry. Here's a tip:
#### Step 7: Update Your .zprofile
Open your **.zprofile** file and add the following lines:
```
export PATH="$HOME/.rbenv/bin:$PATH"
eval "$(rbenv init - zsh)"
```
Save the file. To apply the changes, either open a new terminal window or reload the profile with:
```
source .zprofile
```
>
Check again your ruby version, just run the command bellow and you will see the current version of Ruby in your Mac.
```
ruby -v
ruby 3.3.3 (2024-06-12 revision f1c7b6f435) [arm64-darwin23]
```
Now your Mac is ready to install CocoaPods or any other tools you need for iOS development.
🇧🇷🧑🏻💻 | luizgadao |
1,896,138 | AWS Technical Essentials | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T15:34:27 | https://dev.to/vidhey071/aws-technical-essentials-49jd | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,137 | Clustering vs Partitioning your Apache Iceberg Tables | Apache Iceberg 101 Get Hands-on With Apache Iceberg Free PDF Copy of Apache Iceberg: The Definitive... | 0 | 2024-06-21T15:33:17 | https://dev.to/alexmercedcoder/clustering-vs-partitioning-your-apache-iceberg-tables-1m34 | database, datascience, dataengineering, data | - [Apache Iceberg 101](https://www.dremio.com/blog/apache-iceberg-101-your-guide-to-learning-apache-iceberg-concepts-and-practices/)
- [Get Hands-on With Apache Iceberg](https://bit.ly/am-dremio-lakehouse-laptop)
- [Free PDF Copy of Apache Iceberg: The Definitive Guide](https://bit.ly/am-iceberg-book)
Maintaining your data lake tables efficiently is paramount. Techniques such as compaction, partitioning, and clustering are crucial for ensuring that your data remains organized, accessible, and performant. As data volumes grow, the need for less data movement to get the data into a consumable form drives the demand for turning [data lakes into data warehouses called data lakehouses](https://www.dremio.com/blog/why-lakehouse-why-now-what-is-a-data-lakehouse-and-how-to-get-started/).
The data lakehouse combines the best of data lakes and data warehouses, providing a [unified platform](https://www.dremio.com/blog/the-unified-apache-iceberg-lakehouse-unified-analytics/) that supports both large-scale data processing and high-performance analytics. Within this architecture, [Apache Iceberg](https://www.dremio.com/blog/apache-iceberg-101-your-guide-to-learning-apache-iceberg-concepts-and-practices/) stands out as a powerful table format that offers advanced features for managing big data. However, to leverage Iceberg's full potential, understanding the nuances of partitioning and clustering your tables is essential.
We will delve into the pros and cons of partitioning versus clustering in Apache Iceberg. We'll explore the scenarios where one technique might be more advantageous over the other, helping you make informed decisions to optimize your data storage and query performance.
## Understanding Partitioning and Clustering
### What is Partitioning?
[Partitioning is a technique](https://www.youtube.com/watch?v=20HLpaFi_TI&pp=ygUYQWxleCBNZXJjZWQgUGFydGl0aW9uaW5n) used to divide a large dataset into smaller, more manageable pieces based on specific columns. In Apache Iceberg, partitioning can significantly improve query performance by reducing the amount of data scanned during query execution. When a table is partitioned, Iceberg creates separate data files for each partition, enabling faster access to the relevant data. Common partitioning strategies include dividing data by date, region, or any other logical division that aligns with your query patterns.
### What is Clustering?
Clustering, on the other hand, involves organizing the data within a table based on one or more columns but without creating separate physical partitions. Instead, clustering arranges the data in a way that maximizes data locality, making it more efficient to retrieve related rows. Clustering can be particularly useful for improving the performance of range queries and sorting operations. Unlike partitioning, clustering does not create separate data files but optimizes the storage layout within the existing files.
### Similarities Between Partitioning and Clustering
Both partitioning and clustering aim to enhance query performance and data management efficiency. They achieve this by improving data locality and minimizing the amount of data scanned during queries. Both techniques require an understanding of your data and query patterns to be effective, as improper use can lead to suboptimal performance.
### Differences Between Partitioning and Clustering
- **Physical vs. Logical Organization**: Partitioning physically separates data into different files, while clustering logically organizes data within the same file.
- **Granularity**: Partitioning works at a coarser granularity, dividing the dataset into large chunks. Clustering operates at a finer granularity, arranging rows within those chunks.
- **Overhead**: Partitioning can lead to increased storage overhead due to the creation of multiple files, whereas clustering generally has lower overhead as it does not increase the number of files.
- **Flexibility**: Clustering is more flexible in terms of adjusting to changes in query patterns, as it does not require repartitioning the dataset.
Understanding these similarities and differences is crucial for selecting the appropriate technique for your specific use case. In the following sections, we'll explore the pros and cons of each approach and provide guidance on when to choose partitioning over clustering and vice versa.
## When to Use Partitioning and Clustering
### When to Use Partitioning
Partitioning is most effective when:
1. **Large Data Volumes**: If you have large datasets, partitioning can significantly reduce the amount of data scanned during queries, improving performance.
2. **Predictable Query Patterns**: When your queries consistently filter data based on specific columns, such as date or region, partitioning these columns can speed up data retrieval.
3. **Data Pruning**: Partitioning helps with data pruning, allowing the query engine to skip entire partitions that do not match the query criteria, leading to faster query execution.
4. **Maintenance Operations**: Partitioning simplifies maintenance tasks such as vacuuming, compaction, and deletion of old data, as these operations can be performed on individual partitions.
#### Problems to Avoid with Partitioning
- **Over-Partitioning**: Creating too many small partitions can lead to inefficient query performance due to excessive metadata management and increased file handling overhead.
- **Imbalanced Partitions**: Unevenly distributed data across partitions can result in some partitions being much larger than others, causing skewed query performance and resource utilization.
### When to Use Clustering
Clustering is advantageous when:
1. **Frequent Range Queries**: If your queries often involve range scans or sorting on specific columns, clustering can optimize data layout to improve retrieval times.
2. **Evolving Query Patterns**: Clustering is more adaptable to changes in query patterns since it doesn't require repartitioning the data.
3. **Reducing Data Skew**: By organizing data within files, clustering can help mitigate data skew and ensure more uniform query performance.
4. **Lower Storage Overhead**: Clustering does not create additional files, which can help manage storage costs compared to partitioning.
#### Problems to Avoid with Clustering
- **Poorly Chosen Clustering Columns**: Selecting the wrong columns for clustering can result in minimal performance improvements. It’s crucial to choose columns that align with your most common query patterns.
- **High Write Overhead**: Frequent updates and inserts can lead to higher write overhead, as clustering requires maintaining the data order within files.
- **Complexity in Maintenance**: While clustering is flexible, maintaining the clustered data layout can be complex and may require periodic re-clustering to optimize performance.
### Choosing Between Partitioning and Clustering
1. **Query Workload**: Analyze your query workload to determine if it benefits more from partitioning or clustering. If queries often filter by specific columns, partitioning might be better. If queries involve range scans or sorting, clustering could be more beneficial.
2. **Data Size and Growth**: Consider the size of your dataset and its growth rate. For large, growing datasets, partitioning can help manage and access data more efficiently.
3. **Storage Costs**: Assess the impact on storage costs. Partitioning can lead to increased storage due to multiple files, while clustering generally has lower storage overhead.
4. **Maintenance Efforts**: Evaluate the maintenance efforts required for each approach. Partitioning can simplify some maintenance tasks but may complicate others if over-partitioned. Clustering can be more adaptable but may require regular re-clustering to maintain performance.
By carefully considering these factors, you can make informed decisions on whether to partition or cluster your Apache Iceberg tables to achieve optimal performance and efficiency.
## Combining Partitioning and Clustering
Partitioning and clustering are not mutually exclusive; in fact, using them together can leverage the strengths of both techniques to optimize your data lakehouse performance further. Here’s how they can be combined effectively:
### Benefits of Combining Partitioning and Clustering
1. **Enhanced Query Performance**: By partitioning data on one set of columns and clustering on another, you can optimize for different types of queries, reducing the data scanned and improving retrieval times.
2. **Improved Data Locality**: Combining these techniques ensures that related data is stored together, both within partitions and within files, enhancing data locality and access speed.
3. **Balanced Workload Distribution**: Partitioning can help distribute data across different files or nodes, while clustering ensures efficient data retrieval within those partitions, leading to balanced workload distribution and better resource utilization.
4. **Scalable Data Management**: This combination allows for scalable data management, making it easier to handle large datasets by segmenting them into manageable chunks while maintaining efficient data layout within each chunk.
### Example Use Case
Consider a large e-commerce dataset with transactions spanning multiple years and regions. Here’s how you can combine partitioning and clustering:
1. **Partitioning by Date**: Partition the dataset by transaction date (e.g., year, month). This approach allows queries filtering by date range to scan only the relevant partitions, significantly reducing the data scanned.
2. **Clustering by Product Category and Region**: Within each date partition, cluster the data by product category and region. This layout optimizes queries that filter or sort by these columns, ensuring efficient data retrieval and improved performance.
### Implementation Steps
1. **Define Partition Strategy**: Identify the columns that align with your common filtering criteria and create partitions based on these columns. For instance, use date columns for time-based partitions.
2. **Define Clustering Strategy**: Within each partition, choose clustering columns that align with your sorting and range query patterns. For example, product category and region for clustering within date partitions.
3. **Apply Partitioning and Clustering**: Implement the partitioning and clustering strategies in Apache Iceberg. Ensure that your data ingestion and transformation processes respect these strategies to maintain the optimized data layout.
4. **Monitor and Adjust**: Regularly monitor query performance and data growth. Adjust partitioning and clustering strategies as needed to adapt to changing query patterns and data volumes.
### Potential Challenges
1. **Increased Complexity**: Combining partitioning and clustering increases the complexity of your data management strategy. Ensure that your team understands the implications and can maintain the data layout efficiently.
2. **Maintenance Overhead**: Both techniques require ongoing maintenance. Partitioning may need periodic reorganization, while clustering may require regular re-clustering to maintain performance. Plan for these maintenance tasks in your data operations workflow.
3. **Balancing Act**: Striking the right balance between partitioning and clustering is crucial. Over-partitioning can lead to too many small files, while excessive clustering can increase write overhead. Carefully analyze your data and queries to find the optimal balance.
By thoughtfully combining partitioning and clustering, you can achieve a highly efficient and performant data lakehouse architecture, tailored to meet the specific needs of your workload.
## Simplifying Optimization with Dremio Data Reflections
Optimizing your data lakehouse tables for various query patterns can be complex, especially when balancing the benefits of partitioning and clustering. [Dremio simplifies](https://www.dremio.com/reflections/) this process through its unique feature called [Data Reflections](https://www.dremio.com/blog/the-who-what-and-why-of-data-reflections-and-apache-iceberg-for-query-acceleration/), which allows you to create optimized representations of your datasets without the need to manually maintain multiple versions.
### What are Data Reflections?
Data Reflections in Dremio are pre-computed, Apache Iceberg based materialized views that can be customized with specific partitioning, sorting, and aggregation rules. They are designed to accelerate query performance by automatically substituting these optimized reflections when the Dremio engine determines that they will improve performance. This feature enables you to target multiple query types simultaneously without the overhead of maintaining several versions of your dataset.
### Benefits of Using Data Reflections
1. **Automatic Optimization**: Data Reflections allow Dremio to automatically choose the best representation of your data to optimize query performance, eliminating the need for manual tuning.
2. **Custom Partitioning and Sorting**: You can define custom partitioning and sorting rules for each Data Reflection, tailored to different query patterns. This flexibility ensures that your data is always optimally organized for fast retrieval.
3. **Multiple Query Patterns**: By creating different Data Reflections for various query types, you can support a wide range of queries efficiently. Dremio’s engine will select the most appropriate reflection for each query, providing consistent performance improvements.
4. **Simplified Maintenance**: Maintaining multiple versions of the same dataset manually can be cumbersome and error-prone. Data Reflections automate this process, reducing maintenance overhead and simplifying data management. Reflections also reduce the storage imprint, as you can select which columns are reflected in any particular reflection.
### How Dremio Data Reflections Work
1. **Create Data Reflections**: Define Data Reflections with specific partitioning, sorting, and aggregation rules based on your most common query patterns. For instance, you can create one reflection optimized for date-based queries and another for category-based queries.
2. **Query Execution**: When a query is executed, Dremio’s query optimizer evaluates the available Data Reflections and determines the best one to use. This substitution happens seamlessly, without any need for user intervention.
3. **Performance Gains**: By leveraging Data Reflections, you can achieve significant performance gains across a variety of queries. The reflections are pre-computed and stored, allowing for rapid query execution and reduced response times.
4. **Ongoing Management**: Dremio automatically manages the Data Reflections, updating them as the underlying data changes. This ensures that your reflections are always current and optimized for performance.
### Example Use Case
Consider a scenario where your dataset includes transaction data that is frequently queried by both date and product category. With Dremio, you can create two Data Reflections:
1. **Date-Partitioned Reflection**: Optimized for queries filtering by transaction date.
2. **Category-Sorted Reflection**: Optimized for queries sorting or filtering by product category.
When a user executes a date-based query, Dremio automatically uses the date-partitioned reflection. For category-based queries, it switches to the category-sorted reflection. This dynamic optimization ensures that all queries are executed efficiently without manual intervention.
## Conclusion
Effectively managing and optimizing your data lakehouse tables is crucial for achieving high performance and efficient data retrieval. Both partitioning and clustering offer powerful techniques to enhance query performance, each with its own strengths and ideal use cases. By understanding when to use partitioning and clustering, and how they can be combined, you can make informed decisions to optimize your data layout.
Dremio's Data Reflections take this optimization a step further by automating the process and allowing for custom partitioning and sorting rules tailored to different query patterns. This capability ensures that your queries are always executed using the most efficient data representation, without the need for manual maintenance of multiple dataset versions.
By leveraging these techniques and tools, you can build a highly performant and scalable data lakehouse architecture that meets the demands of diverse and evolving workloads. Whether you are dealing with large-scale data processing, complex analytical queries, or dynamic data environments, a well-optimized data lakehouse can provide the foundation for faster insights and better decision-making.
##### GET HANDS-ON
Below are list of exercises to help you get hands-on with Apache Iceberg to see all of this in action yourself!
- [Intro to Apache Iceberg, Nessie and Dremio on your Laptop](https://bit.ly/am-dremio-lakehouse-laptop)
- [JSON/CSV/Parquet to Apache Iceberg to BI Dashboard](https://bit.ly/am-json-csv-parquet-dremio)
- [From MongoDB to Apache Iceberg to BI Dashboard](https://bit.ly/am-mongodb-dashboard)
- [From SQLServer to Apache Iceberg to BI Dashboard](https://bit.ly/am-sqlserver-dashboard)
- [From Postgres to Apache Iceberg to BI Dashboard](https://bit.ly/am-postgres-to-dashboard)
- [Mongo/Postgres to Apache Iceberg to BI Dashboard using Git for Data and DBT](https://bit.ly/dremio-experience)
- [Elasticsearch to Apache Iceberg to BI Dashboard](https://bit.ly/am-dremio-elastic)
- [MySQL to Apache Iceberg to BI Dashboard](https://bit.ly/am-dremio-mysql-dashboard)
- [Apache Kafka to Apache Iceberg to Dremio](https://bit.ly/am-kafka-connect-dremio)
- [Apache Iceberg Lakehouse Engineering Video Playlist](https://bit.ly/am-iceberg-lakehouse-engineering)
- [Apache Druid to Apache Iceberg to BI Dashboard](https://bit.ly/am-druid-dremio)
- [Postgres to Apache Iceberg to Dashboard with Spark & Dremio](https://bit.ly/end-to-end-de-tutorial)
| alexmercedcoder |
1,896,136 | Amazon EBS | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T15:28:51 | https://dev.to/vidhey071/amazon-ebs-1kh2 | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,896,135 | AWS Storage Gateway Deep Dive | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T15:28:06 | https://dev.to/vidhey071/aws-storage-gateway-deep-dive-59n2 | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,134 | AWS Networking Basics | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T15:23:49 | https://dev.to/vidhey071/aws-networking-basics-4376 | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,896,133 | Amazon QuickSight | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T15:22:41 | https://dev.to/vidhey071/amazon-quicksight-14m9 | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
1,896,124 | TOP CRYPTOCURRENCY AND USDT RECOVER COMPANY // FOLKWIN EXPERT RECOVERY. | \ My decision to leave a stable job at Atlanta's airport and venture into day trading four years ago... | 0 | 2024-06-21T15:19:33 | https://dev.to/virginia_maehandy_c6512c/top-cryptocurrency-and-usdt-recover-company-folkwin-expert-recovery-4f9l |
\
My decision to leave a stable job at Atlanta's airport and venture into day trading four years ago seemed like a leap toward financial independence. Guided by a friend's successful mentorship in Bitcoin trading, I initially invested $17,000, witnessing my capital grow impressively to over $53,000 in just four months. Encouraged by this initial success, I reinvested substantially, totaling $190,000 over the next two years. However, the allure of better rates led me to switch brokers—a decision that would unravel my financial security. The new broker, promising even more lucrative returns, turned out to be a sophisticated scam. When I attempted to withdraw my profits, communication dwindled, excuses piled up, and eventually, the trading platform vanished altogether. I was left stranded, watching helplessly as my hard-earned investments disappeared into thin air. In a state of disbelief and desperation, I turned to fellow traders for advice. It was through this network that I discovered [Folkwin Expert Recovery], a firm specializing in recovering funds lost to online scams. Despite my skepticism, I reached out to them, driven by a dwindling hope for a resolution to my financial nightmare. From the outset,[Folkwin Expert Recovery] demonstrated a level of professionalism and empathy that immediately set them apart from previous encounters. They listened intently to my story, acknowledging the gravity of my situation while outlining a clear strategy for recovery. Their transparency and commitment to my case instilled a renewed sense of hope and confidence that had been shattered by deceit. [Folkwin Expert Recovery's] approach was meticulous and methodical. Utilizing advanced techniques and industry expertise, they embarked on a determined pursuit of justice against the fraudulent brokers. Throughout the process, they kept me informed with regular updates, patiently guiding me through each step and offering reassurance during moments of uncertainty. The turning point came when [Folkwin Expert Recovery] successfully recovered a substantial portion of my lost funds. The relief and gratitude I felt were overwhelming—it was not just about the money, but about reclaiming a sense of control and justice. Their swift and effective action transformed what seemed like an insurmountable setback into a story of resilience and triumph.What truly impressed me about [Folkwin Expert Recovery] was their unwavering dedication to their clients' well-being. They didn't just treat me as a case number; they understood the emotional toll of financial fraud and provided genuine support throughout the recovery process. Their commitment to transparency and ethical practices restored my faith in the possibility of reclaiming what was rightfully mine. with [Folkwin Expert Recovery] serves as a powerful reminder of the risks associated with online investments and the importance of due diligence. Trust must be earned cautiously, and safeguards must be in place to protect one's financial interests. If you find yourself entangled in a similar web of deception, I wholeheartedly recommend [Folkwin Expert Recovery]. They are not just experts in recovering stolen funds; they are compassionate advocates dedicated to helping individuals navigate the complexities of financial fraud with integrity and resilience. [Folkwin Expert Recovery] was more than a recovery of lost funds — it was a journey of empowerment and restoration. They turned a devastating setback into an opportunity for renewal, proving that with the right support and determination, justice can prevail. With [Folkwin Expert Recovery] by your side, reclaiming your financial future is not only possible but achievable with trust, perseverance, and expert guidance. To get in touch with them, contact them with the details here.......
/Website: WWW.FOLKWINEXPERTRECOVERY.COM
/Email: FOLKWINEXPERTRECOVERY @ TECH-CENTER (.) COM
/Whatsapp: +1 (740)705-0711
Best Regards,
Virginia Mae Handy. | virginia_maehandy_c6512c | |
1,896,123 | AWS Certification Quiz 6 | 🚀 Exciting News! 🚀 I am thrilled to announce that I have achieved my AWS certification! 🎉 After... | 0 | 2024-06-21T15:17:35 | https://dev.to/vidhey071/aws-certification-quiz-6-3a0m | aws | 🚀 Exciting News! 🚀
I am thrilled to announce that I have achieved my AWS certification! 🎉 After months of hard work and dedication, I am now certified with this certificate. This accomplishment signifies my commitment to mastering AWS services and best practices, enhancing my skills in cloud computing and infrastructure management.
I am grateful for the support of my colleagues, mentors, and the invaluable resources provided by AWS. This journey has been incredibly rewarding, and I look forward to applying my knowledge to deliver innovative solutions and contribute effectively to our projects.
Thank you all for your encouragement and belief in my abilities. Let's continue to strive for excellence together! | vidhey071 |
1,896,113 | BITCOIN RECOVERY IS VERY MUCH REAL | I was actually fooled and scammed over ( $753,000 ) by someone I trusted with my funds through a... | 0 | 2024-06-21T15:03:06 | https://dev.to/margaret_severson_218c369/bitcoin-recovery-is-very-much-real-4joo | I was actually fooled and scammed over ( $753,000 ) by someone I trusted with my funds through a transaction we did and I feel so disappointed and hurt knowing that someone can steal from you without remorse after trusting them, so I started searching for help legally to recover my stolen funds and came across a lot of Testimonials about LION CYBER Recovery Expert who helps in recovery lost funds, which I can tell has helped so many people who had contacted them regarding such issues and without a questionable doubt their funds was returned back to their wallet in a very short space of time, it took the expert 21 hours to help me recover my funds and the best part of it all was that the scammers was actually located and arrested by local authorities in his region which was very relieving. Hope this helps as many people who have lost their hard earn money to scammers out of trust, you can reach him through the link below for help to recover your scammed funds and thank me later.
Email Address: lioncyberrr@gmail.com
WhatsApp: +1 (929) 660-4485 | margaret_severson_218c369 | |
1,896,122 | AWS Certification Quiz 5 | 🚀 Exciting News! 🚀 I'm thrilled to announce that I've achieved AWS certification! 🎉 After months of... | 0 | 2024-06-21T15:16:29 | https://dev.to/vidhey071/aws-certification-quiz-5-35a0 | aws | 🚀 Exciting News! 🚀
I'm thrilled to announce that I've achieved AWS certification! 🎉
After months of dedicated learning and hard work, I am now officially certified with this certificate. This journey has been incredibly rewarding, and I'm looking forward to leveraging this knowledge to drive innovation and efficiency in cloud computing.
A huge thank you to everyone who supported me along the way. Your encouragement and guidance meant the world to me.
Let's continue to push boundaries and explore new possibilities with AWS! 💡 | vidhey071 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.