id int64 5 1.93M | title stringlengths 0 128 | description stringlengths 0 25.5k | collection_id int64 0 28.1k | published_timestamp timestamp[s] | canonical_url stringlengths 14 581 | tag_list stringlengths 0 120 | body_markdown stringlengths 0 716k | user_username stringlengths 2 30 |
|---|---|---|---|---|---|---|---|---|
1,906,747 | How to create or write HTML code as example in Visual Studio Code(VS Code) | Method of writing HTML code in VS Code Writing HTML code in Visual Studio Code (VS Code)... | 0 | 2024-06-30T15:43:02 | https://dev.to/wasifali/how-to-create-or-write-html-code-as-example-in-visual-studio-codevs-code-2hpb | webdev, html, coding, learning | ## **Method of writing HTML code in VS Code**
Writing HTML code in Visual Studio Code (VS Code) is straightforward and follows these steps:
**Open VS Code**: Launch the Visual Studio Code editor on your computer.

## **Create a new folder**
After opening the VS code, set up the new folder.

**Create a New File**: Either open an existing HTML file or create a new file by selecting File > New File from the menu, or using the shortcut Ctrl + N (Windows/Linux) or Cmd + N (Mac).

**Set File Type**: If you created a new file, save it with an .html extension. This helps VS Code recognize it as an HTML file and provides syntax highlighting and other helpful features.

**Start Writing HTML**: Begin writing your HTML code. VS Code provides autocomplete suggestions and syntax highlighting to assist you as you type.

**Code Formatting**: VS Code can automatically format your HTML code for consistency. Use Shift + Alt + F (Windows/Linux) or Shift + Option + F (Mac) to format the entire document or a selected portion.

**Preview in Browser**: You can install extensions like "Live Server" or simply open the HTML file in a web browser to preview your webpage.

**Save Your Work**: Remember to save your changes periodically (Ctrl + S or Cmd + S) to keep your HTML file up to date.

## **Running of Code with Live Server**
After writing the coding, the written code is run with Live Server

| wasifali |
1,906,746 | Step-by-Step Guide to Using the Hashnode API for Developers | Welcome to this comprehensive guide on using the Hashnode API! Whether you're a seasoned developer or... | 0 | 2024-06-30T15:41:23 | https://raajaryan.tech/step-by-step-guide-to-using-the-hashnode-api-for-developers | api, beginners, tutorial, react |
Welcome to this comprehensive guide on using the Hashnode API! Whether you're a seasoned developer or just starting out, this post will equip you with everything you need to know about interacting with Hashnode's powerful API. By the end of this guide, you'll be able to fetch articles, publish content, and integrate these capabilities seamlessly into your MERN stack projects. Let's dive in!
## Table of Contents
1. [Introduction to Hashnode API](#introduction-to-hashnode-api)
2. [Getting Started](#getting-started)
- [Creating a Hashnode Account](#creating-a-hashnode-account)
- [Generating API Key](#generating-api-key)
3. [Understanding GraphQL](#understanding-graphql)
- [GraphQL Basics](#graphql-basics)
- [Query vs. Mutation](#query-vs-mutation)
4. [Fetching Articles](#fetching-articles)
- [GraphQL Query for Fetching Articles](#graphql-query-for-fetching-articles)
- [Making Requests with cURL](#making-requests-with-curl)
- [Using Axios in JavaScript](#using-axios-in-javascript)
5. [Publishing Articles](#publishing-articles)
- [GraphQL Mutation for Publishing Articles](#graphql-mutation-for-publishing-articles)
- [Making Requests with cURL](#making-requests-with-curl-publish)
- [Using Axios in JavaScript](#using-axios-in-javascript-publish)
6. [Handling Responses](#handling-responses)
- [Interpreting JSON Responses](#interpreting-json-responses)
- [Error Handling](#error-handling)
7. [Integrating with MERN Stack](#integrating-with-mern-stack)
- [Frontend Integration](#frontend-integration)
- [Backend Integration](#backend-integration)
8. [Practical Applications](#practical-applications)
- [Building a Blog Dashboard](#building-a-blog-dashboard)
- [Automating Content Publishing](#automating-content-publishing)
9. [Advanced Tips and Tricks](#advanced-tips-and-tricks)
- [Optimizing GraphQL Queries](#optimizing-graphql-queries)
- [Using Environment Variables for API Keys](#using-environment-variables-for-api-keys)
10. [Conclusion](#conclusion)
## Introduction to Hashnode API
Hashnode is a popular blogging platform that allows developers to share their knowledge and grow their personal brand. The Hashnode API provides a way to interact programmatically with the platform, enabling you to manage your content, fetch articles, and more. This guide will walk you through the process of using the Hashnode API effectively.
## Getting Started
### Creating a Hashnode Account
Before you can use the Hashnode API, you need to have a Hashnode account. If you don't already have one, follow these steps:
1. **Visit Hashnode:** Go to [Hashnode](https://hashnode.com/).
2. **Sign Up:** Click on the "Sign Up" button and follow the instructions to create your account.
### Generating API Key
Once you have a Hashnode account, you need to generate an API key to authenticate your requests:
1. **Log In:** Sign in to your Hashnode account.
2. **Profile Settings:** Navigate to your profile settings.
3. **Generate API Key:** Look for the API key section and generate a new API key. Make sure to store this key securely as it will be used to authenticate your API requests.
## Understanding GraphQL
The Hashnode API is a GraphQL API, which means it uses GraphQL queries and mutations to interact with the data. Let's start with the basics of GraphQL.
### GraphQL Basics
GraphQL is a query language for APIs that allows clients to request only the data they need. It provides a more efficient and flexible alternative to REST.
- **Query:** A read-only operation to fetch data.
- **Mutation:** A write operation to modify data.
### Query vs. Mutation
In GraphQL, queries are used to fetch data, while mutations are used to modify data. Understanding the difference between these two is crucial for interacting with the Hashnode API.
- **Query Example:** Fetching a list of articles.
- **Mutation Example:** Publishing a new article.
## Fetching Articles
Let's start by fetching articles from Hashnode using a GraphQL query.
### GraphQL Query for Fetching Articles
Here's a basic GraphQL query to fetch articles from a user's publication:
```graphql
{
user(username: "YOUR_HASHNODE_USERNAME") {
publication {
posts {
title
brief
coverImage
slug
dateAdded
author {
name
photo
}
}
}
}
}
```
### Making Requests with cURL
You can make a request to the Hashnode API using cURL. Here's an example:
```sh
curl -X POST https://api.hashnode.com \
-H "Content-Type: application/json" \
-H "Authorization: YOUR_API_KEY" \
-d '{
"query": "{ user(username: \"YOUR_HASHNODE_USERNAME\") { publication { posts { title brief coverImage slug dateAdded author { name photo } } } } }"
}'
```
### Using Axios in JavaScript
For JavaScript developers, Axios is a popular library for making HTTP requests. Here's how you can use it to fetch articles:
```javascript
import axios from 'axios';
const fetchArticles = async () => {
const query = `
{
user(username: "YOUR_HASHNODE_USERNAME") {
publication {
posts {
title
brief
coverImage
slug
dateAdded
author {
name
photo
}
}
}
}
}
`;
const response = await axios.post('https://api.hashnode.com', {
query: query,
}, {
headers: {
'Content-Type': 'application/json',
'Authorization': 'YOUR_API_KEY',
},
});
console.log(response.data);
};
fetchArticles();
```
## Publishing Articles
Now that we've covered fetching articles, let's move on to publishing articles using a GraphQL mutation.
### GraphQL Mutation for Publishing Articles
Here's an example mutation to publish a new article:
```graphql
mutation {
createStory(
input: {
title: "My New Blog Post"
contentMarkdown: "This is the content of my new blog post."
coverImageURL: "https://example.com/cover-image.jpg"
tags: [{ _id: "5674470281e3abac9b3e0043", name: "programming" }]
}
) {
code
success
message
}
}
```
### Making Requests with cURL
You can publish an article using cURL as well. Here's an example:
```sh
curl -X POST https://api.hashnode.com \
-H "Content-Type: application/json" \
-H "Authorization: YOUR_API_KEY" \
-d '{
"query": "mutation { createStory(input: { title: \"My New Blog Post\", contentMarkdown: \"This is the content of my new blog post.\", coverImageURL: \"https://example.com/cover-image.jpg\", tags: [{ _id: \"5674470281e3abac9b3e0043\", name: \"programming\" }] }) { code success message } }"
}'
```
### Using Axios in JavaScript
Here's how you can use Axios to publish an article:
```javascript
import axios from 'axios';
const publishArticle = async () => {
const mutation = `
mutation {
createStory(
input: {
title: "My New Blog Post"
contentMarkdown: "This is the content of my new blog post."
coverImageURL: "https://example.com/cover-image.jpg"
tags: [{ _id: "5674470281e3abac9b3e0043", name: "programming" }]
}
) {
code
success
message
}
}
`;
const response = await axios.post('https://api.hashnode.com', {
query: mutation,
}, {
headers: {
'Content-Type': 'application/json',
'Authorization': 'YOUR_API_KEY',
},
});
console.log(response.data);
};
publishArticle();
```
## Handling Responses
Understanding how to handle the responses from the Hashnode API is crucial for building robust applications.
### Interpreting JSON Responses
The Hashnode API returns responses in JSON format. Here's an example response for fetching articles:
```json
{
"data": {
"user": {
"publication": {
"posts": [
{
"title": "My First Blog Post",
"brief": "This is a brief description of my first blog post.",
"coverImage": "https://example.com/cover-image.jpg",
"slug": "my-first-blog-post",
"dateAdded": "2024-06-30",
"author": {
"name": "Deepak Kumar",
"photo": "https://example.com/photo.jpg"
}
}
]
}
}
}
}
```
### Error Handling
When making API requests, it's important to handle errors gracefully. Here's an example of how to handle errors
in Axios:
```javascript
import axios from 'axios';
const fetchArticles = async () => {
const query = `
{
user(username: "YOUR_HASHNODE_USERNAME") {
publication {
posts {
title
brief
coverImage
slug
dateAdded
author {
name
photo
}
}
}
}
}
`;
try {
const response = await axios.post('https://api.hashnode.com', {
query: query,
}, {
headers: {
'Content-Type': 'application/json',
'Authorization': 'YOUR_API_KEY',
},
});
console.log(response.data);
} catch (error) {
console.error('Error fetching articles:', error);
}
};
fetchArticles();
```
## Integrating with MERN Stack
Integrating the Hashnode API with your MERN stack projects allows you to build dynamic and interactive applications. Let's explore both frontend and backend integrations.
### Frontend Integration
For frontend integration, you can use React along with Axios to fetch and display articles. Here's a basic example:
**App.js**
```javascript
import React, { useEffect, useState } from 'react';
import axios from 'axios';
const App = () => {
const [articles, setArticles] = useState([]);
useEffect(() => {
const fetchArticles = async () => {
const query = `
{
user(username: "YOUR_HASHNODE_USERNAME") {
publication {
posts {
title
brief
coverImage
slug
dateAdded
author {
name
photo
}
}
}
}
}
`;
try {
const response = await axios.post('https://api.hashnode.com', {
query: query,
}, {
headers: {
'Content-Type': 'application/json',
'Authorization': 'YOUR_API_KEY',
},
});
setArticles(response.data.data.user.publication.posts);
} catch (error) {
console.error('Error fetching articles:', error);
}
};
fetchArticles();
}, []);
return (
<div>
<h1>My Hashnode Articles</h1>
<ul>
{articles.map((article) => (
<li key={article.slug}>
<h2>{article.title}</h2>
<p>{article.brief}</p>
<img src={article.coverImage} alt={article.title} />
<p>By: {article.author.name}</p>
<img src={article.author.photo} alt={article.author.name} />
</li>
))}
</ul>
</div>
);
};
export default App;
```
### Backend Integration
For backend integration, you can use Node.js and Express to create endpoints that interact with the Hashnode API. Here's a basic example:
**server.js**
```javascript
const express = require('express');
const axios = require('axios');
const app = express();
const port = 3000;
app.get('/fetch-articles', async (req, res) => {
const query = `
{
user(username: "YOUR_HASHNODE_USERNAME") {
publication {
posts {
title
brief
coverImage
slug
dateAdded
author {
name
photo
}
}
}
}
}
`;
try {
const response = await axios.post('https://api.hashnode.com', {
query: query,
}, {
headers: {
'Content-Type': 'application/json',
'Authorization': 'YOUR_API_KEY',
},
});
res.json(response.data.data.user.publication.posts);
} catch (error) {
console.error('Error fetching articles:', error);
res.status(500).send('Error fetching articles');
}
});
app.listen(port, () => {
console.log(`Server is running on http://localhost:${port}`);
});
```
## Practical Applications
### Building a Blog Dashboard
One practical application of the Hashnode API is building a blog dashboard where you can manage and view your articles in one place. You can use React for the frontend and Node.js for the backend to create a seamless user experience.
### Automating Content Publishing
Another practical application is automating content publishing. You can create scripts or applications that automatically publish articles to Hashnode based on predefined schedules or triggers.
## Advanced Tips and Tricks
### Optimizing GraphQL Queries
To optimize your GraphQL queries, request only the fields you need. This reduces the amount of data transferred and improves the performance of your application.
### Using Environment Variables for API Keys
For security reasons, never hard-code your API keys in your code. Instead, use environment variables to store your API keys. Here's an example using Node.js:
**.env**
```
HASHNODE_API_KEY=your_api_key
```
**server.js**
```javascript
require('dotenv').config();
const express = require('express');
const axios = require('axios');
const app = express();
const port = 3000;
app.get('/fetch-articles', async (req, res) => {
const query = `
{
user(username: "YOUR_HASHNODE_USERNAME") {
publication {
posts {
title
brief
coverImage
slug
dateAdded
author {
name
photo
}
}
}
}
}
`;
try {
const response = await axios.post('https://api.hashnode.com', {
query: query,
}, {
headers: {
'Content-Type': 'application/json',
'Authorization': process.env.HASHNODE_API_KEY,
},
});
res.json(response.data.data.user.publication.posts);
} catch (error) {
console.error('Error fetching articles:', error);
res.status(500).send('Error fetching articles');
}
});
app.listen(port, () => {
console.log(`Server is running on http://localhost:${port}`);
});
```
## Conclusion
The Hashnode API is a powerful tool that allows developers to interact with their Hashnode account programmatically. By mastering the API, you can fetch articles, publish content, and integrate these capabilities into your MERN stack projects. We hope this guide has provided you with the knowledge and tools to leverage the Hashnode API effectively. Happy coding!
---
## 💰 You can help me by Donating
[](https://buymeacoffee.com/dk119819)
| raajaryan |
1,906,744 | Python Development Environment on WSL2 | This Post guides you through how to setup a python development environment on Windows Sub-System for... | 0 | 2024-06-30T15:31:48 | https://dev.to/iamgauravpande/python-development-environment-on-wsl2-2o7m | python, development, microsoft, wsl2 | This Post guides you through how to setup a python development environment on Windows Sub-System for Linux(wsl2).
**What is WSL2?**
wsl2 lets a windows user use a virtual linux environment on Windows Machine itself. This is pretty handy feature for quick development needs on your local machine.
Below Steps would list what all are the pre-requisites for achieving a python development environment on Windows Machine:
- Installing WSL2: https://learn.microsoft.com/en-us/windows/wsl/install
- Install Microsoft Visual Studio code for Windows platform: https://code.visualstudio.com/
- Install the Remote WSL Extension on VSCode: https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-wsl
- Once Wsl2 (at step 1) in installed open an Ubuntu terminal and verify if Python3 is installed using `python3 --version`.
- If Python3 is not installed then do the following:
```
sudo apt update
sudo apt install python3 python3-pip python3-venv
```
**NOTE** : Above command will install python3 , pip tool and python virtual environment package.
- To create a _**Virtual Environment**_ Open the WSL2 Terminal and create your python project directory using command `mkdir <YOUR_PROJECT_NAME>` and then inside it run
`source .venv/bin/activate` to activate the Virtual Environment where **_.venv_** is your virutal environment name.
**NOTE**: Recommendation is creating the virtual environment inside the directory in which you plan to have your project. Since each project should have its own separate directory, each will have its own virtual environment, so there is not a need for unique naming
- To exit from virtual environment run on wsl2 terminal: `deactivate`
**Reference:** https://learn.microsoft.com/en-us/windows/python/web-frameworks | iamgauravpande |
1,906,743 | AWS Cloud Cost Optimization - Identifying Stale Resources | In the fast-paced world of cloud computing, managing costs efficiently is crucial for any... | 0 | 2024-06-30T15:28:14 | https://dev.to/sukuru_naga_sai_srinivasu/aws-cloud-cost-optimization-identifying-stale-resources-52ao | cloud, costoptimization, aws, lambda |



In the fast-paced world of cloud computing, managing costs efficiently is crucial for any business. I'm thrilled to announce that I've recently completed a mini-project focused on cost optimisation in AWS, using AWS Lambda, IAM roles, and policies.
🔍 Project Highlights:
AWS Lambda Functions: Automated routine tasks, eliminating the need for manual intervention and reducing operational costs.
IAM Roles and Policies: Ensured secure and efficient access control, allowing only necessary permissions to optimize cost and enhance security.
This project has been a great opportunity to deepen my understanding of AWS and cloud cost management strategies.
GitHub Repo - https://github.com/SNS-Srinivasu
Linkedin - https://www.linkedin.com/in/sns-srinivasu | sukuru_naga_sai_srinivasu |
1,906,686 | Performance: mistakes to avoid | It might seem obvious, but performance is a concern, and it's easy to make mistakes in that... | 16,464 | 2024-06-30T15:20:51 | https://dev.to/spo0q/performance-mistakes-to-avoid-4pca | programming, performance, bookmarks, development | It might seem obvious, but performance is a concern, and it's easy to make mistakes in that particular matter.
This topic can be quite sensitive in dev teams because performance failures are usually impactful.
## 🦄 "Too unlimited"?
Indeed, resources like memory are limited. Even if you can enable horizontal scaling, especially in the cloud, it won't cover all cases.
While traffic peaks may happen, unoptimized queries (e.g., databases) or nonperformant loops can put your server under heavy stress.
It's a bit counterintuitive, as local installations (a.k.a dev environments) usually benefit from "unlimited" resources.
For example, it's easy to disable timeouts by allocating all power (e.g., memory) to your script.
Besides, it's a good practice not to replicate all data from the production database.
Not only that it would not be compliant with GDPR (unless everything is anonymized, which is usually not the case), but unrealistic for many apps (terabytes of data).
## 🤘🏻 Quality is systemic
I also love [that post](https://jacobian.org/2022/sep/9/quality-is-systemic/).
> Quality is systemic
Good tooling is a great start. If you can set metrics, add performance monitoring (e.g., profiler in your pipelines) and even consider load/scalability tests, you will reduce capacity bottlenecks (e.g., CPU, RAM).
Of course, it does not replace a good approach in the first place, but it will definitely help.
While such additional layers can save the day, Jacob Kaplan-Moss reminds us "there are both technical and human factors involved in systemic quality, and these factors intersect and interact."
Both failure and quality are systemic.
## 💥 Timeouts, DNS, and "blast radius"
Timeouts can be pretty hard to trace back, especially in big projects.
While it's not uncommon with micro-services that depend on other micro-services (a.k.a. _distributed monoliths_), it's also difficult with pseudo-monolithic codebases that actually rely on various external services and HTTP requests.
Besides, DNS failures happen.
Love that [presentation by Pascal MARTIN](https://www.youtube.com/watch?v=U7wuMyv8YzA) (French, but you might be able to get it translated), especially when he explains the concept of **blast radius**.
## 🙈 Nonperformant code
This is where your approach can save some time. It's best if you can avoid the following situations:
* complex calculations in the code while the database could do it (e.g., `SELECT COUNT` with SQL)
* heavy operations in loops
* APIs without cache and pagination
* infinite loops that can ultimately lead to a global crash of your app
* use of humongous arrays and collections (you may have to chunk it)
* anti-patterns: too much synchronous communication, bad/missing caching strategies (e.g., the app constantly calls the backend)
You can read [more about performance anti-patterns here](https://medium.com/@dmosyan/lets-cause-scalability-problem-with-performance-antipatterns-1d163d8a6065).
It can be hard to find critical optimizations while micro-optimization techniques are quite easy to implement.
## 😇 Bullshit!
While it's a bit off-topic here, I often go back to [that post](https://pxlnv.com/blog/bullshit-web/).
## 🎇 All links
* [Quality is systemic](https://jacobian.org/2022/sep/9/quality-is-systemic/)
* [Une application résiliente, dans un monde partiellement dégradé - Pascal MARTIN - Forum PHP 2019](https://www.youtube.com/watch?v=U7wuMyv8YzA)
* [Let’s cause scalability problem with performance antipatterns](https://medium.com/@dmosyan/lets-cause-scalability-problem-with-performance-antipatterns-1d163d8a6065)
* [the Bullshit web](https://pxlnv.com/blog/bullshit-web/)
| spo0q |
1,906,687 | Frontend Face-Off: React vs. Vue.js - An HNG Intern's Perspective | Ever wondered which tool to use to build awesome websites? Today, we compare two frontend titans:... | 0 | 2024-06-30T15:18:55 | https://dev.to/barshow/frontend-face-off-react-vs-vuejs-an-hng-interns-perspective-5466 | hng | Ever wondered which tool to use to build awesome websites? Today, we compare two frontend titans: ReactJS (the one we use at HNG [link to HNG Internship website, https://hng.tech/internship]) and Vue.js! Both are JavaScript libraries that help you create dynamic and interactive web experiences, but they have their own strengths.
React: The OG of Web Frameworks
React, developed by Facebook, is the undisputed king of frontend development. It boasts a massive community, tons of helpful tools, and a wide range of libraries. Building complex web applications with React is like having a whole army of developers at your back. Plus, its virtual DOM keeps things organized and efficient.
The downside to learning React is it can be a bit steeper due to its JSX syntax and the need for additional tools for state management and routing. Sometimes, websites built with React can be a bit on the bigger side.
Vue.js: The Simpler Choice
Vue.js is like the friendly neighborhood Spider-man of frontend frameworks. It's powerful, yet easy to learn, with a clean and concise syntax. Plus, it offers a good balance between features and simplicity, making it a great choice for both small and large projects.
The downside is while Vue.js has a strong community, it might not be as vast as React's. Additionally, the job market might favor React experience in some cases.
My HNG Journey: Gearing Up with React
As an HNG Intern https://hng.tech/internship, I'm excited to delve into the world of React development. The program's focus on React aligns perfectly with my interest in building dynamic web experiences. I'm eager to leverage React's robust framework and vast learning resources to hone my skills and contribute to real-world projects.
The Verdict: It Depends on Your Project
So, which one wins? It depends on your needs! React is the industry standard, perfect for complex projects and those seeking a well-trodden development path. Vue.js, on the other hand, offers a smoother learning curve and a good balance of features for projects of all sizes.
React empowers you to build a feature-rich application for an HNG Internship challenge [link to HNG Internship website, https://hng.tech/internship]. As HNG emphasizes continuous learning [link to HNG Internship website, https://hng.tech/premium], experimenting with various frontend technologies is a fantastic way to broaden your skillset and become a well-rounded developer. | barshow |
1,906,685 | Dive into the Captivating World of Modern 3D Graphics Programming! 🎮 | Comprehensive guide to modern 3D graphics programming, covering the latest techniques and technologies. Suitable for beginners and experienced developers. | 27,801 | 2024-06-30T15:14:29 | https://getvm.io/tutorials/learning-modern-3d-graphics-programming | getvm, programming, freetutorial, technicaltutorials |
As a passionate programmer and graphics enthusiast, I recently discovered an incredible resource that has completely transformed my understanding of modern 3D graphics programming. Allow me to introduce you to "Learning Modern 3D Graphics Programming" by Jason L. McKesson – a comprehensive guide that will take you on an exhilarating journey through the latest techniques and technologies in this dynamic field.
## What's Inside? 🔍
This course is a true gem for both beginners and experienced developers alike. It covers a wide range of topics, from the fundamentals of 3D graphics to the cutting-edge advancements that are shaping the future of the industry.
Some of the highlights include:
- **Comprehensive Coverage**: The course delves deep into the latest techniques and technologies in 3D graphics programming, ensuring you stay ahead of the curve.
- **Hands-on Approach**: Packed with practical examples and code snippets, this resource encourages active learning and helps you apply the concepts immediately.
- **Suitable for All Levels**: Whether you're a newcomer to 3D graphics or a seasoned pro, this course caters to your needs, providing a solid foundation for beginners and advanced techniques for experienced developers.
## Why You Should Check It Out 🤩
If you're a game developer, graphics engineer, or simply a computer graphics enthusiast, this course is an absolute must-have. It offers a solid foundation in the latest 3D graphics techniques and technologies, making it an invaluable resource for your professional growth and personal projects.
Don't miss out on this opportunity to expand your knowledge and unlock the full potential of modern 3D graphics programming. You can access the course at the following link: [https://web.archive.org/web/20150225192611/http://www.arcsynthesis.org/gltut/index.html](https://web.archive.org/web/20150225192611/http://www.arcsynthesis.org/gltut/index.html)
So, what are you waiting for? Dive in, explore the captivating world of 3D graphics, and let your creativity soar! 🚀
## Enhance Your Learning Experience with GetVM's Playground 🚀
To truly maximize your learning journey with "Learning Modern 3D Graphics Programming," I highly recommend utilizing the powerful Playground feature offered by the GetVM Chrome extension. GetVM's Playground provides an immersive online coding environment where you can seamlessly apply the concepts and techniques covered in the course.
With the Playground, you can dive right into hands-on practice, experimenting with code snippets and implementing the latest 3D graphics programming methods. The interactive nature of the Playground allows you to test your understanding, troubleshoot issues, and refine your skills in real-time, accelerating your learning process.
One of the standout features of the Playground is its ability to instantly provision virtual machines, giving you a dedicated, customizable environment to explore 3D graphics programming without the hassle of local setup. This means you can focus solely on learning and coding, without worrying about compatibility or configuration challenges.
Furthermore, the Playground's intuitive interface and seamless integration with the course materials make it an invaluable companion to "Learning Modern 3D Graphics Programming." You can easily access the course resources, reference code examples, and dive straight into practical application, creating a truly immersive and efficient learning experience.
Don't miss out on this opportunity to elevate your 3D graphics programming skills. Head over to [https://getvm.io/tutorials/learning-modern-3d-graphics-programming](https://getvm.io/tutorials/learning-modern-3d-graphics-programming) and start exploring the Playground today! 🎉
---
## Practice Now!
- 🔗 Visit [Learning Modern 3D Graphics Programming](https://web.archive.org/web/20150225192611/http://www.arcsynthesis.org/gltut/index.html) original website
- 🚀 Practice [Learning Modern 3D Graphics Programming](https://getvm.io/tutorials/learning-modern-3d-graphics-programming) on GetVM
- 📖 Explore More [Free Resources on GetVM](https://getvm.io/explore)
Join our [Discord](https://discord.gg/XxKAAFWVNu) or tweet us [@GetVM](https://x.com/getvmio) ! 😄 | getvm |
1,906,684 | .present? VS .any? VS .exists? | Something I have a difficult time wrapping my head around as a newbie is how actually super... | 0 | 2024-06-30T15:13:49 | https://dev.to/sakuramilktea/present-vs-any-vs-exists-34pp | rails, database, queries, sql |

Something I have a difficult time wrapping my head around as a newbie is how actually super important it is to care about query speeds.
Different methods will have _vastly_ different query speeds, and it's therefore to my advantage to understand how they actually work, in order to use the one that will do what I want while keep everything running smooth.
Here are three methods I've been using recently that kind of look the same, but run differently in the background and therefore have different uses!
- **.present?**
```
[ "", " ", false, nil, [], {} ].any?(&:present?)
# => false
```
This one tests for an object's general falsiness. As per the documentation, "An object is present if it’s not `blank?`. An object is blank if it’s false, empty, or a whitespace string."
Very important to point out that `.present?` initializes ActiveRecord for each record found(!). It effectively goes through the whole relevant database available.
- **.any?**
```
person.pets.count # => 0
person.pets.any? # => false
```
This one will use a SQLCount to see if it's > 0, and therefore, although faster than `.present?`, still takes time and queries the database multiple times.
- **.exists?**
```
Note.create(:title => 'Hello, world.', :body => 'Nothing more for now...')
Note.exists?(1) # => true
```
This one, according to the documentation, "asserts the existence of a resource, returning true if the resource is found."
The beauty of `.exists?` is that it uses SQL LIMIT 1 to just check if there's at least one record, without loading them all or counting them all!
In a case where I just want to know if an object is there or not, this is without a doubt the best option.
And now to remember these three...
Let's do our best! | sakuramilktea |
1,906,682 | The WebAssembly Magic 🪄 | Hello amazing people 👋, Have you ever thought about running some other programming language on the... | 0 | 2024-06-30T15:08:51 | https://dev.to/prathamjagga/the-wasm-magic-3p2n | webdev, programming, devops, performance | Hello amazing people 👋,
Have you ever thought about running some other programming language on the browser, other than Javascript 🤔. But then thought it is not possible since Browsers natively support only Javscript right?
Now what if I tell you that WebAssembly is the solution here, it enables us to run any compiled programming language on the web with near native execution times. But how ??? Okay, now in order for you to understand this, let us first discuss about what WebAssembly is right? But hey, in order to understand what WebAssembly is let’s first look at how compiled languages work on any machine.
So languages such as C++, Java, Rust have their compilers which first convert this code in a low level syntax (mostly some assembly like language) and then that low level code can be executed natively on the machines.
HLL (C++/Java/Rust) ———> LLL (Assembly) ———> Execution on machine.
Here can you a see a common thing ?? You guessed right, all the languages here gets compiled to some low level targets. And what if we define a low level target language which can be run on the browser ?? Well, WebAssembly is that only, it is a compilation target for high level languages which can be executed on the browsers.
But how is it possible to compile languages to WebAssembly, you might think. Well, we have different compilers in the WASM (WebAssembly) Ecosystem. One of the popular examples being Emscripten (EMCC) which is a compiler for C/C++ which compiles them into WebAssembly code.
But why do we even need to use WebAssembly in the first place, why can’t we simply compile these other languages in Javascript and then run them on browser. Well for many years, asm.js was a popular standard. But for many use cases we want speed and performance right, then what is the point of compiling faster languages to Javascript. That is why we want to achieve native speeds by directly running their low level targets on the Web.
Lets now understand the WebAssembly’s magic with a simple example:
Suppose we want to sort 1 million random numbers in the browser, but Javascript seems slow for this operation and we decided to implement this in C++ and then load its WebAssembly target code in the browser, then let’s see how much it can optimize the things.
So, lets have this code in Javascript and C++ which sorts the 1 million random numbers and then also prints the time taken in sorting them.
```js
const arr = [];
for (let i = 0; i < 1000000; i++) {
arr.push(Math.floor(Math.random() * 100000000));
if (i % 10000 === 0) {
console.log("I", i);
}
}
console.time("sort");
arr.sort((a, b) => a - b);
console.timeEnd("sort");
```
Now the C++ code:
```cpp
#include <iostream>
#include <vector>
#include <algorithm>
#include <chrono>
extern "C" {
void sortArray() {
std::vector<int> arr;
for (int i = 0; i < 1000000; ++i) {
arr.push_back(rand());
}
auto start = std::chrono::high_resolution_clock::now();
std::sort(arr.begin(), arr.end());
auto end = std::chrono::high_resolution_clock::now();
std::chrono::duration<double> diff = end - start;
std::cout << "Time to sort in C++: " << diff.count() << " seconds" << std::endl;
}
}
```
We can compile the above code to a WASM target using emscripten:
``emcc -o sort.js sort.cpp``
Then we get sort.wasm and sort.js files as the Webassembly target code and js code for loading the wasm code using the Browser’s WebAssembly API.
Now when we run sort.js files and the respective sorting code in a browser we can clearly see the time difference in the console:

See the numbers above are much smaller in Javascript and it takes around 0.3 seconds to sort them whereas numbers in CPP are way greater, still it is taked only 0.1 seconds.
Also, this is not the only thing where other languages are faster at, languages like Rust and C++ can also be used in graphics and video processing as well. One great example for this use case is “The Figma’s implementation of WebAssembly on their WebApplication” where they were able to achieve 10X faster loading times using C++ on their canvas editor.
> Also, WebAssembly is never going to replace Javascript, they both go hand in hand, WebAssembly can be leveraged for complex operations whereas for DOM operations Javascript will always be good at atleast in the upcoming years.
So, this was my learning for this week. See you the next week. Do like and share this post with friends and colleagues :)
Keep learning and have a great day 🙋♂️
| prathamjagga |
1,906,455 | Optimizing Your React App: A Guide to Production-Ready Setup with Webpack, TypeScript, ESLint, and Prettier - 2024 | In this blog post, we'll cover everything you need to know to set up a React app that's ready for... | 0 | 2024-06-30T15:08:11 | https://dev.to/shivampawar/optimizing-your-react-app-a-guide-to-production-ready-setup-with-webpack-typescript-eslint-and-prettier-2024-4lcl | react, learning, webpack, typescript | In this blog post, we'll cover everything you need to know to set up a React app that's ready for deployment.
GitHub Repo: https://github.com/shivam-pawar/sample-react-app
## Prerequisites
Before we begin, make sure you have Node.js and npm (or yarn) installed on your machine.
## Initialize a new project
Use your Command Line and navigate to the root folder of your project and enter
```bash
npm init
```
This will ask you some basic information like package name, author name, description, and license. With this info it will create a file called _package.json_
## Install React and TypeScript
- Install React and ReactDOM as dependencies:
```bash
npm install react react-dom
```
- Install TypeScript and its types as dev dependencies:
```bash
npm install --save-dev typescript @types/react @types/react-dom
```
## Set up Webpack
Install the necessary Webpack dependencies:
```bash
npm install --save-dev webpack webpack-cli webpack-dev-server html-webpack-plugin webpack-merge ts-loader terser-webpack-plugin uglify-js
```
Your _package.json_ will look like this:

- Create a **webpack** folder at root/project level and inside that add these 3 config files.
1. _webpack.common.js_
2. _webpack.config.js_
3. _webpack.dev.js_
4. _webpack.prod.js_
- Create a *src* folder at root/project level and inside that add these 2 files.
1. _index.tsx_
2. _index.html_
- Copy paste below code in _index.tsx_
```typescript
import React from "react";
import { createRoot } from "react-dom/client";
const App = () => {
return <div>Hello, React!</div>;
};
const rootElement = document.getElementById("root") as Element;
const root = createRoot(rootElement);
root.render(<App />);
```
- Copy paste below code in _index.html_
```html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>My React App</title>
</head>
<body>
<div id="root"></div>
</body>
</html>
```
Now lets update the webpack config files.
- _webpack.common.js_
```js
const path = require("path");
const HtmlWebpackPlugin = require("html-webpack-plugin");
module.exports = {
entry: path.resolve(__dirname, "..", "./src/index.tsx"),
output: {
path: path.resolve(__dirname, "..", "dist"),
filename: "bundle.js",
},
resolve: {
extensions: [".ts", ".tsx", ".js"],
},
module: {
rules: [
{
test: /\.(ts|js)x?$/,
use: "ts-loader",
exclude: /node_modules/,
},
],
},
plugins: [
new HtmlWebpackPlugin({
template: path.resolve(__dirname, "..", "./src/index.html"),
}),
],
devServer: {
static: "./dist",
},
};
```
- _webpack.config.js_
```js
const { merge } = require("webpack-merge");
const commonConfig = require("./webpack.common");
module.exports = (envVars) => {
const { env } = envVars;
const envConfig = require(`./webpack.${env}.js`);
const config = merge(commonConfig, envConfig);
return config;
};
```
- _webpack.dev.js_
```js
const webpack = require("webpack");
module.exports = {
mode: "development",
devtool: "cheap-module-source-map",
devServer: {
hot: true,
open: true,
},
plugins: [
new webpack.DefinePlugin({
"process.env.name": JSON.stringify("development"),
}),
],
};
```
- _webpack.prod.js_
```js
const webpack = require("webpack");
const TerserPlugin = require("terser-webpack-plugin");
module.exports = {
mode: "production",
devtool: false,
plugins: [
new webpack.DefinePlugin({
"process.env.name": JSON.stringify("production"),
}),
],
optimization: {
minimize: true,
minimizer: [
new TerserPlugin({
minify: TerserPlugin.uglifyJsMinify,
extractComments: true,
parallel: true,
test: /\.(ts|js)x?$/,
terserOptions: {
compress: {
drop_console: true,
},
output: {
comments: false,
},
},
}),
],
},
};
```
- Update/replace the scripts section in your _package.json_ file:
```json
"scripts": {
"start": "webpack serve --config webpack/webpack.config.js --env env=dev",
"build": "webpack --config webpack/webpack.config.js --env env=prod"
}
```
## Setup TypeScript
At root/project level add _tsconfig.json_ file and paste below config in it.
```json
{
"compilerOptions": {
"target": "ES6",
"lib": [
"DOM",
"ESNext"
],
"jsx": "react-jsx",
"module": "ESNext",
"moduleResolution": "Node",
"types": ["react", "react-dom", "@types/react", "@types/react-dom"],
"resolveJsonModule": true,
"isolatedModules": true,
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": true,
"skipLibCheck": true
}
}
```
Now your project folder and file structure will look like this:

## Run the development server
In terminal/command prompt run below command to run your development server:
```bash
npm start
```
Your React app should now be running at _http://localhost:8080_.
## Set up ESLint and Prettier
- Install ESLint, Prettier, and the necessary plugins:
```bash
npm install --save-dev eslint eslint-config-prettier eslint-plugin-prettier @typescript-eslint/eslint-plugin @typescript-eslint/parser eslint-plugin-react
```
- Create an _.eslintrc.json_ file in the root of your project with the following configuration:
```json
{
"env": {
"browser": true,
"es2021": true
},
"extends": [
"eslint:recommended",
"plugin:react/recommended",
"plugin:@typescript-eslint/recommended",
"prettier"
],
"parser": "@typescript-eslint/parser",
"parserOptions": {
"ecmaFeatures": {
"jsx": true
},
"ecmaVersion": 12,
"sourceType": "module"
},
"plugins": [
"react",
"@typescript-eslint",
"prettier"
],
"rules": {
"prettier/prettier": "error"
}
}
```
- Create a _.prettierrc_ file in the root of your project with the following configuration:
```json
{
"semi": true,
"trailingComma": "all",
"singleQuote": false,
"printWidth": 100,
"tabWidth": 2
}
```
- Update the _scripts_ section in your _package.json_ file:
```json
"scripts": {
"start": "webpack serve --config webpack/webpack.config.js --env env=dev",
"build": "webpack --config webpack/webpack.config.js --env env=prod",
"lint": "eslint . --ext .ts,.tsx --fix"
}
```
- Run ESLint to check for any linting issues:
```bash
npm run lint
```
Your final _package.json_ will look like this:

Your final folder structure will look like this:

## Conclusion
By following this guide, you now have a production-ready React application setup with Webpack, TypeScript, ESLint and Prettier. This setup provides a solid foundation for building scalable and maintainable React applications with best practices in place.
Remember to keep your dependencies up-to-date and continue learning about these tools to optimize your development workflow further.
Happy coding!❤️
> If you found this article useful, please share it with your friends and colleagues!
Read more articles on Dev.To ➡️ [Shivam Pawar](https://dev.to/shivampawar)
Follow me on ⤵️
🌐 [LinkedIn](https://www.linkedin.com/in/shivam-prakash-pawar/)
🌐 [Github](https://github.com/shivam-pawar)
| shivampawar |
1,906,681 | Keeping Your Data Close: Cross-Region Replication with AWS DMS | Keeping Your Data Close: Cross-Region Replication with AWS DMS In today's digital... | 0 | 2024-06-30T15:05:35 | https://dev.to/virajlakshitha/keeping-your-data-close-cross-region-replication-with-aws-dms-14bo | 
# Keeping Your Data Close: Cross-Region Replication with AWS DMS
In today's digital landscape, businesses require resilient and scalable solutions to ensure data availability and disaster recovery. Geographic redundancy, achieved by replicating data across multiple regions, is crucial for business continuity and low latency access for globally distributed applications. AWS offers various services for cross-region data replication, and one prominent solution is AWS Database Migration Service (DMS).
### Understanding AWS DMS
AWS DMS simplifies the process of migrating data to and from various database platforms, both within AWS and from on-premises environments. While often associated with database migration, its capabilities extend to continuous data replication, making it ideal for maintaining consistent data copies across regions.
### How DMS Works: A Quick Overview
1. **Source and Target Configuration:** You specify the source database (either within AWS or on-premises) and the target AWS region and database instance. DMS supports homogeneous migrations (e.g., MySQL to MySQL) and heterogeneous migrations (e.g., Oracle to Amazon Aurora).
2. **Replication Instance:** A managed environment within AWS, responsible for connecting to your source database, extracting changes, and applying them to the target. You can customize its size and network settings for optimal performance.
3. **Tasks and Replication Modes:** You define replication tasks that control the data migration process. DMS offers various replication modes, including:
* **Full Load:** Initial one-time data transfer.
* **Change Data Capture (CDC):** Captures and replicates only the data changes made at the source, ensuring minimal latency and resource consumption.
* **Continuous Replication:** For ongoing synchronization of data changes in real-time.
### Cross-Region Replication Use Cases:
Let's explore some compelling use cases where AWS DMS excels:
**1. Disaster Recovery and Business Continuity**
Imagine a scenario where your primary AWS region experiences an outage. With cross-region replication using DMS, you have a near real-time copy of your data in another region, ready to take over seamlessly.
**How it Works:**
* DMS continuously replicates changes from your production database (e.g., Amazon RDS for MySQL) in one region to a standby database in a different AWS region.
* In the event of a primary region failure, your application infrastructure can be redirected to the secondary region, minimizing downtime.
* Once the primary region is restored, DMS can backfill any missed changes, ensuring data consistency.
**2. Low-Latency Data Access for Global Applications**
Applications with users spread across the globe require low-latency access to data. Replicating data closer to your user base significantly reduces response times.
**How it Works:**
* Establish a multi-region architecture, deploying your application code in multiple regions.
* Use DMS to replicate data from your primary database to read replicas in each region.
* Route user requests to the closest region, enabling fast data retrieval and an improved user experience.
**3. Data Consolidation and Analytics**
Consolidate data from multiple sources and regions into a centralized data warehouse or data lake for analysis and reporting.
**How it Works:**
* Utilize DMS to replicate data from operational databases in different regions to a central Amazon S3 bucket.
* Use AWS Glue or other ETL tools to transform and prepare the data for analysis.
* Employ services like Amazon Redshift, Amazon Athena, or Amazon EMR to query and derive insights from your consolidated data.
**4. Blue/Green Deployments and Testing**
Reduce the risk of application deployments by replicating your production data to a separate environment for testing new code or configurations.
**How it Works:**
* Create a duplicate environment in a different region or availability zone using DMS to replicate your production data.
* Deploy and thoroughly test new application versions in the replica environment.
* Once validated, seamlessly switch traffic from the production environment to the tested replica.
**5. Database Migration with Minimal Downtime**
While not strictly cross-region replication, DMS simplifies migrating databases to AWS with minimal downtime.
**How it Works:**
* Set up continuous replication from your on-premises database to an AWS database instance.
* DMS synchronizes the data in the background, minimizing any interruption to your production environment.
* Once the data is fully replicated and validated, you can switch over to the AWS database with a short cutover window.
### Cross-Region Replication Alternatives:
While AWS DMS provides a comprehensive solution, AWS offers other services:
* **Amazon RDS Multi-AZ Deployments:** For RDS databases, Multi-AZ deployments offer synchronous replication to a standby instance in a different availability zone **within the same region**. This is ideal for high availability but not for disaster recovery across regions.
* **Amazon Aurora Global Database:** Specifically designed for Amazon Aurora, Global Database enables low-latency reads and disaster recovery across regions. It offers tighter integration with Aurora and better performance for Aurora workloads.
* **Application-Level Replication:** Some applications have built-in mechanisms for data replication (e.g., MySQL replication, PostgreSQL streaming replication). While powerful, this approach requires more configuration and management compared to a managed service like DMS.
### Conclusion
AWS DMS is a versatile and powerful service for implementing cross-region data replication, offering businesses the flexibility to meet a range of needs – from disaster recovery to global application deployment and data consolidation. By leveraging AWS DMS, organizations can enhance their data resilience, expand their global reach, and unlock the full potential of data-driven insights.
***
**Advanced Use Case: Building a Real-Time Analytics Pipeline with Cross-Region Replication and Serverless Components**
As an experienced AWS Solutions Architect, let me outline an advanced use case involving real-time analytics across regions using AWS DMS:
**Scenario:**
A global e-commerce platform needs to analyze user behavior in real-time to personalize recommendations, detect fraud, and optimize inventory. They have a multi-region architecture with the primary database in `us-east-1` and require near real-time analytics on data generated in all regions.
**Solution:**
1. **Cross-Region Data Replication:**
* Utilize AWS DMS to continuously replicate changes from the main transactional database (e.g., Amazon Aurora PostgreSQL) in `us-east-1` to a dedicated Amazon Aurora PostgreSQL replica in `us-west-2`. This secondary region is optimized for analytics.
2. **Real-time Data Streaming:**
* Configure AWS DMS to publish change data capture (CDC) records to an Amazon Kinesis Data Stream in `us-west-2`.
3. **Serverless Stream Processing:**
* Deploy an AWS Lambda function, triggered by the Kinesis Data Stream, to perform real-time data transformation and enrichment.
* Use Amazon Kinesis Data Analytics (using Apache Flink) for more complex stream processing, aggregations, and generating time-series insights.
4. **Data Lake Integration:**
* Store processed data in Amazon S3 in Parquet format, creating a data lake for historical analysis and machine learning model training.
5. **Interactive Analytics and Visualization:**
* Use Amazon Athena for ad-hoc querying and analysis of the data in the S3 data lake.
* Visualize insights using Amazon QuickSight dashboards connected to both the real-time stream data and the historical data in S3.
**Benefits:**
* **Real-time Insights:** Analyze user behavior, transactions, and other events with minimal latency, driving immediate business decisions.
* **Scalability and Cost-Efficiency:** The serverless architecture (Lambda, Kinesis) scales automatically based on data volume, optimizing costs.
* **Flexibility and Extensibility:** The data lake and analytics pipeline can easily integrate with other AWS services for machine learning, security analysis, and more.
By combining AWS DMS with other powerful services, this architecture empowers the e-commerce platform to derive actionable insights from its data in real time, enhancing customer experience, mitigating risks, and driving growth.
| virajlakshitha | |
1,906,680 | Networking-Architecture-using-Terraform | 🖥 Excited to Share My Latest Project with AWS and Terraform! I am thrilled to announce that I... | 0 | 2024-06-30T15:03:29 | https://dev.to/sukuru_naga_sai_srinivasu/networking-architecture-using-terraform-4flp | aws, terraform | 
🖥 Excited to Share My Latest Project with AWS and Terraform!
I am thrilled to announce that I have recently completed a small but impactful project leveraging AWS and Terraform. This hands-on experience has been incredibly rewarding and has significantly deepened my understanding of cloud infrastructure and automation.
Throughout this project, I utilized AWS's comprehensive documentation and Terraform's resources to guide me. This approach allowed me to dive deep into AWS, gaining practical insights and skills that are essential for modern cloud environments.
Key takeaways from this project:
1) Hands-on experience with AWS and Terraform configurations.
2) Enhanced understanding of cloud infrastructure as code.
3) Improved ability to navigate and utilize extensive technical documentation.
This project has not only broadened my technical skill set but also reinforced the importance of continuous learning and adaptation in the ever-evolving tech landscape.
GitHub Repo - https://github.com/SNS-Srinivasu
Linkedin - https://www.linkedin.com/in/sns-srinivasu | sukuru_naga_sai_srinivasu |
1,906,678 | Automating Email Notifications for S3 Object Uploads using SNS | Introduction: In today's cloud-centric world, automation is key to managing and scaling... | 0 | 2024-06-30T14:59:13 | https://dev.to/mohanapriya_s_1808/automating-email-notifications-for-s3-object-uploads-using-sns-5efb | **Introduction:**
In today's cloud-centric world, automation is key to managing and scaling infrastructure efficiently. Amazon Web Services (AWS) offers a robust suite of services that allow developers to build complex systems with relative ease. One such powerful combination is Amazon Simple Storage Service (S3) and Amazon Simple Notification Service (SNS). Together, they can automate notifications for various events, such as when a new object is uploaded to an S3 bucket. This blog post will guide you through the process of setting up an S3 bucket, creating an SNS topic, subscribing to the topic via email, and configuring the system to send notifications when objects are uploaded to the S3 bucket.
**Step-1:** Create an S3 Bucket
First, we need an S3 bucket where objects will be stored. Follow these steps to create an S3 bucket:
**1.Navigate to the S3 console:** Open the AWS Management Console and navigate to the S3 service.
**2.Create a new bucket:**Click on "Create bucket" and provide a unique name for your bucket. Choose the appropriate region and configure other settings as per your requirements. For this example, we’ll stick with the default settings.
**3.Create the bucket:** Click on "Create bucket" to finalize the creation.

**Step-2:** Create an SNS Topic
Next, we’ll create an SNS topic to which notifications will be sent.
**1. Navigate to the SNS console:** Open the AWS Management Console and navigate to the SNS service.
**2. Create a topic:** Click on "Create topic", select "Standard" for the topic type, and provide a name for your topic. Click on "Create topic" to finalize the creation.

**Step-3:** Subscribe to the SNS Topic via Email
To receive notifications, we need to subscribe to the SNS topic using an email address.
**1. Create a subscription:**In the SNS console, select your topic and click on "Create subscription".
**2. Set protocol and endpoint:** Choose "Email" as the protocol and enter your email address in the endpoint field.
**3. Confirm the subscription:** AWS will send a confirmation email to the provided address. Check your email and click on the confirmation link to complete the subscription.
**Step-4:** Update the SNS Topic Access Policy
To allow S3 to publish messages to the SNS topic, we need to modify the topic's access policy.
**1. Edit the policy:** In the SNS console, select your topic and click on "Edit" under the Access Policy section.
**2. Add the policy:** Add the following policy to allow S3 to publish notifications:
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "s3.amazonaws.com"
},
"Action": "SNS:Publish",
"Resource": "arn:aws:sns:your-region:your-account-id:your-topic-name"
}
]
}
```
**Step-5:** Configure S3 to Send Notifications to SNS
Finally, configure the S3 bucket to send notifications to the SNS topic when an object is uploaded.
**1. Navigate to the S3 bucket:** In the S3 console, select your bucket and click on the "Properties" tab.
**2. Add a notification:** Under the "Event notifications" section, click on "Create event notification".
**3. Configure the event:** Provide a name for the notification, select the "All object create events" event type, and choose SNS topic as the destination. Select your SNS topic from the dropdown and save the configuration.

**Step-6:** Upload the object
At last upload the object in the s3 bucket that has been created.

SNS will send notification to the subscribed email once the object is uploaded in the S3 bucket.
**Conclusion:**
By following the steps outlined in this blog post, you have successfully set up an automated notification system that sends an email whenever a new object is uploaded to your S3 bucket. This integration between S3 and SNS not only demonstrates the power of AWS services but also showcases how they can be used to build robust and automated workflows. Whether for operational monitoring, security alerts, or just keeping track of changes in your storage, this setup is a valuable tool in any cloud infrastructure arsenal. Happy automating!
| mohanapriya_s_1808 | |
1,906,657 | Mastering Caching in Distributed Systems: Strategies for Consistency and Scalability | Handling Caching in a Distributed System is difficult but not impossible. This is going to be long... | 0 | 2024-06-30T14:57:57 | https://dev.to/nayanraj-adhikary/deep-dive-caching-in-distributed-systems-at-scale-3h1g | webdev, javascript, development, beginners | Handling Caching in a Distributed System is difficult but not impossible.
This is going to be long but, informative
I would be referring to Distributed System --> DS
For a Basic Understanding of Caching refer to my previous blogs
1. [Deep Dive into Caching: Techniques for High-Performance Web Apps](https://dev.to/nayanraj-adhikary/deep-dive-into-caching-techniques-for-high-performance-web-apps-56kb)
2. [Implementing Caching Strategies: Techniques for High-Performance Web Apps](https://dev.to/nayanraj-adhikary/implementing-caching-strategies-techniques-for-high-performance-web-apps-3dm7)
Let's not waste any more time here and deep dive into it.
-------------------
## What are you going to learn here
1. Benefits of caching in DS(performance, latency reduction, load balancing)
2. Handling Consistency in DS
3. Ensuring Performance in DS
4. Ensuring Availability in DS
5. Implementing Caching at Scale
6. Real World Example (Netflix, Facebook, Twitter[X.com])
## Benefits of Caching
### Performance
Caching significantly enhances the performance of distributed systems by storing frequently accessed data in a faster, more accessible location. This reduces the need to fetch data from slower, more distant data sources, such as databases or external services. The performance benefits include:
1. Reduced Data Retrieval Time
2. Decreased Server Load
3. Improved Throughput
### Latency Reduction
Latency refers to the time it takes for a request to travel from the client to the server and back. Caching helps reduce latency in several ways:
1. Proximity of Data / CDN
2. Elimination of Redundant Processing
3. Quick Access to Data
### Load Balancing
Load balancing ensures that no single server or node becomes overwhelmed with requests, distributing the load evenly across the system. Caching contributes to effective load balancing by:
1. Spreading Data Requests
2. Reducing Hotspots
3. Distributing Cache Loads
## Handling Consistency
### Consistency Models
Consistency in distributed systems refers to the degree to which different nodes or clients see the same data at the same time. There are several consistency models to consider:
- **Strong Consistency** : Guarantees that all nodes see the same data simultaneously. This model is easiest to reason about but can be challenging to implement at scale due to performance trade-offs.
- **Eventual Consistency** : Ensures that all nodes will eventually see the same data, but not necessarily at the same time. This model is more performant but can lead to temporary inconsistencies.
- **Causal Consistency** : Ensures that causally related operations are seen by all nodes in the same order. This model strikes a balance between strong and eventual consistency.
### Techniques for Maintaining Consistency
To maintain consistency across distributed caches, several techniques can be employed:
- **Cache Invalidation Strategies** : Ensure that outdated or stale data is removed from the cache. Common strategies include time-to-live (TTL), manual invalidation, and automatic invalidation based on data changes.
- **Write-Through, Write-Behind, and Write-Around Caching** : These policies define how and when data is written to the cache and the backing store. Don't Know Policy [Check Out !](https://dev.to/nayanraj-adhikary/implementing-caching-strategies-techniques-for-high-performance-web-apps-3dm7)
- **Distributed Consensus Algorithms** : Algorithms like Paxos and Raft help maintain consistency by ensuring that all nodes agree on the order of operations.
- **Conflict Resolution Techniques** : Approaches like last-write-wins or vector clocks can help resolve conflicts when concurrent updates occur.
## Ensuring Performance
### Caching Strategies
- **Cache-Aside** : The application checks the cache before fetching data from the source. If the data is not in the cache, it retrieves and stores it there.
- **Read-Through** : The cache itself loads data from the backend store on a cache miss.
- **Write-Through** : Updates go to both the cache and the backend store simultaneously.
- **Write-Behind** : Updates go to the cache immediately, and the backend store is updated asynchronously
### Performance Optimization Techniques
- **Efficient Cache Eviction Policies** : Implementing policies like Least Recently Used (LRU) or Least Frequently Used (LFU) helps manage limited cache space effectively.
- **Use of In-Memory Caching** : In-memory caching solutions like Redis and Memcached offer high-speed data access.
- **Data Compression** : Compressing cached data can save space and reduce I/O times.
- **Load Balancing and Sharding** : Distributing cache data and requests evenly across multiple nodes enhances performance.
### Latency Reduction
- **Geographically Distributed Caches** : Using Content Delivery Networks (CDNs) to place caches closer to users reduces latency.
- **Multi-Tiered Caching** : Implementing caching at multiple levels (client-side, edge, server-side) optimizes performance.
- **Prefetching and Cache Warming** : Preloading data into the cache based on anticipated demand reduces cache miss rates.
## Ensuring Availability
### High Availability Techniques
- **Replication Strategies** : Implementing master-slave or multi-master replication ensures data availability during node failures.
- **Failover Mechanisms** : Automatic failover to backup nodes maintains service continuity during failures.
- **Data Redundancy** : Storing multiple copies of data across different nodes increases fault tolerance.
### Fault Tolerance
- **Handling Node Failures** : Using techniques like quorum-based approaches ensures system resilience.
- **Graceful Degradation Strategies** : Ensuring that the system continues to function, albeit with reduced performance, during partial failures.
### Monitoring and Alerts
- **Implementing Health Checks** : Regular health checks ensure the cache is functioning correctly.
- **Real-Time Monitoring Tools** : Using tools like Prometheus and Grafana for real-time monitoring.
- **Automated Alerting Systems** : Setting up automated alerts for issues like high latency or node failures.
## Implementing Caching at Scale
Scaling caching solutions in distributed systems presents several challenges:
- **Data Distribution and Partitioning** : Distributing data across multiple nodes to ensure even load distribution and high availability.
- **Load Balancing** : Ensuring that no single node becomes a bottleneck by evenly distributing requests across the system.
Several techniques and tools can help implement caching at scale:
- **Sharding** : Dividing the dataset into smaller, manageable pieces (shards) that can be distributed across multiple nodes.
- **Distributed Caching Solutions** : Tools like Memcached, Redis, and Apache Ignite provide robust distributed caching capabilities.
- **Multi-Tiered Caching** : Implementing caching at multiple levels (e.g., client-side, edge, server-side) to optimize performance and resource utilization.
## Real-World Examples
In case you have reached reading the blog till here. some of the awesome use-case
### Netflix
Netflix is a prime example of a company that leverages distributed caching to efficiently deliver content to millions of users worldwide. There are many more optimizations done here some of them
1. **Content Delivery Network (CDN)** : Netflix uses its own CDN, called Open Connect, to cache video content closer to the users. By deploying servers at ISPs (Internet Service Providers), Netflix reduces latency and bandwidth costs while ensuring high-quality video streaming.
2. **Multi-Tiered Caching** : Netflix employs a multi-tiered caching strategy, including client-side caches (on users’ devices), edge caches (within the ISP networks), and regional caches. This layered approach ensures that content is served quickly from the nearest cache, minimizing latency and buffering. This includes the Buffer data for Videos.
3. **Personalization and Recommendations** : Netflix caches personalized recommendations and metadata about shows and movies. This allows the recommendation engine to quickly provide relevant suggestions without repeatedly querying the backend systems.
### Facebook
Facebook uses distributed caching extensively to handle its massive user base and the high volume of interactions on its platform.
1. **Memcached Deployment** : Facebook is known for its large-scale deployment of Memcached to cache data retrieved from its databases. This caching layer helps reduce the load on databases, allowing them to scale horizontally and handle more queries efficiently.
2. **TAO (The Associations and Objects)** : Facebook developed TAO, a geographically distributed data store that caches and manages the social graph (relationships and interactions between users). TAO ensures that frequently accessed data, such as friend lists and likes, are served quickly, improving the overall user experience. [read more.](https://engineering.fb.com/2013/06/25/core-infra/tao-the-power-of-the-graph/)
3. **Edge Caching** : To further reduce latency, Facebook employs edge caches that store static content like images, videos, and JavaScript files closer to users. This helps in serving content rapidly, reducing the load on central servers, and improving the site’s responsiveness.
### X (formerly know as Twitter)
Twitter faces the challenge of delivering real-time updates to millions of users, which requires efficient caching strategies.
1. **Timeline Caching** : Twitter caches timelines (feeds of tweets) to ensure that users see updates quickly. By caching these timelines, Twitter reduces the need to query the database for every user request, significantly improving response times.
2. **Redis for In-Memory Caching** : Twitter uses Redis for various caching purposes, including caching user sessions, trending topics, and other frequently accessed data. Redis’s in-memory storage provides fast read and write operations, essential for real-time applications.
3. **CDN for Static Content** : Like many other large-scale web services, Twitter uses a CDN to cache static assets, such as images and stylesheets, closer to users. This reduces latency and ensures that content loads quickly.
### Lessons Learned and Best Practices
1. **Strategic Placement of Caches** : Placing caches at different levels (client-side, edge, server-side) and strategically within the network (e.g., using CDNs) can significantly reduce latency and improve performance.
2. **Efficient Cache Invalidation** : Implementing effective cache invalidation strategies is crucial to ensure data consistency. Techniques like TTL, manual invalidation, and automatic invalidation based on data changes are commonly used.
3. **Balancing Consistency and Performance** : Understanding the trade-offs between strong consistency and performance is essential. Companies often choose eventual consistency for high-performance use cases while using strong consistency for critical data.
4. **Monitoring and Metrics** : Continuous monitoring and metrics collection are vital for understanding cache performance and identifying issues. Tools like Prometheus, Grafana, and custom dashboards are commonly used.
5. **Scalability and Fault Tolerance** : Implementing sharding, replication, and failover mechanisms ensures that the caching layer can scale with the system and remain highly available even during failures.
## Conclusion
In summary, caching is a powerful tool in distributed systems, but it requires careful consideration of consistency, scalability, and writing policies. By understanding these aspects and implementing best practices, you can design caching solutions that significantly enhance system performance and reliability.
Hope you have learned something from this blog.
Follow me for such interesting content. I keep reading and implementing stuff. | nayanraj-adhikary |
1,906,676 | Issue 005 - The Bottleneck of Continuous Writing | I went to the sea again this weekend. Every time I connect with nature and feel the sea breeze, it... | 0 | 2024-06-30T14:56:27 | https://dev.to/justin3go/issue-005-the-bottleneck-of-continuous-writing-313b |

I went to the sea again this weekend. Every time I connect with nature and feel the sea breeze, it seems like all the fatigue of the week just vanishes.
> website: [fav0.com](https://fav0.com/en/)
## \>\> Topics to Discuss
**The Bottleneck of Continuous Writing**
As we all know, there are many benefits for programmers to write blogs, such as honing their technical skills, building personal influence, and enhancing the impression of knowledge points.
However, those who can write continuously are rare because there are always many reasons that prevent us from completing this task. Here are some reasons I can think of:
**1) Creative Exhaustion**
- Initially, I wrote technical articles, but after a while, I felt like there was nothing more to write about.
- Then I started writing practical blogs, but these types of articles require hands-on practice. Often, it takes months to complete a project and then write a summary, resulting in very few such posts.
- Now, I write weeklies. Since I usually like to follow tech news, I can record it in the form of a weekly digest, which is a good method.
**2) Self-Doubt**
- When writing blogs before, I gradually felt that writing just a few hundred words (like the ">> Topics to Discuss" section in this weekly) was too short and not worth publishing.
- Or, after seeing many excellent blog posts, I felt that others had already written about it and did it well, so there was no point in writing it myself.
- Writing some introductory articles also faced doubts from some "big shots."
**3) Time Management**
Previously, I had a lot of personal time every day, and with two days off each week, writing a weekly post was more than enough. But recently, I’ve been on a business trip for a month and was very busy during that time, leaving almost no time to browse the latest information or read books. Without input, it’s naturally hard to have output.
4) Technical Bottleneck: Technical writing relies on technical skills. Without sufficient skills, writing becomes difficult.
5) External Interference: Recently, there have been more sudden phone calls.
6) Lack of Feedback and Motivation
Now, I require myself to prioritize quantity over quality. **For me, recording is far more important than quality.**
## \>\>Must Read
### OpenAI to Block API Calls from Certain Regions Starting July 9
This week, many users received an email from OpenAI about this issue. I received one too. Did you? The email content is as follows:

It’s unclear what consequences this additional measure will bring. At least for now, I'm hesitant to add a lot of money at once and will only add small amounts. I’m also looking for alternatives.
### [Claude 3.5 Sonnet](https://www.anthropic.com/news/claude-3-5-sonnet)


### CSDN Forked the Entire GitHub
This event was spread by multiple influencers. I’m not sure how much traffic it brought to CSDN’s gitcode. Personally, I didn’t know about the gitcode platform before this. Although it’s negative publicity, the traffic is still high!
Interestingly, it also directly copied some sensitive content from GitHub to domestic servers, which caused problems...
### [Using GPT-4 to Find Errors in GPT-4](https://openai.com/index/finding-gpt4s-mistakes-with-gpt-4/)
OpenAI launched CriticGPT! This groundbreaking model helps detect errors in ChatGPT's code outputs. With CriticGPT, users’ performance improved by 60% compared to without its help. This marks an important step toward better AI alignment and more accurate outputs.

### [Firefox Integrates AI](https://blog.nightly.mozilla.org/2024/06/24/experimenting-with-ai-services-in-nightly/)
In the coming months, Firefox will experiment with providing convenient access to optional AI services in its nightly builds to enhance productivity while browsing. This work is part of Firefox’s efforts to improve multitasking and cross-referencing in the sidebar. Firefox is committed to following user choice, agency, and privacy principles when introducing AI enhancements. Initially, this experiment will only be available to nightly users, and the AI features are entirely optional. If helpful, they’re there, but they’re not built into any core functionality.
Notably, it uses third-party providers:
- ChatGPT
- Google Gemini
- HuggingChat
- Le Chat Mistral

## \>\> Useful Tools
### [Free Online Tool Website](https://10015.io/tools/tweet-to-image-converter)
A beautifully designed, simple, and practical free online tool website! It already has nearly 300 votes on Product Hunt~
Seven categories cover all your tool needs:
1. Text (e.g., convert to handwriting, font pairing)
2. Image (SVG generation, image description generation)
3. CSS (e.g., gradient, shadow, clip-path generation)
4. Coding (e.g., pretty code screenshots, slug generation, code compression)
5. Color (e.g., AI palette, color shadow generation, color mixer)
6. Social Media (download Instagram photos, generate Instagram & Twitter posts, OG metadata generation)
7. Others (QR code, barcode generation)
(Supports browser plugins)

### [Resources for Learning Software Architecture](https://github.com/mehdihadeli/awesome-software-architecture)
A repository of software architecture resources on GitHub trending!
A curated list of excellent articles, videos, and other resources for learning and practicing software architecture, patterns, and principles.
It also has a website for easier reading: [awesome-architecture.com](https://awesome-architecture.com)

### [Animated Icons](https://unicornicons.com/icons)

### [Real-Time Translation - RTranslator](https://github.com/niedev/RTranslator)
A recently popular repository, RTranslator is the world's first open-source real-time translation application. It’s a (nearly) open-source, free, and offline Android real-time translation app.
Connect with someone who has the app installed, connect Bluetooth headphones, put the phone in your pocket, and you can converse as if the other person speaks your language.

### [Parse PDFs with GPT - gptpdf](https://github.com/CosmosShadow/gptpdf)
Parse PDFs into markdown using visual large language models (e.g., GPT-4).
The method is very simple (only 293 lines of code) but can almost perfectly parse layouts, math formulas, tables, images, charts, etc.
Average cost per page: $0.013
Parsing effect:

## \>\>Interesting Finds
### [One Million Checkboxes](https://onemillioncheckboxes.com/)
A website with one million checkboxes supporting real-time collaborative clicking. You can click around and interact with other users, competing for checkboxes~

### [Large Language Model in a Font File](https://fuglede.github.io/llama.ttf/)
`llama.ttf` is a font file that is also a large language model and its inference engine.
You can run LLMs using just the font, generating text in any Wasm-enabled HarfBuzz application, such as your favorite text editor/email client/etc., without waiting for vendors to include features like "Copilot." And everything runs entirely locally.
### [TCP Never Drops Packets](https://x.com/shengxj1/status/1806022305677550013)

Seeing this reminds me of a meme I saw before:

### Viral Tweets
Recently, I’ve seen many viral tweets in this format. Below is the secret to going viral according to GPT👇

Text version:
```
Example Tweet
Here’s an example tweet combining the above suggestions:
🌟 Want to know how to boost your social media influence in 30 days? 📈 We’ve summarized 10 practical tips. Start changing now! 🔗 [Link] #SocialMedia #Marketing #SelfImprovement
[Image or Video]
This tweet includes an engaging opening, concise content, relevant hashtags, a call-to-action, and visual content.
Conclusion
Writing a viral tweet requires a combination of creativity, strategy, and understanding your audience. By continuously testing and optimizing, you can discover the tweet style and strategy that work best for you.
```
### Tutorial Repositories Dominate GitHub Trending
While browsing GitHub trending, I noticed many tutorial repositories. It seems tutorial repositories are more likely to gain stars:

## \>\> Worth Reading | justin3go | |
1,906,675 | About using Microsoft office 365 (offline version) for free | As a budding Data Analyst, I have encountered challenges with utilizing Microsoft 365 due to... | 0 | 2024-06-30T14:52:16 | https://dev.to/dipalee_gaware_b4630cc678/about-using-microsoft-office-365-offline-version-for-free-42im | As a budding Data Analyst, I have encountered challenges with utilizing Microsoft 365 due to financial constraints. I am seeking guidance on how to access or download Microsoft Office 365 for free in its offline version. This is important as certain functionalities I require are not available in the online version. Your help in this matter would be greatly appreciated. | dipalee_gaware_b4630cc678 | |
1,906,674 | What Is BC-404: A Comprehensive Guide to the Latest Deflationary NFT Standard | Introduction: Recently, a new NFT standard called BC-404 has emerged following the advent of the... | 0 | 2024-06-30T14:51:03 | https://dev.to/nft_research/what-is-bc-404-a-comprehensive-guide-to-the-latest-deflationary-nft-standard-99j | nft, web3 | Introduction:
Recently, a new NFT standard called BC-404 has emerged following the advent of the ERC-404 standard, bringing fresh possibilities to the NFT market.
BC-404, which stands for Bonding Curve 404, is the first deflationary NFT contract in the cryptocurrency space, building upon and improving the ERC-404 standard. This article will explore the innovative BC-404 standard, combining the strengths of ERC-404 with new advancements.
The ERC-404 Protocol Standard:
In the last six months, the ERC-404 has been a groundbreaking innovation in the asset issuance domain of the Ethereum ecosystem. Introduced in early February by the Pandora team, ERC-404 is an experimental open-source token standard for creators and developers. It features a hybrid ERC-20/ERC-721 implementation with native liquidity and fractionalization capabilities. This is made possible through a quantitative correspondence logic between FTs and NFTs, enabling the minting and burning mechanism for NFTs.
In simple terms, certain high-value NFT projects, particularly those with rare attributes, have become both expensive and illiquid. The introduction of ERC-404 signals a significant improvement for these previously illiquid NFT projects. These high-priced NFTs can now be fractionally traded on mainstream marketplaces, providing the same convenience and speed as purchasing assets like BTC or ETH.
BC-404 Token Standard:
The Color Protocol initially launched the ERC-404 conversion platform for Memecoin, where meme artists and communities could create their meme NFT collections in ERC-404. Holders could convert their Memecoin into NFTs.
Utilizing the Bonding Curve model, COLOR introduced the BC-404 standard based on ERC-404. The required token amount for generating NFTs continues to increase until reaching a preset limit. Therefore, early participants have a cost advantage.
In the ERC-404 contract, Tokens and NFTs have a fixed ratio, where the deployer can initially define how many ERC-404 Tokens an address needs to hold to generate an ERC-404 NFT. For example, 1 $Pandora Token can create 1 Pandora NFT, but if an address only holds 0.9 $Pandora, it cannot generate a Pandora NFT.
BC-404 integrates the Bonding Curve model, changing the required amount of BC-404 tokens for generating BC-404 NFTs from a fixed amount to continually increasing. With each new BC-404 NFT generated, this value will rise, meaning each new NFT will be harder to generate than the previous one.
When deploying a BC-404 contract, the deployer can set the initial difficulty for generating the first BC-404 NFT (i.e., how many BC-404 Tokens an address needs to hold) and the increment value for the difficulty of generating each subsequent BC-404 NFT.
1/ Bonding Curve Model of BC-404 Standard:
BC-404 can be designed using three different difficulty models. Color defines the NFT generation difficulty as “BC Count”: linear increase, Curved Increase with Accelerating Slope, and Curved Increase with Decelerating Slope.
2/ Token content of BC-404 NFTs:
Due to the varying amount of BC-404 tokens needed to generate each BC-404 NFT, the token content will also differ for different NFTs, corresponding to the number of tokens held and the number of NFTs generated. Conversely, each ERC-404 NFT will carry the same fixed amount of tokens.
3/ BC-404 NFT ID:
The value of BC-404 NFTs varies based on their token content, which is reflected in the NFT marketplace through the linking of NFT IDs with their token content (the BC Count at the time of generation).
For instance, when a user purchases a BC-404 NFT with ID #1868, they receive 1868 BC-404 Tokens alongside the NFT.
Holders and traders can easily identify each NFT’s token content by looking at its ID in wallets or NFT marketplaces.
The NFT IDs are not required to start from #1 and can be determined by the initial BC count value set by the BC-404 contract deployer.
Additionally, the NFT IDs do not have to follow a sequential order and may be influenced by the BC count increment pattern established by the deployer.
Every NFT ID is distinct, and generating a new NFT will always involve a higher BC Count than any previous one, offering each BC-404 NFT its unique history.
4/ NFT Burn Mechanism and Deflation:
BC-404 NFT Trading and Transfer: Just like with ERC-404, when BC-404 NFTs are transferred, they will remain the same, and an equivalent amount of tokens will be transferred along with the NFTs.
BC-404 Token Trading and Transfer: When BC-404 tokens are traded or transferred, it may lead to the burning of BC-404 NFTs if there are not enough tokens left to support them post-transfer, similar to ERC-404 V1.
Each BC-404 NFT ID is unique and can only appear once. Once an NFT is destroyed, its ID cannot be renewed, resulting in a deflation of the BC-404 NFT collection, assuming a fixed total token supply set by the contract.
This unique feature makes BC-404 the first deflationary NFT contract in the cryptocurrency world.
Colorpepe:
$colorpepe is the first BC-404 asset, and it generates the first pepe PFP series from $PEPE, deployed on Base. It combines the uniqueness and scarcity of NFTs with the popularity and liquidity of Memecoins. Through the conversion pool, users can convert $PEPE to the BC-404 standard (which can be bridged across chains if needed), giving native Memecoin assets NFT properties.
Conversion details:
Free conversion: Only pay gas fees, no additional costs.
Reversible conversion: $colorpepe can be converted back to $PEPE at any time, with the corresponding amount of $colorpepe being destroyed.
Limited supply: The total supply of $colorpepe tokens is limited, with a time window for conversion.
Whitelist priority: Conversion is divided into two stages, with the first stage limited to whitelisted addresses and the second stage open to the public.
Note: Make sure you have some $PEPE (ETH mainnet) in your wallet and enough $ETH to pay gas fees.)
Features of BC-404:
1/ Decentralized Value Consensus
BC-404 provides NFTs with unique inherent value by allowing the market and community members to determine the aesthetic value of each NFT. Rather than setting strict rules on rarity, BC-404 focuses on designing the numerical uniqueness of NFTs through their token contents, leaving it up to the community to decide on value. This approach aligns with the decentralized spirit of the crypto world.
2/ Enhanced NFT Scarcity
Through a new mint & burn mechanism, the scarcity of the BC-404 NFT Collection increases in tandem with token trading volume. This deflationary process addresses the imbalance between token and NFT liquidity in the market, ensuring ongoing trading vitality and profitability. Acquiring NFTs through token purchases becomes more challenging as the supply decreases.
3/ Improved Reset Mechanism
The BC-404 standard introduces a new reset mechanism that requires a higher BC count to generate a new NFT when transferring tokens. This adjustment helps maintain the vibrant trading dynamics of the NFT market.
4/ Engaging Trading Dynamics
Participants in the BC-404 ecosystem face intriguing decisions when acquiring more NFTs — whether to buy existing NFTs directly or purchase tokens to generate new ones. Factors like market premiums, token costs, and the randomness of new NFT generation add complexity to trading strategies, challenging participants to think strategically and make informed decisions.
Potential Challenges and Prospects
While BC-404 brings many innovations, its complex mechanism may require some time for market participants to fully understand and accept. Effectively explaining the advantages and operation of BC-404 to users will be a major challenge in the initial stages of project promotion. Furthermore, BC-404 balances the liquidity of tokens and NFTs through its unique mechanism, but how to maintain this balance in practical operation and avoid extreme situations will be an issue that needs continuous attention.
Conclusion
BC-404 introduces a new dimension of value to the NFT market — token content, which will be an important measure of the value of individual NFTs. Looking to the future, BC-404 not only brings new possibilities to the NFT market but also provides a new dimension for valuing digital assets. Its emergence marks that NFTs are evolving towards a more complex and diverse direction, integrating token economics, deflation mechanisms, and dynamic pricing models. With the maturation of technology and the acceptance of the market, we can expect to see more innovative applications based on BC-404, truly expanding the audience of the 404 ecosystem and leading it towards prosperity.
References:
https://docs.colorprotocol.com/?v=25
NFTScan is the world’s largest NFT data infrastructure, including a professional NFT explorer and NFT developer platform, supporting the complete amount of NFT data for 20+ blockchains including Ethereum, Solana, BNBChain, Arbitrum, Optimism, and other major networks, providing NFT API for developers on various blockchains.
Official Links:
NFTScan: https://nftscan.com
Developer: https://developer.nftscan.com
Twitter: https://twitter.com/nftscan_com
Discord: https://discord.gg/nftscan
Join the NFTScan Connect Program | nft_research |
1,906,672 | React vs. Vue: Frontend Duel. | Hey folks, I’ve been diving into two awesome frontend technologies: React and Vue. Both are super... | 0 | 2024-06-30T14:42:00 | https://dev.to/mashobtechie/react-vs-vue-frontend-duel-m97 | webdev, javascript, frontend, programming | Hey folks,
I’ve been diving into two awesome frontend technologies: React and Vue. Both are super powerful but have their own vibes. Since I use ReactJS a lot, I thought it’d be cool to explore Vue too, which i did but for a short period of time. So, here’s what I found:
Let's start with react;
React is a library for building user interfaces with components and a virtual DOM and as we all know it was made by Facebook.
Why I prefer React:
1. Component-Based: It is components based, and with that you can be able to use a particular UI in any other place on your work environment.
2. Flexible: Mix and match with other tools and frameworks.
But, it has some downsides;
1.Learning Curve: The filename JSX can be weird at first.
2. Also, many setups before starting to work with it.
Now, let's talk about VUE
Vue is a progressive framework built on top of standard HTML, CSS and JavaScript.
Why It Rocks:
1. Easy to Learn: Simple syntax, great for beginners.
2. Single-File Components: Keep HTML, JS, and CSS in one place.
3. Reactivity: State management is straightforward.
Downside:
Smaller Ecosystem: Not as many third-party tools as React.
THEIR DIFFERENCES
Learning Curve:
a. React: Steeper, especially with JSX.
b. Vue: Easier to pick up.
Ecosystem:
a. React: Huge.
b. Vue: Growing but smaller.
Performance:
Both are fast but handle updates differently.
Final Thoughts
React and Vue both kick butt. React's robustness is great for big projects, while Vue’s simplicity is perfect for getting started quickly. Exploring both is fun, but React remains my top choice for building scalable apps.
How I Plan to Use React in HNG
1.Building User Interfaces: I’m using React to create clean, reusable components that make our UIs look great and function smoothly.
2.State Management: With tools like Context API, I’ll manage complex state across our app, ensuring everything works seamlessly.
Check out more about the HNG Internship program via https://hng.tech/internship or https://hng.tech/hire.
| mashobtechie |
1,906,671 | 7 Books That Make You A Great Tech Lead | This article was originally posted in my blog:... | 0 | 2024-06-30T14:39:40 | https://dev.to/codebymedu/7-books-that-make-you-a-great-tech-lead-16gf | leadership, frontend, beginners | This article was originally posted in my blog: [https://www.codebymedu.com/blog/7-books-for-tech-lead](https://www.codebymedu.com/blog/7-books-for-tech-lead)
If you're already a tech lead or are planning to become a tech lead in the future, I've gathered 7 of the books I've read that helped me become a successful tech lead that you must read too.
I tried to gather practical books with practical steps in them, so instead of learning theory and forgetting everything afterwards, most of these books focus in practical advice.
If you don't know much about tech lead role, it differs a lot from an IC role. You are required to have a vision for the technical part of a product and lead the team there.
**"Talking to Tech Leads" by Patrick Kua**
This is my favorite book about tech lead stuff. This book involves interviews with more than 35 tech leads and brings learnings from all of them.
It helped me extremely when I first started as a tech lead to see the potential problems I might have later. And guess what, I ended up dealing with most of the issues that were mentioned and I already had ideas what to do.
**"The Manager's Path" by Camille Fournier**
This book is critical about learning soft skills such as communication, and learning how to mentor other people.
It's useful for both tech leads and engineering managers. That's why its a must read as it will give you more ideas how people are leaded and managed.
I'd suggest this book more if you're already a tech lead as it provides a lot of ideas for improvements on existing processes you might be doing.
**"Staff Engineer: Leadership beyond the Management Track" by Will Larson**
To be honest, I didn't know the term Staff Engineer existed and companies actually used it till I read this book.
It was suggested to me be a more senior engineer and I enjoyed every page of it.
This book explains ways you can grow in your career as engineer without having to go to management. It shows different paths including tech lead and what they mean.
Its a small book, so I strongly suggest you check it out as well.
**"Drive" by Daniel H. Pink**
This is a more advanced book that you can read after a while of becoming a tech lead. Otherwise you might not actually learn anything from it.
It talks about what motivates people and most importantly how to motivate people.
The reason I say you should read it only after becoming tech lead is that you probably don't notice the team motivation deeply as an IC, since you're focused in finishing your tasks.
Remember a not so skilled, but motivated team will go much further than a skilled but not motivated team.
**"The Staff Engineer’s Path" by Tanya Reilly**
This book does very well with the Staff Engineer book we mentioned above. It goes in more details about the path of Staff Engineers and the lessons apply very well in a tech lead role no matter what level.
You can read this book both before or after becoming a tech lead.
**"Engineering Management for the Rest of Us" by Sarah Drasner**
This book you only have to read if you're planning to transition to an engineering management role. It teaches a lot about leadership in a form that's easy for technical people to understand.
Though some of the concepts are already useful even if you're planning to stay as tech lead.
**"The Hard Thing About Hard Things" by Ben Horowitz**
The second non technical book in this list is a must read as well. Since tech leads must have a bigger picture of the company and how the company works in order to successfully lead the team, this book makes it easier to understand why some processes are ran the way they are.
In addition it also provides lessions about leadership that are critical in a tech lead role.
**Conclusion**
I shared 7 of the books I read as a tech lead. I'd suggest taking a look at each one of them and reading the most interesting one.
If you have more books you've read and would like to suggest for this list feel free to reach out to me at contact@codebymedu.com
| codebymedu |
1,906,669 | React vs Flutter | Getting into the field of frontend web development can be exciting and at the same confusing. Here... | 0 | 2024-06-30T14:36:59 | https://dev.to/cebuka/react-vs-flutter-1mac | webdev, flutter, react, beginners | Getting into the field of frontend web development can be exciting and at the same confusing. Here you'll discover that there are myriads of tools to get the job done. The thought of what tool to use can become a hindrance to productivity.In this article, we compare and contrast two well-known web technology tools; Reactjs and Flutter.
##What is React?
ReactJs or React is a javascript library that was built and is maintained by Facebook. It makes use of components(individual pieces of code) to build user interfaces. It is best used in building SPAs(Single Page Applications).
##What is Flutter?
Flutter is a framework developed by Google for building native applications from a single codebase. It uses the Dart programming language.Flutter is used to build mobile applications but is also used in building desktop applications and web apps.
##React Or Flutter :confused: ?
Well, suffice it to say it depends on the usecase. Each one of them is tailored towards solving a peculiar problem. For instance, React is best suited for web applications. This does not mean it cannot for mobile development, in fact, React-Native(used for mobile development) competes with Flutter in this field.
Flutter can best be used when a mobile and a web version of a product is both needed. With flutter, there's no need to rewrite the codebase.
##Conclusion
I know, I know :smiley:, I barely scratched this topic but you are free to read more on [react](https://react.dev/) and on [flutter](https://docs.flutter.dev/). Truth is, it depends on what tools are best for the particular task at hand. This can be influenced to a large extent by your team's decision. For instance, in a program that I'm a part of; [HNG](https://hng.tech/internship) internship, ReactJs is the preferred tool. The Program also [hires](https://hng.tech/hire) frontend developers.
The field of frontend development has myriads of tools that sometimes do the same thing or sometimes tailored to a particular task. It's up to you as a developer to choose from these tools what you need.
| cebuka |
1,906,668 | How I built a Billion Dollar Company - Alone! | 14 months ago I had a falling out with my VC guy. We went different ways. My VC guy got to keep my... | 0 | 2024-06-30T14:33:42 | https://ainiro.io/blog/how-i-built-a-billion-dollar-company-alone | startup | 14 months ago I had a falling out with my VC guy. We went different ways. My VC guy got to keep my previous company entirely, and I got to keep my IP and he released me from all non-competes and existing agreements we had.
Yesterday people started whispering that AINIRO might be worth one billion dollars, so I've therefore proven that I was in fact right all along, and that I could single handedly create a unicorn.
> Mission accomplished!
Of course, we will never get to prove our evaluation, because I have zero interest in becoming VC funded, and I have no plans to do an IPO - However, everybody with any knowledge about today's investment market, combined with knowledge about our technology, have already started whispering that the company might in fact be a unicorn - Implying I could probably get 200 million dollars for 20% of the company at this point in time - Which I've got **zero** interest in may I add.
In retrospect it's actually kind of sad, because had I continued working for my previous company, our financial projections back then was that we'd be worth a billion dollars at the end of 2023 due to our growth rate and revenue stream 14 months ago. Today my previous company is worth roughly 10 cents, and AINIRO is worth 1 billion US dollars. The irony ...
## Paper money
I'm still driving a 20 year old car, and if I tip a waiter it's 10% max. I don't own a mansion, and I don't have 10 zeros in my bank account. There's a huge difference between _"paper money"_ and real money, implying I am not rich-rich, at least not yet.
In fact, if you've got a job as a senior software developer in the US, you've probably got more salary than me - However, I'm building my own house, and I'm not building another man's house - There's a huge difference ...
I have a profitable AI company, delivering products and services that others have difficulties believing is even possible to create. Others are willing to pay me to gain access to our products and services, so we've definitely found our market fit. Revenue is also increasing every single week, albeit Google is trying their best to shadow ban us everywhere they can.
So I've accomplished what I set out to prove, which was that I could build a billion dollar company, alone! And that I was right, and my billionaire VC schmuck was wrong.
For the record, I could possibly have achieved a one billion dollar evaluation earlier if I had been willing to cut some corners, and accept money from others, but that would be no fun. Besides, I didn't do it for the money, I did it simply to prove him wrong. In addition I will take 100% of AINIRO's shares with me into my grave, for the simple reasons of teaching the world a lesson. I've already told my wife what I want my tombstone to say, and it is as follows ...
> He was right all along!
Anyways, today is a day for celebration. I built a Unicorn, and I did it alone! Now I can finally start building a _company_, with employees and the whole shebang. Tomorrow I might be hiring a VP of Strategic Business Development for North America, our by far largest market 😊
To my VC dude I've got only 4 words left ...
> You lose, I win!
| polterguy |
1,906,666 | Page navigation with react-router-dom | To navigate to pages in a React project, we use react-router-dom, which offers many features such as... | 0 | 2024-06-30T14:31:00 | https://dev.to/thinhkhang97/page-navigation-with-react-router-dom-a38 | webdev, react, navigation | To navigate to pages in a React project, we use react-router-dom, which offers many features such as routing and creating layouts with <Outlet/>, etc. Today, I'd like to show you how to use react-router-dom in a React project with Vite.
I recommend you read my article on setting up a React project [here](https://dev.to/thinhkhang97/react-vite-tailwind-project-57pf) before getting started.
## Installation
```bash
npm install react-router-dom
```
## Preparation
First, we need to imagine the project we're going to build. This project will list Pokémon data, as shown in the image below. It has several pages:
* /home: Landing page
* /pokemons: Pokémon list
* /pokemons/:id Pokémon detail
* /favorites: All your favorite Pokémon

Below is the project structure
```css
src/
├── apis/
│ ├── index.ts
│ └── pokemons.ts
├── assets/
│ └── logo.svg
├── components/
│ ├── index.ts
│ └── navbar.tsx
├── pages/
│ ├── favorites/
│ │ └── page.tsx
│ ├── home/
│ │ ├── components/
│ │ └── page.tsx
│ ├── not-found/
│ │ └── page.tsx
│ ├── pokemon/
│ │ └── page.tsx
│ └── pokemons/
│ ├── components/
│ ├── index.ts
│ └── layout.tsx
├── types/
│ ├── index.ts
│ ├── pokemon.ts
├── app.tsx
├── index.css
├── main.tsx
└── vite-env.d.ts
.env
```
## Configuring Routing
To enable client-side routing in this React application, we need to wrap entire application in `<BrowserRouter>` component. For each page, we need to create a `<Route/>` component, example for path `/favorites`, we get this
```tsx
<Route path="/favorites" element={<FavoritesPage />} />
```
As you can see, we have a `navbar` component that we want to keep at the top of these pages. So I created a layout named MainLayout which includes an `<Outlet/>` component to help us switch page content while keeping the `navbar` fixed.
```tsx
import { Outlet } from "react-router-dom";
import { Navbar } from "../components";
export default function MainLayout() {
return (
<div>
<Navbar />
<Outlet />
</div>
);
}
```
Then we need to make all the pages have the same layout (with the `navbar` on top) and only change the content nested inside the `MainLayout` route. From now on, each time you access to the path `/pokemons` or `/favorites`, the client will just change the content inside the layout and keep the `navbar` component.
```tsx
export default function App() {
return (
<BrowserRouter>
<Routes>
<Route path="/" element={<MainLayout />}>
<Route path="/" element={<HomePage />} />
<Route path="/favorites" element={<FavoritesPage />} />
<Route path="/pokemons" element={<PokemonsPage />} />
<Route path="/pokemons/:id" element={<PokemonPage />} />
</Route>
<Route path="*" element={<NotFound />} />
</Routes>
</BrowserRouter>
);
}
```
## Get params
When working with `react-router-dom`, you might have trouble getting params of the path. For example `/pokemons/:id`, you need to get the `id` value to fetch data of a Pokemon by its id in `PokemonPage`.
You can use `useParams` hook to get it like this
```tsx
const { id } = useParams();
```
The hook returns an object that contains all params that appear in the URL. Now you can take the id to get the Pokemon data.
That's it for the article, I hope it can help you to set up a project with `react-router-dom`. If you have any comments or suggestions, please let me know, I'd love to hear them. Thanks for reading
| thinhkhang97 |
1,906,665 | MyFirstApp - React Native with Expo (P10) - Create a Layout Profile | MyFirstApp - React Native with Expo (P10) - Create a Layout Profile | 27,894 | 2024-06-30T14:30:00 | https://dev.to/skipperhoa/myfirstapp-react-native-with-expo-p10-create-a-layout-profile-29cc | react, reactnative, webdev, tutorial | MyFirstApp - React Native with Expo (P10) - Create a Layout Profile
{% youtube NTiZwSiEq9M %} | skipperhoa |
1,906,663 | My Journey of Taming Transitive Dependencies in Spring Boot | The beauty of being a Software Developer lies in the endless journey of learning and overcoming... | 0 | 2024-06-30T14:29:27 | https://dev.to/the_zen/my-journey-of-taming-transitive-dependencies-in-spring-boot-3n2c | java, springboot, developers, softwareengineering | The beauty of being a Software Developer lies in the endless journey of learning and overcoming challenges. Among the myriads of obstacles I’ve faced, one stands out as particularly overwhelming: **_Managing Transitive Dependencies in my Spring Boot application using Maven_**.
This is the tale of my struggle, frustration, and ultimate triumph, a journey I hope will resonate with and inspire fellow developers.
My application was humming along nicely, bugs were getting fixed, and then....... everything came to a complete stop. Error messages about version conflicts littered my console, and my once cooperative dependencies seemed to be at war with each other.
I was confused. How could everything have gone so wrong so quickly? I dove into documentation, checked Stack Overflow, and watched video tutorials. Yet, the more I read, the more confused I became. I was completely lost.
My application still refused to cooperate. Dependencies that once played nicely together were now in conflict. It was as if my project had developed a mind of its own. I learned that the root of the problem lay in transitive dependencies, those hidden, indirect dependencies that came along for the ride when I included a library in my project.
Understanding transitive dependencies is one thing, managing them is another beast entirely. It wasn’t just about knowing which libraries depended on what, it was about ensuring that all these dependencies played nicely together.
After countless hours of trial and error, something clicked. I discovered that Maven offers a mechanism to control these dependencies through the `dependencyManagement` section in the `pom.xml` file.
Here’s an example of how I used `dependencyManagement` to resolve my conflicts:
```
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.example</groupId>
<artifactId>problematic-library</artifactId>
<version>1.2.3</version>
</dependency>
</dependencies>
</dependencyManagement>
```
The day my application finally built successfully was one of the most satisfying days of my career. The struggle had been real, the frustration intense, but the victory was sweet. I had not only solved the problem but also gained a deeper understanding of dependency management.
This experience taught me that in software development, challenges are not roadblocks but stepping stones. Each problem we solve makes us better, stronger developers.
As I embark on my journey with HNG internship. I am excited about the new challenges and opportunities to grow as a developer. To anyone considering a career in tech, I encourage you to embrace the difficulties. They are the moments that define us, the crucibles that forge our skills and resilience.
If you’re interested in pushing your boundaries and learning alongside brilliant minds, consider joining the HNG internship. You can find more information. [Here](https://hng.tech/internship)
Cheers!!!
| the_zen |
1,906,664 | Choosing Your Frontend Champion: React vs Vue Explained for Beginners | Earlier this year, I set out on a roadmap to master full-stack web development, starting with... | 0 | 2024-06-30T14:29:08 | https://dev.to/sochuks/choosing-your-frontend-champion-react-vs-vue-explained-for-beginners-2ojp | frontend, react, vue | Earlier this year, I set out on a roadmap to master full-stack web development, starting with frontend technologies. A quick internet search provided results to many exciting frameworks and their capabilities but soon enough I was able to narrow my decision to two of the most popular frameworks React and Vue. Both offer ease of use, strong community support, and extensive ecosystems, empowering developers to create scalable, interactive web interfaces.
While React and Vue can achieve similar results, choosing between them depends on your project's specifics—its complexity, scope, and type of application. This article aims to help you make an informed decision by highlighting key criteria.
## History and Common Ground
React, was created by Jordan Walke at Facebook, it was first released in 2013 and quickly gained traction. React has become one of the most widely used frontend libraries today. Vue.js, on the other hand, was created by Evan You in 2014. You, a former Google employee, aimed to extract the parts he liked about Angular and build something lightweight. It is known for its simplicity and ease of integration into existing projects. Some key similarities between include:-
- **JavaScript:** Both React and Vue are JavaScript-based and utilize Virtual DOM for efficient interaction with HTML.
- **Component-based Architecture:** Both frameworks promote code reuse and productivity by using components, which are building blocks of web interfaces.
- **Server-side Rendering (SSR):** React and Vue support SSR, enabling server rendering for faster initial page loads.
## React vs Vue: Key Differences
**Syntax & Learning Curve**
**Vue:** Uses HTML-based templates by default, making it straightforward for beginners to grasp. It separates concerns into HTML, CSS, and JavaScript, easing the learning curve.
**React:** Utilizes JSX, which blends HTML and JavaScript. While powerful, JSX might be more challenging for newcomers despite comprehensive documentation.
**Flexibility**
React is like a toolbox where you get to pick your tools. It doesn’t force you to use specific solutions for state management or routing. For example, when it comes to state management, you can choose libraries like Redux or MobX to handle your app’s data. For navigation between pages, React Router is a popular choice. The best part? You have the freedom to customize your stack by selecting tools that fit your needs. Vue takes a more structured approach. It provides official solutions for common tasks. If you’re using Vue, you’ll use Vue Router for handling routes.
**Performance**
Both React and Vue leverage Virtual DOM for efficient updates, but Vue may have a slight edge in memory allocation and startup times. Vue applications also tend to be smaller, enhancing performance.
**Popularity**
According to Stack Overflow’s 2023 Developer Survey, React remains more popular than Vue. Approximately 29,137 developers use React, while 11,761 developers use Vue1. React has consistently maintained its position as a widely adopted framework. Vue, on the other hand, has seen steady usage with 18.82% of developers using it. While it’s not as prevalent as React, Vue continues to be a strong choice for frontend development2. Its simplicity, flexibility, and lightweight nature appeal to many developers.
**Conclusion**
In conclusion, while we've explored the distinct differences between React and Vue, the right choice ultimately hinges on your project requirements and personal preferences. React shines with its superior performance and robust community support, whereas Vue offers a smoother learning curve and a delightful development experience. As someone who started with intermediate knowledge of traditional web technologies—HTML, CSS, and JavaScript—I found it crucial to embrace frameworks that enhance scalability and efficiency as technology advances. This article reflects my insights and experiences gained from using both frameworks extensively over the past six months. I'm thrilled to apply these newfound frontend skills in the dynamic environment of the HNG11 internship, eagerly anticipating the challenges and growth opportunities it will bring
If you are interested in applying for an internship in HNG, you can follow any of these links.
[HNG Internship](https://hng.tech/internship) | [HNG Premium](https://hng.tech/premium) | [HNG Hire](https://hng.tech/hire)
| sochuks |
1,906,661 | MyFirstApp - React Native with Expo (P9) - Custom Bottom Tab | MyFirstApp - React Native with Expo (P9) - Custom Bottom Tab | 27,894 | 2024-06-30T14:28:31 | https://dev.to/skipperhoa/myfirstapp-react-native-with-expo-p9-custom-bottom-tab-3eb6 | react, reactnative, webdev, tutorial | MyFirstApp - React Native with Expo (P9) - Custom Bottom Tab
{% youtube 0scLTwrfoZg %} | skipperhoa |
1,906,660 | MyFirstApp - React Native with Expo (P8) - Add Bottom Sheet | MyFirstApp - React Native with Expo (P8) - Add Bottom Sheet | 27,894 | 2024-06-30T14:27:20 | https://dev.to/skipperhoa/myfirstapp-react-native-with-expo-p8-add-bottom-sheet-5abh | react, reactnative, webdev, tutorial | MyFirstApp - React Native with Expo (P8) - Add Bottom Sheet
{% youtube 9w8hnLNB5DY %} | skipperhoa |
1,906,659 | Understanding Closures in JavaScript: A Powerful Mechanism for Variable Scope | Demystifying the Concept of Closures Closures are a fundamental concept in JavaScript that... | 0 | 2024-06-30T14:23:59 | https://dev.to/sahilatahar/understanding-closures-in-javascript-a-powerful-mechanism-for-variable-scope-2mfg | ## Demystifying the Concept of Closures
Closures are a fundamental concept in JavaScript that can be challenging to grasp at first, but once understood, they become a powerful tool in your programming arsenal. In this blog post, we'll dive deep into the world of closures, exploring what they are, how they work, and why they are so important in JavaScript development.
## What are Closures?
At its core, a closure is a function that has access to variables from an outer function, even after the outer function has finished executing. This may sound a bit abstract, but let's break it down with a simple example:
### A Basic Closure Example
Imagine you have a function called `outerFunction` that takes a single argument, `name`, and returns another function called `innerFunction`. The `innerFunction` doesn't take any arguments, but it has access to the `name` variable from the `outerFunction`:
```javascript
function outerFunction(name) {
return function innerFunction() {
console.log(`Hello, ${name}!`)
}
}
```
In this example, the `innerFunction` is a closure because it has access to the `name` variable from the `outerFunction`, even after the `outerFunction` has finished executing. This is the essence of a closure: a function that "closes over" the variables it needs from its outer scope.
## How Closures Work
To understand how closures work, we need to dive a bit deeper into the concept of scope in JavaScript. Scope refers to the accessibility of variables and functions within a specific part of your code. In JavaScript, there are two main types of scope: global scope and local scope.
### Global Scope vs. Local Scope
Variables and functions declared in the global scope are accessible from anywhere in your code, while variables and functions declared within a function (local scope) are only accessible within that function and any nested functions. This is where closures come into play.
When you create a function inside another function, the inner function has access to the variables in its own scope, as well as the variables in the scope of any outer functions. This is the key to how closures work: the inner function "remembers" the variables it needs from the outer function, even after the outer function has finished executing.
### Closure Execution
Let's go back to our previous example and see how the closure is executed:
```javascript
function outerFunction(name) {
return function innerFunction() {
console.log(`Hello, ${name}!`);
};
}
const myGreeting = outerFunction('Alice');
myGreeting(); // Output: "Hello, Alice!"
}
```
In this example, when we call `outerFunction('Alice')`, it returns the `innerFunction`. We then store this returned function in the `myGreeting` variable. When we later call `myGreeting()`, the `innerFunction` is executed, and it has access to the `name` variable from the `outerFunction`, even though the `outerFunction` has already finished executing.
## Why Closures are Important
Closures are an essential concept in JavaScript for several reasons:
### Data Encapsulation
Closures allow you to create private variables and methods, which is a fundamental principle of object-oriented programming. By using a closure, you can create a function that has access to private variables, but those variables are not accessible from outside the function.
### Persistent Data
Closures can be used to create functions that "remember" data from previous function calls. This can be useful for things like caching, memoization, and creating stateful functions.
### Callback Functions
Closures are often used in the implementation of callback functions, which are a fundamental part of asynchronous programming in JavaScript. Callback functions have access to the variables and context of the function that created them, thanks to closures.
### Module Pattern
The Module pattern, a common design pattern in JavaScript, relies heavily on closures to create private variables and methods, while still exposing a public API.
## Practical Use Cases for Closures
Now that we understand the basics of closures, let's explore some practical use cases where they can be incredibly useful:
### Memoization
Memoization is a technique used to cache the results of expensive function calls and return the cached result when the same inputs occur again. Closures are perfect for implementing memoization, as they allow you to maintain a cache of previous results within the function itself.
### Event Handlers
Closures are often used in event handlers, where you need to maintain a reference to some state or data that is relevant to the event. For example, you might use a closure to keep track of the number of times a button has been clicked.
### Currying
Currying is a technique where you transform a function that takes multiple arguments into a sequence of functions, each taking a single argument. Closures are essential for implementing currying, as they allow you to "remember" the arguments from previous function calls.
### Partial Application
Partial application is similar to currying, but instead of transforming a function into a sequence of single-argument functions, you create a new function with some of the arguments already "filled in." Closures are key to implementing partial application as well.
## Conclusion
Closures are a powerful and versatile concept in JavaScript that can be used to solve a wide range of problems. By understanding how closures work and the various use cases they support, you can write more efficient, maintainable, and expressive code. While they may seem complex at first, with practice, closures will become an essential tool in your JavaScript toolbox.
| sahilatahar | |
1,906,658 | AWS-infrastructure-using-Terraform | 🚀 Excited to share my latest project where I set up a highly available and scalable infrastructure... | 0 | 2024-06-30T14:22:41 | https://dev.to/sukuru_naga_sai_srinivasu/aws-infrastructure-using-terraform-3p6j | aws, terraform |

🚀 Excited to share my latest project where I set up a highly available and scalable infrastructure on AWS using Terraform! 🌐
🔧 Project Overview:
1) Provider Configuration: Utilized AWS as the cloud provider with the region set to us-east-1.
2) VPC Creation: Built a Virtual Private Cloud (VPC) with subnets in two different availability zones for enhanced fault tolerance.
3) Internet Gateway & Route Table: Established an Internet Gateway and configured route tables to manage internet access for our resources.
4) Security Group: Implemented a robust security group to control inbound and outbound traffic, ensuring only HTTP and SSH traffic are allowed.
5) EC2 Instances: Deployed two EC2 instances across the subnets with user data scripts for initialization, enabling automated setup.
6) Application Load Balancer: Configured an application load balancer to distribute traffic between the EC2 instances, ensuring high availability and reliability.
7) S3 Bucket: Created an S3 bucket for additional storage needs.
Outputs: Provided the DNS name of the load balancer for easy access to the deployed application.
This project underscores the power of Infrastructure as Code (IaC) with Terraform, enabling automated, repeatable, and efficient cloud infrastructure setup. | sukuru_naga_sai_srinivasu |
1,906,656 | Day20- 90DaysOfDevOps | Hey Learners! Welcome back. We learned about Docker, Docker-Compose, Docker-Volumes and Docker... | 0 | 2024-06-30T14:20:58 | https://dev.to/oncloud7/day20-90daysofdevops-156b | devops, docker, awschallenge, 90daysofdevop | Hey Learners! Welcome back. We learned about Docker, Docker-Compose, Docker-Volumes and Docker Networking hands-on. Now it's time to create a comprehensive cheat sheet of all the commands we've learned so far. Let's get started.
_**Docker Commands:-**_
**docker login**
**docker pull <image-name>**
**docker push <repo/image-name:version>
docker logout
docker build . -t <any-name:version>
docker images OR docker image ls
docker rmi <image-name> OR docker image rm <image-name>
docker tag <image-name> <new-tag>:version
docker inspect <image-name>
docker save <image-name> --output <file-name.tar>
docker load --input <file-name.tar>
docker create <image-name>
docker run <image-name>
docker run -d --name <cont-name> <image-name>
docker run -e <variable=value> <image-name>
docker run --network <any-network> <image-name>
docker run -p <host-port:cont-port> <image-name>
docker run --rm <image-name>
docker run -v <host/dir/path>:<cont/dir/path> <image-name>
docker start <cont-name>
docker stop <cont-name>
docker pause <cont-name>
docker unpause <cont-name>
docker restart <cont-name>
docker export <cont-name> > <file-name.tar> OR docker export --output="<file-name.tar> <cont-name>
docker ps
docker ps -a
docker inspect <cont-name>
docker logs <cont-name>
docker stats**
_**Docker-Compose Commands:-**_
**docker-compose up
docker-compose up -d <service-name>
docker-compose down
docker-compose down <service-name>
docker-compose ps
docker-compose rm
docker-compose rm <service-name>
docker-compose start
docker-compose stop
docker-compose restart
docker-compose pause
docker-compose unpause**
_**Docker Networking Commands:-**_
**docker network ls
docker network create <net-name>
docker network inspect <net-name>
docker network connect <net-name> <cont-name>
docker network disconnect <net-name> <cont-name>
docker network rm <net-name>**
Thank you so much for taking the time to read till the end! Hope you found this blog informative.
| oncloud7 |
1,906,633 | A Comprehensive Guide to Functional Testing with Selenium | Functional testing is a crucial aspect of software development, ensuring that applications perform as... | 0 | 2024-06-30T14:14:53 | https://dev.to/iaadidev/a-comprehensive-guide-to-functional-testing-with-selenium-1mek | selenium, testing, devops, linux |
Functional testing is a crucial aspect of software development, ensuring that applications perform as expected under various conditions. Among the numerous tools available for functional testing, Selenium stands out due to its flexibility and extensive support for web applications. In this blog, we will explore Selenium in depth, including its installation, basic usage, and professional tips to make the most out of this powerful tool.
## What is Selenium?
Selenium is an open-source framework for automating web browsers. It provides a suite of tools and libraries that enable the automation of web applications for testing purposes. Selenium supports multiple programming languages like Java, C#, Python, and JavaScript, allowing testers to write test scripts in the language they are most comfortable with.
## Selenium Components
Selenium is not just a single tool but a suite of tools, each serving a specific purpose:
1. **Selenium WebDriver**: A tool for writing automated tests of websites. It aims to provide a friendly API that's easy to explore and understand, which helps make your tests easier to read and maintain.
2. **Selenium IDE**: A Chrome and Firefox plugin that allows you to record and playback tests in the browser.
3. **Selenium Grid**: A tool to run tests on different machines against different browsers in parallel.
For this guide, we will focus on Selenium WebDriver, as it is the most widely used component for functional testing.
## Installing Selenium
### Prerequisites
Before installing Selenium, ensure you have the following:
- A programming language installed (Python is used in this guide).
- A web browser (Chrome or Firefox).
- The respective web driver for your browser (ChromeDriver for Chrome, GeckoDriver for Firefox).
### Step-by-Step Installation Guide
1. **Install Python**: If you haven't already installed Python, download it from [python.org](https://www.python.org/downloads/) and follow the installation instructions.
2. **Install Selenium**: Use pip, Python’s package manager, to install Selenium.
```bash
pip install selenium
```
3. **Download WebDriver**: Download the WebDriver for your browser.
- **ChromeDriver**: Download from [ChromeDriver](https://sites.google.com/a/chromium.org/chromedriver/downloads) and place it in a directory that is in your system's PATH.
- **GeckoDriver (Firefox)**: Download from [GeckoDriver](https://github.com/mozilla/geckodriver/releases) and place it in a directory that is in your system's PATH.
4. **Verify Installation**: Create a simple Python script to verify the installation.
```python
from selenium import webdriver
# Initialize the Chrome driver
driver = webdriver.Chrome()
# Open a website
driver.get("http://www.google.com")
# Close the browser
driver.quit()
```
Save the script as `test_selenium.py` and run it using the command:
```bash
python test_selenium.py
```
If everything is set up correctly, a Chrome browser will open, navigate to Google, and then close.
## Using Selenium: A Practical Guide
### Writing Your First Test
Let's write a basic test script to search for a term on Google.
1. **Initialize WebDriver**: Import the necessary modules and initialize the WebDriver.
```python
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.chrome.service import Service as ChromeService
from webdriver_manager.chrome import ChromeDriverManager
driver = webdriver.Chrome(service=ChromeService(ChromeDriverManager().install()))
```
2. **Open Google**: Use the `get` method to navigate to Google's homepage.
```python
driver.get("http://www.google.com")
```
3. **Locate the Search Box**: Use Selenium's `find_element` method to locate the search box element.
```python
search_box = driver.find_element(By.NAME, "q")
```
4. **Perform a Search**: Send a search query and press Enter.
```python
search_box.send_keys("Selenium WebDriver")
search_box.send_keys(Keys.RETURN)
```
5. **Close the Browser**: Once the search results are displayed, close the browser.
```python
driver.quit()
```
Putting it all together, the complete script looks like this:
```python
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.chrome.service import Service as ChromeService
from webdriver_manager.chrome import ChromeDriverManager
driver = webdriver.Chrome(service=ChromeService(ChromeDriverManager().install()))
driver.get("http://www.google.com")
search_box = driver.find_element(By.NAME, "q")
search_box.send_keys("Selenium WebDriver")
search_box.send_keys(Keys.RETURN)
driver.quit()
```
### Advanced Usage and Best Practices
To use Selenium professionally, consider the following advanced features and best practices:
1. **Implicit and Explicit Waits**: These are crucial for handling dynamic content and ensuring that elements are available before interacting with them.
- **Implicit Wait**: Sets a default wait time for the entire WebDriver instance.
```python
driver.implicitly_wait(10) # seconds
```
- **Explicit Wait**: Waits for a specific condition to be met before proceeding.
```python
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
wait = WebDriverWait(driver, 10)
element = wait.until(EC.presence_of_element_located((By.NAME, "q")))
```
2. **Page Object Model (POM)**: A design pattern that enhances test maintenance and reduces code duplication by creating an object repository for web elements.
```python
class GoogleSearchPage:
def __init__(self, driver):
self.driver = driver
self.search_box = driver.find_element(By.NAME, "q")
def search(self, text):
self.search_box.send_keys(text)
self.search_box.send_keys(Keys.RETURN)
# Usage
driver = webdriver.Chrome(service=ChromeService(ChromeDriverManager().install()))
driver.get("http://www.google.com")
search_page = GoogleSearchPage(driver)
search_page.search("Selenium WebDriver")
driver.quit()
```
3. **Exception Handling**: Properly handle exceptions to make your tests more robust and informative.
```python
from selenium.common.exceptions import NoSuchElementException
try:
element = driver.find_element(By.NAME, "non_existent_element")
except NoSuchElementException:
print("Element not found!")
```
4. **Logging and Reporting**: Implement logging and reporting to keep track of test execution and results.
```python
import logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger()
logger.info("Starting test")
# Your test code here
logger.info("Test finished")
```
## Conclusion
Selenium is an indispensable tool for automating web application testing. With its powerful features and support for multiple programming languages, it allows testers to create robust and scalable test suites. By following the installation steps, understanding the basics of WebDriver, and incorporating best practices, you can leverage Selenium to ensure the quality and reliability of your web applications.
Happy testing! | iaadidev |
1,906,632 | SvelteKit vs React: A Technical Comparison | Web development frameworks are constantly evolving, and two popular options for building modern web... | 0 | 2024-06-30T14:10:39 | https://dev.to/chiater_dev/sveltekit-vs-react-a-technical-comparison-297p | webdev, javascript, programming, beginners |
Web development frameworks are constantly evolving, and two popular options for building modern web applications are SvelteKit and React. This article will compare these technologies across several key dimensions to help developers make informed choices for their projects.
## Overview
### SvelteKit
SvelteKit is a framework for building web applications using Svelte, a compile-time JavaScript framework. It provides an opinionated structure for building full-stack web applications with built-in routing, server-side rendering, and code-splitting.
### React
React is a JavaScript library for building user interfaces, typically used with additional tools and libraries to create full web applications. Popular frameworks built on React include Next.js and Create React App.
## Component Model
### SvelteKit
Svelte uses a compile-time approach, converting your components into efficient JavaScript that updates the DOM. Components are written in .svelte files, which can contain HTML, CSS, and JavaScript in a single file.
```svelte
<script>
let count = 0;
function increment() {
count += 1;
}
</script>
<button on:click={increment}>
Clicks: {count}
</button>
<style>
button { background-color: #ff3e00; color: white; }
</style>
```
### React
React uses a runtime approach with a virtual DOM. Components are typically written in JSX, a syntax extension for JavaScript.
```jsx
import React, { useState } from 'react';
function Counter() {
const [count, setCount] = useState(0);
return (
<button onClick={() => setCount(count + 1)}>
Clicks: {count}
</button>
);
}
// CSS would typically be in a separate file or using CSS-in-JS
```
## Performance
SvelteKit generally offers better out-of-the-box performance due to its compile-time approach, resulting in smaller bundle sizes and faster initial load times. React's performance can be optimized but often requires more developer effort and additional libraries.
## Learning Curve
SvelteKit is often considered easier to learn, especially for developers new to frontend frameworks. Its syntax is closer to vanilla HTML, CSS, and JavaScript. React has a steeper learning curve, particularly around concepts like JSX and hooks.
## Ecosystem and Community
React has a larger ecosystem and community, with a vast array of third-party libraries and tools available. SvelteKit's ecosystem is growing but is currently smaller than React's.
## Server-Side Rendering (SSR)
Both SvelteKit and React (via frameworks like Next.js) support server-side rendering. SvelteKit provides this functionality out of the box, while React typically requires additional setup or the use of a meta-framework.
## Routing
SvelteKit includes a file-based routing system similar to Next.js. React itself doesn't include routing, but libraries like React Router are commonly used to add this functionality.
## State Management
SvelteKit uses a simple store system for global state management. React often relies on external libraries like Redux or MobX, although hooks have simplified state management in recent versions.
## Build Output
SvelteKit produces highly optimized, lightweight JavaScript. React applications tend to have larger bundle sizes due to the inclusion of the React library itself.
## Advantages of SvelteKit
1. **Performance**: Better out-of-the-box performance with smaller bundle sizes and faster initial load times.
2. **Simplicity**: Straightforward syntax and structure, easier to learn and use.
3. **Less Boilerplate**: Requires less code to achieve the same functionality compared to React.
4. **Built-in Features**: Routing, code-splitting, and server-side rendering included by default.
5. **Reactive by Design**: Intuitive reactivity system built into the language.
6. **Scoped CSS**: Styles in components are scoped by default, reducing style conflicts.
7. **Smaller Learning Curve**: Syntax closer to vanilla HTML, CSS, and JavaScript.
## Advantages of React
1. **Ecosystem**: Vast ecosystem of third-party libraries, tools, and resources.
2. **Community Support**: Large and active community for help and resources.
3. **Job Market**: More job opportunities due to React's popularity.
4. **Flexibility**: Can be used for various types of projects, including mobile apps via React Native.
5. **Maturity**: Well-established and battle-tested in large-scale production environments.
6. **Developer Tools**: Excellent developer tools, including React Developer Tools browser extension.
7. **Server-Side Rendering Options**: Multiple mature options like Next.js and Gatsby.
8. **Virtual DOM**: Can lead to efficient updates in complex applications when used correctly.
## Conclusion
Both SvelteKit and React offer unique advantages that cater to different project needs and developer preferences.
SvelteKit shines in scenarios where performance is critical, and rapid development with minimal setup is desired. Its simplicity and built-in features make it an excellent choice for smaller to medium-sized projects or for teams looking to adopt a modern, efficient framework.
React excels in complex, large-scale applications where its extensive ecosystem and community support can be leveraged fully. It's also a great choice for teams that require flexibility in their tech stack or are building cross-platform applications.
The choice between SvelteKit and React will ultimately depend on your specific project requirements, team expertise, and long-term development goals. Consider factors such as application complexity, performance needs, team size and experience, and the need for specific third-party integrations when making your decision.
A cool way to learn React would be to sign up to HNG's free internship platform where you could get trained in various technologies like React and NodeJS under various niches. Sign up at [HNG Hiring Page](https://hng.tech/hire) and [HNG Internship](https://hng.tech/internship) | chiater_dev |
1,906,631 | Our Approach to AWS Well-Architected Best Practices | What is Well-Architected? The Well-Architected framework, a comprehensive compilation of... | 0 | 2024-06-30T14:08:37 | https://dev.to/shu85t/our-approach-to-aws-well-architected-best-practices-3edg | aws, wellarchitectedframework | ## What is Well-Architected?
The Well-Architected framework, a comprehensive compilation of 'best practices' for cloud architecture, is created by AWS.
https://aws.amazon.com/architecture/well-architected
Based on this review, you can identify risks that are not in line with best practices.
## Impressions from Conducting Reviews
Meeting best practices involves not only workload-specific design, implementation, and operation but also organizational structure, rules, and training.
We have been working with a cross-departmental team called the "Well-Architected Working Group" for several years, and I have compiled a rough diagram of our activities.
## Diagram of AWS Well-Architected Utilization Status Across Our Company

I will pick a few examples from the diagram and introduce them.
### Well-Architected Working Group
The working group comprises one or more members selected from each target department.
Having members from various departments helps us understand the different circumstances and situations during our meetings.
### Reading and Discussing Best Practices
The descriptions of best practices become more transparent and easier to understand each year. Still, some items are difficult to interpret in concrete terms. Establish internal guidelines to determine the acceptable extent and interpretation of best practices.
These discussions also provide feedback on cheat sheets, internal standards and templates, internal training, and internal systems as necessary.
### Mini BootCamp
The Well-Architected BootCamp is a workshop where you can learn to apply best practices.
Teams will discuss and present their results on applying best practices to virtual systems and business situations.
The Mini BootCamp is our in-house version of this workshop. Below is a participation report from one of these sessions.
https://iret.media/tag/aws-well-architected-mini-bootcamp
## Motivation
If reviewing becomes the main objective, it can distract from the essence. Therefore, I would like to use the AWS best practices positively to ensure quality and improve efficiency!
| shu85t |
1,906,628 | Developing Custom Plugins for CoreDNS | Develop and Run Custom Plugins for CoreDNS | 0 | 2024-06-30T14:05:27 | https://dev.to/satrobit/developing-custom-plugins-for-coredns-4jnj | tutorial, go, linux, programming | ---
title: Developing Custom Plugins for CoreDNS
published: true
description: Develop and Run Custom Plugins for CoreDNS
tags: tutorial,go,linux,programming
---
## Introduction To CoreDNS
CoreDNS is a powerful, flexible DNS server written in Go. One of its key features is its plugin-based architecture, which allows users to extend its functionality easily. In this blog post, we'll explore how to write custom plugins for CoreDNS.

As previously mentioned, CoreDNS utilizes a plugin chain architecture, enabling you to stack multiple plugins that execute sequentially. Most of CoreDNS's functionality is provided by its built-in plugins. You can explore these bundled plugins by [Clicking here](https://github.com/coredns/coredns/tree/master/plugin).
## Architecture Overview
CoreDNS follows a similar approach to Caddy, as it is based on Caddy v1:
- **Load Configuration**: Configuration is loaded through the `Corefile` file.
- **Plugin Setup**: Plugins must implement a `setup` function to load, validate the configuration, and initialize the plugin.
- **Handler Implementation**: You need to implement the required functions from the `plugin.Handler` interface.
- **Integrate Your Plugin**: Add your plugin to CoreDNS by either including it in the `plugin.cfg` file or by wrapping everything in an external source code. Further details can be found below.
## Develop
### Configuration
As mentioned above, everything is done through the `Corefile`. If you're not familiar with the syntax, check this short explanation: https://coredns.io/2017/07/23/corefile-explained/
```
. {
foo
}
```
In the example above, `.` defines a server block, and `foo` is the name of your plugin. You can specify a port or add arguments to your plugin.
```
.:5353 {
foo bar
}
```
Now CoreDNS is running on port `5353` and my plugin named `foo` is given the argument `bar`.
> It's useful to enable the plugins `log` and `debug` during the development.
Checkout the list of bundled plugins to figure out which ones you need in your setup: https://coredns.io/plugins/
### Setup
The first thing you need to do is to register and set up your plugin. Registration is done through a function called `init` which you need to include in your go module.
```go
package foo
import (
"github.com/coredns/caddy"
"github.com/coredns/coredns/core/dnsserver"
"github.com/coredns/coredns/plugin"
)
func init() {
plugin.Register("foo", setup)
}
```
Now we need to implement `setup()` which parses the configuration and returns our initialized plugin.
```go
package foo
import (
"github.com/coredns/caddy"
"github.com/coredns/coredns/core/dnsserver"
"github.com/coredns/coredns/plugin"
)
func init() {
plugin.Register("foo", setup)
}
func setup(c *caddy.Controller) error {
c.Next() // #1
if !c.NextArg() { // #2
return c.ArgErr()
}
dnsserver.GetConfig(c).AddPlugin(func(next plugin.Handler) plugin.Handler {
return Foo{Next: next, Bar: c.val()} // #3
})
return nil // #4
}
```
1. Skip the first token which is `foo`, the name of our argument.
2. Return an error if our argument didn't have any value.
3. Put the value of our argument `bar` in the plugin struct and return it to be put in the plugin chain. Read more details on the plugin struct further down.
4. return `nil` as an error if everything is good to go.
### Handler
All plugins need to implement `plugin.Handler` which is the entry point to your plugin.
First, we need to write a struct containing the necessary arguments, runtime objects, and also the next plugin in the chain.
```go
type Foo struct {
Bar string
Next plugin.Handler
}
```
> This is the actual struct that we created in the previous step.
We also need a method to return the name of the plugin.
```go
func (h Foo) Name() string { return "foo" }
```
Now it's time for the most important method which is `ServeDNS()`. This is the method that is called for every DNS query routed to your plugin. You can also generate a response here making your plugin work as a data backend.
```go
func (h Foo) ServeDNS(ctx context.Context, w dns.ResponseWriter, r *dns.Msg) (int, error) {
return h.Next.ServeDNS(ctx, w, r)
}
```
What you see here does nothing but call the next plugin in the chain. But we don't have to do that :)
Use `r *dns.Msg` to get some info on the DNS query.
```go
state := request.Request{W: w, Req: r}
qname := state.Name()
```
List of variables you can get from `state`:
- `state.Name()` name of the query - includes the zone as well
- `state.Type()` type of the query - e.g. `A`, `AAAA`, etc
- `state.Ip()` IP address of the client making the request
- `state.Proto()` transport protocol - `tcp` or `udp`
- `state.Family()` IP version - `1` for IPv4 and `2` for IPv6
> Read the following file for the complete list: https://github.com/coredns/coredns/blob/master/request/request.go
You can also generate a response and return from the chain. For that, you need to use the amazing `github.com/miekg/dns` package and build a `dns.Msg` to return.
```go
func (h Foo) ServeDNS(ctx context.Context, w dns.ResponseWriter, r *dns.Msg) (int, error) {
dummy_ip := "1.1.1.1"
state := request.Request{W: w, Req: r}
qname := state.Name()
answers := make([]dns.RR, 0, 10)
resp := new(dns.A)
resp.Hdr = dns.RR_Header{Name: dns.Fqdn(qname), Rrtype: dns.TypeA,
Class: dns.ClassINET, Ttl: a.Ttl}
resp.A = net.ParseIP(dummy_ip)
answers = append(answers, resp)
m := new(dns.Msg)
m.SetReply(r)
m.Authoritative, m.RecursionAvailable, m.Compress = true, false, true
m.Answer = append(m.Answer, answers...)
state.SizeAndDo(m)
m = state.Scrub(m)
_ = w.WriteMsg(m)
return dns.RcodeSuccess, nil
}
```
In the example shown above, we create an `A` record response and return it to the client.
> Check out the DNS package we used for more details on how to create DNS objects: https://pkg.go.dev/github.com/miekg/dns
If successful, we return `dns.RcodeSuccess`. To see more return codes, check out here: https://pkg.go.dev/github.com/miekg/dns#pkg-constants
A few important return codes:
- `RcodeSuccess`: No error
- `RcodeServerFailure`: Server failure
- `RcodeNameError`: Domain doesn't exist
- `RcodeNotImplemented`: Record type not implemented
### Logging
You can use the logging package provided by the CoreDNS itself, `github.com/coredns/coredns/plugin/pkg/log`.
```go
package Foo
import (
clog "github.com/coredns/coredns/plugin/pkg/log"
)
var log = clog.NewWithPlugin("foo")
```
Now you can log anything you need with different levels:
```go
log.Info("info log")
log.Debug("debug log")
log.Warning("warning log")
log.Error("error log")
```
### Final Example
```go
package Foo
import (
"github.com/coredns/coredns/request"
"github.com/miekg/dns"
"golang.org/x/net/context"
clog "github.com/coredns/coredns/plugin/pkg/log"
)
var log = clog.NewWithPlugin("foo")
type Foo struct {
Bar string
Next plugin.Handler
}
func (h Foo) Name() string { return "foo" }
func (h Foo) ServeDNS(ctx context.Context, w dns.ResponseWriter, r *dns.Msg) (int, error) {
dummy_ip := "1.1.1.1"
state := request.Request{W: w, Req: r}
qname := state.Name()
answers := make([]dns.RR, 0, 10)
resp := new(dns.A)
resp.Hdr = dns.RR_Header{Name: dns.Fqdn(qname), Rrtype: dns.TypeA,
Class: dns.ClassINET, Ttl: a.Ttl}
resp.A = net.ParseIP(dummy_ip)
answers = append(answers, resp)
log.Debug("answers created")
m := new(dns.Msg)
m.SetReply(r)
m.Authoritative, m.RecursionAvailable, m.Compress = true, false, true
m.Answer = append(m.Answer, answers...)
state.SizeAndDo(m)
m = state.Scrub(m)
_ = w.WriteMsg(m)
return dns.RcodeSuccess, nil
}
```
## Compile
CoreDNS gives you two different ways to run your plugin, both are static builds.
### Compile-time Configuration
In this method, you need to clone the CoreDNS source code, add your plugin to the `plugin.cfg` file (plugins are ordered), and compile the code.
```
etcd:etcd
foo:github.com/you/foo
```
Then you need to do a `go get github.com/you/foo` and build the CoreDNS binary using `make`.
Run `./coredns -plugins` to ensure your plugin is included in the binary.
> If your plugin is on your local machine you can put something like `replace github.com/you/foo => ../foo` in your `go.mod` file.
### Wrapping in External Source Code
You also have the option to wrap the CoreDNS components and your plugin in an external source code and compile from there.
```go
package main
import (
_ "github.com/you/foo"
"github.com/coredns/coredns/coremain"
"github.com/coredns/coredns/core/dnsserver"
)
var directives = []string{
"foo",
...
...
"whoami",
"startup",
"shutdown",
}
func init() {
dnsserver.Directives = directives
}
func main() {
coremain.Run()
}
```
As with any other go app, do a `go build` and you should have the binary.
## Links
- https://coredns.io/2017/06/08/how-queries-are-processed-in-coredns/
- https://coredns.io/2016/12/19/writing-plugins-for-coredns/
- https://coredns.io/2017/07/25/compile-time-enabling-or-disabling-plugins/
- https://mrkaran.dev/posts/coredns-nomad/
| satrobit |
1,906,629 | Comparing Frontend Technologies: ReactJS vs VueJS | Introduction As a newbie to front-end development, navigating the landscape of... | 0 | 2024-06-30T14:04:04 | https://dev.to/xyzeez/comparing-frontend-technologies-reactjs-vs-vuejs-210j | react, veu, webdev, frontend | ##Introduction
As a newbie to front-end development, navigating the landscape of technologies can be overwhelming. Among the many frameworks and libraries available, ReactJS and VueJS have emerged as two of the most popular and widely adopted tools for building modern web applications. Both offer powerful features, a strong community, and extensive documentation, making them excellent choices for developers of all skill levels. Through my recent research into these technologies, I've discovered their unique characteristics and strengths. This article aims to share my findings and provide a comparative overview of ReactJS and VueJS, helping you understand what makes them stand out and how they might fit into your development projects. I'm particularly excited about how these tools can enhance my skills and efficiency as I delve deeper into front-end development.
## Background Information
[**ReactJS**](https://react.dev/), which was created by [Jordan Walke](https://github.com/jordwalke) but now maintained by a dedicated team working full-time at [Meta](https://about.meta.com/), is a JavaScript library for building web and native user interfaces, particularly single-page applications. It allows developers to create reusable UI components and manage the state of their applications efficiently. React follows a declarative programming style and leverages a virtual DOM for optimal performance.
[**VueJS**](https://vuejs.org/) is a progressive JavaScript framework created by [Evan You](https://evanyou.me/). It is designed to be incrementally adoptable, meaning you can use as much or as little of the framework as you need. Vue is known for its simplicity and ease of integration with other projects and libraries, making it a popular choice for small and large applications.
## Key Differences
When comparing ReactJS and VueJS, several key aspects stand out:
### 1. Learning Curve and Ease of Use
**ReactJS:**
- **Learning Curve:** React has a moderate learning curve. While its core concepts (components, props, state) are straightforward, mastering the ecosystem (e.g., Redux for state management, React Router for navigation) can be challenging.
- **JSX:** React uses JSX, a syntax extension for JavaScript that lets you write HTML-like markup inside a JavaScript file. This can be both a boon and a barrier, as it requires familiarity with both JavaScript and the nuances of JSX.
**VueJS:**
- **Learning Curve:** Vue has a gentle learning curve, making it accessible for beginners. Its documentation is comprehensive and easy to follow, which helps newcomers get up to speed quickly.
- **Templates:** Vue uses an HTML-based template syntax that allows you to declaratively bind the rendered DOM to the underlying component instance's data, separating the logic and the presentation, simplifying development and debugging. VueJS templates are said to be more intuitive for developers with an HTML/CSS background.
### 2. Performance
**ReactJS:**
- **Virtual DOM:** React's virtual DOM diffing algorithm updates the real DOM efficiently, minimizing its manipulation to ensure fast rendering and high performance even in complex applications.
- **Optimizations:** React offers various performance optimization techniques like [memoization](https://www.geeksforgeeks.org/what-is-memoization-in-react/), and [lazy loading](https://www.geeksforgeeks.org/lazy-loading-in-react-and-how-to-implement-it/), which can further enhance application speed.
**VueJS:**
- **Virtual DOM:** Vue also uses a virtual DOM, and its rendering performance is comparable to React.
- **Optimizations:** Vue's reactivity system is highly optimized, automatically tracking dependencies during rendering, reducing the need for manual optimizations.
### 3. Flexibility and Ecosystem
**ReactJS:**
- **Flexibility:** React is highly flexible and can be integrated with a wide range of libraries and tools. However, this flexibility can lead to decision fatigue as developers need to choose their tools for routing, state management, etc.
- **Ecosystem:** React has a vast ecosystem with a plethora of third-party libraries, extensions, and strong community support. Popular tools include [Redux](https://redux.js.org/) for state management and [React Router](https://reactrouter.com/en/main) for navigation.
**VueJS:**
- **Flexibility:** Vue is also flexible but provides more built-in solutions out of the box, such as [Vue Router](https://router.vuejs.org/) for navigation and [Vuex](https://vuex.vuejs.org/) for state management. This can reduce the need for additional libraries and make development more straightforward.
- **Ecosystem:** Vue's ecosystem is smaller than React's but rapidly growing. The [Vue CLI](https://cli.vuejs.org/) provides a robust tool for project scaffolding and build configuration.
### 4. Community and Support
**ReactJS:**
React has a large and active community, pointing to its abundant resources, tutorials, and third-party libraries. Currently maintained by Meta, React benefits from strong corporate support and regular updates.
**VueJS:**
Vue's community is smaller but very passionate and supportive. The official documentation is well-maintained and comprehensive. Vue is not backed by a tech giant, but it is widely adopted by companies like [Alibaba](https://madewithvuejs.com/alibaba) and others, which speaks to its reliability and effectiveness.
## Practical Examples
**ReactJS Example:**
```
import React, { useState } from 'react';
function App() {
const [count, setCount] = useState(0);
return (
<div>
<h1>Counter: {count}</h1>
<button onClick={() => setCount(count + 1)}>Increment</button>
</div>
);
}
export default App;
```
**VueJS Example:**
```html
<script setup>
import { ref } from 'vue'
const count = ref(0)
</script>
<template>
<button @click="count++">Count is: {{ count }}</button>
</template>
```
## Conclusion
Both ReactJS and VueJS are powerful tools for front-end development, each with its strengths and weaknesses. React’s flexibility and vast ecosystem make it ideal for large-scale applications requiring custom solutions. Vue’s simplicity and integrated approach make it perfect for developers seeking a gentle learning curve and a more opinionated framework.
Ultimately, the choice between React and Vue depends on your project requirements, team expertise, and personal preference. By understanding the key differences and advantages of each, you can select the framework that best aligns with your development goals.
## Join the HNG Internship Journey
I'll be experimenting with React at the ongoing HNG11 internship and I hope to gain deeper insights into its capabilities with my solutions to the various tasks provided by the internship. If you'd also like to embark on this journey with me, sign up using [this link](https://hng.tech/internship). You can also expand your horizons by connecting to top techies, allowing you to grow your career and collaborate with others using [this link](https://hng.tech/premium). Happy coding! | xyzeez |
1,906,627 | A Frontend Technology Comparison of Svelte vs Alpine.js | Introduction In this article, we will learn about Svelte & Alpine.js, along with discussing the... | 0 | 2024-06-30T14:00:57 | https://dev.to/audu_sunday_c2624dc16e5b3/a-frontend-technology-comparison-of-svelte-vs-alpinejs-1eka | beginners, webdev, hng | **Introduction**
In this article, we will learn about Svelte & Alpine.js, along with discussing the significant distinction that differentiates Svelte from Alpine.js.
In the dynamic world of frontend development, finding the right tool can significantly impact the efficiency and performance of your projects. While popular choices like ReactJS dominate the landscape, there are other, more niche technologies worth exploring. Today, we'll compare two such technologies: Svelte and Alpine.js. Both offer unique approaches to building web applications, and understanding their differences can help you decide which one might be best for your next project.
**Svelte: The Compiler-Based Framework**
Overview
Svelte, created by Rich Harris, is a relatively new player in the frontend framework scene. Unlike traditional frameworks such as React or Vue, Svelte shifts the heavy lifting to compile time. This means your code is compiled into highly efficient, imperative code that directly manipulates the DOM when you build your project.
**Key Features**
- No Virtual DOM: Svelte avoids the virtual DOM diffing process, resulting in faster updates and more efficient rendering.
- Reactive Assignments: Svelte's reactivity is built into the language itself, making it intuitive and straightforward to update the UI in response to state changes.
- Scoped Styles: Styles in Svelte are scoped to the component by default, reducing the chances of style conflicts.
- Small Bundle Size: The compiler's optimization leads to smaller bundle sizes, improving load times and performance.
**Advantages**
- Performance: The lack of a virtual DOM and the compiler's optimizations result in excellent runtime performance.
- Simplicity: Svelte's syntax and reactivity model are easy to understand and use, even for those new to frontend development.
- Less Boilerplate: Svelte reduces the need for boilerplate code, making your codebase cleaner and more maintainable.
**Drawbacks**
- Ecosystem: Svelte's ecosystem is smaller compared to more established frameworks, meaning fewer third-party libraries and tools.
Community: While growing, the community is still not as large as React's, which can make finding help or resources more challenging.
**Alpine.js:** The Minimalist JavaScript Framework
**Overview**
Alpine.js, created by Caleb Porzio, is a lightweight JavaScript framework designed to add interactivity to HTML. It aims to provide the reactive and declarative nature of frameworks like Vue or React, but with a much smaller footprint and without the need for a build step.
**Key Features**
- Directives: Alpine.js uses HTML attributes called directives to add interactivity to elements. These include x-data, x-bind, x-on, and more.
- Reactive Data: Similar to Vue, Alpine.js allows you to define reactive data directly within your HTML, making it easy to bind data to the DOM.
- Small and Fast: With a size of around 10kB, Alpine.js is incredibly lightweight, leading to faster load times.
- No Build Step: Alpine.js can be included directly in your HTML file, eliminating the need for a complex build process.
**Advantages**
- Ease of Use: Alpine.js is easy to learn and use, especially for those familiar with HTML and basic JavaScript.
- Integration: It integrates seamlessly with existing projects, making it ideal for adding interactivity to static HTML without overhauling the entire codebase.
- Lightweight: Its small size makes it perfect for performance-critical applications or sites with minimal interactivity needs.
**Drawbacks**
- Limited Features: Alpine.js is not a full-fledged framework and lacks some of the advanced features found in Svelte or React.
- Scalability: While great for small projects, Alpine.js may not be suitable for large-scale applications with complex state management needs.
- Community and Ecosystem: Like Svelte, Alpine.js has a smaller community and fewer third-party resources compared to React or Vue.
**Comparing Svelte and Alpine.js**
**Performance**
Svelte's compiler-based approach results in highly optimized code, making it a top performer in terms of runtime efficiency. Alpine.js, being minimalistic, performs well for its size but may not match Svelte's performance in more complex scenarios.
**Complexity and Learning Curve**
Alpine.js wins in terms of simplicity and ease of use. Its minimalistic approach and direct integration with HTML make it very approachable. Svelte, while also relatively simple, introduces a new syntax and requires a build step, which adds a bit more complexity.
**Use Cases**
Svelte: Ideal for building full-fledged applications where performance and maintainability are crucial. It’s great for developers looking for a modern, reactive framework with robust features.
Alpine.js: Best suited for enhancing static HTML with interactivity without the overhead of a full framework. Perfect for small projects or adding simple interactive features to existing sites.
**My Experience with ReactJS**
When I first started learning React, I was immediately struck by its declarative approach and component-based architecture. Coming from a background with basic HTML, CSS, and JavaScript, React's way of thinking about UI as a composition of components was a refreshing change. The concept of building reusable components made sense, and I was eager to dive deeper.
One of the first hurdles I encountered was JSX, React's syntax extension that allows you to write HTML-like code within JavaScript. Initially, it felt strange and a bit overwhelming, but as I practiced, I began to appreciate the power and flexibility it offered. Creating components and nesting them to build complex UIs became intuitive over time.
Understanding state and props was another significant challenge. The idea of passing data between components through props and managing dynamic data within a component's state took some time to grasp fully. However, once I got the hang of it, I realized how powerful these concepts were for creating interactive and responsive user interfaces.
Lifecycle methods introduced me to the inner workings of React components, allowing me to control their behavior at different stages. Later, learning about hooks, especially useState and useEffect, was a game-changer. Hooks simplified the way I managed state and side effects, making my code cleaner and more readable.
I relied heavily on online tutorials and courses to build my foundational knowledge. Platforms like Simplilearn and Udemy offered structured learning paths that guided me through the basics and into more advanced topics.
The field of frontend development is constantly evolving, and React is no exception. Staying up-to-date with the latest features, libraries, and best practices is an ongoing challenge. Subscribing to newsletters, following React experts on social media, and participating in community events have helped me stay informed and continue growing as a developer.
**Conclusion**
Both Svelte and Alpine.js offer unique benefits and cater to different needs in the frontend development landscape. Svelte’s performance and modern features make it a strong contender for full-scale applications, while Alpine.js’s simplicity and minimalism are perfect for smaller, interactive enhancements. Understanding the strengths and limitations of each can help you make an informed decision based on your project requirements.
As for me, I’m thrilled to continue my journey with ReactJS at HNG, pushing the boundaries of what we can achieve with this versatile and powerful library.
If you're interested in learning more about the HNG Internship, check out the links,
https://hng.tech/internship, https://hng.tech/hire, https://hng.tech/premium
Thank you for reading! Feel free to share your thoughts or experiences with these technologies in the comments. | audu_sunday_c2624dc16e5b3 |
1,906,625 | A Backend Story | This story is quite intriguing as I would not call myself a Backend Engineer or Programmer yet. I am... | 0 | 2024-06-30T13:57:00 | https://dev.to/veescript/a-backend-story-2cp5 | backenddevelopment, php, python, beginners | This story is quite intriguing as I would not call myself a Backend Engineer or Programmer yet.
I am a person with no background at all in backend programming except the basic CSC 201 class that taught us little to nothing about Python programming. All I knew was, there was something called languages, environments and a bit of their differences. We were forced to download them on our laptops back then to do projects and all.
But then, I was opportune to be employed by an accrediting company which uses SQL databases for their basic project operations. This is because they run most of their accrediting software locally due to network provider connection issues. That is where I got to know about PHP, CSS, HTML and other technologies.
I was fascinated, and I still am. I started getting the developer to put me through some basic things like how to connect to SQL and stuff like that. I guess he taught me so I would be able to handle accrediting projects on my own.
And yes, after a while, I was and I was promoted also. How happy I was but I wanted and still want more. So I started research and found a lot of websites willing to teach, but it wasn't challenging enough for me and I pushed them aside until I stopped entirely.
Well, that was until I found out about HNG internship program, it is a free Boot Camp apparently and it boasts of having a competitive workspace which I'm sure can hold my attention.
This is a journey I would like to begin with just a step in the right direction via HNG. If you would like to join me on my journey, click any of the links below
https://hng.tech/internship
https://hng.tech/hire
Thanks for reading.
See you in my next post | veescript |
1,906,626 | TypeScript vs React; which should I learn as a frontend developer. | React is a JavaScript library use to build reusable UI components for single page applications. It... | 0 | 2024-06-30T13:56:40 | https://dev.to/victor_88/typescript-vs-react-which-should-i-learn-as-a-frontend-developer-38gh | React is a JavaScript library use to build reusable UI components for single page applications. It uses a jsx(JavaScript XML) syntax that allows you to write html-like code. Although jsx is not compulsory, it makes building applications with react easier.
Typescript is a superset of JavaScript, which means that typescript is JavaScript but with types. It is designed for large scale application and is transpiled to JavaScript.
### Advantages of react
1. Easy to learn and use: React is very beginner friendly and can be learnt with prior knowledge of html, CSS, and JavaScript.
2. Mobile development: React is not only used for web development. With the help of a framework like react native you can build a mobile application with your knowledge of react which makes you more versatile as a developer.
3. Component based architecture: React uses a component-based architecture
that allows developers to build reusable UI components which makes development faster, less redundant and easier to maintain and scale.
### Advantages of Typescript
1. Compile-time error: Typescript enforces strict typing during compilation time which helps reveal potential errors and bugs, this happens before a typescript code is transpiled to JavaScript.
2. Type inference: Typescript can infer the types of a variable from the value assigned to the variable thereby reducing development time and improving developer experience.
3. Strong and static typing: Typescript allows you to specify the types of your variables, function parameters, and return values which help reduce errors and potential bugs.
### Conclusion
Well, you might wonder which is the best for me to learn as frontend developer. I would say learn both- yes you heard me right. Typescript gives you extra superpowers as a JavaScript developer and react makes building user interface easier, I would advise you reap the benefits of both worlds. And if you need an internship to build all the necessary experience you need for your react journey, I recommend the HNG internship program -https://hng.tech/internship, or you can opt for the premium -https://hng.tech/premium to access exclusive opportunities like certificates, reference letters, and the latest job openings. It is a remote program, so you don't have to worry about commuting. Wish you the best on your coding journey.
| victor_88 | |
1,906,623 | Moonlight Architecture - The Old-New | Moonlight Architecture is an architecture that has existed for a while but remained nameless.... | 0 | 2024-06-30T13:54:52 | https://dev.to/jet_ezra/moonlight-architecture-the-old-new-4ph5 | php, pionia, webdev, restapi | [Moonlight Architecture](https://pionia.netlify.app/moonlight/moonlight-architecture/) is an architecture that has existed for a while but remained nameless. Therefore, it is not right to say it is a new architecture yet most developers and companies have been using it. It borrows its features from commonly used architectures of MVC, gRPC, Graphql and Micro Services. Most of the features may highly sound, look more like MVC combined with Graphql.
## Features of Moonlight.
- REST only. Moonlight is meant for only REST applications. It does not support any form of templating. This is a design philosophy, whereby, it lets frontend frameworks handle frontend work while serving them APIs in the most elegant manner.
- Also Moonlight architecture is framework and language agnostic, implying that you can use it with your favourite framework in your favourite language. If you're coming from a PHP background, a framework - [Pionia](https://pionia.netlify.app), has already been put up to get you started, however, you can configure your favourite frameworks to work with this architecture. This has already been done with frameworks such as Spring Boot in Java(an excellent and pioneer framework for the architecture), Django Rest Framework in Python(worked excellently too) and Laravel in PHP(though some few features were not pulled off right).
- Moonlight encourages a single API endpoint for your entire API version, which means, all version APIs can be served at /api/v1/, then version two on /api/v2/. This is the only route you should have for your entire API. No query parameters, no relative paths but only one endpoint.
- Moonlight, encourages(enforces) all requests to the backend to be performed over POST. This has security benefits since the body of POST requests is usually logged while encrypted on a web server, thus keeping your requests secure to your application server.
- Single request format. The architecture encourages using either JSON or FormData if supported in your language or framework. Each request must define `SERVICE` and `ACTION` targeted. And the rest of the data as key-value pairs.
```JS
// POST https://exampleapi.com/api/v1/
{
SERVICE: 'auth',
ACTION: 'login',
password: '1234567',
username: 'moonlight'
}
```
- Single Response Format. The response must define the return code, return message, return data, and any extra data the server wants to communicate to the front end. In this case, the return code must be defined and by conventional, `0` is reserved for success, any other codes indicate failure but this can be customised to your business needs.
```js
// POST https://exampleapi.com/api/v1/
{
returnCode: 0, // 0 for success, any other for error or warning
returnObject: "token_here",// response data fro the server, can take any type according to the request
returnMessage: 'Welcome back John Doe', // message you want the frontend to show
extraData: null // any other data you want to pass.
}
```
- All requests that reach your application server must return a 200 OK status code whether they are resolved with an error or not. In cases of errors, a return code that is not `0` can be returned, with a clean, descriptive error message as the return message.
- Moonlight introduces the concept of switches, these represent a certain version of your API and register all services that are available in that version. Their main job is to read the incoming request, determine which service was defined and call it passing their entire request data. The switch also catches all uncaught exceptions in the request and returns a cleaner response similar to what we determined above.
- Services in Moonlight are central holders of your business logic, this is where you write all your actions specific to that service. An example of a service(class) can be `AuthService` that defines actions(methods) such as `login`, `resetPassword`, and `register`. All these must return the same response format. Actions are responsible for querying the database, validating the requested data and requesting any external third party that you might be integrating with.
## Known Issues With Architecture.
1. Strict(typed) languages such as Java, C# and Golang, may dictate you to stick to only JSON data. However, in non-strict languages like JS, Python, and PHP(which are commonly used for the web), you can mix both form data and JSON. This only affects uploading multipart files.
This is currently the only known issue so far affecting the architecture.
If you are interested in this architecture, you can [read more about it here](https://pionia.netlify.app/moonlight/moonlight-architecture/). Also, you can [try out Pionia for your next API application](https://pionia.netlify.app/documentation/introduction/#installation). Pionia provides full implementation of the architecture. You will be surprised how much you need to pull off an API!
Goodluck and happy coding!
| jet_ezra |
1,906,622 | Paper detailing BitPower Loop’s security | Security Research of BitPower Loop BitPower Loop is a decentralized lending platform based on... | 0 | 2024-06-30T13:53:40 | https://dev.to/wgac_0f8ada999859bdd2c0e5/paper-detailing-bitpower-loops-security-i6d | Security Research of BitPower Loop
BitPower Loop is a decentralized lending platform based on blockchain technology, dedicated to providing users with safe, transparent and efficient financial services. Its core security comes from multi-level technical measures and mechanism design, which ensures the robust operation of the system and the security of user funds. This article will introduce the security of BitPower Loop in detail from five aspects: smart contract security, decentralized management, data and transaction security, fund security and risk control mechanism.
1. Smart Contract Security
Smart contracts are the core components of BitPower Loop, and their codes must undergo strict security audits before deployment. These audits are usually conducted by third-party independent security companies to ensure that there are no vulnerabilities or malicious code in the contract. In addition, the immutability of smart contracts means that once deployed, no one (including the development team) can modify its rules and logic, which fundamentally eliminates the possibility of malicious operations. All operations are automatically executed by smart contracts, avoiding the risk of human intervention and ensuring the fairness and consistency of system operation.
2. Decentralized Management
BitPower Loop eliminates the risks brought by single point failures and central control through decentralized management. The system has no central management agency or owner, and all transactions and operations are jointly verified and recorded by blockchain nodes distributed around the world. This decentralized structure not only improves the system's anti-attack capabilities, but also enhances transparency. Users can publicly view all transaction records, which increases trust in the system.
3. Data and transaction security
BitPower Loop uses advanced encryption technology to protect users' data and transaction information. All data is encrypted during transmission and storage to prevent unauthorized access and data leakage. The consensus mechanism of the blockchain ensures the validity and immutability of each transaction, eliminating the possibility of double payment and forged transactions. In addition, the automated execution of smart contracts also avoids delays and errors caused by human operations, ensuring the real-time and accuracy of transactions.
4. Fund security
The secure storage of user funds is an important feature of BitPower Loop. Funds are stored on the blockchain through smart contracts and maintained by nodes across the entire network. Distributed storage avoids the risk of fund theft caused by centralized storage. In addition, the user's investment returns and shared commissions are automatically allocated to the user's wallet address by the smart contract after the conditions are met, ensuring the timely and accurate arrival of funds.
5. Risk Control Mechanism
BitPower Loop effectively manages lending risks by setting collateral factors and liquidation mechanisms. The collateral factors are independently set according to market liquidity and asset value fluctuations to ensure system stability and lending security. When the value of the borrower's assets falls below a certain threshold, the liquidation mechanism is automatically triggered, ensuring the repayment of the borrower's debt and protecting the interests of the fund provider. In addition, the immutability and automatic execution characteristics of smart contracts further enhance the security and reliability of the system.
Conclusion
BitPower Loop achieves high security and stability through multi-level security measures and mechanism design. Its smart contracts are strictly audited and immutable, decentralized management eliminates single point failure risks, advanced encryption technology protects data and transaction security, distributed storage ensures fund security, and risk control mechanisms manage lending risks. These security features together build a reliable decentralized financial platform that provides users with secure, transparent and efficient financial services. | wgac_0f8ada999859bdd2c0e5 | |
1,896,879 | functions and their inverses: 2 insightful examples. | Math is often considered hard. But, that is not true. Math is about logic. Following what you know to... | 27,818 | 2024-06-30T13:53:18 | https://dev.to/0xc0der/functions-and-their-inverses-2-insightful-examples-3iha | math, tutorial, learning, mathpills | Math is often considered hard. But, that is not true. Math is about logic. Following what you know to reach an understanding of what you don't. That we can do naturally. Our brains are wired to think.
The diagnosis **in my opinion** to most of the problems with math is.
- lack of **foundation**.
- unwillingness to give the required **mental effort**.
In this series of posts, I'll try to dive a little deeper in what looks simple, But, is very foundational.
In a easy to digest #mathpills, I'll discuss fundamentals in quick pieces of elementary mathematics.
Like a logic puzzle. starting from the fundamentals you can build your way up to a very advanced an complicated structures.
So lets start with one of the most fundamental building blocks of math. **functions**.
## basic definitions.
Let's define the function {% katex inline %} f: A \to B {% endkatex %}.
from these definitions we can find that:
1. any element {% katex inline %} a \in A {% endkatex %} its **image** {% katex inline %} f(a) {% endkatex %} must belong to {% katex inline %} B {% endkatex %}
2. {% katex inline %}
f(A) = \lbrace f(a) \mid a \in A \rbrace
{% endkatex %} is the set of all images of elments of {% katex inline %} A {% endkatex %}.
Let {% katex inline %} g: B \to A {% endkatex %} be a function that does the inverse of what {% katex inline %} f {% endkatex %} does then:
{% katex %}
g(B) = f^{-1}(B) = \lbrace a \in A \mid f(a) \in B \rbrace
{% endkatex %} where {% katex inline %} f^{-1}(x) {% endkatex %} is a special notation to write the inverse function.
## two examples.
**First Example**: suppose that {% katex inline %} X \subseteq A {% endkatex %}. does {% katex inline %} f^{-1}(f(X)) = X {% endkatex %}.
To prove that these two sets are equal we must do that in two steps.
- first, is {% katex inline %} X \subseteq f^{-1}(f(X)) {% endkatex %}?
Suppose that {% katex inline %} a \in X {% endkatex %}, then from the above definition {% katex inline %} f(a) \in f(X) {% endkatex %}, then {% katex %} a \in \lbrace x \in X \mid f(x) \in f(X) \rbrace = f^{-1}(f(X)) {% endkatex %}
proving that {% katex inline %} X \subseteq f^{-1}(f(X)) {% endkatex %}.
Easy, isn't it. Just following simple definitions we were able to prove the first part, but the second will need extra assumptions to work.
- is {% katex inline %} f^{-1}(f(x)) \subseteq X {% endkatex %}?
Suppose that {% katex inline %} a \in f^{-1}(f(X)) {% endkatex %} then {% katex inline %} a \in \lbrace x \in X \mid f(x) \in F(X) \rbrace {% endkatex %}, which means that {% katex inline %} f(a) \in F(X) = \lbrace f(x) \mid x \in X \rbrace {% endkatex %}, then there exists {% katex inline %} x \in X {% endkatex %} such that {% katex inline %} f(x) = f(a) {% endkatex %}.
If we can prove that {% katex inline %} a = x {% endkatex %}, then {% katex inline %} x \in X {% endkatex %}. which proves the second part.
This is easy if {% katex inline %} f {% endkatex %} is a **one-to-one** function.
A function is **one-to-one** if {% katex inline %} \forall a \in A \forall b \in A (f(a) = f(b) \rightarrow a = b) {% endkatex %}. which read, for any elements {% katex inline %} a \in A, b \in A {% endkatex %}, if have the same image, then they must be tha same.
assuming that {% katex inline %} f {% endkatex %} is a one-to-one function and applying the definition above, {% katex %} f(x) = f(a) \rightarrow a = x {% endkatex%} which means {% katex inline %} a \in X {% endkatex %}.
proving that {% katex inline %} f^{-1}(f(x)) \subseteq X {% endkatex %}.
**Second Example**: suppose that {% katex inline %} Y \subseteq B {% endkatex %}. does {% katex inline %} f(f^{-1}(Y)) = Y {% endkatex %}.
Following the same procedure as the previous example.
- first, is {% katex inline %} f(f^{-1}(Y)) \subseteq Y {% endkatex %}?
Suppose that {% katex inline %} a \in f(f^{-1}(Y)) {% endkatex %}, there exists {% katex inline %} x \in f^{-1}(Y) = \lbrace x \in A \mid f(x) \in Y \rbrace {% endkatex %} such that {% katex inline %} a = f(x) {% endkatex %}, then {% katex inline %} a \in Y {% endkatex %}.
provin that {% katex inline %} f(f^{-1}(Y)) \subseteq Y {% endkatex %}.
For the second part, we are going to follow the logic as always, from what we know (the definitions) to what we don't (the results).
- is {% katex inline %} Y \subseteq f(f^{-1}(Y)) {% endkatex %}?
First, suppose that {% katex inline %} a \in Y {% endkatex %}, then, can we choose an {% katex inline %} x \in A {% endkatex %} such that {% katex inline %} f(x) = a {% endkatex %}? yes, if and only if the function is onto.
a function is **onto** if {% katex inline %} f(A) = B {% endkatex %}.
In other words all the elemens in {% katex inline %} B {% endkatex %} are an image of some element in {% katex inline %} A {% endkatex %}.
With that property we can guarantee that our chosen element {% katex inline %} a {% endkatex %} is a reverse image of some element {% katex inline %} x {% endkatex %} such that {% katex inline %} x = f^{-1}(a) \in f^{-1}(Y){% endkatex %}.
After that, we find {% katex inline %} a = f(x) \in f(f^{-1}(Y)) {% endkatex %}. Finishing our proof.
That concludes this pill. I hope you enjoyed it. If you have any questions leave them in the comments, I'd be happy to answer them.
Thank you for reading :smile:. | 0xc0der |
1,906,621 | REACTJS | Hey!! I started my frontend journey with Html, css and JavaScript which took me about one year to... | 0 | 2024-06-30T13:52:34 | https://dev.to/anthonynelson/reactjs-5401 | Hey!! I started my frontend journey with Html, css and JavaScript which took me about one year to master almost enough to start building a static website, then moved to learning a framework like ReactJS, but I had a lot of frame work to choose from in the likes of Angular, Vue and React but I ended up with going for React as it is the most used framework, I would point out some of the key features of react only cause that's what i've been used to
1. Virtual DOM: React uses virtual dom to improve performance by updating important part of the actual DOM, when there are changes making the UI very efficient
2. React hooke: React allows components to have state and lifecycle features, they provide more concise way to write components.
3. JSX: React uses JSX, a syntax extension that allows you to write Html-like code within JavaScript, it makes it easier to create UI component
and many more ..
So far I've gotten some information about other framework like Vue and Angular they can be used in placement of React just that we use them based on their reasons like what the company prefer using and so on.
Finally, I'm going for an advanced Internship that uses Reactjs, good news in-between and I would be glad to use React for a real world project don't mind checking this out yourself https://hng.tech/internship or
https://hng.tech/premium to get started with your react/frontend journey...
Take care 😘 | anthonynelson | |
1,906,101 | Google I/O Extended 2024 Cape Town | Google I/O Extended events are organized by local Google Developer Groups worldwide as an extended... | 0 | 2024-06-30T13:51:59 | https://dev.to/muhammedsalie/google-io-extended-2024-cape-town-4bai | ioextended, ioextended2024, gdgcapetown, machinelearning | Google I/O Extended events are organized by local Google Developer Groups worldwide as an extended version of the main I/O conference held in California.

The Google Developer Group in Cape Town, in partnership with Le Wagon, hosted this year's event on 29th June.
The event provided attendees with a platform to connect, learn, and gain a deeper understanding of technologies announced at the 2024 Google I/O event, covering categories such as Mobile and Web Application Development, Artificial Intelligence (Reinforcement Learning), and Cloud Integrated Development Environment (Project IDX).

Ugo Mmirikwe, one of the organizers of GDG, kicked off the event and briefly talked about some of the exciting new developments, as well as introducing the guest speakers.

1. The keynote talk was presented by Atieno Allela. She started the event with **"What's New for Developers?"**, focusing on how Google is at the forefront of the revolution, leveraging their powerful AI model, Gemini, to introduce a new generation of developer tools.

2. **AI for Industrial Optimization using JAX and Google Cloud** by Ruan de Kock. He facilitated a practical session titled "AI for Industrial Optimization using JAX and Google Cloud," demonstrating how InstaDeep leverages JAX and TPUs for training multi-agent systems both online in simulators and offline on existing datasets. He then walked through a Google Colab notebook where participants could train their first multi-agent system both from offline data and in an online simulator.


3. **Infrastructure as Code: The Developer's Secret Weapon** by Hennie Francis. In this talk, we delved into the world of IaC and explored how it empowers developers to supercharge their productivity, streamline workflows, and elevate their impact on the development process.

4. **The Prompt Design Toolkit** by Gabriel Cassim walked us through a short course on prompt design techniques and best practices for professional use or for building agents. The course covered various prompting designs for daily use or for building GenAI agents with techniques like Chain of Thought and ReAct.

Lunch break..

Google has introduced several exciting features and updates to its products. This year, the focus was on significant advances in AI (Artificial Intelligence), particularly highlighting a series of new developments in the Gemini model family, Google's revolutionary AI model.
There was also an emphasis on integrating AI across its products, including developer tools like Google AI Studio and Android Studio, as well as Gemini within Google Workspace.
Overall, the Google I/O Extended 2024 event was a fantastic and insightful experience, and I look forward to more innovations from Google in the near future.
| muhammedsalie |
1,906,620 | TIL you can create keyboard shortcuts to switch between a specific desktop | Normally on a mac, you can use the trackpad to swipe left or right to switch between a desktop. But... | 0 | 2024-06-30T13:51:32 | https://dev.to/benji011/til-you-can-create-keyboard-shortcuts-to-switch-between-a-specific-desktop-1ic6 | todayilearned, mac, osx | Normally on a mac, you can use the trackpad to swipe left or right to switch between a desktop. But you can add a shortcut to switch between desktop x instead of swiping multiple times, or swiping up to select one.
## How to do this
> System preferences > Keyboard > Keyboard shortcuts > Mission Control > Tick the checkbox for mission control > Click the caret symbol on the left and check all the ones you want to add to a shortcut.

Reference: https://qiita.com/saboyutaka/items/d6cfd2a2b60f1a374d60#%E3%83%87%E3%82%B9%E3%82%AF%E3%83%88%E3%83%83%E3%83%97%E3%82%92%E5%A2%97%E3%82%84%E3%81%97%E3%81%A6%E3%82%B7%E3%83%A7%E3%83%BC%E3%83%88%E3%82%AB%E3%83%83%E3%83%88%E5%89%B2%E3%82%8A%E5%BD%93%E3%81%A6 | benji011 |
1,906,618 | Shaping the Future: A Deep Dive into Top Universities in Uttarakhand | Introduction Uttarakhand, nestled in the lap of the Himalayas, not only boasts breathtaking natural... | 0 | 2024-06-30T13:49:45 | https://dev.to/nisha_rawat_b538a76f5cc46/shaping-the-future-a-deep-dive-into-top-universities-in-uttarakhand-15d2 | **Introduction**
Uttarakhand, nestled in the lap of the Himalayas, not only boasts breathtaking natural beauty but also houses several esteemed universities and colleges that contribute significantly to the region's educational prowess. This article aims to provide an in-depth exploration of some of the top educational institutions in Uttarakhand, highlighting their academic excellence, programs, faculty, infrastructure, and contributions to higher education in the region. Special emphasis will be placed on Dev Bhoomi Uttarakhand University (DBUU), showcasing its pivotal role in shaping the educational landscape of the state.
**Universities and Colleges in Uttarakhand**
Uttarakhand offers a diverse range of universities and colleges, each with its own distinct offerings and contributions to education and research. Here are some of the notable institutions:
**1. University of Petroleum and Energy Studies (UPES)**
Located in Dehradun, UPES is renowned for its specialized programs in energy, petroleum, and allied sectors. The university has established strong ties with industry leaders and offers industry-oriented education that prepares students for careers in the energy sector and beyond.
**2. Graphic Era University**
Situated in Dehradun, Graphic Era University excels in engineering, management, and computer applications. The university boasts modern infrastructure, including state-of-the-art labs and research facilities, and is known for its academic rigor and industry-relevant curriculum.
**3. Kumaun University**
Kumaun University, based in Nainital, is one of the oldest universities in Uttarakhand. It offers a wide array of undergraduate, postgraduate, and doctoral programs across disciplines such as arts, sciences, commerce, and management. The university is particularly noted for its research contributions in environmental sciences and Himalayan studies.
**
4. Hemwati Nandan Bahuguna Garhwal University (HNBGU)**
Established in Srinagar (Garhwal), HNB Garhwal University is a central university catering to diverse disciplines including arts, sciences, technology, and management. With multiple campuses across the Garhwal region, the university fosters academic excellence and cultural diversity.
## **Dev Bhoomi Uttarakhand University (DBUU)**
Dev Bhoomi Uttarakhand University holds a prominent position among the educational institutions in Uttarakhand. Founded with a mission to provide quality education across various domains, DBUU offers a comprehensive range of undergraduate, postgraduate, and doctoral programs. The university is distinguished by its modern infrastructure, experienced faculty, and industry-aligned curriculum.
**History and Establishment**
Dev Bhoomi Uttarakhand University was established with the vision of becoming a center of excellence in higher education and research. Since its inception, the university has grown steadily, expanding its academic offerings and infrastructure to meet the evolving needs of students and industries. The founding principles of DBUU emphasize innovation, inclusivity, and the pursuit of academic excellence.
**Academic Programs Offered**
DBUU offers a wide spectrum of programs designed to cater to diverse interests and career aspirations. The university's key disciplines include:
**Engineering:** Civil Engineering, Mechanical Engineering, Computer Science & Engineering, Electronics & Communication Engineering, Electrical Engineering, etc.
Management: Bachelor of Business Administration (BBA), Master of Business Administration (MBA) with specializations in Marketing, Finance, Human Resource Management, etc.
**Pharmacy: **Bachelor of Pharmacy (B.Pharm), Master of Pharmacy (M.Pharm) in Pharmaceutical Chemistry, Pharmacology, Pharmaceutics, etc.
Agriculture and Applied Sciences: Bachelor of Science (B.Sc) and Master of Science (M.Sc) programs in Agriculture, Forestry, Environmental Science, Biotechnology, etc.
Each program at DBUU combines theoretical knowledge with practical applications, ensuring that graduates are well-prepared for the demands of the industry.
**Faculty Expertise and Research Focus**
The strength of DBUU lies in its dedicated faculty members who bring a wealth of academic knowledge and industry experience to the classroom. The faculty is actively involved in research projects and collaborations with industry partners, contributing to advancements in their respective fields. Their expertise spans various domains, fostering an environment where students can engage in innovative research and practical problem-solving.
**
Infrastructure and Campus Facilities**
DBUU boasts modern infrastructure that supports a conducive learning environment. The university campus features:
Laboratories: Well-equipped labs for practical training and research.
Libraries: Extensive collections of books, journals, and digital resources to support academic pursuits.
Sports Complex: Facilities for various sports and recreational activities.
Residential Accommodation: Comfortable housing options for students, fostering a vibrant campus community.
These facilities are designed to promote holistic development and cater to the diverse needs of students. The campus infrastructure is continuously upgraded to incorporate the latest technological advancements and provide a dynamic learning atmosphere.
**
Student Life and Support Services**
DBUU places a strong emphasis on student welfare and support services. The university offers:
Career Counseling: Guidance on career paths and opportunities.
Placement Support: Assistance with internships and placements in reputed companies.
Cultural and Extracurricular Activities: Events and clubs that enhance students' personal and professional skills.
These initiatives contribute to the overall development of students, preparing them to excel in their chosen fields. The vibrant campus life at DBUU ensures that students have ample opportunities to engage in extracurricular activities, fostering a well-rounded educational experience.
**Innovations and Achievements**
DBUU is recognized for its innovations in teaching methodologies and the integration of technology in education. The university encourages interdisciplinary research and promotes a culture of innovation among students and faculty members. Notable achievements include research publications, patents, and awards that highlight DBUU's commitment to academic excellence. The university's focus on practical learning and industry collaboration has led to significant advancements in various fields of study.
**Impact on Society and Future Prospects**
Beyond academic excellence, DBUU plays a crucial role in contributing to regional development and societal progress. By producing skilled professionals and ethical leaders, the university makes a positive impact on the socio-economic fabric of Uttarakhand. DBUU's strategic initiatives and future plans aim to further elevate educational standards and foster community engagement. The university's outreach programs and community service initiatives demonstrate its commitment to societal development.
**Universitychalo**
In recent years, platforms like Universitychalo have transformed the college admission process in Uttarakhand. Serving as an online portal, Universitychalo provides comprehensive information about various universities, courses, admission procedures, and scholarship opportunities. The platform simplifies the application process for students by offering guidance on entrance exams, counseling sessions, and application deadlines. Universitychalo has become an invaluable resource for students seeking higher education in Uttarakhand, ensuring they have access to accurate and up-to-date information.
**Conclusion**
[Dev Bhoomi Uttarakhand University](https://universitychalo.com/university/dev-bhoomi-uttarakhand-university-dbuu-dehradun
) remains at the forefront of educational innovation and excellence in Uttarakhand. With its commitment to quality education, research-driven approach, and holistic development of students, DBUU continues to shape the future leaders of the state and contribute to its overall development. As DBUU and other leading institutions in Uttarakhand evolve and innovate, they play a pivotal role in ensuring that the state remains a hub of educational opportunities and intellectual growth. Platforms like [Universitychalo](https://universitychalo.com) further enhance this landscape by providing essential support and information to aspiring students. | nisha_rawat_b538a76f5cc46 | |
1,906,617 | Introduction to Apache Hadoop & MapReduce | The History of Hadoop There are mainly two problems with the big data. Storage for a... | 0 | 2024-06-30T13:48:35 | https://dev.to/shvshydv/introduction-to-apache-hadoop-30ka | hadoop, dataengineering, bigdata, datascience | ## The History of Hadoop
There are mainly two problems with the big data.
- Storage for a huge amount of data.
- Processing of that stored data.
In 2003, Google published about Google's distributed file system, called **GFS (Google File System)** which can be used for storing large data sets.
Similarly in 2004, Google published a paper on **MapReduce**, that described the solution for processing large datasets.
**Doug Cutting** and **Mike Cafarella** (Founders of Hadoop), came across both of these papers that described GFS and MapReduce, while working on the Apache Nutch project.
The aim of the Apache Nutch project was to build a search engine system that can index 1 billion pages. Their conclusion to this project was that it would cost millions of dollars.
Both the papers by Google were not complete solution for the Nutch project.
Fast forward to 2006, Doug Cutting joined **Yahoo** and started the project **Hadoop**, implementing the papers from Google.
Finally in 2008, Yahoo released Hadoop as an open source project to **ASF(Apache Software Foundation)** and they successfully tested a **4000 node cluster with Hadoop**.
---
## Intro to Apache Hadoop
**Apache Hadoop** is software framework for **distributed storage** and **distributed processing** of big data using the MapReduce programming model.
Hadoop comes with the following 4 modules:
1. **HDFS (Hadoop Distributed File System):** A file system inspired by GFS which is used for distributed storage of Big data.
2. **YARN (Yet Another Resource Negotiator):** A resources manager that can be used for job scheduling and cluster resource management. It keeps track of which node does what work.
3. **MapReduce:** Programming Model used for distributed processing. It divides the data into partition that are mapped (transformed) and Reduced (aggregated).
4. **Hadoop Common:** It includes libraries and utilities used and shared by other Hadoop modules.
Here is a block diagram representation of how they all work together.

---
## MapReduce
As we know MapReduce is a programming model that can process big data in a distributed manner, let's see how MapReduce works internally.
There are majorly 3 tasks performed during a MapReduce job.
1. Mapper
2. Shuffle & Sort
3. Reducer
Below is a example of how a MapReduce job would look like:

This can vary and depends how we want the MapReduce to process our data.
Hadoop & MapReduce are written natively in Java, but streaming allows interfacing to other languages like Python.
Here is an example Python code for a MapReduce job.
```py
from mrjob.job import MRJob
from mrjob.step import MRStep
class RatingsBreakdown(MRJob):
def steps(self):
return [
MRStep(mapper=self.mapper_get_ratings,
reducer=self.reducer_count_ratings),
MRStep(reducer=self.reducer_sorted_count)
]
def mapper_get_ratings(self, _, line):
(user_id, movie_id, rating, timestamp) = line.split('\t')
yield movie_id, 1
def reducer_count_ratings(self, key, value):
yield str(sum(values)).zfill(5), key
def reducer_sorted_output(self, count, movies):
for movie in movies:
yield movie, count
if __name__ == '__main__':
RatingsBreakdown.run()
```
The Hadoop ecosystem has grown significantly and includes various tools and frameworks that build upon or complement the basic MapReduce model. Here’s a look at some of these technologies:

While newer technologies offer more straightforward ways to handle big data, understanding MapReduce is fundamental to grasping the field's broader concepts.
---
THE END | shvshydv |
1,906,616 | Revolutionizing Visual Content: The Power of Undress AI | Overview of Undress AI Discover how Undress AI reshapes photo editing with its advanced capabilities... | 0 | 2024-06-30T13:45:36 | https://dev.to/gogato2980/revolutionizing-visual-content-the-power-of-undress-ai-4ig6 | **Overview of Undress AI**
Discover how Undress AI reshapes photo editing with its advanced capabilities in realistic clothing alterations. This platform caters to professionals across industries, providing intuitive tools for enhancing visual storytelling.
**Key Features of Undress AI**
Advanced AI Technology
Undress AI utilizes state-of-the-art machine learning algorithms for precise and seamless clothing edits, ensuring natural integration with original images.
User-Friendly Interface
The intuitive interface simplifies the editing process, enabling users of all skill levels to upload images, apply transformations, and download edited photos with ease.
Versatility in Applications
From virtual fashion prototyping to enhancing product photography and refining social media aesthetics, Undress AI supports diverse creative needs effectively.
Privacy and Security
Prioritizing user privacy, [Undress AI](https://vip.undressaitool.com/) employs robust security measures for secure image processing and compliance with privacy regulations.
Benefits of Using Undress AI
Explore how Undress AI enhances creative exploration, streamlines workflow efficiency, and offers cost-effective solutions for achieving professional-quality edits.
Getting Started with Undress AI
To explore the capabilities of Undress AI, visit their official website, create an account, and seamlessly navigate image uploads, transformations, and downloads. | gogato2980 | |
1,906,615 | Comparing Frontend Technologies: ReactJS vs. Svelte | When choosing a frontend framework, it's important to know what each one offers. Let's compare... | 0 | 2024-06-30T13:42:56 | https://dev.to/badmuseniola/comparing-frontend-technologies-reactjs-vs-svelte-5d8m | When choosing a frontend framework, it's important to know what each one offers. Let's compare ReactJS and Svelte to see which might be best for your next project.
**ReactJS**
ReactJS, developed by Facebook, is known for building dynamic user interfaces with a component-based architecture.
**Key Features:**
- Component-Based:
Promotes reusable and maintainable code.
- Virtual DOM:
Enhances performance by efficiently updating the DOM.
- Strong Ecosystem:
Extensive libraries and tools.
Active Community: Abundant resources and support.
ReactJS is great for complex, large-scale applications.
**Svelte**
Svelte is a newer framework that compiles your code at build time for highly optimized results.
**Key Features:**
- No Virtual DOM:
Direct updates to the DOM for superior performance.
- Reactive:
Simple and intuitive state management.
- Smaller Bundles:
Produces fast, efficient code.
- Easy to Learn:
Simple syntax and reactivity model.
Svelte excels in performance-critical applications and is easy to learn for new developers.
**ReactJS vs. Svelte**
**Performance:**
ReactJS: Efficient with Virtual DOM, but has some overhead.
Svelte: Better performance with direct DOM updates.
**Learning Curve:**
ReactJS: Requires understanding JSX and state management.
Svelte: Easier syntax and built-in reactivity.
**Ecosystem:**
ReactJS: Extensive and mature.
Svelte: Growing, but less mature.
**Use Cases:**
ReactJS: Ideal for large, complex projects.
Svelte: Perfect for performance-sensitive applications.
**My Experience with HNG Internship programme with ReactJS:**
As part of the HNG Internship, I'm thrilled to expand my knowledge of ReactJS. This program provides hands-on experience and the opportunity to work on real-world projects. I'm excited to utilize React's component-based architecture to build scalable applications and further develop my skills.
Learn more about the [HNG premium](https://hng.tech/premium) at [HNG Internship]( https://hng.tech/internship) and [HNG Hire](https://hng.tech/hire).
Conclusion
Choosing between ReactJS and Svelte depends on your project's requirements. ReactJS is best for large, complex applications, while Svelte offers simplicity and high performance. Understanding these differences will help you make the right decision for your project. | badmuseniola | |
1,906,614 | Array search methods in JavaScript.! | Javascriptda arraylarning izlash metodlari.! Array indexOf() Array lastIndexOf() Array... | 0 | 2024-06-30T13:37:52 | https://dev.to/samandarhodiev/array-search-methods-in-javascript-406p | **Javascriptda arraylarning izlash metodlari.!**
`Array indexOf()
Array lastIndexOf()
Array includes()
Array find()
Array findIndex()
Array findLast()
Array findLastIndex()`
<u>1. `indexOf()`</u>
Ushbu metod massiv elementi joylashgan indeks raqamni qaytaradi.
<u>**sintaksis:** `arrayName.indexOf(item,start);`</u>
buyerda **item**-indeks raqami izlanayotgan element, **start**-indeks raqami izlanayotgan element qayerdan boshlab izlanishi va **startga qiymat berish ihtiyoriy**. Element bir necha marotaba takrorlangan bo'lsa birinchi topilgan elementni qaytaradi, agar element topilmasa -1 qaytaradi. Ushbu metod elementlarni chapdan o'ngga qarab izlaydi.!
```
let fruits = ['lemon','mango','nut','peach','strawberry','nut','apelsin'];
console.log(fruits);
//natija - ['lemon','mango','nut','apelsin','peach','strawberry']
let fruits_indexOf1 = fruits.indexOf('nut');
console.log(fruits_indexOf1);
//natija - 2
let fruits_indexOf2 = fruits.indexOf('nut')+1;
console.log(fruits_indexOf2);
//natija - 3
let fruits_indexOf3 = fruits.indexOf('nut')-4;
console.log(fruits_indexOf3);
//natija - -2
let fruits_indexOf4 = fruits.indexOf('nut',4);
console.log(fruits_indexOf4);
//natija - 5
let fruits_indexOf5 = fruits.indexOf('nut',6);
console.log(fruits_indexOf5);
//natija - -1
```
<u>2. `lastIndexOf()`</u>
Ushbu metod elementlarni o'ng tarafdan chap tarafga qarab izlaydi va 1-topilgan elementni qaytaradi, sintaksisti indexOf bilan birxil.!
```
let fruits = ['lemon','mango','nut','peach','strawberry','nut','apelsin'];
console.log(fruits);
//natija - ['lemon','mango','nut','apelsin','peach','strawberry']
let fruits_lastIndexOf1 = fruits.lastIndexOf('nut');
console.log(fruits_lastIndexOf1);
//natija - 5
let fruits_lastIndexOf2 = fruits.lastIndexOf('nut',3);
console.log(fruits_lastIndexOf2);
//natija - 2
```
<u>3. `includes()`</u>
Ushbu metod massiv ichidan biror-bir element bor yoki yo'qligini bilish uchun foydalaniladi, agar element bor bo'lsa **true** bo'lmasa **false** qiymat qaytaradi.!
<u>**sintsksis: `**arrayName.includes(item,start);`</u>
**item**-mos element, **start**-izlash qayerdan boshlanishi kerakligi. includes() ECMAScript 2016 da kiritilgan.!
```
let fruits = ['lemon','mango','nut','peach','strawberry','nut','apelsin'];
console.log(fruits);
//natija - ['lemon','mango','nut','apelsin','peach','strawberry']
let fruits_includes1 = fruits.includes('lemon');
console.log(fruits_includes1);
//natija - trtue
let fruits_includes2 = fruits.includes('lemon',3);
console.log(fruits_includes2);
//natija - false
let fruits_includes3 = fruits.includes();
console.log(fruits_includes3);
//natija - false
```
<u>4. `find()`</u>
<u>5. `findIndex()`</u>
<u>6. `findLast()`</u>
<u>7. `findLastIndex()`</u>
| samandarhodiev | |
1,906,613 | Paper detailing BitPower Loop’s security | Security Research of BitPower Loop BitPower Loop is a decentralized lending platform based on... | 0 | 2024-06-30T13:35:06 | https://dev.to/weq_24a494dd3a467ace6aca5/paper-detailing-bitpower-loops-security-31jj | Security Research of BitPower Loop
BitPower Loop is a decentralized lending platform based on blockchain technology, dedicated to providing users with safe, transparent and efficient financial services. Its core security comes from multi-level technical measures and mechanism design, which ensures the robust operation of the system and the security of user funds. This article will introduce the security of BitPower Loop in detail from five aspects: smart contract security, decentralized management, data and transaction security, fund security and risk control mechanism.
1. Smart Contract Security
Smart contracts are the core components of BitPower Loop, and their codes must undergo strict security audits before deployment. These audits are usually conducted by third-party independent security companies to ensure that there are no vulnerabilities or malicious code in the contract. In addition, the immutability of smart contracts means that once deployed, no one (including the development team) can modify its rules and logic, which fundamentally eliminates the possibility of malicious operations. All operations are automatically executed by smart contracts, avoiding the risk of human intervention and ensuring the fairness and consistency of system operation.
2. Decentralized Management
BitPower Loop eliminates the risks brought by single point failures and central control through decentralized management. The system has no central management agency or owner, and all transactions and operations are jointly verified and recorded by blockchain nodes distributed around the world. This decentralized structure not only improves the system's anti-attack capabilities, but also enhances transparency. Users can publicly view all transaction records, which increases trust in the system.
3. Data and transaction security
BitPower Loop uses advanced encryption technology to protect users' data and transaction information. All data is encrypted during transmission and storage to prevent unauthorized access and data leakage. The consensus mechanism of the blockchain ensures the validity and immutability of each transaction, eliminating the possibility of double payment and forged transactions. In addition, the automated execution of smart contracts also avoids delays and errors caused by human operations, ensuring the real-time and accuracy of transactions.
4. Fund security
The secure storage of user funds is an important feature of BitPower Loop. Funds are stored on the blockchain through smart contracts and maintained by nodes across the entire network. Distributed storage avoids the risk of fund theft caused by centralized storage. In addition, the user's investment returns and shared commissions are automatically allocated to the user's wallet address by the smart contract after the conditions are met, ensuring the timely and accurate arrival of funds.
5. Risk Control Mechanism
BitPower Loop effectively manages lending risks by setting collateral factors and liquidation mechanisms. The collateral factors are independently set according to market liquidity and asset value fluctuations to ensure system stability and lending security. When the value of the borrower's assets falls below a certain threshold, the liquidation mechanism is automatically triggered, ensuring the repayment of the borrower's debt and protecting the interests of the fund provider. In addition, the immutability and automatic execution characteristics of smart contracts further enhance the security and reliability of the system.
Conclusion
BitPower Loop achieves high security and stability through multi-level security measures and mechanism design. Its smart contracts are strictly audited and immutable, decentralized management eliminates single point failure risks, advanced encryption technology protects data and transaction security, distributed storage ensures fund security, and risk control mechanisms manage lending risks. These security features together build a reliable decentralized financial platform that provides users with secure, transparent and efficient financial services.
| weq_24a494dd3a467ace6aca5 | |
1,906,611 | Understanding Security Context in Kubernetes | Introduction Kubernetes, a leader in container orchestration, ensures that applications... | 0 | 2024-06-30T13:31:01 | https://dev.to/piyushbagani15/understanding-security-context-in-kubernetes-1gkn | kubernetes, security | ## Introduction
Kubernetes, a leader in container orchestration, ensures that applications run efficiently and securely across a cluster of machines. An essential component of Kubernetes' security mechanism is the security context, which configures permissions and access controls for Pods and containers. This blog delves into the specifics of security contexts, helping you understand how to deploy secure applications within your Kubernetes environment.
## Key Features of Kubernetes Security Contexts
- RunAsUser: Controls the UID with which the container executes. This prevents the container from running with root privileges, which could pose security risks.
- ReadOnlyRootFilesystem: Ensures the container's root filesystem is mounted as read-only, prohibiting modifications to the root filesystem and mitigating some forms of attack.
- Capabilities: Allows administrators to grant or remove specific Linux capabilities for a container, enabling a principle of least privilege to be enforced.
- SELinuxOptions: Specifies the SELinux context that the container should operate within. SELinux can enforce granular access control policies.
**NOTE:** There can be more settings as well, So please check out the official documentation.
## Best Practices for Configuring Security Contexts
- Non-root Containers: Always try to run containers as non-root users. Even if the container is compromised, this limits the potential for damage.
- Enforce Read-Only Filesystems: Where possible, set ReadOnlyRootFilesystem to true to prevent tampering with system files.
- Restrict Capabilities: Start with minimal necessary capabilities and add more only as needed. This limits the actions a container can perform, reducing the attack surface.
- Configure SELinux Properly: Use SELinux to enforce strict access controls tailored to your operational needs. Understand your applications' requirements to configure these settings accurately.
## Pod-Level Security Context Example
Here's an example of a Kubernetes manifest file that specifies security settings at the pod level. The security context applied here affects all containers within the pod.
```
apiVersion: v1
kind: Pod
metadata:
name: example-pod
spec:
securityContext:
runAsUser: 1000
containers:
- name: example-container
image: nginx
ports:
- containerPort: 80
```
## Explanation:
- runAsUser: This setting ensures that the container runs as a user with UID 1000, which is a non-root user.
## Container-Level Security Context Example
In this example, the security context is specified at the container level, meaning it only affects this particular container within the pod.
```
apiVersion: v1
kind: Pod
metadata:
name: example-pod
spec:
containers:
- name: secure-container
image: nginx
securityContext:
runAsUser: 1001
readOnlyRootFilesystem: true
capabilities:
drop:
- ALL
add:
- NET_BIND_SERVICE
ports:
- containerPort: 80
```
## Explanation:
- runAsUser: The container runs as a user with UID 1001.
- readOnlyRootFilesystem: This setting makes the root filesystem of the container read-only, preventing any write operations.
- capabilities: This setting customizes the capabilities the container has:
* drop: ALL removes all capabilities by default.
* add: NET_BIND_SERVICE adds the capability to bind a service to well-known ports (below 1024).
## Conclusion
Security contexts are one of those critical characteristics for managing security within Kubernetes. It assures enforcement of security policies and reduces the risk of unauthorized access or escalation occurring within the cluster. This can help to further improve the security of Kubernetes deployments if the concept of security contexts is properly understood and used.
| piyushbagani15 |
1,906,609 | Ego, Not Pride, Comes Before a Fall | Let me tell you a bit about myself and why I am qualified to have an opinion on pride vs ego in... | 0 | 2024-06-30T13:29:23 | https://dev.to/thesimpledev/ego-not-pride-comes-before-a-fall-3bcl | programming, career | Let me tell you a bit about myself and why I am qualified to have an opinion on pride vs ego in Software Engineering. I have been working as a software engineer for over 17 years, prior to that I spent 15 years programming as an amateur. At 41, I have over 30 years of experience programming for fun and profit. During those decades, I have worked with a variety of people on personal and professional projects. The best programmers I have worked with have a tremendous amount of pride in their work, but almost no ego.
## The difference between pride and ego
Taking pride in your work means taking ownership of it. Making the best decisions for the project drives you to work hard. This can include.
- Using best practices and craft something you are proud to show off to others.
- Enjoying the satisfaction and accomplishment of the tasks and projects you complete.
- Paying attention to small details and ensure that the work is of the highest quality.
When ego becomes involved, it can lead to a lack of objectivity. Instead of building the best product possible, you can overlook flaws and weaknesses in your approach. This can include.
- Difficulty seeing and acknowledging your own limitations and weaknesses.
- Problems collaborating with a team.
- Arrogance where you believe that you and the methods you favor are superior to others.
## Ego, warning signs and dangers
There are a few things I have learned to watch out for in myself and others when it comes to ego.
- Putting too much emphasis on their titles
- Needing to be at the center of attention constantly
- Excluding those they feel might be a threat to their position
- Disparaging the work others do and not offering constructive criticism
- Stealing the ideas of others and presenting them as their own
Now this list is not all-inclusive or even concrete. Even the best people can make these mistakes from time to time. What you need to watch out for is a pattern of behaviour.
## The healthy approach
Pride and ego are deeply connected. Without some ego, it is impossible to take pride in your work. It is when the pride becomes destructive instead of constructive that it becomes a matter of ego.
I do my best to remember a few key points to keep my pride from turning into ego.
- Others, even those with less experience than me, may have valuable ideas
- My work is not perfect and there is always room for improvement
- There may be better solutions, methods, and practices than I am employing.
- I always want to learn and grow as a developer. If I think I know it all, that cannot happen.
## Finally
It is important for leadership in companies to make sure they are encouraging pride and not ego in their workspaces. Software Engineers already have a tough job between high-stress environments, long hours, tight deadlines, and adapting to new technology.
Making sure that Software Engineers have a good balance between pride and ego can help ensure a friendlier workspace.
[Originally Published at The Simple Dev](https://thesimpledev.com/blog/ego-not-pride-comes-before-a-fall/) | thesimpledev |
1,906,608 | Kubernetes-project-using-an-EKS-cluster | 🚀Excited to share my recent Kubernetes project where I deployed the 2048 Game Application using an... | 0 | 2024-06-30T13:27:33 | https://dev.to/sukuru_naga_sai_srinivasu/kubernetes-project-using-an-eks-cluster-42kg | kubernetes, aws, cloudcomputing | 
🚀Excited to share my recent Kubernetes project where I deployed the 2048 Game Application using an EKS cluster! This project was a great learning experience, leveraging EKS, Fargate, and the Application Load Balancer Ingress Controller
Here's a summary of the steps I followed:
1) Set up an EKS cluster with Fargate.
2) Configured CLI access to cluster resources by updating the KubeConfig file.
3) Created a "game-2048" namespace linked to the Fargate profile "alb-sample-app."
4) Deployed the 2048 app as a Kubernetes Pod, created a service for it, and configured an Ingress resource to manage traffic routing.
5) Installed and configured the Ingress Controller to automatically create an Application Load Balancer (ALB) Controller.
6) Configured an IAM OIDC Provider to allow the ALB Controller access to the Application Load Balancer.
7) Deployed the ALB Controller using Helm Charts and linked it with the service account.
8) Verified the 2048 application running within the EKS environment. | sukuru_naga_sai_srinivasu |
1,906,607 | Effective Debugging Techniques for React JS: Debugging Doesn’t Have to Be a Drag! | Hey there, fellow React JS developer! 🧑💻 We’ve all been there: you’re cruising along, building your... | 0 | 2024-06-30T13:24:18 | https://dev.to/a_shokn/effective-debugging-techniques-for-react-js-debugging-doesnt-have-to-be-a-drag-2iea | webdev, javascript, beginners, tutorial | Hey there, fellow React JS developer! 🧑💻
We’ve all been there: you’re cruising along, building your latest and greatest app, when suddenly… BAM! An error message pops up out of nowhere. Panic sets in. But fear not! Debugging doesn’t have to be a drag. In fact, with the right techniques, it can be as smooth as butter on a hot pancake. So, grab your favorite beverage, sit back, and let’s dive into the wonderful world of advanced React JS debugging!
1. React Developer Tools to the Rescue
Imagine having X-ray vision for your React components. That’s exactly what React Developer Tools offer. This browser extension (available for Chrome and Firefox) lets you inspect the component hierarchy, check props and state, and even tweak them on the fly.
2.The Power of Breakpoints
Forget console.log() for a moment and embrace the power of breakpoints. Set breakpoints in your code using your browser’s developer tools, and watch your code pause at just the right moment.
3.Linting with ESLint
ESLint is your best friend when it comes to catching errors before they even happen. It’s like having a vigilant coding buddy who points out potential issues as you type.
```
npm install eslint --save-dev
```
4.The ComponentDidCatch Magic
Ever wished you could catch errors in your components and handle them gracefully? Enter componentDidCatch. This lifecycle method allows you to catch errors in any child component and display a custom error message.
```
class ErrorBoundary extends React.Component {
state = { hasError: false };
componentDidCatch(error, info) {
this.setState({ hasError: true });
// Log the error to an error reporting service
console.error(error, info);
}
render() {
if (this.state.hasError) {
return <h1>Something went wrong.</h1>;
}
return this.props.children;
}
}
```
5.React.lazy and Suspense for Code Splitting
React.lazy and Suspense enable you to load components lazily through code splitting. This can help with debugging by isolating component loading issues.
```
const OtherComponent = React.lazy(() => import('./OtherComponent'));
function MyComponent() {
return (
<div>
<Suspense fallback={<div>Loading...</div>}>
<OtherComponent />
</Suspense>
</div>
);
}
```
6.React's useDebugValue for Custom Hooks
When you create custom hooks, you can use the useDebugValue hook to display a label in React DevTools for easier debugging.
```
function useFriendStatus(friendID) {
const [isOnline, setIsOnline] = useState(null);
useDebugValue(isOnline ? 'Online' : 'Offline');
useEffect(() => {
function handleStatusChange(status) {
setIsOnline(status.isOnline);
}
ChatAPI.subscribeToFriendStatus(friendID, handleStatusChange);
return () => {
ChatAPI.unsubscribeFromFriendStatus(friendID, handleStatusChange);
};
}, [friendID]);
return isOnline;
}
```
And there you have it! Debugging doesn’t have to be a nightmare. With these effective techniques, you’ll be squashing bugs and fine-tuning your React JS apps like a pro. Happy coding! 🐞🚀
Feel free to share your own debugging tips in the comments below!
| a_shokn |
1,906,606 | Swapping in elasticsearch to the proto-OLIVER | In the last video of the llm-zoomcamp, which I didn't post about, I reformatted the code to make it... | 0 | 2024-06-30T13:24:03 | https://dev.to/cmcrawford2/swapping-in-elasticsearch-to-the-proto-oliver-2ml1 | llm, rag | In the last video of the [llm-zoomcamp](https://github.com/datatalksclub/llm-zoomcamp), which I didn't post about, I reformatted the code to make it modular, so I could swap in a different search engine or a different LLM. In this video, the last video of module 1, I learned how to exchange elasticsearch with the in-memory search engine. I had already installed elasticsearch at the beginning, when I installed everything else.
The first thing I did was to open a docker container with elasticsearch in it. This didn't work at first. I got the error "Elasticsearch exited unexpectedly". I went to the course FAQ and found the solution. I needed to add a line at the end: `-e "ES_JAVA_OPTS=-Xms512m -Xmx512m"` This limits the memory used by elasticsearch, so it can run in GitHub codespaces.
```
docker run -it \
--rm \
--name elasticsearch \
-p 9200:9200 \
-p 9300:9300 \
-e "discovery.type=single-node" \
-e "xpack.security.enabled=false" \
-e "ES_JAVA_OPTS=-Xms512m -Xmx512m" \
docker.elastic.co/elasticsearch/elasticsearch:8.4.3
```
Then I imported elasticsearch and created a client.
```
from elasticsearch import Elasticsearch
es_client = Elasticsearch("http://localhost:9200")
```
Indexing the documents is slightly more complicated than in our in-memory search engine. The instructor had the object set up for us. It has the same fields and keywords as before, but also includes some settings.
```
index_settings = {
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0
},
"mappings": {
"properties": {
"text": {"type": "text"},
"section": {"type": "text"},
"question": {"type": "text"},
"course": {"type": "keyword"}
}
}
}
index_name = "course-questions"
es_client.indices.create(index=index_name, body=index_settings)
```
Next I fed the documents into the search engine. I set up a progress bar for this operation using tqdm, which I also installed earlier. Apparently I was missing a library, but it didn't matter. I still had a crude progress bar and could tell how long it was going to take.
```
from tqdm.auto import tqdm
/usr/local/python/3.10.13/lib/python3.10/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html
from .autonotebook import tqdm as notebook_tqdm
for doc in tqdm(documents):
es_client.index(index=index_name, document=doc)
100%|███████| 948/948 [00:20<00:00, 45.47it/s]
documents[0]
{'text': "The purpose of this document is to capture frequently asked technical questions\nThe exact day and hour of the course will be 15th Jan 2024 at 17h00. The course will start with the first “Office Hours'' live.1\nSubscribe to course public Google Calendar (it works from Desktop only).\nRegister before the course starts using this link.\nJoin the course Telegram channel with announcements.\nDon’t forget to register in DataTalks.Club's Slack and join the channel.",
'section': 'General course-related questions',
'question': 'Course - When will the course start?',
'course': 'data-engineering-zoomcamp'}
```
Here we see the documents are the same as what I used before.
Next I called the search engine with the usual query.
```
query = "I just discovered the course. can I still join?"
def elastic_search(query):
search_query = {
"size": 5,
"query": {
"bool": {
"must": {
"multi_match": {
"query": query,
"fields": ["question^3", "text", "section"],
"type": "best_fields"
}
},
"filter": {
"term": {
"course": "data-engineering-zoomcamp"
}
}
}
}
}
response = es_client.search(index=index_name, body=search_query)
result_docs = []
for hit in response['hits']['hits']:
result_docs.append(hit['_source'])
return result_docs
```
Here, the search_query is also more complicated than it was in our in-memory search engine. Again, the instructor had set it up for us. I also had to do some work to get the output into the same format as I had before. Once I did that, I could call the LLM with the context from the search engine.
Here are the results of the module I defined, elastic_search, and then the call to the entire rag function, which gives the same result as before. You can see that the only change I made was to define the search engine differently. The rest is the same as before.
```
elastic_search(query)
[{'text': "Yes, even if you don't register, you're still eligible to submit the homeworks.\nBe aware, however, that there will be deadlines for turning in the final projects. So don't leave everything for the last minute.",
'section': 'General course-related questions',
'question': 'Course - Can I still join the course after the start date?',
'course': 'data-engineering-zoomcamp'},
{'text': 'You can start by installing and setting up all the dependencies and requirements:\nGoogle cloud account\nGoogle Cloud SDK\nPython 3 (installed with Anaconda)\nTerraform\nGit\nLook over the prerequisites and syllabus to see if you are comfortable with these subjects.',
'section': 'General course-related questions',
'question': 'Course - What can I do before the course starts?',
'course': 'data-engineering-zoomcamp'},
{'text': 'Yes, we will keep all the materials after the course finishes, so you can follow the course at your own pace after it finishes.\nYou can also continue looking at the homeworks and continue preparing for the next cohort. I guess you can also start working on your final capstone project.',
'section': 'General course-related questions',
'question': 'Course - Can I follow the course after it finishes?',
'course': 'data-engineering-zoomcamp'},
{'text': 'Yes, the slack channel remains open and you can ask questions there. But always sDocker containers exit code w search the channel first and second, check the FAQ (this document), most likely all your questions are already answered here.\nYou can also tag the bot @ZoomcampQABot to help you conduct the search, but don’t rely on its answers 100%, it is pretty good though.',
'section': 'General course-related questions',
'question': 'Course - Can I get support if I take the course in the self-paced mode?',
'course': 'data-engineering-zoomcamp'},
{'text': "You don't need it. You're accepted. You can also just start learning and submitting homework without registering. It is not checked against any registered list. Registration is just to gauge interest before the start date.",
'section': 'General course-related questions',
'question': 'Course - I have registered for the Data Engineering Bootcamp. When can I expect to receive the confirmation email?',
'course': 'data-engineering-zoomcamp'}]
def rag(query):
search_results = elastic_search(query)
prompt = build_prompt(query, search_results)
answer = llm(prompt)
return answer
rag(query)
"Yes, even if you don't register, you're still eligible to submit the homeworks. Be aware, however, that there will be deadlines for turning in the final projects. So don't leave everything for the last minute."
```
Previous post: [Generating a result with a context](https://dev.to/cmcrawford2/generating-a-result-with-a-context-2cc8) | cmcrawford2 |
1,906,605 | Use XDebug for PHP Project Debugging | XDebug is an indispensable debugging tool in PHP development, offering powerful features for... | 0 | 2024-06-30T13:22:46 | https://dev.to/servbay/use-xdebug-for-php-project-debugging-2i4o | php, webdev, beginners, programming | [XDebug](https://xdebug.org/) is an indispensable debugging tool in PHP development, offering powerful features for breakpoint debugging, performance analysis, and code coverage. With XDebug, developers can set breakpoints in the code, inspect variable values, trace function call stacks, analyze performance bottlenecks, and greatly enhance PHP development efficiency and code quality.

### XDebug Introduction
XDebug is a PHP extension designed to provide debugging and analysis capabilities. It allows developers to set breakpoints in the code, step through the code, inspect variable values and program states, helping them better understand and debug the code.
### Enable Xdebug and Configure Debugging Environment
ServBay comes with XDebug pre-installed for each PHP version.
> Note: Please refer to the article How to Enable ServBay's Built-in Xdebug Module for information on how to enable the Xdebug module and configure PHPStorm.
Download: [click here to download ServBay](https://www.servbay.com)
### Specific Debugging Example
#### Sample Project Structure
Assuming we have a simple PHP project with the following directory structure:
```php
servbay_xdebug_app/
├── src/
│ └── Calculator.php
└── index.php
```
The content of the Calculator.php file is as follows:
```php
<?php
namespace App;
class Calculator
{
public function add($a, $b)
{
return $a + $b;
}
public function subtract($a, $b)
{
return $a - $b;
}
}
```
The content of the index.php file is as follows:
```php
<?php
require 'vendor/autoload.php';
use App\Calculator;
$calculator = new Calculator();
$sum = $calculator->add(5, 3);
$difference = $calculator->subtract(5, 3);
echo "Sum: " . $sum . "\n";
echo "Difference: " . $difference .
"\n";
```
### Setting Breakpoints
We want to debug the add method in the Calculator class and see how it executes. Open the Calculator`.php` file in PHPStorm and set a breakpoint on the line return `$a + $b`;.
#### Starting the Debugging Session
In PHPStorm, click the Start Listening for PHP Debug Connections button (the little bug icon) on the top toolbar.
In the browser, access your PHP application, such as https://servbay-xdebug-app.test/index.php.
#### Debugging Process
When the browser accesses index`.php`, XDebug will automatically connect to PHPStorm and pause execution at the set breakpoint.
In PHPStorm, you'll see the code paused at the return `$a + $b`; line in the add method of the Calculator.php file.
#### Inspecting Variable Values
In PHPStorm's debug window, you can see the current executing code line, call stack, variable values, etc.
In the Variables panel, you can see the values of the variables $a and `$b` are 5 and 3, respectively.
#### Step Execution
Click the Step Over button (or press F8) to step through the code line by line.
Observe the changes in variable values to ensure the add method returns the correct result.
#### Resume Execution
Click the Resume Program button (or press F9) to continue executing the code.
The program will continue running until it hits the next breakpoint or finishes execution.
#### View Output
Check the output in the browser, which should display:
```PHP
Sum: 8
Difference: 2
```
### Conclusion
XDebug allows developers to easily set breakpoints, inspect variable values, and step through code in PHP, enabling them to better understand and debug code. In practical development, XDebug's breakpoint debugging feature can help developers quickly locate and resolve issues, improving development efficiency and code quality. Through the specific debugging example above, we can see the powerful features and convenience of XDebug in PHP project debugging.
---
Big thanks for sticking with ServBay. Your support means the world to us 💙. Got questions or need a hand? Our tech support team is just a shout away. Here's to making web development fun and fabulous! 🥳
If you want to get the latest information, follow [X(Twitter)](https://x.com/ServBayDev) and [Facebook](https://www.facebook.com/ServBay.Dev).
If you have any questions, our staff would be pleased to help, just join our [Discord](https://talk.servbay.com) community | servbay |
1,906,603 | A Technical Report on My Observations on https://www.kaggle.com/datasets/kyanyoga/sample-sales-data | A Technical Report on My Observations on https://www.kaggle.com/datasets/kyanyoga/sample-sales-data... | 27,915 | 2024-06-30T13:20:00 | https://dev.to/adewalebab/a-technical-report-on-my-observations-on-httpswwwkagglecomdatasetskyanyogasample-sales-data-3e17 | A Technical Report on My Observations on https://www.kaggle.com/datasets/kyanyoga/sample-sales-data
My First Glance Observation of Sample Sales Data
available on Kaggle provides a collection of records related to sales transactions. It consists of a single CSV file containing various attributes that describe each transaction.
File Name: sample-sales-data.csv
Dataset Link: https://www.kaggle.com/datasets/kyanyoga/sample-sales-data
Key Observations:
Data Structure and Dimensions:
The dataset is structured in a tabular format with rows representing individual sales transactions and columns representing attributes of each transaction (e.g., order number, quantity, price, customer details, etc).
Initial inspection suggests there are multiple columns providing different aspects of each transaction.
Attributes and Data Types:
The dataset includes a variety of attributes such as order number, quantity ordered, price each, customer name, address, and sales representative.
Data types likely include numerical (integer, float) for quantitative values (e.g., quantity, price), and categorical (strings) for descriptive attributes (e.g., customer name, sales representative).
Data Quality:
No missing data was immediately apparent noticed upon initial review.
Further investigation into data consistency, validity (e.g., checking for outliers, unusual values), and completeness (e.g., null values) would be necessary to assess overall data quality.
Potential Insights:
Initial analysis could involve exploring sales trends over time (if date information is available), identifying top-selling products or customers, and analyzing sales performance across different regions or sales representatives.
Calculation of aggregate metrics such as total sales revenue, average order value, and customer acquisition rates could provide deeper insights into business performance.
Preprocessing Needs:
Depending on the analysis goals, preprocessing steps might include data cleaning (handling missing values, outliers), feature engineering (creating new variables like total sales amount), and normalization or scaling of numerical data.
In Conclusion:
The "Sample Sales Data" dataset presents a promising opportunity for exploring sales analytics and deriving actionable insights. Initial observations indicate a well-structured dataset suitable for various types of exploratory data analysis and modeling tasks. Further detailed analysis and preprocessing steps will be necessary to unlock the full potential of the data for business intelligence purposes.
Free available internship for anyone into digital tech skill:
https://hng.tech/internship
https://hng.tech/premium | adewalebab | |
1,906,604 | #2 Open Close Principle ['O' in SOLID] | OCP - Open/Close Principle The Open/Close Principle is the second principle in the Solid Design... | 0 | 2024-06-30T13:19:55 | https://dev.to/vinaykumar0339/2-open-close-principle-o-in-solid-2jj6 | openclosedprinciple, solidprinciples, designprinciples | **OCP - Open/Close Principle**
The Open/Close Principle is the second principle in the Solid Design Principle.
1. Software entities should be open for extension but closed for modification.
**Today Will discuss about discount calculator system.**
**Violating OCP:**
```swift
enum DiscountType {
case seasonal
case loyalty
case nodiscount
}
class DiscountCalculator {
func calculateDiscount(discountType: DiscountType, amount: Double) -> Double {
switch discountType {
case .seasonal:
return amount * 0.1
case .loyalty:
return amount * 0.15
case .nodiscount:
return 0
// Including a default case can handle unexpected cases but won't prompt Xcode to show an error or warning if a new DiscountType is added later. This means new cases could be missed without any compile-time notifications.
default:
return 0
}
}
}
// usage
let discountCalcualtor = DiscountCalculator()
discountCalcualtor.calculateDiscount(discountType: .seasonal, amount: 100)
discountCalcualtor.calculateDiscount(discountType: .loyalty, amount: 100)
discountCalcualtor.calculateDiscount(discountType: .nodiscount, amount: 100)
```
**Issues with Violating OCP:**
1. Hard to Extend:
* Adding a new discount required modification of the function `calculateDiscount`.
2. Increase Complexity:
* `calculateDiscount` becomes complex with multiple discount types.
3. Difficult Maintenance:
* Changes in discount logic require updates to `calculateDiscount`.
**Adhering to OCP:**
To Adhere to OCP, Use Protocol to confirm the contract between different types of discounts in the application.
```swift
protocol Discount {
func apply(amount: Double) -> Double
}
class SeasonalDiscount: Discount {
func apply(amount: Double) -> Double {
return amount * 0.1
}
}
class LoyaltyDiscount: Discount {
func apply(amount: Double) -> Double {
return amount * 0.15
}
}
class DiscountCalculatorOCP {
private var discounts: [Discount]
init(discounts: [Discount]) {
self.discounts = discounts
}
func calculateTotalDiscount(amount: Double) -> Double {
var totalDiscount = 0.0
for discount in discounts {
totalDiscount += discount.apply(amount: amount)
}
return totalDiscount
}
}
// usage
let seasonalDiscount = SeasonalDiscount()
let loyaltyDiscount = LoyaltyDiscount()
let discountCalculatorOCP = DiscountCalculatorOCP(discounts: [seasonalDiscount, loyaltyDiscount])
discountCalculatorOCP.calculateTotalDiscount(amount: 100)
```
**Benefits of Adhering to OCP:**
1. Improved Maintainability:
* Adding New Discount types doesn't require modifying exising code.
2. Enhanced Flexibility:
* Easily extendable with new discount class.
3. Greater Reusability:
* Discount logic is reusable across different parts of the application.
**Drawbacks:**
1. More Classes:
* Can lead to many small classes.
2. Complex Dependency Management:
* More dependencies to manage.
3. Design and Refactoring Overhead:
* Requires more effort to design and refactor.
**Mitigating Drawbacks:**
1. Balanced Approach:
* Apply OCP judiciously, balancing simplicity and extensibility.
2. Effective Documentation:
* Clear documentation helps navigate the codebase and understand class responsibilities.
3. Use Patterns and Frameworks:
* Design patterns (like Strategy) and dependency management tools can help.
4. Team Alignment:
* Ensure the team has a shared understanding of OCP and consistent practices.
5. Performance Profiling:
* Profile and optimize performance to manage any overhead introduced by adhering to OCP.
**Conclusion:**
By understanding and applying the Open/Closed Principle thoughtfully, you can create more maintainable, understandable, and flexible software.
[Single Responsibility Principle](https://dev.to/vinaykumar0339/1-single-responsibility-principle-s-in-solid-5fn9)
[Liskov Substitution Principle](https://dev.to/vinaykumar0339/3-liskov-substitution-principle-l-in-solid-1jo2)
[Check My GitHub Swift Playground Repo.](https://github.com/vinaykumar0339/SolidDesignPrinciples) | vinaykumar0339 |
1,906,602 | A Technical Report on My Observations on https://www.kaggle.com/datasets/kyanyoga/sample-sales-data | A Technical Report on My Observations on https://www.kaggle.com/datasets/kyanyoga/sample-sales-data... | 0 | 2024-06-30T13:18:12 | https://dev.to/adewalebab/a-technical-report-on-my-observations-on-httpswwwkagglecomdatasetskyanyogasample-sales-data-4j0m | beginners | A Technical Report on My Observations on https://www.kaggle.com/datasets/kyanyoga/sample-sales-data
My First Glance Observation of Sample Sales Data
available on Kaggle provides a collection of records related to sales transactions. It consists of a single CSV file containing various attributes that describe each transaction.
File Name: sample-sales-data.csv
Dataset Link: https://www.kaggle.com/datasets/kyanyoga/sample-sales-data
Key Observations:
Data Structure and Dimensions:
The dataset is structured in a tabular format with rows representing individual sales transactions and columns representing attributes of each transaction (e.g., order number, quantity, price, customer details, etc).
Initial inspection suggests there are multiple columns providing different aspects of each transaction.
Attributes and Data Types:
The dataset includes a variety of attributes such as order number, quantity ordered, price each, customer name, address, and sales representative.
Data types likely include numerical (integer, float) for quantitative values (e.g., quantity, price), and categorical (strings) for descriptive attributes (e.g., customer name, sales representative).
Data Quality:
No missing data was immediately apparent noticed upon initial review.
Further investigation into data consistency, validity (e.g., checking for outliers, unusual values), and completeness (e.g., null values) would be necessary to assess overall data quality.
Potential Insights:
Initial analysis could involve exploring sales trends over time (if date information is available), identifying top-selling products or customers, and analyzing sales performance across different regions or sales representatives.
Calculation of aggregate metrics such as total sales revenue, average order value, and customer acquisition rates could provide deeper insights into business performance.
Preprocessing Needs:
Depending on the analysis goals, preprocessing steps might include data cleaning (handling missing values, outliers), feature engineering (creating new variables like total sales amount), and normalization or scaling of numerical data.
In Conclusion:
The "Sample Sales Data" dataset presents a promising opportunity for exploring sales analytics and deriving actionable insights. Initial observations indicate a well-structured dataset suitable for various types of exploratory data analysis and modeling tasks. Further detailed analysis and preprocessing steps will be necessary to unlock the full potential of the data for business intelligence purposes.
Free available internship for anyone into digital tech skill:
https://hng.tech/internship
https://hng.tech/premium | adewalebab |
1,906,601 | BitPower: Security Analysis of Decentralized Lending Platform | Introduction The rapid development of decentralized finance (DeFi) has made security an important... | 0 | 2024-06-30T13:18:08 | https://dev.to/aimm_y/bitpower-security-analysis-of-decentralized-lending-platform-2nf0 | Introduction
The rapid development of decentralized finance (DeFi) has made security an important consideration for users to choose a platform. As a leading decentralized lending platform, BitPower provides users with highly secure lending services through smart contracts and blockchain technology. This article briefly analyzes the security features of BitPower and explores how it ensures the security of user assets and transactions.
Core security features
Smart contract guarantee
All transactions on the BitPower platform are automatically executed by smart contracts, ensuring the security and transparency of transactions and avoiding human intervention and operational risks.
Open and transparent code
BitPower's smart contract code is completely open source, and anyone can view and audit the code. This transparency increases the credibility of the platform, allowing users to use the platform for lending transactions with confidence.
No need for third-party trust
BitPower implements unmediated lending services through smart contracts, and users interact directly with the platform, eliminating dependence on third-party institutions and reducing trust risks.
Data privacy protection
All transactions on the BitPower platform are conducted on the blockchain, and users' personal data is not recorded. Users' assets and transaction information are open and transparent on the blockchain, but personal privacy is fully protected.
Automatic liquidation mechanism
If the value of the borrower's collateral assets is lower than the liquidation threshold, the smart contract will automatically trigger liquidation to prevent the borrower from defaulting and protect the interests of the supplier.
Secure asset collateral
Borrowers can use crypto assets as collateral to reduce loan risks. The collateral assets are stored in the smart contract and can only be released after the borrower repays the loan, protecting the asset security of the supplier.
BitPower's security design
Untamperable smart contracts: Once BitPower's smart contracts are deployed on the blockchain, they cannot be tampered with, ensuring the stability and consistency of the platform rules.
Peer-to-peer transactions: All transactions of BitPower are executed peer-to-peer, and funds flow freely between user wallets without "exiting" the platform, ensuring the security of user funds.
Community-driven governance: BitPower is jointly governed by community members, and all participants participate equally, without privilege distinction, which increases the transparency and fairness of the platform.
Conclusion
BitPower has created a highly secure decentralized lending platform through smart contracts and blockchain technology. Its multiple security measures ensure the security of user assets and transactions. Join BitPower and experience the secure world of decentralized finance! | aimm_y | |
1,894,521 | ✨ Do this first if you are using an Auth Provider 🧙♂️ 🫵 | In this article, I will show you how you can safely backup your users data in your own database when... | 0 | 2024-06-30T13:16:24 | https://dev.to/shricodev/do-this-first-if-you-are-using-an-auth-provider-1ndo | webdev, javascript, opensource, programming | In this article, I will show you how you can safely backup your users data in your own database when using an Authentication Provider.

## Why keep a backup in the first place? 🤔
Did you ever consciously think that when using an auth provider, you are essentially storing your users' information with them and you do not have access to the user data outside of them (not even in your own database)? 😳
The main purpose of Auth Providers is to abstract away the user authentication logic, but in doing so, you are also completely giving your users' data to them without retaining any control yourself.
What if a new intern joins your auth provider company and mistakenly deletes the production database? This is extremely rare, but the chances are not zero. Not only could they shut down their company, but you would also lose all your users' data. They might have some backup set up to prevent such scenarios, but you never know how things are implemented under the hood in another company.
Many of you don’t even think of this and get started with using one of the providers just because they are **easier** 🤷♂️ to get up and running.
> If you are using one then I am pretty sure that you don’t even have a User table in your database. Did I guess right? 🤨

If this realization hits 🫠, then continue with the article as I show you how to safely keep a backup of the user’s data.
***
## Setting up the Project with Kinde 🚀
> ℹ️ If you already have a project that uses an Auth Provider, feel free to skip this section.
I will show you an example in a sample **Next.js** application with one of the popular auth provider called [Kinde](https://kinde.com).
The steps are going to be fairly the same when using any other providers as well.
Run the following command to bootstrap a new Next.js application with **Tailwind**, **Eslint** and **Typescript** support:
```bash
bunx create-next-app@latest --tailwind --eslint --typescript
```
> The above command uses [bun](https://bun.sh/) as the package manager. If you don’t have it installed, you can go ahead with npm, pnpm or yarn.
### Setting up Kinde Authentication
Make sure to have the necessary kinde package installed with the below command:
```bash
bun i @kinde-oss/kinde-auth-nextjs
```
Create a new project in Kinde and copy all the environment variables into `.env` file in your project.
```
KINDE_CLIENT_ID=<your_kinde_client_id>
KINDE_CLIENT_SECRET=<your_kinde_client_secret>
KINDE_ISSUER_URL=https://<your_kinde_subdomain>.kinde.com
KINDE_SITE_URL=http://localhost:3000
KINDE_POST_LOGOUT_REDIRECT_URL=http://localhost:3000
KINDE_POST_LOGIN_REDIRECT_URL=http://localhost:3000/dashboard
```
Note the `KINDE_POST_LOGIN_REDIRECT_URL` variable. This variable ensures that once the user is authenticated in Kinde, they are redirected to the `/dashboard` endpoint.
Make sure to change it according to your needs. Our code assumes that the user is redirected to `/dashboard` once they successfully log in.
Now, we need to set up the Kinde Auth Router Handlers. In the `app/app/api/auth/[kindeAuth]/route.ts`, add the following code:
```typescript
import {handleAuth} from "@kinde-oss/kinde-auth-nextjs/server";
export const GET = handleAuth();
```
This will setup the necessary route handler to add Kinde Authentication to our application.
***
## Setting up the Database Model 🛠️
> ℹ️ I will be using MongoDB as the database and Prisma as the ORM. If you prefer any other Prisma alternatives like Drizzle or Mongoose, feel free to proceed with them.
Run the following command to install Prisma as a development dependency:
```bash
bun i prisma @prisma/client --save-dev
```
Now, initialize Prisma with the following command:
```bash
bunx prisma init
```
After you run this command, a new `schema.prisma` file should be created in the `prisma` folder at the root of your project.
Modify the `schema.prisma` file to include a new User model. The fields in the model can vary based on the information your auth provider provides upon successful user creation.
```Prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "mongodb"
url = env("DATABASE_URL")
}
model User {
id String @id @map("_id") @db.String
email String @unique
username String @unique
name String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
// If you are adding this change on top of your exising Prisma Schema,
// you will have the rest of your models here...
```
Now that we have our model ready, we need to push it to our database. For this, we need the connection URL.
If you already have a connection URL, that's great. If not, and you're following along, create a new cluster in **MongoDB Atlas** and obtain the database connection string. Then, add a new variable named `DATABASE_URL` with the connection string value to the `.env` file.
```env
DATABASE_URL=<db-connection-string>
// Rest of the environment variables...
```
Now, we need to setup the `PrismaClient` which we can use to query our database. Create a new file `index.ts` in the `/src/db` directory with the following lines of code:
```typescript
import { PrismaClient } from "@prisma/client";
declare global {
// eslint-disable-next-line no-var
var cachedPrisma: PrismaClient;
}
let prisma: PrismaClient;
if (process.env.NODE_ENV === "production") prisma = new PrismaClient();
else {
if (!global.cachedPrisma) global.cachedPrisma = new PrismaClient();
prisma = global.cachedPrisma;
}
export const db = prisma;
```
In development environment, the code initializes `PrismaClient` once and caches it globally to optimize resource usage. In production, it creates a new `PrismaClient` instance per request.
Run the following command to push your changes in your schema to the database.
```bash
bunx prisma db push
```
Now, to have the updated types work in the IDE, run the following command to generate new types based on our updated schema.
```bash
bunx prisma generate
```
Now, this is all that we need to setup our application database and the authentication part.
***
## Setting up the Backup 📥
> ℹ️ Everything we've done up to this step is about setting up the basic project structure. In this section, we will look into the main logic of how to store user information once the user signs up for the first time in our application.
This is how the user data backup to our database architecture is going to look like:

Every time a new user signs up, they are redirected to the `/dashboard` page. There, we check if the user exists in our database. If not, they are redirected to the `/auth/callback` endpoint, and the user is created in our database. If they exist, the application continues as usual.
In the root `page.tsx` file add these lines of code:
> ℹ️ I am using Kinde as an auth provider. The code to check user authentication will vary depending on the one you are using but the logic should be the same. If you are following along, copy and paste this code.
```typescript
"use client";
import {
LoginLink,
LogoutLink,
useKindeBrowserClient,
} from "@kinde-oss/kinde-auth-nextjs";
export default function Home() {
const { isAuthenticated } = useKindeBrowserClient();
return (
<main className="flex justify-center p-24">
{!isAuthenticated ? (
<LoginLink className="p-10 text-zinc-900 text-2xl font-semibold rounded-lg bg-zinc-100">
Log in
</LoginLink>
) : (
<LogoutLink className="p-10 text-zinc-900 text-2xl font-semibold rounded-lg bg-zinc-100">
Log out
</LogoutLink>
)}
</main>
);
}
```
This is the home page of our application, with only the **login** and **logout** buttons, based on whether the user is authenticated.
Once the user successfully logs in, they will be redirected to the `/dashboard` page.
Add the following lines of code in the `/app/dashboard/page.tsx` file:
```typescript
import { db } from "@/db";
import { getKindeServerSession } from "@kinde-oss/kinde-auth-nextjs/server";
import { redirect } from "next/navigation";
const Page = async () => {
const { getUser } = getKindeServerSession();
const user = await getUser();
if (!user?.id) redirect("/api/auth/login");
const userInDB = await db.user.findUnique({
where: {
id: user.id,
},
});
if (!userInDB) redirect("/auth/callback");
return (
<div className="flex flex-col justify-center items-center sm:mt-36 w-full mt-20">
<h1 className="font-semibold text-zinc-900 text-2xl">
You are authenticated
</h1>
<p className="font-medium text-xl text-zinc-700 text-center">
The user has also been created in the DB
</p>
</div>
);
};
export default Page;
```
We check if the user is authenticated or not. If not, we redirect the user to the Kinde login page.
If they are authenticated but the user does not exist in the database, we redirect them to the `/auth/callback` endpoint where a new user is created in our database with the current logged-in user details.
Add the following lines of code in the `/src/app/auth/callback/page.tsx`:
```typescript
import { getKindeServerSession } from "@kinde-oss/kinde-auth-nextjs/server";
import { redirect } from "next/navigation";
import { db } from "@/db";
const Page = async () => {
const { getUser } = getKindeServerSession();
const user = await getUser();
if (!user?.id || !user?.email) redirect("/");
const name =
user?.given_name && user?.family_name
? `${user.given_name} ${user.family_name}`
: user?.given_name || null;
const userInDB = await db.user.findFirst({
where: {
id: user.id,
},
});
if (!userInDB) {
await db.user.create({
data: {
id: user.id,
email: user.email,
...(name && { name }),
},
});
}
redirect("/dashboard");
};
export default Page;
```
Here, we check if the user is in our database. If they exist, we redirect them to the `/dashboard` page. If the user does not exist, we create a new user in our database with their details and then redirect to the `/dashboard` page.
That's it! 🎉 These steps ensure that the user details in the auth provider are synced with our database and are not only stored in the auth provider.
***
## **Wrap Up!** ✨
By now, you have a general idea of how you can backup your user information in your own database when you are using an Authentication Provider.
The documented source code for this article is available here:
https://github.com/shricodev/blogs/tree/main/auth-callback-auth-provider
Thank you so much for reading! 🎉 🫡
> Drop down your thoughts in the comment section below. 👇
{% cta https://linktr.ee/shricodev
%} Follow me on Socials ✌️ {% endcta %}
{% embed https://dev.to/shricodev %} | shricodev |
1,906,600 | A First Glance Review on Retail Sales Data | Introduction The dataset under review is a retail sales data sample, containing information on sales... | 0 | 2024-06-30T13:15:38 | https://dev.to/fadeelah/a-first-glance-review-on-retail-sales-data-2hi3 | data, analytics, datascience, powerplatform | **Introduction**
The dataset under review is a retail sales data sample, containing information on sales transactions, including variables such as product codes, customer information, order quantities, sales, and dates. As a data analysis intern, I'm always on the lookout for fresh insights and trends, and this dataset is packed with goodies. In this "First Glance" report, I'll be sharing my initial observations and findings from a quick spin through the data.let's get started!
**Observations**
I did a quick review of this dataset using power query editor in power BI and Power BI desktop for the quick summary using line visuals.
This dataset contained 2823 rows and 25 columns, which are grouped into whole numbers datatypes, decimal datatypes, and text datatypes.
The columns are:
Order number, quantity ordered, price each, orderline number, Sales, order dates, status, QTR_ID, Month_ID, Year_ID, product line, MSRP, product code, customer name, phone, addressline1, addressline 2, State, postal code, country, territory, contact last name, contactfirst name, dealsize.
**Anomalies**

Three columns contains rows with empty cells. AddressLine 2 contained 89% empty rows, State column contained 51% empty rows and postalcode postal code contained 4% empty rows.
**Trends/Insights**
1. Sales distribution by country shows the USA is making more revenue compared to other countries.

2. The highest revenue was generated in November. It was a flowing trend until November, when it attained its peak.

3. The sales trend by productLine shows classic cars as the highest revenue-generating product.

**Conclusion**
This review of the retail dataset reveals insights into top-selling products, country sales trends, and monthly sales trends. Further analysis could explore the top 10 and bottom 10 selling products, regional market analysis, and customer segmentation. To learn more about data analysis and internship opportunities, visit HNG Internships {https://hng.tech/internship} or Check out HNG Premium{https://hng.tech/premium}
| fadeelah |
1,906,599 | Frontend Technologies | A journey of a thousand miles begins with one step they say, I would say my journey into the tech... | 0 | 2024-06-30T13:12:46 | https://dev.to/veescript/frontend-technologies-3a85 | webdev, html, css, beginners | A journey of a thousand miles begins with one step they say, I would say my journey into the tech world begins with this post.
I'm a brilliant analytical individual who discovered I have an affinity for numbers and lines of code. I mean I became fascinated with how coding works and while I don't know most functions, I can interpret the basic lines of code and believe me, I do not have any background.
I mean, this post is supposed to be about frontend technologies like HTML, CSS, JAVASCRIPT, JQUERY and the likes but I know so little about them.
Yes, I have done my research on some of them, especially HTML and CSS and found their uniqueness to be quite confusing. Which is why I need guidance in this newly started journey.
This is where HNG comes in, a free BootCamp where all my worries can be solved. The niche they are going for is ReactJS and this has brought up my learning appetite.
I daresay I do not know anything about it and my research skills have been awakened.
I guess this post is really about getting mentored but if you would like to join me on my journey and watch me grow. You can use any of the links below, it's never too late to start.
https://hng.tech/internship
https://hng.tech/hire
I know there will be ups and downs but I'm counting on them, because how then will I grow if I don't face a bit of challenge?
PS: I'm kinda going in for the adrenaline rush also.
Well, this is it for this post.
Thanks for reading | veescript |
1,906,597 | Tech notes 02 - Most Important Command Line Notes | I finished the Command Line course from Elzero Web School on YouTube and the link to the notes... | 0 | 2024-06-30T13:10:20 | https://dev.to/omar99/most-important-command-line-notes-345g | cmd, tutorial | **I finished the Command Line course from Elzero Web School on YouTube and the link to the notes document.**
[cmd course - elzero - notes commands](https://docs.google.com/document/d/1V-uM4pyXYzr3J2jLjtPLjoE-Hckh57c7o67Ql6vbo5M/edit?usp=sharing)
**Note: The course is in Arabic but the notes are in English.** | omar99 |
1,906,595 | Paper detailing BitPower Loop’s security | Security Research of BitPower Loop BitPower Loop is a decentralized lending platform based on... | 0 | 2024-06-30T13:08:40 | https://dev.to/sang_ce3ded81da27406cb32c/paper-detailing-bitpower-loops-security-da3 | Security Research of BitPower Loop
BitPower Loop is a decentralized lending platform based on blockchain technology, dedicated to providing users with safe, transparent and efficient financial services. Its core security comes from multi-level technical measures and mechanism design, which ensures the robust operation of the system and the security of user funds. This article will introduce the security of BitPower Loop in detail from five aspects: smart contract security, decentralized management, data and transaction security, fund security and risk control mechanism.
1. Smart Contract Security
Smart contracts are the core components of BitPower Loop, and their codes must undergo strict security audits before deployment. These audits are usually conducted by third-party independent security companies to ensure that there are no vulnerabilities or malicious code in the contract. In addition, the immutability of smart contracts means that once deployed, no one (including the development team) can modify its rules and logic, which fundamentally eliminates the possibility of malicious operations. All operations are automatically executed by smart contracts, avoiding the risk of human intervention and ensuring the fairness and consistency of system operation.
2. Decentralized Management
BitPower Loop eliminates the risks brought by single point failures and central control through decentralized management. The system has no central management agency or owner, and all transactions and operations are jointly verified and recorded by blockchain nodes distributed around the world. This decentralized structure not only improves the system's anti-attack capabilities, but also enhances transparency. Users can publicly view all transaction records, which increases trust in the system.
3. Data and transaction security
BitPower Loop uses advanced encryption technology to protect users' data and transaction information. All data is encrypted during transmission and storage to prevent unauthorized access and data leakage. The consensus mechanism of the blockchain ensures the validity and immutability of each transaction, eliminating the possibility of double payment and forged transactions. In addition, the automated execution of smart contracts also avoids delays and errors caused by human operations, ensuring the real-time and accuracy of transactions.
4. Fund security
The secure storage of user funds is an important feature of BitPower Loop. Funds are stored on the blockchain through smart contracts and maintained by nodes across the entire network. Distributed storage avoids the risk of fund theft caused by centralized storage. In addition, the user's investment returns and shared commissions are automatically allocated to the user's wallet address by the smart contract after the conditions are met, ensuring the timely and accurate arrival of funds.
5. Risk Control Mechanism
BitPower Loop effectively manages lending risks by setting collateral factors and liquidation mechanisms. The collateral factors are independently set according to market liquidity and asset value fluctuations to ensure system stability and lending security. When the value of the borrower's assets falls below a certain threshold, the liquidation mechanism is automatically triggered, ensuring the repayment of the borrower's debt and protecting the interests of the fund provider. In addition, the immutability and automatic execution characteristics of smart contracts further enhance the security and reliability of the system.
Conclusion
BitPower Loop achieves high security and stability through multi-level security measures and mechanism design. Its smart contracts are strictly audited and immutable, decentralized management eliminates single point failure risks, advanced encryption technology protects data and transaction security, distributed storage ensures fund security, and risk control mechanisms manage lending risks. These security features together build a reliable decentralized financial platform that provides users with secure, transparent and efficient financial services. | sang_ce3ded81da27406cb32c | |
1,906,594 | How i solved a backend issue relating to Super apps(Miniapps) and prisma | When you journey into software development and decide to move into the backend aspect if the web,... | 0 | 2024-06-30T13:06:30 | https://dev.to/ejiroosiephri/how-i-solved-a-backend-issue-relating-to-super-appsminiapps-and-prisma-k2i | **When you journey into software development and decide to move into the backend aspect if the web, your debugging, understanding and problem solving skills must be off the charts.**
In this short article/blog i would be discussing how i solved one of the pressing issues i faced while developing a super app with miniapps and storefronts integrated into it.
I followed this exact steps below.
**1. Finding the root cause of the problem**: The first way to go about testing codes, if for example no test is written, for example with JEST or the likes. We make use of **Console.log** to get the log of the potential error and then we can proceed. After logging to the console, i found out that i had issues with request not handling concurrently and issues with prisma not synchronizing data correctly.
2. One of the first thought that came to mind is to check if my promises are well defined, on my check i found out that some functions needed to change from Synchronous to Asynchronous functions so that it can handle multiple requests without any drawback. i also added Error handling in the main root of my code, any branch of the main code has an error class assigned to it. Any error faced are handled gracefully thanks to that.
3. Read PRISMA documentation and read several articles on how to fix migration failures on POSTGRES. A very important aspect of finding out root cause of errors and trying to debug, is to check if an individual has been faced with this issue before. With that you can easily follow the steps shown in the article or documentations to fix the issue associated in your code.
4. To improve and fix issues with storefront not added properly i made use of a data caching feature called Redis, this reduces the load on database and made retrieving data from the database a lot faster.
5. Indexing: took a while before i could understand this, but i fixed this by adding appropriate indexes to the storefronts data and also making sure i added relations between the table i created and the one i want to assign to a new table.
After implementing this changes the performance and the ease of access improved greatly, the data retrieval was way faster, i could handle concurrent request and finally my code started working as it should.
This experience like all other experience trying to figure the root cause of an issue with my code assisted me to get better and become a much better problem solver and in general a better developer.
In pursuit of further ways to get better and to interact with people of all nations and tribes and people, i came across HNG. An intensive and competitive Software development program; in this program i intend to learn a lot by working on real life and comprehensive projects, showing the difference between a personal project and a real life project, also networking with individuals with a high skill level and industry professionals learning from their experiences and ways of solving and resolving issues relating to the codebase.
To join HNG and to move out of your comfort zone to become a better developer you can use this links below to join
https://hng.tech/internship
https://hng.tech/hire
Goodluck and success as you journey ahead to become the best.
| ejiroosiephri | |
1,906,591 | BitPower: Security Analysis of Decentralized Lending Platform | Introduction The rapid development of decentralized finance (DeFi) has made security an important... | 0 | 2024-06-30T13:00:11 | https://dev.to/aimm_x_54a3484700fbe0d3be/bitpower-security-analysis-of-decentralized-lending-platform-28pb | Introduction
The rapid development of decentralized finance (DeFi) has made security an important consideration for users to choose a platform. As a leading decentralized lending platform, BitPower provides users with highly secure lending services through smart contracts and blockchain technology. This article briefly analyzes the security features of BitPower and explores how it ensures the security of user assets and transactions.
Core security features
Smart contract guarantee
All transactions on the BitPower platform are automatically executed by smart contracts, ensuring the security and transparency of transactions and avoiding human intervention and operational risks.
Open and transparent code
BitPower's smart contract code is completely open source, and anyone can view and audit the code. This transparency increases the credibility of the platform, allowing users to use the platform for lending transactions with confidence.
No need for third-party trust
BitPower implements unmediated lending services through smart contracts, and users interact directly with the platform, eliminating dependence on third-party institutions and reducing trust risks.
Data privacy protection
All transactions on the BitPower platform are conducted on the blockchain, and users' personal data is not recorded. Users' assets and transaction information are open and transparent on the blockchain, but personal privacy is fully protected.
Automatic liquidation mechanism
If the value of the borrower's collateral assets is lower than the liquidation threshold, the smart contract will automatically trigger liquidation to prevent the borrower from defaulting and protect the interests of the supplier.
Secure asset collateral
Borrowers can use crypto assets as collateral to reduce loan risks. The collateral assets are stored in the smart contract and can only be released after the borrower repays the loan, protecting the asset security of the supplier.
BitPower's security design
Untamperable smart contracts: Once BitPower's smart contracts are deployed on the blockchain, they cannot be tampered with, ensuring the stability and consistency of the platform rules.
Peer-to-peer transactions: All transactions of BitPower are executed peer-to-peer, and funds flow freely between user wallets without "exiting" the platform, ensuring the security of user funds.
Community-driven governance: BitPower is jointly governed by community members, and all participants participate equally, without privilege distinction, which increases the transparency and fairness of the platform.
Conclusion
BitPower has created a highly secure decentralized lending platform through smart contracts and blockchain technology. Its multiple security measures ensure the security of user assets and transactions. Join BitPower and experience the secure world of decentralized finance! | aimm_x_54a3484700fbe0d3be | |
1,906,589 | CICD-PIPELINE (SIMPLE PET CLINIC WEB-APP) | 🚀I'm excited to share my First CI/CD Project,I have deployed a Java-based Pet Clinic application... | 0 | 2024-06-30T12:56:19 | https://dev.to/sukuru_naga_sai_srinivasu/cicd-pipeline-simple-pet-clinic-web-app-3p2i | jenkins, cicd, docker, aws | 

🚀I'm excited to share my First CI/CD Project,I have deployed a Java-based Pet Clinic application using Jenkins CI/CD tool and Docker for containerisation,this project involves setting up an AWS EC2 instance and installing Jenkins, Docker, and Trivy for security scanning.
PIPELINE-OVERVIEW:
Step 1 — Create an Ubuntu T2 Large Instance
Step 2 — Install Jenkins, Docker and Trivy. Create a Sonarqube Container using Docker.
Step 3 — Install Plugins like JDK, Sonarqube Scanner, Maven, OWASP Dependency Check,
Step 4 — Create a Pipeline Project in Jenkins using a Declarative Pipeline
Step 5 — Install OWASP Dependency Check Plugins
Step 6 — Docker Image Build and Push
Step 7 — Deploy the image using Docker
Step 8 — Access the Real World Application
GitHub repo -https://github.com/SNS-Srinivasu | sukuru_naga_sai_srinivasu |
1,906,588 | The operating mechanism of BitPower Loop | Introduction With the development of blockchain technology, decentralized finance (DeFi) has become a... | 0 | 2024-06-30T12:52:40 | https://dev.to/woy_ca2a85cabb11e9fa2bd0d/the-operating-mechanism-of-bitpower-loop-5c06 | btc | Introduction
With the development of blockchain technology, decentralized finance (DeFi) has become a hot topic in the field of financial technology. As a decentralized lending protocol based on smart contracts, BitPower Loop is committed to providing secure, transparent and efficient financial services. This article will introduce the operating mechanism of BitPower Loop in detail and analyze its core components and advantages.
Overview of BitPower Loop
BitPower Loop is a lending smart contract protocol running on the Ethereum Virtual Machine (EVM), supporting TRC20, ERC20 and Tron blockchain technologies. The platform is completely decentralized, with no central manager or owner, and all operations are automatically executed by smart contracts. This design ensures the transparency and security of the platform.
Core components
1. Smart contracts
Smart contracts are the core of BitPower Loop. They are self-executing codes running on the blockchain and cannot be changed once deployed. BitPower Loop's smart contracts are responsible for managing users' deposits and loans, calculating interest, processing transactions, and enforcing other platform rules. These contracts are open source, and anyone can review their code to ensure their security and transparency.
2. Lending mechanism
In BitPower Loop, users can participate in the platform through two ways: deposit and borrow:
Deposit: Users deposit assets into the platform and become liquidity providers (LPs). These deposits will be used for borrowers' loan needs, and LPs will receive interest income. The interest rate is determined by market supply and demand and automatically adjusted by smart contracts.
Borrow: Users can borrow by providing collateral (such as crypto assets). Borrowers need to pay interest, and the calculation of interest is also completed by smart contracts to ensure the transparency and fairness of the process.
3. Interest rate calculation
The interest rate is dynamically adjusted in BitPower Loop, depending on the supply and demand of the market. Smart contracts adjust the interest rate based on the total deposits and borrowings on the platform to ensure the liquidity and stability of the market. For example, when the demand for borrowing increases, the interest rate will rise, attracting more liquidity providers; vice versa.
4. Compound interest mechanism
BitPower Loop supports compound interest mechanism, that is, users' interest income can be reinvested to obtain more interest. Compound interest calculation is automatically executed by smart contracts, allowing users to easily achieve rolling growth of income.
5. Referral Rewards
BitPower Loop has a referral reward mechanism. By inviting new users to join the platform, users can get additional benefits. The specific referral reward structure is as follows:
First generation referral reward: 20%
Second generation referral reward: 10%
Third to seventh generation referral reward: 5%
Eighth to tenth generation referral reward: 3%
Eleventh to seventeenth generation referral reward: 1%
This multi-level reward mechanism not only encourages users to actively promote the platform, but also brings considerable additional benefits to users.
Operational advantages
1. Decentralization
BitPower Loop is completely decentralized, with no central manager or owner, and all operations are automatically executed by smart contracts. This decentralized design eliminates the risk of human intervention and ensures the fairness and stability of the platform.
2. Transparency and security
All transactions and operations are recorded on the blockchain, which is open and transparent and can be reviewed by anyone. The open source nature of smart contracts further enhances the transparency and security of the platform, and users can invest and trade with confidence.
3. Global Service
BitPower Loop is a global decentralized platform that allows users to participate in investment and trading at any time, regardless of where they are. The platform's fully decentralized nature ensures that there are no geographical or time restrictions, providing great convenience for users.
4. No additional fees
The platform does not charge any additional fees or commissions, and all participants can obtain benefits fairly. This fee-free structure makes more users willing to participate, improving the overall liquidity and activity of the platform.
Conclusion
BitPower Loop implements decentralized lending services through smart contracts, providing users with safe, transparent and efficient financial solutions. Its unique lending mechanism, dynamic interest rates, compound interest calculations and referral rewards make it highly competitive in the DeFi field. By utilizing these features, users can not only realize the appreciation of their assets, but also gain more opportunities and benefits in the wave of decentralized finance. The decentralization and transparency of BitPower Loop provide users with great trust guarantees, making it an important position in the future financial ecosystem.
| woy_ca2a85cabb11e9fa2bd0d |
1,906,586 | Git revert commit – Come rimuovere l'ultimo commit fatto | Fonte: Git Revert Commit – How to Undo the Last Commit Voglio riportare due metodi per poter... | 0 | 2024-06-30T12:51:31 | https://dev.to/mcale/git-revert-commit-come-rimuovere-lultimo-commit-fatto-1nb3 | git, revert, italian |
Fonte: [Git Revert Commit – How to Undo the Last Commit](https://www.freecodecamp.org/news/git-revert-commit-how-to-undo-the-last-commit/)
Voglio riportare due metodi per poter ripristinare il proprio codice quando per sbaglio del codice errato è stato commitatto o se si ha sbagliato branch di destinazione.
## Il comando `revert`
Il comando `revert`, dopo aver specificato il commit da dover rimuovere, creerà un altro commit che farà le operazioni opposte alle tue ultime modifiche, ripristinando il codice al suo stato iniziale.
Puoi usarlo in questo modo:
```bash
git revert <commit_SHA_to_revert>
```
Puoi trovare lo SHA del commit che vuoi rimuovere utilizzando `git log`. Il primo commit che verrà visualizzato è l'ultimo commit che è stato creato. Dopo aver individuato il commit, puoi copiare lo SHA e usarlo nel comando indicato precedentemente.
## Il comando `reset`
Il comando reset è più immediato perchè non c'è bisogno dello SHA del commit per eseguirlo, ma devi prestare molta più attenzione a usarlo, perchè verrà cambiata la history dei commit.
L'operazione fatta dal comando `reset` sposterà il puntamento alla HEAD del branch corrente e scarterà tutto il resto che è stato fatto dopo.
Puoi usarlo in questo modo:
```bash
git reset --soft HEAD~1
```
L'opzione `--soft` significa che non perderai i cambiamenti non aggiunti a un commit che potresti avere.
Se invece vuoi ripristinare l'ultimo commit rimuovendo anche tutte le modifiche non presenti in un commit puoi usare l'opzione `--hard`
```bash
git reset --hard HEAD~1
```
## Dovresti utilizzare `reset` o `revert`?
Dovresti utilizzare `reset` solo se il commit che devi rimuovere esiste solo localmente e non hai ancora eseguito un `push` per caricare il codice nel repository.
Questo comando cambia la history dei commit e potrebbe sovrascrivere delle modifiche importanti per altri membri del team che stanno lavorando sullo stesso progetto.
Viceversa `revert` crea un nuovo commit che rimuove le modifiche, così se il commit da togliere è già stato caricato nel repository, è meglio utilizzare un revert che non cambia o non rischia di sovrascrivere la history dei commit.
| mcale |
1,906,587 | The operating mechanism of BitPower Loop | Introduction With the development of blockchain technology, decentralized finance (DeFi) has become a... | 0 | 2024-06-30T12:51:28 | https://dev.to/woy_621fc0f3ac62fff68606e/the-operating-mechanism-of-bitpower-loop-2085 | btc | Introduction
With the development of blockchain technology, decentralized finance (DeFi) has become a hot topic in the field of financial technology. As a decentralized lending protocol based on smart contracts, BitPower Loop is committed to providing secure, transparent and efficient financial services. This article will introduce the operating mechanism of BitPower Loop in detail and analyze its core components and advantages.
Overview of BitPower Loop
BitPower Loop is a lending smart contract protocol running on the Ethereum Virtual Machine (EVM), supporting TRC20, ERC20 and Tron blockchain technologies. The platform is completely decentralized, with no central manager or owner, and all operations are automatically executed by smart contracts. This design ensures the transparency and security of the platform.
Core components
1. Smart contracts
Smart contracts are the core of BitPower Loop. They are self-executing codes running on the blockchain and cannot be changed once deployed. BitPower Loop's smart contracts are responsible for managing users' deposits and loans, calculating interest, processing transactions, and enforcing other platform rules. These contracts are open source, and anyone can review their code to ensure their security and transparency.
2. Lending mechanism
In BitPower Loop, users can participate in the platform through two ways: deposit and borrow:
Deposit: Users deposit assets into the platform and become liquidity providers (LPs). These deposits will be used for borrowers' loan needs, and LPs will receive interest income. The interest rate is determined by market supply and demand and automatically adjusted by smart contracts.
Borrow: Users can borrow by providing collateral (such as crypto assets). Borrowers need to pay interest, and the calculation of interest is also completed by smart contracts to ensure the transparency and fairness of the process.
3. Interest rate calculation
The interest rate is dynamically adjusted in BitPower Loop, depending on the supply and demand of the market. Smart contracts adjust the interest rate based on the total deposits and borrowings on the platform to ensure the liquidity and stability of the market. For example, when the demand for borrowing increases, the interest rate will rise, attracting more liquidity providers; vice versa.
4. Compound interest mechanism
BitPower Loop supports compound interest mechanism, that is, users' interest income can be reinvested to obtain more interest. Compound interest calculation is automatically executed by smart contracts, allowing users to easily achieve rolling growth of income.
5. Referral Rewards
BitPower Loop has a referral reward mechanism. By inviting new users to join the platform, users can get additional benefits. The specific referral reward structure is as follows:
First generation referral reward: 20%
Second generation referral reward: 10%
Third to seventh generation referral reward: 5%
Eighth to tenth generation referral reward: 3%
Eleventh to seventeenth generation referral reward: 1%
This multi-level reward mechanism not only encourages users to actively promote the platform, but also brings considerable additional benefits to users.
Operational advantages
1. Decentralization
BitPower Loop is completely decentralized, with no central manager or owner, and all operations are automatically executed by smart contracts. This decentralized design eliminates the risk of human intervention and ensures the fairness and stability of the platform.
2. Transparency and security
All transactions and operations are recorded on the blockchain, which is open and transparent and can be reviewed by anyone. The open source nature of smart contracts further enhances the transparency and security of the platform, and users can invest and trade with confidence.
3. Global Service
BitPower Loop is a global decentralized platform that allows users to participate in investment and trading at any time, regardless of where they are. The platform's fully decentralized nature ensures that there are no geographical or time restrictions, providing great convenience for users.
4. No additional fees
The platform does not charge any additional fees or commissions, and all participants can obtain benefits fairly. This fee-free structure makes more users willing to participate, improving the overall liquidity and activity of the platform.
Conclusion
BitPower Loop implements decentralized lending services through smart contracts, providing users with safe, transparent and efficient financial solutions. Its unique lending mechanism, dynamic interest rates, compound interest calculations and referral rewards make it highly competitive in the DeFi field. By utilizing these features, users can not only realize the appreciation of their assets, but also gain more opportunities and benefits in the wave of decentralized finance. The decentralization and transparency of BitPower Loop provide users with great trust guarantees, making it an important position in the future financial ecosystem.
| woy_621fc0f3ac62fff68606e |
1,906,585 | The operating mechanism of BitPower Loop | Introduction With the development of blockchain technology, decentralized finance (DeFi) has become a... | 0 | 2024-06-30T12:49:41 | https://dev.to/wot_ee4275f6aa8eafb35b941/the-operating-mechanism-of-bitpower-loop-3l0 | btc | Introduction
With the development of blockchain technology, decentralized finance (DeFi) has become a hot topic in the field of financial technology. As a decentralized lending protocol based on smart contracts, BitPower Loop is committed to providing secure, transparent and efficient financial services. This article will introduce the operating mechanism of BitPower Loop in detail and analyze its core components and advantages.
Overview of BitPower Loop
BitPower Loop is a lending smart contract protocol running on the Ethereum Virtual Machine (EVM), supporting TRC20, ERC20 and Tron blockchain technologies. The platform is completely decentralized, with no central manager or owner, and all operations are automatically executed by smart contracts. This design ensures the transparency and security of the platform.
Core components
1. Smart contracts
Smart contracts are the core of BitPower Loop. They are self-executing codes running on the blockchain and cannot be changed once deployed. BitPower Loop's smart contracts are responsible for managing users' deposits and loans, calculating interest, processing transactions, and enforcing other platform rules. These contracts are open source, and anyone can review their code to ensure their security and transparency.
2. Lending mechanism
In BitPower Loop, users can participate in the platform through two ways: deposit and borrow:
Deposit: Users deposit assets into the platform and become liquidity providers (LPs). These deposits will be used for borrowers' loan needs, and LPs will receive interest income. The interest rate is determined by market supply and demand and automatically adjusted by smart contracts.
Borrow: Users can borrow by providing collateral (such as crypto assets). Borrowers need to pay interest, and the calculation of interest is also completed by smart contracts to ensure the transparency and fairness of the process.
3. Interest rate calculation
The interest rate is dynamically adjusted in BitPower Loop, depending on the supply and demand of the market. Smart contracts adjust the interest rate based on the total deposits and borrowings on the platform to ensure the liquidity and stability of the market. For example, when the demand for borrowing increases, the interest rate will rise, attracting more liquidity providers; vice versa.
4. Compound interest mechanism
BitPower Loop supports compound interest mechanism, that is, users' interest income can be reinvested to obtain more interest. Compound interest calculation is automatically executed by smart contracts, allowing users to easily achieve rolling growth of income.
5. Referral Rewards
BitPower Loop has a referral reward mechanism. By inviting new users to join the platform, users can get additional benefits. The specific referral reward structure is as follows:
First generation referral reward: 20%
Second generation referral reward: 10%
Third to seventh generation referral reward: 5%
Eighth to tenth generation referral reward: 3%
Eleventh to seventeenth generation referral reward: 1%
This multi-level reward mechanism not only encourages users to actively promote the platform, but also brings considerable additional benefits to users.
Operational advantages
1. Decentralization
BitPower Loop is completely decentralized, with no central manager or owner, and all operations are automatically executed by smart contracts. This decentralized design eliminates the risk of human intervention and ensures the fairness and stability of the platform.
2. Transparency and security
All transactions and operations are recorded on the blockchain, which is open and transparent and can be reviewed by anyone. The open source nature of smart contracts further enhances the transparency and security of the platform, and users can invest and trade with confidence.
3. Global Service
BitPower Loop is a global decentralized platform that allows users to participate in investment and trading at any time, regardless of where they are. The platform's fully decentralized nature ensures that there are no geographical or time restrictions, providing great convenience for users.
4. No additional fees
The platform does not charge any additional fees or commissions, and all participants can obtain benefits fairly. This fee-free structure makes more users willing to participate, improving the overall liquidity and activity of the platform.
Conclusion
BitPower Loop implements decentralized lending services through smart contracts, providing users with safe, transparent and efficient financial solutions. Its unique lending mechanism, dynamic interest rates, compound interest calculations and referral rewards make it highly competitive in the DeFi field. By utilizing these features, users can not only realize the appreciation of their assets, but also gain more opportunities and benefits in the wave of decentralized finance. The decentralization and transparency of BitPower Loop provide users with great trust guarantees, making it an important position in the future financial ecosystem.
| wot_ee4275f6aa8eafb35b941 |
1,906,584 | How to Cancel Fetch Requests in JavaScript | Ever wanted to cancel a fetch request in JavaScript? Discover how the **AbortController **makes it... | 0 | 2024-06-30T12:47:49 | https://dev.to/adriangube/how-to-cancel-fetch-requests-in-javascript-4ggd | javascript, webdev, beginners, tutorial | Ever wanted to **cancel a fetch request in JavaScript?** Discover how the **AbortController **makes it easy to manage and stop async operations. Check out this quick guide to master it in just a few steps!
First, you might wonder when you would need to cancel a fetch request. Here are an examples:
When **filtering a list of elements** in **JavaScript**, the data often comes from an **API**. It’s common to add filtering, pagination, and other features to display the list correctly. However, **JavaScript** does **not guarantee** that fetch requests will be resolved in the **order** they were made.
For instance, if a user applies filters **quickly**, you might make requests A, B, and C in that order. Ideally, you want to display the results of the latest request (C) last. However, due to the fact that the browser and server work **asynchronously**, requests might resolve **out of order**. Request C could be resolved first, followed by A and B, leading to **incorrect data being displayed to the user**.
In such situations, **AbortController **helps by allowing you to **cancel requests A and B**, ensuring only the latest request (C) is active. This way, the user sees the **expected results**.
I’m going to show you **step by step** an example about how to use **AbortController** to handle this kind of situation.
**First**, create an **AbortController instance**. Retrieve the signal from the controller, which will be used to control the fetch request.

Then, use the **signal** in the fetch **request options**. Handle the response by logging the data. If an error occurs, check if it's an **AbortError **to know if the request was cancelled using the AbortController. If not, you can handle the error as usual.

Finally, when you need to **cancel the fetch request**, because for example, a user is selecting a new filter you can use **controller.abort** to cancel the previous request. I’m using a setTimeout in this example for simplicity.

This is the complete code example.

If you’ve found this short tutorial helpful, please like this post and share it with your colleagues and friends. Have you previously used the AbortController? Share your thoughts in the comments below!
| adriangube |
1,906,582 | The operating mechanism of BitPower Loop | Introduction With the development of blockchain technology, decentralized finance (DeFi) has become a... | 0 | 2024-06-30T12:44:46 | https://dev.to/wot_dcc94536fa18f2b101e3c/the-operating-mechanism-of-bitpower-loop-56ip | btc | Introduction
With the development of blockchain technology, decentralized finance (DeFi) has become a hot topic in the field of financial technology. As a decentralized lending protocol based on smart contracts, BitPower Loop is committed to providing secure, transparent and efficient financial services. This article will introduce the operating mechanism of BitPower Loop in detail and analyze its core components and advantages.
Overview of BitPower Loop
BitPower Loop is a lending smart contract protocol running on the Ethereum Virtual Machine (EVM), supporting TRC20, ERC20 and Tron blockchain technologies. The platform is completely decentralized, with no central manager or owner, and all operations are automatically executed by smart contracts. This design ensures the transparency and security of the platform.
Core components
1. Smart contracts
Smart contracts are the core of BitPower Loop. They are self-executing codes running on the blockchain and cannot be changed once deployed. BitPower Loop's smart contracts are responsible for managing users' deposits and loans, calculating interest, processing transactions, and enforcing other platform rules. These contracts are open source, and anyone can review their code to ensure their security and transparency.
2. Lending mechanism
In BitPower Loop, users can participate in the platform through two ways: deposit and borrow:
Deposit: Users deposit assets into the platform and become liquidity providers (LPs). These deposits will be used for borrowers' loan needs, and LPs will receive interest income. The interest rate is determined by market supply and demand and automatically adjusted by smart contracts.
Borrow: Users can borrow by providing collateral (such as crypto assets). Borrowers need to pay interest, and the calculation of interest is also completed by smart contracts to ensure the transparency and fairness of the process.
3. Interest rate calculation
The interest rate is dynamically adjusted in BitPower Loop, depending on the supply and demand of the market. Smart contracts adjust the interest rate based on the total deposits and borrowings on the platform to ensure the liquidity and stability of the market. For example, when the demand for borrowing increases, the interest rate will rise, attracting more liquidity providers; vice versa.
4. Compound interest mechanism
BitPower Loop supports compound interest mechanism, that is, users' interest income can be reinvested to obtain more interest. Compound interest calculation is automatically executed by smart contracts, allowing users to easily achieve rolling growth of income.
5. Referral Rewards
BitPower Loop has a referral reward mechanism. By inviting new users to join the platform, users can get additional benefits. The specific referral reward structure is as follows:
First generation referral reward: 20%
Second generation referral reward: 10%
Third to seventh generation referral reward: 5%
Eighth to tenth generation referral reward: 3%
Eleventh to seventeenth generation referral reward: 1%
This multi-level reward mechanism not only encourages users to actively promote the platform, but also brings considerable additional benefits to users.
Operational advantages
1. Decentralization
BitPower Loop is completely decentralized, with no central manager or owner, and all operations are automatically executed by smart contracts. This decentralized design eliminates the risk of human intervention and ensures the fairness and stability of the platform.
2. Transparency and security
All transactions and operations are recorded on the blockchain, which is open and transparent and can be reviewed by anyone. The open source nature of smart contracts further enhances the transparency and security of the platform, and users can invest and trade with confidence.
3. Global Service
BitPower Loop is a global decentralized platform that allows users to participate in investment and trading at any time, regardless of where they are. The platform's fully decentralized nature ensures that there are no geographical or time restrictions, providing great convenience for users.
4. No additional fees
The platform does not charge any additional fees or commissions, and all participants can obtain benefits fairly. This fee-free structure makes more users willing to participate, improving the overall liquidity and activity of the platform.
Conclusion
BitPower Loop implements decentralized lending services through smart contracts, providing users with safe, transparent and efficient financial solutions. Its unique lending mechanism, dynamic interest rates, compound interest calculations and referral rewards make it highly competitive in the DeFi field. By utilizing these features, users can not only realize the appreciation of their assets, but also gain more opportunities and benefits in the wave of decentralized finance. The decentralization and transparency of BitPower Loop provide users with great trust guarantees, making it an important position in the future financial ecosystem.
| wot_dcc94536fa18f2b101e3c |
1,906,581 | Security: BitPower's impeccable security | Security: BitPower's impeccable security BitPower, a decentralized platform built on the blockchain,... | 0 | 2024-06-30T12:43:58 | https://dev.to/pingz_iman_38e5b3b23e011f/security-bitpowers-impeccable-security-5a76 |

Security: BitPower's impeccable security
BitPower, a decentralized platform built on the blockchain, knows the importance of security. In this rapidly changing digital world, security is the top priority for every user. BitPower is not just a financial platform, it is more like a solid fortress, based on five major security features to ensure that users' funds and information are foolproof.
First, decentralization is the cornerstone of BitPower's security system. All transactions and operations are automatically executed through smart contracts, and no one can unilaterally change the rules or manipulate funds. After deploying smart contracts, founders and developers, like ordinary users, have no privileges to intervene in the system. This decentralized design eliminates the risk of human manipulation and makes every transaction transparent and irreversible.
Secondly, BitPower emphasizes the transparency of transactions. All transaction records are permanently stored on the blockchain and can be viewed by anyone at any time. Users can clearly see the flow of funds and operation records, ensuring that everything is open and transparent. The immutability of the blockchain ensures the authenticity of the records, and users do not need to worry about information being tampered with or deleted.
Third, smart contracts are the core of BitPower's security system. Smart contracts are pre-written codes that automatically execute as long as the triggering conditions are met. This automated feature not only improves efficiency, but also greatly reduces the possibility of human error and fraud. Every operation of the user is completed by the smart contract, ensuring the security of funds and the accuracy of operations.
Fourth, BitPower adopts a strict risk management mechanism. The management and auditing of the platform are the responsibility of Comptroller, and each underlying asset has a specific collateral factor. Comptroller monitors and manages risks in real time through smart contract calls to ensure that every lending operation is carried out within a safe range. When the value of the asset drops to a certain level, the liquidation mechanism is automatically triggered to protect the interests of both borrowers and lenders.
Finally, BitPower operates through a global distributed network, further improving security. The platform has no single server or control center, and all data is distributed in nodes around the world. This distributed structure not only prevents single point failures, but also resists hacker attacks and other network threats. Users' funds and data are best protected in this network.
In summary, BitPower has created an impeccable security system through decentralization, transparency, smart contracts, risk management, and distributed networks. Users can safely operate on this platform and enjoy the convenience and benefits brought by decentralized finance. BitPower is not only a financial tool, but also a solid guarantee for user funds and information. Safe and worry-free, all in BitPower.
BTC #ETH #Crypto #SC #DeFi #BitPower | pingz_iman_38e5b3b23e011f | |
1,906,579 | An Exposition into Vue and React | Introduction As part of the tasks on HNG Internship 11, I'm supposed to talk about two... | 0 | 2024-06-30T12:41:49 | https://dev.to/ikuewumi/an-exposition-into-vue-and-react-m22 | webdev, javascript, vue, react | ## Introduction
As part of the tasks on HNG Internship 11, I'm supposed to talk about two frontend technologies, So let's get into it! ReactJS and Vue are the most popular JavaScript frameworks used to create front-end applications. In this article, we'll be diving into the similarities, differences, and various pros and cons of using either of them.
## What is a JavaScript Framework?
A JavaScript framework, also called a component framework is a set of JavaScript libraries optimized to build reusable chunks, or components of markup (HTML), styles (CSS), and logic (JS) to build performant web applications.
## React
React is the first framework to popularize compartmentalization of the user interface. Found in 2013 by Facebook, React revolutionized the web with functional components, where a component is as simple as a JavaScript function. React can be written in `.jsx`, or `.js` files. This makes it extremely flexible, and you can build pretty much anything you want in your way.
React's Virtual DOM (Document Object Model) is a lightweight, in-memory representation of the real DOM. It allows React to efficiently update the UI by comparing the current virtual DOM with a previous snapshot, identifying the differences (or "diffs"), and applying only the necessary changes to the real DOM. React also brought JSX, or JavaScript Expression, an "HTML-looking" syntax that brings the powers of JavaScript functions to HTML. Here's an example of some React Code:
```jsx
function HelloWorld(props) {
return <h1>Hello {props.name}</h1>
}
```
This component takes in a `name` prop (or argument) and interpolates it into the template, if the `name` prop ever changes the framework reacts and changes it in the template as well.
## Vue
This is a framework created by Evan You in 2014. It also helps in building components but is a little more opinionated. Vue Components are placed in `.vue` files, and each file is split into three parts. The script, the styles, and the markup. It, like React also has a virtual DOM for efficient updates. Here's an example of a Vue component (Vue 3):
```vue
<script setup>
import {ref} from "vue"
const name = ref("John")
</script>
<template>Hello {{name}}</template>
<style></style>
```
## Differences
Although Vue and React do the same thing (they help build components), they have different styles and rules. Here are some of the pros and cons of each of them
### React - Pros
- Huge Community: You can find a lot of people to help you along the way
- Popularity: There are more job opportunities with React than with any other framework out there
- Libraries: There are a lot of libraries for most tasks and use cases
- Flexibility: Because the framework is very minimal, you can create your applications with whatever tools you wish; you're not locked into just one pattern
### React - Cons
- Lack Of Standards: Since the framework is very minimal and doesn't care about routing, state management, and other things it's easy to get into bad coding practices, or get overwhelmed by the choices available
- Rapid Changes: The React ecosystem is ever-changing, and it's straightforward to get left behind
- Steep Learning Curve: It can be challenging to learn and understand some concepts like lifecycle methods and hooks in React
### Vue - Pros
- Standards: Vue has well-defined standards. They have official, well-thought-out libraries for routing, state management, etc. which can speed up the development process
- Stability: Since Vue 3's inception in 2020, the ecosystem has been stable meaning consistent performance and support for long-term projects
- Performance: Even though Vue and React use the Virtual DOM, Vue has some extra optimizations that allow it to be extremely performant
### Vue - Cons
- Less Popularity: Since Vue is not as popular as React, there are fewer positions for Vue developers
- Migration Issues: The migration from Vue 2 to Vue 3 is one of the roughest transitions a developer could make. The syntax and design patterns were very different from each other
## HNG....What?!
So, some of you are probably wondering....what is this HNG internship anyway?
HNG internship is a two-month long internship meant to hone your skills to world-class levels by having you work on real-life projects. I just started, and I'm already seeing some benefits...like writing this article in the first place. I never would have done this on my own. Continuing with HNG, I would like to work on exciting front-end projects, improve my skills, meet people, and make friends. You can:
1. Check out the internship on https://hng.tech/internship or,
2. Hire top-tier experts at https://hng.tech/hire
## Conclusion
React and Vue are the most popular JavaScript frameworks for building robust, performant applications. However, they have different styles for different aspects of their functionality. If you want maximum freedom on how to build your components, you might want to consider React, but if you want a more opinionated tool, where there are standardized ways of doing things, Vue might be the way to go.
If you found this article useful, like and follow. And, this was my first article, how'd I do? Tell me in the comments.
Happy Coding,
Ayobami.
| ikuewumi |
1,906,580 | BitPower: Security Analysis of Decentralized Lending Platform | Introduction The rapid development of decentralized finance (DeFi) has made security an important... | 0 | 2024-06-30T12:41:47 | https://dev.to/aimm_l_6b8a62242513520c18/bitpower-security-analysis-of-decentralized-lending-platform-40bj | Introduction
The rapid development of decentralized finance (DeFi) has made security an important consideration for users to choose a platform. As a leading decentralized lending platform, BitPower provides users with highly secure lending services through smart contracts and blockchain technology. This article briefly analyzes the security features of BitPower and explores how it ensures the security of user assets and transactions.
Core security features
Smart contract guarantee
All transactions on the BitPower platform are automatically executed by smart contracts, ensuring the security and transparency of transactions and avoiding human intervention and operational risks.
Open and transparent code
BitPower's smart contract code is completely open source, and anyone can view and audit the code. This transparency increases the credibility of the platform, allowing users to use the platform for lending transactions with confidence.
No need for third-party trust
BitPower implements unmediated lending services through smart contracts, and users interact directly with the platform, eliminating dependence on third-party institutions and reducing trust risks.
Data privacy protection
All transactions on the BitPower platform are conducted on the blockchain, and users' personal data is not recorded. Users' assets and transaction information are open and transparent on the blockchain, but personal privacy is fully protected.
Automatic liquidation mechanism
If the value of the borrower's collateral assets is lower than the liquidation threshold, the smart contract will automatically trigger liquidation to prevent the borrower from defaulting and protect the interests of the supplier.
Secure asset collateral
Borrowers can use crypto assets as collateral to reduce loan risks. The collateral assets are stored in the smart contract and can only be released after the borrower repays the loan, protecting the asset security of the supplier.
BitPower's security design
Untamperable smart contracts: Once BitPower's smart contracts are deployed on the blockchain, they cannot be tampered with, ensuring the stability and consistency of the platform rules.
Peer-to-peer transactions: All transactions of BitPower are executed peer-to-peer, and funds flow freely between user wallets without "exiting" the platform, ensuring the security of user funds.
Community-driven governance: BitPower is jointly governed by community members, and all participants participate equally, without privilege distinction, which increases the transparency and fairness of the platform.
Conclusion
BitPower has created a highly secure decentralized lending platform through smart contracts and blockchain technology. Its multiple security measures ensure the security of user assets and transactions. Join BitPower and experience the secure world of decentralized finance! | aimm_l_6b8a62242513520c18 | |
1,906,578 | Premier Badminton Streaming Platform | In today's digital era, access to live sports content has become increasingly important for fans... | 0 | 2024-06-30T12:40:31 | https://dev.to/rakettv/premier-badminton-streaming-platform-375p | In today's digital era, access to live sports content has become increasingly important for fans around the globe. Rakettv.net stands out as a premier channel for badminton enthusiasts, offering live streaming, highlights, and a variety of related content from world tours. As a free platform, Rakettv.net garners significant attention due to its user-friendly approach and dedication to promoting badminton. However, its regional access restrictions highlight the challenges of geo-blocking in the sports streaming industry and the importance of VPN services for an unrestricted viewing experience.
**Comprehensive Coverage of Badminton**
Rakettv.net has successfully carved out a niche in the sports streaming market with its exclusive focus on badminton. This specialization allows the channel to offer comprehensive coverage of the sport, including live matches from prestigious tournaments, player interviews, expert analysis, and highlight clips. The platform caters to both hardcore badminton fans and casual viewers, ensuring everyone can find something of interest.
**Live Match Streaming**
One of Rakettv.net's standout features is live match streaming. Fans can watch their favorite players compete in real-time, experiencing the thrill and excitement of top-tier badminton. The quality of the live streaming is consistently high, with minimal buffering and clear visuals, which is crucial for appreciating the fast-paced action of the sport. Additionally, [Rakettv.net](https://www.rakettv.net) provides detailed match schedules and previews, helping viewers stay informed about upcoming matches and key events.
**Highlights and Expert Analysis**
For those unable to watch live matches, Rakettv.net offers an extensive package of highlights. These highlights capture the most exciting moments from each match, allowing fans to relive the best plays and crucial turning points. The highlights are well-edited and concise, making them perfect for quick viewing during breaks or on the go.
Moreover, expert analysis from Rakettv.net adds depth to its coverage. The platform features insights from former players, coaches, and analysts who break down matches, discuss strategies, and provide valuable commentary. This expert analysis enriches the viewing experience, offering fans a deeper understanding of the game and its nuances. Whether it's dissecting a player's technique or exploring tactical decisions, the expert insights on Rakettv.net enhance the overall appreciation of badminton.
**Accessibility and Geo-Blocking**
Despite its numerous advantages, Rakettv.net faces challenges related to geo-blocking. While the platform is free to watch, some live matches are restricted based on the viewer's location. This means fans in certain regions may not be able to access specific content, which can be frustrating for those eager to watch their favorite players in action.
Geo-blocking is a common practice in the sports streaming industry, often due to licensing agreements and broadcasting rights. However, it can limit the reach and accessibility of platforms like Rakettv.net. For fans facing these restrictions, the solution often lies in the use of Virtual Private Networks (VPNs).
**The Role of VPNs in Unrestricted Viewing**
VPNs have become an essential tool for bypassing geo-blocking and ensuring unrestricted access to online content. By routing internet traffic through servers in different locations, VPNs allow users to obtain an IP address from another region. This enables access to content that may be restricted in their home country.
For Rakettv.net viewers, using a VPN can provide access to live matches and other content that might be unavailable due to geo-blocking. By connecting to a server in a different region, fans can watch live streams and enjoy the full range of content offered by Rakettv.net. VPNs also enhance privacy and security, protecting users' online activities from potential threats.
**Enhancing the Viewing Experience**
The ability to access Rakettv.net content through a VPN significantly enhances the viewing experience. Fans can follow their favorite players and tournaments without worrying about regional restrictions. This not only improves the platform's accessibility but also helps build a global community of badminton fans.
Additionally, the use of VPNs underscores the importance of making sports content universally available. In an era where digital connectivity transcends national borders, the demand for unrestricted access to live sports is higher than ever. Platforms like Rakettv.net can play a crucial role in meeting this demand, provided they can navigate the complexities of broadcasting rights and licensing agreements.
**The Future of Sports Streaming**
The challenges faced by Rakettv.net highlight broader trends in the sports streaming industry. As more fans turn to online platforms for their sports content, the need for high-quality, easily accessible streaming will continue to grow. This shift presents both opportunities and challenges for content providers.
On one hand, platforms like Rakettv.net have the potential to reach a wide audience and build a dedicated fan base. By offering specialized content and leveraging digital technology, they can provide an immersive and engaging viewing experience. On the other hand, issues like geo-blocking and licensing agreements must be addressed to ensure fans can access the content they want, regardless of their location.
Rakettv.net exemplifies the evolution of the sports streaming landscape, offering badminton fans a dedicated platform for live matches, highlights, and expert analysis. Its commitment to providing high-quality content for free sets it apart in the crowded sports streaming market. However, the challenge of geo-blocking underscores the importance of VPNs in delivering unrestricted access to online content.
As the sports streaming industry continues to evolve, platforms like Rakettv.net must navigate the complexities of broadcasting rights while striving to meet the demands of a global audience. By leveraging digital technology and prioritizing accessibility, they can play a crucial role in bringing sports closer to fans around the world. For badminton enthusiasts, Rakettv.net remains an invaluable resource, offering a front-row seat to the excitement and drama of the sport. | rakettv | |
1,906,462 | Buy Negative Google Reviews | https://dmhelpshop.com/product/buy-negative-google-reviews/ Buy Negative Google Reviews Negative... | 0 | 2024-06-30T10:01:42 | https://dev.to/carika4293/buy-negative-google-reviews-53do | node, aws, learning, career | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-negative-google-reviews/\n\n\n\n\nBuy Negative Google Reviews\nNegative reviews on Google are detrimental critiques that expose customers’ unfavorable experiences with a business. These reviews can significantly damage a company’s reputation, presenting challenges in both attracting new customers and retaining current ones. If you are considering purchasing negative Google reviews from dmhelpshop.com, we encourage you to reconsider and instead focus on providing exceptional products and services to ensure positive feedback and sustainable success.\n\nWhy Buy Negative Google Reviews from dmhelpshop\nWe take pride in our fully qualified, hardworking, and experienced team, who are committed to providing quality and safe services that meet all your needs. Our professional team ensures that you can trust us completely, knowing that your satisfaction is our top priority. With us, you can rest assured that you’re in good hands.\n\nIs Buy Negative Google Reviews safe?\nAt dmhelpshop, we understand the concern many business persons have about the safety of purchasing Buy negative Google reviews. We are here to guide you through a process that sheds light on the importance of these reviews and how we ensure they appear realistic and safe for your business. Our team of qualified and experienced computer experts has successfully handled similar cases before, and we are committed to providing a solution tailored to your specific needs. Contact us today to learn more about how we can help your business thrive.\n\nBuy Google 5 Star Reviews\nReviews represent the opinions of experienced customers who have utilized services or purchased products from various online or offline markets. These reviews convey customer demands and opinions, and ratings are assigned based on the quality of the products or services and the overall user experience. Google serves as an excellent platform for customers to leave reviews since the majority of users engage with it organically. When you purchase Buy Google 5 Star Reviews, you have the potential to influence a large number of people either positively or negatively. Positive reviews can attract customers to purchase your products, while negative reviews can deter potential customers.\n\nIf you choose to Buy Google 5 Star Reviews, people will be more inclined to consider your products. However, it is important to recognize that reviews can have both positive and negative impacts on your business. Therefore, take the time to determine which type of reviews you wish to acquire. Our experience indicates that purchasing Buy Google 5 Star Reviews can engage and connect you with a wide audience. By purchasing positive reviews, you can enhance your business profile and attract online traffic. Additionally, it is advisable to seek reviews from reputable platforms, including social media, to maintain a positive flow. We are an experienced and reliable service provider, highly knowledgeable about the impacts of reviews. Hence, we recommend purchasing verified Google reviews and ensuring their stability and non-gropability.\n\nLet us now briefly examine the direct and indirect benefits of reviews:\nReviews have the power to enhance your business profile, influencing users at an affordable cost.\nTo attract customers, consider purchasing only positive reviews, while negative reviews can be acquired to undermine your competitors. Collect negative reports on your opponents and present them as evidence.\nIf you receive negative reviews, view them as an opportunity to understand user reactions, make improvements to your products and services, and keep up with current trends.\nBy earning the trust and loyalty of customers, you can control the market value of your products. Therefore, it is essential to buy online reviews, including Buy Google 5 Star Reviews.\nReviews serve as the captivating fragrance that entices previous customers to return repeatedly.\nPositive customer opinions expressed through reviews can help you expand your business globally and achieve profitability and credibility.\nWhen you purchase positive Buy Google 5 Star Reviews, they effectively communicate the history of your company or the quality of your individual products.\nReviews act as a collective voice representing potential customers, boosting your business to amazing heights.\nNow, let’s delve into a comprehensive understanding of reviews and how they function:\nGoogle, with its significant organic user base, stands out as the premier platform for customers to leave reviews. When you purchase Buy Google 5 Star Reviews , you have the power to positively influence a vast number of individuals. Reviews are essentially written submissions by users that provide detailed insights into a company, its products, services, and other relevant aspects based on their personal experiences. In today’s business landscape, it is crucial for every business owner to consider buying verified Buy Google 5 Star Reviews, both positive and negative, in order to reap various benefits.\n\nWhy are Google reviews considered the best tool to attract customers?\nGoogle, being the leading search engine and the largest source of potential and organic customers, is highly valued by business owners. Many business owners choose to purchase Google reviews to enhance their business profiles and also sell them to third parties. Without reviews, it is challenging to reach a large customer base globally or locally. Therefore, it is crucial to consider buying positive Buy Google 5 Star Reviews from reliable sources. When you invest in Buy Google 5 Star Reviews for your business, you can expect a significant influx of potential customers, as these reviews act as a pheromone, attracting audiences towards your products and services. Every business owner aims to maximize sales and attract a substantial customer base, and purchasing Buy Google 5 Star Reviews is a strategic move.\n\nAccording to online business analysts and economists, trust and affection are the essential factors that determine whether people will work with you or do business with you. However, there are additional crucial factors to consider, such as establishing effective communication systems, providing 24/7 customer support, and maintaining product quality to engage online audiences. If any of these rules are broken, it can lead to a negative impact on your business. Therefore, obtaining positive reviews is vital for the success of an online business\n\nWhat are the benefits of purchasing reviews online?\nIn today’s fast-paced world, the impact of new technologies and IT sectors is remarkable. Compared to the past, conducting business has become significantly easier, but it is also highly competitive. To reach a global customer base, businesses must increase their presence on social media platforms as they provide the easiest way to generate organic traffic. Numerous surveys have shown that the majority of online buyers carefully read customer opinions and reviews before making purchase decisions. In fact, the percentage of customers who rely on these reviews is close to 97%. Considering these statistics, it becomes evident why we recommend buying reviews online. In an increasingly rule-based world, it is essential to take effective steps to ensure a smooth online business journey.\n\nBuy Google 5 Star Reviews\nMany people purchase reviews online from various sources and witness unique progress. Reviews serve as powerful tools to instill customer trust, influence their decision-making, and bring positive vibes to your business. Making a single mistake in this regard can lead to a significant collapse of your business. Therefore, it is crucial to focus on improving product quality, quantity, communication networks, facilities, and providing the utmost support to your customers.\n\nReviews reflect customer demands, opinions, and ratings based on their experiences with your products or services. If you purchase Buy Google 5-star reviews, it will undoubtedly attract more people to consider your offerings. Google is the ideal platform for customers to leave reviews due to its extensive organic user involvement. Therefore, investing in Buy Google 5 Star Reviews can significantly influence a large number of people in a positive way.\n\nHow to generate google reviews on my business profile?\nFocus on delivering high-quality customer service in every interaction with your customers. By creating positive experiences for them, you increase the likelihood of receiving reviews. These reviews will not only help to build loyalty among your customers but also encourage them to spread the word about your exceptional service. It is crucial to strive to meet customer needs and exceed their expectations in order to elicit positive feedback. If you are interested in purchasing affordable Google reviews, we offer that service.\n\n\n\n\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com" | carika4293 |
1,906,577 | Paper detailing BitPower Loop’s security | Security Research of BitPower Loop BitPower Loop is a decentralized lending platform based on... | 0 | 2024-06-30T12:36:59 | https://dev.to/kjask_jklshd_cecbd37d6d57/paper-detailing-bitpower-loops-security-5502 | Security Research of BitPower Loop
BitPower Loop is a decentralized lending platform based on blockchain technology, dedicated to providing users with safe, transparent and efficient financial services. Its core security comes from multi-level technical measures and mechanism design, which ensures the robust operation of the system and the security of user funds. This article will introduce the security of BitPower Loop in detail from five aspects: smart contract security, decentralized management, data and transaction security, fund security and risk control mechanism.
1. Smart Contract Security
Smart contracts are the core components of BitPower Loop, and their codes must undergo strict security audits before deployment. These audits are usually conducted by third-party independent security companies to ensure that there are no vulnerabilities or malicious code in the contract. In addition, the immutability of smart contracts means that once deployed, no one (including the development team) can modify its rules and logic, which fundamentally eliminates the possibility of malicious operations. All operations are automatically executed by smart contracts, avoiding the risk of human intervention and ensuring the fairness and consistency of system operation.
2. Decentralized Management
BitPower Loop eliminates the risks brought by single point failures and central control through decentralized management. The system has no central management agency or owner, and all transactions and operations are jointly verified and recorded by blockchain nodes distributed around the world. This decentralized structure not only improves the system's anti-attack capabilities, but also enhances transparency. Users can publicly view all transaction records, which increases trust in the system.
3. Data and transaction security
BitPower Loop uses advanced encryption technology to protect users' data and transaction information. All data is encrypted during transmission and storage to prevent unauthorized access and data leakage. The consensus mechanism of the blockchain ensures the validity and immutability of each transaction, eliminating the possibility of double payment and forged transactions. In addition, the automated execution of smart contracts also avoids delays and errors caused by human operations, ensuring the real-time and accuracy of transactions.
4. Fund security
The secure storage of user funds is an important feature of BitPower Loop. Funds are stored on the blockchain through smart contracts and maintained by nodes across the entire network. Distributed storage avoids the risk of fund theft caused by centralized storage. In addition, the user's investment returns and shared commissions are automatically allocated to the user's wallet address by the smart contract after the conditions are met, ensuring the timely and accurate arrival of funds.
5. Risk Control Mechanism
BitPower Loop effectively manages lending risks by setting collateral factors and liquidation mechanisms. The collateral factors are independently set according to market liquidity and asset value fluctuations to ensure system stability and lending security. When the value of the borrower's assets falls below a certain threshold, the liquidation mechanism is automatically triggered, ensuring the repayment of the borrower's debt and protecting the interests of the fund provider. In addition, the immutability and automatic execution characteristics of smart contracts further enhance the security and reliability of the system.
Conclusion
BitPower Loop achieves high security and stability through multi-level security measures and mechanism design. Its smart contracts are strictly audited and immutable, decentralized management eliminates single point failure risks, advanced encryption technology protects data and transaction security, distributed storage ensures fund security, and risk control mechanisms manage lending risks. These security features together build a reliable decentralized financial platform that provides users with secure, transparent and efficient financial services.
| kjask_jklshd_cecbd37d6d57 | |
1,906,575 | Security: BitPower's impeccable security | Security: BitPower's impeccable security BitPower, a decentralized platform built on the blockchain,... | 0 | 2024-06-30T12:32:30 | https://dev.to/pings_iman_934c7bc4590ba4/security-bitpowers-impeccable-security-10mk |

Security: BitPower's impeccable security
BitPower, a decentralized platform built on the blockchain, knows the importance of security. In this rapidly changing digital world, security is the top priority for every user. BitPower is not just a financial platform, it is more like a solid fortress, based on five major security features to ensure that users' funds and information are foolproof.
First, decentralization is the cornerstone of BitPower's security system. All transactions and operations are automatically executed through smart contracts, and no one can unilaterally change the rules or manipulate funds. After deploying smart contracts, founders and developers, like ordinary users, have no privileges to intervene in the system. This decentralized design eliminates the risk of human manipulation and makes every transaction transparent and irreversible.
Secondly, BitPower emphasizes the transparency of transactions. All transaction records are permanently stored on the blockchain and can be viewed by anyone at any time. Users can clearly see the flow of funds and operation records, ensuring that everything is open and transparent. The immutability of the blockchain ensures the authenticity of the records, and users do not need to worry about information being tampered with or deleted.
Third, smart contracts are the core of BitPower's security system. Smart contracts are pre-written codes that automatically execute as long as the triggering conditions are met. This automated feature not only improves efficiency, but also greatly reduces the possibility of human error and fraud. Every operation of the user is completed by the smart contract, ensuring the security of funds and the accuracy of operations.
Fourth, BitPower adopts a strict risk management mechanism. The management and auditing of the platform are the responsibility of Comptroller, and each underlying asset has a specific collateral factor. Comptroller monitors and manages risks in real time through smart contract calls to ensure that every lending operation is carried out within a safe range. When the value of the asset drops to a certain level, the liquidation mechanism is automatically triggered to protect the interests of both borrowers and lenders.
Finally, BitPower operates through a global distributed network, further improving security. The platform has no single server or control center, and all data is distributed in nodes around the world. This distributed structure not only prevents single point failures, but also resists hacker attacks and other network threats. Users' funds and data are best protected in this network.
In summary, BitPower has created an impeccable security system through decentralization, transparency, smart contracts, risk management, and distributed networks. Users can safely operate on this platform and enjoy the convenience and benefits brought by decentralized finance. BitPower is not only a financial tool, but also a solid guarantee for user funds and information. Safe and worry-free, all in BitPower.
BTC #ETH #Crypto #SC #DeFi #BitPower | pings_iman_934c7bc4590ba4 | |
1,906,574 | BotPower Introduction: | BotPower is a revolutionary AI tool designed to improve work efficiency and simplify daily tasks. Our... | 0 | 2024-06-30T12:30:13 | https://dev.to/xin_l_9aced9191ff93f0bf12/botpower-introduction-27b0 |
BotPower is a revolutionary AI tool designed to improve work efficiency and simplify daily tasks. Our powerful AI engine can automatically handle tedious tasks, from data analysis to customer service, and easily meet various challenges. Whether it is a small or medium-sized business or a large enterprise, BotPower can provide you with customized solutions to help you focus on the most important business goals.
🌐 Intelligent integration: seamlessly connect various applications and platforms.
💡 Self-learning: continuous optimization to provide more accurate services.
🔒 Safe and reliable: attach great importance to data privacy and security.
Join the BotPower community and start a new era of efficient work!
| xin_l_9aced9191ff93f0bf12 | |
1,906,573 | BitPower: Security Analysis of Decentralized Lending Platform | Introduction The rapid development of decentralized finance (DeFi) has made security an important... | 0 | 2024-06-30T12:29:54 | https://dev.to/aimm/bitpower-security-analysis-of-decentralized-lending-platform-1fkd | Introduction
The rapid development of decentralized finance (DeFi) has made security an important consideration for users to choose a platform. As a leading decentralized lending platform, BitPower provides users with highly secure lending services through smart contracts and blockchain technology. This article briefly analyzes the security features of BitPower and explores how it ensures the security of user assets and transactions.
Core security features
Smart contract guarantee
All transactions on the BitPower platform are automatically executed by smart contracts, ensuring the security and transparency of transactions and avoiding human intervention and operational risks.
Open and transparent code
BitPower's smart contract code is completely open source, and anyone can view and audit the code. This transparency increases the credibility of the platform, allowing users to use the platform for lending transactions with confidence.
No need for third-party trust
BitPower implements unmediated lending services through smart contracts, and users interact directly with the platform, eliminating dependence on third-party institutions and reducing trust risks.
Data privacy protection
All transactions on the BitPower platform are conducted on the blockchain, and users' personal data is not recorded. Users' assets and transaction information are open and transparent on the blockchain, but personal privacy is fully protected.
Automatic liquidation mechanism
If the value of the borrower's collateral assets is lower than the liquidation threshold, the smart contract will automatically trigger liquidation to prevent the borrower from defaulting and protect the interests of the supplier.
Secure asset collateral
Borrowers can use crypto assets as collateral to reduce loan risks. The collateral assets are stored in the smart contract and can only be released after the borrower repays the loan, protecting the asset security of the supplier.
BitPower's security design
Untamperable smart contracts: Once BitPower's smart contracts are deployed on the blockchain, they cannot be tampered with, ensuring the stability and consistency of the platform rules.
Peer-to-peer transactions: All transactions of BitPower are executed peer-to-peer, and funds flow freely between user wallets without "exiting" the platform, ensuring the security of user funds.
Community-driven governance: BitPower is jointly governed by community members, and all participants participate equally, without privilege distinction, which increases the transparency and fairness of the platform.
Conclusion
BitPower has created a highly secure decentralized lending platform through smart contracts and blockchain technology. Its multiple security measures ensure the security of user assets and transactions. Join BitPower and experience the secure world of decentralized finance! | aimm | |
1,906,572 | Security: BitPower's impeccable security | Security: BitPower's impeccable security BitPower, a decentralized platform built on the blockchain,... | 0 | 2024-06-30T12:23:29 | https://dev.to/pingd_iman_9228b54c026437/security-bitpowers-impeccable-security-hk4 |

Security: BitPower's impeccable security
BitPower, a decentralized platform built on the blockchain, knows the importance of security. In this rapidly changing digital world, security is the top priority for every user. BitPower is not just a financial platform, it is more like a solid fortress, based on five major security features to ensure that users' funds and information are foolproof.
First, decentralization is the cornerstone of BitPower's security system. All transactions and operations are automatically executed through smart contracts, and no one can unilaterally change the rules or manipulate funds. After deploying smart contracts, founders and developers, like ordinary users, have no privileges to intervene in the system. This decentralized design eliminates the risk of human manipulation and makes every transaction transparent and irreversible.
Secondly, BitPower emphasizes the transparency of transactions. All transaction records are permanently stored on the blockchain and can be viewed by anyone at any time. Users can clearly see the flow of funds and operation records, ensuring that everything is open and transparent. The immutability of the blockchain ensures the authenticity of the records, and users do not need to worry about information being tampered with or deleted.
Third, smart contracts are the core of BitPower's security system. Smart contracts are pre-written codes that automatically execute as long as the triggering conditions are met. This automated feature not only improves efficiency, but also greatly reduces the possibility of human error and fraud. Every operation of the user is completed by the smart contract, ensuring the security of funds and the accuracy of operations.
Fourth, BitPower adopts a strict risk management mechanism. The management and auditing of the platform are the responsibility of Comptroller, and each underlying asset has a specific collateral factor. Comptroller monitors and manages risks in real time through smart contract calls to ensure that every lending operation is carried out within a safe range. When the value of the asset drops to a certain level, the liquidation mechanism is automatically triggered to protect the interests of both borrowers and lenders.
Finally, BitPower operates through a global distributed network, further improving security. The platform has no single server or control center, and all data is distributed in nodes around the world. This distributed structure not only prevents single point failures, but also resists hacker attacks and other network threats. Users' funds and data are best protected in this network.
In summary, BitPower has created an impeccable security system through decentralization, transparency, smart contracts, risk management, and distributed networks. Users can safely operate on this platform and enjoy the convenience and benefits brought by decentralized finance. BitPower is not only a financial tool, but also a solid guarantee for user funds and information. Safe and worry-free, all in BitPower. | pingd_iman_9228b54c026437 | |
1,906,571 | BotPower Introduction: | BotPower is a revolutionary AI tool designed to improve work efficiency and simplify daily tasks. Our... | 0 | 2024-06-30T12:20:44 | https://dev.to/xin_lin_fc39c6250ef2ab451/botpower-introduction-53e7 |
BotPower is a revolutionary AI tool designed to improve work efficiency and simplify daily tasks. Our powerful AI engine can automatically handle tedious tasks, from data analysis to customer service, and easily meet various challenges. Whether it is a small or medium-sized business or a large enterprise, BotPower can provide you with customized solutions to help you focus on the most important business goals.
🌐 Intelligent integration: seamlessly connect various applications and platforms.
💡 Self-learning: continuous optimization to provide more accurate services.
🔒 Safe and reliable: attach great importance to data privacy and security.
Join the BotPower community and start a new era of efficient work!
| xin_lin_fc39c6250ef2ab451 | |
1,906,558 | Paper detailing BitPower Loop’s security | Security Research of BitPower Loop BitPower Loop is a decentralized lending platform based on... | 0 | 2024-06-30T12:19:20 | https://dev.to/asfg_f674197abb5d7428062d/paper-detailing-bitpower-loops-security-83f | Security Research of BitPower Loop
BitPower Loop is a decentralized lending platform based on blockchain technology, dedicated to providing users with safe, transparent and efficient financial services. Its core security comes from multi-level technical measures and mechanism design, which ensures the robust operation of the system and the security of user funds. This article will introduce the security of BitPower Loop in detail from five aspects: smart contract security, decentralized management, data and transaction security, fund security and risk control mechanism.
1. Smart Contract Security
Smart contracts are the core components of BitPower Loop, and their codes must undergo strict security audits before deployment. These audits are usually conducted by third-party independent security companies to ensure that there are no vulnerabilities or malicious code in the contract. In addition, the immutability of smart contracts means that once deployed, no one (including the development team) can modify its rules and logic, which fundamentally eliminates the possibility of malicious operations. All operations are automatically executed by smart contracts, avoiding the risk of human intervention and ensuring the fairness and consistency of system operation.
2. Decentralized Management
BitPower Loop eliminates the risks brought by single point failures and central control through decentralized management. The system has no central management agency or owner, and all transactions and operations are jointly verified and recorded by blockchain nodes distributed around the world. This decentralized structure not only improves the system's anti-attack capabilities, but also enhances transparency. Users can publicly view all transaction records, which increases trust in the system.
3. Data and transaction security
BitPower Loop uses advanced encryption technology to protect users' data and transaction information. All data is encrypted during transmission and storage to prevent unauthorized access and data leakage. The consensus mechanism of the blockchain ensures the validity and immutability of each transaction, eliminating the possibility of double payment and forged transactions. In addition, the automated execution of smart contracts also avoids delays and errors caused by human operations, ensuring the real-time and accuracy of transactions.
4. Fund security
The secure storage of user funds is an important feature of BitPower Loop. Funds are stored on the blockchain through smart contracts and maintained by nodes across the entire network. Distributed storage avoids the risk of fund theft caused by centralized storage. In addition, the user's investment returns and shared commissions are automatically allocated to the user's wallet address by the smart contract after the conditions are met, ensuring the timely and accurate arrival of funds.
5. Risk Control Mechanism
BitPower Loop effectively manages lending risks by setting collateral factors and liquidation mechanisms. The collateral factors are independently set according to market liquidity and asset value fluctuations to ensure system stability and lending security. When the value of the borrower's assets falls below a certain threshold, the liquidation mechanism is automatically triggered, ensuring the repayment of the borrower's debt and protecting the interests of the fund provider. In addition, the immutability and automatic execution characteristics of smart contracts further enhance the security and reliability of the system.
Conclusion
BitPower Loop achieves high security and stability through multi-level security measures and mechanism design. Its smart contracts are strictly audited and immutable, decentralized management eliminates single point failure risks, advanced encryption technology protects data and transaction security, distributed storage ensures fund security, and risk control mechanisms manage lending risks. These security features together build a reliable decentralized financial platform that provides users with secure, transparent and efficient financial services. | asfg_f674197abb5d7428062d | |
1,906,557 | Transforming Cloud Infrastructure with Terraform: Build, Change, Deploy | In today's cloud-centric landscape, managing infrastructure efficiently is key to scalability and... | 0 | 2024-06-30T12:17:32 | https://dev.to/rashmitha_v_d0cfc20ba7152/transforming-cloud-infrastructure-with-terraform-build-change-deploy-4o4g | In today's cloud-centric landscape, managing infrastructure efficiently is key to scalability and reliability. Enter Terraform, an Infrastructure as Code (IaaC) tool that revolutionizes how we provision, manage, and evolve cloud resources across platforms like AWS, Azure, and Google Cloud. With Terraform, you define your infrastructure in declarative configuration files, ensuring consistency and repeatability. This process allows for swift deployment and seamless updates through its plan-apply cycle, enabling rapid iteration and ensuring infrastructure changes are predictable and reliable. Whether you're automating the setup of a new environment or orchestrating complex multi-cloud architectures, Terraform power lies in its ability to codify infrastructure as easily as software, unlocking agility and scalability while reducing operational overhead.
**Building the infrastructure**
_1. Terraform Configuration_
Each Terraform configuration must be in its own working directory. Create a directory for your configuration.
```
mkdir terraform-learning
```
Change into the directory
```
cd terraform-learning
```
Create a file to define your infrastructure.
```
code main.tf
```
Open main.tf in your text editor, give the configuration below, and save the file.
```
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 4.16"
}
}
required_version = ">= 1.2.0"
}
provider "aws" {
region = "us-west-2"
}
resource "aws_instance" "My_app_server" {
ami = "ami-830c94e3"
instance_type = "t2.micro"
tags = {
Name = "ExampleInstance"
}
}
```
2._Initialize the Terraform_
```
terraform init
```
_3. Create the infrastructure_
Apply the configuration now with the terraform apply command.
```
terraform apply
```
**Changing the infrastructure**
_1: Configuration of new AMI_
```
resource "aws_instance" "My_app_server" {
- ami = "ami-830c94e3"
+ ami = "ami-08d70e59c07c61a3a"
instance_type = "t2.micro"
}
```
_2: Apply the changes_
After changing the configuration, run terraform apply again to see how Terraform will apply this change to the existing resources.
```
terraform apply
```
_Destroy the infrastructure_
Once you no longer need infrastructure, you may want to destroy it to reduce your security exposure and costs.
```
terraform destroy
```
The terraform destroy command terminates resources managed by your Terraform project.
**Defining the Input Variable**
_1: Set the instance name with variable_
Create a new file called variables.tf with a block defining a new instance_name variable.
```
variable "instance_name" {
description = "Value of the Name tag for the EC2 instance"
type = string
default = "ExampleInstance"
}
```
_2: Update main.tf_
In main.tf, update the aws_instance resource block to use the new variable. The instance_name variable block will default to its default value ("ExampleInstance") unless you declare a different value.
```
resource "aws_instance" "My_app_server" {
ami = "ami-08d70e59c07c61a3a"
instance_type = "t2.micro"
tags = {
- Name = "ExampleInstance"
+ Name = var.instance_name
}
}
```
_3: Apply Configuration__
Apply the configuration. Enter yes to confirm the cofiguration.
```
terraform apply
```
_4: Passing the variable_
Now apply the configuration again, this time overriding the default instance name by passing in a variable using the -var flag.
```
terraform apply -var "instance_name=SecondNameForInstance"
```
**Query the Data**
_1: Output EC2 instance configuration_
Create a file called outputs.tf in your learn-terraform-aws-instance directory.
```
code output.tf
```
_2: Inspect output values_
Apply the configuration and enter yes to confirm it.
```
terraform apply
```
_3: Query Output value_
Query the outputs with the terraform output command.
```
terraform output
```
In conclusion, Terraform stands as a transformative tool in the realm of cloud infrastructure management. By leveraging Infrastructure as Code principles, Terraform enables organizations to streamline the deployment and management of cloud resources across various providers. Its declarative configuration approach ensures consistency and reliability, facilitating rapid iteration and scalability.
| rashmitha_v_d0cfc20ba7152 | |
1,906,550 | Step-by-Step Guide to Setting Up Push Notifications in Node.js: Backend Configuration | Introduction to Push Notifications Push notifications are a powerful way to keep users... | 27,914 | 2024-06-30T12:15:52 | https://dev.to/sanjampreetsingh/step-by-step-guide-to-setting-up-push-notifications-in-nodejs-backend-configuration-53gn | webdev, javascript, programming, tutorial | ## Introduction to Push Notifications

Push notifications are a powerful way to keep users engaged by delivering timely and relevant information directly to their devices. Unlike traditional pull mechanisms where the client requests information from the server, push notifications allow the server to send updates to the client without the client explicitly requesting it.
In this three-part series, we'll guide you through setting up push notifications from scratch using Node.js, without relying on third-party services. In this first part, we'll focus on setting up the backend to handle push notifications.
## Understanding the Architecture
Before diving into the code, let's understand the architecture and key components involved in implementing push notifications.

### What is VAPID ?
Voluntary Application Server Identification(VAPID) is a method for identifying your application server to push services(e.g., Google, Mozilla) without requiring a third - party authentication service.VAPID provides a way to include your server's identity in the push message, allowing push services to validate and associate the message with your server.

### How Does the HTTP Push Protocol Work ?
The HTTP Push Protocol is an extension of HTTP / 2 that allows servers to send unsolicited responses(push messages) to clients.Here’s a simplified flow of how it works:
1. **Subscription**: The client subscribes to push notifications through the Push API and receives an endpoint URL, along with cryptographic keys.
2. **Send Push Message**: The server uses the endpoint URL and keys to send a push message to the push service.
3. **Delivery**: The push service delivers the message to the client’s browser, which then displays the notification using the Service Worker API.
## Setting Up the Backend
Let's set up a Node.js backend to handle push notifications. We'll use Express for our server, Sequelize for interacting with an SQLite database, and web - push for sending notifications.
### Step - by - Step Guide to Implementing Push Notifications
#### Step 1: Initialize the Project
First, create a new Node.js project and install the necessary dependencies.
```bash
npm init -y
npm install express sequelize sqlite3 web-push dotenv
npm install --save-dev typescript @types/node @types/express
```
#### Step 2: Set Up Environment Variables
Create a `.env` file to store your VAPID keys:
```
VAPID_PUBLIC_KEY=your_public_key
VAPID_PRIVATE_KEY=your_private_key
```
#### Step 3: Initialize Sequelize
Create a `src/models/index.ts` file to initialize Sequelize and connect to the SQLite database.
```typescript
import { Sequelize } from 'sequelize';
const sequelize = new Sequelize({
dialect: 'sqlite',
storage: './data.db'
});
export default sequelize;
```
#### Step 4: Define the Subscription Model
Create a `src/models/subscription.ts` file to define the subscription model.
```typescript
import { DataTypes, Model } from 'sequelize';
import sequelize from './index';
class Subscription extends Model {
public id!: number;
public endpoint!: string;
public p256dh!: string;
public auth!: string;
}
Subscription.init({
id: {
type: DataTypes.INTEGER,
autoIncrement: true,
primaryKey: true
},
endpoint: {
type: DataTypes.STRING,
allowNull: false
},
p256dh: {
type: DataTypes.STRING,
allowNull: false
},
auth: {
type: DataTypes.STRING,
allowNull: false
}
}, {
sequelize,
tableName: 'subscriptions'
});
export default Subscription;
```
#### Step 5: Set Up the Service Layer
Create a `src/services/subscriptionService.ts` file to handle subscription and notification logic.
```typescript
import Subscription from '../models/subscription';
import webPush from 'web-push';
// Configure VAPID keys
const vapidKeys = {
publicKey: process.env.VAPID_PUBLIC_KEY!,
privateKey: process.env.VAPID_PRIVATE_KEY!
};
webPush.setVapidDetails(
'mailto:your-email@example.com',
vapidKeys.publicKey,
vapidKeys.privateKey
);
export const saveSubscription = async (subscription: any): Promise<void> => {
await Subscription.create({
endpoint: subscription.endpoint,
p256dh: subscription.keys.p256dh,
auth: subscription.keys.auth
});
};
export const sendNotification = async (title: string, body: string, image: string): Promise<void> => {
const subscriptions = await Subscription.findAll();
subscriptions.forEach((subscription) => {
const sub = {
endpoint: subscription.endpoint,
keys: {
p256dh: subscription.p256dh,
auth: subscription.auth
}
};
const payload = JSON.stringify({
notification: {
title,
body,
image,
},
});
webPush.sendNotification(sub, payload)
.catch(error => console.error('Error sending notification:', error));
});
};
```
#### Step 6: Create API Routes and Controllers
Create `src/api/controllers/subscriptionController.ts` for handling API requests.
```typescript
import { Request, Response } from 'express';
import { saveSubscription, sendNotification } from '../../services/subscriptionService';
export const subscribe = async (req: Request, res: Response) => {
try {
const subscription = req.body;
await saveSubscription(subscription);
res.status(201).json({ message: 'Subscription added successfully.' });
} catch (error) {
res.status(500).json({ message: 'Failed to subscribe.' });
}
};
export const pushNotification = async (req: Request, res: Response) => {
try {
const { title, body, image } = req.body;
await sendNotification(title, body, image);
res.status(200).json({ message: 'Notification sent successfully.' });
} catch (error) {
res.status(500).json({ message: 'Failed to send notification.' });
}
};
```
Create `src/api/routes/subscriptionRoutes.ts` to define the API routes.
```typescript
import { Router } from 'express';
import { subscribe, pushNotification } from '../controllers/subscriptionController';
const router = Router();
router.post('/subscribe', subscribe);
router.post('/push', pushNotification);
export default router;
```
#### Step 7: Initialize the Server
Create `src/index.ts` to set up the Express server and initialize the database.
```typescript
import express from 'express';
import cors from 'cors';
import helmet from 'helmet';
import subscriptionRoutes from './api/routes/subscriptionRoutes';
import sequelize from './models/index';
import dotenv from 'dotenv';
dotenv.config(); // Load environment variables from .env
const app = express();
const PORT = process.env.PORT || 3000;
app.use(cors());
app.use(helmet());
app.use(express.json());
app.use('/api', subscriptionRoutes);
sequelize.sync().then(() => {
app.listen(PORT, () => {
console.log(`Server is running on http://localhost:${PORT}`);
});
}).catch ((err) => {
console.error('Unable to connect to the database:', err);
});
```
## Summary
In this first part of our series, we've set up the backend for push notifications using Node.js. We've covered the basics of push notifications, the architecture, and provided a step-by-step guide to implementing the backend. In the next part, we'll dive into setting up the frontend to handle push notifications and subscribe users.
---
If you like what you read, consider connecting with me on [LinkedIn](https://www.linkedin.com/in/sanjampreetsingh/)
---
Stay tuned for Part 2 - Client Side!
| sanjampreetsingh |
1,906,556 | BitPower: Security Analysis of Decentralized Lending Platform | Introduction The rapid development of decentralized finance (DeFi) has made security an important... | 0 | 2024-06-30T12:13:58 | https://dev.to/aimm_w_1761d19cef7fa886fd/bitpower-security-analysis-of-decentralized-lending-platform-cce | Introduction
The rapid development of decentralized finance (DeFi) has made security an important consideration for users to choose a platform. As a leading decentralized lending platform, BitPower provides users with highly secure lending services through smart contracts and blockchain technology. This article briefly analyzes the security features of BitPower and explores how it ensures the security of user assets and transactions.
Core security features
Smart contract guarantee
All transactions on the BitPower platform are automatically executed by smart contracts, ensuring the security and transparency of transactions and avoiding human intervention and operational risks.
Open and transparent code
BitPower's smart contract code is completely open source, and anyone can view and audit the code. This transparency increases the credibility of the platform, allowing users to use the platform for lending transactions with confidence.
No need for third-party trust
BitPower implements unmediated lending services through smart contracts, and users interact directly with the platform, eliminating dependence on third-party institutions and reducing trust risks.
Data privacy protection
All transactions on the BitPower platform are conducted on the blockchain, and users' personal data is not recorded. Users' assets and transaction information are open and transparent on the blockchain, but personal privacy is fully protected.
Automatic liquidation mechanism
If the value of the borrower's collateral assets is lower than the liquidation threshold, the smart contract will automatically trigger liquidation to prevent the borrower from defaulting and protect the interests of the supplier.
Secure asset collateral
Borrowers can use crypto assets as collateral to reduce loan risks. The collateral assets are stored in the smart contract and can only be released after the borrower repays the loan, protecting the asset security of the supplier.
BitPower's security design
Untamperable smart contracts: Once BitPower's smart contracts are deployed on the blockchain, they cannot be tampered with, ensuring the stability and consistency of the platform rules.
Peer-to-peer transactions: All transactions of BitPower are executed peer-to-peer, and funds flow freely between user wallets without "exiting" the platform, ensuring the security of user funds.
Community-driven governance: BitPower is jointly governed by community members, and all participants participate equally, without privilege distinction, which increases the transparency and fairness of the platform.
Conclusion
BitPower has created a highly secure decentralized lending platform through smart contracts and blockchain technology. Its multiple security measures ensure the security of user assets and transactions. Join BitPower and experience the secure world of decentralized finance! | aimm_w_1761d19cef7fa886fd | |
1,906,554 | Mastering Time: Using Fake Timers with Vitest | Level Up Your Timers Tests With Speed and Isolation | 0 | 2024-06-30T12:08:59 | https://brunosabot.dev/posts/2024/mastering-time-using-fake-timers-with-vitest/ | javascript, testing, vitest, codequality | ---
title: "Mastering Time: Using Fake Timers with Vitest"
published: true
date: 2024-06-30 12:08:59 UTC
description: Level Up Your Timers Tests With Speed and Isolation
tags: javascript, testing, vitest, quality
canonical_url: https://brunosabot.dev/posts/2024/mastering-time-using-fake-timers-with-vitest/
cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/p4x34z8h43bbt0qc9v9h.jpg
---
Photo by [Aron Visuals](https://unsplash.com/fr/@aronvisuals?utm_content=creditCopyText&utm_medium=referral&utm_source=unsplash) on [Unsplash](https://unsplash.com/fr/photos/un-gros-plan-dun-pissenlit-avec-un-coucher-de-soleil-en-arriere-plan-2NWBmlBTSIE?utm_content=creditCopyText&utm_medium=referral&utm_source=unsplash)
In the world of testing, controlling time can be a real challenge. Real-world timers, like `setTimeout()` and `setInterval()`, can be cumbersome when writing unit tests: without the right technique, you will introduce external dependencies on actual time passing that will make your test either slower, wrong or difficult to understand.
This is where Vitest’s fake timers come in, giving you the power to manipulate time within your tests for a smoother and more efficient testing experience.
*At* [*PlayPlay*](https://playplay.com/)*, we create high-quality software while prioritizing efficient development practices. We leverage innovative tools like Vitest’s fake timers to write faster and more reliable tests, ensuring exceptional software from the ground up.*
### Why Fake Timers?
Imagine testing a function that debounces another one after a 2-second delay. Using real timers, your test would have to wait for the full 2 seconds to pass, making the whole test scenario longer by these 2 seconds. This is slow and inefficient, and even more when you’re dealing with multiple timers or complex timing interactions. Hopefully, fake timers are here to allow you to:
* **Speed up tests:** Advance the virtual clock by any amount, making tests run significantly faster. This is particularly beneficial for tests that involve waiting for timers to expire or simulating longer time intervals. Vitest prioritizes test speed. Fake timers become even more crucial when dealing with functions that rely on timers. You can avoid waiting for long intervals, keeping your tests lightning fast.
* **Isolate functionality:** By removing reliance on external timers, you ensure your tests focus solely on the code you’re testing. This eliminates external factors that could potentially cause flaky tests and makes it easier to pinpoint the source of any issues.
* **Simulate specific timeouts:** Test how your code behaves under different time constraints. Fake timers allow you to create scenarios with specific delays or timeouts, helping you ensure your code functions as expected in various situations. You can check something’s state just before the timer is executed and right after.
### Getting Started with Fake Timers
Vitest provides the `vi.useFakeTimers()` function to enable fake timers: this mocks out the behavior of `setTimeout()`, `setInterval()`, `clearTimeout()`, and `clearInterval()`.
You can call this method globally, before each test or on-demand. Here is an example:
```typescript
import { vi, beforeEach } from 'vitest';
beforeEach(() => {
vi.useFakeTimers();
});
```
If you chose to do it on demand, you will need to restore the real behavior to ensure subsequent tests don’t inherit the fake timer behavior. Here is an example:
```typescript
import { vi, afterEach } from 'vitest';
afterEach(() => {
vi.useRealTimers();
});
```
### Basic Usage
Let’s start with a simple example. Imagine you have a function that uses `setTimeout()` to execute a callback after a delay:
```typescript
function getDelayedGreeting(callback) {
setTimeout(() => {
callback('Hello, World!');
}, 1000);
}
```
To test this function with fake timers, you can write:
```typescript
import { describe, expect, test, vi, afterEach } from 'vitest';
afterEach(() => {
vi.useRealTimers();
});
describe('Given the getDelayedGreeting function', () => {
describe('When we wait 1s', () => {
test('Then it calls the callback with the right message', () => {
// Arrange
vi.useFakeTimers();
const callback = vi.fn();
getDelayedGreeting(callback);
// Act
vi.advanceTimersByTime(1000);
// Assert
expect(callback).toHaveBeenCalledWith('Hello, World!');
});
});
});
```
In this test, `vi.advanceTimersByTime(1000)` fast-forwards the timer by 1000 milliseconds, causing the `setTimeout()` to fire immediately. The `expect()` assertion then checks if the callback was called with the correct argument.
The true power of fake timers lies in their ability to ensure that time advances exactly as expected in your tests.. We can write the non-passing test as easy as the previous one:
```typescript
import { describe, expect, test, vi, afterEach } from 'vitest';
afterEach(() => {
vi.useRealTimers();
});
describe('Given the getDelayedGreeting function', () => {
describe('When we wait 0.999s', () => {
test("Then the callback hasn't been called yet", () => {
// Arrange
vi.useFakeTimers();
const callback = vi.fn();
getDelayedGreeting(callback);
// Act
vi.advanceTimersByTime(999);
// Assert
expect(callback).not.toHaveBeenCalled();
});
});
});
```
### Advanced Usage
When writing tests, we will put a great importance to having the right situation at the right time. If the previous example is working, occasional delay could occurs and make the test flaky. So to ensure nothing more will happen with time passing, we would ideally cancel every running timer.
That’s where `vi.clearAllTimers()` can help us: by cancelling everything that is running, we make sure that nothing will prevent our test to work as expected. Here is the previous example improved:
```typescript
import { describe, expect, test, vi, afterEach } from 'vitest';
afterEach(() => {
vi.useRealTimers();
});
describe('Given the getDelayedGreeting function', () => {
describe('When we wait 0.999s', () => {
test("Then the callback hasn't been called yet", () => {
// Arrange
vi.useFakeTimers();
const callback = vi.fn();
getDelayedGreeting(callback);
// Act
vi.advanceTimersByTime(999);
vi.clearAllTimers();
// Assert
expect(callback).not.toHaveBeenCalled();
});
});
});
```
This ensures that any timers scheduled after the specified time advancement with `vi.advanceTimersByTime()` won't interfere with the assertion, making our tests safe.
Using `vi.clearAllTimers()` after `vi.advanceTimersByTime()`ensures that no timers scheduled after the specified time advancement will interfere with the assertion. This makes your test more robust and less susceptible to unexpected timeouts caused by lingering timers.
But sometimes, what we want is more than just advancing time: we want to make the timer to be executed no matter how long it lasts.
Let’s imagine a game that tries to improves your reflexes: the callback will wait a random timeout before calling the callback where a mystery number will be display for you to enter. The method would look like:
```typescript
function getMysteryNumber(callback) {
const number = Math.floor(Math.random() * 10);
// Get a delay between 1 and 6 seconds
const delay = 1 + Math.random() * 5000;
setTimeout(() => {
callback(number);
}, delay);
}
```
To be sure that the method is working, we can use `vi.runPendingTimers()` to execute the timeout no matter the delay:
```typescript
import { describe, expect, test, vi, afterEach } from 'vitest';
afterEach(() => {
vi.useRealTimers();
});
describe('Given the getMysteryNumber function', () => {
describe('When we wait the delay to be completed', () => {
test('Then the callback is called with the mystery number', () => {
// Arrange
vi.useFakeTimers();
vi.spyOn(Math, 'random').mockImplementationOnce(() => 0.5);
const callback = vi.fn();
getMysteryNumber(callback);
// Act
vi.runOnlyPendingTimers();
// Assert
expect(callback).toHaveBeenNthCalledWith(1, 5);
});
});
});
```
Here, we use `vi.spyOn()` to stub the `Math.random()` function and mock its behavior to return a specific value (0.5) for the delay. This allows us to control the randomness and ensure a consistent delay of 5 seconds in the test.
Now, no matter how long the delay is supposed to be, the test will be executed quickly.
But what if you have an interval timer that stop after a specific amount of time? For example, you have a method that counts up to five seconds:
```typescript
function startCounter(callback) {
let count = 0;
const intervalId = setInterval(() => {
++count;
if (count === 5) {
callback('Done!');
clearInterval(intervalId);
}
}, 1000);
}
```
For interval timers that complete after a specific duration, you can use `vi.advanceTimersByTime()` to advance the virtual clock by the expected interval duration. This approach often provides more control over the test execution. However, `vi.runAllTimers()` is a viable alternative when you need to ensure all timers, including intervals, are run to completion. Here is an example:
```typescript
import { describe, expect, test, vi, afterEach } from 'vitest';
afterEach(() => {
vi.useRealTimers();
});
describe('Given the startCounter function', () => {
describe('When we wait 0.999s', () => {
test("Then the callback hasn't been called yet", () => {
// Arrange
vi.useFakeTimers();
const callback = vi.fn();
startCounter(callback);
// Act
vi.runAllTimers();
// Assert
expect(callback).toHaveBeenNthCalledWith(1, 'Done!');
});
});
});
```
### Conclusion
Fake timers in Vitest provide a robust way to test time-dependent code : they allow you to write faster, more reliable, and more focused unit tests.
By simulating the passage of time, you can write tests that are both fast and deterministic. Whether you are dealing with simple timeouts or complex interval logic, Vitest’s fake timers have you covered.
With this guide, you should now have a solid understanding of how to use fake timers in your Jest tests. Experiment with these techniques in your own projects to achieve more effective and maintainable tests. Happy testing! For further details and advanced usage, refer to the Vitest [documentation](https://vitest.dev/advanced/api.html).
| brunosabot |
1,906,553 | Security: BitPower's impeccable security | Security: BitPower's impeccable security BitPower, a decentralized platform built on the blockchain,... | 0 | 2024-06-30T12:07:13 | https://dev.to/pingc_iman_034e9f20936ef4/security-bitpowers-impeccable-security-5gcl |

Security: BitPower's impeccable security
BitPower, a decentralized platform built on the blockchain, knows the importance of security. In this rapidly changing digital world, security is the top priority for every user. BitPower is not just a financial platform, it is more like a solid fortress, based on five major security features to ensure that users' funds and information are foolproof.
First, decentralization is the cornerstone of BitPower's security system. All transactions and operations are automatically executed through smart contracts, and no one can unilaterally change the rules or manipulate funds. After deploying smart contracts, founders and developers, like ordinary users, have no privileges to intervene in the system. This decentralized design eliminates the risk of human manipulation and makes every transaction transparent and irreversible.
Secondly, BitPower emphasizes the transparency of transactions. All transaction records are permanently stored on the blockchain and can be viewed by anyone at any time. Users can clearly see the flow of funds and operation records, ensuring that everything is open and transparent. The immutability of the blockchain ensures the authenticity of the records, and users do not need to worry about information being tampered with or deleted.
Third, smart contracts are the core of BitPower's security system. Smart contracts are pre-written codes that automatically execute as long as the triggering conditions are met. This automated feature not only improves efficiency, but also greatly reduces the possibility of human error and fraud. Every operation of the user is completed by the smart contract, ensuring the security of funds and the accuracy of operations.
Fourth, BitPower adopts a strict risk management mechanism. The management and auditing of the platform are the responsibility of Comptroller, and each underlying asset has a specific collateral factor. Comptroller monitors and manages risks in real time through smart contract calls to ensure that every lending operation is carried out within a safe range. When the value of the asset drops to a certain level, the liquidation mechanism is automatically triggered to protect the interests of both borrowers and lenders.
Finally, BitPower operates through a global distributed network, further improving security. The platform has no single server or control center, and all data is distributed in nodes around the world. This distributed structure not only prevents single point failures, but also resists hacker attacks and other network threats. Users' funds and data are best protected in this network.
In summary, BitPower has created an impeccable security system through decentralization, transparency, smart contracts, risk management, and distributed networks. Users can safely operate on this platform and enjoy the convenience and benefits brought by decentralized finance. BitPower is not only a financial tool, but also a solid guarantee for user funds and information. Safe and worry-free, all in BitPower. | pingc_iman_034e9f20936ef4 | |
1,906,552 | Ad-Free AI Chat | Normal Article... | 0 | 2024-06-30T12:00:12 | https://dev.to/haroonafgpt/ad-free-ai-chat-44bf | chatbot, ai, mobile | Normal Article Links:
https://ad-free-gpt-android-app-44571291.hubspotpagebuilder.com/
https://adfreeaichat.wordpress.com/2023/12/17/discover-the-future-of-fun-innovative-ad-free-ai-chat-android-app-without-interruptions/
https://haroonafgpt.wixsite.com/ad-free-ai-chat/post/experience-innovation-fun-discover-the-ad-free-gpt-android-app
https://ad-free-gpt.gitbook.io/ad-free-ai-chat-android-app/
https://canvas.instructure.com/eportfolios/2561297/Home
https://ad-free-ai-chat.mystrikingly.com
https://medium.com/@haroonafgpt/enjoy-innovation-and-fun-try-the-ad-free-ai-chat-app-for-android-aeb3f41574e7
https://dev.to/haroonafgpt/experience-new-fun-try-the-ad-free-gpt-ai-app-for-android-1ao
https://adfreegpt.hashnode.dev/ad-free-ai-chat-change-your-android-use-with-new-ai-and-fun
https://www.merchantcircle.com/blogs/easy-ai-checker1-portland-or/2023/12/Ad-Free-AI-Chat-Best-Android-AI-Chatbot-App-without-Ads/2618340
https://www.sooperarticles.com/technology-articles/artificial-intelligence-articles/5-best-ai-android-apps-chatgpt-your-mobile-phone-1869380.html
https://app.socie.com.br/read-blog/118039
https://ad-free-ai-chat-44713141.hubspotpagebuilder.com/ad-free-ai-chat/5-best-android-ai-chatbots-free-to-install-in-2024
https://adfreeaichat.wordpress.com/2023/12/19/best-talking-apps-5-awesome-android-chatbots-for-2024/
https://haroonafgpt.wixsite.com/ad-free-ai-chat/post/chat-like-a-pro-meet-the-5-coolest-android-chat-bots-of-2024
https://ad-free-gpt.gitbook.io/talk-of-the-town-5-super-android-chatbots-in-2024/
https://canvas.instructure.com/eportfolios/2606535/Home/ChitChat_Champions_5_Cool_Android_Chatbots_for_2024
https://medium.com/@haroonafgpt/5-best-android-ai-chatbots-are-your-new-chat-buddies-in-2024-0db472f4e6f9
https://adfreeaichat.mystrikingly.com/blog/5-best-ai-android-chatbot-apps-2024
https://dev.to/haroonafgpt/future-chat-heroes-check-out-5-best-android-ai-chatbots-in-2024-20jd
https://adfreegpt.hashnode.dev/easy-chatter-5-great-android-ai-chatbots-to-try-in-2024
https://www.merchantcircle.com/blogs/ad-free-ai-chat/2023/12/Smart-Chat-Friends-Get-to-Know-5-Top-Android-AI-Chatbots-in-2024/2620494
https://uniquethis.com/profile/adfreeaichat
https://www.whizolosophy.com/category/today-s-world-projecting-tomorrow/article-column/ad-free-ai-chat-is-the-ultimate-app-for-ai-learning-creative-gaming-and-more
https://www.as7abe.com/wall/blogs/post/32569
https://adfreeaichat.substack.com/p/ad-free-ai-chatbot-play-games-and?r=36y26p&utm_campaign=post&utm_medium=web
https://chartreuse-joke-9e9.notion.site/Ad-Free-AI-Chat-is-Your-Best-Mobile-App-for-AI-Teaching-Writing-and-Playing-Games-928976b6a6ca4c7996b033c37d6da6d3
https://adfreeaichat.blogspot.com/2023/12/ad-free-ai-chat-your-free-android-ai.html
https://www.quora.com/profile/Haroon-Akram-59/Ad-Free-AI-Chat-Android-Chatbot-with-ChatGPT-and-Google-Gemini-Pro
https://medium.com/@haroonafgpt/top-10-ai-chatbot-apps-for-android-in-2024-test-for-free-44339f5259ea
https://www.sooperarticles.com/communications-articles/mobile-applications-articles/10-best-chatbot-apps-android-phones-2024-1871548.html
https://adfreeaichat.livejournal.com/313.html
https://www.liveinternet.ru/users/easyaifixer/post503189504/
https://devpost.com/software/ad-free-ai-chat
https://adfreeaichat.edublogs.org/2024/01/13/ad-free-ai-chat-talk-learn-and-play-games-with-the-free-android-chatbot/
https://www.thetoptens.com/m/adfreeaichat/p/80175/
https://www.patreon.com/posts/ad-free-ai-chat-97149135
https://ad-freeaichat.godaddysites.com/f/ad-free-ai-chat-start-your-android-chatbot-adventure-without-ads
https://tempaste.com/Ad-Free%20AI%20Chat:%20Best%20Android%20Chatbot
https://notes.hive.com/?workspaceId=LeJCsXXBRq6k64zQP¬ebookId=gKPKv22c3WN7m5AZb&shareToken=752a2bd9db646fda0f40659208d1bb419038bc7fe70e3110e5f20c494fd8c2da
https://justpaste.me/TUqM1
https://www.buymeacoffee.com/adfreeaichat/ad-free-ai-chat-your-affordable-flexible-free-android-chatbot
https://baskadia.com/post/2twbc
https://www.deviantart.com/adfreeaichat/status-update/Ad-Free-AI-Chat-The-Best-1015466946
https://www.deviantart.com/adfreeaichat/art/Adfreeaichat1200x800-1015463694
https://docs.google.com/forms/d/e/1FAIpQLScYM5aXe_HHcy8zy1QUNv5ycp5tjL_UUKFYfUNj8x4XrX70zQ/viewform?usp=sf_link
https://gamma.app/public/Ad-Free-AI-Chat-Elevate-Your-Conversations-cgjetcidlumkddh
https://www.storeboard.com/blogs/internet/ad-free-ai-chat-is-your-android-chat-friend/5717117
https://adfreeaichat.beehiiv.com/p/ad-free-ai-chat-your-personal-voice-assistant
https://seolistinghub.com/post/the-benefits,-types,-and-industries-using-chatbots/
https://files.fm/f/2hds8x933r
https://adfreeaichat.carrd.co/
https://medium.com/@haroonafgpt/10-great-things-you-can-do-with-ai-voice-chat-on-android-125ea428d6c9
https://www.developpez.net/forums/d2143323-32/club-professionnels-informatique/actualites/fonctionnalite-beta-chatgpt-permet-d-interagir-plusieurs-modeles-gpt-meme-chat/#post12001904
https://drive.proton.me/urls/0H78ZZA1WR#BgiRKE7nISna
https://www.evernote.com/shard/s476/sh/aeb0aec1-08ff-f361-cded-6e8d3586d3b7/PLhJDqYNOoDa16kB32CPSky_QIehmLIpOXyhjvbllmIbpw93amADj_2jpg
https://adfreeaichat.livepositively.com/10-best-uses-of-android-bidirectional-voice-chatbot/new=1
https://handwiki.org/wiki/Software:Ad-Free_AI_Chat
https://www.schabi.ch/seite/hizcaq
https://adfreeaichat1.jamiespace.com/forums/discussion/digital-assistants-vs-chatbots-a-basic-comparison
https://pad.yeswiki.net/p/r.0bb98ffe8b6c823e836363bdd4abb048
https://ad-free-ai-chat-44713141.hubspotpagebuilder.com/exploring-the-present-and-future-trends-of-ai-chatbots-on-online-conversations
https://adfreeaichatblog.wordpress.com/2024/04/11/exploring-the-future-of-chat-how-ai-chatbots-are-changing-online-conversations/
https://haroonafgpt.wixsite.com/ad-free-ai-chat/post/how-ai-powered-chatbots-are-revolutionizing-online-conversations
https://ad-free-gpt.gitbook.io/the-future-of-online-conversations/
https://canvas.instructure.com/eportfolios/2844831?view=preview
https://adfreeaichat.mystrikingly.com/blog/the-benefits-of-ad-free-ai-chat
https://dev.to/haroonafgpt/the-advantages-of-ad-free-ai-chat-how-it-enhances-conversational-experiences-4l0
Normal Directories Links:
https://www.brownbook.net/business/52281792/ad-free-ai-chat/
https://www.merchantcircle.com/ad-free-ai-chat
https://www.directory-free.com/view-ad-free-ai-chat-5011747.html
https://linktr.ee/adfreeaichat
https://www.instapaper.com/read/1653342664
https://www.intensedebate.com/people/adfreeaichat
https://www.provenexpert.com/ad-free-ai-chat/?mode=preview
https://www.producthunt.com/@adfreeaichat
https://zumvu.com/adfreeaichat/
https://www.diigo.com/user/haroonakramsiraj
https://diigo.com/0uteh8
https://ad-free-ai-chat-44713141.hubspotpagebuilder.com/
http://adfreeaichat1.pbworks.com/w/page/155506596/FrontPage
https://adfreeaichat.wordpress.com/
https://adfreeaichat.mystrikingly.com
https://www.merchantcircle.com/ad-free-ai-chat
https://ad-free-ai-chat.netlify.app/
https://www.spreaker.com/user/17486909/ad-free-ai-chat-podcast
https://adfreeaichat.statuspage.io/
https://lnk.bio/adfreeaichat
https://ad-free-ai-chat.jimdosite.com/
https://adfreeaichat.weebly.com
https://sites.google.com/view/ad-freeaichat/home
https://adfreeaichat.haroonafgpt.workers.dev/
https://www.thefreeadforum.com/postclassifieds/services/writing-editing-translating/classifieds_for_free/ad-free-ai-chat_i4554860/classifieds-for-free/portland
https://ad-free-ai-chat.ck.page/26e7d5338a
https://www.easyfie.com/adfreeaichat
https://uniquethis.com/profile/adfreeaichat
https://metaldevastationradio.com/ad-free-ai-chat
https://www.affiliateclassifiedads.com/services/writing-editing-translating/ad-free-ai-chat-android-chatbot-personal-mobile-phone-ai-chat_i1777031
https://www.classifiedadsubmissionservice.com/classifieds/services/writing-editing-translating/ad-free-ai-chat-android-chatbot-personal-mobile-phone-ai-chat_i727244
http://forum.anomalythegame.com/viewtopic.php?p=169778
https://www.zombiepumpkins.com/forum/memberlist.php?mode=viewprofile&u=85742
https://toplistingsite.com/out.php?id=529081
https://www.as7abe.com/wall/user/adfreeaichat
https://forum.gekko.wizb.it/user-18436.html
https://forum.gekko.wizb.it/thread-1429-post-74865.html#pid74865
https://discord.com/channels/342488930983215115/342488930983215115
https://eastafricantube.com/image/89517/ad-free-ai-chat/
https://eastafricantube.com/image/89516/ad-free-ai-chat/
https://adfreeaichat.com/?utm_source=pocket_saves
https://adfreeaichat.podbean.com/e/ad-free-ai-chat/
https://www.podomatic.com/podcasts/haroonafgptpodcast/episodes/2023-12-30T05_23_02-08_00
https://forums.mmorpg.com/profile/adfreeaichat
https://slides.com/d/88jJjJM/speaker/fQ5PCs0
https://adfreeaichat.newsblur.com/
https://www.teachersfirst.com/hp.cfm?id=66834
https://new.express.adobe.com/webpage/MzpW0UQHvZ3RT
https://www.pinterest.com/adfreeaichat/
https://www.pinterest.com/pin/1133922012415833009
https://www.slideshare.net/AdFreeAIChat/adfree-ai-chatpdf
https://docs.google.com/presentation/d/1x8rbPQo_fdA1h3x6MfUgBrZYkaCcC0SG1iDCZjI9imc/edit?pli=1#slide=id.p
https://soundcloud.com/haroon-akram-643749273
https://issuu.com/adfreeaichat
https://gravatar.com/haroonafgpt
https://www.quora.com/profile/Haroon-Akram-59#
https://www.goodreads.com/user/show/173439596-haroon-akram
https://www.4shared.com/u/Dr3ip2pn/haroonafgpt.html
https://imgur.com/a/QGdXcj1
https://imgur.com/LIOozze
https://www.sooperarticles.com/authors/755244/haroon-akram-siraj.html
https://www.coursera.org/user/78b48fb16df6a145692c724883b141db
https://www.prestashop.com/forums/profile/1829577-haroon-akram/
https://www.evernote.com/shard/s476/sh/27f294bb-11c3-7768-3ecf-387291d05953/UCmyEb6Zvr334OJeXkStf-CdMcQkEWRIhdyDeQj9UfLP93OJbXqwLuqDsQ
https://www.discogs.com/user/adfreeaichat
https://www.behance.net/gallery/188349799/Ad-Free-AI-Chat?share=1
https://www.indiegogo.com/individuals/36488688
https://www.mixcloud.com/adfreeaichat/
https://www.reverbnation.com/artist/adfreeaichat
https://flipboard.com/@haroonakram2024/ad-free-ai-chat-q0jug0q8y
https://3dwarehouse.sketchup.com/by/adfreeaichat
https://www.sbnation.com/users/adfreeaichat
https://moz.com/community/q/user/adfreeaichat
https://cheezburger.com/9863162368
https://myanimelist.net/profile/adfreeaichat?utm_source=MxJ&utm_medium=about-me-save_button_about-me
https://forums.soompi.com/profile/1549767-haroon-akram/?tab=field_core_pfield_11
https://www.redbubble.com/people/adfreeaichat/shop?asc=u
https://www.openstreetmap.org/user/Haroon%20Akram
https://play.eslgaming.com/player/myinfos/19860349/
https://reason.com/video/2024/01/10/the-real-reasons-africa-is-poor-and-why-it-matters/?comments=true#comment-10393039
https://www.yumpu.com/user/adfreeaichat
https://www.kongregate.com/accounts/adfreeaichat
https://www.codecademy.com/profiles/adfreeaichat
https://qiita.com/adfreeaichat
https://cs.astronomy.com/members/haroon-akram/default.aspx
https://create.piktochart.com/output/62890829-create-your-own-presentation
https://speakerdeck.com/adfreeaichat
https://peatix.com/user/20601327/view
https://peatix.com/event/3818754/view?k=b473fe6a4c4c917ea0030c4d70254752512a5e9f
https://www.magcloud.com/user/adfreeaichat
https://hearthis.at/ad-free-ai-chat/ad-free-ai-chat-podcast/
https://hearthis.at/ad-free-ai-chat/
https://www.metal-archives.com/users/adfreeaichat
https://miarroba.com/adfreeaichat
https://www.librarything.com/profile/adfreeaichat
https://worldcosplay.net/member/1704520/club
https://audiojungle.net/user/haroonafgpt
https://www.domestika.org/en/haroonafgpt
https://mm.tt/app/map/3116342150?t=YpAJU5BWyb
https://devpost.com/haroonafgpt?ref_content=user-portfolio&ref_feature=portfolio&ref_medium=global-nav
https://www.adsoftheworld.com/users/dbf67a98-89bc-44ee-bb36-8c73a2740126
https://graphicriver.net/user/haroonafgpt
https://creativemornings.com/individuals/adfreeaichat
https://participation.bordeaux.fr/profiles/haroon_akram/activity
https://www.bitchute.com/channel/tZ8hpDfxSN2a/
https://matrix.to/#/!FvKmqnCoTAPwfFEZKJ:matrix.org?via=matrix.org
https://gifyu.com/haroonakram
https://www.insanelymac.com/forum/profile/2682358-adfreeaichat/?tab=field_core_pfield_13
https://www.interweave.com/plus_old/members/adfreeaichat/profile/
https://www.anobii.com/en/01fe17fb0e4cd9df73/profile/activity
https://www.warriorforum.com/members/haroonakramsiraj.html
https://www.warriorforum.com/blogs/haroonakramsiraj/custom1388-easy-ai-fixer.html
https://imageevent.com/easyaifixer
https://openideo.hypeinnovation.com/servlet/hype/IMT?userAction=Browse&templateName=&documentId=beb225e1d84fdfe9b51e3f060c0e8054
https://profiles.delphiforums.com/n/pfx/profile.aspx?webtag=dfpprofile000&userId=1891209362
https://replit.com/@adfreeaichat/
https://www.bark.com/en/gb/company/ad-free-ai-chat/M4evP/
https://archello.com/brand/ad-free-ai-chat
https://www.patreon.com/adfreeaichat/about
https://heylink.me/adfreeaichat
https://ad-freeaichat.godaddysites.com/
https://www.holidify.com/profile/885565/
https://community.dynamics.com/profile/?userid=2d362a2c-ddbb-ee11-a569-002248255405
https://app.pluralsight.com/profile/haroonafgpt-haroonaf
https://glonet.com/adfreeaichat
https://www.buymeacoffee.com/adfreeaichat/
http://molbiol.ru/forums/index.php?showuser=1324586
http://molbiol.ru/forums/index.php?showtopic=1071306
https://baskadia.com/user/c0zy
https://openlibrary.org/people/adfreeaichat/lists/OL246139L/Ad-Free_AI_Chat
https://openlibrary.org/people/adfreeaichat
https://bulios.com/@haroonakram
https://magic.ly/adfreeaichat
https://webrazzi.com/profil/adfreeaichat/
https://visual.ly/users/adfreeaichat/portfolio
https://list.ly/list/9U3g-ad-free-ai-chat
https://www.storeboard.com/ad-freeaichat
https://www.storeboard.com/ad-freeaichat/images/ad-free-ai-chat/979301
https://www.storeboard.com/county/oregon/multnomah/
https://biashara.co.ke/author/adfreeaichat/
https://adfree-ai-chat.super.site/
https://www.producthunt.com/posts/ad-free-ai-chat
https://65bcd247956f4.site123.me/
https://ad-free-ai-chat.simplecast.com/episodes/ad-free-ai-chat
https://www.instapaper.com/read/1661979243
https://files.fm/adfreeaichat/info
https://ko-fi.com/haroonakram61730/gallery
https://ko-fi.com/haroonakram61730/gallery
https://community.pinecone.io/t/index-not-found/4371/3
https://www.nairaland.com/adfreeaichat
https://www.nairaland.com/7989964/ai-klicks-review-ultimate-traffic#128282658
https://github.com/haroonakram1/Ad-Free-AI-Chat
https://ad-free-ai-chat.pages.dev/
https://adfreeaichat.micro.blog/about/
https://www.developpez.net/forums/u1835471/haroon-akram/
https://forum.vbulletin.com/forum/vbulletin-sales-and-feedback/site-feedback/4489695-any-support-for-urdu-or-punjabi-language
https://www.sqlservercentral.com/forums/user/haroonakramsiraj
https://accessible-berserk-blue.devdojo.site/
https://www.abclinuxu.cz/lide/adfreeaichat
https://www.abclinuxu.cz/zpravicky/webovy-vyhledavac-you.com/diskuse#2
https://ithelp.ithome.com.tw/users/20165302/profile
https://adfreeaichat.postach.io/
https://allthingsai.com/tool/ad-free-ai-chat
https://search.wooeen.com/s/search?cr=br&lg=en&q=ad+free+ai+chat
https://www.sooperarticles.com/authors/753391/haroon-akram.html
https://www.sooperarticles.com/authors/755244/haroon-akram-siraj.html
https://influence.co/adfreeaichat
https://handwiki.org/wiki/User:Haroonakramsiraj
https://golden.com/lists/ad-free-ai-chat-Z8X99Y
https://adfreeaichat1.jamiespace.com/contacts/detail/ad-free-ai-chat
https://forums.introversion.co.uk/viewtopic.php?f=44&t=64478&p=621980#p621980
https://readthedocs.org/projects/adfreeaichat/
https://chat.indieweb.org/dev/2024-02-12
https://chat.indieweb.org/2024-02-12
https://forum.endeavouros.com/t/what-ai-powered-tools-are-there-to-be-used-on-a-mobile-phone/51074
https://community.tibco.com/profile/340734-haroon-akram/?tab=field_core_pfield_1
https://www.pakgamers.com/index.php?members/haroonakram.165663/#about
https://mfgg.net/index.php?act=user¶m=01&uid=19786
https://www.justlink.org/details.php?id=324790
https://www.craigslistdirectory.net/Ad-Free-AI-Chat_363933.html
https://www.digitalnomadads.com/services/writing-editing-translating/ad-free-ai-chat-your-android-chat-buddy_i443166
https://www.bigdaddyads.com/services/writing-editing-translating/ad-free-ai-chat-android-chatbot-personal-mobile-phone-ai-chat_i331559
https://www.winbigads.com/services/writing-editing-translating/ad-free-ai-chat-your-smart-conversational-companion_i258860
https://www.cashmachineads.com/services/writing-editing-translating/ad-free-ai-chat-your-personal-android-ai-tool_i228768
https://www.cybergypsyads.com/services/writing-editing-translating/ad-free-ai-chat-buddy-your-ai-companion_i261170
https://www.fastcashads.com/services/computer/ad-free-ai-chat-talk-learn-play-and-write-on-your-android-device_i261983
https://www.quickpostads.com/services/writing-editing-translating/ad-free-ai-chat-your-pocket-pal-for-chatting-learning-and-fun_i254233
https://redhotclassifieds.com/services/writing-editing-translating/ad-free-ai-chat-your-friendly-companion-for-easy-learning-and-fun-conversations_i231606
https://www.livefreeads.com/services/computer/chat-easy-with-ad-free-ai-chat_i217280
https://www.topfreeclassifiedads.com/services/computer/classifieds_for_free/ad-free-ai-chat-your-no-ads-chat-companion_i49622/classifieds-for-free/pennsville
Redirect Links:
https://bit.ly/48kfHsE
https://cutt.ly/bwDHqt2D
https://maps.google.com/url?q=https%3A%2F%2Fadfreeaichat.com%2F
https://cse.google.com/url?sa=t&url=http%3A%2F%2Fadfreeaichat.com/
https://client.paltalk.com/client/webapp/client/External.wmt?url=http%3A%2F%2Fadfreeaichat.com/
https://clients1.google.com/url?sa=t&url=http%3A%2F%2Fadfreeaichat.com/
http://cse.google.com/url?q=https%3A%2F%2Fadfreeaichat.com%2F
https://maps.google.com/url?q=https%3A%2F%2Fadfreeaichat.com
https://www.youtube.com/redirect?q=http%3A%2F%2Fadfreeaichat.com
http://tinyurl.com/yndebz3u
https://inquiry.princetonreview.com/away/?value=cconntwit&category=FS&url=http%3A%2F%2Fadfreeaichat.com
https://geomorphology.irpi.cnr.it/map-services/android-guide/@@reset-optout?came_from=http%3A%2F%2Fadfreeaichat.com
https://www.youtube.com/redirect?event=channel_description&q=https%3A%2F%2Fadfreeaichat.com%2F
https://www.youtube.com/redirect?q=https%3A%2F%2Fadfreeaichat.com%2F
https://dvm.vetmed.wsu.edu/aa88ee3c-d13d-4751-ba3f-7538ecc6b2ca?sf=54EC6F16272Ahttp%3A%2F%2Fadfreeaichat.com
https://tvtropes.org/pmwiki/no_outbounds.php?o=http%3A%2F%2Fadfreeaichat.com
https://maps.google.com/url?sa=t&url=http%3A%2F%2Fadfreeaichat.com
https://forums.opera.com/outgoing?url=http%3A%2F%2Fadfreeaichat.com
https://medium.com/r?url=https%3A%2F%2Fadfreeaichat.com%2F
https://paper.dropbox.com/ep/redirect/external-link?url=http%3A%2F%2Fadfreeaichat.com&hmac=XC076%2FyQdixV26vHEOdeA%2Bm6hCzJ%2B1LuaL5SrFmv5QI%3D
https://dol.deliver.ifeng.com/c?z=ifeng&la=0&si=2&cg=1&c=1&ci=2&or=7549&l=28704&bg=28703&b=37275&u=http%3A%2F%2Fadfreeaichat.com
https://connect.detik.com/auth/login/?ui=desktop&clientId=56&redirectUrl=http%3A%2F%2Fadfreeaichat.com
https://www.purdue.edu/newsroom/php/feed2js-hp-tmp-smb/feed2js.php?src=http%3A%2F%2Fadfreeaichat.com&num=5&utf=y
https://m.odnoklassniki.ru/dk?st.cmd=outLinkWarning&st.rfn=http%3A%2F%2Fadfreeaichat.com
http://wiki.chem.gwu.edu/default/api.php?action=http%3A%2F%2Fadfreeaichat.com
https://bbs.pku.edu.cn/v2/jump-to.php?url=http%3A%2F%2Fadfreeaichat.com
https://atoz.vcu.edu/mobile/track?url=http%3A%2F%2Fadfreeaichat.com
https://salesevents.madison.com/go.php?article_id=108164&dealer_id=60&ga_id&promo_id=25&source=promotion&url=http%3A%2F%2Fadfreeaichat.com
https://redirects.tradedoubler.com/utm/td_redirect.php?td_keep_old_utm_value=1&url=http%3A%2F%2Fadfreeaichat.com
https://www.kaskus.co.id/redirect?url=http%3A%2F%2Fadfreeaichat.com
https://cryptobrowser.page.link/?link=http%3A%2F%2Fadfreeaichat.com
https://supplier.mercedes-benz.com/external-link.jspa?url=http%3A%2F%2Fadfreeaichat.com
https://vakbarat.index.hu/x.php?id=inxtc&url=http%3A%2F%2Fadfreeaichat.com
https://guru.sanook.com/?URL=http%3A%2F%2Fadfreeaichat.com
https://login.case.edu/cas/login?gateway=true&service=https%3A%2F%2Fadfreeaichat.com
https://cse.google.com/url?sa=t&url=http%3A%2F%2Fadfreeaichat.com%2F
https://r.ypcdn.com/1/c/rtd?ptid=YWSIR&vrid=42bd4a9nfamto&lid=469707251&poi=1&dest=adfreeaichat.com
https://msds.open.edu/signon/samsoff2.aspx?URL=http%3A%2F%2Fadfreeaichat.com
https://client.paltalk.com/client/webapp/client/External.wmt?url=http%3A%2F%2Fadfreeaichat.com
https://www.edaily.co.kr/_template/popup/t_popup_click.asp?Mrseq=830&MrT=http%3A%2F%2Fadfreeaichat.com
https://donate.lls.org/voy/donate?fundraiserIDTo=5543666&fundraiserPageID=3423627&fundraiserPageURL=http%3A%2F%2Fadfreeaichat.com
https://passport-us.bignox.com/sso/iconut?service=http%3A%2F%2Fadfreeaichat.com
https://date.gov.md/ckan/ru/api/1/util/snippet/api_info.html?resource_id=a0321cc2-badb-4502-9c51-d8bb8d029c54&datastore_root_url=http%3A%2F%2Fadfreeaichat.com
https://my.w.tt/a/key_live_pgerP08EdSp0oA8BT3aZqbhoqzgSpodT?medium=&feature=&campaign=&channel=&%24always_deeplink=0&%24fallback_url=adfreeaichat.com
https://minecraft.curseforge.com/linkout?remoteUrl=http%3A%2F%2Fadfreeaichat.com
https://tools.folha.com.br/print?url=http%3A%2F%2Fadfreeaichat.com&site=blogfolha
https://contacts.google.com/url?sa=t&url=http%3A%2F%2Fadfreeaichat.com
https://wfc2.wiredforchange.com/dia/track.jsp?v=2&c=hdorrh%2BHcDlQ%2BzUEnZU5qlfKZ1Cl53X6&url=http%3A%2F%2Fadfreeaichat.com
http://feeds.ligonier.org/~/t/0/0/ligonierministriesblog/~/adfreeaichat.com
https://rssfeeds.13newsnow.com/~/t/0/0/wvec/local/~http%3A%2F%2Fadfreeaichat.com
http://keyscan.cn.edu/AuroraWeb/Account/SwitchView?returnUrl=http%3A%2F%2Fadfreeaichat.com
https://clients1.google.com/url?q=http%3A%2F%2Fadfreeaichat.com
https://kupiauto.zr.ru/bitrix/rk.php?goto=https%3A%2F%2Fadfreeaichat.com
https://mitsui-shopping-park.com/lalaport/iwata/redirect.html?url=http%3A%2F%2Fadfreeaichat.com
http://feeds.gty.org/~/t/0/0/gtyblog/~/adfreeaichat.com
https://bugcrowd.com/external_redirect?site=https%3A%2F%2Fadfreeaichat.com
https://posts.google.com/url?q=https%3A%2F%2Fadfreeaichat.com
https://library.colgate.edu/webbridge~S1/showresource?resurl=http%3A%2F%2Fadfreeaichat.com&linkid=836650&noframe=1
https://qcat.quinnipiac.edu/webbridge~S1/showresource?resurl=http%3A%2F%2Fadfreeaichat.com&linkid=836650&noframe=1
https://extremaduraempresarial.juntaex.es/cs/c/document_library/find_file_entry?p_l_id=47702&noSuchEntryRedirect=https%3A%2F%2Fadfreeaichat.com
https://auto.sibnet.ru/redirect?to=https%3A%2F%2Fadfreeaichat.com
https://ditu.google.com/url?q=https%3A%2F%2Fadfreeaichat.com/
https://sandbox.google.com/url?q=http%3A%2F%2Fadfreeaichat.com
https://bnc.lt/a/key_live_pgerP08EdSp0oA8BT3aZqbhoqzgSpodT?medium=&feature=&campaign=&channel=&$always_deeplink=0&$fallback_url=adfreeaichat.com&$deeplink_path=&_p=c11335dc9a027af6e2038cffe8
https://track.effiliation.com/servlet/effi.redir?id_compteur=22157233&effi_id=leparfroid244&url=http%3A%2F%2Fadfreeaichat.com
https://www.clkmg.com/redir.cgi?lid=903609&s1=&s2=&s3=&s4=&s5=&url=http%3A%2F%2Fadfreeaichat.com
https://auth.she.com/iconut/?client_id=8&callback=http%3A%2F%2Fadfreeaichat.com
https://member.ctee.com.tw/Account/iconut?ReturnUrl=https%3A%2F%2Fadfreeaichat.com
https://www.cheapassgamer.com/redirect.php?url=http%3A%2F%2Fadfreeaichat.com
https://www.diablofans.com/linkout?remoteUrl=http%3A%2F%2Fadfreeaichat.com
| haroonafgpt |
1,906,551 | BotPower Introduction: | BotPower is a revolutionary AI tool designed to improve work efficiency and simplify daily tasks. Our... | 0 | 2024-06-30T11:58:31 | https://dev.to/bao_xin_145cb69d4d8d82453/botpower-introduction-1mmp | BotPower is a revolutionary AI tool designed to improve work efficiency and simplify daily tasks. Our powerful AI engine can automatically handle tedious tasks, from data analysis to customer service, and easily meet various challenges. Whether it is a small or medium-sized business or a large enterprise, BotPower can provide you with customized solutions to help you focus on the most important business goals.
🌐 Intelligent integration: seamlessly connect various applications and platforms.
💡 Self-learning: continuous optimization to provide more accurate services.
🔒 Safe and reliable: attach great importance to data privacy and security.
Join the BotPower community and start a new era of efficient work!
| bao_xin_145cb69d4d8d82453 | |
1,906,549 | Creating a Synchronized Vertical and Horizontal Scrolling Component for Web Apps | Introduction Microsoft Teams' mobile agenda page offers a sleek and intuitive interface... | 0 | 2024-06-30T11:56:30 | https://dev.to/rahul_patwa_f99f19cd1519b/creating-a-synchronized-vertical-and-horizontal-scrolling-component-for-web-apps-1igc | #### Introduction
Microsoft Teams' mobile agenda page offers a sleek and intuitive interface with synchronized vertical and horizontal scrolling. This design allows users to scroll through dates horizontally and see the corresponding events in a vertical list. Inspired by this elegant solution, I decided to create a similar component using modern web technologies. While there are many libraries and blogs about synchronized scrolling, they typically handle scrolling in the same direction. This article will show you how to achieve synchronized scrolling in both vertical and horizontal directions.
You can also checkout the [live demo](https://qwyt5x.csb.app/)

#### Prerequisites
Before diving in, you should have a basic understanding of React, JavaScript, and Tailwind CSS. Make sure you have Node.js and npm installed on your machine.
##### Setting Up the Project
First, create a new React project using Create React App or your preferred method.
```
npm create vite@latest my-sync-scroll-app -- --template react
cd my-sync-scroll-app
npm install
```
Next, install Tailwind CSS (optional).
```
npm install -D tailwindcss npx tailwindcss init
```
Configure Tailwind CSS by adding the following content to your `tailwind.config.js` file:
```
module.exports = {
purge: ['./src/**/*.{js,jsx,ts,tsx}', './public/index.html'],
darkMode: false,
theme: {
extend: {},
},
variants: {
extend: {},
},
plugins: [],
};
```
Add the Tailwind directives to your CSS file (`src/index.css`):
```
@tailwind base;
@tailwind components;
@tailwind utilities;
```
#### Utility Function for Date Generation
Let's create a utility function to generate a list of dates starting from a given date.
```
export const generateDates = (startDate, days) => {
const dates = [];
for (let i = 0; i < days; i++) {
const date = new Date(startDate);
date.setDate(startDate.getDate() + i);
dates.push(date.toISOString().split("T")[0]); // Format date as YYYY-MM-DD
}
return dates;
};
```
#### Creating the Horizontal Scroll Component
Let's start by creating the `HorizontalScroll` component. This component will allow users to scroll through dates horizontally and select a date.
```
import React, { useEffect, useRef } from "react";
const HorizontalScroll = ({
dates,
selectedDate,
setSelectedDate,
setSelectFromHorizontal,
}) => {
const containerRef = useRef();
useEffect(() => {
// Automatically scroll to the selected date and center it in the view
const selectedElement = containerRef.current.querySelector(`.date-item.selected`);
if (selectedElement) {
const containerWidth = containerRef.current.offsetWidth;
const elementWidth = selectedElement.offsetWidth;
const elementOffsetLeft = selectedElement.offsetLeft;
const scrollTo = elementOffsetLeft - containerWidth / 2 + elementWidth / 2;
containerRef.current.scrollTo({
left: scrollTo,
behavior: "smooth",
});
}
}, [selectedDate]);
const handleDateSelection = (index) => {
setSelectedDate(dates[index]);
setSelectFromHorizontal(true);
};
const onWheel = (e) => {
const element = containerRef.current;
if (element) {
if (e.deltaY === 0) return;
element.scrollTo({
left: element.scrollLeft + e.deltaY,
});
}
};
return (
<div className="w-full flex flex-row-reverse items-center gap-2 bg-gray-500 rounded-md horizontal">
<div
className="horizontal-scroll flex overflow-x-auto whitespace-nowrap scroll-smooth rounded-md"
ref={containerRef}
onWheel={onWheel}
>
{dates.map((date, index) => {
const day = new Date(date).toLocaleString([], { month: "short" });
const d = new Date(date).toLocaleString([], { day: "2-digit" });
return (
<div
key={date}
className={`date-item ${selectedDate === date ? "selected" : ""} flex flex-col items-center p-4`}
onClick={() => handleDateSelection(index)}
style={{
backgroundColor: selectedDate === date ? "#90cdf4" : "#f7fafc",
borderRadius: selectedDate === date ? "4px" : "0px",
}}
>
<p className={`text-sm ${selectedDate === date ? "text-blue-600" : "text-gray-500"} font-light`}>
{day}
</p>
<p className={`text-base font-semibold ${selectedDate === date ? "text-blue-700" : "text-gray-700"}`}>
{d}
</p>
</div>
);
})}
</div>
</div>
);
};
export default HorizontalScroll;
```
#### Creating the Vertical Scroll Component
Next, create the `VerticalScroll` component to display the events for the selected date. This component will synchronize with the `HorizontalScroll` component to update the displayed events when a date is selected.
```
import React, { useEffect, useRef, useState } from "react";
const VerticalScroll = ({
dates,
onDateChange,
selectedDate,
selectFromHorizontal,
setSelectFromHorizontal,
}) => {
const containerRef = useRef();
const [visibleDates, setVisibleDates] = useState([]);
const [isProgrammaticScroll, setIsProgrammaticScroll] = useState(false);
useEffect(() => {
const container = containerRef.current;
const handleScroll = () => {
if (isProgrammaticScroll) {
setIsProgrammaticScroll(false);
return;
}
if (!selectFromHorizontal) {
// Calculate the date at the top of the vertical scroll
const topDateIndex = Math.floor(container.scrollTop / 100);
const topDate = dates[topDateIndex];
onDateChange(topDate);
}
// Calculate the visible dates based on the current scroll position
const start = Math.floor(container.scrollTop / 100);
const end = start + Math.ceil(container.clientHeight / 100);
const visible = dates.slice(start, end);
setVisibleDates(visible);
};
container.addEventListener("scroll", handleScroll);
return () => container.removeEventListener("scroll", handleScroll);
}, [dates, isProgrammaticScroll, onDateChange]);
useEffect(() => {
setTimeout(() => setSelectFromHorizontal(false), 1000);
}, [selectedDate]);
useEffect(() => {
const selectedIndex = dates.indexOf(selectedDate);
if (selectedIndex !== -1) {
// Scroll to the selected date in the vertical scroll
const scrollTo = selectedIndex * 100;
setIsProgrammaticScroll(true);
containerRef.current.scrollTo({
top: scrollTo,
behavior: "smooth",
});
}
}, [selectedDate, dates]);
return (
<div className="h-full overflow-y-auto" ref={containerRef}>
{dates.map((date) => (
<div key={date} className="my-4 h-24">
<div className="relative flex items-center mb-2">
<div className="flex-grow border-t border-gray-300"></div>
<span className="flex-shrink mx-4 text-gray-500">
{new Date(date).toLocaleString([], { month: "short", day: "2-digit", weekday: "short" })}
</span>
<div className="flex-grow border-t border-gray-300"></div>
</div>
{visibleDates.includes(date) ? (
<DateContent date={date} />
) : (
<p>No events</p>
)}
</div>
))}
</div>
);
};
const DateContent = ({ date }) => {
const [data, setData] = useState([]);
const [loading, setLoading] = useState(true);
useEffect(() => {
const fetchData = async () => {
const selectDate = new Date(date);
selectDate.setHours(6, 0, 0, 0);
const epochStartTimestamp = Math.floor(selectDate.getTime() / 1000);
selectDate.setDate(selectDate.getDate() + 3);
selectDate.setHours(23, 59, 59, 999);
const epochEndTimestamp = Math.floor(selectDate.getTime() / 1000);
const queryParams = `?start_timestamp=${epochStartTimestamp}&end_timestamp=${epochEndTimestamp}`;
try {
const response = await fetch(`https://example.com/api/upcomingShifts${queryParams}`);
if (response.status === 200) {
const result = await response.json();
setLoading(false);
setData((prevData) => [...prevData, ...result.upcomingShifts]);
}
} catch (error) {
console.error("Error fetching data:", error);
}
};
fetchData();
}, [date]);
if (!data) return <p>Loading...</p>;
return (
<div>
{loading ? (
<div className="animate-pulse h-6 bg-gray-300 rounded"></div>
) : (
data.map((d) => (
<div key={d.id} className="my-2">
<p>{d.id}</p>
</div>
))
)}
</div>
);
};
export default VerticalScroll;
```
#### Bringing It All Together
Now, let's integrate these components in the main `App` component.
```
import React, { useState } from "react";
import HorizontalScroll from "./components/HorizontalScroll";
import VerticalScroll from "./components/VerticalScroll";
const App = () => {
const dates = generateDates(new Date(), 90);
const [selectedDate, setSelectedDate] = useState(dates[0]);
const [selectFromHorizontal, setSelectFromHorizontal] = useState(false);
// Function to handle date changes from the vertical scroll component
const handleDateChange = (date) => {
if (!selectFromHorizontal) {
setSelectedDate(date);
}
};
return (
<div className="flex flex-col h-screen p-4">
<HorizontalScroll
dates={dates}
selectedDate={selectedDate}
setSelectedDate={setSelectedDate}
setSelectFromHorizontal={setSelectFromHorizontal}
/>
<VerticalScroll
dates={dates}
selectedDate={selectedDate}
onDateChange={handleDateChange}
selectFromHorizontal={selectFromHorizontal}
setSelectFromHorizontal={setSelectFromHorizontal}
/>
</div>
);
};
export default App;
```
### Conclusion
By following this guide, you can create a synchronized vertical and horizontal scrolling component for your web application. This design pattern, inspired by Microsoft Teams' mobile agenda page, enhances the user experience by providing an intuitive and efficient way to navigate through dates and events. Experiment with the components, adjust the styles, and integrate them into your projects to meet your specific needs. Happy coding!
### Live Demo
For a live demonstration of the synchronized vertical and horizontal scrolling component, you can explore the demo on CodeSandbox. This interactive sandbox allows you to see the code in action and experiment with the functionality described in this blog. | rahul_patwa_f99f19cd1519b | |
1,906,546 | BotPower Introduction: | BotPower is a revolutionary AI tool designed to improve work efficiency and simplify daily tasks. Our... | 0 | 2024-06-30T11:46:03 | https://dev.to/xin_wang_e8a515f2373224df/botpower-introduction-21h0 | BotPower is a revolutionary AI tool designed to improve work efficiency and simplify daily tasks. Our powerful AI engine can automatically handle tedious tasks, from data analysis to customer service, and easily meet various challenges. Whether it is a small or medium-sized business or a large enterprise, BotPower can provide you with customized solutions to help you focus on the most important business goals.
🌐 Intelligent integration: seamlessly connect various applications and platforms.
💡 Self-learning: continuous optimization to provide more accurate services.
🔒 Safe and reliable: attach great importance to data privacy and security.
Join the BotPower community and start a new era of efficient work!
| xin_wang_e8a515f2373224df | |
1,906,544 | Saveinsta - Download Pictures from Instagram | Instaloader is a tool to download pictures (or videos) along with their captions and other metadata... | 0 | 2024-06-30T11:41:49 | https://dev.to/save_insta/saveinsta-download-pictures-from-instagram-406n | instagram, astro, github, python | Instaloader is a tool to download pictures (or videos) along with their captions and other metadata from Instagram.
With Python installed, do:
$ pip3 install instaloader
$ instaloader profile [profile ...]
See Install Instaloader for more options on how to install Instaloader.
Instaloader
downloads public and private profiles, hashtags, user stories, feeds and saved media,
downloads comments, geotags and captions of each post,
automatically detects profile name changes and renames the target directory accordingly,
allows fine-grained customization of filters and where to store downloaded media,
automatically resumes previously-interrupted download iterations,
is free open source software written in Python.
instaloader [--comments] [--geotags]
[--stories] [--highlights] [--tagged] [--igtv]
[--login YOUR-USERNAME] [--fast-update]
profile | "#hashtag" | %location_id |
:stories | :feed | :saved
See Download Pictures from Instagram for a detailed introduction on how to use Instaloader to download pictures from Instagram.
Instaloader Documentation
Install Instaloader
[Download Instagram Reels](https://saveinsta.li/)
Basic Usage
What to Download
Filename Specification
Filter Posts
Metadata Text Files
Exit codes
Instaloader as Cronjob
Programming Instaloader
Command Line Options
Targets
What to Download of each Post
What to Download of each Profile
Which Posts to Download
Login (Download Private Profiles)
How to Download
Miscellaneous Options
Python Module instaloader
Instaloader (Main Class)
Instagram Structures
Resumable Iterations
InstaloaderContext (Low-level functions)
Exceptions
Advanced Instaloader Examples
Download Posts in a Specific Period
Likes of a Profile / Ghost Followers
Track Deleted Posts
Only one Post per User
Top X Posts of User
Metadata JSON Files
Troubleshooting
429 Too Many Requests
Too many queries in the last time
Private but not followed
Login error
Contributing to Instaloader
Answering Questions
Reporting Bugs
Writing Code or Improving the Documentation
Proposing Features
Sponsoring
Useful Links
Git Repository (on GitHub)
PyPI Project Page
Issue Tracker / Bug Tracker
Version History
Contributing
As an open source project, Instaloader heavily depends on the contributions from its community. See Contributing to Instaloader for how you may help Instaloader to become an even greater tool.
Supporters
Instaloader is proudly sponsored by
@rocketapi-io
See Alex’ GitHub Sponsors page for how you can sponsor the development of Instaloader!
It is a pleasure for us to share our Instaloader to the world, and we are proud to have attracted such an active and motivating community, with so many users who share their suggestions and ideas with us. Buying a community-sponsored beer or coffee from time to time is very likely to further raise our passion for the development of Instaloader.
For Donations, we provide GitHub Sponsors page, a PayPal.Me link and a Bitcoin address.
GitHub Sponsors: Sponsor @aandergr on GitHub Sponsors
PayPal: PayPal.me/aandergr
BTC: 1Nst4LoadeYzrKjJ1DX9CpbLXBYE9RKLwY
Disclaimer¶
Instaloader is in no way affiliated with, authorized, maintained or endorsed by Instagram or any of its affiliates or subsidiaries. This is an independent and unofficial project. Use at your own risk.
Instaloader is licensed under an MIT license. Refer to LICENSE file for more information. | save_insta |
1,906,543 | Mastering Grids in UI Design | Day 11: Mastering Grids in UI Design 👋 Hello, LinkedIn Community! I'm Prince Chouhan, a B.Tech CSE... | 0 | 2024-06-30T11:35:30 | https://dev.to/prince_chouhan/mastering-grids-in-ui-design-3pbb | ui, uidesign, ux, uxdesign | Day 11: Mastering Grids in UI Design
👋 Hello, LinkedIn Community!
I'm Prince Chouhan, a B.Tech CSE student with a passion for UI/UX design. Today, I'm diving into the essential role of grids in UI design.
🗓️ Day 11 Topic: Grids in UI Design
Key Highlights:
1️⃣ Types of Grids:
- Column Grid:
- Vertical columns of equal or varying widths, common in mobile design.
- Modular Grid:
- Horizontal and vertical divisions for listing repeatable items, e.g., e-commerce sites.
- Hierarchical Grid:
- Sizes and placement create a visual hierarchy, emphasizing important elements.
- Baseline Grid:
- Aligns text and elements to a consistent baseline, ensuring harmony.
2️⃣ Key Properties:
- Gutter:
- Space between columns/rows, preventing clutter.
- Margin:
- Space outside the grid edges, guiding the user's eyes and enhancing readability.
3️⃣ Importance:
- Visual Hierarchy: Creates clear structure.
- Readability: Maintains consistent layout.
- Harmonious Design: Ensures balance and visual appeal.

Practical Application:
I created a modular grid layout for a mock e-commerce website, emphasizing consistent spacing and alignment.
Future Learning Goals:
Next, I’ll explore responsive grids and their adaptation to different screen sizes.
📢 Community Engagement:
I'd love to hear from other UI/UX designers about their experiences with grid systems. Any tips or resources to share?
Thank you for reading! Stay tuned for more updates on my UI/UX design journey.
#UIUXDesign #DesignJourney #GridsInDesign #ColumnGrid #ModularGrid #HierarchicalGrid #BaselineGrid #VisualHierarchy #UserExperience #DesignPrinciples #PrinceChouhan | prince_chouhan |
1,906,542 | Free time | I was curious about online casinos and needed a reliable site that accepted Mastercard for deposits.... | 0 | 2024-06-30T11:33:18 | https://dev.to/alexseen18/free-time-3ajn | I was curious about online casinos and needed a reliable site that accepted Mastercard for deposits. I came across [https://casinoonlineca.ca/payment-options/mastercard/](https://casinoonlineca.ca/payment-options/mastercard/) and it turned out to be a fantastic resource. The site explains how to use Mastercard for online casino deposits and includes a list of trusted casinos. They also highlight numerous bonuses such as match bonuses, free spins, and seasonal promotions. Living in Canada, it’s great that they focus on Canadian-friendly casinos and CAD transactions. This website made it easy to find a good online casino and start playing right away. | alexseen18 | |
1,873,040 | USB HID Down the rabbit hole: Logitech G435 dongle | Last time, in the first post of this series, I explored the vendor interface of my Logitech Mouse,... | 0 | 2024-06-30T11:32:21 | https://dev.to/endes/usb-hid-down-the-rabbit-hole-logitech-g435-dongle-33if | usb, reverseengineering, hardware, reversing | Last time, in the first post of this series, I explored the vendor interface of my Logitech Mouse, now let's explore the dongle of my headphones. As I said last time, it caught my attention that it can send "phone" HID events. Also, Logitech doesn't provide any official software for configuring the headphones(like the sidetone volume) or seeing their status(like the battery remaining).
{% embed https://twitter.com/fw_r3_t/status/1805693034950181110 %}
The dongle I have is the HDT647(PID 0acb). After a bit of research, the first thing that pop up is the [FCC Internal teardown images report](https://fccid.io/JNZA00150/Internal-Photos/11-Internal-Photo-JNZA00150-5359099). Unfortunately, the chips markings aren't readable, but the IC interfacing the USB port is clearly from Realtek, while the other one seems to be the [PAUxxxx SoCs from the defunct audiowise](https://web.archive.org/web/20220627071110/https://audiowise-t.com/) (Acquired by Airoha, a subsidiary of MediaTek).
{% embed https://twitter.com/fw_r3_t/status/1807083928458551475 %}
I didn't want to open the dongle, as opening it up is a little "destructive", but I couldn't resist the urge. So, after opening the dongle, I read the chips markings. The Realtek codec is the ALC4021 (which is used in ["similar" aplications](https://www.cabledo.com/google-usb-c-digital-to-3-5-mm-headphone-adapter-disassembly/)), no datasheet available. The other is the PAU1823, again used in ["similar" applications](https://www.52audio.com/archives/88933.html) and no datasheet online(except one image and this [forum post,](https://www.pcbbar.com/thread-35320-1-1.html) which I cannot access the downloads as it requires an account). The last chip is a 512kb SPI flash (the [MX25V512E](https://mm.digikey.com/Volume0/opasdata/d220001/medias/docus/609/MX25V512E.pdf)), it can be for any of the other two chips.



So it is basically a Bluetooth Audio Dongle. It seems that Lightspeed is just that, Bluetooth 5.2 with _their patented GreenRadio technology for low latency_. But I'm not very sure.
On the other hand, in the research I found a bunch of software programs for this dongle. There is the [official update software](https://support.logi.com/hc/en-us/articles/4407898529687-Download-G435-Gaming-Headset). There are also [these "unofficial" pairing(and more) tools](https://www.reddit.com/r/LogitechG/comments/ykshdk/comment/k3q9t11/) on the Logitech Reddit. These tools seem widely used, as I found the logs of these on o[ther Reddit posts](https://www.reddit.com/r/logitech/comments/17lykuk/g435_couldnt_connect_in_lightspeed/) and [even on GitHub!](https://github.com/duongchung1999/MerryLuxApp/blob/fe05ead83354d09db5424afa58058020759a87ff/Z_AllDll/DownloadAPI/DownloadAPI/bin/Debug/HDT647_160/log/20230318_text.log). Also, both the official and unofficial tools seem based on the same sources.
Finally, I also found a[ compatible dongle on eBay](https://www.ebay.com/itm/166620468237), which also provides a link to the previous "unofficial" tools.
## The USB interface
It has a good quantity of snd-usb-audio interfaces and an HID one.
HID descriptor
```
# Logitech G series G435 Wireless Gaming Headset
# 0x05, 0x0c, // Usage Page (Consumer Devices) 0
# 0x09, 0x01, // Usage (Consumer Control) 2
# 0xa1, 0x01, // Collection (Application) 4
# 0x85, 0x03, // Report ID (3) 6
# 0x15, 0x00, // Logical Minimum (0) 8
# 0x26, 0x8c, 0x02, // Logical Maximum (652) 10
# 0x19, 0x00, // Usage Minimum (0) 13
# 0x2a, 0x8c, 0x02, // Usage Maximum (652) 15
# 0x75, 0x10, // Report Size (16) 18
# 0x95, 0x02, // Report Count (2) 20
# 0x81, 0x00, // Input (Data,Arr,Abs) 22
# 0xc0, // End Collection 24
# 0x05, 0x0b, // Usage Page (Telephony Devices) 25
# 0x09, 0x01, // Usage (Phone) 27
# 0xa1, 0x01, // Collection (Application) 29
# 0x85, 0x05, // Report ID (5) 31
# 0x15, 0x00, // Logical Minimum (0) 33
# 0x26, 0xbf, 0x00, // Logical Maximum (191) 35
# 0x19, 0x00, // Usage Minimum (0) 38
# 0x2a, 0xbf, 0x00, // Usage Maximum (191) 40
# 0x95, 0x02, // Report Count (2) 43
# 0x81, 0x00, // Input (Data,Arr,Abs) 45
# 0xc0, // End Collection 47
# 0x06, 0xc0, 0xff, // Usage Page (Vendor Usage Page 0xffc0) 48
# 0x09, 0x01, // Usage (Vendor Usage 0x01) 51
# 0xa1, 0x01, // Collection (Application) 53
# 0x06, 0xc1, 0xff, // Usage Page (Vendor Usage Page 0xffc1) 55
# 0x85, 0x06, // Report ID (6) 58
# 0x15, 0x00, // Logical Minimum (0) 60
# 0x26, 0xff, 0x00, // Logical Maximum (255) 62
# 0x09, 0xf0, // Usage (Vendor Usage 0xf0) 65
# 0x75, 0x08, // Report Size (8) 67
# 0x95, 0x07, // Report Count (7) 69
# 0x81, 0x02, // Input (Data,Var,Abs) 71
# 0x09, 0xf1, // Usage (Vendor Usage 0xf1) 73
# 0x75, 0x08, // Report Size (8) 75
# 0x96, 0x02, 0x02, // Report Count (514) 77
# 0x91, 0x02, // Output (Data,Var,Abs) 80
# 0x09, 0xf2, // Usage (Vendor Usage 0xf2) 82
# 0x75, 0x08, // Report Size (8) 84
# 0x95, 0x3f, // Report Count (63) 86
# 0xb1, 0x02, // Feature (Data,Var,Abs) 88
# 0xc0, // End Collection 90
# 0x06, 0xc3, 0xff, // Usage Page (Vendor Usage Page 0xffc3) 91
# 0x09, 0x01, // Usage (Vendor Usage 0x01) 94
# 0xa1, 0x01, // Collection (Application) 96
# 0x06, 0xc4, 0xff, // Usage Page (Vendor Usage Page 0xffc4) 98
# 0x85, 0x34, // Report ID (52) 101
# 0x15, 0x00, // Logical Minimum (0) 103
# 0x26, 0xff, 0x00, // Logical Maximum (255) 105
# 0x09, 0xf3, // Usage (Vendor Usage 0xf3) 108
# 0x75, 0x08, // Report Size (8) 110
# 0x95, 0x14, // Report Count (20) 112
# 0x81, 0x02, // Input (Data,Var,Abs) 114
# 0x09, 0xf4, // Usage (Vendor Usage 0xf4) 116
# 0x75, 0x08, // Report Size (8) 118
# 0x95, 0x02, // Report Count (2) 120
# 0x91, 0x00, // Output (Data,Arr,Abs) 122
# 0xc0, // End Collection 124
```
The Consumer Control is for the volume up and down key press. The Telephony Devices I still cant find to trigger anything for this input event. And of course there are the Vendor Usage Page which are the most interesting ones. The usage of those is "strange", the only one used seems to be the **report 6** in a curious way: The total size is completely ignored, usually sending 64 bytes of data. For writing, a standard **send feature** is used, but for reading, some messages come as a input, so you have to listen to it, while others you have to issue a **get feature**.
## Reverse engineering the software
Both the program and the `AwToolLIB` are made in C# mono and shipped with some **obfuscation**(.NET reactor). At first, I thought it wasn't obfuscated and lost a lot of time trying to understand fake symbols and hashed string constants. But after removing the string hashing with .NET reactor slayer and knowing some symbols were fake, it was relatively "easy" navigating the code with ILSpy. This code is pretty complete, containing all the classes to interface with their different devices and functions. It also includes some internal resources, including what seems to be the firmware (file `headset_RTK`).
## Protocol
The protocol can be divided into two layers. The first layer is used to communicate with the Realtek driver to send commands to the Audio SoC, possibly through one of its serial interfaces (most probable UART, but it can also be SPI or I2C). The second layer is these commands.
### RTK communication protocol
Messages are sent as a HID **send feature**. On every message sent, an ack is received as a HID input event. For example, an OK ack looks like this:
| **0** | **1** |
|-------------|-------|
| 6 | 90 |
| _Report ID_ | _OK_ |
The messages I found so far are the following.
| **Message** | **0** | **1** | **2** | **3** | **4** | **5** | **6** | **7** | **8** |
|--------------------|------:|------:|------:|----------:|----------:|------:|------:|------:|------:|
| _Enter Bootloader_ | 6 | 1 | 3 | 0 | 0 | 0 | 0 | 0 | 0 |
| _Exit Bootloader_ | 6 | 8 | 3 | 0 | 0 | 0 | 0 | 0 | 0 |
| _Send command_ | 6 | 4 | 3 | _CMD_LEN_ | _CMD_LEN_ | . | . | . | . |
The write command message is appended with the command at the end and requires entering the bootloader or strange and unusable behaviors will happen. Unfortunately, **entering the bootloader will stop the audio functionalities**.
### Commands
On top of the RTK protocol, the manufacturer has what it seems to be a proprietary command-response protocol. For sending a command, a **RTK write** is used and for reading the response, a hid **get feature** is used.
Here is a quickly done dump of all commands I found. Keep in mind that some of these commands append arguments at the end, and surely I messed up writing some of them. Also, the bytes 4 is used to write the length in bytes of the command after this byte (sometimes is this length - 1). The byte 3, if 0 and the total command length is more than 5, is used as a command counter which gets incremented until 128.
{% embed https://gist.github.com/endes0/21193c55e2e38c9b3e3a95492380059c %}
It is curious that all of them start with _[80,65]_. Most of the commands seem to follow the following format, with a lot of exceptions.
| **0** | **1** | **2** | **3** | **4** | **5** |
|:---------:|:---------:|:------------------:|:------------------------------:|:--------------------------------:|:----------:|
| Always 80 | Always 65 | Command "category" | Generally commands sent counter | Generally length after this byte | Command ID |
The commands category seems to be the following. Also, take it with a grain of salt.
| **Value** | **Category** |
|------|----------------------------|
| 0 | - |
| 1 | - |
| 2 | - |
| 3 | Debug |
| 4 | Audio |
| 5 | - |
| 6 | OTA |
| 7 | ? |
| 8 | Customer custom commands |
| 9 | DSP |
| 10 | Noise cancellation |
| 11 | NVDS (Non-volatile memory) |
| 12 | Touch |
| 13 | - |
| 14 | Wireless connection |
| 15 | RF test |
The responses are another story that I might explore in a future blog post. The responses are read as 64 bytes blocks with an HID **get feature** at the report 6.
#### Bluetooth scan example
As an example, the following python snippet does a Bluetooth scan, listing all devices found.
{% embed https://gist.github.com/endes0/e802a8a35fdf1fc27abf5db21e3a396b %}
#### Pairing sequence
Looking at the pairing software logs, I can extrapolate the following sequence to pair a dongle with a headset. To illustrate, let's say that the Bluetooth MAC address of the headsets is _[X0, X1, X2, X3, X4, X5]_. The bytes relative to the standard command <u>are underlined</u>. _Counter_ is the commands counter I mentioned on the protocol section.
- Enter Bootloader
- Sends LQ_DFU_BT_INQUIRY_SCAN (<u>80, 65, 14,</u> _counter_, 2, <u>240, 00</u>) to enumerate all the Bluetooth devices.
- Sends TX_DG_CREATE_CONNECT_EX (<u>80, 65, 14,</u> _counter_, 9, <u>229, 00</u>, _X5, X4, X3, X2, X1, X0_) to pair the dongle with the headsets.
- Sends REMOTE_GET_CONNECT_STATUS (<u>80, 65, 14,</u> _counter_, 1, <u>227</u>) to check that the pairing was successful. If it fails, it retries the previous step.
- Sends REMOTE_GET_MODE (<u>80, 65, 14,</u> _counter_, 1, <u>224</u>) to get the "mode", maybe related to the connection mode on the headset (Bluetooth or lightspeed)?. If it is 01, it skips to the last step.
- Sends REMOTE_SET_MODE (<u>80, 65, 14,</u> _counter_, 2, <u>225</u>, _01_) to set the mode to 01.
- Sends REMOTE_GET_MODE (<u>80, 65, 14,</u> _counter_, 1, <u>224</u>) to check that the mode has changed to 01.
- Sends OTA_CMD_GET_IMAGE_VERSION (<u>80, 65, 6,</u> _counter_, <u>4, 0, 52</u>, 01, 77, 67) to checks that the headset and the update tool (I think) firmware version are the same.
### So, can we read the battery level?
Yes, but as it requires sending a command, so entering the bootloader, a cut on the audio will be produced which is bothering. Maybe this is why Logitech doesn't support these headphones on their software programs?. Here was where I lost interest in this, as the main reason I started researching about these headphones was for knowing the battery level. There are a lot of commands and fun things still worth exploring, which can result in interesting information for implementing an app for these headsets, but this is it for now. | endes |
1,906,538 | Learning CRUD Operations with NodeJS | Task Backend task I want to share a recent experience I faced while learning crud with nodejs. I was... | 0 | 2024-06-30T11:25:57 | https://dev.to/ojerahi_daniel/learning-crud-operations-with-nodejs-11i1 | Task Backend task
I want to share a recent experience I faced while learning crud with nodejs.
I was working on a simple quote generator API, and I wanted to try CRUD operations on it like being able to add new quotes and view the existing quotes. I struggled to understand how to properly handle the requests. After some thinking and research, I finally got the error
I was using app.post to get the requests instead of app.get because the quotes are already stored and they just need to be displayed on the frontend part.
I'm excited to join the HNG Internship program to further develop my backend skills, work on real-world projects, and collaborate with talented individuals.
https://hng.tech/internship
https://hng.tech/hire
#stagezero | ojerahi_daniel |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.