id int64 5 1.93M | title stringlengths 0 128 | description stringlengths 0 25.5k | collection_id int64 0 28.1k | published_timestamp timestamp[s] | canonical_url stringlengths 14 581 | tag_list stringlengths 0 120 | body_markdown stringlengths 0 716k | user_username stringlengths 2 30 |
|---|---|---|---|---|---|---|---|---|
1,890,802 | Croc: seamless file transfer | Good morning everyone and happy MonDEV! ☕ how is development going in this almost summer? Here we... | 25,147 | 2024-06-17T07:00:00 | https://dev.to/giuliano1993/croc-seamless-file-transfer-2428 | tooling, python, cli, opensource | Good morning everyone and happy MonDEV! ☕
how is development going in this almost summer? Here we continue to write lines of code and look for new tools to talk about, even though soon your Monday newsletter will go on vacation for a few weeks, before coming back more charged than before!
But there's still some time left, so why not come to today's tool instead?
Today we talk about file transfer. How many times do we have to quickly pass one or more files from one PC to another?
What is the first method you use? USB stick? Cloud/Drive maybe? What's App or Telegram Web?
Regardless of your tool of choice, this forces us to interrupt our workflow and the comfortable environment.
As I have said in other cases, it is not something transcendental, but if we can always do better, why not do it?
So I present to you [Croc](https://github.com/schollz/croc), today's tool.
What is it about? Well, Croc is a tool that allows two computers to exchange files securely directly from the command line. It is available for any operating system, very lightweight, and very easy to use.
Once installed, in fact, you just need to launch the command `croc ./file-name.ext` from the terminal and croc will release a passphrase to you; by launching croc from any other pc and entering the same phrase, you can immediately download the file.
The download is available only once so the use of the passphrase is unique, and you can use the one generated or choose one yourself.
In this way, without leaving the terminal, you will have transferred your files from one pc to another in a few seconds.
As I said, perhaps it won't revolutionize your way of working, but I find that small details are sometimes what makes the difference.
That being said, there are many more options that you can set, and I invite you to discover them directly on the tool's repository.
I hope this little insight of the week can be interesting for you, as always let me know!
In the meantime, I just have to wish you a good week.
Happy Coding 0_1 | giuliano1993 |
1,890,308 | PACX ⁓ Create columns: Boolean | Welcome to the 3rd article of the series where we show how to leverage PACX commands to create... | 27,730 | 2024-06-17T07:00:00 | https://dev.to/_neronotte/pacx-create-columns-boolean-jli | Welcome to the 3rd article of the series where we show how to leverage PACX commands to create Dataverse columns.
---
Creating a **Boolean** column is quite straightforward:
```Powershell
pacx column create --type Boolean --table my_table --name "Is Open?"
pacx column create -at Boolean -t my_table -n "Is Open?"
```
PACX assumes the following conventions:
- **SchemaName** and **LogicalName** are built by
- taking the publisher prefix of the [current default solution](https://dev.to/_neronotte/pacx-working-with-solutions-5fil) (`{prefix}`)
- taking only letters, numbers or underscores from the specified `--name` (`{name}`)
- **RequiredLevel** is set to `None`
- **Description** is left empty
- **Label for `true `value** is "True"
- **Label for `false` value** is "False"
You can override the default labels for `True `and `False` values via:
```Powershell
pacx column create --type Boolean --table my_table --name "Is Open?" --trueLabel Yes --falseLabel No
pacx column create -at Boolean -t my_table -n "Is Open?" -tl Yes -fl No
```
Of course you can also override all the other conventions as described in the previous articles of this series.
| _neronotte | |
1,871,774 | Essential SQL Window Functions for Beginners | SQL window functions are crucial for data analysis. This brief guide covers fundamental window... | 21,681 | 2024-06-17T07:00:00 | https://dev.to/dbvismarketing/essential-sql-window-functions-for-beginners-4gb2 | SQL window functions are crucial for data analysis. This brief guide covers fundamental window functions with examples to help you start using them effectively.
### Examples of SQL Window Functions
**ROW_NUMBER()**
Numbers each row uniquely within a window:
```sql
SELECT name, score, ROW_NUMBER() OVER (ORDER BY score DESC) as rank
FROM exam_scores;
```
**RANK()**
Ranks rows, with ties receiving the same rank and subsequent ranks skipped:
```sql
SELECT name, score, RANK() OVER (ORDER BY score DESC) as rank
FROM exam_scores;
```
**DENSE_RANK()**
Ranks rows, with ties receiving the same rank and subsequent ranks being consecutive:
```sql
SELECT name, score, DENSE_RANK() OVER (ORDER BY score DESC) as rank
FROM exam_scores;
```
**PERCENT_RANK()**
Calculates percentile ranks within a result set:
```sql
SELECT name, score, PERCENT_RANK() OVER (ORDER BY score DESC) as percentile_rank
FROM exam_scores;
```
**NTILE()**
Distributes rows into a specified number of groups:
```sql
SELECT name, score, NTILE(4) OVER (ORDER BY score DESC) as quartile
FROM exam_scores;
```
### FAQ
**What are SQL window functions?**
They perform operations across a window of rows, enabling complex calculations like ranking and percentiles.
**How do I use the ROW_NUMBER() function in SQL?**
It assigns a sequential integer to each row within a window. Include it in the `SELECT` clause with an `OVER` clause.
**What is the difference between the RANK() and DENSE_RANK() functions in SQL?**
`RANK()` skips ranks after ties; `DENSE_RANK()` gives consecutive ranks regardless of ties.
**How does the PERCENT_RANK() function work in SQL?**
It provides a percentile rank between `0` and `1` for each row.
### Conclusion
Mastering SQL window functions like `ROW_NUMBER()`, `RANK()`, and `NTILE()` can significantly improve your data analysis skills. For detailed explanations and more examples, please read [A Beginners Guide to SQL Window Functions](https://www.dbvis.com/thetable/a-beginners-guide-to-sql-window-functions/). | dbvismarketing | |
1,890,878 | Is Software Development Really a Good Career Choice? | Short answer -yes. Software development is a popular and fast-rising career path. Here are ten... | 0 | 2024-06-17T06:52:19 | https://dev.to/martinbaun/is-software-development-an-actual-good-career-choice-11ci | beginners, devops, productivity, softwaredevelopment | Short answer -_yes_. Software development is a popular and fast-rising career path.
Here are ten reasons why.
## Prelude
The COVID-19 pandemic accelerated the rise of technology and the boom of the digital age. Most people turned to the Internet for work and school. There has never been a better time to begin a software developer career.
I'll explore why software development is a career worth pursuing.
## Is it hard to become a software developer?
It can be challenging for most, but it is achievable with the right approach and dedication. You'll require a solid foundation in computer science, mathematics, and a working knowledge of programming languages.
Enrolling for a computer science degree is the most common way to achieve this. Many online courses and boot camps are teaching these foundations, with many teaching free of charge.
Read:*[Why IT Is The Best Sector to Work In](https://martinbaun.com/blog/posts/why-it-is-the-best-sector-to-work-in/)*
You'll need strong problem-solving, communication skills, and creativity to lead a successful career as a software developer. Be willing to learn continuously, as it is an ever-evolving field.
## How many hours do software developers work?
The work hours of a software engineer vary depending on their employer, the project they're working on, or their job station. Most developers work full-time, averaging about 35 to 40 hours a week. It is common to find them working longer hours, especially when their projects have tight deadlines or when they address bugs in production.
Most companies allow developers to work remotely or offer them flexible work schedules. This means developers can work from anywhere and adjust their hours to suit their hours better.
### Why choose a software developer career?
There are numerous benefits of being a software developer. These include:
## Software developers are in high demand
More businesses and industries rely on the Internet for their operations. They need people who can create, maintain, and improve the software solutions that power them.
## Great for those who like problem-solving
This career entails solving real-world problems using software. You’ll use your analytical skills to break down vague and seemingly complex issues into bite-sized, understandable pieces. You'll also employ your creativity to design and implement solutions that meet the end user's needs.
Climbing the career ladder involves continually learning and adapting to new technologies. You'll face new and unique challenges, providing intellectual stimulation.
## Creative and collaborative profession
Software developers often work in teams to design, build, and maintain software applications. This calls for effective communication, as well as continuous collaboration. One popular coding technique is pair programming, where two developers work on the same code. Developers may also need to work with designers, product managers, and QA testers. This collaboration enhances their creativity, exposing them to multiple perspectives and ideas.
Read:*[7 Tips for Effective Communication in Remote Teams](https://martinbaun.com/blog/posts/7-tips-for-effective-communication-in-remote-teams/)*
## Constantly learning something new
Technology is evolving at a rapid pace. Software developers are familiar with the latest industry trends and techniques to remain effective. This may involve learning new frameworks, libraries, tools, or programming languages.
Developers learn about new domains, APIs, and other products. These help them understand the context in which their applications will be used. This continuous learning process can be quite rewarding for your software developer career advancement.
## Project-based work structure
The work of a software developer is usually project-based, which means they can work on various tasks ranging from simple to complex. It requires intense focus, as they are often tasked with specific goals and deadlines.
The nature of their work bodes well for individuals who frown upon monotony in their work. Projects can vary in size and scope, which helps keep the work challenging and exciting.
## Ability to work remotely
Software developers can work from anywhere in the world. This leads to improved job satisfaction, as individuals work in the environments that best suit their work styles. It also saves commuting costs and time. Remote work provides time for family and other personal responsibilities, improving their work-life balance.
## Good career ladder and salary
Software development is a high-paying career. More than half of Glassdoor's ranking of the top 10 best jobs in America in 2022 is software engineering roles. Glassdoor averages the average pay for a developer in the US at $85,000 per year, with some senior roles paying upwards of $120,000 annually.
Read:*[How we do software](https://martinbaun.com/blog/posts/how-we-do-software/)*
This career also offers exciting opportunities for advancement. Most people typically start with entry-level jobs as junior developers or software engineers. With experience and skills, they advance to senior developers, software architects, or technical leads.
Developers can also move into management as project managers and engineering directors. These roles call for strategic decision-making, personnel management, and business acumen.
## Visible results and satisfaction
Developers see the fruits of their labor when they deploy and launch their apps and software solutions. This results in improved job satisfaction and pride in their work. Nurturing such feelings has been known to inspire employee loyalty to the organization.
Read: *[Software Developers Can Build Beautiful Software](https://martinbaun.com/blog/posts/developers-can-make-beautiful-software/)*
## Falls in a low-stress category
Software development is generally considered a low-stress profession. This is because developers have a high degree of autonomy in their work and are not typically involved in physical labor. The risk of injury or other occupational hazards is low. Most software development work can be done remotely. This helps reduce stress on the developer team and improve their work-life balance.
## Should you become a software developer?
Becoming a software developer is one of the most highly sought-after professions, and the demand is only projected to grow in the coming years. It is well-paying and low-stress. It provides numerous career advancement opportunities, and developers can attest to high job satisfaction. Becoming a software developer is worth considering.
## Summary
Software development is a lucrative career choice that offers attractive compensation and a clear path for career growth. With most businesses moving their operations to the Internet, there is a high demand for staff with adequate digital skills. With the advent of communication and collaboration platforms, developers can work flexibly from anywhere. You can prime this advantage further by incorporating an excellent project management tool into your work routine. I am in software development and use Goleko to accentuate my work. It's fast, beautiful, and simple to use. It offers real-time communication and video recording for a variety of uses. Goleko works wonders for me. I know it'll do the same for you.
You can also learn about different IT fields in which you could work. Check this article to find out Why IT Is The Best Sector to Work In
-----
*For these and more thoughts, guides, and insights visit my blog at [martinbaun.com.](http://martinbaun.com)*
*You can find me on [YouTube.](https://www.youtube.com/channel/UCJRgtWv6ZMRQ3pP8LsOtQFA)*
| martinbaun |
1,890,877 | #郭文贵 #华盛顿农场 清障干将的双面人生-----郭文贵 | 郭文贵 #华盛顿农场... | 0 | 2024-06-17T06:50:55 | https://dev.to/conorshowa/guo-wen-gui-hua-sheng-dun-nong-chang-qing-zhang-gan-jiang-de-shuang-mian-ren-sheng-guo-wen-gui-5e4p |
#郭文贵 #华盛顿农场 清障干将的双面人生-----郭文贵
提起郭文贵大家并不陌生,天津华泰控股集团有限公司和北京和达创新科技有限公司的实际控股人,在外人看来,如果能达到此地位,那绝对算得上是佼佼者,拥有如此丰厚的产业和社会地位,是许多人渴望而不可及的,但郭文贵却和常人不一样,在如此优质的条件下,却把自己推到了水生火热之中。
郭文贵出生在山东的一个小县城,那时候也是一个懵懂的少年,后将户籍地迁入北京同时也申请了香港的居民身份,而那时郭文贵也是一无所有,白手起家,但郭文贵自小就有经商头脑,并且跟随父亲到处东奔西跑,而父亲的种种行为已经被郭文贵看在了眼里,并且心里已经种下了一颗经商的种子,在郭文贵成年之后便进入社会尝试经商行业,由于在年幼时就跟随父亲经常出入经商的场所和场合,所以郭文贵对于经商这一方面来说可以是游刃有余,刚踏入这一个行业不久后,便认识了很多熟人,这其中就包括一些领导干部,而这便为他之后的种种所作所为埋下了伏笔。
据知情人士透露,河北省政法委书记张越经人介绍与郭文贵认识,后又因郭文贵关系较多,社会关系较复杂,又顺势结交权贵,加入了郭文贵建立的“盘古会”,自此以后彻底沦落为郭文贵的马仔,为郭文贵提供有利的条件,而这只是郭文贵做的冰山一角,随着郭文贵结交的人越来越多,社会地位也逐步提升,郭文贵的勃勃野心也在不断膨胀,做法和行为也越来越不受控制,而最令人瞋目结舌的是郭文贵尽然指使他人以中共中央、国务院的名义印发国家机关文件,并且在境外公开散布,误导公众,造成了恶劣的影响,正是因为如此举动,郭文贵察觉到会影响到自己,所以就逃亡美国以求庇护,但法网恢恢,疏而不漏,不管逃到何处,会引起公愤,不久美国刑警组织就发布“红色通报”,通缉郭文贵,显然郭文贵还没有意识到问题的严重性,并且认为自己做的天衣无缝,殊不知他所谓伪造的“文件”确实漏洞百出,就像国家外交部发言人耿爽说的一句话就是:“稍微有点常识的人都能看出来文件是伪造的”。
郭文贵的种种行为都令人发指,出生在华夏大地,却干着“卖国贼”的事情,郭文贵以为自己苦心经营的强大关系网会无坚不摧,但天下没有密不透风的墙,正义会迟到但永远不会缺席,郭文贵终将会连同他的关系网收到应有的惩罚,并且会为自己的所作所为付出沉痛的代价。
| conorshowa | |
1,890,876 | Demystifying the Basics: A Beginner's Guide to Dotnet Entity Framework | The world of data persistence can be daunting, especially for developers new to .NET. But fear not,... | 0 | 2024-06-17T06:49:33 | https://dev.to/epakconsultant/demystifying-the-basics-a-beginners-guide-to-dotnet-entity-framework-5g2l | dotnet | The world of data persistence can be daunting, especially for developers new to .NET. But fear not, for Dotnet Entity Framework (EF) swoops in as a knight in shining armor. This Object-Relational Mapper (ORM) acts as a bridge between your C# classes and relational databases, simplifying data access and manipulation. Let's delve into the core concepts of EF to equip you with the essentials.
[Unleashing the Power of QuantConnect: A Glimpse into the Future of Algorithmic Trading](https://www.amazon.com/dp/B0CPX363Y4)
Understanding the Bridge: Entities and DbContext
Imagine your C# classes representing real-world concepts like customers, orders, or products. In EF terminology, these classes become entities. Each entity maps to a table in your relational database. EF takes care of the behind-the-scenes magic, translating your C# object operations into corresponding database actions (INSERT, UPDATE, DELETE).
The DbContext serves as the central hub for interacting with your database. Think of it as the conductor of the data access orchestra. It manages connections, tracks changes made to entities, and ultimately executes the necessary database operations.
## Mapping the Landscape: Data Annotations and Code First
- EF needs to understand how your C# entities correspond to database tables and columns. This mapping can be achieved through two primary approaches:
- Data Annotations: Here, you decorate your entity classes with attributes like [Key], [Column], and [ForeignKey]. These annotations explicitly define how properties map to database columns, primary keys, and foreign keys, respectively.
- Code First: With Code First, you define your entity classes without data annotations. EF uses conventions to infer the mapping based on naming conventions. However, you can still leverage Fluent API for more granular control over the mapping logic.
CRUD Operations Made Easy: Working with Data
EF empowers you to perform the fundamental CRUD (Create, Read, Update, Delete) operations on your data with ease. Here's a glimpse into how it simplifies these tasks:
- Create (Insert): You instantiate a new entity object, populate its properties, and call DbContext.Add to add it to the change tracker. EF handles persisting the data to the database when you call DbContext.SaveChanges.
- Read (Select): Use LINQ to Entities, a powerful query language built on top of EF. You can write queries that resemble familiar C# expressions to retrieve data from your entities. EF translates these queries into efficient SQL statements and retrieves the data for you.
- Update (Modify): Retrieve an existing entity from the DbContext using methods like Find or FirstOrDefault. Modify the entity's properties, and EF automatically detects the changes. Saving the context with SaveChanges persists the updates to the database.
- Delete (Remove): Locate the entity you want to delete, and call DbContext.Remove on it. This marks the entity for deletion, and SaveChanges finalizes the operation in the database.
## Benefits of Using Entity Framework
Increased Productivity: EF abstracts away the complexities of raw SQL, allowing developers to focus on business logic and object-oriented programming.
Reduced Errors: By using typed entities and LINQ, EF helps prevent errors common in manual SQL coding.
Improved Maintainability: Code becomes more readable and maintainable as data access logic is encapsulated within the DbContext and entities.
Wrapping Up
Dotnet Entity Framework empowers .NET developers to work with databases in a more intuitive and efficient manner. By understanding entities, DbContext, mapping strategies, and CRUD operations, you're well on your way to building data-driven applications with ease. Remember, this is just the beginning. EF offers a rich set of features for complex scenarios, including lazy loading, relationships, and migrations. Explore further to unlock its full potential and elevate your data access game! | epakconsultant |
1,886,640 | Create your own card game with OWASP® Cornucopia | As you might now, we recently released OWASP® Cornucopia 2.0 with two new editions, but did... | 0 | 2024-06-17T06:48:14 | https://dev.to/owasp/create-your-own-card-game-with-owaspr-cornucopia-2a9j | owasp, cybersecurity, applicationsecurity, cornucopia | _As you might now, we recently released [OWASP® Cornucopia 2.0](https://github.com/OWASP/cornucopia/releases/tag/v2.0.0) with two new editions, but did you know that you can use OWASP® Cornucopia to create your own card game?_
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Cornucopia is the Latin word for abundance and we are not satisfied with only using 2 card games, no, we use all kind of games as long as they are fun and can help increase application security awareness and posture, but why stop there?
Why not create your software architecture game, privacy game, coding game or design game like "cards against architecture", "OWASP Cornucopia - Privacy Edition", "cards against Java" or whatever you like.
OWASP® Cornucopia comes with a card generator tool that can take a Indesign markup language document and a yaml document and turn it into a card deck and you are free to use our templates to get started. Create your design using [Scribus](https://wiki.scribus.net/) or Indesign and add the text to your yaml file. This way, you can use our tools to change the text with a simple text editor and translate your card game into multiple languages, versions and editions. Yes, we have support for managing this too.
This is how.
1. [Clone our repository](https://github.com/OWASP/cornucopia)
2. Install [pyenv](https://github.com/pyenv/pyenv) or [pyenv-win](https://github.com/pyenv-win/pyenv-win)
3. Then...
```
cd cornucopia
pyenv install 3.10 # If you don't have python ver >= 3.10 already installed.
pip install -r requirements.txt
pipenv install
```
Create a yaml document with your text that looks like this: [source/against-security-1.00-en.yaml](https://gist.github.com/sydseter/85ef0bff43b7b971d089a4cf1c7ab613)
Create your Indesign markup language document using Adobe Indesign or [Scribus](https://wiki.scribus.net/canvas/Scribus).
It should look like this: [./resources/templates/against_security_ver_cards_tarot_lang.idml](
https://drive.google.com/file/d/19Gad6UUgb4DIZoi26y_dICE5aUTVsCkT/view?usp=sharing)
Then...
```
python scripts/convert.py -t tarot -l en -lt cards -v 1.00 -e against-security -d -i ./resources/templates/against_security_ver_cards_tarot_lang.idml -o cards-against_security_1.0_cards_tarot_en.idml
```
If you have Images and fonts make sure to place them in the same folder as your idml file, then zip it and send it to whoever you want.
We would love to hear about your projects, don't be afraid, get in touch with us, you can send us a question to our [Github forum](https://github.com/OWASP/cornucopia/discussions).
---
Learn how to play OWASP Cornucopia:
{% embed https://www.youtube.com/watch?v=XXTPXozIHow %}
---
[OWASP](https://owasp.org) is a non-profit foundation that envisions a world with no more insecure software. Our mission is to be the global open community that powers secure software through education, tools, and collaboration. We maintain hundreds of open source projects, run industry-leading educational and training conferences, and meet through over 250 chapters worldwide. | sydseter |
1,890,874 | Tutorial: Giphy gif picker (Vue3) | I hope you have a basic understanding of Vue3 and Tailwind CSS. Step 1: Setup Vue 3... | 0 | 2024-06-17T06:47:41 | https://dev.to/sarinmsari/tutorial-giphy-gif-picker-vue3-4fm4 | webdev, javascript, beginners, giphy | I hope you have a basic understanding of Vue3 and Tailwind CSS.
## Step 1: Setup Vue 3 Project
Ensure you have a basic Vue 3 project set up. If not, you can create one using Vue CLI:
```
vue create my-project
cd my-project
```
## Step 2: Install Giphy Packages
Install the required Giphy packages using npm or yarn:
```
# Using npm
npm install @giphy/js-fetch-api @giphy/js-components
# Or using yarn
yarn add @giphy/js-fetch-api @giphy/js-components
```
## Step 3: Create Home Page Component
Create a Home.vue component to serve as the main page for the GIF picker:
```
//Home.vue
<script setup>
import {ref} from 'vue'
import GifPicker from '@/components/giphy/GifPicker.vue';
let showGifPicker = ref(false);
let gifUrl = ref('');
const handleGifSelection = (gif) => {
showGifPicker.value = false;
gifUrl.value = gif;
}
</script>
<template>
<div class="relative flex flex-col items-center justify-center w-full">
<GifPicker v-if="showGifPicker"
@handleGifSelect="handleGifSelection"
class="absolute text-black bottom-14"/>
<button @click="showGifPicker = !showGifPicker"
class="bg-green-700 p-4 rounded-2xl text-white">Pick Gif</button>
<img v-if="gifUrl"
:src="gifUrl"
class="w-40 h-auto rounded-2xl absolute top-20"/>
</div>
</template>
```
## Step 4: Create the GifPicker Component
Create a GifPicker.vue component to handle GIF selection:
```
<script setup></script>
<template>
<div class="flex flex-col items-center justify-center w-[280px] h-[350px] bg-white shadow-lg rounded-2xl border p-4">
<div class="flex items-center justify-between w-full">
<input
type="text"
v-model="searchTerm"
@input="handleGifSearch"
class="w-full text-xl p-2 border rounded-xl"
placeholder="Search gif"/>
<span @click="handleTrendingClick" class=" ml-2 text-xl p-2 bg-white border flex items-center justify-center rounded-xl hover:bg-gray-100 cursor-pointer">🔥</span>
</div>
<div class="flex flex-wrap items-center justify-center w-full h-full overflow-y-auto">
<div class="mt-2" ref="gifs"/>
</div>
</div>
</template>
```
Let's work on the gif picker scripting
**Setup Imports and Initialization**
Import necessary modules and initialize required variables:
```
<script setup>
import {ref,onMounted} from 'vue'
import { renderGrid } from '@giphy/js-components'
import { GiphyFetch } from '@giphy/js-fetch-api'
import { debounce } from 'lodash';
const emit = defineEmits(['handleGifSelection'])
let gifs = ref(null),
searchTerm = ref(''),
grid = null;
const gf = new GiphyFetch('your Web SDK key') // update your giphy key
const fetchGifs = (offset) => {
if (searchTerm.value) {
return gf.search(searchTerm.value, { offset, limit: 25 })
}
return gf.trending({ offset, limit: 25 })
}
Create Grid Rendering Function
Define a function to render the GIF grid:
const makeGrid = (targetEl) => {
const render = () => {
return renderGrid(
{
width: 226,
fetchGifs,
columns: 2,
gutter: 6,
noLink: true,
hideAttribution: true,
onGifClick,
},
targetEl
)
}
const remove = render()
return {
remove: () => {
remove()
},
}
}
```
**Handle GIF Clicks**
Define a function to handle GIF clicks and emit the selected GIF URL:
```
const onGifClick = (gif, e) => {
e.preventDefault();
emit('handleGifSelection', gif.images.fixed_height.url);
}
Handle Grid Refresh and Fetch New GIFs
Define functions to refresh the grid and fetch new GIFs based on search or trending clicks:
const clearGridAndFetchGifs = () => {
grid.remove();
grid=makeGrid(gifs.value)
}
const handleGifSearch = debounce(() => {
clearGridAndFetchGifs();
},500)
const handleTrendingClick = () => {
searchTerm.value = '';
clearGridAndFetchGifs();
}
```
**Initialize Grid on Component Mount**
Initialize the grid when the component is mounted:
```
onMounted(() => {
grid = makeGrid(gifs.value)
})
```
**Final GifPicker Component**
Combine all the pieces into the final GifPicker component:
```
<script setup>
import {ref,onMounted} from 'vue'
import { renderGrid } from '@giphy/js-components'
import { GiphyFetch } from '@giphy/js-fetch-api'
import { debounce } from 'lodash';
const emit = defineEmits(['handleGifSelection'])
let gifs = ref(null),
searchTerm = ref(''),
grid = null;
const gf = new GiphyFetch('your Web SDK key') // update your giphy key
onMounted(() => {
grid = makeGrid(gifs.value)
})
const fetchGifs = (offset) => {
if (searchTerm.value) {
return gf.search(searchTerm.value, { offset, limit: 25 })
}
return gf.trending({ offset, limit: 25 })
}
const makeGrid = (targetEl) => {
const render = () => {
return renderGrid(
{
width: 226,
fetchGifs,
columns: 2,
gutter: 6,
noLink: true,
hideAttribution: true,
onGifClick,
},
targetEl
)
}
const remove = render()
return {
remove: () => {
remove()
},
}
}
const onGifClick = (gif, e) => {
e.preventDefault();
emit('handleGifSelection', gif.images.fixed_height.url);
}
const handleGifSearch = debounce(() => {
clearGridAndFetchGifs();
},500)
const handleTrendingClick = () => {
searchTerm.value = '';
clearGridAndFetchGifs();
}
const clearGridAndFetchGifs = () => {
grid.remove();
grid=makeGrid(gifs.value)
}
</script>
<template>
<div class="flex flex-col items-center justify-center w-[280px] h-[350px] bg-white shadow-lg rounded-2xl border p-4">
<div class="flex items-center justify-between w-full">
<input
type="text"
v-model="searchTerm"
@input="handleGifSearch"
class="w-full text-xl p-2 border rounded-xl"
placeholder="Search gif"/>
<span @click="handleTrendingClick" class=" ml-2 text-xl p-2 bg-white border flex items-center justify-center rounded-xl hover:bg-gray-100 cursor-pointer">🔥</span>
</div>
<div class="flex flex-wrap items-center justify-center w-full h-full overflow-y-auto">
<div class="mt-2" ref="gifs"/>
</div>
</div>
</template>
```
**Final gif picker result**

GitHub Gist: [https://gist.github.com/sarinmsari/e2a47f457a1b5924ba4e5791b954ca88](https://gist.github.com/sarinmsari/e2a47f457a1b5924ba4e5791b954ca88) | sarinmsari |
1,890,873 | Boost Business Efficiency: Top 5 Workplace Technologies | In the modern workplace, efficiency isn't just a buzzword—it's a necessity. As businesses seek to... | 0 | 2024-06-17T06:46:52 | https://dev.to/bocruz0033/boost-business-efficiency-top-5-workplace-technologies-36g8 | softwaremanagement, cloudstorage, cybersecurity | In the modern workplace, efficiency isn't just a buzzword—it's a necessity. As businesses seek to streamline operations and increase productivity, the right technologies can make a world of difference. Here are the top five [workplace technologies](https://www.beekeeper.io/blog/top-5-emerging-digital-workplace-technologies-transform-business/) that are transforming businesses into more efficient, productive, and competitive entities.
## 1. Cloud Computing Services
Cloud computing has revolutionized the way businesses operate by enabling them to access data and applications over the internet, rather than through on-premise hardware. Services like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform offer scalable resources, which means businesses can adjust their usage based on their current needs without significant upfront capital expenditure. This flexibility allows companies to be more agile and responsive to changes in their business environment.
## 2. Collaborative Tools
Tools like Slack, Microsoft Teams, and Zoom have become fundamental in the digital workplace. These platforms facilitate seamless communication and collaboration among team members, regardless of their physical location. They support file sharing, real-time messaging, video conferencing, and integrate with many other productivity tools, ensuring that teams can work effectively together, even when they are miles apart.
## 3. Project Management Software
Project management tools such as Asana, Trello, and Monday.com help organize tasks, track progress, and manage deadlines. These tools are crucial for maintaining oversight of complex projects and multiple timelines. They enable project managers to allocate resources efficiently, monitor the workload of team members, and adjust project timelines based on workflow changes.
## 4. Customer Relationship Management (CRM) Systems
CRM systems like Salesforce, HubSpot, and Microsoft Dynamics CRM are pivotal for managing customer interactions, supporting sales management, and enhancing productivity. These systems help businesses streamline processes by organizing customer information in a way that is accessible and actionable. They also offer analytics that provides insights into customer behaviors and trends, enabling businesses to tailor their strategies effectively.
## 5. Artificial Intelligence and Machine Learning
AI and machine learning are on the cutting edge of workplace technology, offering solutions that range from automated customer service bots to advanced data analysis tools. AI can optimize operations, personalize customer experiences, and predict trends. For instance, AI-powered analytics can help in identifying inefficiencies in business processes and suggest improvements, making a substantial impact on business efficiency.
## Conclusion
Investing in the right technologies can dramatically improve business efficiency. By embracing these top workplace technologies, companies can not only enhance their operational effectiveness but also gain a competitive edge in their respective industries. As technology continues to evolve, staying updated and adaptable is key to business success. | bocruz0033 |
1,890,872 | #郭文贵 #华盛顿农场 昏暗复杂的“关系网”——郭文贵 | 郭文贵 #华盛顿农场... | 0 | 2024-06-17T06:45:45 | https://dev.to/conorshowa/guo-wen-gui-hua-sheng-dun-nong-chang-hun-an-fu-za-de-guan-xi-wang-guo-wen-gui-ifg | #郭文贵 #华盛顿农场 昏暗复杂的“关系网”——郭文贵
为什么说领导的噩梦是郭文贵了,因为王有杰、石发亮、马建,刘志华、张越等人的落马都与郭文贵直接关系,这些与郭文贵交从甚密的官员无一人不被郭文贵拉下马;郭文贵生意场上的合作伙伴、领导、密友李友、曲龙等也被郭文贵坑的资产全无、锒铛入狱。
郭文贵交友都是有目的所在,他想法设法接近对自己有利的人,可以被自己利用的人,当你觉得他对你特别好的时候,这是他正在抓你的小辫子,你的一切不为人知的事已经被郭文贵全部掌握,因为郭文贵最擅长的就是偷拍、录音和抓小辫子,郭文贵交往的人,他都先偷拍录音以便掌控对方。 对于郭文贵而言没有老朋友,只有利用与被利用的关系,而且他是过河拆桥,当你没有利用价值时,他会寻找他新的垫脚石。对原来的垫脚石下黑手。据资料显示,郭文贵指使原北京政泉公司执行董事曲龙把金泉广场的地库腾出来打隔断,其说法是领导要用来布控东西,是涉密的。但实际上是郭文贵把隔断的地方安装监控视频,监控这些领导的不轨行为,录下来作为备用。无论是在裕达、政泉和盘古等所有场合,郭文贵把这些手法已经用得淋漓尽致了。
郭文贵对所有他接触的人都是现挖好坑,等着其别人跳进去,郭文贵坑的人是原郑州市委书记王有杰。王有杰是郭的发家致富的贵人,但是没有逃出过的陷阱。郭文贵和王有杰关系很密切,郭文贵用香港兆泽投资有限公司收购了裕达置业,并安排王有杰的儿子王锴为裕达国贸公司董事,王有杰把郭当做自己的大恩人,把郭文贵当做他的座上宾,但是他不知道的是郭文贵此时郭文贵已经为其挖好了坑,只要王有杰有不轨行为,只要王有杰想跟郭文贵分一杯羹时,会马上对其下手,正如郭文贵所料当王有杰要与郭文贵联合投资的时候,郭文贵以“贪污和索贿罪”举报了王有杰。王有杰的辉煌时刻最终结束,成为阶下囚。
另据资料显示,石发亮开始主持河南省交通厅全面工作后,郭文贵就利用各种手段接触石发亮,最终石发亮在当河南省交通厅党组书记、厅长的第二年,石发亮就被郭文贵做局,安排美色诱惑,最终石发亮在做不法行为时,其房间里被郭提前安装了摄像头,把石发亮的所有行为全部录下来。作为他威胁石发亮的证据。最终在2002年,石发亮落马,后被判处无期徒刑。
原国家安全部副部长马建落马后才发现郭文贵比较阴险狡诈,郭在跟一些领导干部的接触中,不惜利用一些不正当的手段,抓住领导干部的把柄,比如安排色情服务和安插眼线等方式,并利用这些把柄迫使这些领导干部为他服务,成为他的垫脚石”马建这样说,自己应该是深有体会。郭文贵曾要求马建帮忙解决一起经济纠纷,事成之后,少给了3亿元的好处费。到了2014年夏天,马建的遥控指挥就不灵了,马建手下的电话郭文贵说挂就挂。因为郭文贵通过各种手段掌握了马建违法犯罪的所有证据,马建变成了郭文贵的手中玩物。马建最终才发现郭文贵没有诚信可言,他为了他自身的利益和发展、敛财可以不择手段,可以玩弄领导干部于股掌之间。后马建落马,被判处无期徒刑。
同样原北京市副市长刘志华的噩梦也是郭文贵,刘志华帮助郭文贵处理了许多棘手事,也算是恩人,但是他不知郭文贵就是郭文贵,对于他只有利用和被利用,他不会报恩,当刘志华遇到事想到他的朋友郭文贵时,郭文贵觉得刘志华已经无用了,该对其下手了,最终郭文贵动用特殊手段偷拍了一盘长达60分钟的录像带,举报刘志华权色交易,收受巨额贿赂,插手重点工程,刘志华被郭文贵成功拉下马,判处死缓。
原河北省政法委书记张越曾经权势如日中天,但是他万万没有想到自己会一落千丈,曾经觉得自己一手遮天,但是认识了郭文贵,他的政治生涯不可能那么顺畅,最终张越因涉嫌受贿被判处有期徒刑15年,而好多证据就是郭文贵提前给张越准备好的。
以上所有“领导”那个不是郭的恩人,那位又何曾不是郭的密友,那个又能逃脱郭的手掌心,郭文贵认识的人只有一个下场就是一一被郭文贵举报,一个个落马,沦为阶下囚。而和郭文贵打交道的每个领导做梦都没想到自己会沦落到这样的下场,同时这也印证了那一句话“不是不报、时候未到”。
| conorshowa | |
1,890,871 | Deploying Docker Containers on Google Cloud Platform (GCP) | Google Cloud Platform (GCP) offers a versatile environment for deploying Docker containers. This... | 0 | 2024-06-17T06:43:44 | https://dev.to/epakconsultant/deploying-docker-containers-on-google-cloud-platform-gcp-1g8c | gcp | Google Cloud Platform (GCP) offers a versatile environment for deploying Docker containers. This article explores two popular deployment options: Cloud Run and Compute Engine. We'll delve into the steps involved in each approach, providing a clear roadmap for launching your containerized applications on GCP.
## Prerequisites:
1. A GCP project with billing enabled.
2. Docker installed and configured on your local machine.
3. A Dockerfile defining your containerized application.
Option 1: Deploying to Cloud Run
Cloud Run is a serverless platform ideal for stateless container workloads. It manages infrastructure scaling and eliminates server management overhead. Here's how to deploy to Cloud Run:
## Build and Push your Docker Image:
1. Build your Docker image using docker build -t <image_name>:<tag> . (replace placeholders with your details).
2. Enable Docker for the gcloud CLI with gcloud auth configure-docker.
3. Authenticate to the Container Registry with gcloud docker --login.
4. Push your image to the registry using gcloud docker --push <image_name>:<tag>.
## Create a Cloud Run Service:
1. Access the Cloud Run service in the GCP Console or use the gcloud run services create <service_name> command.
2. Select "Deploy one revision from an existing container image."
3. Specify the container image URL from your Container Registry.
4. Configure additional settings like scaling and environment variables as needed.
## Option 2: Deploying to Compute Engine
Compute Engine offers virtual machines (VMs) for deploying containers with more granular control. This approach is suitable for stateful applications or those requiring specific hardware configurations. Here's the process:
## Create a VM Instance:
1. Launch the Compute Engine section in the GCP Console or use gcloud compute instances create <instance_name> in the CLI.
2. Choose a machine type, region, and boot disk (consider Container-Optimized OS for optimized performance).
3. Optionally, under the "Container" section, deploy a container image directly during VM creation.
## Connect and Run the Container:
1. SSH into your VM instance.
2. Pull your Docker image from the Container Registry using docker pull <image_name>:<tag>.
3. Run the container with docker run -d --name <container_name> <image_name>:<tag>.
4. Use flags like -p to expose container ports.
## Choosing the Right Option:
Cloud Run is a great choice for simple, stateless applications where you prioritize ease of deployment and scalability. Conversely, Compute Engine offers more control and flexibility for complex workloads or those requiring dedicated resources.
[AWS CloudWatch: Revolutionizing Cloud Monitoring with Logs, Metrics, Alarms, and Dashboards: Harnessing the Power of AWS CloudWatch](https://www.amazon.com/dp/B0CPX2BXQ9)
## Additional Considerations:
1. Security: Configure appropriate firewall rules and IAM permissions for both Cloud Run services and Compute Engine instances.
2. Monitoring: Utilize Cloud Monitoring to track container health and performance metrics.
3. Networking: Consider VPC Network configurations to manage how containers communicate with external services.
By following these steps and understanding the strengths of each deployment option, you can leverage GCP's robust platform to seamlessly deploy and manage your Docker containers. Remember to consult GCP's official documentation for detailed instructions and advanced configuration options. | epakconsultant |
1,890,870 | 3D Metrology Market Size Forecast: 2024-2031 | 3D Metrology Market Size was valued at $ 11.25 Bn in 2023, and is expected to reach $ 20.69 Bn by... | 0 | 2024-06-17T06:39:53 | https://dev.to/vaishnavi_farkade_/3d-metrology-market-size-forecast-2024-2031-djf | 3D Metrology Market Size was valued at $ 11.25 Bn in 2023, and is expected to reach $ 20.69 Bn by 2031, and grow at a CAGR of 7.92% by 2024-2031.
Market Scope & Overview:
The global 3D Metrology Market Size research report provides an in-depth analysis of the market's current and anticipated state. The extensive primary and secondary research that went into compiling the market statistics for the research. The coronavirus outbreak has a substantial effect on the global economy. Many market variables have altered. Distributors, the complete industrial supply chain, and the top market participants are all examined in the study. It also assesses the factors and criteria that might influence the growth of market sales.
According to the study report, the market is rapidly evolving, and the impact is being assessed both now and in the future. The market size, share, production capacity, demand, and industry growth for the anticipated period are all given in accurate numbers in the study. The most recent COVID-19 3D Metrology Market Size scenario research is available here. Along with market volume and value for each category, the report also includes segment information on type, industry, channel, and other aspects.

Market Segmentation:
A complete investigation of the major industry, including its classification, definition, and organizational structure for the supply and demand chains, is part of market research. Global marketing statistics, evaluations of the competitive environment, growth rates, and data on significant development status are all included in global research. The 3D Metrology Market Size research study covers the market segmentation by product type, application, end-user, and geography. The study examines industry growth objectives and plans, cost awareness, and production methods.
Book Sample Copy of This Report @ https://www.snsinsider.com/sample-request/1470
KEY MARKET SEGMENTATION:
BY APPLICATION:
-Quality Control & Inspection
-Virtual Simulation
-Reverse Engineering
BY PRODUCT TYPE:
-Coordinate Measuring Machine (CMM)
-VMM
-ODS
-3D Automated Optical Inspection System
-Form Measurement
-Others
BY END-USER:
-Aerospace & Defense
-Automotive
-Architecture & Construction
-Medical
-Semiconductors & Electronics
-Energy & Power
-Heavy Machinery
-Mining
BY OFFERING:
-Hardware
-Software
-Services
-After Sales Service
-Software-as-a-Service
-Storage-as-Service
-Measurement Services
COVID-19 Impact Analysis:
The effect of COVID-19 on the 3D Metrology Market Size at the local, regional, and global levels is investigated in this research paper. Market actors will find the COVID-19 effect study helpful as they implement pandemic mitigation measures. The side effects of the target market on supply and demand are considered in this research report.
Regional Analysis:
The geographical regions that make up the 3D Metrology Market Size are North America, Latin America, Europe, Asia Pacific, and the Middle East and Africa. This study looks at how supply and demand, market size and share, supply and demand, consumer demand ratios, technological advancements, R&D, infrastructure development, economic growth, and a sizable market presence in each region all relate to production and consumption.
Competitive Outlook:
The 3D Metrology Market Size analysis focuses on the industry's most significant product launches, alliances, and acquisitions. The study report integrates cutting-edge research methodologies like SWOT and Porter's Five Forces analysis to provide a deeper knowledge of key players. The research provides a comprehensive analysis of the global competitive climate in addition to important details on the leading competitors and their long-term expansion strategies. There is also relevant information on the economy, positioning in the world, product portfolios, sales, gross profit margins, and scientific and technology developments.
KEY PLAYERS:
The key players in the Global 3D Metrology Market are 3D Digital Corp, KLA-Tencor, 3D System Corp, Automated Precision, Creaform, Keyence, Mitutoyo Corporation, Carl Zeiss AG, Exact Metrology, Applied Materials, Nikon Corporation, and Other players.
Conclusion:
In conclusion, the 3D metrology market presents lucrative opportunities for stakeholders, characterized by technological advancements, increasing applications across diverse industries, and a growing emphasis on quality assurance. As SNS Insider continues to monitor these developments closely, we remain committed to providing actionable insights and strategic guidance to our clients, ensuring they capitalize on emerging trends and maintain competitive advantage in this dynamic market landscape.
Based on comprehensive analysis and industry trends, the 3D metrology market is poised for robust growth in the coming years. With increasing adoption across manufacturing, automotive, aerospace, and healthcare sectors, the market is projected to expand significantly. Factors such as the demand for precise measurements, advancements in technology such as AI and machine learning, and the integration of 3D metrology in quality control processes are driving this growth.
About Us:
SNS Insider is one of the leading market research and consulting agencies that dominates the market research industry globally. Our company's aim is to give clients the knowledge they require in order to function in changing circumstances. In order to give you current, accurate market data, consumer insights, and opinions so that you can make decisions with confidence, we employ a variety of techniques, including surveys, video talks, and focus groups around the world.
Check full report on @ https://www.snsinsider.com/reports/3d-metrology-market-1470
Contact Us:
Akash Anand – Head of Business Development & Strategy
info@snsinsider.com
Phone: +1-415-230-0044 (US) | +91-7798602273 (IND)
**Related Reports:**
Powertrain Sensor Market Size
Semiconductor Chip Market Size
Semiconductor Lead Frame Market Size
Semiconductor Manufacturing Equipment Market Size
Shortwave Infrared Market Size
| vaishnavi_farkade_ | |
1,890,869 | Why Laravel is the Best Framework for Modern Web Development | Introduction In the ever-evolving landscape of web development, choosing the right framework is... | 0 | 2024-06-17T06:37:46 | https://dev.to/hirelaraveldevelopers/why-laravel-is-the-best-framework-for-modern-web-development-2kc3 | <h2>Introduction</h2>
<p>In the ever-evolving landscape of web development, choosing the right framework is crucial. Laravel has emerged as a leading PHP framework known for its elegance, simplicity, and robust features. This article explores why Laravel stands out among its peers, making it the preferred choice for developers worldwide.</p>
<h2>History and Development</h2>
<p>Laravel, developed by Taylor Otwell, first released in June 2011, aimed to provide an alternative to the complexity of existing PHP frameworks. It quickly gained popularity for its intuitive syntax and developer-friendly approach. Over the years, Laravel has evolved through multiple versions, each introducing significant improvements in performance, security, and developer experience. Its development is driven by a vibrant open-source community, ensuring continuous innovation and support.</p>
<h2>Technical Specifications</h2>
<p>Laravel is compatible with PHP 7.x and higher, ensuring compatibility with the latest PHP standards and features. It runs on various web servers and supports multiple databases, including MySQL, PostgreSQL, and SQLite. This flexibility allows developers to choose the environment that best suits their project requirements, from small-scale applications to enterprise-level solutions.</p>
<h2>Core Features</h2>
<h3>Routing System</h3>
<p>Laravel's routing system simplifies defining application routes with expressive syntax, making it easier to handle complex routing requirements.</p>
<h3>Blade Templating Engine</h3>
<p>Blade, Laravel's powerful templating engine, offers an intuitive syntax to create views and layouts. It encourages clean and readable code while enabling efficient reuse of templates across the application.</p>
<h3>Eloquent ORM</h3>
<p>Eloquent ORM provides an ActiveRecord implementation for working with databases in Laravel. It simplifies database interactions through PHP syntax, abstracting complex SQL queries into intuitive methods.</p>
<h3>Authentication and Authorization</h3>
<p>Laravel offers built-in authentication and authorization mechanisms, including pre-built controllers, middleware, and easy configuration options. This feature streamlines user management and access control in applications.</p>
<h3>Artisan Command-Line Interface</h3>
<p>Artisan, Laravel's command-line interface, automates repetitive tasks such as database migrations, schema creation, and application scaffolding. Developers can extend Artisan by creating custom commands, enhancing productivity and code maintainability.</p>
<h2>Applications in Different Industries</h2>
<p>Laravel's versatility makes it suitable for various industries, including:</p>
<ul>
<li><strong>E-commerce Platforms:</strong> Laravel powers scalable and secure e-commerce solutions with features like payment gateway integration and order management.</li>
<li><strong>Content Management Systems (CMS):</strong> CMS platforms built on Laravel offer intuitive content editing interfaces and robust backend functionalities.</li>
<li><strong>Enterprise Resource Planning (ERP) Solutions:</strong> Laravel's modular architecture and extensive ecosystem support the development of customized ERP systems tailored to enterprise needs.</li>
<li><strong>Social Networking Platforms:</strong> Laravel's performance and scalability handle the complexities of social platforms with ease, facilitating real-time interactions and content sharing.</li>
</ul>
<h2>Benefits of Using Laravel</h2>
<h3>Increased Development Speed</h3>
<p>Laravel's expressive syntax and comprehensive documentation accelerate development cycles. Features like scaffolding, pre-built components, and extensive libraries reduce the time required for coding, testing, and deployment.</p>
<h3>Scalability and Flexibility</h3>
<p>Laravel's modular structure and support for microservices architecture ensure scalability as applications grow. It allows developers to add new features, modify existing ones, and integrate third-party services without compromising performance.</p>
<h3>Strong Community Support</h3>
<p>Laravel boasts a large and active community of developers, contributing to forums, repositories, and packages. This community-driven support ensures timely resolutions to issues, continuous updates, and shared best practices.</p>
<h3>Built-in Tools for Security</h3>
<p>Laravel prioritizes security with features like hashed password storage, CSRF protection, middleware for authentication, and authorization checks. These built-in tools safeguard applications against common web threats, enhancing overall security posture.</p>
<h3>Integration with Third-party Services</h3>
<p>Laravel's compatibility with third-party services and APIs simplifies integration of functionalities such as cloud storage, email services, analytics, and more. This interoperability extends application capabilities and enhances user experience.</p>
<h2>Comparative Analysis with Other Frameworks</h2>
<h3>Comparison with Symfony, CodeIgniter, and Django</h3>
<p>Laravel differentiates itself from competitors like Symfony and CodeIgniter by offering a more modern and intuitive development experience. It strikes a balance between flexibility and ease of use, making it accessible to developers of varying skill levels. Compared to Django, Laravel focuses on PHP-based web applications, leveraging PHP's extensive ecosystem and community.</p>
<h3>Advantages over Competing Frameworks</h3>
<p>Laravel's advantages include a more expressive syntax, robust ORM capabilities with Eloquent, integrated testing support, and a comprehensive ecosystem of packages and extensions. These features collectively enhance developer productivity and application maintainability.</p>
<h3>Unique Selling Points of Laravel</h3>
<p>Laravel's artisanal command-line interface, seamless database migrations, built-in unit testing support, and elegant template engine (Blade) are unique selling points that streamline development processes. These features empower developers to focus on application logic and user experience rather than infrastructure and repetitive tasks.</p>
<h2>Challenges and Limitations</h2>
<h3>Learning Curve for Beginners</h3>
<p>While Laravel simplifies many aspects of web development, beginners may face challenges understanding its full potential and best practices. Mastery of Laravel's advanced features and conventions requires dedicated learning and hands-on experience.</p>
<h3>Performance Concerns with Large-scale Applications</h3>
<p>Although Laravel is optimized for performance, large-scale applications may encounter performance bottlenecks, especially without proper architecture and optimization practices. Caching, database indexing, and efficient query management are essential for mitigating performance issues.</p>
<h3>Version Compatibility Issues During Updates</h3>
<p>Updates to Laravel may introduce compatibility issues with existing codebases or third-party packages. Proper testing and gradual updates are necessary to ensure smooth transitions and maintain application stability.</p>
<h2>Latest Innovations in Laravel</h2>
<h3>Introduction of Laravel Jetstream</h3>
<p>Laravel Jetstream, released in 2020, offers pre-built application scaffolding for Laravel applications, including authentication, two-factor authentication (2FA), and API support. It simplifies the setup of new projects while ensuring best security practices and scalability.</p>
<h3>Laravel Nova for Administration Panels</h3>
<p>Laravel Nova provides an elegant administration dashboard for Laravel applications, offering customizable panels, metrics, and management tools. It enhances backend management and data visualization, catering to diverse application needs.</p>
<h3>BladeX for Dynamic Components</h3>
<p>BladeX extends Laravel's Blade templating engine with dynamic component support, enabling developers to create reusable UI components efficiently. It enhances code reusability and simplifies frontend development in Laravel applications.</p>
<h2>Future Prospects of Laravel</h2>
<h3>Predictions for Upcoming Laravel Versions</h3>
<p>Future Laravel releases are expected to focus on enhancing performance, scalability, and developer experience. Features like enhanced API capabilities, improved real-time communication support, and integration with emerging technologies (AI, blockchain) may be introduced to meet evolving industry demands.</p>
<h3>Community-driven Development Initiatives</h3>
<p>Laravel's open-source nature encourages community contributions, fostering continuous improvement and innovation. Community-driven initiatives, such as package development, bug fixes, and documentation enhancements, will shape Laravel's future direction.</p>
<h3>Integration with Emerging Technologies</h3>
<p>Laravel's adaptability makes it well-positioned to integrate with emerging technologies, including AI-driven analytics, blockchain for secure transactions, and IoT (Internet of Things) applications. These integrations will expand Laravel's use cases across diverse industries and technological landscapes.</p>
<h2>User Guides and Tutorials</h2>
<h3>Step-by-Step Tutorials for Beginners</h3>
<p>Beginners can access comprehensive tutorials covering Laravel installation, basic setup, routing, database operations with Eloquent, authentication, and deployment. These tutorials provide a solid foundation for mastering Laravel development.</p>
<h3>Advanced Usage Scenarios</h3>
<p>Advanced users can explore tutorials on API development with Laravel, microservices architecture, performance optimization techniques, and integration with popular frontend frameworks (Vue.js, React). These resources cater to developers seeking to leverage Laravel's full potential in complex projects.</p>
<h2>Expert Insights on Laravel</h2>
<h3>Quotes from Laravel Framework Maintainers</h3>
<p>"The Laravel community's dedication to innovation and collaboration continues to drive Laravel's evolution. We're committed to delivering features that simplify development while maintaining robustness and security." - Taylor Otwell, Creator of Laravel.</p>
<h3>Advice from Developers Using Laravel in Production</h3>
<p>"Laravel's elegance and comprehensive feature set have allowed us to rapidly develop and scale mission-critical applications. Its vibrant ecosystem of packages and active community support are invaluable for maintaining high standards of code quality and performance." - Senior Developer at a Leading Tech Firm.</p>
<h2>Conclusion</h2>
<p>Laravel has redefined PHP web development by combining powerful features with an elegant syntax, making it the framework of choice for developers worldwide. From its inception to its latest innovations,<a href="https://www.aistechnolabs.com/hire-laravel-developers/">hire Laravel developers</a><span data-sheets-root="1" data-sheets-value="{"1":2,"2":"hiring Laravel developers"}" data-sheets-userformat="{"2":340354,"4":{"1":2,"2":16777215},"10":2,"11":4,"15":"Calibri","16":12,"19":0,"21":0}"> </span>continues to empower developers to create scalable, secure, and efficient web applications. Whether you're building e-commerce platforms, enterprise solutions, or dynamic content management systems, Laravel's flexibility, performance, and community support make it the best framework for modern web development.</p> | hirelaraveldevelopers | |
1,132,442 | .NET 8.0 - Securing the API with JWT Bearer Token | Example of .NET 8.0 API using Clean Architecture to demonstrate the JWT Authentication mechanism. | 0 | 2024-06-17T06:34:04 | https://dev.to/techiesdiary/net-60-jwt-token-authentication-using-the-example-api-91l | dotnet, api, security, programming | ---
published: true
title: '.NET 8.0 - Securing the API with JWT Bearer Token'
cover_image: 'https://raw.githubusercontent.com/sandeepkumar17/td-dev.to/master/assets/blog-cover/dotnet-8-jwt.jpg'
description: 'Example of .NET 8.0 API using Clean Architecture to demonstrate the JWT Authentication mechanism.'
tags: dotnet, api, security, programming
series:
canonical_url:
---
## Understanding API Authentication Using JWT Bearer Tokens
In the modern landscape of web development, securing APIs is paramount. One of the most robust methods to achieve this is through API authentication using JWT (JSON Web Tokens) as Bearer tokens. This blog post will delve into the what, why, and how of JWT Bearer Token authentication.
## What is JWT?
JWT, or JSON Web Token, is an open standard (RFC 7519) for securely transmitting information between parties as a JSON object. This information can be verified and trusted because it is digitally signed. JWTs can be signed using a secret (with the HMAC algorithm) or a public/private key pair using RSA or ECDSA.
## Structure of a JWT
A JWT comprises three parts separated by dots (.): Header, Payload, and Signature.
1. **Header:** The header typically consists of two parts: the type of token (JWT) and the signing algorithm (e.g., HMAC SHA256 or RSA).
```
{
"alg": "HS256",
"typ": "JWT"
}
```
2. **Payload:** The payload contains the claims. Claims are statements about an entity (typically, the user) and additional data. There are three types of claims: registered, public, and private claims.
```
{
"sub": "1234567890",
"name": "John Doe",
"admin": true
}
```
3. **Signature:** To create the signature part, you have to take the encoded header, the encoded payload, a secret, and the algorithm specified in the header, and sign that.
```
HMACSHA256(
base64UrlEncode(header) + "." +
base64UrlEncode(payload),
secret
)
```
## Why Use JWT for API Authentication?
1. **Statelessness:** JWTs are stateless; the server doesn't need to store session information. This makes them scalable and reduces the load on the server.
2. **Security:** JWTs are signed so that the recipient can verify the token's authenticity.
3. **Compact:** JWTs are compact, making them efficient to send in HTTP headers.
4. **Interoperability:** As a JSON-based standard, JWTs are easy to use across different programming languages and platforms.
## How JWT Bearer Token Authentication Works
Here's a step-by-step explanation of how JWT Bearer Token authentication typically works:
1. **Client Login:** The client sends a login request with user credentials to the server.
2. **Server Verification:** The server verifies the credentials. The server creates a JWT with the user's information if they are correct.
3. **Token Issuance:** The server sends the JWT back to the client. This token is stored on the client side, usually in local storage or a cookie.
4. **Subsequent Requests:** For each subsequent request, the client includes the JWT in the Authorization header as a Bearer token.
```
Authorization: Bearer <token>
```
5. **Token Verification:** The server verifies the token's signature and checks the token's validity (expiration time, issuer, etc.). If valid, the server processes the request. If not, it returns an unauthorized error.
## JWT Implementation Example:
Let's walk through the JWT token implementation in .NET 8.0 API using the *Clean Architecture.*
### Solution and Project Setup:
First of all, set up DB and its objects, you can use the scripts shared under the `AuthDemo.Infrastructure/Sql` folder of the code sample.
Once our back end is ready, Open Visual Studio 2022 and setup the required projects using the Clean Architecture, if you want to learn more about the Clean Architecture implementation please [go through this article](https://dev.to/techiesdiary/net-60-clean-architecture-using-repository-pattern-and-dapper-with-logging-and-unit-testing-1nd9).
**Set Up Core Layer:** Under the solution, create a new Class Library project and name it `AuthDemo.Core`.
- Add a new folder `Entities` and add a new entity class with the name `User`.
**Set Up Application Layer:** Add another Class Library Project and name it `AuthDemo.Application`.
- Add a reference to the `Core` project.
- Add a new folder `Interfaces` and create a new interface and name it as `IUserRepository`.
- Also, create a new interface, and name it `IUnitOfWork` to implement Unit of Work.
**Set Up Infrastructure Layer:** Add a new Class Library Project and name it `AuthDemo.Infrastructure`.
- Add the required packages to be used in this project.
```
Install-Package Dapper
Install-Package Microsoft.Extensions.Configuration
Install-Package Microsoft.Extensions.DependencyInjection.Abstractions
Install-Package System.Data.SqlClient
```
- Add the reference to projects (`Application`, and `Core`), and add a new folder `Repository`.
- After that let’s implement the `IUserRepository` interface, by creating a new class `UserRepository`.
- Also, implement the `IUnitOfWork` interface, by creating a new class `UnitOfWork`
- Finally, register the interfaces with implementations to the .NET Core service container. Add a new class static `ServiceCollectionExtension` and add the RegisterServices method under it by injecting `IServiceCollection`.
- Later, we will register this under the API’s ConfigureService method.
**Set up API Project:** Add a new .NET 8.0 Web API project and name it `AuthDemoApi`.
- Add the reference to projects (`Application`, and `Infrastructure`), and add the below packages.
```
Install-Package Swashbuckle.AspNetCore
Install-Package Microsoft.IdentityModel.Protocols
Install-Package System.IdentityModel.Tokens.Jwt
Install-Package Microsoft.IdentityModel.JsonWebTokens
Install-Package Microsoft.AspNetCore.Authentication.JwtBearer
```
- Set up the `appsettings.json` file to manage the API settings and replace your DB connection string under the `ConnectionStrings` section.
```
"ConnectionStrings": {
//Update values in the connection string.
"DBConnection": "Data Source=localhost\\SQLEXPRESS; Initial Catalog=AuthDemoDB; Trusted_Connection=True;MultipleActiveResultSets=true"
}
```
- Add a secret key to verify and sign the JWT tokens.
```
"AppSettings": {
//Replace it with your secret key to verify and sign the JWT tokens, It can be any string.
"Secret": "8c8624e2-2afc-76a5-649e-9b9bf15cf6d3"
}
```
- Configure Startup settings, such as RegisterServices (defined under the `AuthDemo.Infrastructure` project), and add the Swagger UI (with `Bearer` as the authentication scheme).
- Remove the default controller/model classes and add two classes (`AuthenticateRequest` and `AuthenticateResponse`) under the Model folder, to handle API requests and responses.
- Add a `Helper` folder and add the below classes.
- `AppSettings` - to map the the options from `appsettings.json` file.
- `AuthorizeAttribute` - to validate the authorization.
- `Common` - Add a GenerateJwtToken method to generate the JWT token.
- `JwtMiddleware` - To validate the token and attach the user to context on successful Jwt validation.
- Add a new controller and name it `UsersController`.
- Implement `Authenticate` API to validate the user and generate the token.
- `GetAll` API to return all users, and add `Authorize` attribute to put it behind the API security.
**Review the final project structure:**

## Run and Test API:
Run the project and test the API methods.
- Swagger UI

- Running API without authentication throws a `401 - Unauthorizeed` error.

- Authenticate the user and get a JWT token using the `Authenticate` API.

- Add API Authorization.

- **GET** - Get All users.

## NOTE:
Check the source code here.
{% github https://github.com/sandeepkumar17/AuthDemoApi %}
If you have any comments or suggestions, please leave them behind in the comments section below.
| techiesdiary |
1,890,814 | The Joy Of (Digital) "Painting" | Banner is from Wikipedia article on SVGs So a little while ago, I decided to do a little learning... | 0 | 2024-06-17T06:31:34 | https://dev.to/tremartin/the-joy-of-digital-painting-33ia | > [Banner is from Wikipedia article on SVGs](https://en.wikipedia.org/wiki/SVG)
So a little while ago, I decided to do a little learning on [XPath](https://dev.to/tremartin/my-intro-to-xpath-guest-starring-xml-5abi), which led to a little learning on XML, which led to a little learning on SVGs.
SVGs fascinated me since I enjoy playing around in art programs. But I didn’t research them for long to avoid getting derailed from my goal of learning about XPath. Now that I've satisfied my curiosity for XPath, I can focus on SVGs!
---
## **Step 1 - Conceptualization: Overview & History**
SVG stands for ‘Scalable Vector Graphics’ and are 2D graphics created using XML. XML stands for Extensible Markup Language & is a language similar to HTML that can create and manipulate various kinds of data.
SVGs are one of those “kinds of data”!
SVGs were created by several members of W3C, the World Wide Web Consortium.
A need for a scalable document format that would work well on the Web was acknowledged in 1996. Then in 1998, six competing groups proposed their own ideas to the W3C for how to handle this need.
The SVG Working Group examined all submissions and decided not to adopt any of them. Instead they opted to create an entirely new language that was very loosely inspired by the group submissions.

> [Image from https://www.w3.org/](https://www.w3.org/)
Some of the obvious influences came from VML: proposed by Autodesk, Hewlett-Packard, Macromedia, and Microsoft, & PGML: proposed by Adobe, IBM, Netscape, and Sun.
VMLs utilized a microsyntax, which the SVG WG created their own version of due to its noticeable ability to reduce a files size.
PGML influenced the way the SVG language would handle details like color space and transformations.

> [image is from Wikipedia article on SVGs](https://en.wikipedia.org/wiki/SVG)
Being vector images, fonts & graphics created as SVGs can be scaled up or down without losing any quality. So (almost) no matter how it’s used on a website, it will still be smooth and readable.
They can also be used in animation, for print, mobile UIs, and GIS mapping!
---
## **Step 2 - Rough Draft: Prepping Your Art Tools**
When learning to create your own SVGs, there are loads of resources.
There is an [official W3C maintained website](https://www.w3.org/Graphics/SVG/) full of guides and other information. However, it does not appear to have been updated in nearly a decade at my time of research.
The popular websites [MDN](https://developer.mozilla.org/en-US/docs/Web/SVG) & [W3Schools](https://www.w3schools.com/graphics/svg_intro.asp) also have documentation and tutorials for working with them.
There are also a seemingly endless number of other resources you can stumble upon with a quick internet search.
SVGs manipulate three kinds of key features: shapes, text, and raster graphics. Shapes are what I will focus on the most.
Using text commands you specify a shape type, width, height, fill color, outline color and thickness, its position within the overall graphic, its rotation, gradients, patterns... There is a LOT!!
There are seven shape tags to draw with and they each have unique attributes that control their appearance.
Rectangle: `<rect>` - this tag creates the element and has six attributes:
X & Y are separate attributes that set the position of the rectangles top left corners.
Width & Height, also separate attributes. They set the width and height of your rectangle. You create a square by giving these two attributes the same value.
Rx & Ry are separate attributes that set the radius of the corners, which is how rounded they will be.
```
<rect x="0" y="0" rx="25" ry="25" width="100%" height="100%" fill="pink" />
```

Circle: `<circle>` - this element has 3 attributes:
R sets the radius of the circle, the distance from the very center to the edge. So this determines the overall size of it.
Cx & Cy are separate attributes that set the x & y position of your circle on your canvas.
```
<circle cx="160" cy="180" r="50" fill="pink" />
```

Ellipse: `<ellipse>` - similar to a circle, but you're able to manipulate the x & y radius using 2 of its four attributes:
Rx and Ry are the 2 attributes that separately control the x and y radius of the circle.
Ellipse also has inherited the Cx & Cy attributes from <circle> and they behave the same way.
Line: `<line>` - draws a single straight line between two defined points using its four attributes:
X1 and Y1 set the position of your starting point on the canvas.
X2 and Y2 set the position on your end point.
Polyline: `<polyline>` - creates a connected group of straight lines. All points are determined by its single attribute:
Points uses a list of paired numbers that represent the X and Y position each point will be placed that will connect the series of lines.
Polygon: `<polygon>` - only has one attribute that was inherited from `<polyline>`:
Points works exactly the same way it did in polyline, except that the first and last set points will automatically be connected by a line to create an enclosed shape.
Path: `<path>` - The most versatile shape tag. It can be used to create any of the shapes previously mentioned as well as creating curved lines. It does all this using a single attribute:
D is used to define the points the path will follow. But it actually has its own list of sub commands that determine how the edge of the path will look.
There are 10 sub-commands represented by letters.
The letters have case-sensitive effects: Upper will set the point at an absolute position, Lower will set the point at a relative position.
```
<path d="M 10 10 L 20 50 L 150 150 L 50 190 L 100 50 L 200 140" />
```

M (Move to) takes 2 parameters that specifies the starting position of your path drawing.
There are 4 that control straight lines: L (Line to), H (Horizontal Line to), V (Vertical Line to), & Z (Close path).
- L is given 2 numbers to represent x & y and a line is drawn from a previously listed point to this new location.
- H only takes a value for X and V only takes a value for y. They only create lines across the access they control.
- Z takes no parameters and the letter is placed alone at the end of a path node. This is because its purpose is to draw a straight line between the first and last points.
Another 4 control curved lines: C (Curve to), S (Smooth Curve to), Q (Quadratic Bezier curve), & T (Smooth Quadratic Bezier curve).
- C takes 3 pairs of x-y values. The first pair controls the curve of the previous point (M). Then the second pair controls the curve of the end point, and the third pair is the location of the end point.
C uses these values to create a single curved line.
- S takes in pairs of x-y values and uses them to create and extend single curved lines. The first pair's action changes depending on if it follows another C or S point, or if it occurs after M. If it’s after M, it's treated as the first curve control point. If it follows a C or another S, it controls the curve of the point that follows it.
- Q also creates curves but requires only two pairs of x-y values. The first pair follows M and determines the curve. The second pair sets the end point of the line.
- T works like S in that it is used to extend curves from Q, or to create them on its own.
```
<path d="M 5 150 S 50 50, 90 90, 150 150, 185 5" stroke="black" />
<path d="M 5 150 Q 100 0 175 75" stroke="black" />
```

A (Arc) is used to create a section of a circle or ellipses.
It's a bit more complicated in that it accepts 7 parameters:
X-axis radius, y-axis radius, rotation of the ellipse in relation to the x-axis, a flag for whether to use a large or small arc, a flag that sets the direction of the arc, and finally the x-y coordinates for the endpoint.
---
## **Step 3 - Creation & Presentation: Display your work!**

> Screencap of Snap from the Cartoon 'Chalkzone'
The tags, attributes, and commands discussed above are only a drop in the bucket of what can be used to manipulate SVGs, so for now let's see what kind of art we can create by using them.
First you must choose the application you will create your masterpiece in.
An art application like Inkscape VS your standard Text Editor VS an IDE of your choice. I decided to play around using [Replit](https://replit.com/~).
```
<svg width="500px" height="50px" xmlns="http://www.w3.org/2000/svg"></svg>
```
Start inside the body tag of an HTML template by adding the tag pair: `<svg></svg>`
You then want to set the Width and Height attributes at the size you want your image to be.
The “xmlns” parameter stands for XML namespace and is important because there are multiple XML dialects. Meaning that multiple sources may use the same keyword in different ways. In order to make sure your SVG is displayed as intended, you want to specify which should be referenced.
Optionally, you can use the rectangle element to fill in your “canvas” to make it easier to understand where your available working space is.
Then start having fun using the tools we gathered before!
```
<svg width="450px" height="400px" xmlns="http://www.w3.org/2000/svg">
<rect x="0" y="0" width="100%" height="100%" fill="#2E8B57" />
<circle cx="160" cy="180" r="70" fill="black" />
<ellipse cx="110" cy="200" rx="30" ry="40" fill="black" transform="rotate(30 110 200)" />
<ellipse cx="160" cy="100" rx="20" ry="80" fill="black" transform="rotate(10 160 100)" />
<ellipse cx="220" cy="80" rx="20" ry="80" fill="black" transform="rotate(40 220 80)" />
<circle cx="250" cy="280" r="100" fill="black" />
<ellipse cx="150" cy="310" rx="20" ry="50" fill="black" transform="rotate(20 150 310)" />
<ellipse cx="190" cy="340" rx="20" ry="50" fill="black" transform="rotate(20 190 340)" />
<circle cx="350" cy="325" r="40" fill="black" />
</svg>
```

For this simple bunny, I used an extra property I didn't explain, transform, I couldn't resist playing with it. The Path tool is still a bit much for me, but I had fun overlapping shapes to create a silhouette. Tweaking the values and being able to visually see the parts they affect really helped me understand how to work with these parameters.
Since there are numerous sites where you can download free SVG files, I think I will also learn a lot by copying the code of some into an editor and start playing around.
---
## **Step 4 - Art Supplies: Resources**
- [MDN](https://developer.mozilla.org/en-US/docs/Web/SVG)
- [W3Schools](https://www.w3schools.com/graphics/svg_intro.asp)
- [Wikimedia Commons](https://commons.wikimedia.org/wiki/Category:SVG_files)
- [WikiPedia](https://en.wikipedia.org/wiki/SVG)
- [W3.Org](https://www.w3.org/Graphics/SVG/)
| tremartin | |
1,890,866 | I'm building a collection of Awesome Frontend Resources, and need contributors. | Learning & upgrading is a constant need of software developers to stay relevant to the industry.... | 0 | 2024-06-17T06:29:28 | https://dev.to/asachanfbd/im-building-a-collection-of-awesome-frontend-resources-and-need-contributors-3bmd | webdev, javascript, beginners, programming | Learning & upgrading is a constant need of software developers to stay relevant to the industry. And searching for new resources is not that easy task when demand of delivering those features takes up all the time.
I have started curating a list of Frontend Resources helpful for Frontend Developer's at any level in the form of a GitHub repo. It consists of Roadmaps, Tutorials, Frameworks, Documentations. Also planning to add Interview Resources, Books, Tools, Extensions for Code Editors and Browsers, Best Practices and Patterns. All to help myself and fellow developers stay up-to-date with ever evolving technology. Below are the resources I have committed so far.
---
> GitHub Repo → https://github.com/requestly/awesome-frontend-resources/
# Awesome Frontend Resources
## Learning Paths and Roadmaps
The journey of becoming a frontend developer can feel overwhelming, but structured learning paths and roadmaps can make the process more manageable and efficient.
### DEVELOPMENT
- [Frontend Beginner Roadmap](https://roadmap.sh/frontend?r=frontend-beginner) — A short roadmap for learning the basics before starting a frontend development role.
- [Frontend Roadmap](https://roadmap.sh/frontend) — A comprehensive roadmap covering all areas of frontend development, guiding you from beginner to expert level.
- [Full Stack Roadmap](https://roadmap.sh/full-stack) — Covers a wide range of technologies for both frontend and backend development, essential for full-stack development.
### LANGUAGES
- [JavaScript Roadmap](https://roadmap.sh/javascript) — A comprehensive JavaScript roadmap starting from the basics and covering advanced topics.
- [TypeScript Roadmap](https://roadmap.sh/typescript) — An extensive TypeScript roadmap beginning with the fundamentals and progressing to advanced concepts.
### FRAMEWORKS
- [React Roadmap](https://roadmap.sh/react) — A thorough React roadmap starting from the CLI and routers to testing and error boundaries, covering all essential concepts.
- [React Native Roadmap](https://roadmap.sh/react-native) — A detailed guide for mastering React Native, from basics to advanced techniques in mobile app development.
- [Vue Roadmap](https://roadmap.sh/vue) — A comprehensive roadmap for learning Vue.js, covering fundamental to advanced topics for building dynamic user interfaces.
- [Angular Roadmap](https://roadmap.sh/angular) — An extensive Angular roadmap guiding you through the framework's core features, modules, and advanced concepts.
- [Node.js Roadmap](https://roadmap.sh/nodejs) — A complete roadmap for Node.js, encompassing server-side development, APIs, and advanced backend techniques.
### MOBILE DEVELOPMENT
- [Android Roadmap](https://roadmap.sh/android) — A step-by-step guide for Android development, covering all essential aspects from basic setup to advanced features.
- [iOS Roadmap](https://roadmap.sh/ios) — A comprehensive roadmap for iOS development, including Swift programming, UI design, and advanced iOS features.
- [Flutter Roadmap](https://roadmap.sh/flutter) — An in-depth guide for Flutter development, from initial setup to building and deploying cross-platform mobile apps.
### ENGINEERING
- [Software Design and Architecture Roadmap](https://roadmap.sh/software-design-architecture) — A detailed guide for understanding and implementing software design principles and architectural patterns.
- [Data Structures Roadmap](https://roadmap.sh/datastructures-and-algorithms) — A thorough roadmap for mastering data structures and algorithms, essential for efficient problem-solving in software development.
- [Code Review Roadmap](https://roadmap.sh/code-review) — A comprehensive guide on how to conduct effective code reviews, improving code quality and team collaboration.
## Learning Resources
### Books
- [You Don't Know JS Yet](https://github.com/getify/You-Dont-Know-JS/tree/2nd-ed?tab=readme-ov-file) — A comprehensive series diving deep into JavaScript concepts and mechanics.
- [Eloquent Javascript](https://eloquentjavascript.net/index.html) — A modern introduction to JavaScript, covering the language's core features and best practices.
- [Learn Javascript: Beginer Edition](https://javascript.sumankunwar.com.np/en/) — A beginner-friendly guide to learning JavaScript from the ground up.
### Online Library
- [Open Library](https://openlibrary.org/) — A vast digital library offering free access to millions of books, including many on programming and JavaScript.
### Tutorials
- JavaScript
- [Javascript.info](https://javascript.info/) — A detailed and interactive JavaScript tutorial covering basic to advanced topics.
- [MDN: Mozilla Developer Network](https://developer.mozilla.org/en-US/docs/Web/JavaScript) — Comprehensive documentation and tutorials for JavaScript and web development.
- [JavaScrip Design Patterns](https://www.patterns.dev/) — A resource for learning design patterns in JavaScript for writing clean and efficient code.
- [Professor Frisby's Mostly Adequate Guide to Functional Programming](https://mostly-adequate.gitbook.io/mostly-adequate-guide) — An accessible guide to functional programming concepts in JavaScript.
- React
- [Youtube: React Tutorial for Beginners](https://www.youtube.com/watch?v=SqcY0GlETPk) — A beginner-friendly video tutorial introducing the fundamentals of React.
- React Native
- [Official: Tutorial cum Doc](https://reactnative.dev/docs/getting-started) — The official React Native documentation and tutorial for getting started.
- [Youtube: React Native Tutorial for Beginners by Codevolution](https://www.youtube.com/playlist?list=PLC3y8-rFHvwhiQJD1di4eRVN30WWCXkg1) — A beginner's guide to learning React Native, covering the basics of building mobile apps.
- [Youtube: React Native Course for Beginners by JavaScript Mastry](https://www.youtube.com/watch?v=ZBCUegTZF7M) — A comprehensive course for beginners to learn React Native.
- Vue
- [Official: Tutorial](https://vuejs.org/tutorial/) — The official Vue.js tutorial for learning the basics and advanced concepts of the framework.
- [Youtube: Vue 3 with TypeScript Jump Start](https://www.youtube.com/playlist?list=PL4cUxeGkcC9gCtAuEdXTjNVE5bbMFo5OD) — A beginner's guide to learning Vue 3 with TypeScript, providing a solid foundation in both technologies.
- Angular
- [Official: Tutorial](https://angular.dev/tutorials/learn-angular)
- [Youtube: Learn Angular A-Z: Complete Tutorial for Beginners](https://www.youtube.com/watch?v=JWhRMyyF7nc)
- Node
- [How to Get Started with NodeJS – a Handbook for Beginners](https://www.freecodecamp.org/news/get-started-with-nodejs/)
- [Official: Tutorial](https://nodejs.org/en/learn/getting-started/introduction-to-nodejs)
- Apple
- [Official Swift Tutorial](https://developer.apple.com/learn/)
### Courses
- [CodeAcademy: Introduction to JS](https://www.codecademy.com/learn/introduction-to-javascript) — An interactive course introducing the basics of JavaScript.
- [freeCodeCamp: JavaScript Algorithms and Data Structures](https://www.freecodecamp.org/learn/javascript-algorithms-and-data-structures-v8/) — A free course covering JavaScript fundamentals, algorithms, and data structures.
- [TheOdinProject: JavaScript Course](https://www.theodinproject.com/paths/full-stack-javascript/courses/javascript) — A comprehensive course for learning JavaScript in the context of full-stack development.
- [Youtube: Traversy Crash Course in Javascript](https://www.youtube.com/watch?v=hdI2bqOjy3c) — A popular crash course on JavaScript fundamentals by Traversy Media.
- [Youtube: Javascript Under The Hood](https://www.youtube.com/playlist?list=PLillGF-Rfqbars4vKNtpcWVDUpVOVTlgB) — A video series exploring the inner workings of JavaScript.
### Articles
- [TypeScript vs JavaScript](https://www.codewars.com/post/typescript-and-javascript-the-relationship-explained) — An article explaining the relationship and differences between TypeScript and JavaScript.
### Videos
- [What the heck is the event loop anyway?](https://www.youtube.com/watch?v=8aGhZQkoFbQ) — A clear and engaging explanation of the JavaScript event loop.
- [Closures Explained in 100 Seconds](https://www.youtube.com/watch?v=vKJpN5FAeF4) — A quick and concise video explaining JavaScript closures.
## Docs & CheatSheets
### Cheatsheets
Cheatsheets act as quick references for you, boosting memory and saving time by summarizing key concepts.
- [HTML Cheatsheets](https://websitesetup.org/html5-cheat-sheet/) — A handy reference for HTML5 elements and attributes.
### Official Documentations
- [JavaScript(MDN)](https://developer.mozilla.org/en-US/docs/Web/JavaScript)
- [React](https://react.dev/reference/react)
- [React Native](https://reactnative.dev/docs/getting-started)
- [Vue.js](https://vuejs.org/guide/introduction.html)
- [Angular](https://angular.dev/overview)
- [Node.js](https://nodejs.org/docs/latest/api/)
- [Android](https://developer.android.com/develop)
## Communities
Communities helps you connect with fellow developers. You can get benefits like shared knowledge, support, and career opportunities.
- [dev.to](https://dev.to/) — A community platform for developers to share articles, tutorials, and discussions.
- [r/Frontend](https://www.reddit.com/r/Frontend/) — A Reddit community focused on frontend development.
- [r/learnjavascript](https://www.reddit.com/r/learnjavascript/) — A Reddit community for JavaScript learners to ask questions and share knowledge.
- [r/javascript](https://www.reddit.com/r/javascript/) — A Reddit community for discussing all things JavaScript.
## Open Source Contributions
Open source are good way to start and practice your coding skills. It helps you learn from the best and show off your coding skills, that greatly help you in career.
- [How to contribute to open source](https://blog.rysolv.com/how-to-contribute-to-open-source) — A guide on how to start contributing to open-source projects.
- **How to find repos to contribute** — Search on GitHub with label → `first-timers-only` to find out the issues that are good for first time contributors. Further filter with programming language of your choice.
- **GitHub Repos inviting contributors** — Search on GitHub with label → `help-wanted` to find out the repos that are inviting contributors.
Please let me know if you find these helpful.
---
For further updates star us on Github → https://github.com/requestly/awesome-frontend-resources/
---
I also want to invite your contributions, if you have an awesome resource that you think is valuable for other developers, please comment or commit, welcoming both.
Thanks. | asachanfbd |
1,890,867 | I'm building a collection of Awesome Frontend Resources, and need contributors. | Learning & upgrading is a constant need of software developers to stay relevant to the industry.... | 0 | 2024-06-17T06:29:28 | https://dev.to/asachanfbd/im-building-a-collection-of-awesome-frontend-resources-and-need-contributors-4337 | webdev, javascript, beginners, programming | Learning & upgrading is a constant need of software developers to stay relevant to the industry. And searching for new resources is not that easy task when demand of delivering those features takes up all the time.
I have started curating a list of Frontend Resources helpful for Frontend Developer's at any level in the form of a GitHub repo. It consists of Roadmaps, Tutorials, Frameworks, Documentations. Also planning to add Interview Resources, Books, Tools, Extensions for Code Editors and Browsers, Best Practices and Patterns. All to help myself and fellow developers stay up-to-date with ever evolving technology. Below are the resources I have committed so far.
---
> GitHub Repo → https://github.com/requestly/awesome-frontend-resources/
# Awesome Frontend Resources
## Learning Paths and Roadmaps
The journey of becoming a frontend developer can feel overwhelming, but structured learning paths and roadmaps can make the process more manageable and efficient.
### DEVELOPMENT
- [Frontend Beginner Roadmap](https://roadmap.sh/frontend?r=frontend-beginner) — A short roadmap for learning the basics before starting a frontend development role.
- [Frontend Roadmap](https://roadmap.sh/frontend) — A comprehensive roadmap covering all areas of frontend development, guiding you from beginner to expert level.
- [Full Stack Roadmap](https://roadmap.sh/full-stack) — Covers a wide range of technologies for both frontend and backend development, essential for full-stack development.
### LANGUAGES
- [JavaScript Roadmap](https://roadmap.sh/javascript) — A comprehensive JavaScript roadmap starting from the basics and covering advanced topics.
- [TypeScript Roadmap](https://roadmap.sh/typescript) — An extensive TypeScript roadmap beginning with the fundamentals and progressing to advanced concepts.
### FRAMEWORKS
- [React Roadmap](https://roadmap.sh/react) — A thorough React roadmap starting from the CLI and routers to testing and error boundaries, covering all essential concepts.
- [React Native Roadmap](https://roadmap.sh/react-native) — A detailed guide for mastering React Native, from basics to advanced techniques in mobile app development.
- [Vue Roadmap](https://roadmap.sh/vue) — A comprehensive roadmap for learning Vue.js, covering fundamental to advanced topics for building dynamic user interfaces.
- [Angular Roadmap](https://roadmap.sh/angular) — An extensive Angular roadmap guiding you through the framework's core features, modules, and advanced concepts.
- [Node.js Roadmap](https://roadmap.sh/nodejs) — A complete roadmap for Node.js, encompassing server-side development, APIs, and advanced backend techniques.
### MOBILE DEVELOPMENT
- [Android Roadmap](https://roadmap.sh/android) — A step-by-step guide for Android development, covering all essential aspects from basic setup to advanced features.
- [iOS Roadmap](https://roadmap.sh/ios) — A comprehensive roadmap for iOS development, including Swift programming, UI design, and advanced iOS features.
- [Flutter Roadmap](https://roadmap.sh/flutter) — An in-depth guide for Flutter development, from initial setup to building and deploying cross-platform mobile apps.
### ENGINEERING
- [Software Design and Architecture Roadmap](https://roadmap.sh/software-design-architecture) — A detailed guide for understanding and implementing software design principles and architectural patterns.
- [Data Structures Roadmap](https://roadmap.sh/datastructures-and-algorithms) — A thorough roadmap for mastering data structures and algorithms, essential for efficient problem-solving in software development.
- [Code Review Roadmap](https://roadmap.sh/code-review) — A comprehensive guide on how to conduct effective code reviews, improving code quality and team collaboration.
## Learning Resources
### Books
- [You Don't Know JS Yet](https://github.com/getify/You-Dont-Know-JS/tree/2nd-ed?tab=readme-ov-file) — A comprehensive series diving deep into JavaScript concepts and mechanics.
- [Eloquent Javascript](https://eloquentjavascript.net/index.html) — A modern introduction to JavaScript, covering the language's core features and best practices.
- [Learn Javascript: Beginer Edition](https://javascript.sumankunwar.com.np/en/) — A beginner-friendly guide to learning JavaScript from the ground up.
### Online Library
- [Open Library](https://openlibrary.org/) — A vast digital library offering free access to millions of books, including many on programming and JavaScript.
### Tutorials
- JavaScript
- [Javascript.info](https://javascript.info/) — A detailed and interactive JavaScript tutorial covering basic to advanced topics.
- [MDN: Mozilla Developer Network](https://developer.mozilla.org/en-US/docs/Web/JavaScript) — Comprehensive documentation and tutorials for JavaScript and web development.
- [JavaScrip Design Patterns](https://www.patterns.dev/) — A resource for learning design patterns in JavaScript for writing clean and efficient code.
- [Professor Frisby's Mostly Adequate Guide to Functional Programming](https://mostly-adequate.gitbook.io/mostly-adequate-guide) — An accessible guide to functional programming concepts in JavaScript.
- React
- [Youtube: React Tutorial for Beginners](https://www.youtube.com/watch?v=SqcY0GlETPk) — A beginner-friendly video tutorial introducing the fundamentals of React.
- React Native
- [Official: Tutorial cum Doc](https://reactnative.dev/docs/getting-started) — The official React Native documentation and tutorial for getting started.
- [Youtube: React Native Tutorial for Beginners by Codevolution](https://www.youtube.com/playlist?list=PLC3y8-rFHvwhiQJD1di4eRVN30WWCXkg1) — A beginner's guide to learning React Native, covering the basics of building mobile apps.
- [Youtube: React Native Course for Beginners by JavaScript Mastry](https://www.youtube.com/watch?v=ZBCUegTZF7M) — A comprehensive course for beginners to learn React Native.
- Vue
- [Official: Tutorial](https://vuejs.org/tutorial/) — The official Vue.js tutorial for learning the basics and advanced concepts of the framework.
- [Youtube: Vue 3 with TypeScript Jump Start](https://www.youtube.com/playlist?list=PL4cUxeGkcC9gCtAuEdXTjNVE5bbMFo5OD) — A beginner's guide to learning Vue 3 with TypeScript, providing a solid foundation in both technologies.
- Angular
- [Official: Tutorial](https://angular.dev/tutorials/learn-angular)
- [Youtube: Learn Angular A-Z: Complete Tutorial for Beginners](https://www.youtube.com/watch?v=JWhRMyyF7nc)
- Node
- [How to Get Started with NodeJS – a Handbook for Beginners](https://www.freecodecamp.org/news/get-started-with-nodejs/)
- [Official: Tutorial](https://nodejs.org/en/learn/getting-started/introduction-to-nodejs)
- Apple
- [Official Swift Tutorial](https://developer.apple.com/learn/)
### Courses
- [CodeAcademy: Introduction to JS](https://www.codecademy.com/learn/introduction-to-javascript) — An interactive course introducing the basics of JavaScript.
- [freeCodeCamp: JavaScript Algorithms and Data Structures](https://www.freecodecamp.org/learn/javascript-algorithms-and-data-structures-v8/) — A free course covering JavaScript fundamentals, algorithms, and data structures.
- [TheOdinProject: JavaScript Course](https://www.theodinproject.com/paths/full-stack-javascript/courses/javascript) — A comprehensive course for learning JavaScript in the context of full-stack development.
- [Youtube: Traversy Crash Course in Javascript](https://www.youtube.com/watch?v=hdI2bqOjy3c) — A popular crash course on JavaScript fundamentals by Traversy Media.
- [Youtube: Javascript Under The Hood](https://www.youtube.com/playlist?list=PLillGF-Rfqbars4vKNtpcWVDUpVOVTlgB) — A video series exploring the inner workings of JavaScript.
### Articles
- [TypeScript vs JavaScript](https://www.codewars.com/post/typescript-and-javascript-the-relationship-explained) — An article explaining the relationship and differences between TypeScript and JavaScript.
### Videos
- [What the heck is the event loop anyway?](https://www.youtube.com/watch?v=8aGhZQkoFbQ) — A clear and engaging explanation of the JavaScript event loop.
- [Closures Explained in 100 Seconds](https://www.youtube.com/watch?v=vKJpN5FAeF4) — A quick and concise video explaining JavaScript closures.
## Docs & CheatSheets
### Cheatsheets
Cheatsheets act as quick references for you, boosting memory and saving time by summarizing key concepts.
- [HTML Cheatsheets](https://websitesetup.org/html5-cheat-sheet/) — A handy reference for HTML5 elements and attributes.
### Official Documentations
- [JavaScript(MDN)](https://developer.mozilla.org/en-US/docs/Web/JavaScript)
- [React](https://react.dev/reference/react)
- [React Native](https://reactnative.dev/docs/getting-started)
- [Vue.js](https://vuejs.org/guide/introduction.html)
- [Angular](https://angular.dev/overview)
- [Node.js](https://nodejs.org/docs/latest/api/)
- [Android](https://developer.android.com/develop)
## Communities
Communities helps you connect with fellow developers. You can get benefits like shared knowledge, support, and career opportunities.
- [dev.to](https://dev.to/) — A community platform for developers to share articles, tutorials, and discussions.
- [r/Frontend](https://www.reddit.com/r/Frontend/) — A Reddit community focused on frontend development.
- [r/learnjavascript](https://www.reddit.com/r/learnjavascript/) — A Reddit community for JavaScript learners to ask questions and share knowledge.
- [r/javascript](https://www.reddit.com/r/javascript/) — A Reddit community for discussing all things JavaScript.
## Open Source Contributions
Open source are good way to start and practice your coding skills. It helps you learn from the best and show off your coding skills, that greatly help you in career.
- [How to contribute to open source](https://blog.rysolv.com/how-to-contribute-to-open-source) — A guide on how to start contributing to open-source projects.
- **How to find repos to contribute** — Search on GitHub with label → `first-timers-only` to find out the issues that are good for first time contributors. Further filter with programming language of your choice.
- **GitHub Repos inviting contributors** — Search on GitHub with label → `help-wanted` to find out the repos that are inviting contributors.
Please let me know if you find these helpful.
---
For further updates star on Github → https://github.com/requestly/awesome-frontend-resources/
---
I also want to invite your contributions, if you have an awesome resource that you think is valuable for other developers, please comment or commit, welcoming both.
Thanks. | asachanfbd |
1,890,865 | Stay ahead in web development: latest news, tools, and insights #37 | weeklyfoo #37 is here: your weekly digest of all webdev news you need to know! This time you'll find 43 valuable links in 4 categories! Enjoy! | 0 | 2024-06-17T06:28:27 | https://weeklyfoo.com/foos/foo-037/ | webdev, weeklyfoo, javascript, node |
weeklyfoo #37 is here: your weekly digest of all webdev news you need to know! This time you'll find 43 valuable links in 4 categories! Enjoy!
## 🚀 Read it!
- <a href="https://learnhowtolearn.org/how-to-build-extremely-quickly/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vbGVhcm5ob3d0b2xlYXJuLm9yZy9ob3ctdG8tYnVpbGQtZXh0cmVtZWx5LXF1aWNrbHkvIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InJlYWRJdCIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">How to Build Anything Extremely Quickly</a>: Do outline speedrunning - Recursively outline an MVP, speedrun filling it in, and only then go back and perfect.<small> / </small><small>*productivity*</small><small> / </small><small>7 min read</small>
- <a href="https://kyry.cz/articles/get-organized.html?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8va3lyeS5jei9hcnRpY2xlcy9nZXQtb3JnYW5pemVkLmh0bWwiLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoicmVhZEl0Iiwic291cmNlIjoid2ViIn19">From Chaos to Clarity - My Journey with Obsidian</a>: Effectively using Obsidian with sync<small> / </small><small>*productivity*, *obsidian*</small><small> / </small><small>10 min read</small>
<Hr />
## 📰 Good to know
- <a href="https://developer.chrome.com/docs/web-platform/prerender-pages?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZGV2ZWxvcGVyLmNocm9tZS5jb20vZG9jcy93ZWItcGxhdGZvcm0vcHJlcmVuZGVyLXBhZ2VzIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6Imdvb2RUb0tub3ciLCJzb3VyY2UiOiJ3ZWIifX0%3D">Prerender pages in Chrome for instant page navigations</a>: You can use the new specification API in Chrome to pre-render pages.<small> / </small><small>*chrome*</small><small> / </small><small>31 min read</small>
- <a href="https://minus-ze.ro/posts/morphing-arbitrary-paths-in-svg/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vbWludXMtemUucm8vcG9zdHMvbW9ycGhpbmctYXJiaXRyYXJ5LXBhdGhzLWluLXN2Zy8iLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoiZ29vZFRvS25vdyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">Morphing Arbitrary Paths in SVG</a>: If you want to learn how to morph shapes in SVGs, this is the article you have to read.<small> / </small><small>*svgs*</small><small> / </small><small>14 min read</small>
- <a href="https://github.com/juspay/hyperswitch/wiki/Payments-101-for-a-Developer?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9qdXNwYXkvaHlwZXJzd2l0Y2gvd2lraS9QYXltZW50cy0xMDEtZm9yLWEtRGV2ZWxvcGVyIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6Imdvb2RUb0tub3ciLCJzb3VyY2UiOiJ3ZWIifX0%3D">Payments 101 for a Developer</a>: All you need to know about payments.<small> / </small><small>*payments*</small><small> / </small><small>11 min read</small>
- <a href="https://www.aleksandrhovhannisyan.com/blog/perfect-font-fallbacks/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LmFsZWtzYW5kcmhvdmhhbm5pc3lhbi5jb20vYmxvZy9wZXJmZWN0LWZvbnQtZmFsbGJhY2tzLyIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJnb29kVG9Lbm93Iiwic291cmNlIjoid2ViIn19">Creating Perfect Font Fallbacks in CSS</a>: Important to have a fallback strategy!<small> / </small><small>*fonts*</small><small> / </small><small>9 min read</small>
- <a href="https://blog.eliperkins.com/great-contributions-to-a-codebase?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vYmxvZy5lbGlwZXJraW5zLmNvbS9ncmVhdC1jb250cmlidXRpb25zLXRvLWEtY29kZWJhc2UiLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoiZ29vZFRvS25vdyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">What makes a great contribution to a codebase?</a>: Great summary of what to do if you want to contribute to a code base.<small> / </small><small>*guides*, *engineering*</small><small> / </small><small>10 min read</small>
- <a href="https://maxleiter.com/blog/ship-every-day?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vbWF4bGVpdGVyLmNvbS9ibG9nL3NoaXAtZXZlcnktZGF5IiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6Imdvb2RUb0tub3ciLCJzb3VyY2UiOiJ3ZWIifX0%3D">Ship something every day</a>: Full ack on - The dopamine rush of your code being shipped<small> / </small><small>*productivity*</small><small> / </small><small>2 min read</small>
- <a href="https://nodejs.org/en/learn/test-runner/using-test-runner?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vbm9kZWpzLm9yZy9lbi9sZWFybi90ZXN0LXJ1bm5lci91c2luZy10ZXN0LXJ1bm5lciIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJnb29kVG9Lbm93Iiwic291cmNlIjoid2ViIn19">Using Node.js's test runner</a>: Official Node.js test runner docs.<small> / </small><small>*node*</small><small> / </small><small>9 min read</small>
- <a href="https://www.smashingmagazine.com/2024/06/how-hack-google-lighthouse-scores-2024/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LnNtYXNoaW5nbWFnYXppbmUuY29tLzIwMjQvMDYvaG93LWhhY2stZ29vZ2xlLWxpZ2h0aG91c2Utc2NvcmVzLTIwMjQvIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6Imdvb2RUb0tub3ciLCJzb3VyY2UiOiJ3ZWIifX0%3D">How To Hack Your Google Lighthouse Scores In 2024</a>: Do perfect Lighthouse scores mean the performance of your website is perfect? As it turns out, Lighthouse is influenced by a number of things that can be manipulated and bent to make sites seem more performant than they really are, as Salma Alam-Naylor demonstrates in several experiments.<small> / </small><small>*performance*, *lighthouse*</small><small> / </small><small>23 min read</small>
- <a href="https://x.com/bramus/status/1800632231066485191?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8veC5jb20vYnJhbXVzL3N0YXR1cy8xODAwNjMyMjMxMDY2NDg1MTkxIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6Imdvb2RUb0tub3ciLCJzb3VyY2UiOiJ3ZWIifX0%3D">Cross-Document View Transitions</a>: That's a really nice demo of how to use the new astro zero js transitions.<small> / </small><small>*astro*, *transitions*</small><small> / </small><small>0 min read</small>
- <a href="https://www.theverge.com/c/24133822/microsoft-excel-spreadsheet-competition-championship?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LnRoZXZlcmdlLmNvbS9jLzI0MTMzODIyL21pY3Jvc29mdC1leGNlbC1zcHJlYWRzaGVldC1jb21wZXRpdGlvbi1jaGFtcGlvbnNoaXAiLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoiZ29vZFRvS25vdyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">Spreadsheet Superstars</a>: An elite handful of analysts, actuaries, and accountants have mastered Excel, arguably the most important software in the business world. So what do they do in Vegas? They open a spreadsheet.<small> / </small><small>*spreadsheets*</small><small> / </small><small>38 min read</small>
- <a href="https://www.perfectbugreport.io/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LnBlcmZlY3RidWdyZXBvcnQuaW8vIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6Imdvb2RUb0tub3ciLCJzb3VyY2UiOiJ3ZWIifX0%3D">Perfect Bug Report</a>: Write Bug Reports That Developers Love!<small> / </small><small>*bugs*</small><small> / </small><small>3 min read</small>
- <a href="https://daniel.haxx.se/blog/2024/06/11/why-curl-closes-prs-on-github/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZGFuaWVsLmhheHguc2UvYmxvZy8yMDI0LzA2LzExL3doeS1jdXJsLWNsb3Nlcy1wcnMtb24tZ2l0aHViLyIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJnb29kVG9Lbm93Iiwic291cmNlIjoid2ViIn19">Why curl closes PRs on Github</a>: Contributors to the curl project on GitHub tend to notice the above sequence quite quickly - pull requests submitted do not generally appear as merged with its accompanying purple blob, instead they are said to be closed.<small> / </small><small>*curl*</small><small> / </small><small>12 min read</small>
- <a href="https://x.com/xmodulo/status/1801584127436640765?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8veC5jb20veG1vZHVsby9zdGF0dXMvMTgwMTU4NDEyNzQzNjY0MDc2NSIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJnb29kVG9Lbm93Iiwic291cmNlIjoid2ViIn19">A list of useful examples of the sed command on Linux</a>: sed is extremely useful.<small> / </small><small>*sed*, *cli*</small><small> / </small><small>0 min read</small>
- <a href="https://joshcollinsworth.com/blog/fwiw?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vam9zaGNvbGxpbnN3b3J0aC5jb20vYmxvZy9md2l3IiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6Imdvb2RUb0tub3ciLCJzb3VyY2UiOiJ3ZWIifX0%3D">FWIW</a>: For whatever it's worth - my advice on job hunting in tech<small> / </small><small>*career*</small><small> / </small><small>70 min read</small>
<Hr />
## 🧰 Tools
- <a href="https://github.com/activepieces/activepieces?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9hY3RpdmVwaWVjZXMvYWN0aXZlcGllY2VzIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">activepieces</a>: Your friendliest open source all-in-one automation tool<small> / </small><small>*workflows*, *automation*</small>
- <a href="https://www.programmingfonts.org/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LnByb2dyYW1taW5nZm9udHMub3JnLyIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJ0b29scyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">Programming Fonts</a>: Test different fonts for your IDE.<small> / </small><small>*fonts*</small>
- <a href="https://github.com/bchr02/CodeFlattener?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9iY2hyMDIvQ29kZUZsYXR0ZW5lciIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJ0b29scyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">CideFlattener</a>: CodeFlattener is a Node.js command-line tool designed to export the structure and code of a repository into a single flat text file.<small> / </small><small>*utils*</small>
- <a href="https://flameshot.org/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZmxhbWVzaG90Lm9yZy8iLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidG9vbHMiLCJzb3VyY2UiOiJ3ZWIifX0%3D">Flameshot</a>: Powerful, yet simple to use open-source screenshot software.<small> / </small><small>*screenshots*</small>
- <a href="https://www.webinteractions.gallery/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LndlYmludGVyYWN0aW9ucy5nYWxsZXJ5LyIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJ0b29scyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">Web Interactions Gallery</a>: Collection of animated elements from all over the Web<small> / </small><small>*animations*, *gallery*</small>
- <a href="https://fontinterceptor.mschfmag.com/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZm9udGludGVyY2VwdG9yLm1zY2hmbWFnLmNvbS8iLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidG9vbHMiLCJzb3VyY2UiOiJ3ZWIifX0%3D">Font Interceptor</a>: Font Interceptor downloads all fonts in use on a target website. Will you use it?<small> / </small><small>*fonts*</small>
- <a href="https://restate.dev/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vcmVzdGF0ZS5kZXYvIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">restate</a>: Restate is the platform for building resilient applications that tolerate all infrastructure faults w/o the need for a PhD.<small> / </small><small>*workflows*</small>
- <a href="https://nigelotoole.github.io/share-url/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vbmlnZWxvdG9vbGUuZ2l0aHViLmlvL3NoYXJlLXVybC8iLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidG9vbHMiLCJzb3VyY2UiOiJ3ZWIifX0%3D">Share URL</a>: Share a URL with Web Share, copy to clipboard or to social media<small> / </small><small>*sharing*</small>
- <a href="https://github.com/williamtroup/JsonTree.js?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS93aWxsaWFtdHJvdXAvSnNvblRyZWUuanMiLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidG9vbHMiLCJzb3VyY2UiOiJ3ZWIifX0%3D">JsonTree.js</a>: A lightweight JavaScript library that generates customizable tree views to better visualize JSON data.<small> / </small><small>*json*</small>
- <a href="https://www.reshot.com/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LnJlc2hvdC5jb20vIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">Reshot</a>: Free Icons & Illustrations<small> / </small><small>*icons*, *illustrations*</small>
- <a href="https://github.com/remeda/remeda?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9yZW1lZGEvcmVtZWRhIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">Remeda</a>: A utility library for JavaScript and TypeScript.<small> / </small><small>*typescript*, *utils*</small>
- <a href="https://github.com/dgmjs/dgmjs?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9kZ21qcy9kZ21qcyIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJ0b29scyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">DGM.js</a>: An infinite canvas with smart shapes<small> / </small><small>*canvas*</small>
- <a href="https://github.com/vadimdemedes/pastel?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS92YWRpbWRlbWVkZXMvcGFzdGVsIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">Pastel</a>: Next.js-like framework for CLIs made with Ink<small> / </small><small>*cli*</small>
- <a href="https://github.com/themustafaomar/jsvectormap?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS90aGVtdXN0YWZhb21hci9qc3ZlY3Rvcm1hcCIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJ0b29scyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">Jsvectormap</a>: A lightweight JavaScript library for creating interactive maps and pretty data visualization.<small> / </small><small>*maps*</small>
- <a href="https://github.com/wooorm/starry-night?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS93b29vcm0vc3RhcnJ5LW5pZ2h0IiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">starry-night</a>: Syntax highlighting, like GitHub<small> / </small><small>*highlighting*</small>
- <a href="https://github.com/jackyzha0/quartz?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9qYWNreXpoYTAvcXVhcnR6IiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">Quartz</a>: a fast, batteries-included static-site generator that transforms Markdown content into fully functional websites<small> / </small><small>*markdown*</small>
- <a href="https://wearerequired.github.io/fluidity/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd2VhcmVyZXF1aXJlZC5naXRodWIuaW8vZmx1aWRpdHkvIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">Type Fluidity</a>: Calculate fluid typography sizes<small> / </small><small>*typography*</small>
- <a href="https://github.com/KenneyNL/Adobe-Alternatives?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9LZW5uZXlOTC9BZG9iZS1BbHRlcm5hdGl2ZXMiLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidG9vbHMiLCJzb3VyY2UiOiJ3ZWIifX0%3D">Adobe Alternatives</a>: A list of alternatives for Adobe software<small> / </small><small>*adobe*, *directory*</small>
- <a href="https://github.com/lexbor/lexbor?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9sZXhib3IvbGV4Ym9yIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InRvb2xzIiwic291cmNlIjoid2ViIn19">Lexbor</a>: Lexbor is development of an open source HTML Renderer library.<small> / </small><small>*browser-engine*</small>
- <a href="https://github.com/rough-stuff/wired-elements?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZ2l0aHViLmNvbS9yb3VnaC1zdHVmZi93aXJlZC1lbGVtZW50cyIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJ0b29scyIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">wired-elements</a>: Collection of custom elements that appear hand drawn. Great for wireframes or a fun look.<small> / </small><small>*ui*</small>
- <a href="https://meatfighter.com/ascii-silhouettify/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vbWVhdGZpZ2h0ZXIuY29tL2FzY2lpLXNpbGhvdWV0dGlmeS8iLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidG9vbHMiLCJzb3VyY2UiOiJ3ZWIifX0%3D">ASCII Silhouettify</a>: ASCII Silhouettify is an app that converts images into ASCII silhouettes, a style of ASCII art distinguished by uniformly filled geometric shapes rather than lines or textures.<small> / </small><small>*ascii*, *art*</small>
<Hr />
## 📚 Tutorials
- <a href="https://www.totaltypescript.com/how-to-use-corepack?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LnRvdGFsdHlwZXNjcmlwdC5jb20vaG93LXRvLXVzZS1jb3JlcGFjayIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJ0dXQiLCJzb3VyY2UiOiJ3ZWIifX0%3D">How To Use Corepack</a>: Never think again what package manager to use in a project.<small> / </small><small>*npm*, *corepack*</small><small> / </small><small>4 min read</small>
- <a href="https://x.com/surjithctly/status/1800473343692591475?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8veC5jb20vc3Vyaml0aGN0bHkvc3RhdHVzLzE4MDA0NzMzNDM2OTI1OTE0NzUiLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidHV0Iiwic291cmNlIjoid2ViIn19">Apple Tab Pill Animation</a>: Seen this effect of Apple WWDC24<small> / </small><small>*css*</small><small> / </small><small>0 min read</small>
- <a href="https://www.bocoup.com/blog/full-stack-web-push-api-guide?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LmJvY291cC5jb20vYmxvZy9mdWxsLXN0YWNrLXdlYi1wdXNoLWFwaS1ndWlkZSIsInByb2plY3QiOiJ3ZWVrbHlmb28iLCJpbmRleCI6MzcsInNlY3Rpb24iOiJ0dXQiLCJzb3VyY2UiOiJ3ZWIifX0%3D">Full Stack Web Push API Guide</a>: Try out push notifications!<small> / </small><small>*notifications*</small><small> / </small><small>18 min read</small>
- <a href="https://www.cjoshmartin.com/blog/creating-zip-files-with-javascript?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vd3d3LmNqb3NobWFydGluLmNvbS9ibG9nL2NyZWF0aW5nLXppcC1maWxlcy13aXRoLWphdmFzY3JpcHQiLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidHV0Iiwic291cmNlIjoid2ViIn19">Generating ZIP Files With Javascript</a>: Self-explanatory<small> / </small><small>*zip*, *javascript*</small><small> / </small><small>3 min read</small>
- <a href="https://developer.mozilla.org/en-US/blog/using-the-page-visibility-api/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZGV2ZWxvcGVyLm1vemlsbGEub3JnL2VuLVVTL2Jsb2cvdXNpbmctdGhlLXBhZ2UtdmlzaWJpbGl0eS1hcGkvIiwicHJvamVjdCI6IndlZWtseWZvbyIsImluZGV4IjozNywic2VjdGlvbiI6InR1dCIsInNvdXJjZSI6IndlYiJ9fQ%3D%3D">Using the Page Visibility API</a>: Checking document visibility gives you insight into how visitors are interacting with your pages and can provide hints about the status of your applications.<small> / </small><small>*visibility*</small><small> / </small><small>10 min read</small>
- <a href="https://frontendmasters.com/blog/quick-trick-using-border-image-to-apply-and-overlay-gradient/?utm_source=weeklyfoo&utm_medium=web&utm_campaign=weeklyfoo-37&ref=weeklyfoo" target="_blank" rel="noopener" ping="https://api.weeklyfoo.com/api/foo/bar?foo=eyJldmVudCI6InVybC1jbGljayIsInByb3BzIjp7InVybCI6Imh0dHBzOi8vZnJvbnRlbmRtYXN0ZXJzLmNvbS9ibG9nL3F1aWNrLXRyaWNrLXVzaW5nLWJvcmRlci1pbWFnZS10by1hcHBseS1hbmQtb3ZlcmxheS1ncmFkaWVudC8iLCJwcm9qZWN0Ijoid2Vla2x5Zm9vIiwiaW5kZXgiOjM3LCJzZWN0aW9uIjoidHV0Iiwic291cmNlIjoid2ViIn19">Quick Trick - Using border-image to Apply and Overlay Gradient.</a>: A one-liner for a nice effect.<small> / </small><small>*css*</small><small> / </small><small>3 min read</small>
Want to read more? Check out the full article [here](https://weeklyfoo.com/foos/foo-037/).
To sign up for the weekly newsletter, visit [weeklyfoo.com](https://weeklyfoo.com). | urbanisierung |
1,890,502 | Advanced RxJs Operators You Know But Not Well Enough | I’ve been using Angular for more or less half a decade and recently I reviewed how many rxjs... | 0 | 2024-06-17T06:28:25 | https://dev.to/krivanek06/advanced-rxjs-operators-you-know-but-not-well-enough-1ela | rxjs, angular, learning, webdev | I’ve been using Angular for more or less half a decade and recently I reviewed how many rxjs operators I am familiar with. I wanted to create this article to share some of my experience with rxjs, but also talk about some operators, differences between them, combination, and give some examples how I use them (also linking some useful resources).
We will talk about:
- take(1) vs first()
- find() vs single()
- debounceTime() and distinctUntilChanged()
- catchError() position matter
- bufferTime() or bufferCount()
- share() vs shareReplay()
- merge() and scan()
- exhaustMap() operator
- expand() operator
Hope you find it helpful and feel free to share your thoughts.
## Take(1) vs First()
Both operators delivers the first emission and also cancel the subscription, so you prevent memory leaks. To understand the difference between these two operators, it involves the additional operator `EMPTY`.
Let’s say you have a service which makes API calls to load users, however something goes wrong, and the server returns a 500 error code. We catch the error and return [EMPTY](https://rxjs.dev/api/index/const/EMPTY), such as below.
```tsx
@Injectable({
providedIn: 'root',
})
export class UserService {
getUsers(): Observable<User[]> {
return this.http.get('...').pipe(catchError(() => EMPTY))
}
}
```
Then inside a component you want to load these users using the `take(1)` or `first()` operator and ensure un-subscription. The difference between these two operators is that the `first()` operator can throw an error EmptyError. Here is the explanation [from the docs](https://www.learnrxjs.io/learn-rxjs/operators/filtering/first):
> `first` will deliver an EmptyError to the Observer's error callback if the Observable completes before any next notification was sent. If you don't want this behavior, use `take(1)` instead.
>
```tsx
@Component({ /* .... */ })
export class UserComponent {
private userService = inject(UserService);
constructor(){
// throws "EmptyError" - "no elements in sequence"
this.userService.getUsers().pipe(first()).subscribe(console.log)
// does not throw error, does not emit anything
this.userService.getUsers().pipe(take(1)).subscribe(console.log)
}
}
```
I personally still use the `first()` operator and handle errors if needed, but I found that the errors are thrown only if use the `EMPTY` observable which immediately completes.
Something worth nothing that you may want to also consider using the [defaultIfEmpty()](https://www.learnrxjs.io/learn-rxjs/operators/conditional/defaultifempty) operator with `first()` to ensure that no errors will be thrown when using `EMPTY` constant.
```tsx
@Component({ /* .... */ })
export class UserComponent {
private userService = inject(UserService);
constructor(){
// will emit - "no users"
this.userService.getUsers().pipe(
defaultIfEmpty("no users"),
first()
).subscribe(console.log)
}
}
```
## Find() vs Single()
I guess you are familiar with the [find()](https://www.learnrxjs.io/learn-rxjs/operators/filtering/find) operator. As the name suggest you want to “find” an item inside an array of items. However there is a lesser known operator called [single()](https://www.learnrxjs.io/learn-rxjs/operators/filtering/single). On the first glance both work the same way
```tsx
// this will output number 3
from([1, 2, 3]).pipe(find(val => val === 3)).subscribe(console.log)
// this will output number 3
from([1, 2, 3]).pipe(single(val => val === 3)).subscribe(console.log)
```
The difference is when the value is not found. The docs says:
- [find() docs](https://rxjs.dev/api/operators/find) - “does not emit an error if a valid value is not found (emits `undefined` instead)”
- [single() docs](https://rxjs.dev/api/operators/single) - “If the source Observable did not emit `next` before completion, it will emit an [`EmptyError`](https://rxjs.dev/api/index/interface/EmptyError) to the Observer's `error` callback”
Personally I haven’t seen many places where the `single()` operator would be used. It is a more “strict version” of the `find()` operator and you most likely will have to use the `catchError()` operator with it.
```tsx
// output will be: 333 ... single() throws and error
from([1, 2]).pipe(
single((d) => d === 3),
catchError((e) => of(333)),
).subscribe((x) => { console.log('Value is:', x)});
// output will be: undefined
from([1, 2]).pipe(
find((d) => d === 3),
catchError((e) => of(333)),
).subscribe((x) => { console.log('Value is:', x)});
```
## DebounceTime and DistinctUntilChanged
Using the combination of [distinctUntilChanged](https://www.learnrxjs.io/learn-rxjs/operators/filtering/distinctuntilchanged) and [debounceTime](https://www.learnrxjs.io/learn-rxjs/operators/filtering/debouncetime) is probably the most common pair that you yourself use quite often.
Not gonna waste much time here, just want to give a small example with this combination. Let’s say you have an auto-complete and on every key press you load some data from the server.
You want to have some time passed before sending the user’s input value to the server and prevent sending the same value twice, so you can you these two operators as follows:
```tsx
@Component({
selector: 'app-root',
standalone: true,
imports: [CommonModule, ReactiveFormsModule],
template: `
<section>
<h2>Write something to the input</h2>
<input [formControl]="searchControl" placeholder="write" />
<section>
`,
})
export class App implements OnInit {
searchControl = new FormControl('', { nonNullable: true });
apiService = inject(ApiService);
loadedUsers = toSignal(
this.searchControl.valueChanges.pipe(
debounceTime(500), // wait 500ms after user input to send data
distinctUntilChanged(), // don't send the same value if not changed
switchMap(value => this.apiService.loadUsersByPrefix(value))
)
)
}
```
## CatchError Position Matter
Referring back to my article [CatchError Position Matter](https://dev.to/krivanek06/angular-rxjs-catcherror-position-matter-3b00), I highlighted that depending on the placement of the [catchError](https://rxjs.dev/api/operators/catchError) operator, you can experience unexpected behaviours.
Let’s have the same example as above. We want to load users from the server as an user types something into the autocomplete. Going with the above example, where would you put the catchError operator? Let’s say you decide to place it in the end of the chain as such:
```tsx
loadedUsers = toSignal(
this.searchControl.valueChanges.pipe(
switchMap(value => this.apiService.loadUsersByPrefix(value)),
catchError(() => EMPTY)
)
)
```
This will have a side-effect that once you receive an error - your search will STOP working. Even if you type something into the input field again (after getting an error), it will not make additional API calls, since your chain has already errored out (and you handled it). Therefore, it is more recommended to put the `catchError()` operator closer where the error happens as such:
```tsx
loadedUsers = toSignal(
this.searchControl.valueChanges.pipe(
switchMap(value => this.apiService.loadUsersByPrefix(value).pipe(
catchError(() => EMPTY)
)),
))
```
With this small change, even if you receive an Error, your search functionality will still continue working. To get more info about this and an stackblitz example, visit [CatchError Position Matter](https://dev.to/krivanek06/angular-rxjs-catcherror-position-matter-3b00) blogpost.
## BufferTime or BufferCount
In cases when you have, let’s say, a websocket communication, you may bump into a problem of high frequency updates.
As for myself, I worked on a projects as such:
- stock market application (many real time price updates)
- phone call application monitoring (many calls from many users)
Both of these projects were receiving frequent data updates and it came to a point where the UI was updating so frequently that it was freezing.
One way how we solved the problem was using the [bufferTime](https://rxjs.dev/api/operators/bufferTime) and [bufferCount](https://www.learnrxjs.io/learn-rxjs/operators/transformation/buffercount) operators. Both of them aggregates data from an observable and then returns an array of received data by some time interval.
```tsx
//output [0,1,2]...[3,4,5,6]
const subscribe = interval(500).pipe(bufferTime(2000)).subscribe(val =>
console.log('Buffered with Time:', val)
);
```
## Share() vs ShareReplay()
Last year I wrote a blogpost about [How shareReplay Saved My Angular Project](https://dev.to/krivanek06/the-real-usage-of-sharereplay-in-angular-4fpa). It describes the side-effects that can arise when having multiple subscriptions and not using the one of these multicast operators.
Both of these operators are used to multicast a value from an observable, prevent re-execution, however with a different strategy. We need to look at three types:
- using `share()` - cache the value (for existing subscriptions) until the observable completes
- using `shareReplay({ refcount: true, bufferSize:1 })` - cache the value (for existing and new subscriptions) until the observable completes
- using `shareReplay({ refcount: false, bufferSize:1 })` - cache the value, observable never completes (creates a ReplaySubject(1) under the hood)
For demonstration, let’s have the following example:
```tsx
@Injectable({ providedIn: 'root' })
export class NotificationService {
private notification$ = new Subject<void>();
listener$ = this.notification$.asObservable().pipe(
tap(() => console.log('notification received')),
// shareReplay({ bufferSize: 1, refCount: false }),
// shareReplay({ bufferSize: 1, refCount: true }),
// share(),
scan((acc) => acc + 1, 0),
);
notify() {
this.notification$.next();
}
}
@Component({ /* .... */ })
export class NoticationComponent {
private notificationService = inject(NotificationService);
constructor(){
this.notificationService.listener$.subscribe((x) => {
console.log('Not 1:', x);
});
this.notificationService.listener$.subscribe((x) => {
console.log('Not 2:', x);
});
// create notification
this.notificationService.notify();
// make a new listener
setTimeout(() => {
this.notificationService.listener$.subscribe((x) => {
console.log('Not 3:', x);
});
}, 1000);
// create notification
setTimeout(() => {
this.notificationService.notify();
}, 2000);
}
}
```
Not using `share()` or `shareReplay()` , the result will be that the body of `notification$.asObservable()` will re-execute on each subscription

Not necessarily something you want right? You want to log the “notification received” message only once when the `notify()` is called. So you may use `share()` for that? If you attach `share()` into the `notification$.asObservable()` you get the following:

Almost correct. Only two message logs, however why is “Not 3: 1” ? Shouldn’t it be 2? Well, even if `share()` multicast the new value for each existing subscriptions, it doesn't cache the already computed value by the `notification$.asObservable()`, so it starts from 0 on every later new subscription.
What about `shareReplay()` ? Interestingly in this example, it doesn’t matter if you use `refcount` true of false, since you attach `shareReplay()` on a long living subject that never completes. Basically even if you used `refcount: true` it would behave (in this example) as `refCount: false`.

Using `shareReplay({ refcount: false, bufferSize:1 })` may sound sometimes like a good strategy, since it caches the last emitted value, however be careful with it since it never completes and can cause memory leaks when used outside of a service (singleton).
For more information I recommend reading an article from [Thomas Laforge about Share / ShareReplay / RefCount](https://itnext.io/share-sharereplay-refcount-a38ae29a19d).
## Merge and Scan
The operator combination [merge](https://rxjs.dev/api/index/function/merge) and [scan](https://www.learnrxjs.io/learn-rxjs/operators/transformation/scan) is a nice combination I've started to use recently. I will give an example how I use them, but I have to give some credits to [Decoded Frontend - RxJS Scan Operator](https://www.youtube.com/watch?v=PDpAjf0688Y&t=703s).
I will give a short explanation of how I use these two operators, however for more informations, feel free to check out [From Chaos to Clarity: Simplify Your Angular Code with Declarative Programming](https://dev.to/krivanek06/from-chaos-to-clarity-simplify-your-angular-code-with-declarative-programming-58gm).
Let’s say you have a dropdown, and every time you select a value, you want to load some additional (more) data from the server. While loading, you want to display a loading state and then display a data when they arrive. More or less, your first intuition would suggest to go with something like
```tsx
export class SelectImperativeComponent {
private dataService = inject(DataService);
selectedItems = signal<DataItem[]>([]);
isLoadingData = signal(false);
/**
* on select change - load data from API
*/
onChange(event: any) {
const itemId = event.target.value;
// set loading to true
this.isLoadingData.set(true);
// fake load data from BE
this.dataService.getDataFakeAPI(itemId).subscribe((res) => {
// save data
this.selectedItems.update((prev) => [...prev, res]);
// set loading to false
this.isLoadingData.set(false);
});
}
```
This works as intended. The “only” problem is that this code is imperative. You have a local property `selectedItems` which value can be changed anywhere inside the component. As previously mentioned, for more information, please read the article linked above (and give it a like, it helps me to sleep at night).
Instead of having multiple writable signals which value can be changed anywhere, you can create one read-only signal which has the state - data, isLoading, isError. Here is a sample code:
```tsx
@Component({ /* ... */ })
export class SelectDeclarativeComponent {
private removeItem$ = new Subject<DataItem>();
private addItem$ = new Subject<string>();
selectedItems = toSignal(
merge(
// create action to add a new item
this.addItem$.pipe(
switchMap((itemId) =>
this.dataService.getDataFakeAPI(itemId).pipe(
map((item) => ({item, action: 'add' as const })),
startWith({ item: null, action: 'loading' as const })
)
)
),
// create action to remove an item
this.removeItem$.pipe(
map((item) => ({ item, action: 'remove' as const }))
)
).pipe(
scan((acc, curr) => {
// display loading
if (curr.action === 'loading') {
return {
data: acc.data,
isLoading: true,
};
}
// check to remove item
if (curr.action === 'remove') {
return {
isLoading: false,
data: acc.data.filter((d) => d.id !== curr.item.id),
};
}
// add item into the rest
return {
isLoading: false,
data: [...acc.data, curr.item],
};
}, { data: [] as DataItem[], isLoading: false })
),
{
initialValue: { data: [], isLoading: false },
}
);
/**
* on select change - load data from API
*/
onChange(event: any) {
const itemId = event.target.value;
this.addItem$.next(itemId);
}
/**
* removes item from selected array
*/
onRemove(item: DataItem) {
this.removeItem$.next(item);
}
}
```
It’s a bit similar how state management libraries, like NgRx works. You have actions: `addItem$` and `removeItem$` and if any of these two actions emits a data, it goes though a reducer (`scan` operator) and the result is saved into the `selectedItems` value. Moreover the `selectedItems` is changing only in one place so a bug can happen only in one place.
## ExhaustMap Operator
When it comes to [higher-order observables](https://dev.to/krivanek06/angular-interview-what-is-higher-order-observable-2k03), more or less people by default go with switchMap(). It’s a safe and most used choice since it cancels the running process of the inner-observable and create a new one on each new emit.
That said, there may be cases when to use a different type of higher-order observable so you may be interested in - [Angular Interview: What is a Higher-Order Observable?](https://dev.to/krivanek06/angular-interview-what-is-higher-order-observable-2k03)
To be honest, using `exhaustMap` is very rare, but I will give you an example how I had to use it recently. Let’s you say want to implement an infinite scroll feature. You have some initial messages and as you scroll up, you load additional messages (using pagination). Here is the GIF about the result.

Just to have some reference, here is a code close to the final implementation
```tsx
@Component({
selector: 'app-chat-feature',
standalone: true,
imports: [ScrollNearEndDirective],
template: `
<div appScrollNearEnd (nearEnd)="onNearEndEmit()">
@for (item of displayedMessages().data; track item.messageId) {
<!-- loaded messages -->
}
</div>
`,
})
export class ChatFeatureComponent {
private messageApiService = inject(MessageApiService);
/**
* subject to emit new scroll event with pagination
*/
private scrollNewEndOffset$ = new Subject<number>();
/**
* observable to load messages from API based on scroll position
*/
displayedMessages = toSignal(this.scrollNewEndOffset$.pipe(
// initial paginatio is 0
startWith(0),
exhaustMap((offset) =>
this.messageApiService.getMessages(offset).pipe(
// stop loading, set data
map((data) => ({ data, loading: false })),
// error happened, set data
catchError((err) => of({ data: [], loading: false })),
// show loading state
startWith({ data: [], loading: true }),
),
),
// remember previous values and add new ones
scan((acc, curr) => ({
data: [...acc.data, ...curr.data],
loading: curr.loading,
}),
{
data: [] as Message[],
loading: true,
},
),
));
onNearEndEmit() {
// emit with new offset
this.scrollNewEndOffset$.next(this.displayedMessages().data.length);
}
}
```
the main idea is that, as you scroll up, the `(nearEnd)` output will emit every-time as you move closer to the end of the scroll. When `(nearEnd)` emits, you change the value for the `scrollNewEndOffset$` , that represents how many messages are already load (initially 0) and when `scrollNewEndOffset$` emits with the new offset value, you load additional 20 (hardcoded) messages from the server with a new offset value.
So far so good, but why `exhaustMap` ? Wouldn’t `switchMap` work ? For that it needs to be understood how `ScrollNearEndDirective` works.
`ScrollNearEndDirective` is a custom directive that emits every time as your scroll bar moves to the end. Let’s say the end is 0px (maximum top) and you have a threshold of 80px. So if you move you scroll inside the threshold, the directive will emit each time when 1px changes. So if you used `switchMap` you result may be something like this:

As you see, plenty of API calls were issued. Since one API call takes around ~2s to complete and I was hovering within that threshold of 80px, my directive was emitting each time and cancelling every previous API calls.
You may wonder if it could be fixed with `debounceTime(X)` and it would work partially, however `exhaustMap` is better, because you don’t care how many times the directive emitted while scrolling, you are sending the same offset value and receiving the same data back.
## Expand() Operator
The `expand()` operator is an operator that recursively runs, while returning an observable, until you return an `EMPTY` observable, which is the stopping condition.
I highly recommend checking out [Joshua Morony - What I learned from this crazy RxJS stream in my Angular app](https://www.youtube.com/watch?v=aKsPMTP6-sY&t=167s), but I will also give a small example with the already mentioned loading message API.
When you look at the previous example, we have the `messageApiService.getMessages(offset)` API call that always returns 20 new messages based on the offset you provide. You may say 20 is not enough, however this value is hard-coded in the BE and we can not change it, or can we ?
What if we could call the API for messages each time twice to load double the amount of the messages? For that you can use the `expand` operator as follows:
```tsx
export class ChatFeatureComponent {
displayedMessages = toSignal(this.scrollNewEndOffset$.pipe(
// initial paginatio is 0
startWith(0),
exhaustMap((offset) =>
this.messageApiService.getMessages(offset).pipe(
// <--- this is new
expand((_, index) => (
index === 0 ?
this.messageApiService.getMessages(offset + 20)
: EMPTY
)
),
// stop loading, set data
map((data) => ({ data, loading: false })),
// error happened, set data
catchError((err) => of({ data: [], loading: false })),
// show loading state
startWith({ data: [], loading: true }),
),
),
// remember previous values and add new ones
scan((acc, curr) => ({
data: [...acc.data, ...curr.data],
loading: curr.loading,
}),
{ data: [] as Message[], loading: true},
),
));
}
```
In this specific scenario, each time we make an API request to load 20 message, we create one more API request, with the `expand()` operator, so in total we load 40 messages on every scroll to top.
## Final Thoughts
In this article I tried to put together some “more advanced” rxjs operators, or combination of operators that can be useful and I occasionally use. I hope you liked the article and feel free to share your thoughts, or connect with me on [dev.to](https://dev.to/krivanek06) | [LinkedIn](https://www.linkedin.com/in/eduard-krivanek-714760148/). | krivanek06 |
1,890,864 | Analysis and Realization of Commodity Futures Volume Footprint Chart | Summary Quantum footprint chart is an advanced chart analysis tool, the English name is... | 0 | 2024-06-17T06:27:55 | https://dev.to/fmzquant/analysis-and-realization-of-commodity-futures-volume-footprint-chart-epi | trading, chart, cryptocurrency, fmzquant | ## Summary
Quantum footprint chart is an advanced chart analysis tool, the English name is "Footprint Charts". It shows the trading activity of each price in a single K line. In addition to providing price information, it also provides information such as transaction volume, active buying, and active selling. It is a multi-dimensional technical chart that reveals the true distribution ratio of each price based on trading volume, explains the complex relationship between trading volume, price and time, and can provide more reference for traders.
## What is Quantum Footprint
Simply put, Quantum Footprint provides market transparency by micro-analysing what happens to buyers and sellers at various price levels. What do you see in the graph of calorie footprint diagram:
- K-line price
- Active Buy Volume (ASK)
- Active Sell Volume (BID)
- Equilibrium system

As shown in the figure above, this is a demonstration version of the quantitative energy footprint chart implemented on the trading platform FMZ.COM. It is calculated based on the actual tick market. Detailed data are attached to the K line. When the mouse hovers over On the K line, the quantitative energy footprint data can be presented. The data in the blue square is its calculation result, which is divided into two columns in total, and the left column is all the price points of the current K line, which are arranged in order from large to small. The right column is the trading volume for each price level, subdivided into buying trading volume and selling trading volume, separated by "x". On the left of "x" is the volume of active buying, on the right of "x" is the volume of active selling.
## The role of Quantum Energy Footprint
Think about what caused the price increase? The answer is simple... buyers and sellers. If there are more buyers than sellers, the price will rise; if there are more sellers than buyers, the price will fall. If the current number of sellers is approximately equal to the number of buying prices, then the market will show a volatile trend, that is, the market is in a balanced state. Once large buyers or sellers appear, this balance will be broken, and then the energy footprint map will show an extreme ratio of active buying and selling.
The energy footprint is dynamically displayed based on Tick data, so it is especially suitable for short-term intraday trading. Traders can accurately see the active trading volume on each price to explain the reasons or laws of price fluctuations and customize their own trading strategies. For example: if the buying volume on the left is much larger than the selling volume on the right, it means that market participants are actively buying, and the price may increase in the future; otherwise, if the selling volume on the right is much larger than the buying volume on the left, it means With market participants actively selling, prices may fall in the future.
## Principle of Quantum Energy Footprint
The calculation principle of the Quantum Energy Footprint comes from the volume and market data. The current volume is calculated in real time according to the price changes of the market. If the market price rises, the volume is recorded as an active buying; if the market price falls, Then record the volume as actively selling. In the FMZ energy footprint graph, Tick data will be calculated in real time to accurately calculate the turnover of each Tick.
## Quantum Energy Footprint Code Implementation
```
/*backtest
start: 2020-03-10 00:00:00
end: 2020-03-10 23:59:00
period: 1h
exchanges: [{"eid":"Futures_CTP","currency":"FUTURES"}]
mode: 1
*/
var NewFuturesTradeFilter = function (period) {
var self = {} // Create an object
self.c = Chart({ // Create "Chart" chart
tooltip: {
xDateFormat:'%Y-%m-%d %H:%M:%S, %A',
pointFormat:'{point.tips}'
},
series: [{
name: exchange.GetName(),
type:'candlestick',
data: []
}]
})
self.c.reset() // clear chart data
self.pre = null // used to record the last data
self.records = []
self.feed = function (ticker) {
if (!self.pre) {// If the previous data is not true
self.pre = ticker // Assign the latest data
}
var action ='' // Mark as empty string
Log('ticker', ticker)
Log('pre', self.pre)
if (ticker.Last >= self.pre.Sell) {// If the last price of the latest data is greater than or equal to the selling price of the previous data
action ='buy' // mark as buy
} else if (ticker.Last <= self.pre.Buy) {// If the last price of the latest data is less than or equal to the bid price of the previous data
action ='sell' // mark as sell
} else {
if (ticker.Last >= ticker.Sell) {// If the last price of the latest data is greater than or equal to the selling price of the latest data
action ='buy' // mark as buy
} else if (ticker.Last <= ticker.Buy) {// If the last price of the latest data is less than or equal to the buying price of the latest data
action ='sell' // mark as "sell"
} else {
action ='both' // Mark as "both"
}
}
// reset volume
if (ticker.Volume <self.pre.Volume) {// If the volume of the latest data is less than the volume of the previous data
self.pre.Volume = 0 // Assign the volume of the previous data to 0
}
var amount = ticker.Volume-self.pre.Volume // the volume of the latest data minus the volume of the previous data
if (action !='' && amount> 0) {// If the tag is not an empty string and the action is greater than 0
var epoch = parseInt(ticker.Time / period) * period // Calculate the K-line timestamp and round it
var bar = null
var pos = undefined
if (
self.records.length == 0 || // If the K-line length is 0 or the last K-line timestamp is less than "epoch"
self.records[self.records.length-1].time <epoch
) {
bar = {
time: epoch,
data: {},
open: ticker.Last,
high: ticker.Last,
low: ticker.Last,
close: ticker.Last
} // Assign the latest data to bar
self.records.push(bar) // Add bar to the records array
} else {// reassign bar
bar = self.records[self.records.length-1] // the last bar of the previous data
bar.high = Math.max(bar.high, ticker.Last) // the maximum price of the last bar of the previous data and the maximum value of the last price of the latest data
bar.low = Math.min(bar.low, ticker.Last) // The minimum price of the last bar of the previous data and the minimum value of the last price of the latest data
bar.close = ticker.Last // last price of latest data
pos = -1
}
if (typeof bar.data[ticker.Last] ==='undefined') {// If the data is empty
bar.data[ticker.Last] = {// reassign value
buy: 0,
sell: 0
}
}
if (action =='both') {// If the token is equal to both
bar.data[ticker.Last]['buy'] += amount // buy accumulation
bar.data[ticker.Last]['sell'] += amount // sell accumulated
} else {
bar.data[ticker.Last][action] += amount // mark accumulation
}
var tips =''
Object.keys(bar.data) // Put the keys in the object into an array
.sort() // sort
.reverse() // Reverse the order in the array
.forEach(function (p) {// traverse the array
tips +='<br>' + p + '' + bar.data[p].sell +'x' + bar.data[p].buy
})
self.c.add( // Add data
0, {
x: bar.time,
open: bar.open,
high: bar.high,
low: bar.low,
close: bar.close,
tips: tips
},
pos
)
}
self.pre = ticker // reassign
}
return self // return object
}
// program entry
function main() {
Log(_C(exchange.SetContractType,'MA888')) // subscription data
var filt = NewFuturesTradeFilter(60000) // Create an object
while (true) {// Enter loop mode
var ticker = exchange.GetTicker() // Get exchange tick data
if (ticker) {// If the Tick data is successfully obtained
filt.feed(ticker) // Start processing data
}
}
}
```
## Quantum Footprint Code Download
The quantitative code footprint chart strategy code has been released on the FMZ.com platform, and ordinary users can use it without setting.
https://www.fmz.com/strategy/189965
## To sum up
In actual use, the energy footprint map can also analyze the flow of funds from the perspective of volume. Whether it is to analyze the direction of large-scale trends or the short-term direction of the day, it plays a vital role. However, it should be noted that the so-called capital flow here does not refer to the entry and exit of funds, but rather reflects the market's willingness to buy and sell and the game behavior of the main players and retail investors.
From: https://www.fmz.com/digest-topic/5845 | fmzquant |
1,890,346 | My TakeAway from The AI Summit - London | Intro: I was fortunate enough to attend the London AI Summit 24. The agenda was packed and most of... | 0 | 2024-06-17T06:27:50 | https://dev.to/balagmadhu/my-takeaway-from-the-ai-summit-london-36e8 | conference, ai | **Intro**:
I was fortunate enough to attend the London AI Summit 24. The agenda was packed and most of the time I was just jumping between session. Here are my reflections from the London AI Summit. While there was a significant focus on Large Language Models (LLMs), these are some of the topics and discussions that I found particularly engaging, both as an attendee and a panelist:
**Gen AI**
GenAI Use Case Expansion:
- The rapid increase in GenAI use cases offers vast opportunities for innovation.
- Companies should review their GenAI strategies comprehensively to harness this potential.
- It's crucial to define incremental value from GenAI that aligns with existing digital investments for a cohesive, value-driven implementation approach.
Bias Mitigation in Large Language Models:
- Large language models are valuable for decision-making but require bias detection and mitigation to be fully effective.
- Proactively addressing biases can unlock untapped value and promote fair, ethical AI applications.
**Robots Operating Autonomously**
Spot, the robot dog by Boston Dynamics, was deployed at a former nuclear fusion reactor site. It gathered radiation data autonomously over 35 days with minimal human involvement.
**Advancements in Autonomy**:
The team is working on deploying robots in industrial settings where humans can't reach. Autonomy in robots allows for operation in hazardous environments, enhancing safety.
**Human-in-the-Loop Approach**:
Industrial companies currently prefer a human-in-the-loop approach for collaboration and control. Operators can intervene with robots when necessary, ensuring safety and reliability.
**Transition to Full Autonomy**:
The move toward full autonomy is gradual, as systems must improve in handling uncertainties.Full autonomy requires AI to respond to changes and optimize tasks in real-time.
**Technological Enhancements**:
Spot was fitted with lidar and advanced 3D mapping for reliable autonomous navigation. It carried a task-specific payload, like a device for monitoring radiation levels.
**Operational Efficiency**:
Spot was programmed to return to its charging unit when the battery was low. It followed a script for daily tasks, optimizing its battery life for efficient operation.
**Data Collection and Mapping**:
Long-term operation creates multiple maps of the environment, aiding operators.Robots provide more reliable data collection than humans, who may make errors due to fatigue.

**NASA Charts AI, Robotics, 3D Printing as Path for Mars Sustainability**
**Mastering AI for Real-Time Data Analysis**: Essential for optimizing plant maintenance and predictive maintenance operations.
**Technological Challenges for Mars**: Developing vehicles for rocky terrain, equipment for lower gravity and cold temperatures, and smart robots with computer vision for ice exploration.
**Innovative Solutions**: Utilizing 3D printing for habitat construction, leveraging robots for material transport, and employing digital twins for personalized medicine.
**Safety Measures**: Creating handheld devices to alert humans of solar flares, with AI systems analyzing NASA data to predict solar events.
**Collaborative Design**: Using generative systems for spacecraft material design and AI models to gain insights from Earth data, in partnership with entities like IBM.
**Diverse Perspectives for Revolutionary Change**: Emphasizing the value of varied life experiences in fostering innovative problem-solving approaches.
**Effective LLM Agent consideration**
**Knowledge is Governance**: Recognizing that effective governance of AI begins with a thorough knowledge of its applications across the organization.
**Inventory of AI Use Cases**: Instructing teams to maintain an inventory of AI use cases to manage and govern AI responsibly.
**AI Risk Management**: Integrating AI risk management with other critical areas such as third-party collaborations and cybersecurity, acknowledging that AI does not operate in isolation.
**Monitoring Risks**: Advocating for robust support systems to monitor AI risks continuously.
**Inclusive Call to Action**: Encouraging a broad call to action to involve volunteers from across the business in the responsible AI process, tapping into diverse ideas and perspectives.
**Cross-Business Collaboration**: Highlighting responsible AI as an exercise that requires cross-business collaboration, involving legal, privacy, employment law, cybersecurity, procurement, and technology teams.
**Engaging Varied Stakeholders**: Suggesting the inclusion of sustainability teams and other varied stakeholders to ensure a comprehensive approach to responsible AI, emphasizing the importance of their involvement and stake in the process.
**Greener AI solution**:
**Green Data Centers**: Utilizing tools like the Electricity Map can help identify and prioritize the use of green data centers that are powered by renewable energy sources.
**Data Minimization**: Implementing policies to avoid data hoarding can reduce unnecessary energy consumption and storage costs.
**Scheduling ML Training**: Encouraging machine learning training during off-peak hours or when the data center is powered by cleaner energy sources can significantly reduce the carbon footprint.
**Task-Specific Models**: Opting for task-specific models can be more efficient and environmentally friendly compared to using large, generic models.
**Open Collaboration**: Fostering collaboration with the research community and open-source initiatives can lead to shared learning and more efficient design of ML solutions.
**Failure Logs**: Sharing failures and encouraging a failure log can help the community learn from mistakes and avoid repeating them, leading to more sustainable practices.
**Alignment with Corporate Strategy**: Ensuring that ML development aligns with the corporate sustainability strategy and quantifying the impact through KPIs can help in the wider adoption of green practices.

| balagmadhu |
1,887,237 | Terramate this SAP BTP! | Introduction Let us assume that we have a company that wants to standardize the setup of... | 26,908 | 2024-06-17T06:26:07 | https://dev.to/lechnerc77/terramate-this-sap-btp-5a8p | sap, btp, terraform, terramate | ## Introduction
Let us assume that we have a company that wants to standardize the setup of SAP BTP subaccounts.
The company decides to define a standard setup of SAP BTP subaccounts with the following constraints:
- A subaccount should be created under a directory
- Every subaccount should get assigned some basic entitlements for app development
- A Cloud Foundry environment should be created
- Emergency subaccount admins should be added
- The resources should be labelled according to the company’s standards
Of course, the company wants to leverage Terraform i.e. the [Terraform provider for SAP BTP]( https://registry.terraform.io/providers/SAP/btp/latest) for the setup. It uses a three-stage approach namely a subaccount for development, test, and production.
To get some inspiration it checks the available samples for SAP BTP available on [GitHub](https://github.com/SAP-samples/btp-terraform-samples). It finds a [sample](https://github.com/SAP-samples/btp-terraform-samples/tree/main/released/usecases/dev_test_prod_setup) that seems to fit to the requirements. Maybe some adoptions are needed, but it is a good starting point.
The sample uses a Terraform module to setup a subaccount. It creates the three stages by iterating over a list containing the stages and executes the module which results in the expected setup.
While the example is a good starting point to get ideas on how to do such a setup and what options you have with Terraform, it also shows some drawbacks that we should address:
- Although it makes sense to keep the setup logically together, the current configuration is tightly coupled in one configuration. The blast radius is high as all three stages might be affected by a change.
- It is difficult to distinguish environments when it comes to e.g., introducing variants depending on the stage.
- Let us say we want to test a new version of the module or a new version of the Terraform provider on the development environment, this is difficult to achieve.
- One huge state gets created containing all three environments and this is definitly a too tight coupling. This makes follow-up activities like drift detections hard to do.
Let us be fair, the code on GitHub is just a sample, but it shows that you need to design the setup properly to avoid these drawbacks.
The "usual" solution for this is to split the configuration into three, one for each stage. This makes things more self contained, but now we have another huge drawback: how do we keep the configurations in sync? Do we copy&paste the configurations? This is not a good idea as it is error-prone and not maintainable on the long run.
Terraform per se does not provide a perfect solution for this, but fortunately there is a huge ecosystem around Terraform that we can leverage. One tool that is worth to take a closer look at is [Terramate](https://terramate.io/). This tool seems to exactly address the issues we are facing. So let us try it out.
## What is Terramate?
*Terramate* according to its documentation is a *productivity tool* for Terraform. Its value proposition is to fill the gaps mentioned before (and even more) that come into play when dealing especially with large/complex Terraform configurations by helping you around automation, orchestration, and observation.
Now you might say: "oh no, not another tool" or at least "hopefully I do not need to learn/use another language in addition to the configuration language that comes with Terraform".
I think this is where the Terramate team made some very smart decisions:
- Terramate is acting *on top* of Terraform, so no restrictions concerning standard Terraform functionality. It enhances the features and functionalities of Terraform by covering the gaps. It is "just" another CLI you must install, but it doesn’t try to be a substitute for the Terraform CLI.
- It uses the Terraform configuration language to configure the additional features and functions that come with Terramate. No need to leave the realms of the language you are anyway using when working with Terraform.
Before diving into the application of Terramate we must do some homework around the concepts of Terramate.
> **Note** - There is a lot of features and functionality that Terramate provides. We will not cover all of them in this blog post. We will focus on the features that help us to address the issues we are facing with the current setup. If you want to get an overview or want to dig deeper in some topics, you should check the [Terramate documentation](https://terramate.io/docs/).
The main concept that will help us is Terramate's concept of *stacks*. Stacks put an additional "layer" on top of Terraform by making the configuration and state a well-defined unit that can be managed via Terramate based on the Terramate configuration. Using stacks we can define stack-specific or stack agnostic Terramate functionality e.g. code-generation that probably helps us getting rid of copy&pasting activities.
Now you might say "great another layer equals more complexity ... as if the scenario wasn't complex enough". We will see soon that this is not the case. Of course, you must understand the additional concepts, but they fall in place quite well and help you to manage the complexity in a convenient way. Having said that, complex things remain complex (no matter what management tries to tell you), but Terramate helps you to manage the complexity in a more structured way.
Before repeating what is anyway available in the Terramate [documentation](https://terramate.io/docs/), let us make things tangible and rebuild the setup for the dev-test-prod scenario leveraging Terramate.
## Applying Terramate to the dev-test-prod scenario
The staged setup with dev, test and production is a perfect candidate for applying the stack concept. Let us start from scratch in an empty directory. First, we create the following directory structure to host the stacks on the file system:
```bash
| - stacks
| | - dev
| | - test
| | - prod
```
Next, we create the stack configuration for the three stages via the Terramate CLI in each of the directories via the Terramare CLI. We key in the following commands:
```bash
terramate create --name "development" --description "Stack for BTP development setup" --tags "dev" stacks/dev
terramate create --name "testing" --description "Stack for BTP testing setup" --tags "tst" stacks/test
terramate create --name "production" --description "Stack for BTP production setup" --tags "prd" stacks/prod
```
As you can see, we gave each stack a name, a description, and a tag. I love the concept of tags in general, so great to have this feature in Terramate to flexibly categorize the stacks.
As a result, we find a `stack.tm.hcl` file in every directory. This file contains the stack-specific metadata. Here an example for the development stack:
```terraform
stack {
name = "development"
description = "Stack for BTP development setup"
tags = ["dev"]
id = "29300b60-f0bb-4dda-bb4a-f0320148b0ed"
}
```
As you can see the content reflects the parameters we defined. In addition, it contains a unique identifier for the stack.
After we have initialized the stacks, we want to put in the necessary Terraform configuration. We do not want to copy&paste the configuration back and forth, but we want to define things once and use the configuration in the different stacks considering the specific requirements for the stages.
To solve that Terramate brings another functionality to the table namely [code genration](https://terramate.io/docs/cli/code-generation/). We will use this feature to generate the basic setup for the stacks. I am a big pro-ponent of code generation instead of generic magic, so this feature is very appealing to me.
### Generating the Terraform configuration
Let us now create the stage/stack-specific Terraform configurations. We start with the provider configuration. The layout of the provider should be the same for all three stages. However, we want to allow some flexibility when it comes to the development stage to test new versions of the Terraform CLI as well as new versions of the Terraform provider.
To achieve that we define [global variables](https://terramate.io/docs/cli/reference/variables/globals) that can be accessed in the code generation process. We put these variables in the `configs.tm.hcl` file in the root directory:
```terraform
// Configure default Terraform version and default providers
globals "terraform" {
version = ">= 1.6.0"
version_dev = ">= 1.8.0"
}
globals "terraform" "providers" "btp" {
version = "~> 1.4.0"
version_dev = "~> 1.4.0"
}
globals "terraform" "modules" "btp_subaccount_module" {
source = "github.com/btp-automation-scenarios/btp-subaccount-module?ref=67cb61948e19497377fb4e23f01dd301319c6907"
}
```
Here we specify several version constraints as well as the source of the module on GitHub.
Next, we create a directory called `templates` where we will put in the code templates for the code generation. We add a file called `generate_provider.tm.hcl` which serves as template for the code generation resulting in a provider configuration. It contains the following content:
```terraform
generate_hcl "_terramate_generated_provider.tf" {
content {
terraform {
required_version = tm_ternary(tm_contains(terramate.stack.tags, "dev"), global.terraform.version_dev, global.terraform.version)
required_providers {
btp = {
source = "SAP/btp"
version = tm_ternary(tm_contains(terramate.stack.tags, "dev"), global.terraform.providers.btp.version_dev, global.terraform.providers.btp.version)
}
}
}
provider "btp" {
globalaccount = var.globalaccount
}
}
}
```
The syntax is good to understand:
- The [`generate_hcl` block](https://terramate.io/docs/cli/code-generation/generate-hcl) defines the contents of the configuration to be generated.
- The block label `"_terramate_generated_provider.tf"` defines the file name of the generated file.
- The `content` block contains the actual content of the file. Here we define the provider configuration. As you can see, we make use of some Terramate specific functions to assign the version of the Terraform CLI and the Terraform provider depending on the stage with the help of the global variables and the stack specific information. We check if the stack is tagged as “dev” using the [`tm_contains`]("_terramate_generated_provider.tf") function together with the information available via the `terramate.stack` object. Depending on the availability we decide which version to use leveraging the [tm_ternary](https://terramate.io/docs/cli/reference/functions/tm_ternary) function.
One last piece is missing that we need for the generation: Teramate needs to know which templates to use for the generation. We define this in the `imports.tm.hcl` file on root level. The file has the following content:
```terraform
# Import helper files
import {
source = "./templates/generate_provider.tm.hcl"
}
```
Okay, ready to go? Let us kick off our first code generation via:
```bash
terramate generate
```
We see the following output:

And indeed, every stack now contains a generated provider.tf file according to our naming:

Next stop: the variables. We want to make the call of the setup as easy as possible, so we will default the values where possible i.e., we will only leave the `globalaccount` variable open for the user to provide. We will also make use of the stack specific information to:
- Assign the architect role to the emergency admins in the development stage, otherwise the organizationally assigned admins
- Assign the entitlement `free` for `HANA Cloud` in the development stage, otherwise the entitlement `hana`.
To achieve this, we create a new template file for the variables in the `templates` directory called `generate_variables.tm.hcl`. We put the following code into the file:
```terraform
generate_hcl "_terramate_generated_variables.tf" {
content {
variable "globalaccount" {
type = string
description = "The globalaccount subdomain where the sub account shall be created."
}
variable "region" {
type = string
description = "The region where the account shall be created in."
default = "us10"
}
variable "unit" {
type = string
description = "Defines to which organisation the sub account shall belong to."
default = "Sales"
}
variable "unit_shortname" {
type = string
description = "Short name for the organisation the sub account shall belong to."
default = "sls"
}
variable "architect" {
type = string
description = "Defines the email address of the architect for the subaccount"
default = "genius.architect@test.com"
}
variable "costcenter" {
type = string
description = "Defines the costcenter for the subaccount"
default = "1234509874"
}
variable "owner" {
type = string
description = "Defines the owner of the subaccount"
default = "someowner@test.com"
}
variable "team" {
type = string
description = "Defines the team of the sub account"
default = "awesome_dev_team@test.com"
}
variable "emergency_admins" {
type = list(string)
description = "Defines the colleagues who are added to each subaccount as emergency administrators."
default = tm_ternary(tm_contains(terramate.stack.tags, "dev"), ["somearchitect@test.com"], ["jane.doe@test.com", "john.doe@test.com"])
}
variable "entitlements" {
description = "List of entitlements for a BTP subaccount"
type = list(object({
group = string
type = string
name = string
plan = string
amount = number
}))
default = [
{
group = "Audit + Application Log"
type = "service"
name = "auditlog-viewer"
plan = "free"
amount = null
},
{
group = "Alert"
type = "service"
name = "alert-notification"
plan = "standard"
amount = null
},
{
group = "SAP HANA Cloud"
type = "service"
name = "hana-cloud"
plan = tm_ternary(tm_contains(terramate.stack.tags, "dev"), "free", "hana")
amount = null
},
{
group = "SAP HANA Cloud"
type = "service"
name = "hana"
plan = "hdi-shared"
amount = null
}
]
}
}
}
```
We add the template to the `imports.tm.hcl` file that now contains two blocks:
```terraform
# Import helper files
import {
source = "./templates/generate_provider.tm.hcl"
}
import {
source = "./templates/generate_variables.tm.hcl"
}
```
Good to go for a second round of code generation:
```bash
terramate generate
```
The output shows that the variables are generated for every stack:

And we also find the generated files in the stacks:

But hold on a second: the output also shows that only the delta is generated. Files that do not need to be touched are not changed even if we re-run the `terramate generate` command. I like that!
Last but not least we must generate the main configuration file. The only requirement we have is that the `usage` attribute of the subaccount is set to `USED_FOR_PRODUCTION` in the testing and production stage, while the development environment should be set to `NOT_USED_FOR_PRODUCTION`. In addition, we want to use the label of the stack to feed into the `stage` parameter of the module that creates the subaccount.
We know the drill by now and create a new template called `generate_main.tf.hcl` in the `templates` directory:
```terraform
generate_hcl "_terramate_generated_main.tf" {
lets {
stage = tm_upper(tm_element(terramate.stack.tags, 0))
}
content {
# ------------------------------------------------------------------------------------------------------
# Creation of directory
# ------------------------------------------------------------------------------------------------------
resource "btp_directory" "parent" {
name = "${var.unit}-${terramate.stack.name}"
description = "This is the parent directory for ${var.unit} - ${terramate.stack.name}."
labels = { "architect" : ["${var.architect}"], "costcenter" : ["${var.costcenter}"], "owner" : ["${var.owner}"], "team" : ["${var.team}"] }
}
# ------------------------------------------------------------------------------------------------------
# Call module for creating subaccoun
# ------------------------------------------------------------------------------------------------------
module "project_setup" {
source = "${global.terraform.modules.btp_subaccount_module.source}"
stage = "${let.stage}"
region = var.region
unit = var.unit
unit_shortname = var.unit_shortname
architect = var.architect
costcenter = var.costcenter
owner = var.owner
team = var.team
emergency_admins = var.emergency_admins
parent_directory_id = btp_directory.parent.id
usage = tm_ternary(tm_contains(terramate.stack.tags, "dev"), "NOT_USED_FOR_PRODUCTION", "USED_FOR_PRODUCTION")
}
}
}
```
The source of the module is defined via the global variable we defined before, and the usage is set depending on the stack label using the known Terramate functions. For the sake of demoing the capabilities we introduced local variables via a [`lets` block](https://terramate.io/docs/cli/reference/variables/lets) to define the `stage` variable. We make use of two further Terramate functions to get the first element of the stack tags ([`tm_element`](https://terramate.io/docs/cli/reference/functions/tm_element)) and to convert it to upper case ([`tm_upper`](https://terramate.io/docs/cli/reference/functions/tm_upper)).
We add the import to the `imports.tm.hcl` file which now finally looks like this:
```terraform
# Import helper files
import {
source = "./templates/generate_provider.tm.hcl"
}
import {
source = "./templates/generate_variables.tm.hcl"
}
import {
source = "./templates/generate_main.tm.hcl"
}
```
After the execution of the generation command, we see the `main.tf` files with the expected content in the stacks:

Shortly summarizing what we did until now:
- We created Terramate stacks with some metadata representing the stages of the setup
- We used the Terramate code generation feature to generate the provider configuration, the variables, and the main configuration for the Terraform setup including stack specific adjustments. As a consequence, we did not need to repeat ourselves or do some copy&paste exercises. In addition, we can now easily adapt the configuration in the future and regenerate the files.
Now time to get some infrastructure set up!
### Using Terramate to run Terraform commands
As we have a complete Terraform configuration in the single directories, we could hop into each of those and execute the Terraform commands there. But can't we do better with Terramate? Yes, we can.
Terramate has the [`terramate run` command](https://terramate.io/docs/cli/reference/cmdline/run) that allows you to execute *any* command in the stacks. Let us try out some things.
First, we want to initialize all stacks. we do so by executing the following command:
```bash
terramate run terraform init
```
In the console we se that the Terraform commands gets executed per stack:

We recognize that the usual suspects appear in the file system:

Initialization successful. Next, we might want to plan the setup. We do so by executing the following command:
```bash
terramate run terraform plan -var globalaccount=<YOU GLOBAL ACCOUNT>
```
In the output we will see that the plan is executed for all three stacks in sequence. The output is a bit lengthy, so I will not provide a screenshot here.
The output reflects that the plan is executed for all three stacks. We will also see a different number of resources that are to be created as we have only one emergency administrator for the development environment, while we defined two for the test and production environment. Things work as expected.
We can also check the execution sequence via:
```bash
terramate list --run-order
```
The output looks like this and as we have no dependencies all stacks are on the same level ordered by the sequence in the file system:

The execution of the commands can also be done in parallel:
Terramate will do the parallelization if the stacks are independent from each other. If there are dependencies between the stacks, Terramate will execute the stacks in the correct order:
```bash
terramate run --parallel=3 terraform plan -var globalaccount=<YOU GLOBAL ACCOUNT>
```
This is then reflected in the output of the command:

We can also restrict the execution to a specific stack e.g., via the path:
```bash
terramate run --chdir=stacks/dev terraform plan -var globalaccount=<YOU GLOBAL ACCOUNT>
```
or even more convenient using tags:
```bash
terramate run --tags dev terraform plan -var globalaccount=<YOU GLOBAL ACCOUNT>
```
The output reflects the filtering:

You can of course also execute all the other commands and apply your setup. But I think the value of executing the commands via Terramate is clear now.
> **Note** - Maybe you stumble across Terramate's built-in safeguarding to prevent executions on uncommitted changes when doing things on your machine. You can switch off all safeguards by adding the `-X` flag to the `terramate run` commands. It certainly makes sense though to check the [documentation](https://terramate.io/docs/cli/reference/cmdline/run#options) what are the effects before doing so.
## Where to find the code
You find the code including a copy of the original monolithic setup on GitHub: <https://github.com/btp-automation-scenarios/btp-terramate>.
## Conclusion and Outlook
When dealing with more complex Terraform setups that usually arise quickly, Terramate is a very valuable tool that you should take a look at. It helps you to manage the complexity in a structured way and comes with useful features to make your life in the Terraform world easier.
The rewriting of the dev-test-prod setup showed where Terramate could shine and helps us tackling the complexity that comes with these setups. Two main ingredients for achieving this is the concept of stacks and the code generation feature.
However, we left out one important aspect of remote backends. Here again the stack-based approach comes in handy as you can see in this sample provided by the Terramate team that showcases how to generate stack specific backend configurations: <https://github.com/terramate-io/terramate-examples/blob/main/01-keep-terraform-dry/imports/generate_backend.tm.hcl>.
There is so much more to explore in Terramate and in which scenarios it can support your setup. Looking at the documentation I just scratched the surface and need to spend a bit more time to understand all the details. But I am convinced it is worth it.
With that happy Terraforming with Terramate!
> **Note** - I focused on Terraform in this blog post, but you can of course also use [OpenTofu](https://opentofu.org/). There is no barrier built into Terramate that would prevent you from doing so.
| lechnerc77 |
1,890,863 | Introducing Ronin Network: The Blockchain for Gaming | What is Ronin Network? Ronin Network is an EVM (Ethereum Virtual Machine)-compatible blockchain... | 0 | 2024-06-17T06:25:27 | https://dev.to/footprint-analytics/introducing-ronin-network-the-blockchain-for-gaming-2pc5 | blockchain | <img src="https://statichk.footprint.network/article/7ef3a8a0-c521-4d90-a7dd-c2eea66e712c.jpeg">
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">What is Ronin Network?</span></h2><a href="https://roninchain.com/"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Network</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> is an EVM (Ethereum Virtual Machine)-compatible blockchain forged for gaming. It powers two of Web3’s top games: Axie Infinity and Pixels. Ronin aims to become the full-stack infrastructure that enables the success of blockchain games.</span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Key Takeaways</span></h2><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin is an EVM-compatible Layer 1 blockchain forged for gaming. The Ronin team, Sky Mavis, is one of the few teams with real experience in driving mass adoption. </span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Initially utilizing a Proof of Authority (PoA) consensus mechanism, Ronin Network later transitioned to Delegated Proof of Stake (DPoS) to improve security and increase decentralization.</span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Network provides several advantage</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">s to studios and developers within its ecosystem: specialized gaming infrastructure, active on-chain community, an</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">d access to proven publishing expertise from Sky Mavis. </span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Through strategic upgrades, recovery efforts, and expanding its game titles, Ronin has developed a diverse ecosystem and established itself as a leading blockchain platform for gaming.</span></li></ul>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Introduction to Ronin Network</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Network is a cutting-edge EVM-compatible Layer 1 blockchain, crafted specifically for the Web3 gaming world. Developed by Sky Mavis, creators of the groundbreaking Web3 game Axie Infinity, Ronin is built to support games where players can truly own and trade their in-game assets. </span>
<img src="https://statichk.footprint.network/article/85a03753-98a3-44d2-9ad8-69bee501e9e3.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Network</span></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The network first made its public debut with the release of its testnet on December 23, 2020, followed by the launch of its mainnet on February 1, 2021. These key developments were part of Sky Mavis' broader strategy to evolve beyond a single game, aiming to build a diverse ecosystem of multiple gaming studios on Ronin. This ambitious plan is progressively bearing fruit, transforming Ronin into one of the foundational platforms for blockchain gaming.</span>
<h3><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The Team behind Ronin Network: Sky Mavis</span></h3>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Sky Mavis, the company behind Ronin Network, aims to give economic freedom to people, starting with gamers. Known for creating Axie Infinity, Sky Mavis has established itself not only as a leading force in blockchain gaming but also as one of the few teams with real experience driving mass adoption. </span>
<img src="https://statichk.footprint.network/article/00210c9c-46d5-4e80-a3d1-9e6b127c2aae.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Sky Mavis</span></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The company offers essential tools for game developers to build and expand their blockchain games. These tools include Mavis Hub, a launchpad for promising blockchain games; Mavis Market, the first generalized NFT marketplace on Ronin; and Mavis Account, which connects users to all of Sky Mavis' products.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Based in Vietnam and Singapore, Sky Mavis boasts a diverse and dynamic team of over 250 employees worldwide. </span>
<h3><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Fundraising by Ronin Network</span></h3>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Sky Mavis has secured $172 million across four successful funding rounds. Initially, the focus was primarily on advancing the development of Axie Infinity during the first two rounds. However, with the Series B funding, a strategic shift occurred, directing resources towards the growth and enhancement of both Axie Infinity and the newly launched Ronin Network. Notable investors in these rounds included a16z Crypto, Animoca Brands, Paradigm, and Accel.</span>
<img src="https://statichk.footprint.network/article/bd029b75-02e9-4240-85b8-35738ca34752.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Fundraising by Sky Mavis</span></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">In April 2022, Sky Mavis initiated a remarkable funding round, raising $150 million in response to an unexpected security breach where 173,600 ETH and 25.5M USDC were drained from the Ronin Bridge. This round was swiftly secured from a group of investors within a week of the incident. However, Sky Mavis was able to refill the Ronin Bridge and fully reimburse all affected users with its own balance sheet. Consequently, the company scaled back the initially planned $150 million round and ultimately raised $11 million, led by a16z and Animoca Brands.</span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">How Ronin Network Works</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Network initially operated on a Proof of Authority (PoA) consensus mechanism, which involved a select group of validators responsible for maintaining the network and verifying transactions. This method offers several advantages, including lower energy consumption compared to Proof of Work (PoW) systems, as it does not require intensive computational tasks to validate transactions. PoA also supports faster transaction speeds and reduced costs, as it allows for quicker block validation. Despite these benefits, PoA has faced criticism for its lack of decentralization compared to other consensus mechanisms.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">To enhance Ronin's decentralization, Sky Mavis introduced an update incorporating Delegated Proof of Stake (DPoS) into the validator selection process. This change allows any Ronin token (RON) holder who meets a certain threshold to become a validator, widening access and significantly boosting the network’s decentralization. While transitioning to DPoS, Ronin retains the key benefits of PoA, such as swift transaction speeds and lower fees. Under DPoS, token holders delegate their stakes to elect validators, and to ensure the integrity of this process, slashing rules are applied to penalize any validators found to be acting maliciously.</span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Benefits of Ronin Network</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Network provides several advantages to studios and developers within its ecosystem.</span>
<h3><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Specialized Gaming Infrastructure</span></h3>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Network offers a tailored gaming infrastructure specifically designed to meet the unique demands of the gaming industry. This platform ensures seamless integration and exceptional performance, enabling developers to easily deploy and manage their games. With Ronin's infrastructure, transactions are processed nearly instantly at minimal costs—typically less than $0.001 per transaction. Additionally, its compatibility with the EVM simplifies the game development process, allowing for smooth implementation. Ronin also provides developers with an integrated suite of tools through the Ronin and Mavis products. The upgrade to DPoS has enhanced economic security and improved decentralization, all while maintaining rapid transaction speeds and low fees. </span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Moreover, Ronin's infrastructure supports easy user onboarding. Ronin introduced sponsored transactions, allowing one address to make transactions that are paid for by another. This feature enables game studios to sponsor transaction fees for their players, significantly easing the onboarding process for new users unfamiliar with Web3 gaming.</span>
<h3><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Active On-chain Community</span></h3>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin is home to one of the biggest and most active Web3 gaming communities in the world. According to Footprint Analytics, the average daily active users(DAU) on Ronin jumped from 16.97K in October 2023 to over 1.37 million in May 2024, with Pixels' integration — an astonishing 80-fold increase. This community is not only large but also highly active, with members regularly engaging in gameplay and referrals. Such participation establishes Ronin as a premier destination for active on-chain users.</span>
<img src="https://statichk.footprint.network/article/cbdcb948-2182-430c-b94a-0d4015cf38d6.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/research/chain/chain-stats/ronin-overview?chain=Ronin&amp;amp;series_date-85532=2023-07-01~2024-05-20&amp;amp;filter_the_data_of_recent_days_below_to_exclude_the_current_day-85523=7"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Network Daily Active Users</span></a></em>
<h3><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Proven Publishing Expertise</span></h3>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Sky Mavis’ publishing expertise is a key advantage, with a team that includes seasoned industry professionals and dedicated Web3 developers. This team has a deep understanding of how to create and promote Web3 games effectively, ensuring that these games not only meet the highest standards of quality but also resonate with players. Moreover, Sky Mavis provides the Mavis Suite and various tooling for Ronin, which together form a best-in-class development kit designed specifically for Web3 gaming. These products are not only seamlessly integrated with each other but are also developed in close collaboration with studios within the Ronin ecosystem, facilitating continuous improvement and innovation. </span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Capitalizing on these benefits, games and NFT projects that migrate or launch on Ronin often experience significant growth — a phenomenon known within the ecosystem as the "</span><a href="https://roninchain.com/blog/posts/the-ronin-effect"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Effect</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">". We will explore several examples of this effect in the next section of this article.</span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The Ronin Network Ecosystem</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Let's explore how Ronin transitioned from primarily supporting Axie Infinity to becoming a thriving gaming chain with a diverse ecosystem.</span>
<h3><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Games</span></h3>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Axie Infinity</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> is a vibrant virtual world where players interact with creatures called Axies, which can be battled, bred, and collected. Axies are also a source of resources within the game. </span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Initially, Axie Infinity thrived on Ethereum's blockchain, pioneering the concept of play-to-earn economies in the gaming space. However, as Ethereum's architecture struggled to keep pace with the demands of high-speed and cost-efficient gaming transactions, Axie faced significant challenges in user retention and sustaining growth throughout 2021. </span>
<img src="https://statichk.footprint.network/article/1b9c0471-479c-4951-90ef-228ad1921125.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Axie Infinity</span></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Sky Mavis recognized the urgent need for a scalable solution to address these limitations. Despite initially migrating to Loom Network in pursuit of relief from Ethereum's crippling fees and network congestion, the rapid growth of Axie Infinity quickly outpaced Loom's ability to provide the necessary scalability. This led Sky Mavis to embark on a bold endeavor: the development and launch of the Ronin Network. In February 2021, Sky Mavis unveiled the Ronin Network. Unlike Ethereum and Loom, Ronin was built specifically for gaming. </span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Leveraging its specialized gaming infrastructure and Sky Mavis' strategic emphasis on community building, user engagement and retention, along with its expertise in content marketing, go-to-market(GTM) strategies, and in-game economics design, Axie Infinity experienced a significant surge in growth. This growth was further amplified by favorable timing, coinciding with increased gaming activity during the pandemic when people spent more time on games.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">According to data from </span><a href="https://www.footprint.network/research/gamefi/game-protocols/single-game-stats?series_date-79426=past90days&amp;amp;game_name=Axie%20Infinity"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Footprint Analytics</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, Axie Infinity achieved a significant milestone by surpassing 1 million users within seven months of the Ronin mainnet launch. Just 10 months later, this figure skyrocketed to an impressive 9 million users. </span>
<img src="https://statichk.footprint.network/article/7663be54-ac9d-484e-a4f1-d054061a468b.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/research/gamefi/game-protocols/single-game-stats?series_date-79426=past90days&amp;amp;game_name=Axie%20Infinity"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Total Users of Axie Infinity on Ronin and Ethereum</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Following the security breach in March 2022, Ronin shifted to a DPoS consensus mechanism to bolster network security and further decentralization, subsequently achieving a steady recovery. Sky Mavis now operates 1 validator out of the 22 validators. This shift paved the way for Ronin to expand its gaming ecosystem beyond Axie Infinity, beginning with the introduction of several new games in March 2023 including </span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The Machines Arena</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, </span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Tribesters, and Axie Champions</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">. Later in the year, Ronin further diversified its portfolio by adding </span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Wild Forest</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">,</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> Pixels</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, </span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Apeiron, </span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">and</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> Fight League. </span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Pixels</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> is a Web3 farming game that transforms the traditional farming game experience, allowing players to cultivate resources, prepare food, and trade goods. Since migrating to Ronin, Pixels has seen remarkable growth, with DAU exceeding 100K in the first month and reaching over 1 million by the seventh month. </span><img src="https://statichk.footprint.network/article/078b7ecb-88f4-44d9-994a-85f093247b07.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/research/gamefi/game-protocols/single-game-stats?series_date-79426=past90days&amp;amp;game_name=Pixels"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Pixels Daily Active Users</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">This growth has not only significantly expanded Ronin's user base and transaction volumes but also enriched the ecosystem with its focus on user-generated content and peer-to-peer resource trading. </span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Additionally, Ronin's ecosystem is expanding with the launch of new products and services in areas such as NFTs and DeFi, further enriching its diverse offerings.</span>
<span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">NFTs</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin supports several NFT dApps, including the</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> NFT marketplace Mavis Mar</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">ket and Axie Infinity Marketplace, CyberKongz Genkai NFT, and NFTBank. </span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Mavis Market, as Ronin’s generalized NFT marketplace, upholds creator royalties, fostering an economic model that benefits creators, their communities, and the Ronin network alike. This balance is achieved through a whitelist access requirement for contract deployment on Ronin, which curbs the rise of unauthorized marketplaces and ensures that creators receive their due royalties.</span>
<img src="https://statichk.footprint.network/article/c2e46987-4abf-4114-b0f2-6a9dae01d944.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://roninchain.com/blog/posts/the-ronin-effect"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Mavis Market's Fee Structure</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">An example of NFT success in the Ronin ecosystem is the multiplayer action RPG game Kaidro, which migrated to Ronin in February 2024. Its first primary NFT mint on Ronin in April 2024 met with overwhelming success, selling out all 9,999 Spark Suit NFTs almost instantly. Over 2,500 NFTs of these were snapped up in a public sale in less than 30 seconds, generating a total of $2 million in primary sales.</span>
<h3><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">DeFi</span></h3>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The Ronin network hosts several DeFi protocols, including its native decentralized exchange(DEX), Katana DEX, as well as Ronin Bridge, Ronin Staking, the lending protocol MetaLend, and the payment solution Sablier. Katana DEX stands out as the largest dApp on Ronin, commanding more than 90% of the total value locked (TVL) on the platform.</span>
<h3><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Wallet</span></h3>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin Wallet serves as the primary gateway for millions of gamers, offering browser extensions and mobile apps for easy access. With Ronin Wallet, users can track individual token activity histories and view NFT details. This wallet leverages Social Login features and Multi-Party Computation (MPC) technology to enhance security and manage ERC-20 tokens on Ethereum and other supported chains.</span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The RON Token</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The RON token is the native ecosystem token of Ronin Network. It serves multiple functions such as paying transaction fees, staking to secure the network and earn rewards, and pricing assets in NFT marketplaces. Additionally, RON is widely utilized as the currency for gaming transactions, including purchases of game items, NFT mints, and token sales.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The distribution of RON tokens is allocated as follows: 25% for rewards, 30% for community incentives, 30% to Sky Mavis, and 15% to the ecosystem fund. The total maximum supply of RON is capped at 1 billion tokens, with the potential for full distribution over 108 months.</span>
<img src="https://statichk.footprint.network/article/98bad4d8-35a6-4fc4-bd3c-a6f59d175fce.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://docs.roninchain.com/basics/tokenomics"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#0000ff;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">RON Unlock Schedule</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">RON holders and stakers enjoy exclusive benefits, such as airdrops, whitelist opportunities, and launchpad allocations for token launches. For instance, Pixels launched its PIXEL token in February 2024 and announced a 20 million PIXEL airdrop for those staking RON, rewarding Ronin’s supporters.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">From October 202</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">3 to May 2024, t</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">he price of RON rose from $0.4 to over $3.0, reaching a peak at $4.4 in March 2024 as reported by </span><a href="https://www.coingecko.com/en/coins/ronin"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">CoinGecko</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">. </span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">What is Next for the Ronin Network?</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">“Ronin’s curated approach to game publishing has seen considerable success in Pixels, Apeiron, and Kaidro. We are excited to accelerate the onboarding of great games and replicate these successful launches,”</span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> said </span><a href="https://x.com/bottomd0g"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Bailey Tan</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, Head of Ronin Ecosystem. </span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Looking ahead, Sky Mavis continues to prioritize network enhancement as part of their strategic roadmap. The team consistently focuses on addressing the “Blockchain Trilemma”— balancing security, scalability, and decentralization.</span>
<img src="https://statichk.footprint.network/article/dea1bb33-9f33-4ebb-83c5-3272ec394627.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source: Ronin Slides at AxieCon 2022</span></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin is actively exploring different scaling strategies, including both monolithic and modular approaches. For the monolithic method, a key solution under consideration is parallel transaction execution. In terms of modular solutions, Ronin is particularly focused on Zero-Knowledge (ZK) rollups. After evaluating the advantages and disadvantages of optimistic and ZK rollups, the team has determined that ZK rollups are more suitable for Ronin. This is because ZK rollups verify transactions more quickly and settle them sooner, making them a better choice for a gaming blockchain.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Bailey Tan also shared some future plans for the Ronin Network.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">"We are looking into ZK rollup technology, specifically the zkEVM variant, to future proof our network in terms of scalability. We are also leveling up our Mavis Suite products to reduce development time and friction in user onboarding, and improve monetization for Web3 games. Our MPC embedded wallet technology sits at the center of these efforts, and is in a pilot phase with some games on Ronin."</span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Final Thoughts</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">As an EVM-compatible Layer 1 blockchain, Ronin is specifically tailored to optimize the gaming experience, providing an infrastructure that supports seamless integration and rapid transaction processing. As it continues to evolve with enhanced security measures and an expanding ecosystem, Ronin is set to deepen its impact. It has the potential to become the go-to blockchain for Web3 gaming.</span>
<br>
<br>
<br>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#444746;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Author: Stella L (</span><a href="mailto:stella@footprint.network"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">stella@footprint.network</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">)</span>
<br>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">This article was </span><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">first published in the </span><a href="https://www.coingecko.com/author/footprintanalytics"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Footprint Analytics column</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#0d0d0d;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> on CoinGecko.</span>
<br>
<br>
<br>
***
<br>
<br>
<br>
<span style="font-size:12.499999999999998pt;font-family:Arial,sans-serif;color:#24292f;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">What is Footprint Analytics?</span>
<br>
<span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Footprint Analytics is a blockchain data solutions provider. We leverage cutting-edge AI technology to help analysts, builders, and investors turn blockchain data and combine Web2 data into insights with accessible visualization tools and a powerful multi-chain API across 30+ chains for NFTs, games, wallet profiles, and money flow data.</span>
<br>
<a href="https://www.footprint.network/"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Website</span></a><span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> \| </span><a href="https://twitter.com/Footprint_Data"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">X / Twitter</span></a><span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> \| </span><a href="https://t.me/Footprint_Analytics"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Telegram</span></a><span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> \| </span><a href="https://discord.gg/3HYaR6USM7"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Discord</span></a>
<br>
| footprint-analytics |
1,890,860 | May 2024 Public Chain Report: Regulatory Shifts and Market Trends | June2024, stella@footprint.network Data Source: Public Chain Research Page May saw significant... | 0 | 2024-06-17T06:24:33 | https://dev.to/footprint-analytics/may-2024-public-chain-report-regulatory-shifts-and-market-trends-4igb | blockchain | <img src="https://statichk.footprint.network/article/dca3b292-7abf-4d05-b5d2-f470ff37f186.png">
<span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">June2024, </span><a href="mailto:stella@footprint.network"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">stella@footprint.network</span></a><span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> </span>
<span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Data Source: </span><a href="https://www.footprint.network/public/research/chain/chain-ecosystem/chain-overview?series_date=past90days"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Public Chain Research Page</span></a>
<br>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">May saw significant regulatory and political developments impacting the crypto market. The U.S. SEC's approval of initial filings for spot Ethereum ETFs boosted the performance of Ethereum and its Layer 2 solutions. The Trump campaign's announcement to accept cryptocurrency donations suggested potential market impacts from the upcoming U.S. presidential election. Solana continued its upward trajectory with key integrations and adoption by major platforms like PayPal. TON attracted significant attention with Pantera Capital's “largest investment” to date, alongside surging TVL and increasing on-chain activities.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Data for this report was obtained from Footprint Analytics’ </span><a href="https://www.footprint.network/public/research/chain/chain-ecosystem/chain-overview?series_date=past90days"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">public chain research page</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> \- an easy\-to\-use dashboard containing the most vital stats and metrics to understand the public chain industry\, updated in real\-time\.</span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Crypto Macro Overview</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">In May 2024, significant regulatory developments contributed to market dynamics, with the U.S. Securities and Exchange Commission (SEC) approving the initial filings for spot Ethereum ETFs. This breakthrough helped Ethereum outperform the broader crypto market, supported by a shift in regulatory attitudes towards cryptocurrencies.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Additionally, the political landscape influenced market sentiment as the Trump campaign announced it would accept cryptocurrency donations. This move suggests potential impacts of the upcoming U.S. presidential election on the crypto market, akin to shifts driven by Federal Reserve monetary policies.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Conversely, the ongoing resolution of Mt. Gox's bankruptcy slightly restrained Bitcoin's price. The exchange, which has been in bankruptcy for a decade, announced last September that creditor repayments would begin in October 2024, raising concerns about potential market impacts from coin sell-offs. </span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Public Chain Overview</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">By the end of May, the total market cap of public chain cryptocurrencies rose 10.5% from April to $1.9 trillion. Bitcoin, </span><a href="https://www.footprint.network/public/research/chain/chain-stats/ethereum-overview?chain=Ethereum&amp;amp;amp;series_date-85136=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ethereum</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, </span><a href="https://www.footprint.network/public/research/chain/chain-stats/bnb-chain-overview?chain=BNB%20Chain&amp;amp;amp;filter_the_data_of_recent_days_below_to_exclude_the_current_day-85146=7&amp;amp;amp;series_date-85152=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">BNB Chain</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, and Solana led with market shares of 62.9%, 21.4%, 4.1%, and 3.6%, respectively. Notably, Ethereum’s share increased from 19.7% to 21.4%, while Solana’s from 3.1% to 3.6%. </span>
<img src="https://statichk.footprint.network/article/74b118aa-27cc-4064-9a69-1ca1b4df6680.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/@Higi/All-Chain-Overview?series_date=2024-05-01~2024-05-31&amp;amp;amp;channel=EN-673"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Public Chain Token Market Cap Share</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">In May 2024, Bitcoin rebounded from its April-end lows to register a gain, climbing from $60,653 at the start of the month to $67,606 by its close, an increase of 11.5%. Similarly, Ether showed a robust recovery, with its price moving from $3,011 to $3,778 over the same period, an uplift of 25.5%. </span>
<img src="https://statichk.footprint.network/article/d8636d55-2032-4399-aa38-a230e6c72f60.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/@KikiSmith/BTC-ETH-Decentralized-Stablecoin-Market-Analysis?date_filter=2024-01-01~2024-05-31&amp;amp;amp;channel=EN-673"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">BTC Price & ETH Price</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Supported by broader crypto market trends, public chain cryptocurrencies recovered from April's poor performance. Besides Bitcoin and Ethereum, Solana's market cap increased by 34.5%, NEAR by 19.0%, and </span><a href="https://www.footprint.network/public/research/chain/chain-stats/avalanche-overview?chain=Avalanche&amp;amp;amp;filter_the_data_of_recent_days_below_to_exclude_the_current_day-85163=7&amp;amp;amp;series_date-85168=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Avalanche</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> by 14.8%. </span>
<img src="https://statichk.footprint.network/article/43718a9d-4441-44aa-bedd-0d218e2aff0b.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/@Higi/All-Chain-Overview?series_date=2024-05-01~2024-05-31&amp;amp;amp;channel=EN-673"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Public Chain Token Market Cap and Price</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Toncoin's price rose by 23.0%, but its market cap fell by 14.6%. On May 30, The Open Network (TON) </span><a href="https://t.me/s/tonblockchain"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">announced</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> an update to the circulating supply of Toncoin on data aggregators, excluding Toncoin held by Telegram, The Open Network Foundation, and the TON Believers Fund. This update caused an immediate drop in Toncoin's market cap.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Total Value Locked (TVL) in the public chain sector reached $87.2 billion at the end of May, marking a 14.7% increase from April, with Ethereum, Tron, and BNB Chain leading the way. Notably, TON’s TVL surged by 106.4% within the month.</span>
<img src="https://statichk.footprint.network/article/1339e6b6-6c19-4bb5-bca8-ea24c6d745a5.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/@Higi/All-Chain-Overview?series_date=2024-05-01~2024-05-31&amp;amp;amp;channel=EN-673"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Public Chain TVL Ranking</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Solana continued its upward trend in May, driven by ongoing meme coin activities and significant developments. LayerZero added Solana to its cross-chain bridge network, connecting it to seven chains: Ethereum, Avalanche, Polygon, Arbitrum, BNB Chain, Optimism, and Base, with plans to expand to over 70 blockchains. Additionally, PayPal chose the Solana blockchain to expand its stablecoin, PayPal USD (PYUSD), marking its first move beyond the Ethereum ecosystem.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">TON also attracted significant attention. In early May, Pantera Capital, managing over $5 billion in assets, announced its “</span><a href="https://panteracapital.com/blockchain-letter/ton-our-largest-investment-ever/"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">largest investment ever</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">” in TON. Moreover, Telegram gaming bot projects like Tapswap and Hamster Kombat are gaining traction amid the Notcoin hype, bringing more users to TON and increasing on-chain activities.</span>
<h3><span style="font-size:12pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Developments within Major Layer 1 Blockchains in May 2024</span></h3><h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Bitcoin</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Blackrock's Bitcoin ETF has become the fastest ETF to reach $20 billion. </span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Animoca Brands has announced its plans to enter the Bitcoin ecosystem by leveraging the Opal Protocol.</span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ethereum</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Spot ETH ETFs could see 25% of the demand of BTC counterpart – Bloomberg analysts.</span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">A group of prominent Ethereum developers, including Vitalik Buterin, has proposed a new transaction type (EIP-7702) to enhance the functionality and security of Externally Owned Accounts (EOAs). </span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">BNB Chain</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">BNB Chain announced 4 hardforks on opBNB testnet and the cas cost on opBNB will be 10x lower.</span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">BNB Chain introduced its latest trading volume incentive program. </span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">NEAR</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">NEAR increased its shards from 4 to 6 to handle more transactions smoothly.</span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">NEAR announced the launch of NEAR AI, a new research and development lab.</span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Sui</span></h4><ul><li><a href="https://www.footprint.network/public/research/chain/chain-stats/sui-overview?chain=Avalanche&amp;amp;amp;series_date-85907=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Sui</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> refuted criticism of the tokenomics around the distribution and control of its SUI token.</span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Sui's social login primitive zkLogin adds multi-signature recovery and support for Apple accounts.</span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Polygon</span></h4><ul><li><a href="https://www.footprint.network/public/research/chain/chain-stats/polygon-overview?chain=Polygon&amp;amp;amp;filter_the_data_of_recent_days_below_to_exclude_the_current_day-85180=7&amp;amp;amp;series_date-85182=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Polygon</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> Labs announced SP1 would be used to create pessimistic proofs for the AggLayer to help guarantee the security of an aggregated network.</span></li></ul>
<h4><a href="https://www.footprint.network/public/research/chain/chain-stats/core-overview?chain=Core&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Core Chain</span></a></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Core DAO’s Gaming Summer Jam - Summer 2024 began on May 20. </span></li></ul>
<h4><a href="https://www.footprint.network/public/research/chain/chain-stats/gala-chain-overview?chain=GalaChain&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">GalaChain</span></a></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">An unidentified hacker breached Gala Games' internal controls, minting 5 billion new GALA tokens on May 20. The Gala Games team detected the breach and deployed their blocklist functionality to isolate the attacker’s address.</span></li></ul>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Layer 2</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">In May, the SEC’s approval of initial filings for spot Ethereum ETFs boosted the performance of </span><a href="https://www.footprint.network/public/research/chain/chain-ecosystem/layer-2-overview?%253E%253D_date-84008=2023-08-01&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ethereum Layer 2s</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">. </span><a href="https://www.footprint.network/public/research/chain/chain-stats/arbitrum-overview?chain=Arbitrum&amp;amp;amp;series_date-85559=2010-01-01~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Arbitrum</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> led the market with a 44.8% share and a 19.6% increase in TVL. </span><a href="https://www.footprint.network/public/research/chain/chain-stats/optimism-overview?chain=Optimism&amp;amp;amp;filter_the_data_of_recent_days_below_to_exclude_the_current_day-85191=7&amp;amp;amp;series_date-85202=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Optimism</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> followed with a 22.2% market share and a 13.1% increase in TVL. Blast saw a 22.3% increase in TVL, and Base experienced a 27.2% rise.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Linea’s TVL surged by 82.9% due to its DeFi event, Linea Surge, which attracted more DeFi enthusiasts and significantly increased TVL on the network. Manta Pacific’s TVL increased by 17.4% following the successful launch of Manta CeDeFi, a new product offering rewards from both CeFi and DeFi yields.</span>
<img src="https://statichk.footprint.network/article/9029d8cd-9271-4438-a798-b2b0a5a5e476.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/public/research/chain/chain-ecosystem/layer-2-overview?%253E%253D_date-84008=2023-08-01&amp;amp;amp;single_date-86180=2024-05-31"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Layer 2 Overview</span></a></em>
<h3><span style="font-size:12pt;font-family:Arial,sans-serif;color:#434343;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Developments within Major Layer 2 Blockchains in May 2024</span></h3><h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Arbitrum</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Arbitrum is the first Layer 2 to pass $150B in swap volume according to Uniswap Labs.</span></li><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Injective plans to launch its own layer-3 network "inEVM " based on Arbitrum's Orbit toolkit.</span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Optimism</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Layer 3s can join the Superchain by building on the OP Stack and sharing revenue with the Collective.</span></li></ul>
<h4><a href="https://www.footprint.network/public/research/chain/chain-stats/starknet-overview?series_date-80244=past90days&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Starknet</span></a></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">StarkWare has introduced a new scaling framework based on ZK execution sharding called ZKThreads.</span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Blast</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Blur launches on Blast, offering Blast Points and 2 million in GOLD rewards. </span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Taiko</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Taiko goes live on mainet on May 27.</span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Rootstock</span></h4><ul><li><a href="https://www.footprint.network/public/research/chain/chain-stats/rootstock-overview?chain=rootstock&amp;amp;amp;%253E%253D_date-84854=2023-01-01&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Rootstock</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> founder said Bitcoin's Ethereum-style programmability could come in 12 months.</span></li></ul>
<h4><span style="font-size:12pt;font-family:Arial,sans-serif;color:#666666;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Merlin Chain</span></h4><ul><li><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Merlin’s Seal was available to unstake according to </span><a href="https://www.footprint.network/public/research/chain/chain-stats/merlin-overview?chain=Merlin&amp;amp;amp;series_date-85559=2010-01-01~&amp;amp;amp;series_date-93099=2010-01-01~&amp;amp;amp;past_days-96438=7&amp;amp;amp;series_date_1-96421=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Merlin</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> Developer Update on May 10th. </span></li></ul>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Web3 Gaming</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">In April, 1,525 games were active across various blockchains, with </span><a href="https://www.footprint.network/public/research/gamefi/game-overview/single-chain?chain=BNB%20Chain&amp;amp;amp;series_date-79659=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">BNB Chain</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, </span><a href="https://www.footprint.network/public/research/gamefi/game-overview/single-chain?chain=Polygon&amp;amp;amp;series_date-79659=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Polygon</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, and </span><a href="https://www.footprint.network/public/research/gamefi/game-overview/single-chain?chain=Ethereum&amp;amp;amp;series_date-79659=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ethereum</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> leading with market shares of 23.3%, 19.7%, and 15.7% respectively.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Among the 3.3 million DAUs in May, </span><a href="https://www.footprint.network/public/research/gamefi/game-overview/single-chain?chain=Ronin&amp;amp;amp;series_date-79659=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Ronin</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, Polygon, and </span><a href="https://www.footprint.network/public/research/gamefi/game-overview/single-chain?chain=Near&amp;amp;amp;series_date-79659=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Near</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> continued to lead the sector, similar to the previous month. Ronin maintained its dominance with approximately 29.0% market share. Near saw its share increase from 12.1% at the beginning of May to 14.8% by the end of the month. </span><a href="https://www.footprint.network/public/research/gamefi/game-overview/single-chain?chain=Flow&amp;amp;amp;series_date-79659=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Flow</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> also experienced growth, with its share rising from 0.7% to 3.3%. Conversely, BNB Chain's share declined from 8.0% to 5.9%.</span>
<img src="https://statichk.footprint.network/article/49fed017-d404-46e6-bda9-653659c9b1ff.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Source:</span> <a href="https://www.footprint.network/@DamonSalvatore/Gamers-Reasearch?series_date=2024-05-01~2024-05-31&amp;amp;amp;channel=EN-673"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Daily Active Users by Chain</span></a></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">On May 24th, the </span><a href="https://www.footprint.network/public/research/gamefi/game-overview/single-chain?chain=Arbitrum&amp;amp;amp;series_date-79659=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Arbitrum</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> community initiated voting on the 200 million ARB Gaming Catalyst Plan, set to conclude on June 8th, aimed at bolstering gaming on the network. At the time of this report, the proposal has secured </span><a href="https://www.tally.xyz/gov/arbitrum/proposal/53472400873981607449547539050199074000442490831067826984987297151333310022877"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">majority support</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">, with 80.6% in favor. Concurrently, Arbitrum is developing a Layer 3 game-specific chain ecosystem, with the multi-chain NFT game ecosystem Polychain Monsters announcing a similar initiative through Altlayer.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The </span><a href="https://www.footprint.network/public/research/gamefi/game-overview/single-chain?chain=Starknet&amp;amp;amp;series_date-79659=past90days~&amp;amp;amp;channel=EN-673"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Starknet</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> Foundation awarded a 2 million STRK grant to the on-chain metaverse game, Realms.World, as part of a broader strategy to distribute 50 million STRK tokens to enhance Starknet’s gaming ecosystem, announced in March.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">The iconic Web2 soccer game Captain Tsubasa debuted on Oasys in May, developed by Mint Town, Co., Ltd. and BLOCKSMITH&Co., subsidiaries of mobile gaming giant KLab Inc. Oasys is actively pursuing further collaborations with Mint Town and other developers to integrate premium intellectual properties(IP) into blockchain games.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">For more data insights, you can get it from the May Web3 gaming report: </span><a href="https://www.footprint.network/article/may-2024-web3-game-report-growth-trends-and-evolving-user-engagement-f2kEfCvI"><span style="font-size:12pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">May 2024 Web3 Game Report: Growth Trends and Evolving User Engagement</span></a><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">.</span>
<h2><span style="font-size:13.999999999999998pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Investment & Funding</span></h2><span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">In May, the public chain sector recorded 17 funding events totaling $89.7 million, a 14.3% decrease from April. However, 7 of these events did not disclose detailed funding amounts. Notably, Pantera Capital announced its investment in TON as its “largest investment ever.” Previously, Pantera's largest investment was $250 million, used to purchase Solana (SOL) at a discount from the estate of the bankrupt FTX exchange.</span>
<img src="https://statichk.footprint.network/article/4d208ff7-ceb6-4ddd-933e-6eb396beedc5.png"><em><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Public Chain Funding Events in May 2024 (Source:</span> <a href="http://crypto-fundraising.info"><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">crypto-fundraising.info</span></a><span style="font-size:9pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">)</span></em>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Technology advancement remains a key focus within the blockchain infrastructure sector, drawing significant investor interest. One of the key advancements is the integration of Real World Assets (RWA) in crypto applications. In May, Layer 1 blockchain E Money Network and Ethereum Layer 2 Plume Network both secured new funding rounds to integrate RWA with their networks.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Humanity Protocol is another notable example, having raised $30 million at a $1 billion valuation. The startup is developing a blockchain-based identity system that uses palm scans to recognize individuals, aiming to verify online identities in an era of artificial intelligence deepfakes. This funding event propelled Humanity Protocol to unicorn status in less than a year.</span>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Interestingly, identifying a Layer 2 as specifically a Bitcoin or Ethereum Layer 2 is becoming more challenging, as many Layer 2 solutions are now building on multiple Layer 1s or supporting the scaling of multiple chains. For instance, Lumoz, a modular compute layer and ZK-RaaS platform, simplifies ZK-Rollup usage and promotes wider adoption. It supports several networks, including those built on Ethereum, Bitcoin, and BNB Chain.</span>
<br>
<br>
<span style="font-size:12pt;font-family:Arial,sans-serif;color:#333333;background-color:#ffffff;font-weight:400;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_</span>
<span style="font-size:12.499999999999998pt;font-family:Arial,sans-serif;color:#24292f;background-color:transparent;font-weight:700;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">What is Footprint Analytics?</span>
<br>
<span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Footprint Analytics is a blockchain data solutions provider. We leverage cutting-edge AI technology to help analysts, builders, and investors turn blockchain data and combine Web2 data into insights with accessible visualization tools and a powerful multi-chain API across 30+ chains for NFTs, games, wallet profiles, and money flow data.</span>
<br>
<a href="https://www.footprint.network/"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Website</span></a><span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> \| </span><a href="https://twitter.com/Footprint_Data"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">X / Twitter</span></a><span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> \| </span><a href="https://t.me/Footprint_Analytics"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Telegram</span></a><span style="font-size:11pt;font-family:Arial,sans-serif;color:#000000;background-color:transparent;font-weight:400;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span"> \| </span><a href="https://discord.gg/3HYaR6USM7"><span style="font-size:11pt;font-family:Arial,sans-serif;color:#1155cc;background-color:transparent;font-weight:400;font-style:normal;font-variant:normal;text-decoration:underline;-webkit-text-decoration-skip:none;text-decoration-skip-ink:none;vertical-align:baseline;white-space:pre;white-space:pre-wrap;" data-raw-html="span">Discord</span></a>
<br>
| footprint-analytics |
1,890,859 | In-Depth Guide about What is EV Code Signing & Its Impact on Enterprises | Extended Validation (EV) Code Signing certificate is an advanced digital signature. It protects... | 0 | 2024-06-17T06:23:28 | https://dev.to/sign_my_code/in-depth-guide-for-what-is-ev-code-signing-its-impact-on-enterprises-400n | evcodesigning, codesigning | Extended Validation (EV) Code Signing certificate is an advanced digital signature. It protects software developers’ and publishers’ codes, content, scripts, and other digital objects in their software and online applications from malicious attacks.
EV Code Signing provides next-generation security and trust to the customers by signing their software.
The software publishers and developers must undergo stringent verification and inspection to obtain an EV Code Signing Certificate. The validation process requires them to provide the necessary information to establish the authenticity and legitimacy of their organization.
Once the authentication process is complete, the EV Code Signing certificate is issued to the software publishers and developers to sign their software.
Full Guide Here: https://signmycode.com/blog/what-is-ev-code-signing-its-impact-on-enterprises | sign_my_code |
1,890,858 | How to Integrate Third-Party Services in eCommerce Website | Integrating third-party services into an eCommerce website is essential for enhancing functionality,... | 0 | 2024-06-17T06:23:24 | https://dev.to/amanmishrausa/how-to-integrate-third-party-services-in-ecommerce-website-3db7 | webdev | Integrating third-party services into an eCommerce website is essential for enhancing functionality, improving user experience, and driving sales. These integrations can range from payment gateways to shipping solutions, analytics, customer support, marketing tools, and more. Here's a comprehensive guide on how to integrate third-party services into your eCommerce website effectively.
## 1. Understanding the Need for Third-Party Integrations
Third-party services provide specialized functionalities that would be complex and time-consuming to develop in-house. They offer:
• Payment Processing: Secure transaction handling through gateways like PayPal, Stripe, or Square.
• Shipping Solutions: Real-time shipping rates, tracking, and label printing through carriers like UPS, FedEx, or DHL.
• Analytics: Enhanced data tracking and reporting via Google Analytics, Hotjar, or Mixpanel.
• Customer Support: Live chat, ticketing systems, and chatbots from Zendesk, Intercom, or Drift.
• Marketing and SEO: Email marketing, social media integration, and SEO tools from Mailchimp, Hootsuite, or Yoast.
## 2. Choosing the Right Services
Selecting the right third-party services depends on your business needs, target audience, and existing infrastructure. Consider the following factors:
• Compatibility: Ensure the service integrates seamlessly with your eCommerce platform (e.g., Shopify, WooCommerce, Magento).
• Cost: Evaluate the pricing models (subscription, per-use fees) and align them with your budget.
• Scalability: Choose services that can grow with your business.
• Support and Documentation: Look for services with robust customer support and comprehensive documentation.
## 3. Setting Up Payment Gateways
Payment gateways are crucial for handling transactions securely and efficiently. Here’s a step-by-step process for integrating a payment gateway:
1. Select a Payment Gateway: Choose a gateway that supports multiple currencies, has low transaction fees, and provides strong security features.
2. Sign Up and Configure Account: Create an account with the gateway provider and configure settings such as currency preferences, payout methods, and fraud detection.
3. Install and Integrate: Use the provided APIs or plugins to connect the payment gateway to your eCommerce site. Most platforms offer detailed documentation for this process.
4. Test Transactions: Conduct test transactions to ensure the gateway processes payments correctly and securely.
## 4. Implementing Shipping Solutions
Shipping integrations streamline order fulfillment by providing real-time rates, label printing, and tracking information. Follow these steps:
1. Choose a Shipping Carrier: Select carriers that offer the best rates and services for your shipping needs.
2. Register and Set Up Accounts: Create accounts with the chosen carriers and configure your shipping preferences.
3. Integrate with Your Platform: Use carrier APIs or platform-specific plugins to integrate shipping solutions. Configure shipping zones, rates, and methods within your eCommerce settings.
4. Test the Integration: Verify that shipping options display correctly at checkout and that tracking information is accessible to customers.
## 5. Enhancing Analytics and Tracking
Analytics integrations help you understand customer behavior and optimize your site’s performance. Here’s how to integrate analytics tools:
1. Select Analytics Tools: Common choices include Google Analytics, Mixpanel, and Hotjar.
2. Create an Account and Configure Settings: Set up your analytics account and configure key settings like goals, conversions, and user segmentation.
3. Install Tracking Codes: Add the provided tracking codes to your website’s header or use plugins to automate this process.
4. Monitor and Analyze Data: Regularly review analytics reports to gain insights into user behavior, site performance, and conversion rates.
## 6. Integrating Customer Support Tools
Customer support tools enhance user experience by providing timely assistance. Here’s a guide to integration:
1. Choose Support Tools: Options include live chat (e.g., Intercom), ticketing systems (e.g., Zendesk), and chatbots (e.g., Drift).
2. Set Up Accounts: Register and configure your support tools, setting up canned responses, ticket categories, and support teams.
3. Integrate with Your Website: Add chat widgets, contact forms, and help desk portals using provided scripts or plugins.
4. Train Support Team: Ensure your support team is trained on using the new tools and adhering to customer service protocols.
## 7. Utilizing Marketing and SEO Tools
Marketing and SEO tools help drive traffic and improve search engine rankings. Here’s how to integrate them:
1. Select Tools: Popular choices include Mailchimp for email marketing, Hootsuite for social media management, and Yoast for SEO.
2. Set Up and Configure: Create accounts and configure your marketing and SEO settings. This includes setting up email campaigns, social media posts, and SEO metadata.
3. Integrate with Your Platform: Use APIs, plugins, or direct integrations to connect these tools with your eCommerce site.
4. Monitor and Optimize: Regularly review campaign performance and SEO metrics to refine your strategies.
## 8. Testing and Quality Assurance
After integrating third-party services, thorough testing is essential:
• Functionality Testing: Ensure each integration works as intended and does not conflict with other site elements.
• Security Testing: Verify that integrations do not introduce vulnerabilities.
• Performance Testing: Ensure that the integrations do not adversely affect site speed or performance.
## 9. Ongoing Maintenance and Updates
Maintain and update your integrations to ensure continued functionality and security:
• Monitor Service Updates: Stay informed about updates from third-party providers and apply them promptly.
• Regular Audits: Periodically review integrations to ensure they are functioning correctly and meeting business needs.
• User Feedback: Collect feedback from users to identify any issues or areas for improvement.
## Conclusion
Integrating third-party services into your eCommerce website is a vital strategy for enhancing functionality, improving user experience, and driving sales.
By carefully selecting the right services, following best practices for integration, and maintaining these integrations, you can ensure a seamless and efficient operation that supports your business growth. Always keep user experience at the forefront and be proactive in adopting new technologies that can give your eCommerce site a competitive edge.
Additionally, for businesses looking to expand their reach and improve accessibility, considering [eCommerce app development](https://www.techgropse.com/ecommerce-app-development) can provide a significant boost, allowing for a more personalized and convenient shopping experience for your customers.
With the right combination of web and mobile solutions, your eCommerce platform can achieve new heights in the digital marketplace.
| amanmishrausa |
1,890,857 | How To View Previous Incidents To Gain Helpful Context During Incident Triage? | Picture this: you're knee-deep in resolving a P1/P0 incident, urgently seeking answers. What if you... | 0 | 2024-06-17T06:20:57 | https://www.squadcast.com/blog/how-to-view-previous-incidents-to-gain-helpful-context-during-incident-triage | incidentmanagement, squadcastupdates | Picture this: you're knee-deep in resolving a P1/P0 incident, urgently seeking answers. What if you could tap into past incidents to get important incident insights and streamline your troubleshooting process? In this blog, we pitch into the practical aspects of leveraging [Squadcast's Past Incidents](https://www.squadcast.com/blog/unveiling-past-incidents-accelerating-incident-resolution-with-historical-context) feature to help you enhance your Incident Management process.
## The Challenge: Reacting vs. Proactively Preventing
Incident resolution can be akin to navigating a maze blindfolded. Without a comprehensive understanding of Past Incidents, you risk reinventing the wheel each time. The absence of a systematic approach often leads to
- Prolonged downtimes
- Repeated firefighting
- Drain on resources
- Potential knowledge gaps
- Ambiguity in performance evaluation
- Gaps in documentations like [runbooks & playbooks](https://www.squadcast.com/blog/runbook-vs-playbook-whats-the-difference)
- Reactive Incident Management
- Uninformed decision-making during new incidents
- More context for [stakeholder communication](https://www.squadcast.com/blog/keeping-stakeholders-notified-of-incidents-with-squadcast)
## The Solution: Utilizing Past Incidents Feature
Past Incidents hold valuable insights that can revamp your Incident Management process. With [Squadcast’s Past Incidents feature](https://www.squadcast.com/blog/unveiling-past-incidents-accelerating-incident-resolution-with-historical-context), you gain access to a wealth of historical incident data. Past occurrences are laid out in a way, allowing you to discern patterns—whether it's a recurring issue or a one-time incident.
_**Read more:** [Suppressing Alert Noise during Scheduled Maintenance](https://www.squadcast.com/blog/suppressing-alert-noise-during-scheduled-maintenance)_
## How to View Past Incidents in Squadcast?
To access [Past Incidents](https://support.squadcast.com/incidents-page/past-incidents?_gl=1*bilx61*_gcl_au*MTI4MzQyMTUyNy4xNzE0Mzc2NDE2LjE0NDg2MjgzNjMuMTcxNjU1MzgzMS4xNzE2NTUzODMw) related to a parent incident, follow these steps:
- Go to the **Incidents List page** and select the desired incident.
- In the **Incident Details page**, navigate to the Details section and locate the Past Incidents tab.

- By default, you will be shown **5 of the most relevant incidents** that have previously been resolved.

- If you want to review Past Incidents from a specific date, click on the **corresponding date square on the heat-map**.

Here are the details shown for the relevant incidents:

## Use Cases: Past Incidents Applications
Past Incidents can be useful in various cases:
- **Repetition Analysis:** Instantly gauge the frequency of similar incidents over defined periods _– be it the last seven days, the previous quarter, or beyond_. This information becomes your compass, helping you decide whether you're dealing with a persistent headache or a rare anomaly.
- **Automated Resolution:** For recurrent issues, [Past Incidents](https://www.squadcast.com/blog/unveiling-past-incidents-accelerating-incident-resolution-with-historical-context#past-incident-details) guide you towards automation. As you troubleshoot, consider implementing CI/CD pipelines or Jenkins scripts that proactively address the identified problem. So you can skip manual intervention and let automation be your first responder. Read more on how Squadcast Workflows can help.
- **Post Mortem Replication:** Hunt into Past Incidents with high activity levels. You may uncover post-mortems that hold the key to swift resolutions. By replicating successful resolutions, you sidestep the need for exhaustive troubleshooting, resolving current incidents at an accelerated pace.
## Benefits of Squadcast’s Past Incidents
- By learning from the past, you **eliminate the guesswork and reduce incident resolution times**.
- You can strategically fine-tune your infrastructure. Proactively make **changes to prevent future incidents**, ensuring a more robust and resilient system.
- Paves the way for **creating automated runbooks and mitigation pipelines**. You’ll be able to respond to repetitive incidents with predefined actions, such as server backups or Prometheus pod restarts, executed seamlessly through outgoing webhooks.
[](https://register.squadcast.com/)
## Practical Path to Agile Resolutions
Information is power. Squadcast's Past Incidents gives you the ability to glance back before moving forward. By extracting actionable insights from historical incidents, you not only resolve current issues faster but also fortify your infrastructure against future disruptions. Afterall, in the tech world, foresight often outshines hindsight.
This feature is available to everyone who takes Squadcast on a spin via our [14 day free trial](http://register.squadcast.com/)!
_[Squadcast](https://www.squadcast.com/) is an Incident Management tool that’s purpose-built for SRE. Get rid of unwanted alerts, receive relevant notifications and integrate with popular ChatOps tools. Work in collaboration using virtual incident war rooms and use automation to eliminate toil._
[](http://register.squadcast.com/) | squadcastcommunity |
1,890,834 | Are you a robot?: Intro to CAPTCHA | Introduction Whenever you try to log into a website or participate in an online poll, you... | 0 | 2024-06-17T06:19:14 | https://dev.to/ccwell11/captcha-49hb | ## Introduction
Whenever you try to log into a website or participate in an online poll, you may notice a tiny check box that inquires:
> #### "Are you a robot?"
This may seem like such a pointless, silly, and easily receivable way to see if the person using the site is a human or a robot, but I'll be the person to let you know that there is a lot more going on in the background than what is being shown. Chances are, the necessary data needed to determine your status as a member of the human race will have been collected before your mouse even reaches the white box to click. That check box test is a type of CAPTCHA test used to make sure that the number of robots present on an application is significantly reduced if not removed entirely.
## What is CAPTCHA?
According to the Official CAPTCHA [website](http://www.captcha.net/), CAPTCHA is a "program that protects websites against bots by generating and grading tests that humans can pass but current computer programs cannot." This means that the intended purpose of the CAPTCHA tests is to determine if a "user" is truly a human or a component of some form of automated program (bot). Short for _Completely Automated Public Turing test to tell Computers and Humans Apart_, the full phrase that the acronym represents makes it much more apparent as to what the intended function of these programs is. These tests usually appear at specific moments during a user's navigation of an application or website, typically at locations where user input is required in some way, form, or fashion (i.e. login pages, online polls, form submissions, etc.). CAPTCHA can also be triggered by the detection of suspicious behaviors that humans are unlikely to replicate (Cloudflare, "How CAPTCHAs work").
## How CAPTCHA Works
The way that CAPTCHA works is by presenting a task to the client that would be considered relatively easy for _mostly_ all humans to work out but much more challenging for an autonomous program to decipher with little to no human intervention. These tasks can make it much more apparent as to what users need to be blocked from proceeding further into an application to avoid automated events from being inflicted. There are currently different versions of CAPTCHA tests that are programmed to check for bots in a variety of ways, but the most infamous trail is the ~~dreaded~~ "distorted text" test.

This test uses an algorithmically constructed photographic image that has been distorted in a way that makes it visually different from common fonts and text patterns and expects the user to respond with the text that is displayed in the provided image. This test played on the known failures involved with the digitization of text and used this weakness in the current technology in order to provide an additional layer of protection for all internet citizens. Similarly, an audio equivalent of this CAPTCHA test exists that reads the characters to be typed aloud instead of showing the user visually. This proved to not be as inclusive as one would have wanted as members of the deaf-blind community did not have an accessible alternative for that respective demographic (Google, "What is CAPTCHA?").

## Challenges & Concerns
After some time passed from its initial introduction in 2000, CAPTCHA began to receive many critiques due to it taking additional time (up to 30 seconds) for some users to complete. This got to the point where one of the co-cretor's friends allegedly mentioned it to him personally on numerous occasions. With it being a newer technological concept, people still were not fully used to it. The adjustment frustrated and confused many people, and rightfully so. Another major issue was the seemingly unwavering battle between good and bad automatic processes. As time progressed, so did the technology that was created to bypass these security measures that were put in place. A select few bots were able to accurately provide the necessary input to be perceived as a human or closely emulate it which is a major problem. Even in the instances where bots were not able to fool CAPTCHA by itself, "click farms", locations populated with people who manually complete CAPTCHA tests, were being used as well.

## reCAPTCHA
In response to user criticism and other apparent improvements to security that were needed, reCAPTCHA was introduced. reCAPTCHA is a service that is now being provided and maintained by Google to better ensure the security of websites while also providing a more user-friendly experience. reCAPTCHA includes automatic bot behavior detection that keeps track of potential evidence that could prove or disprove a user as a bot. Things like sharp & precise mouse movements, biometric data, IP addresses, and cookie logs are just a few of the factors that can be considered.

reCAPTCHA also has access to Google's fraud intelligence department and uses the gathered information within its possession to identify and block any users that are probable to have malicious intentions. This paired with artificial intelligence/machine learning-powered threat detection that's "capable of identifying active attacks and uncovering the connections between adversaries and their operations" allows users & site owners to feel more secure while also contributing more to the autonomy of the user end to comfortably combat the autonomy of malicious software or programs (Google, "What is CAPTCHA?"). With all that was included in the reCAPTCHA technology rollout, there should not be a question as to why it is the current officially recommended CAPTCHA implementation by the creators of CAPTCHA.

## CAPTCHA Creator
```js
let captcha;
function generate() {
// Clear old input
document.getElementById("submit").value = "";
// Access the element to store
// the generated captcha
captcha = document.getElementById("image");
let uniquechar = "";
const randomchar =
"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
// Generate captcha for length of
// 5 with random character
for (let i = 1; i < 5; i++) {
uniquechar += randomchar.charAt(
Math.random() * randomchar.length)
}
// Store generated input
captcha.innerHTML = uniquechar;
}
function printmsg() {
const usr_input = document
.getElementById("submit").value;
// Check whether the input is equal
// to generated captcha or not
if (usr_input == captcha.innerHTML) {
let s = document.getElementById("key")
.innerHTML = "Matched";
generate();
}
else {
let s = document.getElementById("key")
.innerHTML = "not Matched";
generate();
}
}
```
In [this](https://www.geeksforgeeks.org/captcha-generator-using-html-css-and-javscript/) code snippet example provided by GeekforGeeks, the code for a simplified version of a "distorted text" CAPTCHA can be created and ran to produce an output like so:

## Conclusion
It is apparent that CAPTCHA technology is needed in this day and age, especially with how common it is for people to have sensitive information about themselves stored on a digital platform. CAPTCHA can not only assist with the security of a user's information and accounts but also prevent biases in data from being formed from the use of repetitive inputs from a bot. Although CAPTCHA programs are not perfect and can allow bots to slip through its cracks every now and then, without them the state of the internet would not be what it is today with its lack of user confidence.
#### Sources
- https://www.cloudflare.com/learning/bots/how-captchas-work/
- http://www.captcha.net/
- https://cloud.google.com/security/products/recaptcha?hl=en
- https://www.geeksforgeeks.org/captcha-generator-using-html-css-and-javscript/
| ccwell11 | |
1,890,851 | Tagging AWS resources the right way using Terraform | 🔖 Introduction: Keeping your AWS resources organized and tracking costs can be a... | 0 | 2024-06-17T06:18:09 | https://dev.to/aws-builders/tagging-aws-resources-the-right-way-using-terraform-2872 | aws, devops, terraform, platformengineering | ## 🔖 Introduction:
Keeping your AWS resources organized and tracking costs can be a challenge, especially as your infrastructure grows.
Tagging resources is a simple yet great solution, but doing it effectively requires following best practices.
In this blog post, we’ll show you how to tag AWS resources using the Infrastructure as Code (IaC) tool Terraform.
## Introduction to Tagging AWS Resources:
Tagging AWS resources is important for maintaining an organized and cost-effective cloud infrastructure.
Tags are **_key-value pairs_** that allow you to categorize and manage resources based on criteria like environment, application, team, etc.
Consistent tagging provides benefits like better **_resource organization, cost allocation, automation, security,_** and **_lifecycle management._**
In Terraform, you can tag resources during provisioning. For example, to tag an S3 bucket with **_environment_** and **_team_** tags:
```
resource "aws_s3_bucket" "example" {
bucket = "my-bucket"
tags = {
Environment = "Production"
Team = "DevOps"
}
}
```
You can also define default tags at the provider level, which apply to all resources:
```
provider "aws" {
default_tags {
tags = {
Environment = "Production"
ManagedBy = "Terraform"
}
}
}
```
## Overriding Default Tags at Resource Level:
Defining default tags at the provider level using the `default_tags` block promotes consistency and reduces manual work by automatically applying common tags to all resources provisioned by that provider.
### Benefits of Default Tags:
- `Consistency`: Ensures all resources have a base set of tags applied.
- `Less Manual Work`: Avoids repetitive tag definitions across resources.
For example, setting default tags in the AWS provider:
```
provider "aws" {
default_tags {
tags = {
Environment = "Production"
ManagedBy = "Terraform"
}
}
}
```
How to Override Default Tags or Add Extra Tags you can override the default tags or add new tags at the resource level by specifying the `**tags**` argument.
This argument takes precedence over the default tags defined at the provider level.
```
resource "aws_s3_bucket" "example" {
bucket = "my-bucket"
# Override default Environment tag and add Purpose tag
tags = {
Environment = "Staging"
Purpose = "Data Processing"
}
}
```
In this example, the `Environment` tag is overridden with the value `Staging`, while the `Purpose` tag is added specifically for this S3 bucket resource.
The `ManagedBy` default tag is still applied.
_Use Cases for Resource-Level Tag Customization_:
- Environment-specific tags (dev, staging, prod, etc.).
- Application/project-specific tags.
- Resource-specific metadata tags (e.g., purpose, owner, expiration).
- Compliance or regulatory tags based on data sensitivity.
- Cost allocation tags for specific resources.
By allowing tag overrides at the resource level, you maintain the benefits of default tags while gaining the flexibility to customize tags based on the specific needs of individual resources.
## Using Variables and Functions for Flexible Tagging:
Terraform allows you to define tags as variables and use functions like `merge()` to combine them with other tags, promoting reusability and flexibility.
Defining tags as variables.
```
variable "default_tags" {
default = {
Environment = "Production"
ManagedBy = "Terraform"
}
}
```
Using the `merge()` function to combine tags:
```
resource "aws_instance" "example" {
ami = "ami-0c94855ba95c71c99"
instance_type = "t2.micro"
tags = merge(
var.default_tags,
{
Name = "ExampleInstance"
Project = "MyApp"
}
)
}
```
The **merge()** function combines the **default_tags** variable with additional resource-specific tags, resulting in all four tags being applied to the EC2 instance.
## Handling Special Cases:
Some AWS resources require specific tagging configurations or have limitations on how tags can be applied.
### Tagging Auto Scaling Groups:
Auto Scaling Groups [ASG](https://docs.aws.amazon.com/autoscaling/ec2/userguide/AutoScalingGroup.html) and Launch Templates [LT](https://docs.aws.amazon.com/autoscaling/ec2/userguide/LaunchTemplates.html) are tricky to tag correctly.
Without the right configuration, the EC2 instance and attached storage volumes launched by the ASG and LT will not have the default tags attached.
ASGs require the [propagate_at_launch](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/autoscaling_group) tag configuration.
### Tagging Launch Templates:
Launch templates require the **tag_specifications** configuration:
```
resource "aws_launch_template" "example" {
# ...
tag_specifications {
resource_type = "instance"
tags = {
Environment = "Production"
ManagedBy = "Terraform"
}
}
tag_specifications {
resource_type = "volume"
tags = {
Persistence = "Permanent"
}
}
}
```
### Tagging EBS Volumes:
When you create Elastic Compute EC2 instances via Terraform, the Elastic Block Store EBS volumes attached to the EC2 are not automatically tagged. Untagged EBS volumes are cumbersome to administer.
You assign the EC2 default tags to the attached EBS storage volume with the aws_instance [volume_tags](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/instance).
```
resource "aws_instance" "example" {
# ...
volume_tags = {
Name = "DataVolume"
Persistence = "Permanent"
}
}
```
### Other Special Cases:
Certain resources like AMIs, NAT Gateways, or VPC Endpoints may have specific tagging requirements or limitations.
Always refer to the Terraform provider documentation for the latest guidance on tagging configurations for different resource types.
## Avoiding Common Pitfalls:
### _Inconsistent tag naming conventions_:
Using inconsistent tag keys like **_appid, app_role,_** and **_AppPurpose_** makes tags harder to use and manage.
```
resource "aws_s3_bucket" "example" {
bucket = "my-bucket"
tags = {
appid = "myapp"
app_role = "data-processing"
AppPurpose = "logs"
}
}
```
Instead, define [an explicit ruleset](https://docs.aws.amazon.com/whitepapers/latest/tagging-best-practices/adopt-a-standardized-approach-for-tag-names.html) for tag key naming and stick with it.
### _Not tagging all resources (including secondary resources):_
Failing to tag all AWS resources, including secondary or complementary resources like EBS volumes, leads to incomplete visibility and cost tracking.
```
resource "aws_instance" "example" {
ami = "ami-0c94855ba95c71c99"
instance_type = "t2.micro"
tags = {
Environment = "Production"
}
}
```
### _Identical default and resource tags issue:_
Having identical tag keys and values in both `default_tags` and resource `tags` causes an error in Terraform, requiring deduplicating tags or using workarounds.
```
provider "aws" {
default_tags {
tags = {
Name = "Example"
}
}
}
resource "aws_vpc" "example" {
tags = {
Name = "Example" # Error: tags are identical
}
}
```
### _Perpetual diff for partial tag matches:_
When `default_tags` and resource `tags` have some matching and some differing tags, Terraform shows a perpetual diff trying to update the matching tags on every plan, requiring workarounds.
```
provider "aws" {
default_tags {
tags = {
Match1 = "A"
Match2 = "B"
NoMatch = "X"
}
}
}
resource "aws_vpc" "example" {
tags = {
Match1 = "A" # Perpetual diff trying
Match2 = "B" # to update these
NoMatch = "Y"
}
}
```
### _Infrastructure drift and tag loss:_
Losing tags due to infrastructure drift when resources are modified outside of Terraform.
Using IaC consistently helps mitigate this issue.
## Best Practices and Tips:
### _Establish a clear tagging strategy and naming convention:
_
Define a consistent set of tag keys and naming conventions to use across your infrastructure.
```
variable "tag_names" {
default = {
environment = "Environment"
application = "Application"
team = "Team"
costcenter = "CostCenter"
}
}
```
### _Tag resources as you provision them (not after):_
Apply tags to resources during the provisioning process, not after the fact, to ensure consistent tagging from the start.
```
resource "aws_s3_bucket" "example" {
bucket = "my-bucket"
tags = {
(var.tag_names.environment) = "Production"
(var.tag_names.application) = "MyApp"
}
}
```
### _Regularly review and audit tags:_
Periodically review and audit resource tags to ensure compliance with your tagging strategy and identify any missing or incorrect tags.
### _Automate tagging where possible:_
Leverage Terraform’s features like `default_tags`, variables, and functions to automatically apply tags during provisioning, reducing manual effort and promoting consistency.
## AWS Resource Groups and Tag Editor:
Have you ever wanted to do the following:
“Find all AWS resources in all regions that have the tag team='platform engineering' “?
AWS Resource Groups and Tag Editor are powerful tools that allow you to manage tags across multiple AWS resources and regions effectively.
### Resource Groups:
Resource Groups provide a centralized way to organize and manage collections of AWS resources based on shared tags. With Resource Groups, you can:
1. Find resources across regions that have specific tags applied, such as team='platform engineering'.
2. Identify resources that are missing tags or have incorrect tag values.
3. Automate operations like starting/stopping instances or applying configurations based on resource group membership.
4. View consolidated information about resource status, costs, and configurations within a group.
### Tag Editor:
The Tag Editor is a component of Resource Groups that enables bulk tagging operations across supported AWS services and regions. Using the Tag Editor, you can:
1. Search for resources based on resource types and existing tags, allowing queries like “Find all EC2 instances with team='platform engineering'.
2. Add, modify, or remove tags on multiple resources simultaneously, streamlining tagging efforts.
3. Preview the changes before applying them, ensuring accuracy and avoiding unintended modifications.
4. Use tag-based access control policies to manage resource access based on tag values.
## 🔚 Conclusion:
Proper tagging is substantial for organized, cost-friendly AWS setups, make sure to use Terraform’s tagging tools and the best ways and avoid common mistakes.
Thanks for reading and I hope you learned something about tagging AWS resources in Terraform!
Until next time 🎉
Thank you for Reading !! 🙌🏻😁📃, see you in the next blog.🤘🇵🇸
🚀 Thank you for sticking up till the end. If you have any questions/feedback regarding this blog feel free to connect with me :
♻️ LinkedIn: https://www.linkedin.com/in/rajhi-saif/
♻️ Twitter: https://twitter.com/rajhisaifeddine
The end ✌🏻
🔰 Keep Learning !! Keep Sharing !! 🔰
**_References:_**
https://support.hashicorp.com/hc/en-us/articles/4406026108435-Known-issues-with-default-tags-in-the-Terraform-AWS-Provider-3-38-0-4-67-0
https://medium.com/@leslie.alldridge/how-to-tag-aws-resources-in-terraform-effectively-f4f12bc2416b
https://engineering.deptagency.com/best-practices-for-terraform-aws-tags
| seifrajhi |
1,890,850 | Api Integration – Importance And Best Practices | In today's digital world, applications need to communicate with each other to provide users with... | 0 | 2024-06-17T06:14:39 | https://keploy.io/blog/community/api-integration-importance-and-best-practices | api, integration, webdev, devops |

In today's digital world, applications need to communicate with each other to provide users with seamless experiences. This communication is often made possible through APIs (Application Programming Interfaces). API integration is a process where different software systems are connected using APIs, allowing them to share data and functionalities. Let's delve deeper into what API integration is, how it works, and why it's crucial for modern software development.
**What is an API Integration?**
API integration is the process of connecting two or more applications using APIs. This connection allows the applications to exchange data and perform functions cohesively. For instance, an e-commerce website might integrate with a payment gateway API to process transactions.
You can think of an API has a set of rules and protocols that allows one software application to interact with another. It defines the methods and data structures that developers can use to interact with the software. APIs enable different applications to talk to each other, share data, and invoke services.
While APIs are powerful on their own, but with API integration we can unlocks the most complex use cases.
**Why is API integration important?**
In the modern world, more and more companies are adopting API First approach, it is a model where APIs are designed and developed as the foundational elements of an application. This means that before any user interfaces (UIs) or other application components are built, the APIs are thoroughly planned, designed, and tested.
By designing the API first, developers ensure that all parts of the application interact with a consistent and standardized interface. This reduces the risk of discrepancies and integration issues later in the development process, but this can't happen with a just One API — instead, developers must integrate multiple APIs to deliver high quality experiences.
**What are the benefits of API integration?**
**Almost all application leverage API integration has it offers numerous benefits:**
**Automation**: Automates workflows and reduces manual intervention.
**Efficiency**: Enhances efficiency by enabling real-time data exchange.
**Scalability**: Scales your application’s capabilities by adding new features through third-party services.
**Improved User Experience**: Provides a seamless user experience by integrating multiple services.
**Cost-Effective**: Reduces development costs by leveraging existing APIs rather than building functionalities from scratch.
**How does API integration work?**
API integration works by connecting two or more applications through their APIs to enable data exchange and functionality sharing. This process starts with identifying the integration needs and choosing the right APIs. The next step involves securely authenticating to the APIs using methods like API keys or OAuth.
Once authenticated, the applications send requests to the APIs, which process these requests and return the appropriate responses. This data is then integrated into the application as needed. Effective API integration also includes handling errors, ensuring data compatibility, and maintaining security throughout the interaction.
**What are some examples of API integrations?**
APIs are powerful tools that, when integrated, can automate and streamline complex business processes. Here are some sophisticated use cases:
**Business Process Automation**
API integration can automate business-critical workflows, reducing manual effort and errors. Examples include:
**Customer Relationship Management (CRM)**: Integrating CRM systems with email marketing tools to automate customer follow-ups and marketing campaigns.
**Human Resources**: Connecting HR management systems with payroll and benefits providers to automate employee onboarding and payroll processing.
**Supply Chain Management**: Linking supply chain systems with inventory management and ordering systems to automate stock replenishment and order fulfillment.
**Data Synchronization**
API integration ensures that data is consistently updated across different systems. This is essential for:
**Financial Services**: Synchronizing transaction data between banking systems and financial management tools.
**Real-Time Analytics**: Integrating analytics platforms with various data sources to provide up-to-date insights and reports.
**Customer Support**: Connecting customer support platforms with CRM and ticketing systems to ensure accurate and timely customer information.
**What are some best practices for API Integration?**
**To ensure successful API integration, consider the following best practices:**
**Thorough Planning**
Before starting the integration, plan thoroughly by:
**Defining Objectives**: Clearly outline what you want to achieve with the integration.
**Researching APIs**: Research and choose the best API for your needs.
**Understanding Limits**: Be aware of any limitations or restrictions of the API.
**Robust Security Measures**
Implement robust security measures, including:
**Using HTTPS**: Ensure all API communications use HTTPS to secure data in transit.
**Implementing OAuth**: Use OAuth for secure and scalable authentication.
**Validating Input**: Validate all input to prevent malicious data from causing harm.
**Comprehensive Testing**
Test your integration comprehensively by:
**Unit Testing**: Test individual components of the integration.
**Integration Testing**: Test the integration as a whole to ensure all parts work together seamlessly.
**Performance Testing**: Ensure the integration performs well under different conditions.
**Clear Documentation**
Maintain clear and detailed documentation, including:
**API Endpoints**: List all the API endpoints used and their purposes.
**Authentication Methods**: Describe the authentication methods and credentials required.
**Error Codes**: Document possible error codes and their meanings.
Examples: Provide example requests and responses.
Continuous Monitoring
Implement continuous monitoring to:
**Track Performance**: Monitor the performance and availability of the API.
**Detect Issues**: Identify and resolve issues quickly.
**Analyze Usage**: Analyze API usage to understand how it’s being used and identify areas for improvement.
**Conclusion**
API integration is a fundamental aspect of modern software development. It enables different applications to communicate, share data, and enhance functionalities seamlessly. By understanding the basics of APIs, the process of integration, and the challenges involved, developers can create more efficient, scalable, and user-friendly applications. Whether you’re working in e-commerce, social media, healthcare, or any other industry, mastering API integration will undoubtedly enhance your development capabilities and help you deliver better solutions to your users.
Through thoughtful planning, robust security measures, comprehensive testing, clear documentation, and continuous monitoring, you can overcome the challenges of API integration and unlock its full potential to automate and optimize business-critical workflows | keploy |
1,888,168 | Leveraging Amazon Titan Text Premier for RAG AI in Software Testing | Introduction Amazon Titan Text Premier, now available through Amazon Bedrock, is a state-of-the-art... | 0 | 2024-06-17T06:07:26 | https://dev.to/aws-builders/leveraging-amazon-titan-text-premier-for-rag-and-agent-based-ai-in-software-testing-3b81 | **Introduction**
Amazon Titan Text Premier, now available through Amazon Bedrock, is a state-of-the-art generative AI model that can revolutionize various fields, including software testing. This article provides a detailed guide on how to implement Retrieval-Augmented Generation (RAG) and agent-based generative AI applications to enhance software testing processes, optimizing outcomes with these advanced technologies.
**Understanding RAG and Agent-Based Generative AI**
**Retrieval-Augmented Generation (RAG)**
RAG combines retrieval-based techniques with generative models to create systems capable of fetching relevant information from extensive data sets and using this context to generate high-quality responses. This is particularly useful for tasks requiring detailed and contextually accurate outputs, such as creating comprehensive test cases or documentation.
**Agent-Based Generative AI**
Agent-based generative AI employs autonomous agents powered by generative models to perform tasks like test case creation, scenario simulation, and software interaction. These agents can learn and adapt from their interactions, making software testing more efficient and effective.
**How to Implement RAG and Agent-Based Generative AI in Software Testing**
**Step 1: Setting Up the Environment**
**_1.1 Accessing Amazon Bedrock_**
Log into your AWS account and go to the Amazon Bedrock service.
Ensure you have the necessary permissions to use the Amazon Titan Text Premier model.
**_1.2 Provisioning the Titan Text Premier Model_**
Follow the AWS documentation to set up the Titan Text Premier model in your AWS environment.
Configure the model to meet your specific software testing needs.
**Step 2: Creating a RAG System for Test Case Generation**
**_2.1 Preparing the Data_**
Collect a comprehensive set of documents, including user manuals, past test cases, and bug reports.
Use a retrieval system like Elasticsearch or Amazon Kendra to index this data for efficient searching.
**_2.2 Implementing the RAG Framework_**
Develop a retrieval component that queries the indexed data based on test requirements.
Integrate the Titan Text Premier model to generate test cases using the retrieved information.
**_2.3 Automating Test Case Generation_**
Create automation scripts to streamline the process of retrieving and generating test cases.
Use these generated test cases to enhance your existing test suite for broader and more thorough testing.
Step 3: Deploying Agent-Based Generative AI for Dynamic Testing
**_3.1 Defining Agent Roles and Scenarios_**
Identify the types of agents needed, such as UI testers, API testers, and performance testers.
Define scenarios for these agents to cover, including edge cases and common user interactions.
**_3.2 Developing Agent Logic_**
Use the Titan Text Premier model to enable agents to dynamically generate and execute test scripts.
Implement logic for agents to adapt and learn from test results, improving their effectiveness over time.
**_3.3 Integrating with CI/CD Pipelines_**
Connect the agent-based testing system to your Continuous Integration/Continuous Deployment (CI/CD) pipeline.
Ensure agents can autonomously start tests, analyze results, and report issues, supporting continuous testing.
**Benefits of Using Amazon Titan Text Premier in Software Testing**
Comprehensive Test Coverage
RAG and generative AI allow for the creation of a wide range of test scenarios, including those that might be overlooked by human testers, ensuring thorough test coverage.
**Enhanced Efficiency**
Automating test case generation and execution reduces manual effort and speeds up the testing process, enabling testers to focus on more complex issues.
**Continuous Improvement**
Generative AI models learn from test results, continuously improving the accuracy and relevance of generated test cases and scenarios.
**Scalability**
Agent-based systems can easily scale to handle large test suites and extensive applications, providing robust testing capabilities without significant additional resources.
**Conclusion**
Integrating Amazon Titan Text Premier into your software testing framework with RAG and agent-based generative AI greatly enhances testing efficiency and effectiveness. By automating and optimizing test processes, organizations can achieve higher-quality software products with faster release cycles. Amazon Bedrock's advanced infrastructure and capabilities make it feasible and highly beneficial to implement these innovative AI techniques.
Embrace the future of software testing with Amazon Titan Text Premier and transform your testing strategies for superior results.
| adelinemakokha | |
1,890,842 | Introduction to Container Orchestration | Containerization has revolutionized how we develop and deploy software, streamlining processes and... | 27,750 | 2024-06-17T06:06:03 | https://psj.codes/introduction-to-container-orchestration | containers, kubernetes, devops, opensource | Containerization has revolutionized how we develop and deploy software, streamlining processes and enhancing scalability. It all began modestly in `1979` with the introduction of `chroot`, a Unix feature that allowed applications to operate within a confined directory subset. This breakthrough laid the groundwork for application isolation, a crucial concept for modern containerization.
Building upon chroot, FreeBSD's introduction of `jails` in 2000 marked a significant advancement. Jails provided a more robust form of isolation within a FreeBSD environment, enabling multiple applications to run securely on the same host without interference. This development was pivotal in demonstrating the practicality of isolating software environments for enhanced security and efficiency.
Following FreeBSD, `Solaris Containers (2004),` also known as `Zones` refined containerization by introducing sophisticated resource management capabilities. Zones allowed administrators to allocate specific CPU, memory, and storage resources to each container, optimizing hardware utilization and paving the way for efficient data centre management.
Google's `control group(cgroup)`, integrated into the Linux kernel in `2007`, brought fine-grained resource control to Linux-based containers. This innovation enabled administrators to manage and isolate resource usage among groups of processes, enhancing predictability and performance in containerized environments.
The culmination of these advancements led to the creation of `Linux Containers (LXC) in 2008`, which provided a user-friendly interface for leveraging Linux kernel features like `cgroups` and `namespaces`. LXC enabled the creation and management of lightweight, isolated Linux environments, marking a significant milestone towards the widespread adoption of container technology.
`In 2013, Docker` revolutionized containerization with its user-friendly platform for creating, deploying, and managing containers. Initially built upon LXC, Docker later introduced its `container runtime and libcontainer`, which l`everaged Linux namespaces, control groups, and other kernel features`. Docker's standardized container format and tooling simplified application packaging and deployment, accelerating the adoption of containers in both development and production environments.
Around the same time, the technological landscape experienced a major shift in software architecture. It moved from monolithic applications, where all modules run on a single machine and are tightly coupled, to a more decentralized and scalable model known as microservices architecture. In the 2000s, the rise of microservices architecture and the adoption of cloud computing rapidly accelerated the use of containerization. However, efficiently managing and orchestrating these containers remains a significant challenge.
**Challenges in Container Management**
Efficiently managing and orchestrating these containers at scale remains a formidable task, presenting challenges such as:
* **Deployment**: Deploying numerous containers across diverse environments requires meticulous handling of versions, dependencies, and configurations to ensure consistency and reliability.
* **Scaling**: Applications must scale dynamically to meet varying demands, necessitating automated mechanisms that optimize resource usage without manual intervention.
* **Networking**: Effective networking is essential for seamless service discovery, load balancing, and secure communication among containers, demanding robust management policies.
* **Resource Management**: Efficient allocation of CPU, memory, and storage resources is critical to prevent performance bottlenecks and control operational costs effectively.
* **Security**: Ensuring container security requires implementing strict access controls, secure configurations, and isolation strategies to mitigate risks of breaches.
* **High Availability**: Maintaining application availability involves proactive management of container failures, load balancing, and resilient failover strategies to minimize downtime.
Addressing these challenges is crucial for leveraging the full potential of containerization, enabling agility, scalability, and efficiency in software development and deployment.
### Container Orchestration
While containerization has revolutionized software deployment, efficiently managing and scaling containerized applications across complex environments remains a daunting task. Container orchestration addresses these challenges by automating deployment, scaling, and management processes, ensuring applications run seamlessly from development through to production.
### Container Orchestrators
Container orchestrators are tools that group systems together to form clusters where container deployment and management are automated at scale while meeting the production requirements.
They provide essential functionalities such as:
* **Automated Deployment**: Simplifying the deployment process with declarative configurations and automated rollouts.
* **Scalability**: Enabling horizontal scaling of applications based on resource demands, ensuring performance and efficiency.
* **Networking Automation**: Facilitating efficient networking by managing service discovery, load balancing, and network security policies.
* **Resource Optimization**: Optimizing resource allocation and utilization to enhance performance and reduce operational costs.
* **Security Enhancements**: Implementing security best practices, including isolation mechanisms, encryption, and access controls.
* **High Availability Strategies**: Ensuring continuous application availability through automated failover, load distribution, and recovery mechanisms.
Popular Container Orchestrators are:
**Kubernetes**:
- Developed by Google and is now maintained by the Cloud Native Computing Foundation (CNCF).
- Features: Automatic bin packing, self-healing, horizontal scaling, service discovery, load balancing, and automated rollouts and rollbacks.
**Docker Swarm**:
- Native clustering and orchestration solution for Docker containers.
- Features: Easy setup, Docker CLI compatibility, service discovery, load balancing, scaling, and rolling updates.
**Apache Mesos**:
- An open-source project that abstracts CPU, memory, storage, and other compute resources away from machines.
- Features: Highly scalable, supports multiple frameworks, resource isolation, fault tolerance, and elasticity.
**Nomad**:
- Developed by HashiCorp, it is a flexible and simple workload orchestrator.
- Features: Multi-region, multi-cloud deployment, integrates with Consul for service discovery and Vault for secrets management, easy to use, and supports multiple workloads (Docker, non-containerized, Windows, etc.).
**OpenShift**:
- Developed by Red Hat, built on top of Kubernetes with additional features.
- Features: Developer and operational tools, automated installation, upgrade management, monitoring, logging, and security policies.
**Rancher**:
- An open-source platform for managing Kubernetes at scale.
- Features: Multi-cluster management, integrated monitoring and logging, centralized RBAC, and supports any Kubernetes distribution.
**Amazon Elastic Kubernetes Service (EKS)**:
- Managed Kubernetes service by Amazon Web Services.
- Features: Fully managed, integrated with AWS services, auto-scaling, security, and compliance.
**Google Kubernetes Engine (GKE)**:
- Managed Kubernetes service by Google Cloud.
- Features: Fully managed, integrated with Google Cloud services, auto-scaling, security, and compliance.
**Azure Kubernetes Service (AKS)**:
- Managed Kubernetes service by Microsoft Azure.
- Features: Fully managed, integrated with Azure services, auto-scaling, security, and compliance.
**IBM Cloud Kubernetes Service**:
- Managed Kubernetes service by IBM Cloud.
- Features: Fully managed, integrated with IBM Cloud services, auto-scaling, security, and compliance.
**Alibaba Cloud Container Service for Kubernetes (ACK)**:
- Managed Kubernetes service by Alibaba Cloud.
- Features: Fully managed, integrated with Alibaba Cloud services, auto-scaling, security, and compliance.
### Conclusion and Future Articles
In conclusion, containerization has revolutionized software development and deployment, offering scalability, efficiency, and agility crucial in today's dynamic landscape. As we've explored the evolution from chroot to Docker and the challenges of managing containerized environments, it's clear that container orchestration is pivotal.
***Thank you for reading this blog; your interest is greatly appreciated, and I hope it helps you on your Kubernetes journey; in the next blog, we'll explore Kubernetes, covering its features, architecture and core components.*** | pratikjagrut |
1,890,845 | How to Leverage Real World Asset Token Development for Business Growth ? | In the rapidly developing digital economy, the tokenization of real assets stands out as an important... | 0 | 2024-06-17T06:05:54 | https://dev.to/mirthi12/how-to-leverage-real-world-asset-token-development-for-business-growth--3emg | In the rapidly developing digital economy, the tokenization of real assets stands out as an important innovation. By converting physical assets into digital tokens on the blockchain, businesses can unlock new growth opportunities. Here are ten top ways to leverage real-world asset tokens to drive business growth:
**Increase Liquidity
**
Tokenizing assets like real estate, commodities, or art makes them more liquid. This means they can be bought, sold or traded more easily, attracting a wider range of investors. Increased liquidity often leads to higher assets and more investment opportunities.
**Broaden the investment base
**
Traditional investments often require significant capital, which limits participation to high-net-worth individuals or large institutions. Tokenization enables fractional ownership, allowing smaller investors to participate. Democratization of investment opens up new sources of capital and diversifies the investment base.
**Improve transparency
**
Blockchain technology that supports resource labeling provides unparalleled transparency. Every transaction is recorded in a distributed ledger that is accessible to all stakeholders. Such transparency increases trust between investors and customers and improves the company's image and credibility.
**Improve Efficiency
**
Traditional asset management and transactions can be time-consuming, expensive and involve multiple intermediaries. Tokenization streamlines these processes, reducing the need for intermediaries and lowering transaction costs. This efficiency can lead to significant savings and faster turnaround times.
**Enable Global Access
**
Asset identifiers remove geographic barriers, allowing assets to be bought and sold globally. This global reach can attract international investors, giving companies access to wider markets and greater funding opportunities.
**Facilitating Compliance
**
Blockchain technology can help companies meet regulatory requirements more efficiently. Smart contracts can be programmed to automatically check compliance with local and international laws, reducing the risk of legal problems and ensuring smoother operation.
**Open up new revenue streams
**
By tagging real estate, companies can explore new revenue streams. For example, a real estate company can receive income not only from the sale and rental of real estate, but also from the sale of tokens backed by the property. Diversification of income sources can improve economic stability and growth.
**Improve Security
**
Tokens on the blockchain are very secure due to the cryptographic nature of the technology. This reduces the risk of fraud and theft and provides investors and owners with peace of mind. Improved information security can also reduce insurance costs and improve asset management.
**Promotes Innovation
**
Asset tokenization encourages innovation in financial products and services. Companies can develop new investment instruments, such as tokenized funds or derivatives, tailored to specific market needs. This innovation can attract forward-looking investors and partners, encouraging economic growth and competitive advantage.
**Improve Brand Image
**
The use of advanced technologies such as blockchain and securitization can significantly improve a company's brand image. This positions the company as an innovation and technology leader and attracts technology customers and investors. A strong and modern brand image can increase market share and customer loyalty.
**Conclusion
**
The development of real assets offers many benefits to businesses, from increased liquidity and efficiency to better data security and global access. By taking advantage of these benefits, companies can not only improve their current operations, but also open up new opportunities for growth. Embracing this technology isn't just about keeping up with the trends.
 | mirthi12 | |
1,890,844 | Best Practices for Top-Notch Software Development | It is no secret or even news that software has become integral to our daily lives in today's highly... | 0 | 2024-06-17T06:03:58 | https://dev.to/geekktech/best-practices-for-top-notch-software-development-5d8h | softwaredevelopment, bestpractices | It is no secret or even news that software has become integral to our daily lives in today's highly digital world. No, really -- from checking our phones right after waking up to using different apps throughout the day, well-crafted software apps now simplify tasks and enhance user experiences. In any case, have you at any point thought about how every one of these applications and programming that we use are made? No? Let me tell you: taking an idea and changing it into a utilitarian and easy-to-understand software offering is a complex undertaking. It necessitates not only meticulous planning but also precise execution and a team with the necessary expertise and skills to manage the project. Understanding the best practices in product development is crucial for anyone looking to create software for their business, regardless of industry.
Before you can start talking about [full-cycle software product development services](https://www.rishabhsoft.com/services/software-product-development) with your stakeholders, I highly recommend reading this blog, wherein I discuss best practices to help you better navigate the process.
**Software Development for Business in the Modern Age**
Software development is an essential investment for any organization in the modern market, especially since software has come to serve as the center of various tasks, including customer interaction, internal collaborations, and internal processes, among other things. Custom software drives advancement, improves effectiveness, and opens new income streams for the company. Modern software development practices such as Agile and DevOps accentuate adaptability, speedy delivery, and user-centric design.
## Software Development Best Practices that Businesses Should Keep in Mind
- **Pick the right development model**: Selecting the right development model is critical for successful software development. Current top-of-the-shelf choices include Agile, which offers an adaptable, iterative turn of events; Cascade, which brings organized, sequential steps; and Kanban, which envisions a work process for smooth delivery while also taking care of various necessities. The ultimate decision will be based on the size of the project, its complexity, and the company's culture. Other considerations will include the size of the team and the budget, among other things
- **Choose the right technology stack**: Software development also necessitates selecting the appropriate technology stack, which includes programming languages, frameworks, etc. The ideal stack is based upon the company's individual requirements, including its motivation, desired elements, versatility, and the team's expertise areas. It is essential to consider dependable and viable technology stacks that line up with your prerequisites to guarantee effective and reasonable software development.
- **Code review**: A code review includes programmers inspecting the code to find mistakes, boost quality, and share relevant knowledge. Such a practice helps companies identify likely bugs, recommends best practices, and even ensures adherence to style guides. Remember: A more robust and easy-to-maintain codebase is made possible only by regular code reviews, which encourage teamwork as well as continuous improvement in the development process.
- **Testing**: Testing thoroughly during the software development process involves different strategies, such as acceptance testing, to ensure the software meets user needs. Integration testing is also used to analyze component interaction. Suffice it to say that careful software testing helps companies distinguish and fix issues early in the process. This, in turn, is vital for preventing issues later in the development lifecycle. Thanks to this procedure, the software can be trusted to work as intended and meet quality standards.
There you have it, folks -- some of the most important best practices you need to remember when you set out to develop software for your company or even if you hire a full-cycle development services provider for it. | geekktech |
1,890,843 | A Not So Useful Python Ariadne GraphQL Server Setup | This article is also available on my personal blog I am writing this article after trying to... | 0 | 2024-06-17T06:01:45 | https://dev.to/rickyxyz/a-not-so-useful-python-ariadne-graphql-server-setup-k81 | graphql, python, webdev, sqlite | > This article is also available on [my personal blog](https://rickyxyz.dev/blog/setup_graphql_with_python/)
I am writing this article after trying to follow this outdated guide [Using GraphQL with Python - A Complete Guide](https://www.apollographql.com/blog/complete-api-guide), and thought "I should try write something like this". So, here we are.
Technologies used in this article:
- Python 3.10.14 (no particular reason, I just have this version installed)
- Flask 3.0.3
- Ariadne 0.23.0
- SQLite (no setup required with SQLite, so we can focus on GraphQL)
- Bash as shell interface
Here is a diagram of what we are going to make in this article.

Table of Contents:
1. [Environment Setup](#environment-setup)
2. [All In One File](#all-in-one-file)
3. [Adding SQLite Into The Mix](#adding-sqlite-into-the-mix)
4. [Resolving Query](#resolving-query)
5. [Resolving Mutation](#resolving-mutation)
6. [A More Sophisticated Setup](#a-more-sophisticated-setup)
## Environment Setup
Use Python virtual environment, so it does not mess with your other packages or Python installation. And don't forget to activate the virtual environment.
```bash
python -m venv venv
source venv/bin/activate
```
Install the packages we are going to use.
```bash
pip install flask ariadne
```
## All In One File
This will be the minimum setup to "run a GraphQL server". Here is the minimum code in one file, for you busy people. There is not much explanation done here, since I am sure some people are only here to get the minimal code and are more interested in figuring things out themselves.
```python
# /app.py
# Code originally from
# https://ariadnegraphql.org/docs/flask-integration
from ariadne import QueryType, graphql_sync, make_executable_schema
from flask import Flask, jsonify, request
# GraphQL schema definition
# https://graphql.org/learn/schema/
type_defs = """
type Query {
hello: String!
}
"""
query = QueryType()
# Bind resolver to query
# Meaning: run the function resolve_hello
# when query hello is receiver
@query.field("hello")
def resolve_hello(_, info):
request = info.context
user_agent = request.headers["User-Agent"] or "Guest"
return "Hello, %s!" % user_agent
schema = make_executable_schema(type_defs, query)
app = Flask(__name__)
# Bind /graphql route to accept and process graphQL request
# https://graphql.org/learn/serving-over-http/
@app.route("/graphql", methods=["POST"])
def graphql_server():
data = request.get_json()
headers = dict(request.headers)
success, result = graphql_sync(
schema,
data,
context_value={"request": request, "headers": headers},
debug=app.debug
)
status_code = 200 if success else 400
return jsonify(result), status_code
# Start the flask app
if __name__ == "__main__":
app.run(debug=True)
```
The part below is to add a GraphQL explorer when you go to `/graphql` in your web browser. This part is completely optional, but it is quite helpful when experimenting.
```python
from ariadne.explorer import ExplorerGraphiQL
explorer_html = ExplorerGraphiQL().html(None)
@app.route("/graphql", methods=["GET"])
def graphql_explorer():
return explorer_html, 200
```
Below is how the GraphQL explorer looks like in your browser.

## Adding SQLite Into The Mix
In this section we are going to implement the SQLite and Python SQLite Connector part of the server.

If you need a sample SQLite database, you can go get the [Northwind sample SQLite DB](https://en.wikiversity.org/wiki/Database_Examples/Northwind/SQLite) and copy and paste its content into `seed.sql`, and then run the python code below to create the SQLite db file. After you run the file, you should have a file named `Northwind.db` on the root of your folder. Alternatively, you can just download the `.db` file from the [GitHub Repository](https://github.com/rickyxyz/ariadne-example/blob/main/Northwind.db)
```python
# Create Northwind.db from .sql file
import sqlite3
with open('seed.sql', 'r') as f:
query = f.read()
con = sqlite3.connect('Northwind.db')
cur = con.cursor()
cur.executescript(query)
con.commit()
con.close()
```
Below is the Entity-Relationship Diagram for the Northwind database.

For this article, we are only going to mess with the Shippers table on the Database. First step, first, make a `database.py` file to store all the sqlite functionality in. This file will be responsible for interacting with the SQLite.
```python
# /database.py
# Code from
# https://flask.palletsprojects.com/en/3.0.x/patterns/sqlite3/
# Build a model of sort to connect to SQLite
import sqlite3
from flask import g
DATABASE = "NorthWind.db"
def make_dicts(cursor, row):
"""Helper function to turn query result into dict"""
return dict((cursor.description[idx][0], value) for idx, value in enumerate(row))
def get_db():
"""Helper function to get DB connection for Flask"""
db = getattr(g, "_database", None)
if db is None:
db = g._database = sqlite3.connect(DATABASE)
# Turn query result into dict
db.row_factory = make_dicts
return db
def query_db(query, args=(), one=False, commit=False):
"""Helper function to make query to DB"""
db = get_db()
cur = db.execute(query, args)
if commit:
db.commit()
cur.close()
return None
else:
rv = cur.fetchall()
cur.close()
return (rv[0] if rv else None) if one else rv
```
The next file will be `shippers.py`, this file will act as a model for the `Shippers` table in the database. While it is completely fine to just put these functions in the `database.py` file, it is just a little bit better to separate the concerns between file.
```python
# /shippers.py
from database import query_db
def getShippers():
result = query_db("SELECT * FROM Shippers")
return result
def getShipper(shipperID):
query = "SELECT * FROM Shippers WHERE ShipperID = ?"
result = query_db(query, (shipperID))
return result
def createShipper(name, phone):
query = "INSERT INTO Shippers (ShipperName, Phone) VALUES (?, ?)"
result = query_db(query, (name, phone), commit=True)
return result
def updateShipper(shipperID, name, phone):
query = "UPDATE Shippers SET ShipperName = ?, Phone = ? WHERE ShipperID = ?"
result = query_db(query, (name, phone, shipperID), commit=True)
pass
def deleteShipper(shipperID):
query = "DELETE FROM Shippers WHERE ShipperID = ?"
result = query_db(query, (shipperID), commit=True)
return result
```
## Resolving Query
Now it's time to make the resolver functions and call the database functions from the resolvers.

First modify the `app.py` file, modify the `type_defs` like the code below, so the resolver knows what schema is valid.
```python
type_defs = """
type Query {
shippers: [Shipper!]
shipper(ID: ID!): Shipper
}
type Shipper {
ShipperID: ID!
ShipperName: String!
Phone: String!
}
"""
```
Next bind the resovler function to the query. These functions will be run when their respective query is invoked. Below the type_defs function add these code.
```python
@query.field("shippers")
def resolve_shippers(_, info):
result = getShippers()
return result
@query.field("shipper")
def resolve_shipper(obj, info, ShipperID):
result = getShipper(ShipperID)
return result[0] # This is because query_db() will return an array
```
This is basically what happens when a POST request is made to the `/graphql` endpoint.

When a POST request is made to `/graphql` endpoint, the Ariadne internally resolves the query and call the resolver function that was binded to that type of query. Technically, the resolver function can do anything it wants as long as it returns the same type of data as specified in the schema. So technically, the code below is a valid resolver.
```python
@query.field("shipper")
def resolve_shipper(obj, info, ShipperID):
return {
"ShipperID": 4,
"ShipperName": "West World",
"Phone": "(123) 8485 827"
}
```
If you added the GraphQL explorer, you can go to `/graphql` to your browser and type some queries to test out the GraphQL endpoint.
```graphql
# example query
{
shipper(ShipperID: "1") {
ShipperID
ShipperName
Phone
}
shippers {
ShipperID
ShipperName
Phone
}
}
```
Or you can also use curl or Postman, you get the point.
```bash
curl -i -H 'Content-Type: application/json' -H "Authorization: bearer myGithubAccessToken" -X POST -d '{"query":"{\n shipper(ShipperID: \"1\") {\n ShipperID\n ShipperName\n Phone\n }\n \n shippers {\n\t\tShipperID\n ShipperName\n Phone\n }\n}"}' http://127.0.0.1:5000/graphql
```
## Resolving mutation
Time to mutate data through GraphQL. Import `MutationType` and modify the type_defs variable again into.
```python
from ariadne import MutationType
mutation = MutationType() # You need this line to call the @mutation decorator
type_defs = """
type Query {
shippers: [Shipper!]
shipper(ID: ID!): Shipper
}
type Mutation {
createShipper(ShipperName: String!, Phone: String!): Shipper
updateShipper(ShipperID: ID!, ShipperName: String, Phone: String): Shipper
deleteShipper(ShipperID: ID!): Shipper
}
type Shipper {
ShipperID: ID!
ShipperName: String!
Phone: String!
}
"""
```
And connect the resolver to the mutation like previously done with query.
```python
@mutation.field("createShipper")
def resolve_createShipper(obj, info, ShipperName, Phone):
result = createShipper(ShipperName, Phone)
print(result)
return result
```
Also, add change the `mutation` as a parameter to `make_executable_schema` function like so.
```python
schema = make_executable_schema(type_defs, query, mutation)
```
Now making a `POST` request to the `/graphql` endpoint, try making this query.
```graphql
mutation {
createShipper(ShipperName: "New Shipper", Phone: "123 456 789") {
__typename
}
}
```
You should see a null return. To check if the creation succeeded or not, make a `getShippers` query, and the newly created shipper should show up. Adding the update and delete operation should be quite straigtforward, just add the mutation decorator to a resolver function that will call the functions from `shippers.py` module.
```python
@mutation.field("updateShipper")
def resolve_updateShipper(obj, info, ShipperID, ShipperName, Phone):
result = updateShipper(ShipperID, ShipperName, Phone)
print(result)
return result
@mutation.field("deleteShipper")
def resolve_deleteShipper(obj, info, ShipperID):
result = deleteShipper(ShipperID)
print(result)
return result
```
## A More Sophisticated Setup
Have you ever heard of the [90-90 Rule](https://en.wikipedia.org/wiki/Ninety%E2%80%93ninety_rule), if you haven't you may want to read about it in the meantime. This section is supposed to cover: error handling, better schema types, and more file separation.
> Thank you for reading.
> Codes are available in GitHub [rickyxyz/ariadne-example](https://github.com/rickyxyz/ariadne-example)
| rickyxyz |
1,890,841 | How to Choose the Best eCommerce Platform for Your Australian Business | Selecting the perfect eCommerce platform is a pivotal decision for your Australian business. With the... | 0 | 2024-06-17T06:01:11 | https://dev.to/thisuri_dewmini_63f59fbc8/how-to-choose-the-best-ecommerce-platform-for-your-australian-business-3md4 | ecommerce, shopify, magento, woocommerce | Selecting the [perfect eCommerce platform](https://www.neosolax.com.au/ecommerce-agency-solutions-australia/) is a pivotal decision for your Australian business. With the burgeoning online retail market in Australia, choosing the right platform can significantly impact your business’s growth, customer experience, and overall success. This guide will walk you through the essential factors to consider and help you make an informed decision tailored to your business needs.
1. Ease of Use: Simplify Your Setup
Imagine launching your online store with just a few clicks. Platforms like Shopify and Wix eCommerce are renowned for their user-friendly interfaces, enabling even the most tech-averse entrepreneurs to set up and manage their stores effortlessly. Look for platforms that offer intuitive navigation and drag-and-drop functionalities, ensuring that you can focus on your business rather than getting bogged down by technical details.
2. Scalability: Future-Proof Your Business
Your [eCommerce platform ](https://www.neosolax.com.au/ecommerce-agency-solutions-australia/)should grow with your business. As your product catalog expands and your traffic increases, your platform should handle the load seamlessly. Shopify and BigCommerce are excellent choices for scalability, offering robust infrastructure to support your business as it evolves. They ensure that your site remains fast and responsive, providing a smooth shopping experience even during peak times.
3. Customization and Flexibility: Tailor Your Store
Every business is unique, and your online store should reflect your brand’s personality. Platforms like WooCommerce and Magento excel in customization, allowing you to tweak every aspect of your site. WooCommerce, integrated with WordPress, offers extensive plugins and themes, while Magento provides advanced customization for businesses with specific needs. Ensure your platform supports custom features and integrations to create a distinctive online presence.
4. Payment Gateway Integration: Seamless Transactions
In Australia, customers prefer diverse payment options, including PayPal, Afterpay, and ZipPay. Choose a platform that supports these popular gateways to enhance your checkout process. Shopify, with its wide range of integrated payment options, ensures your customers can pay using their preferred methods, reducing cart abandonment and increasing sales.
5. SEO and Marketing Tools: Boost Your Visibility
Visibility is crucial in the crowded online marketplace. Opt for [platforms with built-in SEO features](https://www.neosolax.com.au/e-commerce/seo/) and marketing tools to drive traffic to your site. BigCommerce offers advanced SEO tools and social media integrations, helping you climb search engine rankings and engage with your audience. Customizable meta tags, clean URLs, and blogging capabilities are essential features to look for.
6. Security: Protect Your Customers
With increasing concerns about online security, protecting your customers’ data is paramount. Ensure your platform offers robust security features, including SSL certificates, PCI compliance, and regular updates. Magento and Shopify are known for their strong security measures, ensuring your customers can shop with confidence, knowing their information is safe.
7. Customer Support: Reliable Assistance
When technical issues arise, having reliable customer support is a lifesaver. Platforms like Shopify and Wix eCommerce offer 24/7 support, comprehensive help centers, and active community forums. Responsive customer support can quickly resolve issues, keeping your store running smoothly and minimizing downtime.
[**Popular eCommerce Platforms for Australian Businesses**](https://www.neosolax.com.au/ecommerce-agency-solutions-australia/)
- Shopify:
Ideal for businesses of all sizes, Shopify offers ease of use, scalability, and extensive payment gateway options. Its robust app store and excellent customer support make it a top choice.
- WooCommerce:
Perfect for WordPress users, WooCommerce provides unmatched customization and flexibility. It’s an excellent choice for businesses looking for a scalable, cost-effective solution.
- Magento:
Best suited for larger businesses with complex needs, Magento offers powerful customization and scalability. Its advanced features require some technical expertise but provide unparalleled flexibility.
- BigCommerce:
With a strong focus on SEO and multi-channel selling, BigCommerce is ideal for growing businesses. Its comprehensive features support complex product catalogs and high traffic volumes.
- Wix eCommerce:
For small businesses or startups, Wix eCommerce offers an affordable, easy-to-use platform with beautiful templates and drag-and-drop functionality.
Choosing the [best eCommerce platform for your Australian business](https://www.neosolax.com.au/ecommerce-agency-solutions-australia/) involves considering ease of use, scalability, customization, payment integration, SEO features, security, and customer support. By evaluating these factors and comparing popular platforms like Shopify, WooCommerce, [Magento](https://www.neosolax.com.au/magento-agency/web-development), BigCommerce, and Wix eCommerce, you can find the perfect fit for your business. Invest in a platform that not only meets your current needs but also supports your future growth, ensuring a successful and seamless eCommerce journey in the Australian market. | thisuri_dewmini_63f59fbc8 |
1,891,961 | Simple web QR maker with zxing and Thymeleaf | Encode text into QR Code with zxing and display it using thymeleaf. Live demo here. You can find the... | 0 | 2024-06-18T04:52:20 | https://jsedano.dev/java/thymeleaf/qr/2024/06/17/simple-qr.html | java, thymeleaf, qr | ---
title: Simple web QR maker with zxing and Thymeleaf
published: true
date: 2024-06-17 06:00:00 UTC
tags: java,thymeleaf,qr,java
canonical_url: https://jsedano.dev/java/thymeleaf/qr/2024/06/17/simple-qr.html
---
Encode text into QR Code with [zxing](https://github.com/zxing/zxing) and display it using thymeleaf. Live demo [here](https://demo.jsedano.dev/simpleqr/).
You can find the complete code for this [here](https://github.com/jsedano/simple-qr).
This is the method to encode a String into an image, and then encode the image into a base 64 String that represents the png, based around the code in this [tutorial](https://www.digitalocean.com/community/tutorials/java-qr-code-generator-zxing-example).
[QRCreator.java](https://github.com/jsedano/simple-qr/blob/main/src/main/java/dev/jsedano/simpleqr/service/QRCreator.java)
```
private String unsafeCreateQRImage(String qrCodeText, int size)
throws WriterException, IOException {
// Create the ByteMatrix for the QR-Code that encodes the given String
Hashtable<EncodeHintType, ErrorCorrectionLevel> hintMap = new Hashtable<>();
hintMap.put(EncodeHintType.ERROR_CORRECTION, ErrorCorrectionLevel.L);
BitMatrix byteMatrix =
qrCodeWriter.encode(qrCodeText, BarcodeFormat.QR_CODE, size, size, hintMap);
// Make the BufferedImage that are to hold the QRCode
int matrixWidth = byteMatrix.getWidth();
BufferedImage image = new BufferedImage(matrixWidth, matrixWidth, BufferedImage.TYPE_INT_RGB);
image.createGraphics();
Graphics2D graphics = (Graphics2D) image.getGraphics();
graphics.setColor(Color.WHITE);
graphics.fillRect(0, 0, matrixWidth, matrixWidth);
// Paint and save the image using the ByteMatrix
graphics.setColor(Color.BLACK);
for (int i = 0; i < matrixWidth; i++) {
for (int j = 0; j < matrixWidth; j++) {
if (byteMatrix.get(i, j)) {
graphics.fillRect(i, j, 1, 1);
}
}
}
ByteArrayOutputStream bos = new ByteArrayOutputStream();
ImageIO.write(image, "png", bos);
byte[] imageBytes = bos.toByteArray();
Base64.Encoder encoder = Base64.getEncoder();
return encoder.encodeToString(imageBytes);
}
```
Use this on the `controller` for the thymeleaf templates.
[SimpleQRController.java](https://github.com/jsedano/simple-qr/blob/main/src/main/java/dev/jsedano/simpleqr/controller/SimpleQRController.java)
```
@RequestMapping("/")
public String newRandomSelect(Model model) {
model.addAttribute("textAndQRImage", new TextAndQRImage());
return "inputQR";
}
@RequestMapping("/show")
public String showQR(Model model, TextAndQRImage textAndQRImage, HttpSession session) {
if (Objects.isNull(textAndQRImage.getText()) || textAndQRImage.getText().isEmpty()) {
return "redirect:/simpleqr/";
}
String qrImage = qrCreator.createQRImage(textAndQRImage.getText(), 800);
if (Objects.isNull(qrImage) || qrImage.isEmpty()) {
return "redirect:/simpleqr/";
}
textAndQRImage.setText(textAndQRImage.getText());
textAndQRImage.setBase64Image(qrCreator.createQRImage(textAndQRImage.getText(), 800));
model.addAttribute("textAndQRImage", textAndQRImage);
return "showQR";
}
```
Part of [inputQR.html](https://github.com/jsedano/simple-qr/blob/main/src/main/resources/templates/inputQR.html) that asks for the text that will be encoded into a QR code.
```
<form action="/simpleqr/show" th:object="${textAndQRImage}" method="POST">
<input type="text" placeholder="input text here" th:field="*{text}">
<button type="submit">Generate QR</button>
</form>
```
And then we display the QR code in [showQR.html](https://github.com/jsedano/simple-qr/blob/main/src/main/resources/templates/showQR.html) template.
```
<body>
<div>
<img class="img-responsive" th:src="@{'data:image/png;base64,'+${textAndQRImage.base64Image}}"/>
</div>
<h2 th:text=${textAndQRImage.text}></h2>
</body>
```
Download the complete code for this here: [simple-qr code](https://github.com/jsedano/simple-qr). Live demo on: [simple-qr live](https://demo.jsedano.dev/simpleqr/). | jsedano |
1,890,840 | Crafting a web SDK for Logto in minutes | Learn how to create a custom SDK for Logto using @logto/browser. Logto, an open-source auth... | 0 | 2024-06-17T05:59:55 | https://blog.logto.io/crafting-browser-sdk/ | webdev, opensource, productivity, programming | Learn how to create a custom SDK for Logto using `@logto/browser`.
---
Logto, an open-source auth platform, offers a plethora of official SDKs designed to simplify integration for various frameworks and platforms. However, there are still many platforms that do not have official SDKs.
To bridge this gap, Logto provides the fundamental package `@logto/browser`, designed to help developers craft custom SDKs tailored to specific requirements. This package implements the core functionalities of Logto, detached from any specific framework or platform, as long as it supports JavaScript and runs in a browser environment.
In this guide, we will walk you through the steps to create a React SDK using @logto/browser, this SDK will implement the sign-in flow. You can follow the same steps to create an SDK for any other JavaScript-based platform that running in browser.
# The sign-in flow
Before we start, let's understand the sign-in flow in Logto. The sign-in flow consists of the following steps:
1. **Redirect to Logto**: The user is redirected to the Logto sign-in page.
2. **Authenticate**: The user inputs their credentials and authenticates with Logto.
3. **Redirect back to your app**: After successful authentication, the user is redirected back to your app with an auth code.
4. **Code exchange**: Your app exchanges the auth code for tokens.
# Brief introduction of `@logto/browser`
The `@logto/browser` package exposes a `LogtoClient` class that provides the core functionalities of Logto, including methods for sign-in flow:
1. `signIn()`: Generates the OIDC auth URL, and redirects to it.
2. `handleSignInCallback()`: Check and parse the callback URL and extract the auth code, then exchange the code for tokens by calling token endpoint.
3. `isAuthenticated()`: Check if the user is authenticated.
# Crafting the React SDK
In the SDK, we will provide 2 hooks: `useLogto` and `useHandleSignInCallback`, and along with a `LogtoProvider` component:
1. `useLogto`: A hook that provides the `signIn` method to trigger the sign-in flow, and the `isAuthenticated` state to check if the user is authenticated.
2. `useHandleSignInCallback`: A hook that handles the callback URL and exchanges the auth code for tokens, complete the sign-in flow.
To use the SDK, you can simply wrap your app with the `LogtoProvider` component, and use the hooks to check auth state, sign-in and handle the callback.
### Step 1: Install the package
First, install the `@logto/browser` package using npm or other package managers:
```
npm install @logto/browser
```
### Step 2: Define the context of React
Define the context of the provider, containing 3 parts:
1. The underlying `LogtoClient` instance which will be initialized in the provider, and used in the hooks.
2. The authentication state.
3. The method to set the authentication state.
Create a new file `context.tsx` and write the following code:
```
import type LogtoClient from '@logto/browser';
import { createContext } from 'react';
export type LogtoContextProps = {
/** The underlying LogtoClient instance (from `@logto/browser`). */
logtoClient?: LogtoClient;
/** Whether the user is authenticated or not. */
isAuthenticated: boolean;
/** Sets the authentication state. */
setIsAuthenticated: React.Dispatch<React.SetStateAction<boolean>>;
};
export const throwContextError = (): never => {
throw new Error('Must be used inside <LogtoProvider> context.');
};
/**
* The context for the LogtoProvider.
*
* @remarks
* Instead of using this context directly, in most cases you should use the `useLogto` hook.
*/
export const LogtoContext = createContext<LogtoContextProps>({
logtoClient: undefined,
isAuthenticated: false,
setIsAuthenticated: throwContextError,
});
```
### Step 3: Implement the provider
With the context ready, let's implement the provider. The provider will initialize the `LogtoClient` instance, check if the user is authenticated, and provide the context to its children.
Create a new file `provider.tsx`:
```
import LogtoClient, { type LogtoConfig } from '@logto/browser';
import { type ReactNode, useEffect, useMemo, useState, useCallback } from 'react';
import { LogtoContext } from './context.js';
export type LogtoProviderProps = {
config: LogtoConfig;
children?: ReactNode;
};
export const LogtoProvider = ({ config, children }: LogtoProviderProps) => {
const [loadingCount, setLoadingCount] = useState(1);
const memoizedLogtoClient = useMemo(() => ({ logtoClient: new LogtoClient(config) }), [config]);
const [isAuthenticated, setIsAuthenticated] = useState(false);
useEffect(() => {
(async () => {
const isAuthenticated = await memoizedLogtoClient.logtoClient.isAuthenticated();
setIsAuthenticated(isAuthenticated);
})();
}, [memoizedLogtoClient]);
const memoizedContextValue = useMemo(
() => ({
...memoizedLogtoClient,
isAuthenticated,
setIsAuthenticated,
}),
[memoizedLogtoClient, isAuthenticated, setIsAuthenticated]
);
return <LogtoContext.Provider value={memoizedContextValue}>{children}</LogtoContext.Provider>;
};
```
### Step 4: Implement the hooks
Now, let's implement the hooks.
- `useLogto`: In this hook, we use the context to get the `LogtoClient `instance, and provide the `signIn` method and `isAuthenticated` state. You can continue to add more methods to this hook.
- `useHandleSignInCallback`: This hook will read the callback URL from the browser, extract the auth code, and exchange it for tokens. It will also set the authentication state to `true` after the user is authenticated.
Create a new file `hooks.ts` and write the following code:
```
import type LogtoClient from '@logto/browser';
import { useCallback, useContext, useEffect, useMemo, useRef } from 'react';
import { LogtoContext, throwContextError } from './context.js';
type Logto = {
isAuthenticated: boolean;
} & Pick<LogtoClient, 'signIn'>;
const useHandleSignInCallback = (callback?: () => void) => {
const { logtoClient, isAuthenticated, setIsAuthenticated } = useContext(LogtoContext);
useEffect(() => {
if (!logtoClient) {
return;
}
(async () => {
if (!isAuthenticated) {
await logtoClient.handleSignInCallback(window.location.href);
setIsAuthenticated(true);
callbackRef.current?.();
}
})();
}, [isAuthenticated, logtoClient, setIsAuthenticated]);
return {
isAuthenticated,
};
};
const useLogto = (): Logto => {
const { logtoClient, isAuthenticated } = useContext(LogtoContext);
const client = logtoClient ?? throwContextError();
const methods = useMemo(
() => ({
signIn: client.signIn,
// other methods
}),
[client, proxy]
);
return {
isAuthenticated,
...methods,
};
};
export { useLogto, useHandleSignInCallback };
```
# Checkpoint: using the SDK
Now, you have crafted the React SDK for Logto. You can use it in your app by wrapping the app with the `LogtoProvider` component, and using the hooks to check the auth state, sign in, and handle the callback. You can check the official React sample project [here](https://github.com/logto-io/js/tree/master/packages/react-sample).
# Conclusion
In this guide, we have walked you through the steps to create a React SDK for Logto implementing the basic auth flow. The SDK provided here is a basic example. You can extend it by adding more methods and functionalities to meet your app's needs.
You can follow the same steps to create an SDK for any other JavaScript-based platform that runs in a browser.
Resources:
1. [Logto Browser SDK](https://github.com/logto-io/js/tree/master/packages/browser)
2. [Logto React SDK](https://github.com/logto-io/js/tree/master/packages/react)
{% cta https://logto.io/?ref=dev %} Try Logto Cloud for free {% endcta %} | palomino |
1,890,838 | HOW TO TRACE AND RECOVER STOLEN CRYPTOCURRENCY WITH CYBERPUNK PROGRAMMERS | With nothing to lose, I took a leap of faith and reached out to Cyberpunk Programmers. My situation... | 0 | 2024-06-17T05:57:19 | https://dev.to/tiffany_walsh_9c65ca7d6df/how-to-trace-and-recover-stolen-cryptocurrency-with-cyberpunk-programmers-116p | cryptocurrency, recovery, experts | With nothing to lose, I took a leap of faith and reached out to Cyberpunk Programmers. My situation was dire: my computer had been hit by a devastating virus attack, and with it went access to my Bitcoin wallet containing over $100,000. The sheer panic and dread I felt were overwhelming. All my hard-earned savings, meticulously accumulated through years of trading, seemed to vanish overnight. A close friend saw my despair and recommended Cyberpunk Programmers, insisting they had a stellar reputation for handling such crises. I was skeptical; the crypto world is notorious for scams and empty promises. But I was desperate, so I decided to give them a try.From the very first interaction, I felt a glimmer of hope. The team at Cyberpunk Programmers was incredibly responsive and professional. They took the time to understand every detail of my situation, asking about the virus attack and the specifics of my lost access. Their patience and genuine concern were like a beacon of light in a very dark time.What impressed me most was their transparency. They didn’t make grandiose promises but explained the recovery process in a way that was both reassuring and realistic. This honesty was crucial for me, as it built a level of trust that is so rare to find. They kept me updated every step of the way, so I never felt out of the loop or unsure about what was happening.To my amazement, within just a few days, they were able to recover my wallet and all its contents. Seeing my balance restored was an indescribable relief. It felt like a miracle. All the anxiety and stress melted away, replaced by a profound gratitude for Cyberpunk Programmers team.Their expertise and professionalism transformed what seemed like a hopeless situation into a success story. They didn’t just recover my funds; they restored my peace of mind. Knowing that there are trustworthy and capable professionals out there who can handle such critical tasks is incredibly reassuring.I can't recommend Cyberpunk Programmers enough. If you ever find yourself in a similar predicament, don’t hesitate to reach out to them. Their service is nothing short of extraordinary, and they truly care about their clients. Thanks to them, I have my life back on track, and I’m once again filled with hope and optimism for the future. Visit their website, cyberpunkers . org or mail cyberpunk @ programmers . Net | tiffany_walsh_9c65ca7d6df |
1,829,359 | 5 Quick Fixes for Power Automate | No matter how much I love Power Automate there are always going to be 'niggles'. I'm not talking... | 0 | 2024-06-17T05:52:05 | https://dev.to/wyattdave/5-quick-fixes-for-power-automate-1ola | powerautomate, powerplatform, lowcode, rpa | No matter how much I love Power Automate there are always going to be 'niggles'. I'm not talking about big issues, more the ones that you deal with as the impact isn't big enough to impact your delivery, but still big enough that you feel it.
So here are a mix of bugs and poor design decisions that I would love to see fixed.
The Changes
1 Conditions Don't Show Inputs
2 Connection Naming
3 Cancelled Runs Don't Show Logs
The Fixes
4 Switch Copy Paste
5 Expressions, Arrays and Apply to Each
---
## 1. Conditions Don't Show Inputs
Every developer must have felt the pain on this one at least once. Your flow doesn't run as expected, you check the logs and the condition returned false instead of the expected true. You look at the inputs and you see this:

Well gee that's useful, thank you. So now you have to add a compose with a list of the inputs and rerun.
The fix is easy, show the expression with the inputs, or allow an option to show values. This is kind of what I expected 'Tracked Properties' to be instead of the current implementation.

## 2. Connection Naming
I don't know how, I always try not too, but I always end up with multiple connections for same connection. Why do I need the ability to create multiple, that's a good question.
So when I go to add a connection I end up seeing this:

And I have no idea which one to pick, which one is being used for what flow.
2 possible fixes, remove the ability to have more then one connection. The only reason to is to enable changing selected flow connections, and that's what connection references are for now. But if you really want to keep them then add the option to change the name in connection settings.

## 3. Cancelled Loop Runs Don't Show Logs
We have all probably had this happen to use, the flow is looping over hundreds of records, but it fails on the first item. You don't want to wait for the flow to finish (as that will take ages), so you cancel the run. But when you look at the logs you can't see the item that failed.

I understand why, as the Apply to Each is actually a API call (like Get Items from SharePoint), so the actual looping isn't done in the flow. But that said we get the item detail when complete, so why can't they return the rows that had complete when cancelled.
And that's the fix, update the Apply to Each (and Do until) API's to return items already processed when the flow is cancelled. Or the better approach return each item as it is processed, so we get to see the action in real time as the flow runs 😎
## 4. Switch Copy Paste
Now this is a niche one I admit, not many people use Switch's, and when they do they may not copy and paste that often. But if you try to you will see that there is a bug, and you can't. You can copy and paste an item from outside into the Switch, and from inside the Switch to outside, but not from inside to inside.
Don't believe me, try it. So now every time I want to duplicate an action between Switch branches I have to paste it outside the scope and drag it in.
The fix is simple, make it work. And I can even tell Microsoft why. When they last updated the schema they didn't update the Switch.
During that update they added a operationMetadataId key to all the actions JSON's (you can see it in the peak code). But when you create a action within a scope that key isn't there.
Outside Switch

Inside Switch

## 5. Expressions and Arrays
To help you Power Automate automatically creates a 'Apply to Each' and adds the action you are editing into it if the input is a item from array. This is a nice touch and definitely stops issues for new developers. The issue I have is this behaviour also happens in the expression editor.
So lets say I want to check if the first item in an array is a value, something like:
```
if(equals(outputs('Get_items')?['body/value'][0]?['Title'],'David'),'Match','Mismatch')
```
I type:
```
if(equals(
```
and then when I select Title (or any field) it adds the action to a Apply to Each, wipes my expression and replaces it with:
```
items('For_each')?['Title']
```

This is so frustrating, when I'm in the expression editor and have written an expression why does it act like I just selected it in Dynamic Content selector.
And just to add make it even more annoying it then ofen registers the action as has an array item, even when it doesn't. So you can't move it out of the loop.

The fix again is easy, if I select within the Dynamic Content selector then please do put it in an Apply to Each. But if it's in the Expression editor don't, just leave me in the editor.
Apply to Each

Not Apply to Each

---
To be fair to the development team they have been pumping out some cool new features (version control, Dataverse run logs), and working on getting the New UI upto speed so I understand why these little niggles are not a priority. | wyattdave |
1,069,163 | Macbook air m1 vs macbook pro 15 2018 vs mac mini 2018 | This is a comparison between these three computers, from de the viewpoint of a software developer... | 0 | 2024-06-17T05:50:36 | https://dev.to/luisgmoreno/macbook-air-m1-vs-macbook-pro-15-2018-vs-mac-mini-2018-56o6 | laravel | This is a comparison between these three computers, from de the viewpoint of a software developer working in real projects on everyday basis.
Stack:
I work on Laravel projects mostly and some mobile apps in android and iOS sometimes, on those I haven't checked yet, I will update when test those.
Macbook air specs:
Base model
Macbook pro 15 2018:
Base model
Mac mini: i5 with 512gb disk and 16gb ram
Programs I run every day at the same time:
1. Phpstorm
2. Nginx
3. Php-fpm
4. Mysql
5. Google Chrome with 20-40 tabs, and multiple profiles
6. Tableplus
7. [Ray](
8. Understand and fix bugs faster using Ray - Rayhttps://myray.app)
9. iTerm: with webpack watching for changes in the front end
10. assets
11. iTerm: iddle
12. iTerm: ssh into something
13. Sublime Merge
14. Apple music or Youtube music
15.
Subjetive review:
Everything feels more snappy, faster, the keyboard in a million times better, I haven't had a problem with compatibility issues in regards of the ARM migration, moreover if I didn't knew that its an m1 arm mac at the moment after two months of usage I wouldn't have noticed yet, the screen is smaller but is not a big deal, the brightness is 400 nit vs 500nit of the pro but I think that it is really less, because I can use it in the same places with direct sun light in some windows as the pro and I have not miss the pro brightness at all, is the portability, but the portability improvements oh god this machine its a lot lighter and I carry it all over the office, the house, with the 15 I mostly remained in the desktop because there was a factor of unconfortability to move the bigger 15 inch.
I work with 5 virtual desktops and scroll across them with trackpad, the air does this flawlessly in contrast to the pro that only can do it without jank/dropframes with the radeon graphics, with the intel it was sluggish, furthermore when I was running heavy cpu tasks I would jank/stutter even with the radeon gpu.
The bettery life in the pro was like 2 hour 40 minutes in heavy load to 3 and 50 minutes in normal development flow. The air is like 7-9 hours of dev work, the is no heavy load to the air in my current work flow anything that taxed the pro don't do it with the air, an example of heavy load for my in running the tests, a few seconds task, combined with some video in youtube multiple chrome tabs, phpstorm and maybe an ios simulator all all the same time, the pro could do that but the cpu was at 100% the fans at full blast, the temps high, in the same escenario in the air there is no stress in the cpu, nor in the temps, and obviously there is no fan in the air.
The mini has a performance very similar to the pro, almost identical, but it was less prone to overheating, it was noisy as well, and the scroll between desktops and was better than the intel in the pro but worst than the radeon.
The memory: this was my biggest fear, I was used to 16gb of ram, and for my workflow was good enough, the new air has half of that, in reality for my workflow I'm at the limit, I don't need more but I use all of the available memory all the time. The activity monitor have a memory presure monitor and when is red the performance degrades, I mean the scroll between desktops is choppy and navigating the code on intelliphense? is slow, even typing in the editor turns a little slow, using sublime text or vscode in combination con safari the memory presure is green always, but I like my two memory hogs, so I'm creck periodically if there are tabs that I'm no longer using and closing them. In summary 8gb in arm m1 ARE NOT equivalent to 16gb in x64, 8gb in arm = 8gb in x64, would I better with 16gb, of course, are 8gb enough for a professional developer, yes, but you have to manage your resources more closely than if you had 16, in any case Its better an m1 with 8gb of memory than a intel mac with 16gb for the gains in performance and efficiency in the processor and the GPU.
One thing thats its impressive is the speed in which those new macs can change the resolution of the internal display or external monitor in comparison with the intel macs, I don't know what kind of optimizations they made to the OS in that action you can feel the difference between the two architectures.
Some performance not subjetive measurements:
| luisgmoreno |
1,890,836 | How to deploy a React app with Kamal (formerly known as MRSK) & GitHub Action | A guide to automated deploy of React applications with Kamal and GitHub Actions | 0 | 2024-06-17T05:49:34 | https://www.kartikey.dev/2024/06/17/how-to-deploy-react-app-with-kamal-and-github-action.html | react, kamal, github | ---
title: How to deploy a React app with Kamal (formerly known as MRSK) & GitHub Action
published: true
description: A guide to automated deploy of React applications with Kamal and GitHub Actions
tags: react, kamal, mrsk, github
canonical_url: https://www.kartikey.dev/2024/06/17/how-to-deploy-react-app-with-kamal-and-github-action.html
---
I recently helped [Sidecar Learning](https://sidecarlearning.com){:target="_blank"} move their legacy application built with React front-end and Rails back-end migrate from Heroku and AWS to VPS over at DigitalOcean.
This guide is outcome of that migration. React front-end was quite simple and do not have much moving parts so it's quite simple to do this. If you are looking to deploy monolith application then you can read the following posts I have written:
- [Deploy Rails app and Postgres with Kamal(previously MRSK) on single DigitalOcean server](/2023/04/05/how-to-deploy-rails-app-and-postgres-with-mrsk-on-single-server.html)
- [How to deploy multi-environment(staging, production) application using Kamal(previously MRSK)](/2023/04/09/how-to-deploy-multi-environment-staging-production-application-using-mrsk.html)
- [How to deploy a NodeJS application using Kamal (previously MRSK)](/2023/04/10/how-to-deploy-a-nodejs-application-using-mrsk.html)
## Kamal Config
First, let's create a simple `deploy.yml` for Kamal. The configuration is mostly boilerplate. I am using [GHA cache](https://docs.docker.com/build/cache/backends/gha/){:target="_blank"} to speed up the deploys.
```yaml
# config/deploy.yml
service: react-app
image: username/image-name
servers:
web:
- 123.456.789.012
registry:
username: docker_username
password:
- KAMAL_REGISTRY_PASSWORD
env:
clear:
FOO: BAR
NPM_CONFIG_PRODUCTION: false
builder:
multiarch: false
cache:
type: gha
options: mode=max
```
If you wish to, you can use "registry" cache as well. I have noticed that it's faster by 5-10 seconds but if you are using free Docker Hub account then you only get one free image. That's why I prefer to use "gha" as cache back-end. Kamal only [supports](https://kamal-deploy.org/docs/configuration/builders/#using-multistage-builder-cache){:target="_blank"} "gha" and "registry". If you would like to use the "registry" cache, use the following builder config:
```yaml
builder:
multiarch: false
cache:
type: registry
options: mode=max,image-manifest=true,oci-mediatypes=true
```
## Dockerfile
Our app is using quite old version of NodeJS. This is why I like Kamal more than anything. No platform dependency or requirement to fulfil. You create your own environment and Kamal provides thin wrapper around the Docker commands. As simple as it could be.
Here, we first create a build step. Once the build is ready, no need to include all those node_modules in our final application. We keep it as small as possible so that it's easier to perform operations.
In our app, we have a small Express app that serves the static file. That's why I have created a separate `package.runtime.json` file. You can modify that part as per your own need. The next session will explain how the express server works.
```dockerfile
# Dockerfile
FROM node:11.10.1 as build
WORKDIR /app
ENV NODE_ENV="production" \
NPM_CONFIG_PRODUCTION="false"
COPY . .
RUN npm ci
RUN npm run build
FROM node:11.10.1
WORKDIR /app
ENV NODE_ENV="production" \
NPM_CONFIG_PRODUCTION="false"
COPY --from=build /app/build ./build
COPY --from=build /app/server.js ./server.js
COPY --from=build /app/package.runtime.json ./package.json
RUN npm install --production
EXPOSE 3000
CMD ["node", "server.js"]
```
## Express Server
We have a tiny express server that serves the static files with appropriate headers. We have configured Cloudflare to provide SSL support and cache the static assets. Only `index.html` file is not cached, and bundled assets will be served by this Express server only once. Then Cloudflare will take the load. All of this is deployed on our $4 DigitalOcean droplet and it is handling our moderately busy app very well.
I have [configured](https://webpack.js.org/guides/caching/){:target="_blank"} the Webpack to generate bulid with hash, so with every new build, the new hash will rename the file - same as [Propshaft](https://github.com/rails/propshaft/){:target="_blank"}.
I have added a route for Kamal healthcheck as well even though the health-check step has been [removed](https://github.com/basecamp/kamal/pull/740){:target="_blank"} in Kamal 1.6.0.
```javascript
// server.js
const compression = require('compression');
const express = require('express');
const path = require('path');
const port = process.env.PORT || 3000;
const app = express();
// Use compression middleware
app.use(compression());
// Serve static files with cache control headers
app.use(express.static(path.join(__dirname, 'build'), {
maxAge: '1d', // Cache for 1 day
setHeaders: (res, path) => {
// Set cache-control headers based on file types
if (path.endsWith('.js')) {
res.setHeader('Cache-Control', 'public, max-age=31536000'); // 1 year
} else if (path.endsWith('.css')) {
res.setHeader('Cache-Control', 'public, max-age=31536000'); // 1 year
} else if (path.endsWith('.html')) {
res.setHeader('Cache-Control', 'public, max-age=0'); // No cache
} else {
res.setHeader('Cache-Control', 'public, max-age=86400'); // 1 day
}
}
}));
// Kamal health check route
app.get('/up', (req, res) => {
res.send(`<!DOCTYPE html><html><body style="background-color: green"></body></html>`);
});
// Send all requests to index.html to handle routing in React Router
app.get('*', (req, res) => {
res.sendFile(path.resolve(__dirname, 'build', 'index.html'));
});
// Start the server
app.listen(port, () => {
console.log(`Server is running on port ${port}`);
});
```
We only need ExpressJS and the compression packages to run this small server. That's why we are including only these two packages in the runtime package file.
`package.runtime.json`
```json
{
"name": "your-app",
"version": "1.0.0",
"main": "server.js",
"dependencies": {
"compression": "^1.7.4",
"express": "^4.16.4"
},
"scripts": {
"start": "node server.js"
}
}
```
## GitHub Action
It's a headache to run the deployment from your own machine. The following GitHub Actions config will solve it for you. I have added a concurrency config too that will let only one deploy run at a time. It's useful when you make multiple commits in a short time.
```yaml
# .github/workflows/deploy.yml
name: Deploy to production
# To make sure that only one deploy runs at a time. Deploy lock will not let simultaneous deployments.
concurrency:
group: ${{ github.workflow }}
on:
push:
branches: [master]
jobs:
deploy:
runs-on: ubuntu-latest
env:
KAMAL_REGISTRY_PASSWORD: ${{ secrets.KAMAL_REGISTRY_PASSWORD }}
steps:
- name: Set up Docker Buildx for cache
uses: docker/setup-buildx-action@v3
# Since we are using GHA cache, we need to expose the cache to the runtime
- name: Expose GitHub Runtime for cache
uses: crazy-max/ghaction-github-runtime@v3
- name: Checkout code
uses: actions/checkout@v3
# Ruby is only needed to install Kamal
- name: Set up Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: '3.3'
- name: Install dependencies
run: gem install kamal
# This is to facilitate Kamal with the private key to access server(s)
- uses: webfactory/ssh-agent@v0.7.0
with:
ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Run deploy command
run: kamal deploy
```
| tannakartikey |
1,842,579 | LLM Fine-Tuning: Domain Embeddings with GPT-3 | Starting in 2023, Large Language Models evolved to form or be a component of information retrieval... | 0 | 2024-06-17T05:49:10 | https://dev.to/admantium/llm-fine-tuning-domain-embeddings-with-gpt-3-3dln | llm | Starting in 2023, Large Language Models evolved to form or be a component of information retrieval systems. In such a system, domain knowledge is encoded in a special format. Then, given a user query, the most relevant chunks from the knowledge base are determined and an answer is formulated. In LLMs, the knowledge base is all learned training material. However, given the learned vector representations of words, other content can be embedded in the same vector space. And in this vector space, similarity search between user queries and stored knowledge can be made to identify the context from which an LLM answers. This is a LLM retrieval system in a nutshell.
This article shows how to use GPT-3 embeddings for designing a question answer system - the third possible approach as outlined in my [previous article](https://admantium.com/blog/llm13_question_answer_system_architectures/). GPT-3, initially released on 2020, showed astonishing capabilities to produce text that is hard to distinguish from human-written text. It is an advanced language model trained on billions of internet resources like Wikipedia, books and plain web crawling. Furthermore, a rich API exists for text generation as well as for creating embeddings. You will learn how to use this API to create embeddings, and how to use these embeddings for a similarity search given a user query.
_The technical context of this article is `Python v3.11`, OpenAIs GPT-3.5 api wrapper `openai v1.12.0`, and the helper libraries `scikit-learn v1.4.1` and `wikipedia-api v0.6.0`. All instructions should work with newer versions too, but you might need to use another OpenAi model because older models are being phased out._
_This article originally appeared at my blog [admantium.com](https://admantium.com/blog/llm17_qa_system_domain_embeddings/)_.
## GPT-3 Model Overview
OpenAI provides different models via its API. At the original time of writing this article in early 2022, API access was only granted to selected companies. Only later, [API access for individual developers](https://openai.com/blog/api-no-waitlist/) was granted, and since 2023, the API is open for every developer.
Another difference between starting this article in early 2022 and early 2024 are the available models. In essence, OpenAI deprecates older model as well as changing the provided API functions. Originally, the following [GPT-3 models](https://beta.openai.com/docs/models/gpt-3) were available:
- `text-davinci-002`
- `text-curie-001`
- `text-babbage-001`
- `text-ada-001`
As of 2024, the list differs models along their context window and general capabilities - see [gpt-3-5-turbo](https://platform.openai.com/docs/models/gpt-3-5-turbo) for a full description.
- `gpt-3.5-turbo-012`: The most recent version, higher accuracy for output formatting, context window is 16,385 tokens
- `gpt-3.5-turbo-1106`: Improved instruction following, context window is 16,385 tokens
- `gpt-3.5-turbo-instruct`: Same capabilities as GPT3 models, and a context window of 4,096 tokens
## Required Python Libraries
The essential library for this project is [OpenAI](https://pypi.org/project/openai/), supported by two helper libraries. Install them with the [poetry](https://pypi.org/project/openai/) dependency manager a shown:
```bash
poetry init --quiet
poetry add openai scipy wikipedia-api
# Using version ^1.12.0 for openai
# Using version ^1.12.0 for scipy
# Using version ^0.6.0 for wikipedia-api
# Updating dependencies
# Resolving dependencies... (1.6s)
# Package operations: 7 installs, 0 updates, 0 removals
# • Installing h11 (0.14.0)
# • Installing httpcore (1.0.4)
# • Installing distro (1.9.0)
# • Installing httpx (0.27.0)
# • Installing openai (1.12.0)
# • Installing scipy (1.12.0)
# • Installing wikipedia-api (0.6.0)
# Writing lock file
```
## OpenAI Python Library
The quintessential input to language generation with GPT-3 is the prompt. It’s not a just a simple question: You can define the role of a messages as `user`, `system` or `assistant`, as well as structure the prompt into different sections. This primes the GPT-3 model and leads to a more nuanced and accurate answer.
The OpenAI library provides several API endpoints for specific use cases, including working with text, audio and images, as well as one for embedding. For text input, the chat completion endpoint is used.
Here is an example asking a statistic fact:
```py
import os
from openai import OpenAI
client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))
query = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "How many inhabitants are living in Berlin?",
}
],
model="gpt-3.5-turbo-instruct"
)
```
And the response object is this:
```py
ChatCompletion
id='chatcmpl-8vmJ3K67ZApZrq5M0ZNiVJY7hld6i',
choices=[Choice(finish_reason='stop',
index=0,
logprobs=None,
message=ChatCompletionMessage(content='As of 2021,
the population of Berlin is approximately 3.7 million inhabitants.',
role='assistant',
function_call=None,
tool_calls=None))],
created=1708781077,
model='gpt-3.5-turbo-0125',
object='chat.completion',
system_fingerprint='fp_86156a94a0',
usage=CompletionUsage(completion_tokens=19,
prompt_tokens=15,
total_tokens=34)
)
```
The object returned from the API contains meta information about the model, consumed and generated tokens, and a `Choice` object that contains the answer. Interestingly, the answer clearly reflects that this GPT3.5 model cannot access data after its 2021 training episode. Providing newer content to this model is another use case for our question answer system design approach.
## Question Answering with Plaintext
When the exact containing context for a question is known, it can be simple added as as-is text to the prompt. The following example shows how to formulate the first two paragraphs of the Wikipedia article about NASA as a context for a user query.
```py
# Source: Wikipedia, NASA, https://en.wikipedia.org/wiki/NASA
article_text = '''
The National Aeronautics and Space Administration (NASA /ˈnæsə/) is an independent agency of the U.S. federal government responsible for the civil space program, aeronautics research, and space research. Established in 1958, it succeeded the National Advisory Committee for Aeronautics (NACA) to give the U.S. space development effort a distinctly civilian orientation, emphasizing peaceful applications in space science. It has since led most American space exploration, including Project Mercury, Project Gemini, the 1968–1972 Apollo Moon landing missions, the Skylab space station, and the Space Shuttle. It currently supports the International Space Station and oversees the development of the Orion spacecraft and the Space Launch System for the crewed lunar Artemis program, the Commercial Crew spacecraft, and the planned Lunar Gateway space station.
NASA's science is focused on better understanding Earth through the Earth Observing System; advancing heliophysics through the efforts of the Science Mission Directorate's Heliophysics Research Program; exploring bodies throughout the Solar System with advanced robotic spacecraft such as New Horizons and planetary rovers such as Perseverance; and researching astrophysics topics, such as the Big Bang, through the James Webb Space Telescope, the Great Observatories and associated programs. The Launch Services Program oversees launch operations and countdown management for its uncrewed launches.
'''
question="What is NASA?"
client.chat.completions.create(
messages=[
{
"role": "system",
"content": "You are a question-answering assistant. Answer truthfully. If you do not know the answer, say 'I don't have access to this information'",
},
{
"role": "user",
"content": f'''
Context: {article_text}
Question: {question}''',
}
],
model="gpt-3.5-turbo"
)
```
In this example, the message list starts with a `system` role message that sets the general behavior for all subsequent interactions. And the second message structures the query into a context and questions section, providing structured information to the LLM.
## Question Answering with Embeddings
To use embeddings for a question answering system, several steps need to be considered:
- Create embeddings for chunked texts
- Store the embeddings
- For a user query, perform a similarity search with the embeddings
- Retrieve embeddings and formulate a prompt
Let’s detail and realize these steps with individual Python functions.
### Step 1: Create Embeddings
The OpenAI embedding API can be used via the client library. Expected parameters are the embedding model and the input text. At the time of writing this article, three [embedding models](https://platform.openai.com/docs/guides/embeddings/embedding-models) are available: `text-embedding-3-small`, `text-embedding-3-large` and `text-embedding-ada-002`.
Here is an example:
```py
embedding_model="text-embedding-3-small"
client.embeddings.create(input = [text], model=embddeing_model).
```
Using the above defined `article_text` variable, containing the first two paragraphs of the wikipedia article about NASA, shows this embedding:
```py
# CreateEmbeddingResponse(
# data=[Embedding(embedding=[-0.02948867715895176, 0.014214463531970978, 0.059668492525815964, ...], index=0, object='embedding')],
# model='text-embedding-3-small',
# object='list',
# usage=Usage(prompt_tokens=279, total_tokens=279))
```
The embedding itself can be accessed as `res.data[0].embedding`. And similar to the chat completion object, this contains meta information about the processed tokens.
### Step 2: Store Embeddings
The next step is to load the Wikipedia articles content and split it into paragraphs that have at least 100 characters (this removes headings and empty paragraphs as well). For this, the handy `wikipedia-api` library will be used.
```py
import wikipediaapi
user_agent = "Mozilla/5.0 (Linux; Android 10; K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Mobile Safari/537.36"
language = "en"
wiki = wikipediaapi.Wikipedia(user_agent, language)
nasa_page = wiki.page('NASA')
chunks = [chunk for chunk in nasa_page.text.split('\n') if len(chunk) > 100]
```
In total, that gives 100 chunks. The first three chunks are as follows:
```py
print(chunks[0:3])
# ['The National Aeronautics and Space Administration (NASA ) is an independent agency of the U.S. federal government responsible for the civil space program, aeronautics research, and space research. Established in 1958, it succeeded the National Advisory Committee for Aeronautics (NACA) to give the U.S. space development effort a distinctly civilian orientation, emphasizing peaceful applications in space science. It has since led most American space exploration, including Project Mercury, Project Gemini, the 1968–1972 Apollo Moon landing missions, the Skylab space station, and the Space Shuttle. It currently supports the International Space Station and oversees the development of the Orion spacecraft and the Space Launch System for the crewed lunar Artemis program, the Commercial Crew spacecraft, and the planned Lunar Gateway space station.',
# "NASA's science is focused on better understanding Earth through the Earth Observing System; advancing heliophysics through the efforts of the Science Mission Directorate's Heliophysics Research Program; exploring bodies throughout the Solar System with advanced robotic spacecraft such as New Horizons and planetary rovers such as Perseverance; and researching astrophysics topics, such as the Big Bang, through the James Webb Space Telescope, the Great Observatories and associated programs. The Launch Services Program oversees launch operations and countdown management for its uncrewed launches.",
# "NASA traces its roots to the National Advisory Committee for Aeronautics (NACA). Despite being the birthplace of aviation, by 1914 the United States recognized that it was far behind Europe in aviation capability. Determined to regain American leadership in aviation, Congress created the Aviation Section of the U.S. Army Signal Corps in 1914 and established NACA in 1915 to foster aeronautical research and development. Over the next forty years NACA would conduct aeronautical research in support of the U.S. Air Force, its predecessors in the U.S. Army, the U.S. Navy, and the civil aviation sector. After the end of World War II, NACA became interested in the possibilities of guided missiles and supersonic aircraft, developing and testing the Bell X-1 in a joint program with the U.S. Air Force. NACA's interest in space grew out of its rocketry program at the Pilotless Aircraft Research Division."],
```
For these chunks, embeddings are calculated and stored as tuples containing the original text as well as the embeddings. Here is the relevant source code:
```py
def embedd(client, text):
client = OpenAI()
embedding_model="text-embedding-3-small"
res = client.embeddings.create(input = [text], model=embedding_model)
return res.data[0].embedding
embeddings = [(chunk, embedd(client,chunk)) for chunk in chunks]
```
The first item in this list is this:
```py
embeddings[0]
# [(
# 'The National Aeronautics and Space Administration (NASA ) is ...',
# [-0.024839315563440323, 0.004018288571387529, 0.061975762248039246, ...]
# )]
```
### Step 3: Similarity Search for User Queries
The user query is passed to the embeddings API, and then a local cosine similarity search is made against the stored embeddings. This list is ordered from best to worst match and returned. Each item in the list is a tuple of the form `(similarity, text, embedding)`.
```py
from scipy.spatial.distance import cosine
def similarity_search(client, query, embeddings):
emq = embedd(client, query)
similar_embeddings = [(cosine(emq, emc), text, emc) for text, emc in embeddings]
return(sorted(similar_embeddings, reverse = True))
```
Here is an example, asking about the founding age of NASA:
```py
query = "When was NASA founded?"
similarity_search(client, query, embeddings)
# [(0.3510536381739262,
# 'The National Aeronautics and Space Administration (NASA ) is an independent agency of the U.S. federal government responsible for the civil space program, aeronautics research, and space research. Established in 1958 ...',
# [-0.024839315563440323,0.004018288571387529, ...],
# ...
# ]]
```
### Step 4: Prompt Generation
The final step is to formulate the prompt that contains the system role definition, the user query and its context.
The first method generates the prompt, using only the best match of the similarity search.
```py
def generate_prompt(client, query, embeddings):
matches = similarity_search(client, query, embeddings)
_, context, _ = matches[0]
question_prompt = {
"role": "user",
"content": f'''
Context: {context}
Question: {query}
'''
}
return (question_prompt)
```
And the second method wraps the chat completion API call and uses the generated prompts.
```py
def embedding_qa(client, query, embeddings):
system_message = {
"role": "system",
"content": "You are a question-answering assistant. Answer truthfully. If you do not know the answer, say 'I don't have access to this information'",
}
question_prompt = generate_prompt(client, query, embeddings)
response = client.chat.completions.create(
messages= [system_message, question_prompt],
model="gpt-3.5-turbo"
)
return response.choices[0].message
```
## Question Answering with Domain Embeddings
With all methods implemented, we formulate a query about NASA founding data, and use the embeddings from the same article.
```py
query = "When was NASA founded?"
embedding_qa(client, query)
# ChatCompletionMessage(content='NASA was founded in 1958.', role='assistant', function_call=None, tool_calls=None)
```
The similarity search ordered all paragraphs and put the very first paragraph to the top. Then, the GPT3 model processes the context and found the required information.
Comparing this approach to the two former approaches (linguistic fine-tuning and qa finetuning), several benefits emerge: a) scalability, any textual data source can be embedded and used for similarity search, b) dynamicity, the Python methods can be used in a production system to continuously add new embeddings or retrieve the most up-to-date version, c) quality, Gen3 and Gen4 LLMs formulate answers themselves instead of just annotating parts of the context.
## Conclusion
This article showed how to create a question answering system using domain embeddings. The approach specifically consists of four steps that were implanted as Python functions: a) Load a text, separate into chunks, and generate embeddings, b) store the embeddings, c) embed a user query, determine most relevant embeddings by calculating cosine similarity, and d) formulate a prompt that distinguishes the system role and the user message, and in which the user message clear mentions the context in which the LLMs should find the answer. With this approach, several benefits are realized: better scalability to access text information, dynamicity to use most up-to-data information, and overall better quality when using Gen3 and Gen4 models. The next article shows how to fine-tune a model using instruction datasets.
| admantium |
1,890,835 | Build MSG File Analyzer using NextJS | When applying for a tech job it is important to have a strong portfolio to impress the company.... | 0 | 2024-06-17T05:47:53 | https://www.developertimeline.io/build-msg-outlook-analyzer-using-nextjs-and | nextjs, beginners, javascript, webdev |
When applying for a tech job it is important to have a strong portfolio to impress the company. Especially for beginner developers the first impression of resume and projects are playing crucial role. The most common project types are To-do app, Weather app, Blog and etc. which is **boring**.
> This post cross-published from Notion with [OnePublish](https://onepubli.sh).
In this new series we will build unique and creative stuff and ship it to production. Check out previous publication from this series below:
[https://www.developertimeline.io/build-a-phishing-site-detector-with-nextjs-and-virustotal/](https://www.developertimeline.io/build-a-phishing-site-detector-with-nextjs-and-virustotal/)
So, in this episode we will build Email Detail Viewer that will take the MSG file as input, read its content and display it to the user.
We’re going to use a NextJS to build our app fast and ship it to production easily using Vercel. Each time we make a new commit Vercel will update our app so we don’t need to worry about deployment but only focus on building.
### **Why this project is great?**
Because it involves many skills those required to have as a developer:
- TypeScript.
- Third-party integration.
- Use of NextJS itself.
- UI/UX skills.
- Tailwind CSS (eh)
- Deployment
### **What we are building?**
This project is going to be a single page app that will let user to upload MSG File or in other terms Microsoft Outlook Item. After user uploads file, it should extract the email headers, content, attachments and other details of email then display the result to the user with nice UI/UX.
### **Getting Started**
Let’s start by creating a new NextJS project. I will be using latest version of NextJS which is version 14 at time of writing this post.
```shell
npx create-next-app@latest
```
It will prompt few questions to configure the NextJS project. I will select to use TypeScript, initialise the Tailwind CSS and use App router.
```shell
✔ What is your project named? … <your_app_name_here>
✔ Would you like to use TypeScript? … No / **Yes**
✔ Would you like to use ESLint? … No / **Yes**
✔ Would you like to use Tailwind CSS? … No / **Yes**
✔ Would you like to use `src/` directory? … **No** / Yes
✔ Would you like to use App Router? (recommended) … No / **Yes**
✔ Would you like to customize the default import alias (@/*)? … No / **Yes**
```
Now you should have initialised NextJS project ready to use.
### **Building Backend**
First, we need to find a library that can read the content of msg file. There is a npm package named [@kenjiuno/msgreader ](https://www.npmjs.com/package/@kenjiuno/msgreader?ref=developertimeline.io)which can read and extract the details from the .msg files.
Let's start by creating and endpoint called `/api/reader` that will handle POST request from frontend and pass file data to the msg reader.
_/api/reader/route.ts_
```typescript
import { reader } from "@/app/utils/reader";
export async function POST(request: Request) {
const formData = await request.formData();
const file: any = formData.get("file");
const buffer = Buffer.from(await file.arrayBuffer());
const readerRes = await reader(buffer);
return Response.json({ reader: readerRes });
}
```
Simply, getting and reading the `file` from request body and storing the buffer inside `buffer` variable.
There is no need to save uploaded file locally since `reader` accepts `Buffer` as well. That saves us from a lot of file IO operations.
Now, let's add a `reader` util function:
_/utils/reader.ts_
```typescript
import MsgReader from "@kenjiuno/msgreader";
export async function reader(buffer: Buffer) {
const testMsg = new MsgReader(buffer);
const testMsgInfo = testMsg.getFileData();
return testMsgInfo;
}
```
We're using `@kenjiuno/msgreader` library to extract the details from file buffer.
That's pretty all from backend side!
### **Building Frontend**
On the frontend side, we're going to use DaisyUI and some Flowbite components to quickly build UI. Both built on top of Tailwind CSS.
Here's the index page that holds all components for this app:
_app/page.tsx_
```typescript
"use client";
import { useState } from "react";
import FileInput from "./components/FileInput";
import EmailHeaders from "./components/Headers";
import AttachmentsHeaders from "./components/Attachments";
import { FieldsData } from "@kenjiuno/msgreader";
const Home: React.FC = () => {
const [details, setDetails] = useState<FieldsData | null>(null);
return (
<>
<FileInput onDataExtracted={(details: FieldsData) => setDetails(details)} />
{/* Results */}
<EmailHeaders headerDetails={details?.headers} />
<AttachmentsHeaders attachmentsDetails={details?.attachments} />
</>
);
};
export default Home;
```
The first component will be `FileInput` that will allow user to upload their MSG file to the drag & drop input area.
_/components/FileInput.tsx_
```typescript
"use client";
import { FieldsData } from "@kenjiuno/msgreader";
import axios from "axios";
import { ChangeEvent, useState } from "react";
interface FileInputProps {
onDataExtracted: (details: FieldsData) => void
}
const FileInput: React.FC<FileInputProps> = ({ onDataExtracted }) => {
const [alertMessage, setAlertMessage] = useState<string>("");
const submitMsgFile = async (event: ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0];
if (file) {
try {
const formData = new FormData();
formData.append("file", file);
const res = await axios.post("/api/reader", formData);
onDataExtracted(res?.data?.reader);
console.log(res.data.reader);
setAlertMessage("Success! Check results below.");
} catch (error) {
console.log(error);
}
} else {
}
};
return (
<div className="hero min-h-screen bg-base-200">
<div className="hero-content text-center">
<div className="max-w-lg">
<h1 className="text-5xl font-bold mb-5">MSG Analyzer</h1>
<p className="mb-5">Drop your .MSG file below to get results</p>
<div className="flex items-center justify-center w-full">
<label className="flex flex-col items-center justify-center w-full h-64 border-2 border-gray-300 border-dashed rounded-lg cursor-pointer bg-gray-50 dark:hover:bg-bray-800 dark:bg-gray-700 hover:bg-gray-100 dark:border-gray-600 dark:hover:border-gray-500 dark:hover:bg-gray-600">
<div className="flex flex-col items-center justify-center pt-5 pb-6">
<svg
className="w-8 h-8 mb-4 text-gray-500 dark:text-gray-400"
aria-hidden="true"
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 20 16"
>
<path
stroke="currentColor"
strokeLinecap="round"
strokeLinejoin="round"
strokeWidth="2"
d="M13 13h3a3 3 0 0 0 0-6h-.025A5.56 5.56 0 0 0 16 6.5 5.5 5.5 0 0 0 5.207 5.021C5.137 5.017 5.071 5 5 5a4 4 0 0 0 0 8h2.167M10 15V6m0 0L8 8m2-2 2 2"
/>
</svg>
<p className="mb-2 text-sm text-gray-500 dark:text-gray-400">
<span className="font-semibold">Click to upload</span> or drag
and drop
</p>
<p className="text-xs text-gray-500 dark:text-gray-400">
SVG, PNG, JPG or GIF (MAX. 800x400px)
</p>
</div>
<input
id="dropzone-file"
type="file"
className="hidden"
onChange={submitMsgFile}
/>
</label>
</div>
{alertMessage && (
<div role="alert" className="alert">
<svg
xmlns="http://www.w3.org/2000/svg"
className="stroke-current shrink-0 h-6 w-6"
fill="none"
viewBox="0 0 24 24"
>
<path
strokeLinecap="round"
strokeLinejoin="round"
strokeWidth="2"
d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z"
/>
</svg>
<span>{alertMessage}</span>
<div>
<a href="#results" className="btn btn-sm btn-primary">
Get Results
</a>
</div>
</div>
)}
</div>
</div>
</div>
);
};
export default FileInput;
```
The `submitMsgFile` handler function listens `onChange` event from input field. That mean's once user uploads file it wil trigger the handler function.
Next, passing the `file` to the `FormData` and sending to the endpoint.
As you noticed this component has callback prop which sends result of reader to the parent component. In page level, we will pass the result to other components to display the result.
Let's now move to the `Headers` component that displays result of email headers:
_components/Headers.tsx_
```typescript
"use client";
interface EmailHeadersProps {
headerDetails: string | undefined;
}
const EmailHeaders: React.FC<EmailHeadersProps> = ({ headerDetails }) => {
return (
<div className="px-16 my-8" id="results">
{headerDetails && (
<>
<h3 className="text-2xl font-bold mb-2">Email Headers</h3>
<div className="mockup-code">
<pre data-prefix="~" className="text-xs text-sky-500">
<code
style={{
display: "block",
marginLeft: "50px",
whiteSpace: "pre-wrap",
wordWrap: "break-word",
overflowWrap: "break-word",
wordBreak: "break-all",
}}
>
{headerDetails}
</code>
</pre>
</div>
</>
)}
</div>
);
};
export default EmailHeaders;
```
and finally `Attachments` to view attached file names.
_/components/Attachments.tsx_
```typescript
"use client";
interface AttachmentsInterface {
attachmentsDetails: Array<any> | undefined;
}
const AttachmentsHeaders: React.FC<AttachmentsInterface> = ({
attachmentsDetails,
}) => {
return (
<div className="px-16 my-8">
{attachmentsDetails && attachmentsDetails.length > 0 &&(
<>
<h3 className="text-2xl font-bold">Attachments</h3>
<dl className="max-w-md text-gray-900 divide-y divide-gray-200 dark:text-white dark:divide-gray-700">
{attachmentsDetails.map((item) => (
<>
<div className="flex flex-col pb-3">
<dt className="mb-1 text-gray-500 md:text-lg dark:text-gray-400">
{item.attachMimeTag}
</dt>
<dd className="text-lg font-semibold">{item.fileName}</dd>
</div>
</>
))}
</dl>
</>
)}
</div>
);
};
export default AttachmentsHeaders;
```
### **Results**
At the end the result will look like below:


### **What’s next?**
One more project added yo your portfolio! Feel free to display more details or change the design as you want.
You need to deploy your NextJS to Vercel to make it publicly accessible for others. Also, remember to explain your solution well in ReadMe file since it shows that you’re documenting your work.
{% embed https://github.com/PylotStuff/msg-analyzer %}
| thedevtimeline |
1,890,833 | which web development agency excels in both creativity and functionality? | Can anyone recommend a web development agency that not only crafts visually stunning designs but also... | 0 | 2024-06-17T05:38:38 | https://dev.to/akshiya_50c4f18ab5c50370c/which-web-development-agency-excels-in-both-creativity-and-functionality-2fi2 | webdev, website, beginners, discuss | Can anyone recommend a web development agency that not only crafts visually stunning designs but also ensures seamless functionality?
I’m Looking for a partner that goes beyond the basics, perhaps one that adds a touch of innovation to every project. | akshiya_50c4f18ab5c50370c |
1,890,832 | Compiler In One Byte | This is a submission for DEV Computer Science Challenge v24.06.12: One Byte Explainer. ... | 0 | 2024-06-17T05:35:17 | https://dev.to/sharavana/compiler-in-one-byte-4hf1 | devchallenge, cschallenge, computerscience, beginners | *This is a submission for [DEV Computer Science Challenge v24.06.12: One Byte Explainer](https://dev.to/challenges/cs).*
## Explainer
A program that takes your code and breaks it into small meaningful pieces and checks for errors and forms a tree of expressions and translates to a low-level language like assembly lang.Why we need it?Just because we are lazy to write assembly code.
<!-- Explain a computer science concept in 256 characters or less. -->
## Additional Context
Q)Why should I know about it?
Ans.)Just for GK that you don't need to compile if you are a lazy Js dev like me.
<!-- Please share any additional context you think the judges should take into consideration as it relates to your One Byte Explainer. -->
<!-- Team Submissions: Please pick one member to publish the submission and credit teammates by listing their DEV usernames directly in the body of the post. -->
<!-- Don't forget to add a cover image to your post (if you want). -->
<!-- Thanks for participating! --> | sharavana |
1,890,831 | how-to-test-local-website-on-your-phone by aryan | vite package.json //change vite --host 'scripts':{ 'dev':'vite... | 0 | 2024-06-17T05:31:22 | https://dev.to/aryan015/how-to-test-local-website-on-your-phone-by-aryan-3aic | react, vite, node | ## vite
`package.json`
```js
//change vite --host
'scripts':{
'dev':'vite --host',
}
//initially vite runs on local server only.
```
```sh
npm run dev #start vite server
```
## live server extension
If you are not using any framework then This extension might help.
```js
//download live server on VSCode[available for other IDE]
//1. right click
//2. start live server
```
```sh
ipconfig #windows
#copy ipv4 address with port number
#http://192.168.87.148:5173/
```
## create react app
```sh
npm start --host 192.168.1.111 # you can start custom react app using --host flag
```
type on phone
```sh
#192.168.1.111:3000 # get your host on react terminal
```
## How to find your IP address
### for mac
System Preferences can be accessed by opening the Apple menu.
Select Network from the View menu by opening it, or click Network in the System Preferences window.
On the left menu, select your network connection.
The local IP address for an Ethernet or USB connection will be displayed.
In the connection status section for a Wi-Fi connection, your IP address will be displayed.
### for linux
```sh
ifconfig
```
`note`: You must be on same network by either USB or wifi.
[🔗linkedin](https://www.linkedin.com/in/aryan-khandelwal-779b5723a/)
## learning resources
[🧡Scaler - India's Leading software E-learning](www.scaler.com)
[🧡w3schools - for web developers](www.w3school.com) | aryan015 |
1,890,830 | تحميل ملف قنوات النايل سات HD | يمكنك تحميل ملف قنوات نايل سات عربي والتمتع بأحدث باقة من القنوات الجديدة | 0 | 2024-06-17T05:27:27 | https://dev.to/mahadathalyoum/thmyl-mlf-qnwt-lnyl-st-hd-4bb4 | nilesat | يمكنك [تحميل ملف قنوات نايل سات عربي](https://mahadathalyoum.com/%D9%85%D9%84%D9%81-%D9%82%D9%86%D9%88%D8%A7%D8%AA-%D9%86%D8%A7%D9%8A%D9%84-%D8%B3%D8%A7%D8%AA/) والتمتع بأحدث باقة من القنوات الجديدة | mahadathalyoum |
1,890,829 | Job portal you can apply as a software engineer in India [part-3] GOVT | Part three of find jobs for jobless people. NITIAAYOG Official Government of India... | 0 | 2024-06-17T05:26:26 | https://dev.to/aryan015/job-portal-you-can-apply-as-a-software-engineer-in-india-part-3-govt-4fc6 | google, microsoft, codechef, react | Part three of find jobs for jobless people.
## [NITIAAYOG](https://www.niti.gov.in/)
Official Government of India website for jobless people. National Institution for Transforming India aims to uplift job market.🤷♀️
## [AICTE](https://internship.aicte-india.org/)
All India Council for Technical Education is again a government agency for uplift skills of fresher. I think here you can find paid internship. I myself paid 500rs to get certificate here.🤣[linkedin](https://www.linkedin.com/in/aryan-khandelwal-779b5723a/)
## [my-gov-in](https://www.mygov.in/work-at-mygov/)
I found a developer job around 50k. My favorite platform.
[🔗linkedin](https://www.linkedin.com/in/aryan-khandelwal-779b5723a/)
## learning resources
[🧡Scaler - India's Leading software E-learning](www.scaler.com)
[🧡w3schools - for web developers](www.w3school.com) | aryan015 |
1,890,792 | Paguina | Check out this Pen I made! | 0 | 2024-06-17T04:03:36 | https://dev.to/walter_sanchez/paguina-1i85 | codepen | Check out this Pen I made!
{% codepen https://codepen.io/wadysgo/pen/qBGpKGe %} | walter_sanchez |
1,890,828 | Simple Volatility EMV Strategy | Summary Unlike other technical indicators, "Ease of Movement Value" reflects changes in... | 0 | 2024-06-17T05:25:57 | https://dev.to/fmzquant/simple-volatility-emv-strategy-1045 | strategy, trading, fmzquant, cryptocurrency | ## Summary
Unlike other technical indicators, "Ease of Movement Value" reflects changes in price, volume, and popularity. It is a technology that combines prices and volume changes. It measures the price change of unit volume, Forming a price volatility indicator. When the market gathers popularity and the transaction is active, it prompts a buying signal; when the trading volume is low, and the market energy is about to run out, it prompts a selling signal.
Simple volatility EMV is designed according to the principle of equal volume chart and compressed chart. Its core concept is: market price will consume a lot of energy only when the trend turns or is about to turn, and the external performance is that the trading volume becomes larger. When the price is rising, it will not consume too much energy due to the boosting effect. Although this idea is contrary to the view that both quantity and price rise, it does have its own unique features.
## EMV calculation formula
Step 1: Calculate mov_mid
Among them, TH represents the highest price of the day, TL represents the lowest price of the day, YH represents the highest price of the previous day, and YL represents the lowest price of the previous day. Then if MID> 0 means today's average price is higher than yesterday's average price.

Step 2: Calculate ratio
Among them, TVOL represents the trading volume of the day, TH represents the highest price of the day, and TL represents the lowest price of the day.

Step 3: Calculate emv

## EMV usage
The author of EMV believes that the huge rise is accompanied by the rapid depletion of energy, and the rise often does not last too long; on the contrary, the moderate volume, which can save a certain amount of energy, often makes the rise last longer. Once an upward trend is formed, less trading volume can push prices up, and the value of EMV will increase. Once the downtrend market is formed, it is often accompanied by an infinite or small decline, and the value of EMV will decline. If the price is in a volatile market or the price rises and falls are accompanied by a large volume, the value of EMV will also be close to zero. So you will find that EMV is below the zero axis in most of the market, which is also a major feature of this indicator. From another perspective, EMV values mega-trends and can generate sufficient profit.
The usage of EMV is quite simple, just look at whether EMV crosses the zero axis. When EMV is below 0, it represents a weak market; when EMV is above 0, it represents a strong market. When EMV changes from negative to positive, it should be bought; when EMV changes from positive to negative, it should be sold. Its characteristic is that it can not only avoid the shock market in the market, but also enter the market in time when the trend market starts. However, because EMV reflects the change in volume when prices change, it only has an effect on mid to long-term trends. For short-term or relatively short trading cycles, EMV's effect is very poor.
## Strategy realization
Step 1: Write a strategy framework
```
# Strategy main function
def onTick():
pass
# Program entry
def main():
while True: # Enter infinite loop mode
onTick() # execution strategy main function
Sleep(1000) # sleep for 1 second
```
FMZ.COM adopts the rotation training mode. First, you need to define a main function and an onTick function. The main function is the entry function of the strategy, and the program will execute the code line by line from the main function. In the main function, write a while loop and repeatedly execute the onTick function. All the core code of the strategy is written in the onTick function.
Step 2: Get position data
```
def get_position():
position = 0 # The number of assigned positions is 0
position_arr = _C(exchange.GetPosition) # Get array of positions
if len(position_arr)> 0: # If the position array length is greater than 0
for i in position_arr: # Traverse the array of positions
if i['ContractType'] =='IH000': # If the position symbol is equal to the subscription symbol
if i['Type']% 2 == 0: # if it is long position
position = i['Amount'] # Assign a positive number of positions
else:
position = -i['Amount'] # Assign the number of positions to be negative
return position # return position quantity
```
Because in this strategy, only the number of real-time positions is used, in order to facilitate maintenance, get_position is used here to encapsulate the amount of positions. If the current position is long, it returns a positive number, and if the current position is short, it returns a negative number.
Step 3: Get K-line data
```
exchange.SetContractType('IH000') # Subscribe to futures variety
bars_arr = exchange.GetRecords() # Get K-line array
if len(bars_arr) <10: # If the number of K lines is less than 10
return
```
Before obtaining specific K-line data, you must first subscribe to a specific trading contract, use the SetContractType function from FMZ.COM, and pass in the contract code. If you want to know other information about the contract, you can also use a variable to receive this data. Then use GetRecords function to get K-line data, because the returned is an array, so we use the variable bars_arr to accept it.
Step 4: Calculate emv
```
bar1 = bars_arr[-2] # Get the previous K-line data
bar2 = bars_arr[-3] # get the previous K-line data
# Calculate the value of mov_mid
mov_mid = (bar1['High'] + bar1['Low']) / 2-(bar2['High'] + bar2['Low']) / 2
if bar1['High'] != bar1['Low']: # If the dividend is not 0
# Calculate the value of ratio
ratio = (bar1['Volume'] / 10000) / (bar1['High']-bar1['Low'])
else:
ratio = 0
# If the value of ratio is greater than 0
if ratio> 0:
emv = mov_mid / ratio
else:
emv = 0
```
Here, we do not use the latest price to calculate the value of EMV, but use the relatively lagging current K line to output the signal and place a K line to issue an order. The purpose of this is to make the backtest closer to real trading. We know that although the quantitative trading software is now very advanced, it is still difficult to completely simulate the real price tick environment, especially when faced with backtesting Bar-level long data, so this compromise method is used.
Step 5: Placing the orders
```
current_price = bars_arr[-1]['Close'] # latest price
position = get_position() # Get the latest position
if position> 0: # If you are holding long positions
if emv <0: # If the current price is less than teeth
exchange.SetDirection("closebuy") # Set the trading direction and type
exchange.Sell(round(current_price-0.2, 2), 1) # close long position
if position <0: # If you are holding short positions
if emv> 0: # If the current price is greater than the teeth
exchange.SetDirection("closesell") # Set the trading direction and type
exchange.Buy(round(current_price + 0.2, 2), 1) # close short position
if position == 0: # If there is no holding position
if emv> 0: # If the current price is greater than the upper lip
exchange.SetDirection("buy") # Set the trading direction and type
exchange.Buy(round(current_price + 0.2, 2), 1) # open long position
if emv <0: # if the current price is smaller than the chin
exchange.SetDirection("sell") # Set the trading direction and type
exchange.Sell(round(current_price-0.2, 2), 1) # open short position
```
Before placing the order, we need to determine two data, one is the price of the order and the other is the current position status. The price of placing an order is very simple, just use the current closing price to add or subtract the minimum change price of the variety. Since we have used the get_position function to encapsulate the position, we can call it directly here. Finally, the position is opened and closed according to the positional relationship between the EMV and the zero axis.
## Strategy backtest
Backtest configuration

Backtest log


Capital curve

## Complete strategy
```
# Backtest configuration
'''backtest
start: 2019-01-01 00:00:00
end: 2020-01-01 00:00:00
period: 1d
basePeriod: 1d
exchanges: [{"eid":"Futures_CTP","currency":"FUTURES"}]
'''
def get_position():
position = 0 # The number of assigned positions is 0
position_arr = _C(exchange.GetPosition) # Get array of positions
if len(position_arr)> 0: # If the position array length is greater than 0
for i in position_arr: # Traverse the array of positions
if i['ContractType'] =='IH000': # If the position symbol is equal to the subscription symbol
if i['Type']% 2 == 0: # if it is long position
position = i['Amount'] # Assign a positive number of positions
else:
position = -i['Amount'] # Assign the number of positions to be negative
return position # return position quantity
# Strategy main function
def onTick():
# retrieve data
exchange.SetContractType('IH000') # Subscribe to futures
bars_arr = exchange.GetRecords() # Get K-line array
if len(bars_arr) <10: # If the number of K lines is less than 10
return
# Calculate emv
bar1 = bars_arr[-2] # Get the previous K-line data
bar2 = bars_arr[-3] # get the previous K-line data
# Calculate the value of mov_mid
mov_mid = (bar1['High'] + bar1['Low']) / 2-(bar2['High'] + bar2['Low']) / 2
if bar1['High'] != bar1['Low']: # If the dividend is not 0
# Calculate the value of ratio
ratio = (bar1['Volume'] / 10000) / (bar1['High']-bar1['Low'])
else:
ratio = 0
# If the value of ratio is greater than 0
if ratio> 0:
emv = mov_mid / ratio
else:
emv = 0
# Placing orders
current_price = bars_arr[-1]['Close'] # latest price
position = get_position() # Get the latest position
if position> 0: # If you are holding long positions
if emv <0: # If the current price is less than teeth
exchange.SetDirection("closebuy") # Set the trading direction and type
exchange.Sell(round(current_price-0.2, 2), 1) # close long position
if position <0: # If you are holding short positions
if emv> 0: # If the current price is greater than the teeth
exchange.SetDirection("closesell") # Set the trading direction and type
exchange.Buy(round(current_price + 0.2, 2), 1) # close short position
if position == 0: # If there is no holding position
if emv> 0: # If the current price is greater than the upper lip
exchange.SetDirection("buy") # Set the trading direction and type
exchange.Buy(round(current_price + 0.2, 2), 1) # open long position
if emv <0: # if the current price is smaller than the chin
exchange.SetDirection("sell") # Set the trading direction and type
exchange.Sell(round(current_price-0.2, 2), 1) # open short position
# Program entry
def main():
while True: # Enter infinite loop mode
onTick() # execution strategy main function
Sleep(1000) # sleep for 1 second
```
The complete strategy has been published to the strategy square of the FMZ.COM website, and it can be used by clicking Copy.
https://www.fmz.com/strategy/213636
## To sum up
Through this course of study, we can see that EMV is contrary to ordinary traders, but it is not unreasonable. Because EMV introduces volume data, it is more effective than other technical indicators that use price calculations to find out what's behind the price. Each strategy has different characteristics. Only by fully understanding the advantages and disadvantages of different strategies and removing the dross and extracting its essence can we go further from success.
From: https://www.fmz.com/digest-topic/5833 | fmzquant |
1,890,827 | Environment file details required | I have Ubuntu OS on AWS free tier. Ran command: npm install express mongoose dotenv bcryptjs... | 0 | 2024-06-17T05:25:36 | https://dev.to/abanerjee1107/environment-file-details-required-55cj | node, javascript, webdev | I have Ubuntu OS on AWS free tier. Ran command:
npm install express mongoose dotenv bcryptjs jsonwebtoken cors
Now, in .env file require below details:
MONGO_URI=your_mongodb_uri
JWT_SECRET=your_jwt_secret
Can someone help with how to get the above environment details. | abanerjee1107 |
1,890,984 | Linkwarden: Free Open-source Bookmark Manager for Teams | In the digital age, managing a plethora of bookmarks can be a daunting task, especially for teams... | 0 | 2024-06-18T06:39:17 | https://blog.elest.io/linkwarden-free-open-source-bookmark-manager-for-teams/ | opensourcesoftwares, elestio, linkwarden | ---
title: Linkwarden: Free Open-source Bookmark Manager for Teams
published: true
date: 2024-06-17 05:22:21 UTC
tags: Opensourcesoftwares, Elestio, Linkwarden
canonical_url: https://blog.elest.io/linkwarden-free-open-source-bookmark-manager-for-teams/
cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/85gqgkxwlm512t8s4thv.jpg
---
In the digital age, managing a plethora of bookmarks can be a daunting task, especially for teams working on collaborative projects.
Enter [Linkwarden](https://elest.io/open-source/linkwarden?ref=blog.elest.io), an open-source bookmark manager designed specifically for individuals and teams to collect, organize, and preserve webpages. Its feature-rich platform ensures that your bookmarks are not only organized but also preserved for future access, even if the original webpages are taken down.
With its self-hostable nature, Linkwarden provides transparency and trust, allowing users to have full control over their data. Let's dive into the core functionalities that make Linkwarden an indispensable tool for teams.
<iframe width="200" height="113" src="https://www.youtube.com/embed/owu98CVcjYc?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen title="Linkwarden: Free Open-source Bookmark Manager for Teams"></iframe>
_Watch our Linkwarden platform overview_
### Collect and Organize
Linkwarden excels at helping users collect and organize bookmarks from any browser with ease. The platform allows you to create collections and sub-collections, enabling a hierarchical structure for better organization.
Tags can be added to bookmarks, making it simple to categorize and locate specific links. For instance, you can create a new collection to group related links for a particular project, ensuring that all team members have easy access to relevant resources.
The browser extension simplifies the process of collecting webpages directly from your browser, streamlining the workflow for busy professionals.
Bulk actions, such as editing or deleting multiple items at once, further enhance the organizational capabilities. Linkwarden's import and export features allow you to seamlessly integrate your existing bookmarks, ensuring a smooth transition from other bookmark managers.
Additionally, pinning favorite links to the dashboard provides quick access to frequently used resources, boosting productivity.
The platform's powerful search functionality allows you to effortlessly filter and find curated content across all collections, making it a breeze to retrieve information.
With these features, Linkwarden transforms the way you manage your bookmarks, bringing order and efficiency to your digital life.
### Preserve Webpages
One of Linkwarden's standout features is its ability to preserve webpages, ensuring that your bookmarked content remains accessible even if the original site goes offline.
This safeguard against link rot is crucial for maintaining the integrity of your curated resources. The platform allows you to download archived versions of webpages, providing a permanent snapshot that can be revisited anytime.
This feature is particularly valuable for researchers and professionals who rely on specific online content for their work. By preserving webpages, Linkwarden ensures that you won't lose access to important information due to changes or deletions on the original site.
The platform's secure API integration allows for custom solutions and automation, enhancing the preservation process. Whether you're storing articles, blog posts, or other online content, Linkwarden's preservation capabilities provide peace of mind. This commitment to content preservation makes Linkwarden an essential tool for anyone who relies on stable, long-term access to online resources.
### Collaborate and Share
Linkwarden is designed with collaboration in mind, making it an ideal tool for teams.
You can collaborate on gathering resources under a collection, ensuring that all team members contribute to and benefit from a shared pool of information. Customizable permissions allow you to control access and editing rights, tailoring the platform to fit your team's needs.
This flexibility ensures that sensitive information remains secure while enabling seamless collaboration. Sharing curated collections is made easy, whether you want to share with team members or the public.
This feature is particularly useful for project collaboration, where sharing essential links and references is crucial for success. With its robust sharing and collaboration features, Linkwarden empowers teams to work together efficiently and effectively.
### Conclusion
In summary, [Linkwarden](https://elest.io/open-source/linkwarden?ref=blog.elest.io) is a powerful, open-source bookmark manager that caters to the needs of both individuals and teams. Its comprehensive features for collecting, organizing, and preserving webpages make it an indispensable tool for modern web users. By offering a self-hostable solution, Linkwarden ensures transparency and control over your data, while its responsive design guarantees accessibility across all devices.
[Try Linkwarden on Elestio.](https://elest.io/open-source/linkwarden?ref=blog.elest.io) | kaiwalyakoparkar |
1,873,010 | 😁 Join my community of technical writers and open source developers | A community of Developers by Developers! We all can agree that being a part of a good community can... | 0 | 2024-06-17T05:21:41 | https://dev.to/anmolbaranwal/join-my-community-of-technical-writers-and-open-source-developers-46g5 | opensource, community, learning, discuss | A community of Developers by Developers!
We all can agree that being a part of a good community can lead to exponential growth.
I love the DEV community but interacting with others is still tough, so I've created a new community `OpenSouls` for technical writers, open source developers and content creators.
You can join here: [dub.sh/opensouls](https://dub.sh/opensouls)
Yo! If you're still thinking, then let's change your mind.

---
## How a community is an ideal way to grow?
Community is where the real growth begins. 🔥
I've been part of great communities with less than 100 members and others being more than 1000 members.
I'm also a discord moderator in several others so I know how things work.
Do you have to reply to every message? No.
Do you have to read every message? No.
Do you have to share value? No.
If you do it genuinely, you can make close bonds (lifelong ones).
It's like free learning with others.
So, join a community, be there for someone, and rest assured, someone will be there for you too.
---
## What can you do with some of the channels?
You can find the channel guide as soon as you join the community but let's see what all you can do!
`💬・general-chat`: the main channel to hang out, and discuss tech, content creation, technical writing, or any cool stuff you're working on.
`📈・twitter-x-chat`: to discuss content creation for Twitter. Share any good posts you find on Twitter.
`📖・linkedin-chat`: to discuss content creation for LinkedIn. Share any good posts you find on LinkedIn.
`🏆・your-achievements`: share your cool achievements here, in terms of followers, views, stars, clients, or anything. Show the community what you're proud of!
`🤝・open-source`: share any open source projects, resources, or any cool stuff. You can even discuss growth strategy on how to grow in open source!
`🔖・awesome-articles`: share any awesome articles you have read or found.
`✨・discord-feedback`: give feedback on how to improve this server.
`🌱・profile-feedback`: ask for feedback on your profile, projects, docs, landing page, or any cool stuff you want. The community is here for you!

`⚒・jobs-internships`: post any opportunities you have or check the existing ones if you are seeking work.

`🧠・hackathons`: find team members and share about any cool hackathon you're participating in. Discuss projects and how to win them for real!
`🔥・buildspace`: to meet other builders participating in Nights and Weekends. Grow your ideas and gather feedback!
> <q>I will share all the awesome tweets, or posts from LinkedIn that I find interesting in the necessary channels, so I will make sure to keep it active.</q>
---
## Why are all the three domains together?
I do all of that stuff:
- I am an open source developer.
- I am a technical writer.
- I am a content creator.
I simply want people like me to keep each other accountable and grow beyond!
Don't worry, it's with a proper structure so you can focus on what matters.
---
## $40 for Lifetime Growth
I've received a lot of requests over the months with doubts and I always reply to them.
But we both can agree that nobody is almost serious!
If you want to be mentored by me on how to grow as a technical writer and open source developer. Then you can sponsor me on GitHub and you will get a private channel - it's like a lifetime growth.
To be frank, you don't have to do this because there will be lots of developers and writers you can learn from for free.
Read the complete details here on this [notion document](https://anmolbaranwal.notion.site/Lifetime-Growth-844e8b39890b4a9297427debcbd95e72) including how it works, expectations, and if it's worth the deal.
---
## Need feedback
Okay, let me be frank. I'm very new to this.
Any tips on how to run a good community are appreciated.
Or any feedback in terms of structure, content, suggestions, or anything.
I've put a lot of thought into making this, and I've tried to keep it organized. Please [click here to tweet about this community](https://twitter.com/intent/tweet?text=Join%20the%20awesome%20community%20for%20developers%20and%20technical%20writers%20at%20dub.sh%2Fopensouls%20by%20%40anmol_codes) to spread the word.
ty for reading!
see you there :) | anmolbaranwal |
1,885,930 | The Humans of OpenTelemetry: KubeCon EU 2024 | We’re back with our second edition of Humans of OpenTelemetry, this time from KubeCon EU in Paris.... | 0 | 2024-06-17T04:00:00 | https://opentelemetry.io/blog/2024/humans-of-otel-eu-2024/ |

We’re back with our second edition of [Humans of OpenTelemetry](http://localhost:1313/blog/2023/humans-of-otel/), this time from KubeCon EU in Paris. Once again, [Reese Lee](https://github.com/reese-lee) and I interviewed OpenTelemetry contributors and end users, and learned how they got involved with OTel:
* [Iris Dyrmishi (Miro)](https://www.linkedin.com/in/iris-dyrmishi-b15a9a164/)
* [Severin Neumann (Cisco)](https://github.com/svrnm)
* [Kayla Reopelle (New Relic)](https://github.com/kaylareopelle)
* [Morgan McLean (Splunk)](https://github.com/mtwo)
* [Henrik Rexed (Dynatrace)](https://github.com/henrikrexed)
* [Vijay Samuel (eBay)](https://github.com/ccaraman)
* [Daniel Gomez Blanco (Skyscanner)](https://github.com/danielgblanco)
* [Doug Odegaard (ServiceNow)](https://github.com/dodegaard)
* [Adnan Rahić (Tracetest)](https://github.com/adnanrahic)
* [Rynn Mancuso (Honeycomb)](https://github.com/musingvirtual)
Also, special thanks to:
* [Reese Lee](https://github.com/reese-lee), my co-interviewer
* [Henrik Rexed](https://github.com/henrikrexed) for providing the audio and video recording equipment, and doing the initial edits of the raw footage
* [Zhu Jiekun](https://github.com/jiekun) for assisting with his own camera
You can watch the full recording below:
{% embed https://youtu.be/bsfMECwmsm0?si=qiC2KfKOBmIvj0WW %}
Thanks to everyone who has contributed to OpenTelemetry to date, and we look forward to your continued contributions in 2024 and beyond! 🎉
### **Transcript**
If reading is more your thing, check out the transcript of our conversations below.
#### **1- Meet the Humans of OTel**
**IRIS DYRMISHI:** Well, I’m Iris Dyrmishi. I’m a senior observability engineer at Miro and my life, my professional life is all about observability. I build an observability platform that provides the tools for engineering teams at Miro to monitor, to observe and get the best of their applications.
**SEVERIN NEUMANN:** My name is Severin Neumann. I’m working at Cisco at the open source program office and I’m a member of the OpenTelemetry governance committee and I’m one of the co-maintainers of the OpenTelemetry documentation.
**KAYLA REOPELLE:** My name is Kayla Reopelle. I work for New Relic and I am contributing to the OpenTelemetry Ruby project.
**MORGAN MCLEAN:** My name is Morgan McLean,I’m a director of product management at Splunk.I’ve been with OpenTelemetry since day one. I’m one of the co-founders of the project. I’m on the governance committee. Wow. What do I work on within OTel?A bit of everything. I mean early on it was literally everything. Myself and Ted and various others were doing many, many jobs. More recently I was involved in the release of traces, metrics 1.0. Logs 1.0 last year. Right now I’m working on profiling as well as OpenTelemetry’s expansion into mainframe computing.
**HENRIK REXED:** My name is Henrik Rexed. I am a cloud native advocate at Dynatrace and I’m passionate about observability, performance, and I’m trying to help the community by providing content on getting started on any solutions out there.
**VIJAY SAMUEL:** My name is Vijay Samuel and I help do architecture for the observability platform at eBay.
**DANIEL GOMEZ BLANCO:** I’m Daniel Gomez Blanco. I’m a principal engineer at Skyscanner and also member of the OpenTelemetry governance committee.
**DOUG ODEGAARD:** My name is Doug Odegaard. I’m a senior solutions architect with ServiceNow Cloud Observability, which is also formerly Lightstep. I’m also a previous customer of using OpenTelemetry for several years prior to that.
**ADNAN RAHIĆ:** Hey, I am Adnan. I work at Tracetest as a developer advocate which is…you can guess better than me what that is. Pretty much do a bunch of everything regarding OpenTelemetry. I’m one of the contributors for the documentation, for the blog, and the demo.
**RYNN MANCUSO:** My name is Rynn Mancuso. I work for Honeycomb.io and I am one of the maintainers of the End User SIG.
#### **2- What does observability mean to you?**
**IRIS DYRMISHI:** What does observability mean to me? observability to me is the biggest passion of my life and also my professional career. It is one of those areas that you are not very interested when you start your career because you don’t know anything about it. It’s not taught in school, it’s not preached by the tech communities a lot, but then you discover it and say, “Wow, this is amazing!” We’re actually making a change and we’re helping the teams make the best of their product. So yeah, that’s all.
**SEVERIN NEUMANN:** I think observability is a big game changer, right? So it’s evolution from what we have done, especially APM, over the last few years. So I worked for a very long time at AppDynamics and we sold APM agents to customers and we gave them a lot of the things that observability is promising today as well. But the big change I see with observability that it’s coming down, let’s say to everybody, right? So this is making the things that we did there available for everybody. And even more, we’re moving away from this… Hey, let’s add a post compilation agent into your application to like, yeah, let’s make native observability. Let’s make this a thing that developers, that operation teams are using across all the organizations.
**KAYLA REOPELLE:** So to me observability means having peace of mind. It means having something that you can rely on in order to see what happened and what went wrong. I think observability is also a way to feel more technically connected to your customers and your users, so that you can see the ways that they’re interacting with your software instead of just the ways that you might interact with it.
**MORGAN MCLEAN:** I mean, observability to me transcends just the computing industry. It’s the ability to peer into something and understand how it works, what it’s doing right now, and thus if it breaks, how to fix it more quickly. Certainly when we think about telemetry in this industry, what observability classically has meant is visibility to backend infrastructure and applications kind of excitingly, I think it’s expanding right now, right? With OpenTelemetry we’re pushing into client applications, we’re pushing into mainframes, as I mentioned earlier. And so it’s really visibility into any systems that impact your business, any technical system observability.
**HENRIK REXED:** Usually when people mention of observability they say it’s a replacement of the old name monitoring. But in fact for me it’s more than monitoring, because monitoring is like, you just look at something and observability is like having enough information to understand a given situation. So if you just look at metrics then, okay, you have a guess that something is going on, but you don’t understand. So having the options to get more information like logs, events, exceptions, traces, compiling, then at the end combine all those dimensions together, then you say, okay, I got it, this is my problem and I can resolve it.
**VIJAY SAMUEL:** What does observability mean to me? I belong to what is called the site engineering organization inside of eBay, and our goal is to make sure that we can observe everything that’s going on in the site and ensure that we have high availability. So basically, observability means knowing if the site is running fine or not, because that’s why I’m there.
**DANIEL GOMEZ BLANCO:** What does observability mean to me? It’s a way for us to understand what’s happening within our systems, because we run quite a complex system, so we need to understand what goes on inside of them so we can deliver a good experience for our end users at the end of the day.
**DOUG ODEGAARD:** So observability is, to me, I’ve been a full stack developer for years, and so as we observe…actually I ended up on an incident response team doing tracking of incidents, but also trying to figure out what was wrong. And it pointed out to me how much we need this, how hard it was to look at so many different screens and so forth.
**ADNAN RAHIĆ:** So observability is, to me, I’ve been a full stack developer for years, and so as we observe…actually I ended up on an incident response team doing tracking of incidents, but also trying to figure out what was wrong. And it pointed out to me how much we need this, how hard it was to look at so many different screens and so forth.observability for me is the way to actually see what’s happening in your system. It’s the pinnacle of not being up the whole night trying to figure out what went wrong. And with OpenTelemetry and with the rise of tracing the last couple of years, it has hit an all time high with regards to the possibilities that we have right now. So I’m just really, really happy to be part of the project. I’m also really happy that it’s growing at that pace, that it’s growing right now, and I can’t see how that’s going to evolve within the next couple of years.
**RYNN MANCUSO:** For me, observability is about being able to ask deeper questions of our systems, being able to demand, I think more than just alerting on things that are emergencies, things we’ve seen before, but actually being able to go out into the unknown and understand how complex systems are performing.
#### **3- What does OpenTelemetry mean to you?**
**IRIS DYRMISHI:** OpenTelemetry is the tool that is making observability great again. I would say that observability is seeing the surge, now that OpenTelemetry is becoming so popular, it’s allowing centralization of telemetry signals, it’s allowing semantic conventions, and it’s generally helping observability teams and engineering teams take more attention to the observability and building it and making it better.
**SEVERIN NEUMANN:** What does OpenTelemetry mean to me? I think it’s the vehicle for observability. It’s enabling that. And I joined OpenTelemetry community a few years back because I was curious about this idea to bring observability to everybody. And I think we are doing a really good job. And what it also means to me now is that it’s an amazing community. Right? So we’re at KubeCon here, and I meet so many people I just know from those conversations, and now I can talk to them in person. And we talk a lot about OpenTelemetry, but we also talk a lot about other things than OpenTelemetry. We talk about observability, of course, about what we think about is going to happen in a few years and all those other things, and that’s what OpenTelemetry means to me.
**KAYLA REOPELLE:** So OpenTelemetry to me seems like it’s a community effort to take the best of what’s already been out there for instrumentation and collect it in one group so that everyone can benefit from it. I think that we’ve learned so much as different agent engineers, but there’s also so much to learn from users of the products themselves. And OpenTelemetry does a great job of bringing both people who are, you know, experts in observability, and experts in languages to make something that’s really great and meaningful for everyone.
**MORGAN MCLEAN:** I mean, OpenTelemetry is my baby. Put so much effort into creating this project. What does it mean to me? I mean, there’s the boring answer, which is it extracts signals: metrics, traces, logs, profiles, everything else from your infrastructure, from your services, from your clients, makes those observable, processable on the backend. But I think to a lot of us who’ve been in this community so long, and a lot of us like yourself and Henrik here and others who participate in the community so much, I mean, OTel is also just a really nice open source community to participate in. It’s a thing I just enjoy working on. I know that’s abstract and kind of like a sort of squishy thing to say, but I don’t know. OTel has a lot of meaning to me in many different ways. All very positive.
**HENRIK REXED:** OpenTelemetry for me, means the future. Because at the end, by having an open standard, we have the luxury to have a common standard for common format, for all the solution of the market and having that common format for all the industry and all the vendors and all the solutions, it will just open use cases. I think testing used to rely on, I don’t know, feedback from users. And now with observability data, we could be so much efficient in the way we’re testing, we could be so much efficient in replacing marketing tools, business analytics tools. I think it’s the future. And one thing that also a lot of people talk about, AI everywhere, machine learning, blah, blah, blah, but I think it’s the same thing as a Tesla. I mean, Tesla, when you drive your car, it takes decisions based on the sensors that it measures. And if you don’t have those sensors and those measurements, then you cannot have a smart… you can have the smartest systems, but without the data, you cannot take the right decisions. I think it’s an enabler also for the future implementations of modern applications.
**VIJAY SAMUEL:** OpenTelemetry is the standard for observability going forward, and it’s very important because as we have gone through the journey of observability over the past few years, we have had to hunt for open standards in Prometheus and few others. Now, at least with ingestion and collection, it’s a single standard for everyone to adopt. And I think that’s pretty powerful for the long run.
**DANIEL GOMEZ BLANCO:** What does OpenTelemetry mean to me? That, I think is bringing people together, bringing everyone together under one single language and the ones that way of thinking about telemetry. I think human languages are difficult enough for us to understand each other. And I think, you know, OpenTelemetry is bringing the technology together and one single way of like, thinking about telemetry, thinking about how we observe our systems.
**DOUG ODEGAARD:** To me, OpenTelemetry is bringing the ability to have product teams, infrastructure teams, helping their jobs make it easier and also just improve the customer experience and just make it overall a better experience to do our jobs.
**ADNAN RAHIĆ:** OpenTelemetry is the, I’m going to say, the future of observability. We’ve seen so many companies, many vendors move to an OpenTelemetry-first mindset, and the way that you can use OpenTelemetry to generate them, to actually gather all telemetry signals with one set of libraries, with one tool. It’s just the way it was supposed to be. You’re not locked into one tool, one vendor, one cloud provider anymore. You can do basically whatever you want, and you can use both the metrics, logs, and traces for basically anything you want to do. Really happy to see it.
**RYNN MANCUSO:** OpenTelemetry is an instrumentation protocol that helps us ask more detailed questions about observability because it collects multiple signals from many flexible types of systems. Folks monitor everything from the control plane in Kubernetes all the way up to physical on-prem systems. It’s a really flexible language and it’s beautiful community of humans that came together over the pandemic to build something really special.
#### **4- How did you get involved with OpenTelemetry?**
**IRIS DYRMISHI:** I was working in a very fast-pacing observability team, and we were maintaining a lot of tools and we really did not have conventions there, we did not have centralizations and we really were not flexible when it came to backends and vendor agnostic in general. So we discovered this amazing tool called OpenTelemetry. We said okay, let’s give it a try. It worked great for us. And here I am today, one year later, more than one year later, and let’s say pushing the migration to OpenTelemetry in my second project.
**SEVERIN NEUMANN:** How did I get involved into OpenTelemetry? So yeah, I mentioned that… so I got curious a few years ago. So I was… I was at AppDynamics working as a so-called domain architect, and I was an expert for Node.js, Python and a lot of those other languages. And there was always this conversation around like, hey, there’s this thing now called OpenTelemetry and should we not integrate this into our product? And I was like, okay, I want to learn more. Then I was like, what is a good way to learn something new about an open source technology? Yeah, get involved into that. So I was involved in JavaScript at some point, and then at some point I realized like, yeah, but if I really want to get a good view into OpenTelemetry, doing documentation is a good way into that. And that’s how I ended up being a maintainer for the documentation.
**KAYLA REOPELLE:** I got involved in OpenTelemetry last spring when New Relic asked me to take a look at what the current status was of the OpenTelemetry Ruby project. I also work as an engineer on the New Relic Ruby agent team, and that gave me an opportunity to start to contribute to the project. And I noticed that a lot of the signals for Ruby weren’t yet stable. So a lot of my work so far has been going into trying to bring logs and metrics to stability in Ruby.
**MORGAN MCLEAN:** I was working at Google on Google’s observability products like tracing, profiling, debugging, that sort of thing. And one of the challenges we had in tracing was getting data from people’s applications. It’s really, really hard. You need integrations of hundreds of thousands of pieces of software. No one team, no one company is going to maintain that. It’s just infeasible. And so we want to do something open source. There were other open source standards. There was one that had started, I think, roughly around the same time we were doing this, called OpenTracing. We started OpenCensus.
At some point, especially amongst the more social media savvy members, the team, which I am not one of, there was some contention between those projects about where people who maintain databases and language runtime things should actually spend their integration efforts, and it was limiting the success of both projects. So I was leading OpenCensus. Ted and Dan and others were leading OpenTracing. And in late 2018, early 2019, we finally sort of brought things to a head and decided to merge those into what is now called OpenTelemetry. So that’s sort of, you know, I’ve been involved since then, I’ve been…now I work at Splunk. Different company, but still on the same types of things. But that’s how my involvement started, and it’s just grown and grown and snowballed from there.
**HENRIK REXED:** When I started the adventure in observability, of course, I joined Dynatrace, and Dynatrace has their vendor agent, the OneAgent, and I saw this movement of OpenTelemetry, and coming from the performance background, I looked at it and I said, “Whoa, an open standard.” “That sounds quite exciting” because I had a performance, a gig for a customer, where I implemented like a collecting logs and processing it and putting machine learning. And I told myself at that time, it would be so wonderful to have one common standard. So then instead of doing a custom implementation, I could have something for everyone. And when I looked at the, just the definition of the project and the things behind the project, I was so excited. I said, oh, gosh, I want to be involved in the project. And that’s where I started to build content to help the community get started.
I used to be a developer, but I’m a bad developer for sure. So that’s why I’m trying to help the project in other ways, in all the directions. And yeah, my goal is increase the adoption of the open standards, making sure that it’s been adopted everywhere, so then we can move forward by implementing even more exciting implementations.
**VIJAY SAMUEL:** I started a few years ago for two reasons. One, we were looking to introduce tracing inside the company, and at that time, OpenTracing and OpenCensus was converging into OpenTelemetry. We started evaluating OpenTelemetry for that. And given that we were moving into OpenTelemetry for tracing, I also went through the journey of migrating our metrics collection into OpenTelemetry. That’s basically how I got involved.
**DANIEL GOMEZ BLANCO:** How did I get involved in OpenTelemetry? I got involved through my work at Skyscanner, as an end user. I was driving adoption and open standards for telemetry. During COVID there was a need for simplification and how we approach infrastructure, how we approach, how we collect, how we process, and how we export telemetry data, and also basically… to basically lead the adoption of open standards and their simplification effort. So as an observability lead, I got more involved in the community aspect of OpenTelemetry, decided to interact with all their end users and meeting people that want to solve the same problems and want to find a solution that works for everyone.
**DOUG ODEGAARD:** So, OpenTelemetry, I actually, for several years, in my previous position, I was hired to actually develop observability software. I was writing my own thing, we were doing a lot of alert management and various things. It was so much work and I thought, this has got to be easier. Plus I wanted to make sure that it could be future, future proof, dare I use that term? But also extensible.
And when I discovered OpenTelemetry, I was just like, oh, thank you. Because it’s something that the company could carry forward. And also we didn’t have to worry about storing the data as much. And so it’s really provided a really excellent platform so that we can focus on the task at hand versus how to do the job. So how I got involved in the project was actually first as a customer. It was about three, close to four years ago, kind of the infancy of OpenTelemetry. And I would go online, I would look at the documentation, or I would be in the code a lot, but I wanted to learn more. So I would go to a SIG call and there would be someone from Google and Microsoft and other companies, and then there was this guy from this small fintech in the US. And at first it was a little awkward, but they were so excited to have me in the call because I was an end user. And so it really was, it was a wonderful experience to begin that way, to realize that I could contribute to this rather than simply be a consumer of it. So it was great. And then I transitioned my career into working for a vendor, and we implement these systems now for customers like myself that I was years ago. So it’s kind of a pay it forward, give back type of thing.
**ADNAN RAHIĆ:** How did we get involved into the OpenTelemetry project? We started contributing more to the blog with you guys started contributing a bit to the docs as well. And yeah, it’s just been a whole-hearted effort in the team to always kind of dedicate a few, a few minutes of each day to check out the OpenTelemetry project, find a way to contribute.
**RYNN MANCUSO:** I got involved in the OpenTelemetry project…honestly, I was working at one observability company in marketing, and they didn’t see the point. They didn’t want me to get involved. And I really believed in open source. I’d worked in Mozilla and Wikimedia and really believed that, like, this was the way forward from a strategic perspective. So the second I could switch to a company that did let me get involved, that’s what I did. And now I’m at Honeycomb. And I’m glad to say within the first three months, I made project member and started working with the End User Working Group and worked to grow it into a SIG, into all the programs that it has today, together with others.
#### **5- What’s your favorite telemetry signal?**
**IRIS DYRMISHI:** Tracing is my favorite signal.
**SEVERIN NEUMANN:** My favorite signal now is profiling, because I think this is really closing a big gap that was missing in observability, right? So I mentioned before, right, I come from the APM space, and now for me, APM, observability, it’s very hard to make, like, a difference here. But one thing that when I talk with people using APM products right now is they’re like, hey, where’s code level visibility with OpenTelemetry, right? My commercial agent is giving me that line of code that is breaking something. And this is what we get with profiling. And that’s why I’m really, really excited about it.
**KAYLA REOPELLE:** To decide a favorite signal is kind of difficult for me. I really love the power of traces. I think that traces can tell stories in ways that are very meaningful. But on the same, like, on the other hand, I’ve been so immersed in logs and trying to allow logs to have more connections to spans and traces, I definitely have a soft spot for logs as well.
**MORGAN MCLEAN:** I mean, I’m partial to distributed traces because that’s where this project got its start. And I think early on, that’s where a lot of the value was. No one else was really doing standardized distributed trace collection right? There were some open source examples of it attached to, like, Zipkin and Jaeger. But I think the reason OpenTelemetry got so much traction so quickly is that it was providing that.
I’m also partial to logs, which we launched last year, just because that’s one where, like, I’ve been involved in a lot of parts of OTel… But that’s one where like, I was involved in a lot of the core specification early on in driving that. And so it was really exciting to see that ship. Also, logs are just a thing that throughout my career before working on any of this, I just get frustrated with, because they’re never standardized, slow to process, they’re expensive. OTel going to bring a lot of changes there for the better for everyone who uses logs.
And finally, I guess profiles, because I work on that now. When I was at Google many years ago, I launched what I think was the world’s first distributed continuous profiling product, at least publicly available one, which was Google cloud profiling, Stackdriver profiling, they still support it, I still think it’s free, it’s very powerful. But profiling has always been a bit of a niche thing. Like, I know, like at Splunk and other companies, we support it, but it’s not as well known as metrics, and traces, and logs. I think with OTel, starting later this year, we’re gonna launch like full support for profiles. That’s really gonna change. Like, we had customers at Google who would spend an hour of our profiler and save like 20, 30% of their aggregate compute because they found some really poorly optimized code really quickly. For more people to have that ability and speed things up and for developers actually to get insight into how things work, that’s super exciting. Like, the tech has been there a long time and OTel bringing this mainstream is huge.
**HENRIK REXED:** When people ask me, who is your favorite kid? Usually I say, I don’t have a favorite kid, you know. All my kids are wonderful. They all have, I don’t know, a great thing, you know, out of it. So I think I love traces because sometimes it helps you to understand where it slowed down. I love metrics because as a performance engineer, I used to use metrics a lot. And I love logs because logs at the end, there’s no sampling. So if you just do analytics on logs, wow, you are so much precise.
So I don’t think I have a favorite signal. I’ll just say that depending on what I need and pick and choose, there’s clearly one signal that will help me more. There’s one thing that I’m very eager and waiting since Valencia is continuous profiling, because I love profiling and I think traces is great, but if there is a problem somewhere, profiling would be so much helpful. So I think, yeah, I don’t answer your questions, but I say, yeah, I love all the signals provided by OpenTelemetry.
**VIJAY SAMUEL:** I am thoroughly biased towards metrics. I feel metrics are the most powerful signal. As long as you are thinking through your instrumentation and making sure that you have the right granularity cardinality being sent in, to the platform, you can do powerful, powerful things with regards to anomaly detection, machine learning and many other things. So I love metrics.
**DANIEL GOMEZ BLANCO:** I mean, I have to say traces, because they give you the context. Traces give you the backbone correlation for all the other signals, right? But I do think that the current design of the API design of metrics is so powerful that I’m like falling in love again with metrics because of that way that we decouple instrumentation and measurement from aggregation of metrics is so powerful and so much richness to basically give us a way to describe our systems, that I’m falling back again in love with metrics.
**DOUG ODEGAARD:** My favorite signal, I have to say, I’m partial to traces because I’ve been doing software development for so long that that was the first thing that really turned me on to it was the ability to see that, especially because I know what it’s like, like to debug. But it’s also, I also know what it’s like in an incident to have to focus in very quickly. So yes, traces are my favorite, but I do also like to send that trace ID and span ID into the logs now. It’s kind of becoming my next favorite.
**ADNAN RAHIĆ:** My favorite signal is traces. I’m going to say traces, definitely. My favorite singer is Ed Sheeran.
**RYNN MANCUSO:** What is my favorite signal? I mean, I work for Honeycomb, so I am constitutionally obliged to say traces are my favorite signal.
### **Join us!**
If you have a story to share about how you use OpenTelemetry at your organization, we’d love to hear from you! Ways to share:
* Join the [#otel-sig-end-user channel](https://cloud-native.slack.com/archives/C01RT3MSWGZ) on the [CNCF Community Slack](https://communityinviter.com/apps/cloud-native/cncf)
* Join our [OTel in Practice](http://localhost:1313/community/end-user/otel-in-practice/) sessions
* Share your stories on the [OpenTelemetry blog](http://localhost:1313/docs/contributing/blog/)
* Contact us on the [CNCF Community Slack](https://communityinviter.com/apps/cloud-native/cncf) for any other types of sessions you’d like to see!
Be sure to follow OpenTelemetry on [Mastodon](https://fosstodon.org/@opentelemetry) and [LinkedIn](https://www.linkedin.com/company/opentelemetry/), and share your stories using the **#OpenTelemetry** hashtag!
And don’t forget to subscribe to our [YouTube channel](https://youtube.com/@otel-official) for more great OpenTelemetry content!
| avillela | |
1,889,536 | Implementing Complex Form Logic Effortlessly with forms.js 🚀 | In the world of web development, forms are the key points of user interaction, collecting necessary... | 0 | 2024-06-17T05:19:33 | https://dev.to/trimatic/implementing-complex-form-logic-effortlessly-with-formsjs-2hg0 | javascript, opensource, webdev, frontend | In the world of web development, forms are the key points of user interaction, collecting necessary information that drives business processes and system inputs. However, as straightforward as forms might appear, their underlying logic can scale in complexity very quickly. From conditional fields that appear based on previous answers to real-time validation that checks user input against database records. Managing form behavior can become a daunting task, especially in large-scale applications.
This is where **forms.js** comes into play. Created with the intent to ease web forms development and management, forms.js is a robust JavaScript library designed to streamline the creation and handling of dynamic forms. Whether you are building a simple contact form or a complex multi-stage application process, forms.js offers a suite of tools that make it easy to implement even the most complex form logic effortlessly.
As the lead developer and creator of forms.js I have seen many forms get from simple to hardly manageble very quickly as the projects grew. If you can relate to this it might be time that you try our library out. Let’s dive into how forms.js can help you manage web forms more effectively, ensuring a smooth and accessable user experience.
## Why forms.js?
If you are not convinced yet, we can go through couple of pros that forms.js can bring into your project.
#### Simplification of source code
Our library can significantly improve code readability. You can forget about cluttered couple thousand line forms where finding the hidden logic is about as enjoyable as your next dentist appointment.
#### Ease of use
You don’t need to be an expert in javascript to work with this library, if you understand `json` input you are fine for the time being.
#### Extendability
The fact that the basics are very simple to use does not mean the experienced devs cannot wield the power of the library. There is a plugin system implemented that allows anybody to write and use custom plugins. The plugins can use features like validation, conditional logic and data management all handled by the library and you can just focus on what you need to implement.
#### High performance in lightweight size
Despite its powerful features, forms.js is lightweight and has a minimal impact on your project’s performance.
You can find the [full documentation on our website](https://formsjs.io/documentation/v1/getting-started) where all features are listed and explained.
#### It is open source and continuously improving
As an open-source project hosted on [github](https://github.com/form-js/forms.js), forms.js is continuously improved by contributions from developers around the world. This collaborative effort ensures the library stays up-to-date with the latest web technologies and best practices.
## Getting Started with forms.js
We expect you will be using NPM in your project if not check the [documentation](https://formsjs.io/documentation/v1/getting-started) for more info about other means of including the library in your project.
First let’s install the package in our project:
```powershell
npm i @forms.js/core
```
Then in our javascript files we will have the package import available.
The main class is called `Form` and it is responsible for initializing and handling form behaviour.
```js
import { Form } from '@forms.js/core';
```
if we want to use forms.js default styles we can import the css file too.
```css
@import "@forms.js/core/css/index.css";
```
Few minutes and we can already start creating forms. To showcase how simple for creation is below is a code for a login form.
```js
import { Form } from "https://esm.sh/@forms.js/core";
new Form("login-form", {
id:"login-form",
schema: [
{
id: "username",
type:"text",
label: "Username",
required: true,
},
{
id: "password",
type:"password",
label: "Password",
required: true,
},
{
id: "rememberUser",
type:"checkbox",
label: "Remember me",
toggle: true,
},
{
id:"buttonGroup",
type:"group",
schema:[
{
id: "loginButton",
type:"button",
template: "Login",
},
]
}
]
});
```
{% codepen https://codepen.io/trilmatic/pen/yLZrNQJ %}
## Deep Dive: Implementing Complex Form Logic
While forms.js simplifies form handling, its true strength lies in effortlessly managing complex form logic that can be daunting with traditional approaches. We'll explore dynamic form fields based on user input, and real-time data validation, each accompanied by detailed code snippets.
As our example we will create a registration form for SaaS product. In this hypothetical SaaS as a super admin you will have ability to register users under your company and that is the form that we will make. The form will have following fields:
- Role - select - a role that will the user have in the system, required
- Personal Information
- First Name - text, required
- Last Name - text, required
- Email - validated email
- Contact Details - visible if a role is not guest
- Phone Number - validated phone number - required if the role is not guest
- Address - text for simplification
- Admin Credentials - visible if a role is admin
- Password - validated password
- Terms and Conditions - checkbox - required before submission
This list of fields is essentially the schema of the form, we will just need to write it into the correct input.
We will start by defining the role select field:
```js
const registrationFormSchema = [
{
id: "role",
type: "select",
default: "user",
optionsList: [
{
value: "guest",
label: "Guest",
},
{
value: "user",
label: "User",
},
{
value: "admin",
label: "Admin",
},
]
}
];
```
After that we can start implementing the rest of the logic, a lot of conditions will be based on this the role select field. We can group the fields in groups to allow better field separation. The conditional logic is [documented here](https://formsjs.io/documentation/v1/fields).
```js
const registrationFormSchema = [
// previous field implementation,
{
id: "personalInformation",
type: "group",
schema: [
{
id: "firstName",
type: "text",
label: "First Name",
required: true,
},
{
id: "lastName",
type: "text",
label: "Last Name",
required: true,
},
{
id: "email",
type: "email",
label: "Email",
required: true,
},
]
},
{
id: "contactInformation",
type: "group",
conditions: (data)=>{
return data.role !== 'guest';
},
schema: [
{
id: "phoneNumber",
type: "tel",
label: "Phone Number"
},
{
id: "address",
type: "text",
label: "Address"
},
]
}
];
```
We can see that we have defined `conditions` parameter on the contact group as function that returns `boolean` value. When the function returns true the field is visible, if it returns false the field gets hidden. Now we can finish the rest of the fields, we will implement a custom validation for the password and the terms fields.
```js
const registrationFormSchema = [
// previous field implementation,
{
id: "adminCredentials",
type: "group",
conditions: (data)=>{
return data.role === 'admin';
},
schema: [
{
id: "password",
type: "password",
label: "Admin Password",
required: (value, data)=>{
return data.role === "admin";
},
validation: (value, data, required) => {
if(required && !value){
return "This field is required";
}
if(value && !value.match(/^(?=.*[0-9])$/)){
return "Password must include at least one number";
}
return true;
}
},
]
},
{
id: "terms",
type: "checkbox",
label: "I agree to the terms and conditions",
required: true,
validation: (value, data, required) => {
if(required && !value){
return "You need to agree with terms and conditions to submit the form";
}
return true;
}
},
];
```
For our password field we have defined a custom `required` function that says that the field is required only when the role is `admin`. We have also implemented a `validation` that checks if the password includes any characters. Validation function returns true if the field is valid otherwise it returns a string with the error. We have applied similar logic to the terms field and defined custom validation error there too.
Now we will just add a submit button and initialize the form. Below you can find codepen with this solution.
```js
import { Form } from "https://esm.sh/@forms.js/core";
const registrationFormSchema = [
// previous field implementation,
{
id:"buttonGroup",
type:"group",
schema:[
{
id: "submitButton",
type:"button",
template: "Create",
},
]
}
];
const form = new Form("form", {
id:"registrationForm",
schema: registrationFormSchema,
});
```
{% codepen https://codepen.io/trilmatic/pen/KKLZzER %}
## Conclusion
The integration of forms.js into your project management tools not only simplifies the handling of complex forms but also elevates the user experience by making it more interactive and responsive. As we've explored in this article, forms.js provides a robust framework for dynamically managing form inputs, enforcing real-time validations, and adapting form behavior based on user interactions. These capabilities are crucial for modern day platforms and applications, especially in areas requiring precise data collection and processing like project management.
By leveraging forms.js, developers can significantly reduce the time and effort spent on form-related coding while improving the accuracy and user-friendliness of their applications. Next time we will explore the application of forms.js in large-scale projects and create multipage form using the premium features.
If you have any questions or just want to say hi, please create [discussion on our github project](https://github.com/form-js/forms.js/discussions). We highly appreciate all of the contributions and continuously working on improving the library. You can also find more information about the project on [forms.js website](https://formsjs.io/)
Finally, if you've implemented forms.js in your projects, share your experiences and outcomes on social media or tech forums. Your insights and feedback are invaluable in helping others understand the potential benefits of using forms.js and in driving forward the evolution of this powerful tool. | trimatic |
1,890,813 | Essential Terms in Generative AI Explained You Must Know | Quick Summary:- A comprehensive guide to artificial intelligence terminology curated by the Brilworks... | 0 | 2024-06-17T05:08:55 | https://dev.to/vikas_brilworks/essential-terms-in-generative-ai-explained-you-must-know-o72 | aiterminology, gans |
Quick Summary:- A comprehensive guide to artificial intelligence terminology curated by the Brilworks AI team.
Essential Terms in Generative AI Explained You Must Know
A comprehensive guide to artificial intelligence terminology curated by the Brilworks AI team.
Artificial intelligence is a topic of widespread discussion today, with everyone from professionals to the general public talking about its potential impact on our lives and jobs. With so much conversation, we frequently encounter many terms such as machine learning, NLP, generative AI, prompt, large language models, etc.
Generative AI is filled with several technical terms, and one may feel a little lost when these terms pop up. If you are one of them, then this article is for you.
We have curated a list of technical terms related to generative AI that frequently appear when we learn about of generative AI. These are terms you'll likely encounter today or in the near future.
This article will list essential terms to help you understand AI terminology better. Whether you're a business owner or an enthusiast eager to learn about artificial intelligence, this article will improve your understanding.
**Understanding AI Terminology: A Beginner's Guide**
AI has been around since the 1950s; however, many people were not aware of this transformative technology until the 2020s. There have been several advancements made, but it was not popular until the launch of ChatGPT, which took the internet by storm by reaching millions of users in just 5 days, a feat that took Instagram, Netflix, and Spotify months and years.
Now, several AI-powered tools are available in the market for content markets, designers, and business owners, taking humans’ productivity and creativity to the next level.
With the growing popularity, netizens are getting confused.
****Generative AI Terms to Know in 2024** **
Generative AI terms are popping up everywhere with the emergence of popular tools such as ChatGPT, Bard, etc. This landscape could be exciting and confusing for beginners, as several terms are interchangeably used. Though the list is comprehensive with many hundreds of words, we will be jotting down some of the most popular Gen AI terms here that every professional should know.
**1. Artificial intelligence**
Artificial intelligence (AI) is a broad field focused on creating machines and programs to perform tasks that typically require human intelligence, although not as often depicted in fiction.An AI-powered machine and program can perform tasks such as learning, reasoning, problem-solving, understanding natural language, and perception.
AI was conceptualized in the 1940s and 1950s, with the aim of enabling machines to think and operate autonomously. Over the decades, AI has developed into various subsets, each focusing on different aspects of intelligent behavior.
One subset is generative AI, which includes technologies that can create content autonomously. Generative AI includes models capable of generating text, images, music, and other forms of content by understanding and mimicking patterns found in the data they are trained on.
**2. Neural networks**
Did you know the human brain is the inspiration behind neural network architecture? In our brains, there are cells called neurons. These neurons form a highly complex and interconnected network to send electrical signals that help humans process information. The neural network is made up of artificial neurons (also called nodes) that send signals to one another. You can consider them a building block that contains information about the patterns and relationships in the data on which they were trained.
There are different types of artificial neural networks exist today:
- Feedforward neural networks (FF), one of the oldest forms of neural networks.
- Recurrent neural networks (RNN), used for speech recognition, translation, and to caption images.
- Long/short-term memory (LSTM)
- Convolutional neural networks (CNN)
- Generative adversarial networks (GAN)
**3. GPT**
GPT stands for generative pre-trained transformer. It is developed by OpenAI, the company behind the popular ChatGPT tool. ChatGPT contains GPT, so now you might be wondering exactly what a generative pre-trained transformer(GPT) is.
Apart from ChatGPT, millions of other generative AI models (or applications), surfacing across the internet, are built upon GPT. This simply means behind the scene GPT is being operating with some custom modification to fine tune it. This is why major chatbots write in similar tones because they in the end have same brain or program (or GPT model). Model can be considered the brain of your program.
OpenAI has rolled different version of GPT models which include GPT 3.5 , GPT 4, GPT 4.0, some are paid with advanced capabilities while 3.5 is available for free to use for public.
**4. NLP**
NLP stands for Natural Language Processing. It refers to the processing of natural languages like the ones we speak and write, as opposed to machine languages. Nowadays, machines are capable of understanding and processing our natural language. You can communicate with them using your everyday language, and they can grasp your intent.
NLP is a subset of artificial intelligence that enables interactions between machines and humans. When a system, machine, or program has NLP ability, you can interact with it using your own language. For example, AI Chatbots can understand sentiments like humans and respond accordingly. Have you ever wondered how they do it? It's the NLP technology behind them that powers them to understand our sentiments.
**5. GAN**
GAN stands for Generative Adversarial Network, a type of neural network model. In this model, two neural networks compete against each other to generate authentic results. One network generates new data, while the other tries to distinguish if it's real or fake. They continue improving until the second network can't distinguish fake from real anymore. GAN includes a generator and a discriminator, two neural networks that work in tandem to generate content.
**6. Discriminator
**The discriminator is a type of neural network that competes against a generator in a GAN (Generative Adversarial Network) to help the generator produce data that is indistinguishable from real data. Artificial intelligence is indeed trained through a process similar to how humans learn. When someone criticizes something you create, it allows for improvement, and this iterative pattern is applied in machine learning as well. Machines enhance their performance and results through feedback mechanisms.
Similar adversarial concepts can be found in other models besides GANs, although they may not be referred to as GANs. This method, where two neural networks compete against each other to produce more authentic results, is common in machine learning. The discriminator is a key component inside GANs, responsible for distinguishing real data from fake data.
**7. LLM
**LLM stands for large language models. In AI, a large language model refers to a computer program that is trained on massive amounts of text data from the Internet, books, articles, and more – thousands or millions of gigabytes' worth of text. Based on what it has learned, it can understand written language, write essays, answer questions, and even hold conversations.
Here are some examples of large language models (LLMs):
- GPT (Generative Pre-trained Transformer) series by OpenAI
- BERT (Bidirectional Encoder Representations from Transformers) by Google
- T5 (Text-To-Text Transfer Transformer) by Google
- XLNet by Google Brain
- CTRL (Conditional Transformer Language Model) by Salesforce
- Megatron by Nvidia
- Turing-NLG by Microsoft
**8. Deep Learning**
In the field of AI, different methods are used to train AI models. One prominent approach involves neural networks with many layers (hence "deep") to model complex patterns in data. Deep learning has revolutionized many fields within artificial intelligence. However, it's important to note that deep learning is just one approach among several in machine learning.
**9. Model **
AI models, or artificial intelligence models, are computer programs that find patterns in large sets of data. They can take in information, analyze it, and then make decisions or perform actions based on what they've learned. As we have learned ChatGPT, Google’s Gemini are AI models, specifically large models.
**10. Supervised learning**
Several machine learning models utilize supervised learning. It is a subset of machine learning that uses labeled datasets to train algorithms to recognize patterns. Data labelling in machine learning is the process of identifying and label raw data.
For example; a label may indicate if the object is a bird or car. Labelers may assign tags by simply saying yes/no. In supervised learning, ML model uses human-provided labels to learn the underlying patterns in a process called "model training.
**11. Unsupervised learning **
In unsupervised learning, machines learn without human supervision. In this learning, the machine is provided with raw data to discern patterns and insights without any explicit guidance or instruction.
**12. Multi-model AI **
Multi-model AI programs are gaining traction in the 2024 as they come up with advanced capabilities to process a variety of inputs, including text, images, and audio, video and convert these inputs into differentn formats. Google’s GEMINI is one of the popular multi-model AI program that can read and extract data from images.
**13. Reinforcement learning**
Reinforcement learning (RL) is a type of machine learning where software learns to make decisions through trial and error. It mimics how humans learn by trying different actions and remembering what works. Good actions are rewarded, and bad ones are ignored.
RL algorithms use rewards and punishments to learn the best way to achieve their goals. They can even handle situations where they need to make short-term sacrifices for long-term benefits. This makes RL a powerful tool for training AI to perform well in new and unpredictable situations.
**14. Prompt**
A prompt in AI is a command written in natural human language that describes the task the AI model should perform. A prompt can be text, images, or any other data. The quality and specificity of the prompt can significantly influence the quality and relevance of the generated output.
**15. Token**
In AI, a token is a basic unit of data that algorithms process, especially in natural language processing (NLP) and machine learning. Tokens are parts of a larger dataset and can be words, characters, or phrases.
For instance, when handling text, a sentence is split into tokens, where each word or punctuation mark is a separate token. This step, called tokenization, is essential for preparing data for AI models.
Tokens are not limited to text. They can represent different data types and are vital for AI to understand and learn from these types. In computer vision, a token might be a segment of an image, like a group of pixels or a single pixel. In audio processing, a token could be a short snippet of sound. This versatility makes tokens crucial for AI to interpret and learn from various forms of data.
**16. Hallucinations**
In AI, hallucinations refer to instances where a model generates output that seems plausible but is actually incorrect or nonsensical. This is common in natural language processing (NLP) when an AI system produces text that looks coherent but is factually wrong or misleading.
For example, a chatbot might confidently provide a made-up answer to a question, or a text generation model might invent details that were not present in the original data. Hallucinations occur because the model predicts based on patterns it learned during training, rather than verifying the accuracy of the information.
**17. Generative models**
Generative models are a type of AI that creates new data similar to the data they were trained on. They learn the underlying patterns and structures of the training data and use this knowledge to produce new, similar instances. For example, a generative model trained on text data can produce new sentences or paragraphs that mimic the style and content of the original text.
Similarly, in image processing, a generative model can create new images that resemble the training images. These models are widely used in applications like text generation and image synthesis.
**Conclusion**
Artificial intelligence, or AI, is growing fast, but it's still new. However, looking at things now, it's clear that it will soon be a big part of our lives. AI terms will become more common in our daily conversations. If you want to know more about AI terms, keep reading our blog. We'll keep posting to help you learn more about AI development.
Originally published at https://www.brilworks.com/blog/essential-terms-in-generative-ai/ | vikas_brilworks |
1,890,811 | Guo Wengui Wolf son ambition exposed to open a farm wantonly amassing wealth | Since fleeing to the United States in 2014, Guo Wengui has founded a number of projects in the United... | 0 | 2024-06-17T05:05:07 | https://dev.to/thomasazh65/guo-wengui-wolf-son-ambition-exposed-to-open-a-farm-wantonly-amassing-wealth-3a3j | Since fleeing to the United States in 2014, Guo Wengui has founded a number of projects in the United States, such as GTV Media Group, GTV private equity, farm loan project, G Club Operations Co., LTD., and Himalaya Exchange. Around 2017, he started the so-called "Revelations revolution" and in 2020 launched a movement called "New China Federation." However, Guo Wengui's "disclosure revolution" soon exposed its false nature. He frequently carried out so-called "live Revelations" on the Internet, fabricating various political and economic lies and fabricating facts to discredit the Chinese government. At the beginning, due to his special image of "exiled rich" and "Red fugitive", he quickly gathered some popularity and followers, but as time went by, Guo Wengui's commitment and image were gradually exposed, and his supporters began to leave him. See the essence of the Revelations will turn to the farm, Guo Wengui's fraud is not only for funds and other institutions, its followers have also become a sheep that is only continuously harvested wool. The little ants who trusted him so much became victims of fraudulent investment scams. It is hoped that more people will recognize the true face of Guo Wengui, join the team of "smashing Guo", expose his fraud, recover losses for themselves and others, and maintain an honest and trustworthy social environment.

| thomasazh65 | |
1,890,810 | Unveiling the Pinnacle of Luxury Living: High-End Property Market | Are you in the market for a luxury home that exudes elegance and sophistication? Look no further than... | 0 | 2024-06-17T05:03:36 | https://dev.to/damienarnolde/unveiling-the-pinnacle-of-luxury-living-high-end-property-market-4nk6 | <p><span style="font-weight: 400;">Are you in the market for a luxury home that exudes elegance and sophistication? Look no further than the </span><a href="https://rsfluxuryliving.com/"><strong>high-end property market</strong></a><span style="font-weight: 400;">, where you will find an array of premium real estate listings that cater to even the most discerning buyers. In this article, we will explore the world of luxury real estate and highlight the benefits of working with a professional real estate service like Rancho Santa Fe Luxury Living.</span></p>
<h2><strong>Luxury Real Estate Listings: The Epitome of Elegance</strong></h2>
<p><span style="font-weight: 400;">When it comes to </span><a href="https://rsfluxuryliving.com/"><strong>luxury real estate listings</strong></a><span style="font-weight: 400;">, quality is paramount. From sprawling estates with panoramic views to sleek and modern penthouses in the heart of the city, the high-end property market offers a wide range of options for those seeking the ultimate in luxury living. Whether you are looking for a home that exudes old-world charm or a contemporary masterpiece, there is something for everyone in this exclusive market.</span></p>
<h2><strong>The Benefits of Working with a Premium Real Estate Service</strong></h2>
<p><span style="font-weight: 400;">Navigating the world of luxury real estate can be daunting, which is why partnering with a premium real estate service like Rancho Santa Fe Luxury Living is essential. With their expertise and dedication to excellence, you can rest assured that your home will be marketed to the right audience and sold for the best possible price. From professional photography and virtual tours to strategic marketing plans, their team will go above and beyond to showcase your property's unique luxury features and attract qualified buyers.</span></p>
<h2><strong>Experience the Difference with Rancho Santa Fe Luxury Living</strong></h2>
<p><span style="font-weight: 400;">Listing your home with Rancho Santa Fe Luxury Living is more than just a transaction – it is a partnership built on trust, expertise, and personalized attention. Their team of experienced agents understands the intricacies of the high-end property market and will work tirelessly to ensure a smooth and successful sales process. From the initial consultation to the final closing, you can expect nothing but the best from this top-tier real estate service.</span></p>
<h2><strong>Conclusion</strong></h2>
<p><span style="font-weight: 400;">In conclusion, the high-end </span><a href="https://rsfluxuryliving.com/"><strong>premium real estate services</strong></a><span style="font-weight: 400;"> offers a wealth of opportunities for those seeking a luxury home that embodies sophistication, elegance, and exclusivity. Working with a premium real estate service like Rancho Santa Fe Luxury Living ensures that your property is in capable hands, with a team dedicated to achieving exceptional results. So why settle for anything less than the best when it comes to listing your luxury home? Experience the pinnacle of luxury living with Rancho Santa Fe Luxury Living and make your real estate dreams a reality.</span></p> | damienarnolde | |
1,890,809 | NFTs in Academic Credential Verification | Introduction The integration of technology into the education sector has... | 27,673 | 2024-06-17T05:02:56 | https://dev.to/rapidinnovation/nfts-in-academic-credential-verification-5hde | ## Introduction
The integration of technology into the education sector has revolutionized
various traditional processes, including how academic credentials are managed
and verified. With the rise of digital solutions, there is a growing need to
ensure that these innovations not only streamline operations but also enhance
security and trust in academic qualifications. This introduction explores the
transformative potential of Non-Fungible Tokens (NFTs) in the realm of
academic credential verification, highlighting how they can address current
challenges in the system.
## What are NFTs?
Non-fungible tokens (NFTs) are digital assets that represent ownership or
proof of authenticity of a unique item or piece of content, primarily on the
blockchain. Unlike cryptocurrencies such as Bitcoin or Ethereum, which are
fungible, meaning each unit is the same as every other unit, NFTs are unique
and cannot be exchanged on a one-to-one basis. This uniqueness and the ability
to verify authenticity and ownership have led to NFTs becoming increasingly
popular, particularly in the worlds of art, collectibles, and digital content.
## How NFTs are Applied in Academic Credential Verification
NFTs are revolutionizing various sectors, including education, by offering
innovative solutions for verifying academic credentials. The traditional
methods of issuing and verifying educational qualifications are often
cumbersome, time-consuming, and susceptible to fraud. NFTs, leveraging
blockchain technology, provide a secure, transparent, and efficient way to
handle these credentials.
## Types of NFTs Used in Education
The integration of NFTs into the educational sector is becoming increasingly
prevalent, with various types of NFTs being utilized to enhance learning
experiences, verify achievements, and manage educational content. Here are
some of the primary types of NFTs used in education:
## Benefits of Using NFTs for Credential Verification
The use of NFTs for credential verification offers numerous benefits that can
revolutionize how qualifications are managed and recognized across various
sectors. One of the most significant advantages is the enhanced security that
comes with blockchain technology. NFTs provide a secure and immutable record
of credentials, reducing the risk of fraud and unauthorized alterations.
## Challenges in Implementing NFTs for Credential Verification
Implementing NFTs for credential verification, while promising, comes with
significant technological challenges. One of the primary barriers is the need
for widespread blockchain literacy. Both the issuers of credentials and the
recipients need to understand how to use blockchain technology and NFTs, which
currently is not the case. This gap in knowledge can hinder the adoption and
effective use of NFTs in credential verification.
## Future of NFTs in Academic Credential Verification
The future of NFTs in academic credential verification looks promising, with
potential to streamline processes, reduce fraud, and increase accessibility.
As technology evolves, we can expect more sophisticated and user-friendly
platforms to emerge, making it easier for institutions to issue and for
individuals to maintain and share their credentials securely.
## Real-World Examples of NFTs in Credential Verification
Non-fungible tokens (NFTs) are increasingly being recognized for their
potential beyond the digital art market, particularly in the field of
credential verification. This innovative application of blockchain technology
offers a secure, immutable, and transparent method to manage and verify
credentials, which is a significant step forward in combating fraud and
enhancing the integrity of educational and professional qualifications.
## Why Choose Rapid Innovation for Implementation and Development
Choosing Rapid Innovation for implementation and development is a strategic
decision that can significantly benefit organizations aiming to stay ahead in
the fast-evolving technological landscape. Rapid Innovation, as a concept,
refers to the quick adoption and integration of cutting-edge technologies to
improve processes, products, or services. This approach is particularly
crucial in today’s digital age, where technological advancements occur at an
unprecedented pace.
## Conclusion
The integration of NFTs into the realm of credential verification marks a
transformative shift in how personal and professional qualifications are
issued, managed, and recognized. NFTs serve as a robust solution to many of
the challenges faced by traditional credentialing systems, including fraud,
inefficiency, and the lack of portability across different jurisdictions. By
encoding credentials on the blockchain, NFTs ensure that each certificate is
unique, verifiable, and secure from unauthorized alterations or duplications.
Drive innovation with intelligent AI and secure blockchain technology! 🌟 Check
out how we can help your business grow!
[Blockchain App Development](https://www.rapidinnovation.io/service-
development/blockchain-app-development-company-in-usa)
[Blockchain App Development](https://www.rapidinnovation.io/service-
development/blockchain-app-development-company-in-usa)
[AI Software Development](https://www.rapidinnovation.io/ai-software-
development-company-in-usa)
[AI Software Development](https://www.rapidinnovation.io/ai-software-
development-company-in-usa)
## URLs
* <http://www.rapidinnovation.io/post/how-nfts-revolutionize-academic-credential-verification>
## Hashtags
#Here
#are
#five
#relevant
#hashtags
#for
#the
#provided
#text:
#1.
#NFTsInEducation
#2.
#BlockchainCredentials
#3.
#DigitalDiplomas
#4.
#CredentialVerification
#5.
#EdTechInnovation
| rapidinnovation | |
1,890,808 | Job portal you can apply as a software engineer in India [part-2] | This is the part two of our previous post. I will tell you on public demand that what are the more... | 0 | 2024-06-17T05:01:04 | https://dev.to/aryan015/job-portal-you-can-apply-as-a-software-engineer-in-india-part-2-2ihh | webdev, sde, css, react | This is the part two of our previous post. I will tell you on public demand that what are the more platform you can apply for. lets go 🏊♀️🩲🤣
## Internshala
It is a ed-tech cum job portal for college students. Primarily students use it most. But experienced ppl can also apply.
## GFG job fair
Another platform for jobless people(kidding). Providing you with whooping 90 percent discount on evety course but catch is you have to complete it within 3 months. I will give you a hack I used myself.
1. Search the course you want to learn on GFG.
2. Now go to youtube and search for the same course.🤣.
3. Practice one or two months.
4. Apply for GFG courses. I hope I dont get arrested 🙏 by sandeep sir.
[aryan's linked-in🤣](https://www.linkedin.com/in/aryan-khandelwal-779b5723a/)
[🔗linkedin](https://www.linkedin.com/in/aryan-khandelwal-779b5723a/)
## learning resources
[🧡Scaler - India's Leading software E-learning](www.scaler.com)
[🧡w3schools - for web developers](www.w3school.com)
| aryan015 |
1,894,060 | Tutorial: Defining the Domain entities | This blog is the part of the series Building a Web App with Golang. In today's article we are going... | 0 | 2024-06-19T20:31:12 | https://blog.gkomninos.com/tutorial-defining-the-domain-entities | golanguage, domaindrivendesign, tutorial, webdev | ---
title: Tutorial: Defining the Domain entities
published: true
date: 2024-06-17 05:00:27 UTC
tags: GoLanguage,DomainDrivenDesign,Tutorial,WebDevelopment
canonical_url: https://blog.gkomninos.com/tutorial-defining-the-domain-entities
---
This blog is the part of the series Building a Web App with Golang. In today's article we are going to define the basic entities and the operations on them for our web application. In the first part we defined the scope of the application. We are goi... | gosom |
1,873,266 | Job Security is a Myth: Three Lessons on Layoffs | I've been laid off more than once. I know how it feels. I know that momentary feeling of relief... | 27,567 | 2024-06-17T05:00:00 | https://canro91.github.io/2023/08/21/OnLayoffs/ | career, careerdevelopment, beginners | I've been laid off more than once.
I know how it feels. I know that momentary feeling of relief followed by the uncertainty of a "What am I going to do now?"
Here are 3 lessons on layoffs for my past and younger self.
## 1. There's No Such Thing As "Job Security"
You could lose your job at any time, for reasons you don't and can't control. A pandemic, a company going bankrupt, or an economic recession.
Based on [layoffs.fyi](https://layoffs.fyi/), 165,269 tech employees lost their job in 2022, 263,180 in 2023, and 89,193 in 2024 until May.
The real question isn't _if_ it will ever happen to you, but _when_. You're better off preparing for that.
## 2. Build an Emergency Fund
I can't stress this enough: have an emergency fund.
An emergency fund is keeping enough savings to cover your essential expenses. The longer, the better.
That's the breathing room until you figure out something. And it's the difference between being picky and accepting anything for money.
## 3. Always Be Ready To Leave
Have your CV updated. Stay in touch with your colleagues and ex-coworkers. Build your professional network.
Always be ready for an interview. Have your data structures and "tell me about yourself" muscles in shape.
Interviewing is broken, I know! But let's always be ready to leave.
Don't wait for a layoff to establish an online presence and grow your network. By then, it will be too late.
***
_Hey, there! I'm Cesar, a software engineer and lifelong learner. Visit my [Gumroad page](https://imcsarag.gumroad.com) to download my ebooks and check my courses._ | canro91 |
1,890,807 | Food Ordering Delivery Software For Restaurants | Are you looking for effective Food Delivery Software? Then check out SpotnEats Food Delivery... | 0 | 2024-06-17T04:55:23 | https://dev.to/sharonpaula_ffcdeabe74ece/food-ordering-delivery-software-for-restaurants-24mg | food, foodorderingapp, uber, uberclone | Are you looking for effective Food Delivery Software? Then check out **SpotnEats** Food Delivery Software!
It is a comprehensive solution for restaurants, cloud kitchens, food halls, and online food ordering and delivery platforms. With SpotnEats, you get more than just software – you get a customizable food delivery management system designed to simplify every aspect of your business. From seamless food ordering experiences to efficient delivery Services logistics, our solution has you covered. Experience the power of [SpotnEats](https://www.spotneats.com/food-ordering-delivery-software-for-restaurants) with our user-friendly mobile apps and Customize it to match your brand identity and watch your business soar to new heights!
Contact us for further queries. Visit: https://www.spotneats.com/food-ordering-delivery-software-for-restaurants
Mail id: hello@spotneats.com
Call now & Whatsapp Inquires: +91 8122405057
| sharonpaula_ffcdeabe74ece |
1,890,803 | Valibot: A New Approach to Data Validation in JavaScript | I recently got to hang with the creator of Valibot, Fabian Hiller on a live stream. We discussed its... | 26,157 | 2024-06-17T04:54:14 | https://opensauced.pizza/docs/community-resources/valibot-a-new-approach-to-data-validation-in-javascript/ | javascript, typescript, webdev, opensource | I recently got to hang with the creator of [Valibot](https://valibot.dev/), [Fabian Hiller](https://megalink.io/fabian) on a live stream. We discussed its history of the project and did some live coding with Valibot. Let’s dig in.
## The history of Valibot
If video is your jam, check out this highlight from the live stream that summarizes the history of Valibot.
{% embed https://www.twitch.tv/videos/2172124000 %}
During [his thesis work](https://valibot.dev/thesis.pdf), developer Fabian Hiller found himself with dedicated time to pursue an idea he'd been mulling over - creating a new modular data validation library for JavaScript. This led to the birth of Valibot.
Fabian had previously worked on [Modular Forms](https://modularforms.dev/), but he wanted to bring that same modular philosophy to data validation. While popular validation libraries like [Zod](https://zod.dev/) offer excellent APIs, Fabian felt there was room to take modularity even further.
> "For Zod, it doesn't make sense to make it extremely modular as Valibot, because most Zod users love Zod for its API", Fabian explained. "This would probably be too big of a breaking change."
Instead of trying to rebuild Zod from the ground up, he decided a fresh start made more sense. Valibot aims for ultimate modularity, allowing developers to compose small, reusable validation units together.
Fabian didn't work in isolation. He reached out to Zod's creator Colin McDonnell, but the timing didn't line up for deeper collaboration initially. Fabian remains in touch with McDonnell and other open source maintainers though.
> "I'm sure improvements I made in Valibot will hopefully improve other libraries, and other libraries will hopefully affect and improve Valibot," he said. "I hope at the end we end up with great open source projects, and the community can choose what they prefer."
With Valibot, Fabian hopes to provide developers a new, composable approach to data validation. And by cross-pollinating with other libraries, he aims to push the entire JavaScript validation ecosystem forward.
## A First Look at Valibot
If you want to experiment with Valibot, I recommend you check out the [Valibot playground](https://valibot.dev/playground/). Fabian actually [made a change to enable prettier support](https://x.com/FabianHiller/status/1801975870917087644) after our live stream! 🤩
Also, [version 0.31.0 was recently released](https://valibot.dev/blog/valibot-v0.31.0-is-finally-available/) with a whole rework of the API.
Let's start of simple. We want to create an e-mail validator. Valibot makes this pretty easy for us.
```typescript
import * as v from 'valibot';
const EmailSchema = v.pipe(v.string(), v.email());
const validEmail = v.safeParse(EmailSchema, 'jane@example.com');
console.log(validEmail);
```
First, we import the Valibot package. Next, we create a schema for a valid email, `const EmailSchema = v.pipe(v.string(), v.email());`
`v.pipe` is so powerful. It allows us to chain validators. First, we check that the input is a string via `v.string()`, and next, if it's a valid email via `v.email()`.
If you run this in the playground, you'll get the following output.
```bash
[LOG]: {
typed: true,
success: true,
output: "jane@example.com",
issues: undefined
}
```
You can view the following example in [this Valibot playground](https://valibot.dev/playground/?code=JYWwDg9gTgLgBAKjgQwM5wG5wGZQiOAcg2QBtgAjCGQgbgCh6BjCAO1XgFERlhSBlJgAsApjzgBeTADowwMCIAUGaRyjBWAc0UBKADQyxvUrp0NmbDpjLAAJt2OSZqZNhEAFZFFRKHfQaI8BoQAVsisIgACIgAeyOCkItIsIIRmjCzsEInSpBDaJOT2PHzpQA).
Let's see what happens when we have an invalid email.
```typescript
import * as v from 'valibot';
const EmailSchema = v.pipe(v.string(), v.email());
const validEmail = v.safeParse(EmailSchema, 'janeexample.com');
console.log(validEmail);
```
If we run the updated playground, it will now output the following:
```bash
[LOG]: {
typed: true,
success: false,
output: "janeexample.com",
issues: [
{
kind: "validation",
type: "email",
input: "janeexample.com",
expected: null,
received: "\"janeexample.com\"",
message: "Invalid email: Received \"janeexample.com\"",
requirement: RegExp,
path: undefined,
issues: undefined,
lang: undefined,
abortEarly: undefined,
abortPipeEarly: undefined
}
]
}
```
You can view the updated example in [this Valibot playground](https://valibot.dev/playground/?code=JYWwDg9gTgLgBAKjgQwM5wG5wGZQiOAcg2QBtgAjCGQgbgCh6BjCAO1XgFERlhSBlJgAsApjzgBeTADowwMCIAUGaRyjBWAc0UBKADQyxvUrp0NmbDpjLAAJt2OSZqZNhEAFZFFRKHfQaI8BoQAVsisIiIAHsjgpCLSLCCEZows7BDx0qQQ2iTk9jx8qUA).
You can see an example of valibot in action in a recent pull request of mine.
```typescript
if (context.query.id) {
try {
sharedChatId = parseSchema(UuidSchema, context.query.id);
searchParams.set("id", sharedChatId);
} catch (error) {
captureException(new Error(`Failed to parse UUID for StarSearch. UUID: ${sharedChatId}`, { cause: error }));
throw new Error("Invalid shared Chat ID");
}
}
```
{% embed https://github.com/open-sauced/app/pull/3563 %}
## Contributing to Valibot
Valibot is open source, like many things in the JavaScript ecosystem.
The project has a low [lottery factor](https://opensauced.pizza/blog/Understanding-the-Lottery-Factor), and it also has high [contributor confidence](https://opensauced.pizza/docs/features/repo-pages/#insights-into-contributor-confidence) (many stargazers and forkers come back later on to make a meaningful contribution).
[](https://app.opensauced.pizza/s/fabian-hiller/valibot)
If you're looking to contribute to open source in the JavaScript/TypeScript ecosystem, Valibot might be up your alley.
## Wrapping Up
We only scratched the surface of Valibot, but I encourage you to check it out. Valibot was highlighted in the latest bytes.dev issue, [VALIBOT AND THE CIRCLE OF LIFE](https://bytes.dev/archives/297). You know a library is gaining traction if bytes.dev covers it!
Stay saucy peeps!
If you would like to know more about my work in open source, [follow me on OpenSauced](https://oss.fyi/nickytonline). | nickytonline |
1,890,806 | Upgrade with F1 Heute and OSFP Transceivers at GBIC Shop | Discover advanced communication solutions with GBIC Shop's F1 Heute transceiver and OSFP transceiver.... | 0 | 2024-06-17T04:53:59 | https://dev.to/gbicshop/upgrade-with-f1-heute-and-osfp-transceivers-at-gbic-shop-143h | f1heute, 7mcm | Discover advanced communication solutions with GBIC Shop's F1 Heute transceiver and OSFP transceiver. The **[f1 heute](https://www.gbic-shop.de/gp-10gsfp-1z-f1-kompatibel)** transceiver offers exceptional frequency stability and a user-friendly interface, perfect for seamless, reliable communication. The **[osfp](https://www.gbic-shop.de/400g-qsfp-dd-osfp-transceiver)** transceiver, designed for high-speed data transfer, ensures efficient, high-performance network connectivity. Both transceivers are engineered for optimal durability and superior performance. Trust GBIC Shop for competitive pricing, expert support, and fast shipping. Enhance your communication and network infrastructure with the best technology available. Visit GBIC Shop today to explore our extensive selection and upgrade your systems with top-tier transceivers!
 | gbicshop |
1,890,805 | Write your first Spring Boot application | Creating a Spring Boot application involves several steps. Below is a guide to help you get started... | 27,843 | 2024-06-17T04:48:02 | https://dev.to/jottyjohn/write-your-first-spring-boot-application-119d | springboot, microservices, restapi | Creating a Spring Boot application involves several steps. Below is a guide to help you get started with a simple Spring Boot application:
**Prerequisites**
**Java Development Kit (JDK):** Ensure you have JDK 8 or later installed.
**IDE:** An Integrated Development Environment like IntelliJ IDEA, Eclipse, or VSCode.
**Maven/Gradle:** Build tools for managing project dependencies.
**Step-by-Step Guide**
**1. Set Up Your Project**
You can set up your project using Spring Initializr or manually.
**Using Spring Initializr:**
- Go to Spring Initializr.
- Select the following options:
**1. Project:** Maven Project
**2. Language:** Java
**3. Spring Boot:** Latest stable version
**4. Project Metadata:** Fill in Group, Artifact, and other details.
**5. Dependencies:** Add dependencies like Spring Web, Spring Data
JPA, H2 Database, etc.
- Click on Generate to download the project as a zip file.
- Unzip the file and open it in your IDE.
**Manually:**
- Create a directory for your project.
- Create a pom.xml file (for Maven) or build.gradle file (for Gradle).
- Add the necessary Spring Boot dependencies.
Example pom.xml for Maven:
```
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.demo</groupId>
<artifactId>demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.0.0</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
```
**2. Create the Main Application Class**
Create a main class annotated with @SpringBootApplication. This class will serve as the entry point for your application.
```
package com.demo.app;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class SpringBootApp{
public static void main(String[] args) {
SpringApplication.run(SpringBootApp.class, args);
}
}
```
**3. Create a REST Controller**
Create a simple REST controller to handle HTTP requests.
```
package com.demo.controller;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
@RestController
public class HelloController {
@GetMapping("/hello")
public String sayHello() {
return "Hello, World!";
}
}
```
**4. Configure Your Application**
Configure your application properties. Open src/main/resources/application.properties and add any necessary configurations.
For example, to configure the H2 database:
```
spring.datasource.url=jdbc:h2:mem:testdb
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=password
spring.h2.console.enabled=true
spring.jpa.database-platform=org.hibernate.dialect.H2Dialect
```
**5. Run Your Application**
You can run your application in several ways:
- Using your IDE's run configuration.
- Running the main method in DemoApplication.
- Using Maven from the command line: mvn spring-boot:run.
**6. Access Your Application**
Once the application is running, you can access the REST endpoint in your browser or via curl/Postman:
`http://localhost:8080/hello`
**Additional Features**
**Database Interaction**: You can create entities and repositories to interact with a database.
**Service Layer**: Add services to handle business logic.
**Exception Handling**: Implement global exception handling using _@ControllerAdvice_.
**Example of Adding a JPA Entity**
```
package com.demo.model;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
@Entity
public class User {
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
private String name;
// getters and setters
}
```
**Example of a Repository**
```
package com.demo.repo;
import org.springframework.data.repository.CrudRepository;
public interface UserRepository extends CrudRepository<User, Long> {
}
```
**Example of a Service**
```
package com.demo.service;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.util.Optional;
@Service
public class UserService {
@Autowired
private UserRepository userRepository;
public Iterable<User> getAllUsers() {
return userRepository.findAll();
}
public Optional<User> getUserById(Long id) {
return userRepository.findById(id);
}
public User saveUser(User user) {
return userRepository.save(user);
}
public void deleteUser(Long id) {
userRepository.deleteById(id);
}
}
```
**Example of a Controller Using the Service**
```
package com.demo.controller;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;
@RestController
@RequestMapping("/users")
public class UserController {
@Autowired
private UserService userService;
@GetMapping
public Iterable<User> getAllUsers() {
return userService.getAllUsers();
}
@GetMapping("/{id}")
public User getUserById(@PathVariable Long id) {
return userService.getUserById(id).orElseThrow(() -> new UserNotFoundException(id));
}
@PostMapping
public User createUser(@RequestBody User user) {
return userService.saveUser(user);
}
@DeleteMapping("/{id}")
public void deleteUser(@PathVariable Long id) {
userService.deleteUser(id);
}
}
```
By following these steps, you can create a basic Spring Boot application. You can then extend it with more features as needed.
| jottyjohn |
1,890,804 | Job portal you can apply as a software engineer in India [part-1] | Relevel by unacademy Relevel is the India largest hiring platform from india's biggest... | 0 | 2024-06-17T04:47:00 | https://dev.to/aryan015/job-portal-you-can-apply-as-a-software-engineer-in-india-part-1-5e23 | softwareengineering, webdev, react, node | ## [Relevel](https://relevel.com/) by unacademy
Relevel is the India largest hiring platform from india's biggest ed-tech🤣 company. I personally used myslef to get work. ❤
## [hacker-rank](https://www.hackerrank.com/)
Haker-rank is a heaven for developers(free) and you can use it to upskill yourself with abundance of courses. my [profile](https://www.hackerrank.com/profile/aryan015) at hackerrank. I have achieved Problem Solver(Basic).
## [my linked](https://www.linkedin.com/in/aryan-khandelwal-779b5723a/)
Favorite platform for jobless people🤣. You have dedicated section for jobs with variety of filters.
`note: `you can also find freelancing work here.
## learning resources
[🧡Scaler - India's Leading software E-learning](www.scaler.com)
[🧡w3schools - for web developers](www.w3school.com)
| aryan015 |
1,890,801 | Finishing implementation and doc strings | The main implementation and documentation strings phase has been finished. One of the functions that... | 0 | 2024-06-17T04:44:45 | https://dev.to/ahmedhosssam/finishing-implementation-and-doc-strings-4c2a | gsoc | The main implementation and documentation strings phase has been finished.
One of the functions that have been deleted was `parse_unit`, it almost wasn't triggered by any event of HEK events, so I decided to delete it, and we will see in the future if it has any major effects on the code.
Also `parse_columns_to_table` has been refactored by adding more variables to make the code more readable.
This is the new implementation:
```python
def parse_columns_to_table(table, attributes, is_coord_prop = False):
"""
Parses the columns in an Astropy table and convert the values into Astropy objects.
Parameters
----------
table: astropy.table
Astropy table.
attributes: list
A list of HEK unit attributes or coordinate attributes.
is_coord_prop: bool
To specify if `attributes` is a list of unit attributes or coordinate attributes.
Returns
-------
`astropy.table`
Raises
------
TypeError
If `table` is not an Astropy table.
KeyError
If any of the attribute dictionaries are missing required keys (i.e. "name", "unit_prop").
"""
for attribute in attributes:
if attribute["name"] in table.colnames and ("unit_prop" in attribute or attribute.get("is_chaincode", False)) and attribute.get("is_unit_prop", True):
unit_attr = ""
if is_coord_prop:
unit_attr = "event_coordunit"
else:
unit_attr = attribute["unit_prop"]
new_column = []
for idx, value in enumerate(table[attribute["name"]]):
new_value = ""
if value in ["", None]:
new_value = value
elif attribute.get("is_chaincode", False):
new_value = parse_chaincode(value, attribute, table[attribute["unit_prop"]][idx])
else:
unit = get_unit(table[unit_attr][idx])
new_value = value * unit
new_column.append(new_value)
table[attribute["name"]] = new_column
for attribute in attributes:
if attribute.get("is_unit_prop", False) and attribute["name"] in table.colnames:
del table[attribute["name"]]
return table
```
Also a public interface `__all__` has been added to `util.py`, until now, the public "api" for `util.py` is:
```python
__all__ = [
'freeze',
'parse_times',
'parse_values_to_quantities',
'UNIT_FILE_PATH',
'COORD_FILE_PATH'
]
```
`UNIT_FILE_PATH` and `COORD_FILE_PATH` are used in `test_hek.py`
There was a problem in `get_unit` because of some units that wasn't parsed correctly, one of them is `ergs per cubic centimeter`, because of the first implementation of `get_unit`:
```python
with u.add_enabled_units([cm2, m2, m3]), u.set_enabled_aliases(aliases):
# Units for coordinate frames have more than one unit, otherwise it will be just one unit.
# There is an assumption that coord1_unit, coord2_unit and coord3_unit are the same.
units = re.split(r'[, ]', unit)
return u.Unit(units[0].lower())
```
The function was taking the first "word" of the input and returns the unit, obviously this will be wrong with `ergs per cubic centimeter` because it will take only `ergs` and returns `u.Unit('erg')`.
But this is the only case with HEK units, the other units works just fine.
This is the new implementation to correct this behavior:
```python
def get_unit(unit):
"""
Converts string into astropy unit.
Parameters
----------
unit: str
The targeted unit
Returns
-------
unit
Astropy unit object (e.g. <class 'astropy.units.core.Unit'> or <class 'astropy.units.core.CompositeUnit'>)
Raises
------
ValueError
Because `unit` did not parse as unit.
Notes
----
For the complete list of HEK parameters: https://www.lmsal.com/hek/VOEvent_Spec.html
"""
cm2 = u.def_unit("cm2", u.cm**3)
m2 = u.def_unit("m2", u.m**2)
m3 = u.def_unit("m3", u.m**3)
erg_per_cm3 = u.def_unit("ergs/cm^3", u.erg/u.ml)
aliases = {
"steradian": u.sr,
"arcseconds": u.arcsec,
"degrees": u.deg,
"sec": u.s,
"emx": u.Mx,
"amperes": u.A,
"ergs": u.erg,
"cubic centimeter": u.ml,
"square centimeter": cm2,
"cubic meter": m3,
"square meter": m2,
"ergs per cubic centimeter": erg_per_cm3,
}
with u.add_enabled_units([cm2, m2, m3]), u.set_enabled_aliases(aliases), warnings.catch_warnings():
# Units for coordinate frames have more than one unit, otherwise it will be just one unit.
# There is an assumption that coord1_unit, coord2_unit and coord3_unit are the same.
warnings.filterwarnings("ignore", category=u.UnitsWarning)
if unit in aliases:
unit = u.Unit(aliases[unit])
else:
unit = u.Unit(re.split(r'[, ]', unit)[0].lower())
return unit
```
Also, the units warnings were ignored because it's irrelevant to the user of HEK.
So, the next phase is testing, we will see if our assumptions about the unit parsing and the other refactoring were right or not. | ahmedhosssam |
1,890,799 | Your First Project in Blup A Step-by-Step Guide in Flutter News 2024 #24 ʚїɞ | Hey Flutter enthusiasts! Ever worry about missing key Flutter updates? Well, worry no... | 26,008 | 2024-06-17T04:37:52 | https://dev.to/lucianojung/your-first-project-in-blup-a-step-by-step-guide-in-flutter-news-2024-24-eyie-1p1e | flutter, news, dart, discuss | ## Hey Flutter enthusiasts!
Ever worry about missing key Flutter updates? Well, worry no more!
Starting 2024, I'm here to keep you informed with a weekly Monday report. Let's stay ahead in the world of Flutter!
## Table of Contents
1. {% cta #mayor-flutter-updates %} Mayor Flutter updates {% endcta %}
2. {% cta #new-flutter-videos %} New Flutter Videos {% endcta %}
3. [New Flutter Packages](#new-flutterpackages)
4. [New Dev Posts](#new-devposts)
5. [New Medium Posts](#new-mediumposts)
---
## Mayor Flutter updates:
> There are no mayor flutter updates this week!
-> Currently [Flutter Version Google I/O 3.22](https://docs.flutter.dev/release/whats-new)
---
## New Flutter Videos:
> The [Flutter YouTube Channel](https://youtube.com/@flutterdev?si=RZyl1nLVnSt373Vu) did not post any new Videos this week!
---
## New Flutter-Packages
{% details [test_util](https://pub.dev/packages/test_util) (Version 0.1.1) %} Helpers and utilities for testing: running Dart processes and expecting output.
\#MIT-0 (LICENSE) {% enddetails %}
{% details [macro_util](https://pub.dev/packages/macro_util) (Version 0.1.0-9.dev) %} Helpers and utilities for developing macros: field introspection, generated code formatting, etc.
\#MIT-0 (LICENSE) {% enddetails %}
{% details [css](https://pub.dev/packages/css) (Version 1.0.5) %} A CSS library for Flutter that facilitates minimal code styling of widgets, simplifying the development of consistent and elegant UI.
\#cupertino_icons, #flutter, #google_fonts {% enddetails %}
{% details [short_navigation](https://pub.dev/packages/short_navigation) (Version 0.2.5) %} This package built to navigate between screens (routes) without using context (BuildContext).
\#flutter {% enddetails %}
{% details [flutter_epub_viewer](https://pub.dev/packages/flutter_epub_viewer) (Version 1.0.1) %} A Flutter package for viewing Epub documents developed by combining the power of Epubjs and flutter_inappwebview
\#flutter, #flutter_inappwebview, #json_annotation {% enddetails %}
---
### New Dev-Posts
{% embed https://dev.to/jigneshpatel_flutterdeveloper/share-multiple-files-in-flutter-5202 %}
{% embed https://dev.to/djsmk123/unlocking-the-future-passwordless-authenticationpasskey-with-flutter-and-nodejs-1ojh %}
{% embed https://dev.to/priyanshuverma/saavn-music-player-for-windows-420k %}
{% embed https://dev.to/ozonexkeshav07/problem-your-project-requires-a-newer-version-of-the-kotlin-gradle-plugin-in-flutter-5da3 %}
{% embed https://dev.to/infowindtech57/flutter-vs-react-native-which-one-is-better-593e %}
---
### New Medium-Posts
{% details [Your First Project in Blup A Step-by-Step Guide](https://medium.com/@blup-tool/your-first-project-in-blup-a-step-by-step-guide-e100136d5508) by Blup %} Creating a mobile app from scratch can be a daunting task but with Blup a powerful low-code Flutter IDE the process becomes straightforward and efficient. In this guide we will take you through…
\Flutter, Low Code, App Development, Android, IOS {% enddetails %}
{% details [Getting Started How to Set Up and Navigate Blup.](https://medium.com/@blup-tool/getting-started-how-to-set-up-and-navigate-blup-21c7e0b22c12) by Blup %} Blup is a cutting-edge low-code Flutter IDE designed to revolutionize app development by simplifying the process and making it accessible to everyone. Whether youre a seasoned developer or just…
\Flutter, Flutter App Development, Android, IOS, App Development {% enddetails %}
{% details [Introduction to Blup A Powerful Low-Code Flutter IDE](https://medium.com/@sahaj.blup/introduction-to-blup-a-powerful-low-code-flutter-ide-66d1d1ce9abb) by Blup %} Welcome to our blog series on Blup an innovative low-code Flutter IDE thats revolutionizing the way we approach app development. In this series well guide you through the ins and outs of Blup…
\Flutter, App Development, Low Code, Android App Development, IOS App Development {% enddetails %}
{% details [Introduction to Blup A Powerful Low-Code Flutter IDE](https://medium.com/@blup-tool/introduction-to-blup-a-powerful-low-code-flutter-ide-66d1d1ce9abb) by Blup %} Welcome to our blog series on Blup an innovative low-code Flutter IDE thats revolutionizing the way we approach app development. In this series well guide you through the ins and outs of Blup…
\Flutter, App Development, Low Code, Android App Development, IOS App Development {% enddetails %}
{% details [Is Flutter the Best Option for Developing iOS Apps](https://medium.com/@sanakhaleel1980/is-flutter-the-best-option-for-developing-ios-apps-72b6ae84ba4a) by Syed Zohaib Akhter %} The goal of mobile app developers is to distribute their product to customers in a more cost-effective and efficient manner. During and after the development process locating and using the right…
\Apps, App Development, Flutter, Flutter App Development, IOS App Development {% enddetails %}
---
Last Flutter News: [Flutter News 2024 #23 ʚїɞ](https://dev.to/lucianojung/series/26008)
_Did I miss any recent updates? Feel free to share any important news I might have overlooked!_ | lucianojung |
1,890,798 | Thanks to all for viewing and starring PyTermOS! | Thanks to all people that helped me! Now we got 2 stars and 3 views! That’s already... | 0 | 2024-06-17T04:37:34 | https://dev.to/markdev/thanks-to-all-for-viewing-and-starring-pytermos-7f2 | ## Thanks to all people that helped me!
Now we got 2 stars and 3 views!
That’s already something!!
### If you haven’t seen my previous post, you can see it on my page. | markdev | |
1,890,797 | Exploring FTP and SSL/TLS Protocols in Networking: A Comprehensive Guide | In the digital age, secure and efficient data transfer is paramount for both personal and... | 0 | 2024-06-17T04:34:25 | https://dev.to/iaadidev/exploring-ftp-and-ssltls-protocols-in-networking-a-comprehensive-guide-1l02 | network, protocol, devops, ftp |
In the digital age, secure and efficient data transfer is paramount for both personal and professional activities. This comprehensive guide delves into two essential networking protocols: FTP (File Transfer Protocol) and SSL/TLS (Secure Sockets Layer/Transport Layer Security). We will explore their roles, functionalities, and implementation with practical code snippets and examples.
## Contents
1. [Introduction to FTP](#introduction-to-ftp)
- [What is FTP?](#what-is-ftp)
- [How FTP Works](#how-ftp-works)
- [Basic FTP Commands](#basic-ftp-commands)
- [FTP Code Snippet](#ftp-code-snippet)
2. [The Need for Secure FTP](#the-need-for-secure-ftp)
- [Secure FTP Versions](#secure-ftp-versions)
- [Implementing FTPS](#implementing-ftps)
3. [Understanding SSL/TLS](#understanding-ssltls)
- [What is SSL/TLS?](#what-is-ssltls)
- [How SSL/TLS Works](#how-ssltls-works)
- [SSL/TLS in Action](#ssltls-in-action)
4. [Combining FTP with SSL/TLS](#combining-ftp-with-ssltls)
- [FTPS Implementation](#ftps-implementation)
- [Practical Use Cases for FTPS](#practical-use-cases-for-ftps)
5. [Setting Up an FTPS Server](#setting-up-an-ftps-server)
- [Step-by-Step Guide to Setting Up vsftpd with FTPS](#step-by-step-guide-to-setting-up-vsftpd-with-ftps)
6. [Advanced FTPS Features](#advanced-ftps-features)
- [Passive vs. Active Mode](#passive-vs-active-mode)
- [Using FTPS with Firewalls](#using-ftps-with-firewalls)
7. [Troubleshooting Common FTPS Issues](#troubleshooting-common-ftps-issues)
- [Connection Refused](#connection-refused)
- [Certificate Errors](#certificate-errors)
- [Data Channel Encryption Issues](#data-channel-encryption-issues)
8. [Conclusion](#conclusion)
- [Additional Resources](#additional-resources)
## Introduction to FTP
### What is FTP?
FTP, or File Transfer Protocol, is one of the oldest protocols used for transferring files over a network. Introduced in the 1970s, FTP allows users to upload, download, and manage files on remote servers. Despite its age, FTP remains widely used due to its simplicity and effectiveness.
### How FTP Works
FTP operates on a client-server model, where an FTP client connects to an FTP server to perform file operations. It uses two channels:
- **Control Channel:** Used for sending commands and receiving responses.
- **Data Channel:** Used for transferring actual files.
By default, FTP uses port 21 for the control channel and a dynamic range of ports for data transfer.
### Basic FTP Commands
Here are some fundamental FTP commands:
- `USER`: Specifies the username for login.
- `PASS`: Specifies the password for login.
- `LIST`: Lists files and directories in the current directory.
- `RETR`: Retrieves (downloads) a file from the server.
- `STOR`: Stores (uploads) a file to the server.
### FTP Code Snippet
Below is a simple Python code snippet demonstrating how to connect to an FTP server and list files using the `ftplib` library:
```python
from ftplib import FTP
# Connect to the FTP server
ftp = FTP('ftp.example.com')
ftp.login(user='username', passwd='password')
# List files in the current directory
ftp.retrlines('LIST')
# Close the connection
ftp.quit()
```
This snippet connects to an FTP server, logs in with a username and password, lists the files in the current directory, and closes the connection.
## The Need for Secure FTP
While FTP is straightforward and useful, it has a significant drawback: it transmits data, including usernames and passwords, in plain text. This vulnerability makes it easy for attackers to intercept and read sensitive information. To address this issue, several secure versions of FTP have been developed.
### Secure FTP Versions
1. **FTPS (FTP Secure):** FTPS adds support for SSL/TLS to FTP, encrypting the control and data channels to protect data in transit.
2. **SFTP (SSH File Transfer Protocol):** SFTP is a completely different protocol that runs over SSH (Secure Shell) and provides secure file transfer capabilities.
### Implementing FTPS
Here’s an example of how to use FTPS with Python’s `ftplib` and `ssl` modules:
```python
from ftplib import FTP_TLS
# Connect to the FTPS server
ftps = FTP_TLS('ftp.example.com')
ftps.login(user='username', passwd='password')
ftps.prot_p() # Switch to secure data connection
# List files in the current directory
ftps.retrlines('LIST')
# Close the connection
ftps.quit()
```
In this snippet, `FTP_TLS` is used instead of `FTP` to establish a secure connection. The `prot_p()` method ensures that the data channel is encrypted.
## Understanding SSL/TLS
### What is SSL/TLS?
SSL (Secure Sockets Layer) and its successor TLS (Transport Layer Security) are cryptographic protocols designed to provide secure communication over a computer network. They encrypt data to ensure privacy and integrity, preventing eavesdropping and tampering.
### How SSL/TLS Works
SSL/TLS works by establishing an encrypted link between a server and a client. The process involves several steps:
1. **Handshake:** The client and server exchange cryptographic information to establish a secure session.
2. **Encryption:** Once the handshake is complete, data transmitted between the client and server is encrypted.
3. **Verification:** SSL/TLS uses certificates to verify the identity of the server, ensuring that the client is communicating with the intended server.
### SSL/TLS in Action
To illustrate SSL/TLS in action, let’s consider a simple example using Python’s `ssl` module to create a secure socket connection:
```python
import socket
import ssl
hostname = 'www.example.com'
context = ssl.create_default_context()
# Create a socket and wrap it in an SSL context
sock = socket.create_connection((hostname, 443))
secure_sock = context.wrap_socket(sock, server_hostname=hostname)
# Send a simple HTTP request over the secure connection
secure_sock.sendall(b'GET / HTTP/1.1\r\nHost: www.example.com\r\n\r\n')
# Receive and print the response
response = secure_sock.recv(4096)
print(response.decode('utf-8'))
# Close the connection
secure_sock.close()
```
In this example, a secure socket connection is established to a web server using SSL/TTLS, and an HTTP request is sent over this connection.
## Combining FTP with SSL/TLS
### FTPS Implementation
Combining FTP with SSL/TLS, as shown earlier, is a common approach to secure file transfers. FTPS provides the same functionality as FTP but with added security. Let’s revisit the FTPS code snippet for clarity:
```python
from ftplib import FTP_TLS
# Connect to the FTPS server
ftps = FTP_TLS('ftp.example.com')
ftps.login(user='username', passwd='password')
ftps.prot_p() # Switch to secure data connection
# List files in the current directory
ftps.retrlines('LIST')
# Close the connection
ftps.quit()
```
### Practical Use Cases for FTPS
FTPS is ideal for organizations that need to transfer sensitive data securely. Here are some practical use cases:
- **Healthcare:** Transferring patient records and medical data.
- **Finance:** Sending financial statements and transaction records.
- **E-commerce:** Uploading product catalogs and customer information.
## Setting Up an FTPS Server
Setting up an FTPS server involves configuring an FTP server to support SSL/TLS. One popular FTP server software is **vsftpd** (Very Secure FTP Daemon), which can be configured for FTPS.
### Step-by-Step Guide to Setting Up vsftpd with FTPS
1. **Install vsftpd:**
```sh
sudo apt-get update
sudo apt-get install vsftpd
```
2. **Generate SSL/TLS Certificates:**
```sh
sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout /etc/ssl/private/vsftpd.key -out /etc/ssl/certs/vsftpd.crt
```
3. **Configure vsftpd for FTPS:**
Edit the vsftpd configuration file (`/etc/vsftpd.conf`) and add the following lines:
```sh
listen=YES
listen_ipv6=NO
ssl_enable=YES
allow_anon_ssl=NO
force_local_data_ssl=YES
force_local_logins_ssl=YES
ssl_tlsv1=YES
ssl_sslv2=NO
ssl_sslv3=NO
rsa_cert_file=/etc/ssl/certs/vsftpd.crt
rsa_private_key_file=/etc/ssl/private/vsftpd.key
```
4. **Restart vsftpd:**
```sh
sudo service vsftpd restart
```
## Advanced FTPS Features
### Passive vs. Active Mode
FTP can operate in two modes: passive and active. In active mode, the server initiates the data connection, while in passive mode, the client initiates it. FTPS supports both modes, but passive
mode is often preferred for its compatibility with firewalls.
### Using FTPS with Firewalls
When using FTPS, it’s essential to configure firewalls correctly to allow secure data transfer. Here are some tips:
- **Allow Port 21:** Ensure that port 21 (or the custom control port) is open.
- **Open Passive Mode Ports:** Configure the firewall to allow the range of ports used for passive mode data connections.
- **Enable Deep Packet Inspection (DPI):** Some firewalls support DPI to inspect FTPS traffic and allow dynamic port allocation.
## Troubleshooting Common FTPS Issues
### Connection Refused
If you encounter a “connection refused” error, check the following:
- Ensure the FTP server is running and accessible.
- Verify that the server’s hostname and port are correct.
- Check firewall rules to ensure they allow FTPS traffic.
### Certificate Errors
Certificate errors can occur if the client does not trust the server’s certificate. To resolve this:
- Ensure the server’s certificate is correctly configured and not expired.
- Add the server’s certificate to the client’s trusted certificate store.
- Use a certificate signed by a trusted Certificate Authority (CA).
### Data Channel Encryption Issues
If the data channel is not encrypted, verify that `prot_p()` is called after logging in:
```python
ftps.prot_p() # Ensure the data channel is encrypted
```
## Conclusion
FTP and SSL/TLS are foundational protocols in the world of networking, enabling secure and efficient file transfers. While FTP is an older protocol, its secure variants like FTPS remain relevant for modern data transfer needs. By understanding and implementing these protocols, you can ensure your data remains safe and accessible, whether you're a developer, IT professional, or enthusiast.
In this guide, we’ve explored the basics of FTP, the importance of SSL/TLS, and how to combine them to create a secure file transfer solution. We’ve also provided practical examples and troubleshooting tips to help you implement and manage FTPS effectively.
### Additional Resources
- [Python ftplib documentation](https://docs.python.org/3/library/ftplib.html)
- [Python ssl module documentation](https://docs.python.org/3/library/ssl.html)
- [vsftpd official website](https://security.appspot.com/vsftpd.html)
By staying informed and utilizing these protocols, you can navigate the complexities of network security with confidence. Happy coding and secure transferring! | iaadidev |
1,890,796 | Mastering Digital Marketing: A Blueprint for Online Success | best digital marketing services in delhi In today's digital world, businesses are always looking for... | 0 | 2024-06-17T04:29:20 | https://dev.to/tekbooster/mastering-digital-marketing-a-blueprint-for-online-success-550i | webdev, javascript, programming, beginners | best digital marketing services in delhi
In today's digital world, businesses are always looking for ways to stand out and succeed in a highly competitive environment. Take advantage of essential digital marketing strategies for businesses of all sizes, from startups to large corporations. Let’s take a look at key tactics and techniques that can take your online presence to new levels. [[(best digital marketing services in delhi)](https://www.tekbooster.com/
)]
Understanding Digital Marketing
Digital marketing is the art of promoting products or services through various online channels. (best digital marketing services in delhi) We cover a wide range of tactics for reaching and engaging your target audience in the digital space, from websites and search engines to social media platforms and email.
Key elements of a digital marketing strategy:
1. Website optimization. Your website acts as a digital storefront for your business. To attract and retain visitors, your site must be easy to navigate, visually appealing, and optimized for search engines.
2. Search Engine Optimization (SEO): SEO is the process of optimizing a website to rank higher in search engine results. Using relevant keywords and improving your site structure can increase organic traffic and visibility.
3. Content marketing. Content is king in the digital realm. Creating valuable and engaging content helps you establish authority, build trust, and attract potential customers. (best digital marketing services in delhi)
4. Social media marketing. Social media platforms provide a direct connection with your audience. Connect with them through meaningful content, targeted advertising, and community-building initiatives.
5. Email marketing. Despite the emergence of other channels, email marketing remains a powerful tool for generating leads and increasing conversion rates. Personalization and segmentation are the keys to success.
6. Pay-per-click (PPC) advertising. Pay-per-click advertising allows you to bid on keywords and show targeted ads to your audience. This is a convenient way to generate traffic and conversions right away.
7. Analysis and data analysis. Data tracking and analytics can help you understand the effectiveness of your marketing efforts. Use valuable information to improve your strategy and optimize performance.
best digital marketing services in delhi
Develop a successful digital marketing strategy: (best digital marketing services in delhi)
1. Set clear goals. Set specific goals for your digital marketing campaign, such as increasing website traffic or generating leads.
2. Know your audience. Conduct market research to understand the needs, preferences, and behaviors of your target audience.
3. Select the correct channel. Focus your efforts on the channels where your audience is most active and engaged. (best digital marketing services in delhi)
4. Create compelling content. Develop content that resonates with your audience and provides value.
5. Optimize for mobile devices. Make sure your website and content are optimized for mobile devices so you can serve your users wherever they are.
6. Measure and Iterate: Track key metrics to evaluate performance and make data-driven decisions for continuous improvement. (best digital marketing services in delhi)
Conclusion
Digital marketing is a powerful tool that allows businesses to connect with their audiences, drive engagement, and achieve their marketing goals (best digital marketing services in delhi). By using the right strategies and remaining flexible in an ever-changing digital world, businesses can stay ahead of the competition and succeed online. | tekbooster |
1,890,478 | Getting started with Valkey using JavaScript | Run existing Redis apps with Valkey and learn how to use it with LangChain Valkey is an open... | 0 | 2024-06-17T04:27:12 | https://community.aws/content/2hx81ITCvDiWqrAz06SECOvepoa | valkey, redis, javascript, database | > Run existing Redis apps with Valkey and learn how to use it with LangChain
[Valkey](https://valkey.io/) is an open source alternative to [Redis](https://redis.io/). It's a community-driven, [Linux Foundation project](https://www.linuxfoundation.org/press/linux-foundation-launches-open-source-valkey-community) created to keep the project available for use and distribution under the open source Berkeley Software Distribution (BSD) 3-clause license after the [Redis license changes](https://redis.com/blog/redis-adopts-dual-source-available-licensing/).
I think the path to Valkey was well summarised in this [inaugural blog post](https://valkey.io/blog/hello-world/):

I will walk through how to use Valkey for JavaScript applications using existing clients in Redis ecosystem as well as [iovalkey](https://github.com/valkey-io/iovalkey) (a friendly fork of [ioredis](https://github.com/redis/ioredis)).
## Using Valkey with `node-redis`
[node-redis](https://github.com/redis/node-redis) is a popular and widely used client. Here is a [simple program](https://github.com/abhirockzz/valkey-javascript/blob/master/subscriber.js) that uses the [Subscriber](https://valkey.io/commands/subscribe/) component of the [PubSub](https://valkey.io/commands/pubsub/) API to subscribe to a channel.
```javascript
import redis from 'redis';
const client = redis.createClient();
const channelName = 'valkey-channel';
(async () => {
try {
await client.connect();
console.log('Connected to Redis server');
await client.subscribe(channelName, (message, channel) => {
console.log(`message "${message}" received from channel "${channel}"`)
});
console.log('Waiting for messages...');
} catch (err) {
console.error('Error:', err);
}
})();
```
To try this with Valkey, let's start an instance using the [Valkey Docker image](https://hub.docker.com/r/valkey/valkey/):
```bash
docker run --rm -p 6379:637 valkey/valkey
```
Also, head over to https://valkey.io/download to get OS specific distribution, or use Homebrew (on Mac) - `brew install valkey`. You should now be able to use the Valkey CLI (`valkey-cli`).
Get the code from GitHub repo:
```bash
git clone https://github.com/abhirockzz/valkey-javascript
cd valkey-javascript
npm install
```
Start the subscriber app:
```bash
node subscriber.js
```
Publish a message and ensure that the subscriber is able to receive it:
```bash
valkey-cli PUBLISH valkey-channel 'hello valkey'
```
Nice! We were able to write a simple application with an existing Redis client and run using Valkey (instead of Redis). Sure, this is an over-simplified example, but there were no code changes required.
## Use Valkey with `ioredis` client
[ioredis](https://github.com/redis/ioredis) is another popular client. To be doubly sure, lets try `ioredis` with Valkey as well. Lets write a [publisher application](https://github.com/abhirockzz/valkey-javascript/blob/master/publisher.js):
```javascript
import Redis from 'ioredis';
const redisClient = new Redis();
const channelName = 'valkey-channel';
const message = process.argv[2];
if (!message) {
console.error('Please provide a message to publish.');
process.exit(1);
}
async function publishMessage() {
try {
const receivedCount = await redisClient.publish(channelName, message);
console.log(`Message "${message}" published to channel "${channelName}". Received by ${receivedCount} subscriber(s).`);
} catch (err) {
console.error('Error publishing message:', err);
} finally {
// Close the client connection
await redisClient.quit();
}
}
publishMessage();
```
Run the publisher, and confirm that the subscriber app is able to receive it:
```bash
node publisher.js 'hello1'
node publisher.js 'hello2'
```
You should see these logs in the subscriber application:
```text
message "hello1" received from channel "valkey-channel"
message "hello2" received from channel "valkey-channel"
```
## Switch to `iovalkey` client
As mentioned, `iovalkey` is a fork of `ioredis`. I made the following changes to port the producer code to use `iovalkey`:
1. Commented out `import Redis from 'ioredis';`
2. Added `import Redis from 'iovalkey';`
3. Installed `iovalkey` - `npm install iovalkey`
Here is the updated version - yes, this was all I needed to change (at least for this simple application):
```javascript
// import Redis from 'ioredis';
import Redis from 'iovalkey';
```
Run the new `iovalkey` based publisher, and confirm that the subscriber is able to receive it:
```bash
node publisher.js 'hello from iovalkey'
```
You should see these logs in the subscriber application:
```text
message "hello from iovalkey" received from channel "valkey-channel"
```
Awesome, this is going well. We are ready to sprinkle some generative AI now!
## Use Valkey with LangchainJS
Along with Python, JavaScript/TypeScript is also being used in the generative AI ecosystem. [LangChain](https://www.langchain.com/) is a popular framework for developing applications powered by large language models (LLMs). LangChain has JS/TS support in the form of [LangchainJS](https://js.langchain.com/v0.2/docs/introduction/).
Having worked a lot [with the Go port](https://community.aws/tags/langchaingo) ([langchaingo](https://github.com/tmc/langchaingo)), as well as [Python](https://community.aws/content/2aq9ju6xvYtywGVbuPoWFTk5oK4/build-a-streamlit-app-using-langchain-amazon-bedrock-and-redis), I wanted to try LangchainJS.
One of the common use cases is to use Redis as a chat history component in generative AI apps. LangchainJS has this built-in, so let's try it out with Valkey.
### Using Valkey as chat history in Langchain
To install LangchainJS:
```bash
npm install langchain
```
For the LLM, I will be using [Amazon Bedrock](https://docs.aws.amazon.com/bedrock/latest/userguide/what-is-bedrock.html) (its [supported natively with LangchainJS](https://js.langchain.com/v0.2/docs/integrations/platforms/aws#bedrock)), but feel free to use others.
For Amazon Bedrock, you will need to [configure and set up Amazon Bedrock](https://docs.aws.amazon.com/bedrock/latest/userguide/setting-up.html?sc_channel=el&sc_campaign=genaiwave&sc_content=amazon-bedrock-golang-getting-started&sc_geo=mult&sc_country=mult&sc_outcome=acq), including [requesting access](https://docs.aws.amazon.com/bedrock/latest/userguide/setting-up.html#manage-model-access?sc_channel=el&sc_campaign=genaiwave&sc_content=amazon-bedrock-golang-getting-started&sc_geo=mult&sc_country=mult&sc_outcome=acq) to the Foundation Model(s).
[Here is the chat application](https://github.com/abhirockzz/valkey-javascript/blob/master/chat.js). As you can see, it uses the `RedisChatMessageHistory` component.
```javascript
import { BedrockChat } from "@langchain/community/chat_models/bedrock";
import { RedisChatMessageHistory } from "@langchain/redis";
import { ConversationChain } from "langchain/chains";
import { BufferMemory } from "langchain/memory";
import prompt from "prompt";
import {
ChatPromptTemplate,
MessagesPlaceholder,
} from "@langchain/core/prompts";
const chatPrompt = ChatPromptTemplate.fromMessages([
[
"system",
"The following is a friendly conversation between a human and an AI.",
],
new MessagesPlaceholder("chat_history"),
["human", "{input}"],
]);
const memory = new BufferMemory({
chatHistory: new RedisChatMessageHistory({
sessionId: new Date().toISOString(),
sessionTTL: 300,
host: "localhost",
port: 6379,
}),
returnMessages: true,
memoryKey: "chat_history",
});
const model = "anthropic.claude-3-sonnet-20240229-v1:0"
const region = "us-east-1"
const langchainBedrockChatModel = new BedrockChat({
model: model,
region: region,
modelKwargs: {
anthropic_version: "bedrock-2023-05-31",
},
});
const chain = new ConversationChain({
llm: langchainBedrockChatModel,
memory: memory,
prompt: chatPrompt,
});
while (true) {
prompt.start({noHandleSIGINT: true});
const {message} = await prompt.get(['message']);
const response = await chain.invoke({
input: message,
});
console.log(response);
}
```
Run the application:
```bash
node chat.js
```
Start a conversation:

If you peek into Valkey, notice that the conversations are saved in a `List`:
```bash
valkey-cli keys *
valkey-cli LRANGE <enter list name> 0 -1
```
> Don't run` keys *` in production - its just for demo purposes

### Using `iovalkey` implementation for chat history
The [current implementation](https://github.com/langchain-ai/langchainjs/blob/d09f63a/libs/langchain-redis/src/chat_histories.ts) uses the `node-redis` client, but I wanted to try out `iovalkey` client. I am not a JS/TS expert, but it was simple enough to port the existing implementation.You can [refer to the code on GitHub](https://github.com/abhirockzz/valkey-javascript)
As far as the client (chat) app is concerned, I only had to make a few changes to switch the implementation:
- Comment out `import { RedisChatMessageHistory } from "@langchain/redis";`
- Add `import { ValkeyChatMessageHistory } from "./valkey_chat_history.js";`
- Replace `RedisChatMessageHistory` with `ValkeyChatMessageHistory` (while creating the `memory` instance)
It worked the same way as above. Feel free to give it a try!
## Wrapping up
It's still early days for the Valkey (at the time of writing), and there is a long way to go. I'm interested in the how the project evolves and also the client ecosystem for Valkey.
Happy Building! | abhirockzz |
1,890,793 | A dive into IDEs | IDEs, which stands for Integrated Development Environments are tools that are super important in the... | 0 | 2024-06-17T04:19:01 | https://dev.to/cougarred1/a-dive-into-ides-57m1 | IDEs, which stands for Integrated Development Environments are tools that are super important in the entire subject of software development. They produce a available foundation for software engineering. A Integrated Development Environment include features like a debugger, a open source code editor and more. Integrated Development Environments are essential when it comes to software development and its entirety. Believe it or not, there's a high chance that you've already used a IDE and still might currently be using one without even knowing! There's many benefits and popular examples of the use of Integrated Development Environments, providing a very critical role in the programming world.
Integrated Development Environments simply are a collection of programs bundled together to provide solutions to different tasks. The main objective for Integrated Development Environments is to allow allow the efficiency of developers to be boosted with the best results by allowing developers to debug, testing and writing their code. This is wonderful, because not only can a programmer have the ability to write code to be executed, they can also test it and put a debugger to use that will work through the problems in whatever code they're dealing and while doing so they wouldn't have to use a variety of separate applications or mechanisms.
When it comes to the components that makes a Integrated Development Environment complete, there are several different parts which all play different roles. First there is the Source Code Editor, which is a text editor that is created for the sole purpose of writing and modifying code. The source code editor include features like bracket matching, which informs the developer whether or not their syntax is correct, it's very cool and appealing to the eye how it all operates. You'll be very contempt in whether or not your syntax is written correctly, because if not it'll notify you. Another awesome feature of the source code editor is code completion, which is when a user is typing code, it'll suggest a continuation that offers to finish the code out for you what it concluded would be best.

Next there's one of my favorites, which are the debuggers. Debuggers are super cool, because they literally are there to help you in being guided to fix your code if there are any errors in how things are set up. The debugger allows individuals to execute their code a step at a time, seeing the exact order execution went from beginning to end. Also, inspecting variables and and controlling the flow of things. By offering a real time look into the developers code behavior, debuggers assist these individuals into spotting the exact location where the problem within the code occurred. Which drastically can reduce the amount of time spent on developers troubleshooting, trying to figure out what went wrong and where. This significantly improves workflow and allow the code to be trustworthy before being complete.

There's a compiler, which is important because it transforms the written code into a format where the machine is able to understand it. The role that compilers play are very crucial for code to be readable. High level programming languages are converted are into executable instances that the hardware is able to actually understand. It's common for Integrated Development Environments to include compilers built in, to showcase testing, coding and debugging, allowing developers to write code that is immediately compiled, so they are able to see the output real time, in the present environment they're in.
Integrated Development Environments bring a lot of benefits, by consolidating all the features that would be extremely significant to developers, into a single interface. Integration inside of Integrated Development Environments amplify the overall software engineering experience by supplying workflows and better chemistry between multiple characteristics of the software engineering process. Creating a environment producing a great chemistry includes connecting many different tools in the developer world. As spoken about before, like the interpreters, the debuggers, source code editors and more. The Integrated Development Environments are simply just interfaces where all those features, plus more are ready for a software engineer at their hand, in a moments notice if they need it or not. This allows the simplicity when it comes to switching between things like coding, testing, and debugging, which we all know is so fundamental in the software engineering world.
The most popular Integrated Development Environment, is one that I and many other people use world wide. It's called Visual Studio Code (VS Code) and it was developed by Microsoft. One of the reasons that it's so popular is because it's free and the amount of versatility that's enabled inside of it. An example of Visual Studio Code being versatile is the different amount of programming languages that it supports. Not only is Visual Studio code very useful, it's also very appealing to the eye, allowing users to customize the program in so many ways that makes it more appealing to them. It's also available on Windows, Mac and Linux which is super cool.
Integrated Developer Environments (IDEs) are so fundamental in the coding world, and so many people use them without even knowing it. I certainly didn't know I was using a Integrated Developer Environment for a long time, before I found out what it was. It's fundamental for meaning the life of a software engineer easier and reducing the amount of time and steps it takes to do certain things.
Sources:
https://aws.amazon.com/what-is/ide/
https://www.codecademy.com/article/what-is-an-ide
https://www.redhat.com/en/topics/middleware/what-is-ide
https://www.veracode.com/security/integrated-development-environment
https://www.youtube.com/watch?v=vUn5akOlFXQ&pp=ygUOd2hhdCBpcyBhbiBpZGU%3D
https://www.youtube.com/watch?v=4Q3tw7sc1ZA&pp=ygUOd2hhdCBpcyBhbiBpZGU%3D | cougarred1 | |
1,890,791 | Building a Rock-Solid Foundation with Infrastructure as Code (IaC) | _Welcome Aboard Week 3 of DevSecOps in 5: Your Ticket to Secure Development Superpowers! Hey there,... | 0 | 2024-06-17T03:54:38 | https://dev.to/gauri1504/building-a-rock-solid-foundation-with-infrastructure-as-code-iac-efo | devops, devsecops, cloud, securiry | _Welcome Aboard Week 3 of DevSecOps in 5: Your Ticket to Secure Development Superpowers!
Hey there, security champions and coding warriors!
Are you itching to level up your DevSecOps game and become an architect of rock-solid software? Well, you've landed in the right place! This 5-week blog series is your fast track to mastering secure development and deployment.
---
In the agile development and cloud computing age, infrastructure management has dramatically shifted. Gone are the days of manual server configurations and error-prone scripting. Enter Infrastructure as Code (IaC), a revolutionary approach that automates infrastructure provisioning and configuration through code. This blog delves deep into the world of IaC, exploring its benefits, core concepts, best practices, and advanced techniques.
## The Power of IaC: Building Reliable and Scalable Infrastructure
IaC offers a multitude of advantages over traditional manual infrastructure management. Let's explore some key benefits:

#### Reduced Manual Errors:
Imagine the frustration of a typo leading to a critical production environment failure. IaC removes the human element from infrastructure provisioning by automating the process based on pre-defined code. This significantly reduces the risk of errors and ensures consistency in deployments.
#### Improved Repeatability and Scalability:
Need to spin up a new development environment quickly? IaC allows you to replicate infrastructure configurations with ease. Simply use the existing code to provision identical environments in minutes. This becomes even more powerful when scaling infrastructure. With IaC, scaling up or down becomes a matter of modifying the code and running a deployment script.
#### Version Control and Collaboration:
IaC code can be stored in version control systems like Git, just like application code. This enables features like tracking changes, collaboration among team members, and the ability to roll back deployments if necessary. Version control ensures a clear audit trail and simplifies troubleshooting.
## Demystifying IaC: Declarative vs. Imperative Approaches
IaC tools come in two primary flavors: declarative and imperative. Understanding these approaches is crucial for choosing the right tool for your project.
#### Declarative IaC:
This approach focuses on the desired state of the infrastructure. You simply define what resources you need (e.g., servers, databases) and their desired configurations (e.g., size, security settings) in the code. Tools like Terraform and AWS CloudFormation are popular examples. The IaC engine then translates this code and interacts with the underlying infrastructure provider to create or modify resources as needed to achieve the desired state.
#### Imperative IaC:
Here, the code dictates the exact steps needed to achieve the desired infrastructure configuration. Tools like Ansible and Chef use an imperative approach. The code specifies a sequence of commands necessary to configure the infrastructure, similar to how you might write a script to manually configure a server.

#### Choosing the Right Pattern:
The choice between declarative and imperative IaC depends on your specific needs:
**Declarative IaC** is ideal for environments that prioritize infrastructure as code and prefer a high-level, configuration-centric approach. It's also excellent for managing complex infrastructure with many resources, as changes are easier to track and understand.
**Imperative IaC **offers more granular control over individual steps, making it a good choice for situations where specific configuration management tasks are needed beyond simple resource provisioning. It can also be useful for automating existing manual server configuration workflows.
#### Popular IaC Tools for Each Pattern:
#### Declarative IaC:
Terraform, AWS CloudFormation, Azure Resource Manager (ARM)
#### Imperative IaC:
Ansible, Chef, Puppet
#### Example (Declarative IaC with Terraform):
```
resource "aws_instance" "web_server" {
ami = "ami-0e123456789abcdef0"
instance_type = "t2.micro"
tags = {
Name = "Web Server"
}
}
```
This code snippet in Terraform defines a single AWS EC2 instance named "Web Server" with the specified AMI ID and instance type. Terraform will automatically provision this instance in your AWS account.
#### Example (Imperative IaC with Ansible):
```
- name: Install Apache web server
hosts: all
become: true
tasks:
- name: Install apache2 package
package:
name: apache2
state: present
- name: Start and enable apache service
service:
name: apache2
state: started
enabled: yes
```
This Ansible playbook defines tasks for installing the Apache web server package and starting the service on all managed hosts.
## Taming the Chaos: Managing Infrastructure Drift
Infrastructure drift is a phenomenon where the actual state of your infrastructure deviates from the configuration defined in your IaC code. This can happen due to manual changes made outside the IaC workflow. It's crucial to address infrastructure drift to maintain consistency and security.
#### Understanding Infrastructure Drift:
Drift can introduce security vulnerabilities, configuration inconsistencies, and billing surprises. For example, a server might be manually provisioned outside of IaC, leaving it unmanaged.

## Combating Drift and Ensuring Quality: Advanced IaC Practices
#### IaC Drift Detection Tools:
Fortunately, several tools can help identify infrastructure drift. These tools compare the actual infrastructure state with the IaC code and report any discrepancies. Popular options include:
#### Terraform Drift:
A built-in Terraform command that detects drift in your AWS, Azure, and GCP environments.

#### Cloud Conformity:
A service that continuously scans your cloud infrastructure for drift and compliance violations.
#### Open Source Drift Detectors:
Tools like Fugue and Terratest offer open-source solutions for drift detection in various cloud platforms.
Strategies to Prevent and Remediate Drift: Here's how to keep your
#### infrastructure on the straight and narrow:
#### Enforce IaC Usage:
Make IaC the mandatory approach for all infrastructure provisioning and configuration changes. This discourages manual modifications outside the IaC workflow.
#### Automate Remediations:
Configure IaC tools to automatically remediate drift when detected. This can involve automatically provisioning missing resources or bringing configurations back into compliance with the IaC code.
#### Continuous Integration/Continuous Delivery (CI/CD) Integration:
Integrate IaC code into your CI/CD pipeline. This ensures that infrastructure changes are automatically deployed and tested as part of the application deployment process, minimizing the chance for manual drift.
## Building Confidence: IaC Testing Strategies
Just like application code, IaC code also benefits from thorough testing to ensure its correctness and functionality. Here are some key IaC testing approaches:
#### Unit Testing IaC Code:
Unit testing focuses on validating the syntax and logic of individual IaC modules. This helps catch errors early in the development process. Tools like Terratest and Kitchen exist specifically for unit testing IaC code.

#### Integration Testing for IaC:
Integration testing verifies how different IaC modules interact and ensure the overall infrastructure configuration works as expected. This can involve deploying infrastructure stacks in a test environment and simulating real-world scenarios.
#### IaC Testing Tools:
Several tools can streamline IaC testing:
#### Terratest:
Provides a framework for writing unit and integration tests for Terraform code.

#### Molecule:
A tool for testing infrastructure configurations defined with various IaC tools.

#### Serverspec:
A testing framework that allows writing tests for server configurations using a language like Ruby.

## Beyond the Basics: Advanced IaC Techniques
As your IaC experience grows, consider these advanced techniques to improve your infrastructure management:
#### Modular IaC Design:
Break down your IaC code into reusable modules for different infrastructure components (e.g., web servers, databases). This promotes code reusability, maintainability, and scalability.

#### Data Templating with IaC:
Leverage data templating languages like Jinja2 within your IaC code. This allows you to dynamically generate configurations based on specific environments or variables, making your IaC code more adaptable.
#### State Management with IaC:
Certain IaC tools require managing state information (e.g., IP addresses of provisioned resources). Options include using remote state backends (e.g., Terraform Cloud workspaces) or leveraging cloud provider-specific state management solutions.

## IaC Use Cases: Powering Your Infrastructure Workflows
IaC's versatility extends beyond basic infrastructure provisioning. Let's explore some compelling use cases:
#### IaC for Network Automation:
Automating network configurations like firewalls, routing, and security policies with IaC streamlines network management and reduces errors. Tools like Ansible and Cisco ACI can be used for network automation.
#### IaC for Continuous Delivery Pipelines:
Integrate IaC code into your CI/CD pipeline. This allows infrastructure provisioning and configuration to happen automatically alongside application deployments, ensuring everything is deployed consistently and reliably.
#### IaC for Disaster Recovery:
IaC can be used to automate disaster recovery workflows. By storing your infrastructure configuration as code, you can quickly rebuild your infrastructure in case of an outage, minimizing downtime.

## Security First: IaC Security Best Practices
Security is paramount when managing infrastructure through code. Here are some key considerations:
#### Secrets Management for IaC:
Never store sensitive information like passwords or API keys directly in your IaC code. Leverage secrets management services offered by cloud providers or use environment variables to securely manage secrets within your IaC workflow.
#### Least Privilege Principle in IaC:
The principle of least privilege dictates that IaC code should have the minimum permissions required to perform its tasks. This minimizes the potential damage caused by accidental or malicious code execution.
#### IaC Compliance and Governance:
IaC code should adhere to your organization's security policies and compliance regulations. Tools like Cloud Custodian can help enforce these policies within your IaC code.
## A Glimpse into the Future: The Evolving Landscape of IaC
IaC is constantly evolving, with new trends and technologies shaping its future. Here's a peek at what's on the horizon:
#### Self-Service Infrastructure with IaC:
Imagine a world where developers can provision their own environments using pre-approved IaC templates. This empowers developers with greater autonomy while maintaining control through governance policies.
#### Machine Learning in IaC:
Machine learning can optimize IaC code by identifying patterns and suggesting improvements. It can also automate infrastructure management tasks and predict potential issues before they occur.
#### Infrastructure as Code for Edge Computing:
The rise of edge computing necessitates managing infrastructure at geographically distributed locations. IaC tools are being adapted to handle the unique challenges of edge deployments, such as limited resources and intermittent connectivity.
## Deep Dives for the Discerning Reader
IaC Cost Optimization: Cloud infrastructure costs can add up quickly. IaC can help optimize costs by:
#### Right-sizing resources:
Provisioning only the resources needed for a particular workload can significantly reduce costs. IaC tools can automate this process.
#### Utilizing spot instances:
Cloud providers offer discounted compute instances with variable availability. IaC can be used to leverage spot instances for workloads that can tolerate interruptions.
#### Automating scaling:
IaC can automatically scale infrastructure up or down based on demand, eliminating the risk of overprovisioning and incurring unnecessary costs.
## IaC Best Practices for Collaboration: Effective collaboration is
crucial for successful IaC adoption. Here are some best practices:
#### Code reviews:
Implement code review processes for IaC code similar to application code reviews. This ensures code quality and adherence to best practices.
#### Version control practices:
Utilize version control systems like Git to track changes, manage different versions of IaC code, and facilitate rollbacks when necessary.
#### Communication strategies:
Establish clear communication channels between infrastructure engineers, developers, and operations teams to ensure everyone is aligned on IaC usage and best practices.
#### IaC Training and Certification:
Numerous resources exist for learning IaC and getting certified in popular IaC tools like Terraform or Ansible. Cloud provider documentation, online courses, and certification programs offered by providers like Hashicorp can equip you with the necessary skills.
## Conclusion
Infrastructure as Code (IaC) is revolutionizing infrastructure management. By automating infrastructure provisioning and configuration, IaC offers numerous benefits, including improved efficiency, consistency, and scalability. This blog has provided a comprehensive overview of IaC concepts, best practices, and advanced techniques. As you embark on your IaC journey, remember to prioritize security, leverage automation, and embrace the ever-evolving landscape of this powerful technology.
---
I'm grateful for the opportunity to delve into Building a Rock-Solid Foundation with Infrastructure as Code (IaC) with you today. It's a fascinating area with so much potential to improve the security landscape.
Thanks for joining me on this exploration of Building a Rock-Solid Foundation with Infrastructure as Code (IaC). Your continued interest and engagement fuel this journey!
If you found this discussion on Building a Rock-Solid Foundation with Infrastructure as Code (IaC) helpful, consider sharing it with your network! Knowledge is power, especially when it comes to security.
Let's keep the conversation going! Share your thoughts, questions, or experiences Mastering Version Control with Git: Beyond the Basics in the comments below.
Eager to learn more about DevSecOps best practices? Stay tuned for the next post!
By working together and adopting secure development practices, we can build a more resilient and trustworthy software ecosystem.
Remember, the journey to secure development is a continuous learning process. Here's to continuous improvement!🥂
| gauri1504 |
1,890,790 | Understanding OIDC back-channel logout | Learn how OIDC back-channel logout works and why it is important in modern identity solutions. ... | 0 | 2024-06-17T03:54:34 | https://blog.logto.io/oidc-back-channel-logout/ | webdev, opensource, identity, security | Learn how OIDC back-channel logout works and why it is important in modern identity solutions.
---
# Background
Ensuring user security and privacy is a topic that never gets old. Nowadays social sign-in has been widely adopted as a sign-in method, for its simplicity and seamless experience. However, what happens if you logout from the social identity provider, can your other online services using your social identity be logged out at the same time?
One essential feature of OpenID Connect (OIDC), the back-channel logout, offers a robust solution to address this requirement, enhancing user security by enabling logout simultaneously across various applications.
# What is OIDC back-channel logout?
OIDC back-channel logout is a mechanism designed to ensure that when a user logs out from an identity provider (IdP), they are also logged out from all the associated relying parties (RPs) or applications.
The back-channel logout operates through direct server-to-server “back-channel” communication, allowing the identity provider to notify all registered client applications about the user's logout event. Consequently, client applications can promptly terminate the user's sessions and perform any necessary cleanup actions.
#How does back-channel logout work?
The back-channel logout process involves several steps:
1. **User initiates logout:** The user initiates a logout from the identity provider.
2. **IdP sends logout token:** The IdP generates a logout token and sends it to all the registered RPs through a direct back-channel request.
3. **RP processes logout:** Each RP receives the logout token, validates it, and terminates the user session.
4. **Confirmation to IdP:** The RP may send a confirmation back to the IdP, acknowledging the successful logout.
# Benefits and impact
The back-channel logout feature offers several benefits for both users and service providers:
- **Enhanced user security:** Users can enjoy improved security and privacy, knowing that their sessions are promptly terminated across all connected applications upon logout.
- **Simplified user experience:** The seamless logout experience reduces user friction and enhances usability, fostering trust and satisfaction.
- **Compliance with security standards:** Adoption of OIDC backchannel logout aligns with industry best practices and regulatory requirements, demonstrating a commitment to robust security practices.
# I need this feature. Does Logto support it already?
We are actively testing this feature recently, and it will be available on both Logto Cloud and Logto open-source version. Please stay tuned to our future updates.
# Conclusion
OIDC back-channel logout is an essential feature in modern identity solutions, enabling users with greater control over their online security and privacy. By adopting this mechanism, organizations can provide a seamless and secure logout experience, safeguarding their online activities.
{% cta https://logto.io/?ref=dev %} Try Logto Cloud for free {% endcta %}
| palomino |
1,890,789 | Day 1: Introduction to Data Structures and Algorithms (DSA)🚀 | Welcome to the first day of our comprehensive journey into Data Structures and Algorithms (DSA)!... | 0 | 2024-06-17T03:54:21 | https://dev.to/dipakahirav/day-1-introduction-to-data-structures-and-algorithms-dsa-122o | dsa, algorithms, datastructures, learning | Welcome to the first day of our comprehensive journey into Data Structures and Algorithms (DSA)! Whether you are a beginner or someone looking to solidify your understanding, this series will walk you through the fundamentals step-by-step. Let’s embark on this exciting journey! 🎉
please subscribe to my [YouTube channel](https://www.youtube.com/@DevDivewithDipak?sub_confirmation=1
) to support my channel and get more web development tutorials.
## What are Data Structures and Algorithms? 🤔
### Data Structures
**Data Structures** are ways to organize and store data in a computer so that it can be accessed and modified efficiently. They provide the foundation for efficiently managing large amounts of data. Common data structures include:
- **Arrays**: A collection of items stored at contiguous memory locations.
- **Linked Lists**: A sequence of nodes where each node contains data and a reference to the next node.
- **Stacks**: A collection based on the Last In, First Out (LIFO) principle.
- **Queues**: A collection based on the First In, First Out (FIFO) principle.
### Algorithms
**Algorithms** are step-by-step procedures or formulas for solving problems. In computing, algorithms perform tasks like sorting, searching, and processing data. They are essential for writing efficient and effective code.
## Importance of DSA in Programming and Interviews 📝
Understanding DSA is critical for several reasons:
1. **Efficient Problem Solving**: Well-designed data structures and algorithms help in writing optimized and efficient code.
2. **Cracking Interviews**: Many technical interviews focus on DSA to assess problem-solving and coding skills.
3. **Foundation for Advanced Topics**: DSA concepts are the basis for advanced topics in computer science and software development.
## Overview of Common Data Structures and Algorithms 📚
### Common Data Structures
- **Arrays**: Simple, fixed-size structures for storing data elements.
- **Linked Lists**: Dynamic structures for storing elements with ease of insertion and deletion.
- **Stacks**: Useful in scenarios requiring reverse processing or backtracking.
- **Queues**: Ideal for scenarios like task scheduling.
### Common Algorithms
- **Sorting Algorithms**: Methods to arrange data in a particular order (e.g., Bubble Sort, Quick Sort).
- **Search Algorithms**: Methods to find elements within a data structure (e.g., Binary Search).
- **Graph Algorithms**: Techniques for solving problems related to graph structures (e.g., Dijkstra's Algorithm).
## Time and Space Complexity ⏱️
To evaluate the efficiency of an algorithm, we use time and space complexity.
### Time Complexity
Time complexity measures the amount of time an algorithm takes to complete as a function of the input size. For example:
- **O(1)**: Constant time.
- **O(n)**: Linear time.
- **O(log n)**: Logarithmic time.
- **O(n^2)**: Quadratic time.
### Space Complexity
Space complexity measures the amount of memory an algorithm uses relative to the input size. Efficient algorithms strive to minimize memory usage.
## Big O Notation 📊
Big O notation is a mathematical notation used to describe the upper bound of an algorithm's time or space complexity. It provides an abstract measure of performance, helping developers understand the worst-case scenario for their algorithms.
### Common Big O Notations
- **O(1)**: Constant time - the algorithm's performance is independent of the input size.
- **O(n)**: Linear time - the algorithm's performance grows linearly with the input size.
- **O(log n)**: Logarithmic time - the algorithm's performance grows logarithmically as the input size increases.
- **O(n^2)**: Quadratic time - the algorithm's performance is proportional to the square of the input size.
## Conclusion 🎯
Today, we laid the foundation by introducing the basic concepts of Data Structures and Algorithms and their significance. Understanding these fundamentals is crucial as we delve deeper into more complex topics in the upcoming days.
Stay tuned for [Day 2](https://dev.to/dipakahirav/day-2-understanding-big-o-notation-1l7m), where we will explore Big O notation in detail, complete with examples and practice problems. Feel free to share your thoughts or questions in the comments below. Happy coding! 💻
---
Feel free to leave your comments or questions below. If you found this guide helpful, please share it with your peers and follow me for more web development tutorials. Happy coding!
### Follow and Subscribe:
- **Website**: [Dipak Ahirav] (https://www.dipakahirav.com)
- **Email**: dipaksahirav@gmail.com
- **Instagram**: [devdivewithdipak](https://www.instagram.com/devdivewithdipak)
- **YouTube**: [devDive with Dipak](https://www.youtube.com/@DevDivewithDipak?sub_confirmation=1
)
- **LinkedIn**: [Dipak Ahirav](https://www.linkedin.com/in/dipak-ahirav-606bba128) | dipakahirav |
1,890,788 | Day 1: Introduction to Data Structures and Algorithms (DSA)🚀 | Welcome to the first day of our comprehensive journey into Data Structures and Algorithms (DSA)!... | 0 | 2024-06-17T03:54:21 | https://dev.to/dipakahirav/day-1-introduction-to-data-structures-and-algorithms-dsa-3d43 | dsa, algorithms, datastructures, learning | Welcome to the first day of our comprehensive journey into Data Structures and Algorithms (DSA)! Whether you are a beginner or someone looking to solidify your understanding, this series will walk you through the fundamentals step-by-step. Let’s embark on this exciting journey! 🎉
please subscribe to my [YouTube channel](https://www.youtube.com/@DevDivewithDipak?sub_confirmation=1
) to support my channel and get more web development tutorials.
## What are Data Structures and Algorithms? 🤔
### Data Structures
**Data Structures** are ways to organize and store data in a computer so that it can be accessed and modified efficiently. They provide the foundation for efficiently managing large amounts of data. Common data structures include:
- **Arrays**: A collection of items stored at contiguous memory locations.
- **Linked Lists**: A sequence of nodes where each node contains data and a reference to the next node.
- **Stacks**: A collection based on the Last In, First Out (LIFO) principle.
- **Queues**: A collection based on the First In, First Out (FIFO) principle.
### Algorithms
**Algorithms** are step-by-step procedures or formulas for solving problems. In computing, algorithms perform tasks like sorting, searching, and processing data. They are essential for writing efficient and effective code.
## Importance of DSA in Programming and Interviews 📝
Understanding DSA is critical for several reasons:
1. **Efficient Problem Solving**: Well-designed data structures and algorithms help in writing optimized and efficient code.
2. **Cracking Interviews**: Many technical interviews focus on DSA to assess problem-solving and coding skills.
3. **Foundation for Advanced Topics**: DSA concepts are the basis for advanced topics in computer science and software development.
## Overview of Common Data Structures and Algorithms 📚
### Common Data Structures
- **Arrays**: Simple, fixed-size structures for storing data elements.
- **Linked Lists**: Dynamic structures for storing elements with ease of insertion and deletion.
- **Stacks**: Useful in scenarios requiring reverse processing or backtracking.
- **Queues**: Ideal for scenarios like task scheduling.
### Common Algorithms
- **Sorting Algorithms**: Methods to arrange data in a particular order (e.g., Bubble Sort, Quick Sort).
- **Search Algorithms**: Methods to find elements within a data structure (e.g., Binary Search).
- **Graph Algorithms**: Techniques for solving problems related to graph structures (e.g., Dijkstra's Algorithm).
## Time and Space Complexity ⏱️
To evaluate the efficiency of an algorithm, we use time and space complexity.
### Time Complexity
Time complexity measures the amount of time an algorithm takes to complete as a function of the input size. For example:
- **O(1)**: Constant time.
- **O(n)**: Linear time.
- **O(log n)**: Logarithmic time.
- **O(n^2)**: Quadratic time.
### Space Complexity
Space complexity measures the amount of memory an algorithm uses relative to the input size. Efficient algorithms strive to minimize memory usage.
## Big O Notation 📊
Big O notation is a mathematical notation used to describe the upper bound of an algorithm's time or space complexity. It provides an abstract measure of performance, helping developers understand the worst-case scenario for their algorithms.
### Common Big O Notations
- **O(1)**: Constant time - the algorithm's performance is independent of the input size.
- **O(n)**: Linear time - the algorithm's performance grows linearly with the input size.
- **O(log n)**: Logarithmic time - the algorithm's performance grows logarithmically as the input size increases.
- **O(n^2)**: Quadratic time - the algorithm's performance is proportional to the square of the input size.
## Conclusion 🎯
Today, we laid the foundation by introducing the basic concepts of Data Structures and Algorithms and their significance. Understanding these fundamentals is crucial as we delve deeper into more complex topics in the upcoming days.
Stay tuned for Day 2, where we will explore Big O notation in detail, complete with examples and practice problems. Feel free to share your thoughts or questions in the comments below. Happy coding! 💻
---
Feel free to leave your comments or questions below. If you found this guide helpful, please share it with your peers and follow me for more web development tutorials. Happy coding!
### Follow and Subscribe:
- **Website**: [Dipak Ahirav] (https://www.dipakahirav.com)
- **Email**: dipaksahirav@gmail.com
- **Instagram**: [devdivewithdipak](https://www.instagram.com/devdivewithdipak)
- **YouTube**: [devDive with Dipak](https://www.youtube.com/@DevDivewithDipak?sub_confirmation=1
)
- **LinkedIn**: [Dipak Ahirav](https://www.linkedin.com/in/dipak-ahirav-606bba128) | dipakahirav |
1,890,787 | Exploring the Synergy of AI and Blockchain: A New Era of Innovation | In the rapidly evolving landscape of technology, the integration of artificial intelligence (AI) and... | 0 | 2024-06-17T03:53:55 | https://dev.to/laxita01/exploring-the-synergy-of-ai-and-blockchain-a-new-era-of-innovation-537k | blockchain, ai | In the rapidly evolving landscape of technology, the integration of artificial intelligence (AI) and blockchain stands out as a transformative force poised to reshape various industries. By combining the predictive power of AI with the immutable and decentralized nature of blockchain, businesses can unlock new levels of efficiency, security, and innovation. This blog explores the profound impact of [integrating AI and blockchain](https://www.solulab.com/ai-in-blockchain/), the benefits of this synergy, and real-world examples of their application.
**The Convergence of AI and Blockchain**
AI and blockchain are powerful technologies in their own right, but their convergence creates a synergy that can revolutionize numerous sectors. AI's ability to analyze vast amounts of data and make intelligent predictions complements blockchain's secure and transparent data management capabilities. Together, they offer solutions that are greater than the sum of their parts.
**Key Benefits of AI and Blockchain Integration**
**Enhanced Data Security and Privacy:** Blockchain's decentralized nature ensures that data is stored across multiple nodes, making it resistant to tampering and unauthorized access. When combined with AI, which can analyze and detect anomalies, businesses can achieve unprecedented levels of data security and privacy.
**Improved Efficiency and Automation:** AI-powered smart contracts on blockchain platforms can automate complex processes and transactions. This reduces the need for intermediaries, lowers operational costs, and speeds up transaction times, benefiting industries such as finance, supply chain, and healthcare.
**Trust and Transparency:** Blockchain's immutable ledger ensures that all transactions are transparent and traceable. When integrated with AI, this transparency can be leveraged to build trust in AI algorithms, as stakeholders can verify the data and decisions made by AI systems.
**Real-World Applications**
**Supply Chain Management:** Combining AI and blockchain enhances supply chain visibility and efficiency. AI algorithms can predict demand and optimize inventory management, while blockchain ensures traceability and authenticity of goods. Companies can hire dedicated developers to build customized solutions that leverage both technologies.
**Healthcare:** AI and blockchain integration can revolutionize healthcare by enabling secure sharing of patient data, improving diagnosis accuracy, and streamlining administrative processes. For example, [generative AI consulting services](https://www.solulab.com/generative-ai-consulting-company/) can help healthcare providers develop AI models that analyze medical images, while blockchain ensures patient data integrity.
Finance: The financial sector can benefit immensely from the integration of AI and blockchain. AI can analyze market trends and predict investment opportunities, while blockchain ensures secure and transparent transactions. Blockchain development companies are at the forefront of creating such innovative solutions for financial institutions.
**Decentralized AI Marketplaces:** Blockchain can facilitate decentralized AI marketplaces where developers can share and monetize AI models. Hybrid AI systems, which combine different AI approaches, can be securely traded and validated using blockchain technology, fostering innovation and collaboration.
**Future Prospects**
The integration of AI and blockchain is still in its early stages, but the potential for innovation is immense. As these technologies continue to evolve, businesses will increasingly seek specialized services to harness their combined power. Whether it's through generative AI consulting services or partnering with a [blockchain development company,](https://www.solulab.com/blockchain-development-company/) the opportunities for innovation and growth are boundless.
In conclusion, the synergy of AI and blockchain represents a new frontier in technology, offering solutions that enhance security, efficiency, and transparency across various industries. By leveraging the strengths of both technologies, businesses can drive innovation and achieve competitive advantages. To fully realize these benefits, it is crucial to [hire dedicated developers](https://www.solulab.com/hire-dedicated-developers/) with expertise in both AI and blockchain, ensuring the successful implementation and integration of these transformative technologies. | laxita01 |
1,890,681 | Oh My Posh- Powershell Terminal Setup | I would like to share how to set up a fancy and productive terminal prompt under Windows PowerShell.... | 0 | 2024-06-17T03:38:22 | https://dev.to/chenchih/oh-my-posh-powershell-terminal-setup-mfj | terminal, window, powershell, linux | I would like to share how to set up a fancy and productive terminal prompt under Windows PowerShell. Many people who have ever used Linux or Mac, will notice some great command, whereas the window command is not useful. Today I would like to show a tutorial on how to set up `oh-my-posh` environment.
If you ever use Linux/mac probably have heard of `oh-my-zsh`, it also makes our prompt to be more fancy. In `oh-my-zsh` it only supports `zsh` shell, but `oh-my-posh` supports many shells, but most people use it under the window. Both of these frameworks are used not just to fancy our terminal but also to let us be more productive when using `cli` command. `Cli` command means you can use the command to achieve many stuff without using the UI interface, which includes copy, filter, delete, rename, and many more.
# 1. Install Window Terminal and PowerShell
Before setting up oh-my-posh you need to install the window terminal and Powershell first. The default window provides you powershell which it's version 5.x version. You will need to install Powershell 7.x version which is typically named PowerShell core.
Powershell core supports many more functions in plugins, if not used it might occur error when installing and importing plugins. Window Terminal allows our prompt to display properly with the nerd font or fancy logo.
## Window Terminal
There are two methods:
- Store: access Microsoft and type Window Terminal
- Winget:
```
winget install Microsoft.WindowsTerminal
```
There are two methods:
## Powershell
- Store: access Microsoft and type Power Shell
- Winget:
```
winget install Microsoft.Powershell
```
# Window package installation
## Winget
In case your `winget ` is not able to use it, please download and install it under this link:
Step 1: Download under this link
Please use either link ok:
https://learn.microsoft.com/en-us/windows/package-manager/winget/
https://github.com/microsoft/winget-cli/releases
The file will look like `Microsoft.DesktopAppInstaller_8wekyb3d8bbwe.msixbundle
`
Step2: Install by command
```
Add -AppPackage
```
If you ever meet `Fail-Error Execution Policies`, probably it might be the policy problem, please change it to remote
> check the current policy: `Get-ExecutionPolicy -List`
```
Scope ExecutionPolicy
----- ---------------
MachinePolicy Undefined
UserPolicy Undefined
Process Undefined
CurrentUser Undefined
LocalMachine RemoteSigned
```
> set policy: `Set-ExecutionPolicy RemoteSigned`
For more detail on each policy, please refer [here](https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.security/set-executionpolicy?view=powershell-7.4)
- Restricted (Default): No scripts are allowed to run, not even ones you write yourself.
- RemoteSigned: This allows scripts signed by a trusted source (like Microsoft) to run. It's like having a guard who checks IDs – only scripts with valid "signatures" (like an artist's signature) are allowed. Scripts you write yourself wouldn't work unless you sign them with a special certificate. Files that are downloaded from the internet must be signed by a trusted publisher or must be unblocked
- AllSigned: This allows any script with a valid signature to run, regardless of who signed it. It's less secure than RemoteSigned because it trusts any "signature," even from unknown sources. Imagine a guard who just checks for a valid ID but doesn't care who issued it. Scripts must be signed by a trusted publisher, like Microsoft or a well-known software vendor. Think of it like a document requiring a verified signature from a recognized authority.
- Bypass (Not Recommended): This completely disables script execution restrictions. It's like having no security guard at all! Any script can run, which is very risky and not recommended unless you fully understand the potential dangers.
# 2. Oh-My-Posh setup
## Step1: install Nerd Font
Please download either of the link below to download and install nerd font:
- Download:
> - https://github.com/ryanoasis/nerd-fonts
> - https://www.nerdfonts.com/font-downloads
- Install:
Extract the file and right-click to install or drag into `C:\Windows\Fonts`
## Step2: Window terminal setting
Navigate window terminal and set below setting:
- Startup: Default set PowerShell, Deafualt terminal: window terminal.
Setting like below picture:

- Appearance
Access to powershell>appearance> change `fontface` to `nerd font`
## Step3: install oh-my-posh
- Install Oh-my-posh
```
winget install JanDeDobbeleer.OhMyPosh -s winget
```
- update the latest version
```
winget upgrade JanDeDobbeleer.OhMyPosh -s winget
```
After installing it press `Oh-my-posh` on the terminal to check command works or not.
## Step4: activate theme
activate default theme
```
oh-my-posh init pwsh | invoke-expression
```
list all the theme look like:
```
Get-PoshThemes
```
or display all theme location and filename
```
Get-PoshThemes –List
```
Change theme according to above theme filename
```
oh-my-posh init pwsh --config "$env:POSH_THEMES_PATH\iterm2.omp.json" | Invoke-Expression
```
## Step5: prompt setting writing profile
We need to create profile to write the theme in it. in Step4 it will only take effect on that session temporary, when reopening new session the command you enter like theme will be lost remain default.
profile location of different poweshell:
> - **PS5.1**: `C:\Users\test\Documents\WindowsPowerShell`
> - **PS7.1**: `C:\Users\test\Documents\PowerShell`
generate a file: will only generate an empty file
```
New-Item -Path $PROFILE -Type File –Force
```
You need to modify the file using `notepad $profile`, you can change Notepad to any you prefer text editor. The `$profile` will automatically open the file it generates, which no need to type the full path of the profile location.
Add your previous command on changing the theme to your profile
```
oh-my-posh init pwsh --config "$env:POSH_THEMES_PATH\iterm2.omp.json" | Invoke-Expression
```
You can also assign the full path of the theme below i mention many examples of assigned themes:
```
#full path
$themepath = 'C:\Users\test\Documents\PowerShell\shanselman_v3-v2.json'
oh-my-posh --init --shell pwsh --config $themepath | Invoke-Expression
# URL
oh-my-posh init pwsh --config 'https://raw.githubusercontent.com/JanDeDobbeleer/oh-my-posh/main/themes/jandedobbeleer.omp.json' | Invoke-Expression
# same location as $profile
$omp_config = Join-Path $PSScriptRoot ".\theme.json"
oh-my-posh --init --shell pwsh --config $omp_config | Invoke-Expression
```
You might wonder how the `$env` knows the full path of the default theme, it's because it's an environment variable in powershell, you can use this command to check the env variable and path.
```
ChildItem env:
```

Note: You can set and remove an $env variable using the below command. But this will only take effect if you write into `$profile`, or in the command(temporary). If you set the env but not add into $profile will not take effect.
```
#set $env
Set-Item -Path env:MYCUSTOMVAR –Value <path location>
#remove
$env:<name> = $null
```
## Step6 add some function or alias to profile
Now we can assign some Linux commands to alias, which you can use Linux commands on PowerShell. You will be using a lot of the $profile in the plugin, just think it's a configuration of PowerShell.
Before setting the alias or functions, let's install git and node, it's optional but if you want to add git to the alias, it is highly recommended you install it here.
- Install git
```
winget install --id Git.Git -e --source winget
```
- Install nodejs
```
WINGET INSTALL OPENJS.NODEJS.ltS
```
### Alias
```
Set-alias tt tree
Set -Alias ll ls
Set-Alias g git
#Set alias vim nvim
Set-Alias grep findstr
Set-Alias tig 'C:\Program Files\Git\usr\bin\tig.exe'
Set-Alias less 'C:\Program Files\Git\usr\bin\less.exe'
# Ultilities (Optional)
function which ($command) {
Get-Command -Name $command -ErrorAction SilentlyContinue |
Select-Object -ExpandProperty Path -ErrorAction SilentlyContinue
}
```
### Function
adding function can be alternative of alias
```
function getenv{
Get-ChildItem env:
}
function head {
param($Path, $n = 10)
Get-Content $Path -Head $n
}
function tail {
param($Path, $n = 10)
Get-Content $Path -Tail $n
}
function grep($regex, $dir) {
if ( $dir ) {
Get-ChildItem $dir | select-string $regex
return
}
$input | select-string $regex
}
function df {
get-volume
}
#get $env variable
function getenv {ChildItem env:}
# Git Shortcuts
function gs { git status }
function ga { git add . }
function gc { param($m) git commit -m "$m" }
function gp { git push }
function g { z Github }
function gcom {
git add .
git commit -m "$args"
}
function lazyg {
git add .
git commit -m "$args"
git push
}
# adding hosts shortcut
function hosts { notepad c:\windows\system32\drivers\etc\hosts }
```
If you are interested in the profile, please refer to my profile on this [link](https://github.com/chenchih/Env_Setup_Note/blob/master/Terminal/ohmyposh/Microsoft.PowerShell_profile.ps1).
# 3. Plugin
There are many more plugins, but I will choose only those I think are commonly used by people or developers.
## Terminal Icons
Display icon according to file type
> install module
```
Install-Module -Name Terminal-Icons -Repository PSGallery -Force
```
> import module
```
Import-Module Terminal-Icons
```
What will it look like, please refer below picture.

## psreadline(Autocompletion)
This is a powerful module, which has many functions on it. I will not use all of them, I will only pick important ones or useful ones. If you’re interested you can study on the [official site](https://learn.microsoft.com/en-us/powershell/module/psreadline/?view=powershell-7.4). PSReadline basely is developed by Microsoft and used under PowerShell.
You can use this command to get the default hotkey: `Get-PSReadLineKeyHandler`
> install module
```
Install-Module PSREadLine -Force
#or below more detail which recommends
Install-Module -Name PSReadLine -AllowPrerelease -Scope CurrentUser -Force -SkipPublisherCheck
```
> Import module
```
#method1
Import-Module PSReadLine
#method2
if ($host.Name -eq 'ConsoleHost')
{
Import-Module PSReadLine
}
```
- all psreadline plugin
```
Set-PSReadLineOption -EditMode Windows
Set-PSReadLineOption -PredictionSource History
Set-PSReadLineOption -PredictionViewStyle ListView
Set-PSReadLineKeyHandler -key Tab -Function Complete
Set-PSReadLineKeyHandler -Key Tab -Function MenuComplete
Set-PSReadlineKeyHandler -Chord ctrl+x -Function ViExit
Set-PSReadLineKeyHandler -Chord 'Ctrl+d' -Function DeleteChar
Set-PSReadlineKeyHandler -Chord ctrl+w -Function BackwardDeleteWord
Set-PSReadlineKeyHandler -Chord ctrl+e -Function EndOfLine
Set-PSReadlineKeyHandler -Chord ctrl+a -Function BeginningOfLine
Set-PSReadLineKeyHandler -Key Alt+B -Function SelectShellBackwardWord
Set-PSReadLineKeyHandler -Key Alt+F -Function SelectShellForwardWord
# CaptureScreen is good for blog posts or emails showing a transaction
# of what you did when asking for help or demonstrating a technique.
#ctrl+c and ctrl+d to copy terminal
#ctrl+v to paste
Set-PSReadLineKeyHandler -Chord 'Ctrl+d,Ctrl+c' -Function CaptureScreen
```
if you feel like the color is too light can't read it might be the color, so you can add the color like below:
```
Set-PSReadLineOption -Colors @{
Command = 'Yellow'
Parameter = 'Green'
String = 'DarkCyan'
}
```
## Zoxide(directory jumper)
This lets you navigate the file system based on your cd command history, which just renumber your last command’s directory.
> `Install-Module -Name Z –Force`
So imagine every time go to a specific location like `cd C:\Users\User\Downloads`
But if we visit the directory once, it will remember it, so next time you can just type `z` with the directory or file name, like z Download, it will access to cd `C:\Users\User\Downloads`. It will recognize the path already.
## PSFzf(Fuzzy finder)
This is a tool like an interactive search bar on your terminal command line that allows regex matching, and filtering files, or directories.
> iNSTALL SCOOP (if you have scoop installed skip this part)
```
irm get.scoop.sh | iex
```
> Install fzf an pszf
```
scoop install fzf #install fzf
Install-Module -Name PSFzf -Scope CurrentUser -Force #powershell
```

> Import module into the profile
```
Import-Module PSFzf
# Override PSReadLine's history search
Set-PsFzfOption -PSReadlineChordProvider 'Ctrl+f' -PSReadlineChordReverseHistory 'Ctrl+r'
```
> Below is how to use fzf to search your file or history command:
- `Ctrl+r(Fzf Reverse Fuzzy Search)`:to fuzzily search your command searches your history, **similar to the history command**, and run select or use it
- `Alt+c (Fzf Set Location)`: fuzzily search for a directory in your home directory and allow to cd to the directory. Basely this is **quickly selected to subdirectory**.
- `CTRL-f (Fzf Provider Select)`: to fuzzily **search for a file or directory in your home directory** and select it will get the path of it
> **Note**: In psreadline if you use the `Set-PSReadLineOption -EditMode Emacs` then probably `alt+c` you will not be able to use, due to the hotkey is been conflict, which psreadline will take higher priority. in the below picture you can see `alt+c` is a conflict, so if you use alt+c it will be a capitalized word in this case.

If I try to change the conflict function capitalize word using this `Set-PSReadLineKeyHandler -Chord 'Alt+Shift+C' -Function CapitalizeWord` it still won't change, probably it's been hotcode.
So there are two solutions to fix it:
- Method1: don't use `emac`, change to `window` or `vi` mode
- method2: use the alternative command as a function to use it
```
function FzfNav { Get-ChildItem . -Recurse -Attributes Directory | Invoke-Fzf | Set-Location }
```
### layout fzf
Layout allows you to adjust the percentage of the navigator bar, and adjust the total result position, like below. You can add it to the function when next time you want to use just call a function instead of typing a complicated command.
```
$(fzf --height 40% --reverse)
```

### preview files in fzf
This allows you to preview or read the file without opening the file to see whether this is the file you want. You can also add text editor in front when you enter the file allowing you to modify it.
> install
```
scoop install bat
```
> preview file
```
fzf --preview='cat {}'
```
> Preview and enter the file to modify on a specific text editor
```
function ff{
vim $(fzf --preview 'bat --style=numbers --color=always --line-range :500 {}')
}
```

### search syntax
You can use this related syntax to search your file
- `'wild`: include wild
- `!^music`: do not start with music
- `!.mp3$`: do not end with .mp3
- `!test`: do not include test
- `^music`: start with music
`.mp3$` : end with .mp3
## FastFetch
Fastfetch is an alternative to neofetch tool that will show your system information. You can find more information on think [fastfetch link](https://github.com/fastfetch-cli/fastfetch). This [awesome-fetch](https://github.com/beucismis/awesome-fetch) provides many alternative related tools.
```
#generate config:
fastfetch --gen-config
#location window: C:\Users\test\.config\fastfetch\config.jsonc
#load customer cfg:
fastfetch --load-config /path/to/config_file
fastfetch --load-config .\aa.jsonc
#SHOW ONLY HARDWARE:
fastfetch -c hardware
#PRINT LOGOS:
fastfetch --print-logos
#USE LOGO:
fastfetch --logo sparky
#print all logo:
fastfetch --print-logos
#USE CERTAIN COLOR :
fastfetch --color blue
```
YOu can find many more preset examples [here ](https://github.com/fastfetch-cli/fastfetch/tree/dev/presets/examples)to see more settings.
# window terminal some setting
## 1. Automatic copy string when select it
> - Method1: adding into the window terminal
Open terminal setting shortcut: `ctrl+shift+p`, or `ctrl+shift+,` to open `settings.json file`
Select will copy: `copyonselect=true`
> - Method2: add into profile
You can also automatically add to the profile:
```
function cpy { Set-Clipboard $args[0] }
function pst { Get-Clipboard }
```
## 2. Paste multiply line pop warming
If you copy multiple lines and paste them into the window terminal will pop warming, to disable the warming we can set `Multipastingwarming=false`
This is what I mean about copy and pasting multiple line warming:

## 3. Clear-Hidden showing the terminal logo
Powershell core release new version 7.4.2 will pop, you can hide this message or logo, just add `-nologo` into the window terminal like below

# conclusion
In this post I just wish to keep it simple on how to set up oh-my-posh, however, there are many more to cover. I have another post of [Medium ](https://medium.com/jacklee26/setup-fancy-terminal-using-ohmyposh-9f0ce00948bf)which cover a lot of detail. I wish to make it short in this post, there are many of the psreadline I didn't make more detail on each function.
| chenchih |
1,890,779 | NaturalHabits - Best Health and Beauty Center | At NaturalHabits, we embark on a journey towards holistic wellness, guiding you through the intricate... | 0 | 2024-06-17T03:33:46 | https://dev.to/shirley78/naturalhabits-best-health-and-beauty-center-1agc | healthandwellness | At NaturalHabits, we embark on a journey towards holistic wellness, guiding you through the intricate tapestry of natural living and empowering you to thrive in harmony with nature. Our platform is more than just a health blog; it’s a sanctuary where knowledge meets inspiration, and where community fosters growth.
[https://naturalhabits.org/
](https://naturalhabits.org/)
Our Journey
Our story begins with a deep-rooted passion for embracing the healing powers of nature. From ancient remedies to modern holistic practices, we’ve traversed through diverse landscapes of wellness, gathering wisdom and insights along the way. Our journey is an ongoing exploration, marked by curiosity, discovery, and a relentless pursuit of optimal well-being.
Who We Are
We are a collective of health enthusiasts, wellness experts, and passionate advocates for natural living. From herbalists to nutritionists, each member of our team brings a unique perspective and expertise to the table. Together, we form a vibrant community dedicated to sharing knowledge, fostering growth, and inspiring positive change.
What We Do: Empowering You Across the Wellness Spectrum
At NaturalHabits, our mission is to empower you across every facet of wellness, offering a comprehensive array of resources and insights tailored to meet your holistic health needs. Underpinned by our commitment to accuracy, integrity, and community, we provide guidance and support in the following key areas:
Medical Insights: Navigate the complexities of health with confidence, as we delve into the latest medical research, breakthrough treatments, and holistic approaches to well-being. From chronic conditions to preventative care, our expertly curated content offers invaluable insights to help you make informed decisions about your health.
Nutrition and Diet: Fuel your body with nourishing foods and embrace the transformative power of nutrition. Explore our collection of delicious recipes, evidence-based nutrition guides, and practical tips to cultivate a balanced and sustainable diet that supports your vitality and longevity.
Fitness and Exercise: Ignite your passion for movement and discover the joy of active living. Whether you’re a seasoned athlete or a beginner on the path to fitness, our diverse range of exercise routines, workout plans, and expert advice will inspire you to challenge your limits, build strength, and enhance your overall well-being.
Mental Health and Well-being: Nurture your mind, body, and spirit with our holistic approach to mental health and well-being. Explore mindfulness practices, stress management techniques, and transformative strategies to cultivate resilience, foster inner peace, and navigate life’s challenges with grace and mindfulness.
Lifestyle and Wellness: Embrace a lifestyle that celebrates balance, harmony, and vitality. From sustainable living tips to mindfulness practices, our lifestyle content encompasses a wide range of topics designed to inspire you to live authentically, consciously, and joyfully.
At NaturalHabits, we believe that true wellness encompasses every aspect of your being. Through our diverse array of resources and insights, we invite you to embark on a journey of self-discovery, empowerment, and transformation. Together, let’s unlock the boundless potential of holistic health and embrace a life of vitality, purpose, and well-being.
Our Commitment to You
We understand that navigating the vast landscape of health and wellness can be overwhelming. That’s why we are committed to providing you with accurate, evidence-based information that you can trust. Your well-being is our priority, and we are dedicated to guiding you with integrity, transparency, and compassion every step of the way.
Accurate Information
In a world inundated with misinformation, we stand as a beacon of reliability and trustworthiness. Our content is meticulously researched, drawing from reputable sources and expert insights to ensure that you receive the most accurate and up-to-date information available.
Join the Conversation
Your voice matters to us. NaturalHabits is more than just a platform; it’s a community where ideas are exchanged, questions are answered, and connections are forged. Join the conversation, share your experiences, and become a part of our thriving community of wellness enthusiasts.
At NaturalHabits, we invite you to embark on a journey of self-discovery, empowerment, and transformation. Together, let’s embrace the wisdom of nature and unlock the boundless potential of holistic health. Welcome to a world where well-being knows no bounds. Welcome to NaturalHabits.
[https://forum.enscape3d.com/wcf/index.php?user/76243-what-s-erecprime-reviews/](https://forum.enscape3d.com/wcf/index.php?user/76243-what-s-erecprime-reviews/)
[https://forum.enscape3d.com/wcf/index.php?user/76245-serolean-safe-or-not/](https://forum.enscape3d.com/wcf/index.php?user/76245-serolean-safe-or-not/)
[https://www.quora.com/What-is-a-complimentary-billionaire-brainwave-review-of-2024/answer/Shirley-Earhart-1?prompt_topic_bio=1](https://www.quora.com/What-is-a-complimentary-billionaire-brainwave-review-of-2024/answer/Shirley-Earhart-1?prompt_topic_bio=1) | shirley78 |
1,890,777 | Deviation rate BIAS trading strategy | Summary As the saying goes, This world will seperate after long time united. Also will do... | 0 | 2024-06-17T03:26:08 | https://dev.to/fmzquant/deviation-rate-bias-trading-strategy-knm | strategy, trading, fmzquant, cryptocurrency | ## Summary
As the saying goes, This world will seperate after long time united. Also will do the opposite after long time pliting. And this phenomenon also exists in the futures market. There is no variety that only rises but does not fall. But when to rise and when to fall, it depends on the deviation rate. In this article, we will use the deviation rate to construct a simple trading strategy.
## Brief introduction

Deviation rate BIAS is a technical indicator derived from the moving average. It is mainly in the form of a percentage to measure the degree of price deviation from the moving average in fluctuations. If the moving average is the average cost of a trader, the deviation rate is the average rate of return of the trader.
## The principle of deviation rate
The theoretical basis of the deviation rate is an analysis of the trader's heart. When the price is greater than the average cost of the market, it means that the long position traders will have the idea of cash out the profits, which will cause the price to fall. When the price is less than the average cost of the market, it means that short-sellers are profitable, and the idea of cash out the profit will cause the price to rise.
- When the price deviates upward from the moving average, the deviation rate is too large, and there is a high probability that the price will fall in the future.
- When the price deviates from the moving average downward, the deviation rate is too small, and there is a high probability that the price will rise in the future.
Although the moving average is calculated from the price, in terms of external form, the price will definitely move closer to the moving average, or the price will always fluctuate around the moving average. If the price deviates too far from the moving average, regardless of whether the price is above or below the moving average, it may eventually tend to the moving average, and the deviation rate is the percentage value that the price deviates from the moving average.
## Formula for calculating deviation rate
Deviation rate = [(the closing price of the day - N day average price) / N day average price] * 100%
Among them, N is the moving average parameter, because the period of N is different, the calculation result of the deviation rate is also different. In general, the values of N are: 6, 12, 24, 36, etc. In actual use, it can also be dynamically adjusted according to different varieties. However, the selection of parameters is very important. If the parameter is too small, the deviation rate will be too sensitive, if the parameter is too large, the deviation rate will be too slow. The calculation results of the deviation rate are positive and negative. The greater the positive deviation rate, the greater the profit of the bulls and the greater the probability of price correction. The greater the negative deviation rate, the greater the short profit and the greater the probability of price rebound.
## Strategy logic
Since the deviation rate is another form of moving average, then we can also adapt a double deviation rate strategy based on the double moving average strategy. Judging from the positional relationship between the short-term deviation rate and the long-term deviation rate, the current market state is judged. If the long-term deviation rate is greater than the short-term deviation rate, it actually represents the short-term moving average up cross the long-term moving average, and vice versa.
- Long position opening: if there is no current holding position and the long-term deviation rate is greater than the short-term deviation rate
- Short position opening: if there is no current holding position and the long-term deviation rate is less than the short-term deviation rate
- Long position closing: if there is a holding long position, and the long-term deviation rate is less than the short-term deviation rate
- Short position closing: if there is a holding short position, and the long-term deviation rate is greater than the short-term deviation rate
## Strategy writing
Step 1: Write a strategy framework
```
# Strategy main function
def onTick():
pass
# Program entry
def main():
while True: # Enter infinite loop mode
onTick() # execution strategy main function
Sleep(1000) # sleep for 1 second
```
FMZ platform adopts the rotation training mode. First, a main function and an onTick function need to be defined. The main function is the entry function of the strategy, and the program will execute the code line by line starting from the main function. In the main function, write a while loop and repeatedly execute the onTick function. All the core code of the strategy is written in the onTick function.
Step 2: Define virtual positions
```
mp = 0
```
The advantage of virtual positions is that it is simple to write, and iterative operation is fast. It is generally used in the backtest environment. It is assumed that every order is completely filled, but the actual position is usually used in actual trading. Since the virtual position is to record the state after opening and closing, it needs to be defined as a global variable.
Step 3: Get K line
```
exchange.SetContractType('rb000') # Subscribe to futures varieties
bars_arr = exchange.GetRecords() # Get K-line array
if len(bars_arr) <long + 1: # If the number of K lines is too small
return
```
Using the FMZ function SetContractType, you can subscribe to the rebar index contract by passing in "rb000", but in the backtest and real market situation, the rebar index is used as the data, and the specific main contract is used to place the order. Then use the GetRecords function to get the K-line data of the rebar index. Since it takes a certain period to calculate the deviation rate, in order to avoid program errors, if there are not enough K lines, use if statements to filter.
Step 4: Calculate the deviation rate
```
close = bars_arr[-2]['Close'] # Get the closing price of the previous K line
ma1 = TA.MA(bars_arr, short)[-2] # Calculate the short-term moving average value of the previous K line
bias1 = (close-ma1) / ma1 * 100 # Calculate the short-term deviation rate value
ma2 = TA.MA(bars_arr, long)[-2] # Calculate the long-term average of the previous K line
bias2 = (close-ma2) / ma2 * 100 # Calculate the long-term deviation rate value
```
According to the formula for calculating the deviation rate, we first obtain the closing price. In this strategy, we use the previous K-line closing price, which means that the current K-line signal is established and the next K-line is for placing orders. Then use the FMZ built-in talib library to calculate the moving average. For example, the moving average is: TA.MA. This function receives 2 parameters, namely: K line array and moving average period.
Step 5: Placing orders
```
global mp # global variables
current_price = bars_arr[-1]['Close'] # latest price
if mp> 0: # If you are holding long positions
if bias2 <= bias1: # If the long-term deviation rate is less than or equal to the short-term deviation rate
exchange.SetDirection("closebuy") # Set the trading direction and type
exchange.Sell(current_price-1, 1) # Closing long positions
mp = 0 # reset virtual holding positions
if mp <0: # If you are holding short positions
if bias2 >= bias1: # If the long-term deviation rate is greater than or equal to the short-term deviation rate
exchange.SetDirection("closesell") # Set the trading direction and type
exchange.Buy(current_price + 1, 1) # closing short positions
mp = 0 # reset virtual holding positions
if mp == 0: # If there is no holding position
if bias2> bias1: # Long-term deviation rate is greater than short-term deviation rate
exchange.SetDirection("buy") # Set the trading direction and type
exchange.Buy(current_price + 1, 1) # open long positions
mp = 1 # reset virtual holding position
if bias2 <bias1: # The long-term deviation rate is less than the short-term deviation rate
exchange.SetDirection("sell") # Set the trading direction and type
exchange.Sell(current_price-1, 1) # open short positions
mp = -1 # reset virtual holding position
```
## Complete strategy
```
# Backtest configuration
'''backtest
start: 2018-01-01 00:00:00
end: 2020-01-01 00:00:00
period: 1h
basePeriod: 1h
exchanges: [{"eid":"Futures_CTP","currency":"FUTURES"}]
'''
# External parameters
short = 10
long = 50
# Global variables
mp = 0
# Strategy main function
def onTick():
# retrieve data
exchange.SetContractType('rb000') # Subscribe to futures varieties
bars_arr = exchange.GetRecords() # Get K-line array
if len(bars_arr) <long + 1: # If the number of K lines is too small
return
# Calculate BIAS
close = bars_arr[-2]['Close'] # Get the closing price of the previous K line
ma1 = TA.MA(bars_arr, short)[-2] # Calculate the short-term moving average of the previous K line
bias1 = (close-ma1) / ma1 * 100 # Calculate the short-term deviation rate value
ma2 = TA.MA(bars_arr, long)[-2] # Calculate the long-term average of the previous K line
bias2 = (close-ma2) / ma2 * 100 # Calculate the long-term deviation rate value
# Placing Orders
global mp # global variables
current_price = bars_arr[-1]['Close'] # latest price
if mp> 0: # If you are holding long positions
if bias2 <= bias1: # If the long-term deviation rate is less than or equal to the short-term deviation rate
exchange.SetDirection("closebuy") # Set the trading direction and type
exchange.Sell(current_price-1, 1) # closing long positions
mp = 0 # reset virtual holding position
if mp <0: # If you are holding short positions
if bias2 >= bias1: # If the long-term deviation rate is greater than or equal to the short-term deviation rate
exchange.SetDirection("closesell") # Set the trading direction and type
exchange.Buy(current_price + 1, 1) # closing short positions
mp = 0 # reset virtual holding position
if mp == 0: # If there is no holding position
if bias2> bias1: # Long-term deviation rate is greater than short-term deviation rate
exchange.SetDirection("buy") # Set the trading direction and type
exchange.Buy(current_price + 1, 1) # opening long positions
mp = 1 # reset virtual holding position
if bias2 <bias1: # The long-term deviation rate is less than the short-term deviation rate
exchange.SetDirection("sell") # Set the trading direction and type
exchange.Sell(current_price-1, 1) # open short positions
mp = -1 # reset virtual holding position
# Program entry function
def main():
while True: # loop
onTick() # execution strategy main function
Sleep(1000) # sleep for 1 second
```
The complete strategy has been published on the FMZ website:
https://www.fmz.com/strategy/215129
## Strategy backtest
Backtest configuration

Performance report


Fund curve

## To Sum Up
Deviation rate is a simple and effective trading tool that can provide an effective reference for traders. In actual use, it can be flexibly applied with MACD and Bollinger band indicators to truly reflect its value.
From: https://www.fmz.com/digest-topic/5827 | fmzquant |
1,890,747 | Velvetauth new authentication | Simplify and Secure Your Application Authentication with VelvetAuth As a C# developer, you... | 0 | 2024-06-17T03:18:10 | https://dev.to/nesquick/velvetauth-new-authentication-5729 | csharp | ## Simplify and Secure Your Application Authentication with VelvetAuth
As a C# developer, you know the importance of implementing robust authentication mechanisms to protect your applications and ensure that only authorized users gain access. However, creating a secure and user-friendly authentication system from scratch can be a daunting task. This is where VelvetAuth comes in.
### What is VelvetAuth?
VelvetAuth is a comprehensive software authentication solution designed to simplify and secure user authentication for your applications. It uses both token-based systems and general registration options to provide a range of features ensuring your applications are secure and easy to manage.
### Key Features of VelvetAuth
1. **Token-Based Authentication**: VelvetAuth uses token-based authentication, ensuring that only authorized users can access your application.
2. **General Registration**: Offers a straightforward registration process without the need for tokens, making it versatile for different use cases.
3. **User Management**: Easily manage user accounts, including registration, login, and password management.
4. **Security Features**: HWID lock and detailed logging to prevent unauthorized access and key sharing.
5. **Integration**: Seamless integration with your C# applications, allowing you to add authentication capabilities with minimal effort.
6. **Scalability**: Designed to handle applications of all sizes, from small projects to large enterprise solutions.
### Why Choose VelvetAuth?
- **Ease of Use**: VelvetAuth provides a straightforward API and comprehensive documentation, making it easy to integrate into your existing projects.
- **Security**: With features like HWID locking, you can rest assured that your application is protected against unauthorized access.
- **Flexibility**: Whether you're developing a desktop application or a web service, VelvetAuth can be tailored to meet your specific needs.
- **Community Support**: Join a growing community of developers who are using VelvetAuth to secure their applications. Share your experiences, get help, and contribute to the project.
### Join the VelvetAuth Community
We invite you to explore VelvetAuth and see how it can enhance your C# applications. Visit our [website](https://velvetauth.com/) and our [discord](https://dsc.gg/velauth) for more information, and check out our [GitHub repository](https://github.com/velvetauthentication) to get started.
Feel free to reach out with any questions or feedback. We look forward to seeing what you build with VelvetAuth!
---
By using VelvetAuth, you can focus on developing great features for your application, while we take care of the authentication.
Happy coding! | nesquick |
1,890,745 | Decoupling Your Applications with AWS EventBridge: A Deep Dive | Decoupling Your Applications with AWS EventBridge: A Deep Dive In today's dynamic... | 0 | 2024-06-17T03:02:41 | https://dev.to/virajlakshitha/decoupling-your-applications-with-aws-eventbridge-a-deep-dive-2kci | 
# Decoupling Your Applications with AWS EventBridge: A Deep Dive
In today's dynamic technological landscape, building responsive and scalable applications is paramount. Applications need to react to events happening both internally and externally, and that's where **event-driven architecture** comes into play. AWS EventBridge sits at the heart of this architectural pattern within the AWS ecosystem, providing a powerful and flexible service for building loosely coupled, event-driven applications.
### Introduction to AWS EventBridge
AWS EventBridge is a serverless event bus service that enables communication between your applications, integrated SaaS applications, and AWS services through events. Instead of relying on tightly coupled point-to-point integrations, EventBridge provides a central hub where events are published and consumed asynchronously.
**Key Concepts:**
* **Event:** An event represents a change in state or an update. For example, an event could be a new file uploaded to an S3 bucket, a new user registration in your application, or a scheduled cron job trigger.
* **Event Source:** The origin of the event. Event sources can be AWS services, your own applications, or third-party SaaS applications.
* **Event Bus:** A pipeline that receives events from sources and routes them to targets based on rules.
* **Rule:** A configuration that specifies an event pattern (filter) and one or more targets to invoke when an event matching that pattern arrives on the event bus.
* **Target:** The destination where the event is sent for processing. Targets can include AWS Lambda functions, SNS topics, SQS queues, Step Function state machines, and more.
### Use Cases for EventBridge
Let's explore in-depth how AWS EventBridge facilitates building powerful event-driven architectures by examining five common use cases:
#### 1. Real-time Data Processing and Analytics
**Scenario:** Imagine you're running an e-commerce platform. Every time a new order is placed, you want to capture that event, analyze it in real-time, and update your inventory management system.
**Solution:**
1. **Event Source:** Configure your order processing system to publish an "OrderCreated" event to an EventBridge event bus.
2. **Event Pattern:** Define an EventBridge rule that matches the "OrderCreated" event type and extracts relevant data like product IDs and quantities.
3. **Targets:**
* **Lambda Function:** Trigger a Lambda function to perform real-time analytics on the order data, calculating metrics such as average order value or popular product trends.
* **Kinesis Data Stream:** Stream the order data into a Kinesis Data Stream for further processing and analysis with tools like Amazon Kinesis Data Analytics or Amazon Redshift.
* **Inventory Management System:** Update your inventory system in real-time to reflect the change in stock levels.
**Benefits:** This decoupled approach ensures that your analytics and inventory systems stay up-to-date without creating dependencies on the order processing system.
#### 2. Serverless Workflow Orchestration
**Scenario:** You need to orchestrate a multi-step workflow in response to a file being uploaded to an S3 bucket. The workflow includes validating the file, processing it, and sending notifications.
**Solution:**
1. **Event Source:** Configure an S3 event notification to publish an event to EventBridge whenever a new file is added to the designated bucket.
2. **Rules and Targets:** Define a series of EventBridge rules to trigger different stages of the workflow:
* **Rule 1:** On file upload, trigger a Lambda function to validate the file format and content.
* **Rule 2 (Conditional):** If the file validation is successful, trigger a Step Function state machine to orchestrate the data processing steps. If validation fails, trigger an SNS topic to notify administrators.
* **Rule 3:** Upon completion of the Step Function execution, trigger another Lambda function to send success/failure notifications.
**Benefits:** EventBridge seamlessly integrates with other AWS services to create robust serverless workflows without the need for complex custom code.
#### 3. SaaS Integration
**Scenario:** You want to synchronize data between your application and a third-party SaaS platform, such as Salesforce or Zendesk.
**Solution:**
1. **Event Source:** Leverage SaaS connectors available in EventBridge to receive events from the third-party platform (e.g., new leads created in Salesforce, new tickets created in Zendesk).
2. **Rules and Targets:** Create EventBridge rules to route events from the SaaS platform to appropriate targets in your AWS environment.
* **Example:** Route Salesforce "NewLead" events to a Lambda function that creates corresponding records in your CRM system hosted on AWS.
**Benefits:** EventBridge simplifies integration with SaaS applications, reducing the development effort required to synchronize data and automate workflows.
#### 4. Cross-Region Eventing
**Scenario:** You have a multi-region application and need to propagate events across different AWS regions to ensure data consistency and trigger actions in the appropriate locations.
**Solution:**
1. **Event Bus in the Origin Region:** Configure your application in the source region to publish events to an EventBridge event bus.
2. **Cross-Region Target:** Set up the target EventBridge event bus in the destination region.
3. **EventBridge Rule:** Create a rule on the source event bus that matches the events you want to propagate and specify the target event bus in the destination region.
**Benefits:** EventBridge simplifies building resilient and scalable applications that span multiple AWS regions.
#### 5. Scheduled Event Triggers
**Scenario:** You need to schedule recurring tasks, such as nightly database backups or monthly report generation.
**Solution:**
1. **Event Source:** Use Amazon CloudWatch Events (now integrated with EventBridge) to schedule events at predefined intervals using cron expressions.
2. **Target:** Configure the CloudWatch Event rule to target an EventBridge event bus.
3. **EventBridge Rule:** Set up an EventBridge rule that matches the scheduled event and triggers the desired action, such as a Lambda function for backups or a Step Function for report generation.
**Benefits:** EventBridge provides a centralized and flexible mechanism for scheduling and managing recurring tasks within your cloud infrastructure.
### Alternative Eventing Solutions
While AWS EventBridge shines within the AWS ecosystem, it's worth mentioning other prominent eventing solutions:
* **Apache Kafka:** A powerful open-source distributed streaming platform, well-suited for handling high-volume, real-time data streams.
* **RabbitMQ:** An open-source message broker known for its reliability and focus on message queuing.
* **Google Cloud Pub/Sub:** Google Cloud's fully managed real-time messaging service, offering similar functionality to EventBridge.
* **Azure Event Grid:** Azure's eventing service, enabling event-driven architectures within the Azure cloud.
### Conclusion
AWS EventBridge provides a robust and versatile foundation for building modern, event-driven applications on AWS. Its seamless integration with other AWS services, support for third-party SaaS applications, and ease of use make it an essential tool for developers and architects looking to create decoupled, scalable, and responsive systems. By embracing EventBridge, you unlock the true power of event-driven architectures, enabling your applications to react intelligently to real-time events and automate complex workflows with efficiency.
## Advanced Use Case: Building a Real-time Threat Detection System
Now, let's delve into a more advanced use case that highlights how EventBridge, combined with other AWS services, can power sophisticated solutions:
**Scenario:** As a software architect responsible for security, you need to build a real-time threat detection system that analyzes logs and security events from multiple sources to identify and respond to potential threats immediately.
**Solution:**
1. **Data Ingestion:**
* **AWS CloudTrail:** Enable CloudTrail to log API activity across your AWS account, providing valuable insights into user actions and resource changes.
* **AWS VPC Flow Logs:** Enable VPC Flow Logs to capture information about network traffic within your Virtual Private Cloud.
* **Security Information and Event Management (SIEM) Tool:** Integrate a third-party SIEM tool to collect security logs and events from your applications and infrastructure.
2. **Centralized Event Processing:**
* **Amazon Kinesis Data Firehose:** Configure Kinesis Data Firehose delivery streams to continuously stream logs and events from CloudTrail, VPC Flow Logs, and your SIEM tool to an Amazon S3 bucket for persistent storage.
* **AWS Lambda:** Utilize Lambda functions triggered by S3 event notifications to perform real-time processing and normalization of the ingested log data.
3. **Event Correlation and Threat Detection:**
* **Amazon EventBridge:**
* **Custom Event Bus:** Create a dedicated custom EventBridge event bus for your security-related events.
* **Event Pattern Matching:** Define EventBridge rules with sophisticated event patterns to identify suspicious activities or patterns across the normalized log data. For example, you could create rules to detect multiple failed login attempts from the same IP address within a short period.
* **AWS Lambda:** Trigger Lambda functions from EventBridge rules to perform further analysis, enrich event data with threat intelligence feeds, and calculate risk scores.
4. **Automated Response and Remediation:**
* **AWS Security Hub:** Integrate with Security Hub to centralize security findings and enable automated responses based on pre-configured security standards.
* **AWS Lambda:** Trigger Lambda functions from EventBridge rules or Security Hub findings to execute automated remediation actions, such as:
* Disabling compromised user accounts.
* Isolating suspicious instances by modifying security groups.
* Generating alerts and notifications to security teams via email, SMS, or incident management systems.
5. **Monitoring and Analysis:**
* **Amazon CloudWatch:** Monitor the performance and health of your threat detection system, track event throughput, and configure alarms for potential issues.
* **Amazon Athena:** Utilize Athena to query and analyze the raw log data stored in Amazon S3 to gain deeper insights into security trends and identify areas for improvement.
**Benefits:**
* **Real-time Threat Detection:** By correlating events from multiple sources, this solution enables you to detect and respond to threats in real-time, minimizing potential damage and downtime.
* **Centralized Security Monitoring:** Integrating various security tools and services with EventBridge provides a unified view of your security posture.
* **Automated Response:** Automating remediation steps helps contain threats quickly and effectively.
* **Scalability and Flexibility:** The serverless nature of this solution ensures it scales automatically with your needs and allows you to easily adapt to evolving security threats.
By combining the power of EventBridge with other AWS services like Kinesis, Lambda, Security Hub, and CloudWatch, you can build a robust and comprehensive real-time threat detection system that strengthens your security posture and protects your valuable assets.
| virajlakshitha | |
1,890,731 | Top CSS Websites for Developers: You Should Know! | CSS (Cascading Style Sheets) is a cornerstone of modern web development, enabling developers to... | 0 | 2024-06-17T02:31:23 | https://dev.to/vyan/top-css-websites-for-developers-elevate-your-frontend-skills-550l | webdev, react, beginners, css | CSS (Cascading Style Sheets) is a cornerstone of modern web development, enabling developers to create visually appealing and responsive websites. With the evolution of frontend technologies, there are numerous resources available online to enhance your CSS skills. In this blog, we'll explore some of the top CSS websites that can help you learn, experiment, and get inspired.
## 1. **Uiverse.io**
[Uiverse.io](https://uiverse.io/) is a fantastic platform for developers looking to improve their CSS skills through interactive components. Uiverse offers a wide range of user interface elements like buttons, loaders, and card designs, each accompanied by the necessary HTML and CSS code. The site encourages you to experiment with and customize these components, making it an excellent resource for both learning and inspiration.
### Key Features:
- **Interactive Components**: Browse and customize various UI elements.
- **Code Snippets**: Access ready-to-use HTML and CSS code.
- **Community Contributions**: Explore designs shared by other developers.
## 2. **CSS-Tricks**
[CSS-Tricks](https://css-tricks.com/) is one of the most well-known resources for everything related to CSS. Founded by Chris Coyier, CSS-Tricks provides a wealth of articles, tutorials, and guides on various CSS topics. Whether you're a beginner or an advanced developer, you'll find valuable insights and techniques to enhance your CSS skills.
### Key Features:
- **Comprehensive Tutorials**: Step-by-step guides on CSS concepts.
- **Almanac**: Detailed reference for CSS properties and selectors.
- **Community Forum**: Engage with other developers and get your questions answered.
## 3. **Devdevout**
[Devdevout](https://devdevout.com/) is a web design and development blog that offers high-quality tutorials and articles on CSS, JavaScript, and HTML. The site is known for its creative and innovative design ideas, providing a great source of inspiration and learning for frontend developers.
### Key Features:
- **Playground**: Explore a collection of creative coding experiments.
- **Blueprints**: Access practical, reusable code snippets for UI components.
- **In-depth Tutorials**: Learn advanced CSS techniques through detailed articles.
## 4. **CodePen**
[CodePen](https://codepen.io/) is an online code editor and community for front-end developers. It allows you to write HTML, CSS, and JavaScript code directly in your browser and see the results instantly. CodePen is a fantastic platform for experimenting with CSS, sharing your work, and discovering what other developers are creating.
### Key Features:
- **Live Code Editor**: Write and test code in real-time.
- **Pens**: Explore millions of user-submitted code snippets and projects.
- **Challenges**: Participate in coding challenges to improve your skills.
## 5. **Dev snap**
[Dev snap](https://devsnap.me/) is an essential tool for frontend developers, providing CSS features . It's a valuable resource to ensure your CSS code works across different browsers and devices.
### Key Features:
- **Feature Search**: Easily find information about specific CSS features.
## 6. **MDN Web Docs**
[MDN Web Docs](https://developer.mozilla.org/en-US/) (by Mozilla) is a comprehensive resource for web developers, offering detailed documentation and tutorials on web standards, including CSS. MDN is known for its accuracy and depth, making it a go-to reference for developers of all levels.
### Key Features:
- **CSS Documentation**: Thorough reference for all CSS properties and selectors.
- **Learning Guides**: Structured tutorials for learning CSS from scratch.
- **Examples and Demos**: Practical examples to illustrate CSS concepts.
## 7. **Material Tailwind**
[Material Tailwind](https://www.material-tailwind.com/) is a classic resource that showcases the power of Tailwind CSS.
### Key Features:
- **Inspiration**: See how different approaches to Tailwind CSS can create unique visual experiences.
- **Challenge**: Submit your own design and be part of the CSS Zen Garden community.
### 8. **HeadLess Ui**
[HeadLess Ui](https://headlessui.com/) is a resource that showcases the power of Tailwind CSS that can be used in React Components.
### 9. **Unidraw**
[Unidraw](https://undraw.co/illustrations) Undraw is an open source illustration library. They provide beautiful SVG illustrations you can use for free.
### 10. **Shape Divider**
[Shape Divider](https://www.shapedivider.app/) ShapeDivider helps you add curve shapes to headings, paragraphs or between sections very easily. It is frequently used in designs.
## Conclusion
Whether you're just starting out with CSS or looking to refine your skills, these websites provide invaluable resources for learning, experimenting, and drawing inspiration. From comprehensive tutorials and interactive components to browser compatibility tools and innovative design showcases, these platforms cover all aspects of CSS development. Dive in, explore, and elevate your frontend skills to the next level! | vyan |
1,890,741 | GraphQL vs. REST: A Dev's Guide to Picking Your API Poison (and Why You Should Argue With Me) | If you're building anything web-related these days, chances are you've got a nagging question at the... | 0 | 2024-06-17T02:53:00 | https://dev.to/kareem-khaled/graphql-vs-rest-a-devs-guide-to-picking-your-api-poison-and-why-you-should-argue-with-me-55cd | api, restapi, graphql | If you're building anything web-related these days, chances are you've got a nagging question at the back of your mind: REST or GraphQL? It certainly feels like the old, trusty workhorse pitted against the shiny new toy in this battle. Which one will be right for your project?
Let's hash this out.
## REST: Good Old Days
It is no wonder that REST has been here for a long time; it is very easy and liberal. This method is so easy to understand and start working on. REST operates with the client-server model and uses the following conventional HTTP methods to execute the CRUD: GET, POST, PUT, and DELETE.
The beauty of REST lies in the simplicity and wide adoption. It is almost like a trusty toolbox that everyone knows how to use – a huge amount of documentation, libraries, and tools are available, and pretty much any developer you hire will at least have experience with it.
## …but REST isn't perfect….
The probably largest disadvantage of REST at this point is that you always end up either over- or under-fetching data. You either get too much information back at a time, wasting bandwidth and processor cycles, or it makes a bunch of requests to get all the data you want. It's really a pain, especially when it comes to bandwidth-hungry mobile apps.
Another annoyance is the rigidity of REST endpoints. You need to define them up front, and when your front-end requirements change, you have the potential pain of redesigning your API.
## GraphQL: The Shiny New Toy
GraphQL was born out of frustrations at Facebook with the limitations of REST. It's a query language that lets one request exactly what is needed and nothing more or less. You basically define a schema that describes your data, and then clients can send in queries to retrieve specific information as per their requirements.
The really cool thing about GraphQL is that there's only one endpoint for all of your requests. That could help a lot in reducing the number of round trips to the server, making your application snappier. And the strongly typed schema allows you to catch errors way too early; it also makes it easier to maintain your API.
### Hold Your Horses, GraphQL Ain't All Sunshine and Rainbows
While GraphQL is undoubtedly very powerful, it comes with its set of challenges. First of all, there is a **bigger learning curve** since you need to understand the concept of schemas, queries, and mutations, which might become overwhelming at the very start.
Another problem is **caching**. Since GraphQL queries are capable of being so dynamic, having efficient caching strategies in place is trickier compared to REST.
And then there's the performance factor. **Unless you're careful**, complex GraphQL queries can really tax your server.
## The Smackdown: When to Choose Which
So, when should you go with GraphQL and when should you stick with REST? Here's my opinion, but don't worry - I love a good argument about this:
## Use GraphQL if:
- Your app has complicated data requirements.
- You're developing a mobile app and you want to reduce the number of network requests.
- You're using microservices architecture.
- Your frontend requirements are changing all the time.
## Stick with REST if:
- You have a small API with simple data requirements.
- You are building a public API that must be easily consumed by a broad base of external developers.
- Caching is of the highest priority for your app.
- The Hybrid Option
Of course, there's no rule saying you must pick one or the other. In many cases, a hybrid approach that combines GraphQL and REST might be the perfect solution.
## Now it's your turn!
On a final note, now that I have shared my two cents, I would be interested in hearing from you on what you think. So, have you tried GraphQL or REST in your projects? What was your feeling? Have you had any unexpected challenges? Maybe you just have a bit of a completely different angle about this whole debate.
Drop a line in the comments below, or hit me up on ([kareem_khaled@t-horizons.com](mailto:kareem_khaled@t-horizons.com))
Let's start this chat! | kareem-khaled |
1,890,739 | Distributed Systems Patterns | Notes via ByteByteGo 1. Ambassador Pattern Pros: Simplifies communication between... | 0 | 2024-06-17T02:45:53 | https://dev.to/inamdarminaz/distributed-systems-patterns-3om6 | - Notes via [ByteByteGo](https://www.youtube.com/watch?v=nH4qjmP2KEE)
## **1. Ambassador Pattern**
_Pros:_
- Simplifies communication between services.
- Handles load balancing, traffic routing, and retries transparently.
- Promotes decoupling of services.
_Cons:_
- Adds an additional layer, which can introduce latency.
- Requires configuration and management overhead.
_Applications:_
- Kubernetes uses Envoy as an Ambassador.
## **2. Circuit Breaker Pattern**
_Pros:_
- Prevents cascading failures and improves system resilience.
- Enhances fault tolerance by isolating failing components.
- Provides fallback mechanisms to maintain system stability.
_Cons:_
- Introduces complexity to manage circuit states.
- May impact performance during high load or transient failures.
_Applications:_
- Netflix's Hytrix library.
## **3. CQRS (Command Query Responsibility Segregation) Pattern**
_Pros:_
- Optimizes read and write operations independently.
- Improves scalability and performance for read-heavy workloads.
- Facilitates complex business logic on the write side.
_Cons:_
- Increases architectural complexity.
- Requires careful synchronization between command and query models.
## **4. Event Sourcing Pattern**
_Pros:_
- Provides a complete audit trail of system state changes.
- Enables temporal queries and historical analysis.
- Supports scalability and resilience through immutable event logs.
_Cons:_
- Increased storage requirements due to storing all events.
- Requires efficient replay mechanisms for state rebuilds.
## **5. Leader Election Pattern**
_Pros:_
- Establishes a single point of coordination in distributed systems.
- Ensures high availability by quickly electing a new leader.
- Facilitates scalability and fault tolerance.
_Cons:_
- Adds overhead due to election algorithms and heartbeat mechanisms.
- May introduce latency during leader changes.
_Applications:_
- Apache zookeeper.
## **6. Publisher-Subscriber Pattern**
_Pros:_
- Supports asynchronous and real-time messaging.
- Decouples publishers from subscribers, improving scalability.
- Facilitates event-driven architectures.
_Cons:_
- Requires robust message delivery mechanisms to ensure reliability.
- May introduce complexity in managing message ordering and processing.
## **7. Sharding Pattern**
_Pros:_
- Improves scalability by distributing data across multiple nodes.
- Enhances performance for read and write operations by reducing contention.
- Allows horizontal scaling by adding more shards.
_Cons:_
- Requires careful shard key selection and management.
- Increases complexity in data distribution and query routing.
- Introduces additional overhead for data rebalancing and maintenance.
_Applications:_
- Cassandra and MongoDB | inamdarminaz | |
1,890,736 | How to Customize GitHub Profile: Part 1 | As a developer, especially for those who are looking for a tech role like myself, it's important to... | 0 | 2024-06-17T02:40:03 | https://dev.to/ryoichihomma/how-to-customize-github-profile-like-a-pro-16aa | github, githubprofile, githubportfolio, git | As a developer, especially for those who are looking for a tech role like myself, it's important to customize your GitHub profile because your GitHub profile is like your digital resume and makes you stand out from a bunch of other candidates. In this article, I'll walk you through the introduction section of my own GitHub profile customization.
[Part 2](https://dev.to/ryoichihomma/how-to-customize-your-github-profile-part-2-32g2) | [Part 3](https://dev.to/ryoichihomma/how-to-customize-your-github-profile-part-3-37em) | [Part 4](https://dev.to/ryoichihomma/how-to-customize-github-profile-part-4-29h) | [Part 5](https://dev.to/ryoichihomma/how-to-customize-github-profile-part-5-23po)

## Create a New Repository Named Same as Your Username
To customize your GitHub profile, you need a repository whose name is the same as your GitHub username. Don't forget to make it public and add a README.md file when you create it.
## Add Your Bio with Typing SVG
One of the best ways to catch someone's attention is with a dynamic introduction. I used a Typing SVG to animate the text in my header. This small touch can make a huge difference in making your profile more engaging.
Here's a guide on how to create yours by my favorite developer, [Jonah Lawrence](https://github.com/DenverCoder1/readme-typing-svg).

## Add View Counter Button
After the header, I briefly describe myself and include a profile view counter button to showcase the number of visitors to my profile. This adds a layer of social proof and can intrigue visitors.
### How to create yours using [Pro Visit Counter](https://visitcount.itsvg.in/)
 **Step1)** Once you click the "Create Now" button, you can start customizing your counter like this.  **Step2)** You can also choose a color theme and select count style like this.  **Step3)** Once you click the "Generate" button, both Markdown and HTML codes will be automatically generated. Copy either code and paste it into your README.md file.

Here's another tool by [Anton Komarev](https://github.com/antonkomarev/github-profile-views-counter).
## Personal Touches and Contact Information
Sharing personal details and contact information makes it easy for others to connect with you. I included a list of key points about myself, my learning goals, and where people can find me online.
### Wrapping Up
This is just the beginning of creating an eye-catching GitHub profile. Stay tuned for the next part of this series, where I will dive into how to showcase your social media links and media and skill sections effectively.
In the meantime, feel free to ask any questions or share your GitHub profiles in the comments below. Let's connect and grow together🌱
Happy coding!💻
#### References
[Guide of Readme Typing SVG by Jonah Lawrence](https://github.com/DenverCoder1/readme-typing-svg)
[Readme Typing SVG](https://readme-typing-svg.demolab.com/demo/)
[Pro Visit Counter](https://visitcount.itsvg.in/)
[GitHub Profile Views Counter by Anton Komarev](https://github.com/antonkomarev/github-profile-views-counter)
##### Other Parts
[Part 2](https://dev.to/ryoichihomma/how-to-customize-your-github-profile-part-2-32g2) | [Part 3](https://dev.to/ryoichihomma/how-to-customize-your-github-profile-part-3-37em) | [Part 4](https://dev.to/ryoichihomma/how-to-customize-github-profile-part-4-29h) | [Part 5](https://dev.to/ryoichihomma/how-to-customize-github-profile-part-5-23po) | ryoichihomma |
1,890,733 | SSL context creation crashes of c++ native module in Electron application | I am building a C++ native module to be used in an Electron application. The native module is... | 0 | 2024-06-17T02:33:16 | https://dev.to/codert0109/ssl-context-creation-crashes-of-c-native-module-in-electron-application-307a | question, cpp, websocketpp, ssl | I am building a C++ native module to be used in an Electron application. The native module is responsible for communicating with a WebSocket server. I am using the WebSocketPP library and the following sample code:
index.cc
```
#include <websocketpp/config/asio_client.hpp> // TLS
#include <websocketpp/client.hpp>
typedef websocketpp::client<websocketpp::config::asio_tls_client> client;
typedef websocketpp::config::asio_client::message_type::ptr message_ptr;
typedef websocketpp::lib::shared_ptr<boost::asio::ssl::context> context_ptr;
using websocketpp::lib::bind;
using websocketpp::lib::placeholders::_1;
using websocketpp::lib::placeholders::_2;
//
...
//
class WebSocketHandler
{
public:
void set(const std::string &url, const std::string &token)
{
ws_url = url;
authorizationHeader = "Bearer " + token;
// Initialize ASIO
_webSocket.init_asio();
// Set logging to be pretty verbose (everything except message payloads)
_webSocket.set_access_channels(websocketpp::log::alevel::all);
_webSocket.clear_access_channels(websocketpp::log::alevel::frame_payload);
// Set open handler
_webSocket.set_open_handler(bind(&WebSocketHandler::on_open, this, std::placeholders::_1));
// Set close handler
_webSocket.set_close_handler(bind(&WebSocketHandler::on_close, this, std::placeholders::_1));
// Set fail handler
_webSocket.set_fail_handler(bind(&WebSocketHandler::on_fail, this, std::placeholders::_1));
// Set message handler
_webSocket.set_message_handler(bind(&WebSocketHandler::on_message, this, std::placeholders::_1, std::placeholders::_2));
// Set TLS handler
_webSocket.set_tls_init_handler(bind(&WebSocketHandler::on_tls_init, this, std::placeholders::_1));
}
void start()
{
websocketpp::lib::error_code ec;
client::connection_ptr con = _webSocket.get_connection(ws_url, ec);
if (ec)
{
std::cout << "Could not create connection because: " << ec.message() << std::endl;
return;
}
// Set the authorization header
con->replace_header("Authorization", authorizationHeader);
// Connect to server
_webSocket.connect(con);
// Start the ASIO io_service run loop
_thread.reset(new websocketpp::lib::thread(&client::run, &_webSocket));
}
void stop()
{
_webSocket.stop();
if (_thread && _thread->joinable())
{
_thread->join();
}
}
private:
context_ptr on_tls_init(websocketpp::connection_hdl hdl)
{
context_ptr ctx = websocketpp::lib::make_shared<boost::asio::ssl::context>(boost::asio::ssl::context::sslv23); // crash at this line
try {
// Simplified SSL options for testing
ctx->set_options(boost::asio::ssl::context::default_workarounds |
boost::asio::ssl::context::no_sslv2 |
boost::asio::ssl::context::no_sslv3 |
boost::asio::ssl::context::single_dh_use);
std::cout << "SSL options set successfully" << std::endl;
} catch (std::exception &e) {
std::cout << "Exception during set_options: " << e.what() << std::endl;
}
return ctx;
}
void on_open(websocketpp::connection_hdl hdl)
{
std::cout << "connection opened" << std::endl;
}
void on_close(websocketpp::connection_hdl hdl)
{
std::cout << "connection closed" << std::endl;
}
void on_fail(websocketpp::connection_hdl hdl)
{
std::cout << "connection failed" << std::endl;
}
void on_message(websocketpp::connection_hdl hdl, client::message_ptr msg)
{
std::cout << "message arrived" << std::endl;
}
client _webSocket;
std::string ws_url;
std::string authorizationHeader;
};
//
...
//
WebSocketHandler handler;
handler.set("wss://echo.websocket.org/", "Token_xxxx");
handler.start();
....
handler.stop();
```
binding.gyp
```
{
"targets": [
{
"target_name": "binding",
"include_dirs": [
"<!@(node -p \"require('node-addon-api').include\")",
"<(module_root_dir)/include"
],
"conditions": [
['OS=="win"', {
"sources": [
"./src/index.cc"
],
"configurations": {
"Debug": {
"msvs_settings": {
"VCCLCompilerTool": {
"RuntimeLibrary": "0",
"ExceptionHandling": "1"
},
},
},
"Release": {
"msvs_settings": {
"VCCLCompilerTool": {
"RuntimeLibrary": "0",
"ExceptionHandling": "1"
},
},
},
},
"libraries": [
"-lws2_32",
"-lShlwapi"
]
}]
]
}
],
}
```
Test Script
```
const engine = require("../bin/binding.node");
const test = async () => {
try {
engine.startConnection();
} catch (err) {
console.log("Error occurred", err);
}
};
test();
```
Problem The module works correctly in a JavaScript test script but crashes in Electron at this line:
context_ptr ctx = websocketpp::lib::make_shared<boost::asio::ssl::context>(boost::asio::ssl::context::sslv23);
I suspect the issue might be related to the way SSL libraries are linked. I feel linking SSL libraries statically might resolve the issue, but I am unsure how to achieve this. I tested with other several libraries based in boost but the result was same. It keeps crahsed in ssl context creation part only in electron application.
Environment
C++14/17
Electron v23(version upgrade doesn't help)
WebSocketPP 0.8.2
Node 16.14.2/18.x.x
Dependencies installed using vcpkg: OpenSSL, WebSocketPP, Boost
Question
How can I link SSL libraries statically in my project to potentially fix this issue? Are there any other possible solutions or insights regarding this problem?
Thank you for your assistance! | codert0109 |
1,890,732 | The Basics of BABEL | Have you started a project for creating a website and got everything working? Your APIs, Requests,... | 0 | 2024-06-17T02:31:26 | https://dev.to/gagecantrelle/the-basics-of-babel-2d8f | Have you started a project for creating a website and got everything working? Your APIs, Requests, databases, and other codes not throwing an error. Great, now the only thing you have to do is bundle your code. You got webpack installed but what about the transcompiler, which
one is used, and is it good? Well, I’ve got one we can use and it is free to use, Babel. It’s not hard to set up and will only take under a minute to put in.
**FunFacts**
Babel is a free and open-source transcompiler for JavaScript. It was released to the world on September 28, 2014. The most stable version of it was released on January 8, 2014. Babel can also use plugins/libraries to help with certain syntax, mostly lines of code that use stuff like react code, CSS, SVG, and others. To use babel you need to have Node installed with npm.
**Let start coding**
First, let's start off by installing our dependencies for the package.json folder. If you don’t have one just run npm init.
```
Npm install @babel/core
Npm install @babel/cli
Npm install @babel/present-env
```
Now Let's create a file and add some code to it.
```
Let func = function(data){
return data + 255
}
//example code
Let hop test = false
```
Now let's set up our Babel file to tell Babel what to do and use. Create a present key in an object and give present one of the dependencies we install, @babel/present-env. This helps tell Babel what environment it will be using. For example, it could be running on Firefox, Safari, or the Chrome browser.
```
{
“present”: [
“@babel/present-env”
]
}
```
Next, to tell it which environment it will be working on we need to change the values in the present key a bit. First, we need to put “@babel/present-env” inside of an array. Then in the same array add an object with a target key. This key will hold what environment Babel will be working on, and what version.
```
{
“present”: [
[“@babel/present-env”, {
“targets” :{
“fireFox”: “17”,
“Chrome”: “67”,
“Safari”: “11.1”
}
}]
]
}
```
Finally, because we don’t have other codes to run Babel, we will have to run a command in the terminal to get it to work. This command will target a specific file and send it to a specific folder or file. If the specific output doesn't exist, the command will create the file/folder for use. After the command, go to the file you set to the output and see the result.
```
./node_modules/@babel/cli/bin/babel.js main.js –out-file main.dist.js
// use this command to check the current version
./node_modules/@babel/cli/bin/babel.js –version
```
Let’s say that you are working on a react project and you're also using babel. Now like a talk about Babel can’t handle certain syntax. There are plugins and other libraries you can install to help, let take a look at an example
```
{
“present”: [
“@babel/present-env”,
“@babel/react” // for react
],
“Plugins”:[
“@babel/plugin-syntax-dynamic-import”
]
```
Now that you have seen Babel and how exactly it works, why not try it out in your next project? Babel has multiple plugins and libraries for whatever type of project you’re doing. If you're interested in webpack that uses babel or react, I have done blogs on these topics before so when you have some time why not take a look?
https://dev.to/gagecantrelle/the-basics-of-react-57a1
https://dev.to/gagecantrelle/the-basics-of-webpack-2d71
Also if you are interested in what trans compiled code looks like, head to this site that will transcompile anything given to it. https://babeljs.io/
Links:
https://en.wikipedia.org/wiki/Babel_(transcompiler)
https://www.youtube.com/watch?v=o9hmjdmJLMU
https://medium.com/age-of-awareness/setup-react-with-webpack-and-babel-5114a14a47e9#bb4c
| gagecantrelle | |
339,777 | Layouts in Vue CLI | This can easily be done with Slots and Component :is. <template lang="pug"> #App component(... | 0 | 2020-05-20T10:04:36 | https://dev.to/patarapolw/layouts-in-vue-cli-3dfn | vue, javascript, webdev | This can easily be done with Slots and Component `:is`.
```vue
<template lang="pug">
#App
component(v-if="layout" :is="layout")
router-view
router-view(v-else)
</template>
<script lang="ts">
import { Vue, Component, Watch } from 'vue-property-decorator'
@Component
export default class App extends Vue {
get layout () {
const layout = this.$route.meta.layout
return layout ? `${layout}-layout` : null
}
}
</script>
```
And it fallbacks to Blank Layout.
In `router/index.ts`,
```ts
import Vue from 'vue'
import VueRouter from 'vue-router'
Vue.use(VueRouter)
const registeredLayouts = [
'App'
]
registeredLayouts.map((layout) => {
Vue.component(`${layout}-layout`, () => import(/* webpackChunkName: "[request]-layout" */ `../layouts/${layout}.vue`))
})
const router = new VueRouter({
mode: 'hash',
routes: [
{
path: '/',
component: () => import(/* webpackChunkName: "[request]" */ '../views/Home.vue')
},
{
path: '/example',
component: () => import(/* webpackChunkName: "[request]" */ '../views/Example.vue'),
meta: {
layout: 'App'
}
}
]
})
export default router
```
And in, `layouts/App.vue`.
```vue
<template lang="pug">
#app
NavBar
slot
</template>
```
It is also possible to protect some layouts.
```vue
<template lang="pug">
#App
b-loading(active v-if="isLoading")
component(v-else-if="layout" :is="layout")
router-view
router-view(v-else)
</template>
<script lang="ts">
import { Vue, Component, Watch } from 'vue-property-decorator'
@Component
export default class App extends Vue {
isLoading = true
get user () {
return this.$store.state.user
}
get layout () {
const layout = this.$route.meta.layout
return layout ? `${layout}-layout` : null
}
created () {
this.onUserChange()
}
@Watch('user')
onUserChange () {
if (!this.user) {
setTimeout(() => {
this.isLoading = false
}, 3000)
} else {
this.isLoading = false
}
this.onLoadingChange()
}
@Watch('isLoading')
onLoadingChange () {
if (!this.isLoading) {
if (!this.user) {
this.$router.push('/')
} else if (this.$route.path === '/') {
this.$router.push('/lesson')
}
}
}
}
</script>
``` | patarapolw |
1,890,730 | Evaluation of backtest capital curve using "pyfolio" tool | Foreword A few days ago, it was found that the profit and loss curve output of the FMZ... | 0 | 2024-06-17T02:27:46 | https://dev.to/fmzquant/evaluation-of-backtest-capital-curve-using-pyfolio-tool-5efk | backtest, trading, cryptocurrency, fmzquant | ## Foreword
A few days ago, it was found that the profit and loss curve output of the FMZ strategy backtest result was relatively simple, so I thought about whether to obtain the income result data and then process it myself to get a more detailed capital curve evaluation report and display it graphically. When I started to write out the ideas, I found that it was not so easy, so I wonder if anyone has the same ideas and has already made the corresponding tools? So I searched the Internet and found that there are indeed such tools. I looked at several projects on GitHub and finally chose pyfolio.
## What is pyfolio?
pyfolio is a Python library for financial portfolio performance and risk analysis developed by "quantinc". It works well with "Zipline" open source backtest library. "quantinc" also provides comprehensive management services for professionals, including Zipline, Alphalens, Pyfolio, FactSet data, etc.
The core of pyfolio is the so-called "so-called tear sheet", which is composed of a variety of independent graphs that provide a comprehensive picture of the performance of the trading algorithm.
```
GitHub address: https://github.com/quantopian/pyfolio
```
## Learn to use pyfolio
Due to the fact that there are few online learning materials for this tool, it takes a long time for me to use it easily.
PyFolio API reference:
```
https://www.quantopian.com/docs/api-reference/pyfolio-api-reference#pyfolio-api-reference
```
Here is a more detailed introduction to pyfolio's API. The platform can be used for backtesting of US stocks. The backtesting results can be directly displayed through pyfolio. I only learned it roughly. It seems that other functions are quite powerful.
## Install pyfolio
The installation of pyfolio is relatively simple, just follow the instructions on GitHub.
## FMZ backtest results displayed by pyfolio
Well, the introduction is here, and began to enter the topic. First, get the backtest capital curve data on FMZ platform.

Click the button next to the full screen in the above figure in the floating profit and loss chart of the backtest result, and then select ‘Download CSV’.
The format of the obtained CSV data is as follows (the file name can be changed according to your needs):

If you want to have a comparative benchmark for the analysis results, you also need to prepare a K-line daily data of the trading target. if there is no K-line data, only the income data can also be analyzed, but there will be several more indicators for the results of benchmark data analysis, such as: Alpha, Beta, etc. The following content are written in accordance with the baseline K-line data.
We can obtain K-line data directly from the platform through the FMZ research environment:
```
# Use the API provided by the FMZ research environment to obtain K-line data which equal to the revenue data
```
dfh = get_bars('bitfinex.btc_usd', '1d', start=str(startd), end=str(endd))
After the data is prepared, we can start the coding. We need to process the acquired data to make it conform to the data structure required by pyfolio, and then call the create_returns_tear_sheet interface of pyfolio to calculate and output the result. We mainly need to pass in returns, benchmark_rets=None and live_start_date=None three parameters.
The return parameter is required income data; benchmark_rets is the benchmark data, it is not necessary; live_start_datelive_start_date is not necessary.
The meaning of this parameter is: when did your returns start from the real market? For example, our a bunch of returns above, assuming that we are starting real market after 2019-12-01, and the previous are in the simulation market or the result of a backtest, then we can set it like this: live_start_date = '2019-12-01'.
By setting the parameters, we can theoretically analyze whether our strategy has been overfitted. If the difference between the inside and outside of the sample is large, then there is a high probability that this is overfitting.
We can implement this analysis function in the FMZ research environment, or we can implement it locally. The following takes the implementation in the FMZ research environment as an example:
```
https://www.fmz.com/upload/asset/1379deaa35b22ee37de23.ipynb?name=%E5%88%A9%E7%94%A8pyfolio%E5%B7%A5%E5%85%B7%E8%AF%84%E4%BB%B7%E5%9B%9E%E6%B5%8B%E8%B5%84%E9%87%91%E6%9B%B2%E7%BA%BF(%E5%8E%9F%E5%88%9B).ipynb
```
```
# First, create a new "csv to py code.py" python file locally and copy the following code to generate the py code containing the CSV file of the fund curve downloaded from FMZ. Running the newly created py file locally will generate "chart_hex.py" file.
#!/usr/bin/python
# -*- coding: UTF-8 -*-
import binascii
# The file name can be customized as needed, this example uses the default file name
filename = 'chart.csv'
with open(filename, 'rb') as f:
content = f.read()
# csv to py
wFile = open(filename.split('.')[0] + '_hex.py', "w")
wFile.write("hexstr = bytearray.fromhex('" +
bytes.decode(binascii.hexlify(content))
+ "').decode()\nwFile = open('" + filename + "', 'w')\nwFile.write(hexstr)\nwFile.close()")
wFile.close()
```
```
# Open the "chart_hex.py" file generated above, copy all the contents and replace the following code blocks, and then run the following code blocks one by one to get the chart.csv file
hexstr = bytearray.fromhex('efbbbf224461746554696d65222c22e6b5aee58aa8e79b88e4ba8f222c22e4ba8be4bbb6220a22323031392d31302d33312030303a30303a3030222c300a22323031392d31312d30312030303a30303a3030222c300a22323031392d31312d30322030303a30303a3030222c2d302e3032383434353837303635373338383930350a22323031392d31312d30332030303a30303a3030222c302e3030373431393439393432333839363936390a22323031392d31312d30342030303a30303a3030222c2d302e30323234373732373731373434313231370a22323031392d31312d30352030303a30303a3030222c2d302e30323033393930383333363836353735390a22323031392d31312d30362030303a30303a3030222c2d302e3034393935353039333230393332303435360a22323031392d31312d30372030303a30303a3030222c2d302e303434333232333634383035363033370a22323031392d31312d30382030303a30303a3030222c2d302e3032353631313934393330353935313637360a22323031392d31312d30392030303a30303a3030222c302e3032363331303433393432313739303536360a22323031392d31312d31302030303a30303a3030222c302e3033303232303332383333303436333137350a22323031392d31312d31312030303a30303a3030222c302e3033313230373133363936363633313133330a22323031392d31312d31322030303a30303a3030222c2d302e3031383533323831363136363038333135350a22323031392d31312d31332030303a30303a3030222c2d302e30313736393032353136363738333732320a22323031392d31312d31342030303a30303a3030222c2d302e3032323339313034373338373637393338360a22323031392d31312d31352030303a30303a3030222c2d302e3030383433363137313736363631333438370a22323031392d31312d31362030303a30303a3030222c302e3031373430363536343033313836383133330a22323031392d31312d31372030303a30303a3030222c302e303232393131353234343739303732330a22323031392d31312d31382030303a30303a3030222c302e3033323032363631303538383035373131340a22323031392d31312d31392030303a30303a3030222c302e303138393230323836383338373438380a22323031392d31312d32302030303a30303a3030222c302e30363632363938393337393232363738390a22323031392d31312d32312030303a30303a3030222c302e3036303835343430303337353130313033370a22323031392d31312d32322030303a30303a3030222c302e31343432363035363831333031303231330a22323031392d31312d32332030303a30303a3030222c302e32343239343037303935353332323336370a22323031392d31312d32342030303a30303a3030222c302e32313133303432303033353237373934310a22323031392d31312d32352030303a30303a3030222c302e323735363433303736313138343937380a22323031392d31312d32362030303a30303a3030222c302e323532343832323739343237363235360a22323031392d31312d32372030303a30303a3030222c302e32343931313136313839303039383437370a22323031392d31312d32382030303a30303a3030222c302e31313038373135373939323036393134310a22323031392d31312d32392030303a30303a3030222c302e313633343530313533373233393139390a22323031392d31312d33302030303a30303a3030222c302e31393838303132323332343735393737350a22323031392d31322d30312030303a30303a3030222c302e31363633373536393939313635393038350a22323031392d31322d30322030303a30303a3030222c302e32303638323732383333323337393630370a22323031392d31322d30332030303a30303a3030222c302e32303434323831303032303830393033320a22323031392d31322d30342030303a30303a3030222c302e323030353636323836353230383830360a22323031392d31322d30352030303a30303a3030222c302e31323434363439343330303739303635360a22323031392d31322d30362030303a30303a3030222c302e31303032343339383239393236303637332c302e31303032343339383239393236303637330a22323031392d31322d30372030303a30303a3030222c302e31303637313232383937343130373831360a22323031392d31322d30382030303a30303a3030222c302e31323839363336313133333032313036310a22323031392d31322d30392030303a30303a3030222c302e313337393030323234303239323136320a22323031392d31322d31302030303a30303a3030222c302e31313432333735383637323436303130350a22323031392d31322d31312030303a30303a3030222c302e31323638353037323134353130343038320a22323031392d31322d31322030303a30303a3030222c302e31343139333631313738343432333234330a22323031392d31322d31332030303a30303a3030222c302e31333838333632383537383138383536370a22323031392d31322d31342030303a30303a3030222c302e313136323031343031393435393734350a22323031392d31322d31352030303a30303a3030222c302e31363135333931303631363930313932330a22323031392d31322d31362030303a30303a3030222c302e31343937383138343836363238323231380a22323031392d31322d31372030303a30303a3030222c302e31353734393833333435363438393438320a22323031392d31322d31382030303a30303a3030222c302e32343234393031303233333139323635380a22323031392d31322d31392030303a30303a3030222c302e32313830363838353631363039303035350a22323031392d31322d32302030303a30303a3030222c302e323938383636303034333936303139340a22323031392d31322d32312030303a30303a3030222c302e33303135333036303934383834370a22323031392d31322d32322030303a30303a3030222c302e323938363835393334383634363038370a22323031392d31322d32332030303a30303a3030222c302e333039333035323733383735393130310a22323031392d31322d32342030303a30303a3030222c302e333834363231343935353136383931320a22323031392d31322d32352030303a30303a3030222c302e33343532373534363233383138313130360a22323031392d31322d32362030303a30303a3030222c302e33363235323332383833363737313035330a22323031392d31322d32372030303a30303a3030222c302e33343937363331393933333834333133360a22323031392d31322d32382030303a30303a3030222c302e33303732393733373234353434373938360a22323031392d31322d32392030303a30303a3030222c302e33323238383132323432363135363530370a22323031392d31322d33302030303a30303a3030222c302e33343134363537343239333438363535330a22323031392d31322d33312030303a30303a3030222c302e333435323733393139363237303738320a22323032302d30312d30312030303a30303a3030222c302e33353730313633323035353433343337340a22323032302d30312d30322030303a30303a3030222c302e33343937353937393034363236373934370a22323032302d30312d30332030303a30303a3030222c302e33373032333633333138303534353335370a22323032302d30312d30342030303a30303a3030222c302e33383636373137373837343037313635370a22323032302d30312d30352030303a30303a3030222c302e33383834373536373836393031343634330a22323032302d30312d30362030303a30303a3030222c302e34313331323236353139383433373731340a22323032302d30312d30372030303a30303a3030222c302e34323335323332383237303436333733350a22323032302d30312d30382030303a30303a3030222c302e34363837333531323838353035333330330a22323032302d30312d30392030303a30303a3030222c302e353436373135313832363033383332380a22323032302d30312d31302030303a30303a3030222c302e353530373037323136333937383830310a22323032302d30312d31312030303a30303a3030222c302e35353531373436393236393938310a22323032302d30312d31322030303a30303a3030222c302e353632323130363337343737323731330a22323032302d30312d31332030303a30303a3030222c302e353734373831373030393536383631370a22323032302d30312d31342030303a30303a3030222c302e353632383330303731353536353831350a22323032302d30312d31352030303a30303a3030222c302e363538323839383038313031393136380a22323032302d30312d31362030303a30303a3030222c302e363732323034393830303331333936370a22323032302d30312d31372030303a30303a3030222c302e363537313832383237323238323335380a22323032302d30312d31382030303a30303a3030222c302e363734393831383838383639373536330a22323032302d30312d31392030303a30303a3030222c302e363739373632303637393239383131330a22323032302d30312d32302030303a30303a3030222c302e363334313332373332393636313231370a22323032302d30312d32312030303a30303a3030222c302e363237353837313436323430323734370a22323032302d30312d32322030303a30303a3030222c302e363331313336373230353334393834370a22323032302d30312d32332030303a30303a3030222c302e3630313936323331393931343334360a22323032302d30312d32342030303a30303a3030222c302e363036343239313935383633313431360a22323032302d30312d32352030303a30303a3030222c302e35383130363933393531373337390a22323032302d30312d32362030303a30303a3030222c302e363133313034353130383436353937380a22323032302d30312d32372030303a30303a3030222c302e3632393938323638373737383035350a22323032302d30312d32382030303a30303a3030222c302e363831333134363734333130313533350a22323032302d30312d32392030303a30303a3030222c302e373134303533393533383834313233350a22323032302d30312d33302030303a30303a3030222c302e373433383032353331363031313135360a22323032302d30312d33312030303a30303a3030222c302e373535393639303935383539313330370a22323032302d30322d30312030303a30303a3030222c302e373533383030313630323737353438310a22323032302d30322d30322030303a30303a3030222c302e373534343434333437323732343132350a22323032302d30322d30332030303a30303a3030222c302e373435373138393532343434373738330a22323032302d30322d30342030303a30303a3030222c302e3738373636303035313130343530340a22323032302d30322d30352030303a30303a3030222c302e373935393939343930353732393834360a22323032302d30322d30362030303a30303a3030222c302e373935323037323039363636373034390a22323032302d30322d30372030303a30303a3030222c302e3832393234363232343838363336350a22323032302d30322d30382030303a30303a3030222c302e383239393034373635353939363035350a22323032302d30322d30392030303a30303a3030222c302e383338363639323137313033313436350a22323032302d30322d31302030303a30303a3030222c302e38353830313634373631380a22323032302d30322d31312030303a30303a3030222c302e383130323530393437393936313938330a22323032302d30322d31322030303a30303a3030222c302e383433323631313436333636313030320a22323032302d30322d31332030303a30303a3030222c302e383535383536353834363731333632320a22323032302d30322d31342030303a30303a3030222c302e383337323730363631383738303935360a22323032302d30322d31352030303a30303a3030222c302e383333353332343038383538303234330a22323032302d30322d31362030303a30303a3030222c302e383636383832343034353334343633320a22323032302d30322d31372030303a30303a3030222c302e383836363634323232323038333831310a22323032302d30322d31382030303a30303a3030222c302e393032363430303937303731373033390a22323032302d30322d31392030303a30303a3030222c302e383832373838333631373939333438380a22323032302d30322d32302030303a30303a3030222c302e383530303035363732363738333734320a22323032302d30322d32312030303a30303a3030222c302e3737383436363530373530313739360a22323032302d30322d32322030303a30303a3030222c302e373737383734393835393335313437350a22323032302d30322d32332030303a30303a3030222c302e373731333834393530303532383132330a22323032302d30322d32342030303a30303a3030222c302e373937383030363936353434323134340a22323032302d30322d32352030303a30303a3030222c302e373736383231373934313333363939370a22323032302d30322d32362030303a30303a3030222c302e373938353333313136353336313831310a22323032302d30322d32372030303a30303a3030222c302e383530343335363139343238353239390a22323032302d30322d32382030303a30303a3030222c302e383734333333393138383334393638310a22323032302d30322d32392030303a30303a3030222c302e3838383336363333393338343837380a22323032302d30332d30312030303a30303a3030222c302e383933393737393637343631333438380a22323032302d30332d30322030303a30303a3030222c302e3931323431323035313530303336362c302e3931323431323035313530303336360a22323032302d30332d30332030303a30303a3030222c302e383733353632323939353238363532330a22323032302d30332d30342030303a30303a3030222c302e383532353336353235333030343039310a22323032302d30332d30352030303a30303a3030222c302e383633323633313830363733313335350a22323032302d30332d30362030303a30303a3030222c302e383734303237343632353730373730350a22323032302d30332d30372030303a30303a3030222c302e383634323439323631363431353135360a22323032302d30332d30382030303a30303a3030222c302e38373630353132313331363135333031').decode()
wFile = open('chart.csv', 'w')
wFile.write(hexstr)
wFile.close()
!ls -la
cat chart.csv
```
```
# Install pyfolio library in research environment
!pip3 install --user pyfolio
```
```
import pandas as pd
import sys
sys.path.append('/home/quant/.local/lib/python3.6/site-packages')
import pyfolio as pf
import matplotlib.pyplot as plt
%matplotlib inline
import warnings
warnings.filterwarnings('ignore')
from fmz import * # import all FMZ functions
# Read fund curve data, FMZ platform download, cumulative income data
df=pd.read_csv(filepath_or_buffer='chart.csv')
# Convert to date format
df['Date'] = pd.to_datetime(df['DateTime'],format='%Y-%m-%d %H:%M:%S')
# Get start and end time
startd = df.at[0,'Date']
endd = df.at[df.shape[0]-1,'Date']
# Read the target asset daily K-line data, and use it as the benchmark income data
# Use the API provided by the FMZ research environment to obtain K-line data equal to the revenue data
dfh = get_bars('bitfinex.btc_usd', '1d', start=str(startd), end=str(endd))
dfh=dfh[['close']]
# Calculate the daily rise and fall based on the closing price of k-line data
dfh['close_shift'] = dfh['close'].shift(1)
dfh = dfh.fillna(method='bfill') # Look down for the nearest non-null value, fill the exact position with this value, full name "backward fill"
dfh['changeval']=dfh['close']-dfh['close_shift']
dfh['change']=dfh['changeval']/dfh['close_shift']
# Frequency changes keep 6 decimal places
dfh = dfh.round({'change': 6})
# Revenue data processing, the FMZ platform obtains the cumulative revenue, and converts it to the daily revenue change rate
df['return_shift'] = df['Floating Profit and Loss'].shift(1)
df['dayly']=df['Floating P&L']-df['return_shift']
chushizichan = 3 # Initial asset value in FMZ backtest
df['returns'] = df['dayly']/(df['return_shift']+chushizichan)
df=df[['Date','Floating Profit and Loss','return_shift','dayly','returns']]
df = df.fillna(value=0.0)
df = df.round({'dayly': 3}) # retain three decimal places
df = df.round({'returns': 6})
# Convert pd.DataFrame to pd.Series required for pyfolio earnings
df['Date'] = pd.to_datetime(df['Date'])
df=df[['Date','returns']]
df.set_index('Date', inplace=True)
# Processed revenue data
returns = df['returns'].tz_localize('UTC')
# Convert pd.DataFrame to pd.Series required for pyfolio benchmark returns
dfh=dfh[['change']]
dfh = pd.Series(dfh['change'].values, index=dfh.index)
# Processed benchmark data
benchmark_rets = dfh
# The point in time when real-time trading begins after the strategy's backtest period.
live_start_date = '2020-02-01'
# Call pyfolio's API to calculate and output the fund curve analysis result graph
# "returns" Parameters are required, the remaining parameters can not be entered
pf.create_returns_tear_sheet(returns,benchmark_rets=benchmark_rets,live_start_date=live_start_date)
```
The output analysis result:


## Interpretation of results
There are a lot of output data, we need to calm down and learn what these indicators mean. Let me introduce a few of them. After we find the introduction to the relevant indicators and understand the meaning of the indicators, we can interpret our trading strategy status.
- Annual return
Annualized rate of return is calculated by converting the current rate of return (daily rate of return, weekly rate of return, monthly rate of return, etc.) into annual rate of return. It is a theoretical rate of return, not a rate of return that has actually been achieved. The annualized rate of return needs to be distinguished from the annual rate of return. The annual rate of return refers to the rate of return for one year of strategy execution and is the actual rate of return.
- Cumulative returns
The easiest concept to understand is the return on strategy, which is the rate of change in total assets from the beginning to the end of the strategy.
- Annual Volatility
The annualized volatility rate is used to measure the volatility risk of the investment target.
- Sharpe ratio
Describes the excess return that the strategy can obtain under the total unit risk.
- Max Drawdown
Describing the biggest loss of the strategy. The maximum drawdown is usually the smaller, the better.
- Omega ratio
Another risk-reward performance indicator. Its biggest advantage over Sharpe ratio is-by construction-it considers all statistical moments, while Sharpe ratio only considers the first two moments.
- Sortino ratio
Describes the excess return that the strategy can obtain under the unit's downside risk.
- Daily Value-at-Risk
Daily Value at Risk-Another very popular risk indicator. In this case, it means that in 95% of cases, the position (portfolio) is kept for another day, and the loss will not exceed 1.8%.
Reference: https://towardsdatascience.com/the-easiest-way-to-evaluate-the-performance-of-trading-strategies-in-python-4959fd798bb3
- Tail ratio
Select the 95th and 5th quantiles for the distribution of daily return, and then divide to obtain the absolute value. The essential meaning is how many times the return earned is greater than the loss.
- Stability
This is called stability. In fact, it is very simple, that is, how much the time increment explains the cumulative net value, that is, the r-squared of the regression. This is a bit abstract, let's explain briefly.
Reference: https://blog.csdn.net/qtlyx/article/details/88724236
## Small suggestions
It is hoped that FMZ can increase the evaluation function of the rich capital curve, and increase the storage function of historical backtest results, so that it can display the backtest results more conveniently and professionally, and help you create better strategies.
From: https://www.fmz.com/digest-topic/5798 | fmzquant |
1,890,729 | Lonton Wealth Forum - Redefining Global Wealth Strategies | Lonton Wealth Forum: A New Frontier in Global Wealth Management The Lonton Wealth Forum stands as a... | 0 | 2024-06-17T02:26:43 | https://dev.to/lontonwealthltd/lonton-wealth-forum-redefining-global-wealth-strategies-ppp | Lonton Wealth Forum: A New Frontier in Global Wealth Management
The Lonton Wealth Forum stands as a premier annual event that convenes leaders, seasoned experts, and innovators from the global wealth management sector. Established in 2015, the forum has swiftly emerged as a pivotal platform for discussing the latest trends, strategies, and technological advancements in wealth management. The forum aims to drive innovation and progress within the industry, facilitating significant interactions among key stakeholders from around the world.
The inception of Lonton wealth forum was driven by the recognition of a critical gap in high-quality platforms for wealth management discourse. Launched in London by a consortium of leading financial institutions, wealth management firms, and consulting companies, Lonton wealth forum was designed to offer an unparalleled environment for dialogue and knowledge exchange. Its evolution over the years has seen a growing influence, with participation from bankers, investors, fund managers, and fintech pioneers from various regions.
At the core of Lonton wealth forum’s mission is the objective to promote sustainable growth in the global wealth management field through high-level discussions and innovative thought exchanges. The forum aspires to become an indispensable platform for wealth management leaders and professionals, guiding the industry toward greater efficiency, transparency, and technological integration.

Each edition of Lonton wealth forum features an array of high-profile speakers and panel discussions, addressing crucial topics such as global economic trends, wealth management strategies, asset allocation, and risk management. The forum regularly attracts CEOs and CIOs of renowned financial institutions, professors from top academic institutions, and founders of cutting-edge technology firms. Their insights and experiences offer valuable perspectives on the complexities and future direction of wealth management.
A significant aspect of Lonton wealth forum is its emphasis on financial technology innovation. Dedicated exhibition areas at the forum allow tech companies to showcase their latest advancements, including AI-driven investment advisors, blockchain applications in wealth management, and big data analytics platforms. These demonstrations provide attendees with new tools and solutions to maintain a competitive edge in the rapidly evolving market.
Lonton wealth forum’s global perspective facilitates a comprehensive analysis of economic developments, policy changes, and market dynamics across different regions. This worldwide outlook helps participants understand the diverse and intricate nature of wealth management, enabling more informed decision-making processes.
Networking opportunities at Lonton wealth forum are another key highlight. The forum provides an exceptional setting for attendees to connect with industry peers, exchange views, and explore potential collaborations during the conference and its social events. Such face-to-face interactions foster long-term professional relationships and drive cooperative efforts and innovation in practice.
Since its establishment, Lonton wealth forum has made a significant impact on the global wealth management landscape. The discussions and outcomes of the forum have become a critical reference for wealth management practices, promoting innovation in areas like digital transformation, risk control, and client service. Looking ahead, Lonton wealth forum plans to expand its influence further by enhancing its geographic reach and diversifying its agenda. It aims to remain at the forefront of wealth management forums worldwide, addressing a broader array of topics to cater to the evolving needs of the industry.
In summary, the Lonton Wealth Forum has become a beacon of excellence and influence in the global wealth management sector. Through its unwavering focus on industry developments, promotion of technological innovation, and provision of a high-caliber exchange platform, Lonton wealth forum is spearheading the new frontier in wealth management. For professionals in the field, participating in Lonton wealth forum represents a vital opportunity to gain insights into the latest trends, establish connections with like-minded individuals, and explore new business opportunities. | lontonwealthltd | |
1,890,728 | Azure Resource Naming Conventions! Best Practices for Optimal Management | Introduction In the vast landscape of Azure cloud services, effective resource management... | 0 | 2024-06-17T02:23:30 | https://dev.to/karthiksdevopsengineer/azure-resource-naming-conventions-best-practices-for-optimal-management-9d0 | azure, microsoft, cloudcomputing, productivity | ## Introduction
In the vast landscape of Azure cloud services, effective resource management is paramount. One often-overlooked aspect that significantly contributes to efficient management is a well-defined naming convention. In this guide, we’ll delve into crafting and implementing an effective Azure resource naming convention, using a structured approach that enhances clarity, organization, and scalability.
## Understanding the Components
In our proposed Azure resource naming convention, each component serves a distinct purpose
- **Resource Type:** Indicates the specific service or resource type (e.g., VM, SQL, Storage).
- **Application or Project name:** Identifies the associated application or project.
- **Environment:** Specifies the environment of the Application or Project (e.g., development, production).
- **Region:** Denotes the Azure region where the resource is deployed.
- **Unique identifier:** Provides a unique reference for the particular resource.
## Benefits of Consistent Naming Conventions
Implementing a standardized naming convention offers numerous advantages
- **Clarity and Readability:** Clear, descriptive names enhance understanding and facilitate communication.
- **Ease of Resource Identification:** Consistent naming simplifies locating and managing resources across environments.
- **Simplified Management and Organization:** Well-named resources streamline administrative tasks and improve overall resource governance.
## Best Practices for Implementing Naming Conventions
To ensure the effectiveness of your naming conventions, adhere to these best practices
- **Consistency:** Maintain uniformity across all resources within your Azure environment.
- **Descriptive Naming:** Use meaningful names that accurately reflect the purpose and function of each resource.
- **Metadata Incorporation:** Include relevant metadata, such as environment, region, or department, to provide additional context.
- **Documentation:** Document the naming conventions comprehensively for team reference and future scalability.
## Example Use Cases

Let’s explore practical scenarios demonstrating the application of our naming convention
- **Production Environments:** rg-sharepoint-prod-westus-001
- **Development Environments:** rg-sharepoint-dev-eastus-002
- **Various Azure Regions:** rg-sharepoint-prod-centralus-003
- **Different Resource Types:** rg-sql-db-dev-westus-004
## Conclusion
Crafting an effective Azure resource naming convention is not merely a matter of syntax; it’s a strategic approach to streamline resource management, enhance collaboration, and ensure scalability. By adopting the structured approach outlined in this guide, organizations can establish a robust foundation for efficient Azure resource governance. Embrace consistency, clarity, and documentation to master Azure resource naming conventions and unlock the full potential of your cloud environment.
| karthiksdevopsengineer |
1,890,727 | Lonton Wealth Forum: Introduction to Bonds | Lonton Wealth Forum: Introduction to Bonds Bonds are a fundamental component of the financial markets... | 0 | 2024-06-17T02:22:12 | https://dev.to/lontonwealthforum/lonton-wealth-forum-introduction-to-bonds-4kjh | lontonwealthforum | Lonton Wealth Forum: Introduction to Bonds
Bonds are a fundamental component of the financial markets and play a crucial role in both personal and institutional investment portfolios. This guide will provide an overview of bonds, explaining their types, how they work, and why they are important for investors.
What are Bonds?
Bonds are fixed-income securities that represent a loan made by an investor to a borrower, typically a corporation or government. When you purchase a bond, you are essentially lending money to the issuer in exchange for periodic interest payments and the return of the bond’s face value when it matures.

Key Features of Bonds
Face Value
The face value (or par value) of a bond is the amount that the issuer agrees to repay the bondholder at the maturity date. Most bonds are issued with a face value of $1,000.
Coupon Rate
The coupon rate is the annual interest rate paid on the bond's face value. This interest is typically paid semi-annually, although some bonds may pay annually or at other intervals.
Maturity Date
The maturity date is when the bond's principal amount (face value) is repaid to the bondholder. Bonds can have short-term maturities (less than 5 years), intermediate-term maturities (5-10 years), or long-term maturities (more than 10 years).
Yield
The yield of a bond is the rate of return it generates. It can be calculated in different ways, but the most common is the current yield, which is the annual coupon payment divided by the current market price of the bond.
Types of Bonds
Government Bonds
Government bonds are issued by national governments and are considered low-risk because they are backed by the government's credit. In the U.S., these include Treasury bonds, notes, and bills, which differ mainly in their maturities.
Municipal Bonds
Municipal bonds are issued by state and local governments to fund public projects such as schools, highways, and hospitals. They often offer tax-exempt interest income, making them attractive to investors in higher tax brackets.
Corporate Bonds
Corporate bonds are issued by companies to raise capital for various purposes, such as expanding operations or refinancing debt. These bonds typically offer higher yields than government bonds but come with higher risk.
High-Yield Bonds
High-yield bonds, also known as junk bonds, are issued by companies with lower credit ratings. They offer higher interest rates to compensate for the increased risk of default.
Convertible Bonds
Convertible bonds can be converted into a predetermined number of the issuing company's shares. This feature offers the potential for capital appreciation if the company's stock price rises.
How Bonds Work
When you buy a bond, you are entitled to regular interest payments (coupon payments) until the bond matures. At maturity, the issuer repays the face value of the bond. The price of a bond can fluctuate based on interest rates, credit ratings, and other market factors. For example, if interest rates rise, existing bonds with lower coupon rates become less attractive, and their market prices may fall.
Benefits of Investing in Bonds
Income Generation
Bonds provide regular interest payments, making them a reliable source of income for investors, particularly retirees.
Diversification
Including bonds in an investment portfolio can help diversify risk. Bonds often have a low correlation with stocks, meaning they may perform differently under the same market conditions.
Capital Preservation
Bonds, especially government and high-quality corporate bonds, are considered safer investments compared to stocks. They help preserve capital and reduce the overall volatility of a portfolio.
Tax Advantages
Certain bonds, like municipal bonds, offer tax-exempt interest income, which can be beneficial for investors in high tax brackets.
Risks Associated with Bonds
Interest Rate Risk
When interest rates rise, the value of existing bonds tends to fall. This is because new bonds are issued with higher coupon rates, making existing bonds with lower rates less attractive.
Credit Risk
Credit risk is the possibility that the bond issuer will default on its payments. Bonds with lower credit ratings have higher credit risk.
Inflation Risk
Inflation can erode the purchasing power of the fixed interest payments received from bonds, making them less attractive during periods of high inflation.
Liquidity Risk
Some bonds may be difficult to sell quickly at their fair market value, particularly in a volatile market or for bonds with lower credit ratings.
Conclusion
Bonds are a vital part of the financial markets and offer various benefits, including steady income, diversification, and capital preservation. However, like any investment, they come with risks that need to be carefully considered. By understanding the fundamentals of bonds, investors can make informed decisions and effectively incorporate bonds into their investment strategies to achieve their financial goals. | lontonwealthforum |
1,890,726 | How to Extract Text From PDF in C# (Beginner Tutorial) | PDF (Portable Document Format) documents have become a standard for sharing and preserving the layout... | 0 | 2024-06-17T02:21:24 | https://dev.to/tayyabcodes/how-to-extract-text-from-pdf-in-c-beginner-tutorial-20li | csharp, tutorial, developer, codenewbie | [PDF](https://en.wikipedia.org/wiki/PDF) (Portable Document Format) documents have become a standard for sharing and preserving the layout of documents across various platforms and devices. They are widely used due to their ability to maintain formatting, regardless of software or hardware, making them ideal for contracts, manuals, and reports. However, extracting text from a PDF can sometimes be challenging, especially when dealing with complex layouts or embedded content.
Many PDF libraries in C# provide help in PDF file tasks. In this blog, we’ll dive into the process of extracting text from PDF documents using the [IronPDF library](https://ironpdf.com/) in C#. IronPDF offers a powerful and user-friendly API that simplifies working with PDFs, allowing developers to retrieve text, images, and other embedded data efficiently. By the end of this guide, you’ll have a clear understanding of how to leverage IronPDF for text extraction in your C# applications, enhancing your ability to manipulate and utilize PDF content effectively.
## Introduction of IronPDF Library

[IronPDF](http://ironpdf.com) is an advanced .NET library that simplifies working with PDF documents. With IronPDF, you can effortlessly create, edit, and manage PDF files within your C# applications. The library uses a Chrome-based rendering engine to convert HTML, CSS, JavaScript, and images into high-quality PDF documents.
The library provides multiple methods for [HTML to PDF conversion](https://ironpdf.com/examples/using-html-to-create-a-pdf/), including creating PDFs from HTML strings, URLs, or MVC views. This flexibility makes it easy to integrate PDF generation into your web applications. Beyond conversion, IronPDF offers extensive features for editing PDF documents. You can add headers, footers, and digital signatures, merge and split PDFs, and even fill and read interactive form data. The library also provides methods to extract data and images from existing PDFs, which is particularly useful for data processing and content manipulation tasks.
## Extract Text Using IronPDF
With IronPDF, you can easily extract text from any PDF document using a variety of methods tailored to different needs. This section will guide you through the process of installing IronPDF, loading a PDF document, and extracting text using various techniques.
IronPDF offers robust functionality that simplifies the extraction process, whether you need to retrieve all text, text from specific pages, or text line by line. This flexibility ensures that you can handle various PDF text extraction scenarios efficiently. Let’s explore how to set up IronPDF and use its features to extract text from your PDF documents.
### Step 1: Install IronPDF Library
To start using IronPDF in your C# project, you need to install the library via NuGet Package Manager. Open your project in Visual Studio, then follow these steps:
1. Right-click on your project in the Solution Explorer.
2. Select “Manage NuGet Packages...”
3. Search for “IronPDF” in the Browse tab.
4. Click “Install” to add the IronPDF library to your project.
You can also use NuGet Package manager console for installing IronPDF library using command line:
```
Install-Package IronPdf
```
Once installed, you will have access to the various features IronPDF offers, including PDF creation, editing, and text extraction. Ensure that you have the appropriate directives in your code to utilize IronPDF functionalities:
```
using IronPdf;
```
This setup will enable you to efficiently work with PDF documents in your C# applications.
### Step 2: Load the PDF Document
Before extracting text, you need to load the PDF file into your application. IronPDF makes this process straightforward. Here’s a sample code snippet to load a PDF format:
```
// Load the PDF document
var pdf = PdfDocument.FromFile("path/to/your/document.pdf");
```
The above code loads the PDF file. Loading the document is the first step towards interacting with its content.
### Step 3: Extract Text from PDF
IronPDF provides several methods to extract PDF text, allowing you to choose the one that best fits your needs.
#### Extract All Text
To extract all the text from a PDF document, you can use the ExtractAllText method. This method returns the entire text content of the PDF as a single string.
```
// Extract all text from the PDF
var text = pdf.ExtractAllText();
//Show Extracted text on console
Console.WriteLine(allText);
```

Extracting all text at once is useful for situations where you need a complete snapshot of the document’s textual content. This method is straightforward and can be used to pull all the text for further processing or analysis quickly.
#### Extract Text from Specific Pages
If you need to extract text from specific pages, you can specify the page numbers. The following example demonstrates how to extract text from the first page of the PDF:
```
// Extract text from specific pages
string text = pdf.ExtractTextFromPage(1);
Console.WriteLine(text);
int[] pages = new[] { 0, 1 };
// Extract text from pages 1 & 2
string pages2 = pdf.ExtractTextFromPages(pages);
Console.WriteLine(pages2);
```

This approach is beneficial when you are only interested in certain sections of a document, such as specific chapters in a book or particular sections in a report.
#### Extract Text Line by Line
Extracting text line by line can be useful for more granular text processing. IronPDF provides a method to extract text line by line, ensuring you can process each line individually.
```
// Extract text by lines
var lines = pdf.Pages[1].Lines;
```
Line-by-line extraction is particularly useful for scenarios where you need to process or analyze text in smaller chunks, such as parsing through each sentence or dealing with formatted data. This method allows for detailed text manipulation, making it easier to handle tasks like data extraction and content analysis.
By using these methods, you can efficiently extract and manipulate text from your PDF documents in various ways, depending on your requirements. IronPDF’s flexibility and ease of use make it an excellent choice for handling PDF text extraction in your C# applications.
## Conclusion

IronPDF provides a comprehensive and user-friendly solution for working with PDF documents in C#. Its powerful features allow you to easily extract text from PDFs, whether you need all the text at once, specific pages or line by line.
With IronPDF, you can enhance your application’s capabilities, making tasks like document automation, content management, and data analysis more efficient. Moreover, IronPDF offers a [free trial](https://ironpdf.com/licensing/), allowing you to explore its features and see how it fits into your projects. For those looking to integrate IronPDF into their development environment fully, licensing starts at $749, providing access to a wide range of powerful tools and support.
| tayyabcodes |
1,890,725 | Analysis of LED display screens used in command centers | At present, the display technologies used in large-screen display systems mainly include DLP (rear... | 0 | 2024-06-17T02:20:58 | https://dev.to/sostrondylan/analysis-of-led-display-screens-used-in-command-centers-3kj4 | led, display, screen | At present, the display technologies used in large-screen display systems mainly include DLP (rear projection), LCD (liquid crystal) and [full-color LED](https://sostron.com/products/). The following compares these three display technologies in terms of display effect, installation load-bearing capacity, business application and operation and maintenance costs.

A. Comparison of display system indicators
1. Physical seams
The physical seams determine the degree of segmentation of the picture. The display pixels of the high-definition small-pitch digital LED display system are composed of dot-shaped LED lights arranged vertically and horizontally, combined with high-precision production and manufacturing processes to ensure that there are no visible physical seams between display units, and there are no optical seams common in other spliced screens. In contrast, DLP and LCD displays will have obvious seams, affecting the continuity and visual experience of the overall picture. [Here is everything about small-pitch LED displays. ](https://sostron.com/everything-about-small-pitch-led-display/)
2. Color consistency
Color consistency mainly reflects the color difference problem between each display unit. DLP display systems will have color difference after long-term use, and the color difference phenomenon will expand over time. Although it can be adjusted, it is inconvenient for application and maintenance. High-definition small-pitch digital LED display systems and LCD large-screen systems basically have no color difference problems and can maintain color consistency for a long time.

3. Color saturation and color reproduction
Color saturation determines the brightness of the image, and color reproduction determines the consistency between the image and the real natural object. High-definition small-pitch digital LED display systems use LED lights with a wide color gamut, which can provide excellent color reproduction effects when displaying various pictures and videos, and perfectly reproduce the gorgeous colors of the real world. However, LCD and DLP display systems essentially use projection technology, with a smaller color gamut range, and there is a certain gap in color reproduction and saturation. [Which type of display effect is best for you, DLP, LCD, or LED? ](https://sostron.com/dlp-lcd-led-which-type-of-display-is-best-for-you/)
4. Brightness
Brightness determines the visual effect. The brightness of DLP display units is generally 200-300cd/㎡, the brightness of LCD display units is 700cd/㎡, and the brightness of high-definition small-pitch digital LED display systems can be as high as 1200cd/㎡. In places like command centers, too high brightness is not suitable for long-term viewing, but high-definition small-pitch digital LED display systems can maintain the grayscale of the picture without loss at low brightness by selecting high-performance driver ICs, taking into account both human eye comfort and clear pictures. Therefore, in terms of brightness, high-definition small-pitch digital LED display systems perform best, followed by LCD, and DLP display systems have the lowest brightness.

B. Business applications
1. Real-time monitoring
High-definition small-pitch digital LED display systems can provide high-resolution, high-brightness and high-contrast display effects, which are very suitable for the real-time monitoring needs of command centers. This system can display high-definition monitoring images to ensure that details are clearly visible, facilitating quick judgment and decision-making.

2. Data visualization
In command centers, data visualization is a very important part. High-definition small-pitch digital LED display systems support the access of multiple signal sources, and can simultaneously display various data charts and video information to help command personnel better analyze and process information.
3. Emergency command
Emergency command requires a large-screen display system that can operate stably in a complex environment. High-definition small-pitch digital LED display systems have high reliability and stability, can provide stable display effects under various environmental conditions, and are suitable for emergency command scenarios.
C. Operation and maintenance costs
1. Installation cost
The installation process of high-definition small-pitch digital LED display systems is relatively simple. The standardized cabinet design makes installation more convenient, and its thin and light design also reduces transportation and installation costs. In contrast, the installation of DLP and LCD display systems requires consideration of splicing gaps and complex debugging processes, which are more expensive. [Provide you with a purchase guide for LED display control systems. ](https://sostron.com/led-display-control-system-purchase-guide/
)
2. Maintenance cost
The high-definition small-pitch digital LED display system adopts a modular design, which is convenient for later maintenance and replacement of parts. Its durability and protection performance are strong, which reduces long-term maintenance costs. However, the color difference adjustment and splicing maintenance of DLP and LCD display systems are more complicated, and the maintenance cost is higher. [What is the cost of LED panels? ](https://sostron.com/how-much-do-led-panels-cost-these-days/
)

Summary
Comparing the display effect, business application and operation and maintenance cost, the high-definition small-pitch digital LED display system shows obvious advantages in command center applications. Its features of seamlessness, good color consistency, high color reproduction, adjustable brightness and low installation and maintenance costs make it an ideal choice for large-screen display systems in command centers. Whether in real-time monitoring, data visualization or emergency command, the high-definition small-pitch digital LED display system can provide excellent performance and reliability.

Thank you for watching. I hope we can solve your problems. Sostron is a professional [LED display manufacturer](https://sostron.com/about-us/). We provide all kinds of displays, display leasing and display solutions around the world. If you want to know: [The relationship between 5G and LED display.](https://www.linkedin.com/pulse/relationship-between-5g-led-display-dylan-lian-zngdc) Please click read.
Follow me! Take you to know more about led display knowledge.
Contact us on WhatsApp:https://api.whatsapp.com/send?phone=+8613570218702&text=Hello | sostrondylan |
1,890,712 | Nekopoi APK Jelajahi Dunia Anime: Film yang Layak Ditonton | Anime, salah satu genre animasi dari Jepang, telah menarik banyak penggemar di seluruh dunia berkat... | 0 | 2024-06-17T02:17:31 | https://dev.to/boydtownsend/nekopoi-apk-jelajahi-dunia-anime-film-yang-layak-ditonton-ngn |
Anime, salah satu genre animasi dari Jepang, telah menarik banyak penggemar di seluruh dunia berkat keragaman, kreativitas, dan kedalaman kontennya. Berikut beberapa rekomendasi film anime link https://nekopoiapk.io
yang layak untuk ditelusuri:
"Nama Anda" (Kimi no Na wa)
**Genre**: Romantis, Fantasi
**Ringkasan**: "Nama Anda" bercerita tentang pertemuan antara Taki dan Mitsuha, dua anak muda yang hidup di dua dunia berbeda tetapi dapat berkomunikasi melalui mimpi. Film ini menyelidiki cinta, nasib, dan perubahan hidup mereka.
2. **"Serangan terhadap Titan" (Shingeki no Kyojin)**
**Genre**: Aksi, Horor, Fantasi
**Ringkasan**: Di dunia tempat manusia tinggal di kota yang dikelilingi tembok sangat tinggi untuk melindungi mereka dari Titan raksasa pemakan manusia. Film ini penuh drama dengan pertanyaan tentang sifat manusia dan kelangsungan hidup.
3. **"Spirited Away" (Dikirim ke Chihiro no Kamikakushi)**
**Genre**: Petualangan, Misteri, Sihir
**Ringkasan**: Chihiro, seorang gadis berusia 10 tahun, ditarik ke dunia roh setelah orang tuanya berubah menjadi babi. Ini adalah perjalanan Chihiro untuk menemukan cara membebaskan dirinya dan keluarganya dari dunia aneh ini.
4.**"Naruto"**
**Genre**: Aksi, Petualangan, Ninja
**Ringkasan**: Naruto Uzumaki adalah seorang anak muda yang bercita-cita menjadi Hokage, penguasa dan pelindung desa. Film ini mengikuti Naruto dan teman-temannya dalam perjalanan mereka untuk menjadi ninja terkuat.
5. **"Tetanggaku Totoro" (Tonari no Totoro)**
**Ringkasan**: Kisah dua saudara perempuan Satsuki dan Mei saat mereka pindah ke rumah baru di dekat hutan. Mereka menemukan bahwa hutan ini menyembunyikan makhluk aneh, termasuk Totoro – teman misterius dan menggemaskan.
Menyimpulkan
Anime bukan hanya genre animasi tetapi juga cara untuk menemukan kisah-kisah yang mendalam dan inspiratif tentang manusia dan dunia di sekitar mereka. Setiap film menawarkan pengalaman unik dan memikat penonton dengan menggunakan seni animasi untuk menceritakan kisah yang kaya. Luangkan waktu untuk menjelajahi dan menikmati keragaman dunia | boydtownsend | |
1,890,379 | Get Ready for Your JavaScript Interview: Top 100 Questions to Practice | JavaScript is a crucial skill for any developer, and mastering it can open doors to exciting career... | 0 | 2024-06-17T02:14:49 | https://raajaryan.tech/get-ready-for-your-javascript-interview-top-100-questions-to-practice | javascript, node, beginners, tutorial | JavaScript is a crucial skill for any developer, and mastering it can open doors to exciting career opportunities. Whether you're preparing for your first JavaScript interview or aiming to brush up on your skills, here are 100 essential questions that can help you ace your interview:
1. What is JavaScript?
2. What are the key differences between JavaScript and Java?
3. What are the data types in JavaScript?
4. Explain the concept of hoisting in JavaScript.
5. How does JavaScript handle asynchronous operations?
6. What are closures in JavaScript?
7. What is the prototype chain in JavaScript?
8. How do you check the type of a variable in JavaScript?
9. Explain the event bubbling and capturing in JavaScript.
10. How does prototypal inheritance differ from classical inheritance?
11. What is the difference between `==` and `===` operators?
12. How does `setTimeout` function work in JavaScript?
13. Explain the concept of promises in JavaScript.
14. What are arrow functions in JavaScript?
15. How does `this` keyword work in JavaScript?
16. What is the use of `bind` method in JavaScript?
17. How can you create a closure in JavaScript?
18. What is the difference between `null` and `undefined`?
19. Explain the concept of event delegation in JavaScript.
20. What is the difference between synchronous and asynchronous JavaScript?
21. How do you handle errors in JavaScript?
22. What are higher-order functions in JavaScript?
23. Explain the difference between function declaration and function expression.
24. How can you achieve inheritance in JavaScript?
25. What are the different ways to declare a variable in JavaScript?
26. Explain the use of `map`, `filter`, and `reduce` functions.
27. What is the purpose of `JSON.stringify` and `JSON.parse` in JavaScript?
28. How can you prevent the default behavior of an event in JavaScript?
29. Explain the concept of memoization in JavaScript.
30. What are the different types of scopes in JavaScript?
31. How does the ES6 module system differ from CommonJS?
32. What are the rest parameters in JavaScript?
33. How do you handle CORS in JavaScript?
34. Explain the use of `async` and `await` in JavaScript.
35. What are generator functions in JavaScript?
36. How can you convert a callback-based function to a promise-based function?
37. Explain the concept of currying in JavaScript.
38. What is the difference between `let`, `const`, and `var` in JavaScript?
39. How can you detect if a variable is an array in JavaScript?
40. Explain the purpose of the `Symbol` data type in JavaScript.
41. What is a closure trap in JavaScript?
42. How do you handle memory leaks in JavaScript?
43. Explain the use of `try`, `catch`, and `finally` in error handling.
44. What is the purpose of the `use strict` directive in JavaScript?
45. How can you check if an object has a property in JavaScript?
46. Explain the concept of event loop in JavaScript.
47. What are the differences between `forEach` and `map` methods?
48. How can you copy an object in JavaScript?
49. Explain the differences between `slice` and `splice` methods.
50. What is the purpose of the `WeakMap` and `WeakSet` in JavaScript?
51. How can you merge two arrays in JavaScript?
52. Explain the purpose of the `Symbol.iterator` method in JavaScript.
53. What are the different ways to create an object in JavaScript?
54. How can you validate an email address in JavaScript?
55. Explain the purpose of the `includes` method in arrays.
56. What are the differences between `==` and `===` operators in JavaScript?
57. How can you create a private variable in JavaScript?
58. Explain the purpose of the `Object.keys` method in JavaScript.
59. What are the differences between `null`, `undefined`, and `undeclared` in JavaScript?
60. How can you convert a string to a number in JavaScript?
61. Explain the purpose of the `Array.isArray` method in JavaScript.
62. What are the differences between function declaration and function expression in JavaScript?
63. How can you convert JSON to a JavaScript object?
64. Explain the purpose of the `Array.prototype.map` method in JavaScript.
65. What are the differences between `let`, `const`, and `var` in JavaScript?
66. How can you sort an array of objects by a specific property in JavaScript?
67. Explain the purpose of the `Array.prototype.filter` method in JavaScript.
68. What are the differences between `==` and `===` operators in JavaScript?
69. How can you remove duplicates from an array in JavaScript?
70. Explain the purpose of the `Array.prototype.reduce` method in JavaScript.
71. What are the differences between `null`, `undefined`, and `undeclared` in JavaScript?
72. How can you check if a string contains a substring in JavaScript?
73. Explain the purpose of the `Array.prototype.forEach` method in JavaScript.
74. What are the differences between `let`, `const`, and `var` in JavaScript?
75. How can you reverse a string in JavaScript?
76. Explain the purpose of the `Array.prototype.indexOf` method in JavaScript.
77. What are the differences between `==` and `===` operators in JavaScript?
78. How can you find the largest and smallest numbers in an array in JavaScript?
79. Explain the purpose of the `Array.prototype.some` method in JavaScript.
80. What are the differences between `null`, `undefined`, and `undeclared` in JavaScript?
81. How can you capitalize the first letter of a string in JavaScript?
82. Explain the purpose of the `Array.prototype.every` method in JavaScript.
83. What are the differences between `let`, `const`, and `var` in JavaScript?
84. How can you remove falsy values from an array in JavaScript?
85. Explain the purpose of the `Array.prototype.slice` method in JavaScript.
86. What are the differences between `==` and `===` operators in JavaScript?
87. How can you flatten an array in JavaScript?
88. Explain the purpose of the `Array.prototype.sort` method in JavaScript.
89. What are the differences between `null`, `undefined`, and `undeclared` in JavaScript?
90. How can you shuffle an array in JavaScript?
91. Explain the purpose of the `Array.prototype.reverse` method in JavaScript.
92. What are the differences between `let`, `const`, and `var` in JavaScript?
93. How can you find the intersection of two arrays in JavaScript?
94. Explain the purpose of the `Array.prototype.join` method in JavaScript.
95. What are the differences between `==` and `===` operators in JavaScript?
96. How can you find the union of two arrays in JavaScript?
97. Explain the purpose of the `Array.prototype.splice` method in JavaScript.
98. What are the differences between `null`, `undefined`, and `undeclared` in JavaScript?
99. How can you create a random number between two numbers in JavaScript?
100. Explain the purpose of the `Array.prototype.concat` method in JavaScript.
---
These questions cover a wide range of topics and are designed to help you solidify your understanding of JavaScript concepts. Make sure to practice answering them and understand the underlying principles to confidently tackle any JavaScript interview. Good luck! | raajaryan |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.