id
int64
5
1.93M
title
stringlengths
0
128
description
stringlengths
0
25.5k
collection_id
int64
0
28.1k
published_timestamp
timestamp[s]
canonical_url
stringlengths
14
581
tag_list
stringlengths
0
120
body_markdown
stringlengths
0
716k
user_username
stringlengths
2
30
1,901,691
Global Color Measuring Instrument Market by Market growth, trend, opportunity and forecast 2023-2030
Global Color Measuring Instrument Market The Color Measuring Instrument Market is expected to grow...
0
2024-06-26T17:46:52
https://dev.to/reportprime23/global-color-measuring-instrument-market-by-market-growth-trend-opportunity-and-forecast-2023-2030-4in3
Global Color Measuring Instrument Market The Color Measuring Instrument Market is expected to grow from USD 1.10 Billion in 2022 to USD 2.50 Billion by 2030, at a CAGR of 11.20% during the forecast period. Get Sample Report: https://www.reportprime.com/enquiry/sample-report/1467 Global Color Measuring Instrument Market Size The color measuring instrument is a tool used to quantify and identify colors in different materials, ranging from textiles, paints, inks, plastics, and ceramics. The market for color measuring instruments is segmented based on type, application, and region. The key types are desktop and portable. The application areas include industry and laboratory sectors. Geographically, the market is segmented across North America, Asia Pacific, Middle East, Africa, Australia, and Europe. Some of the key players in the market are ALTANA, Konica Minolta, Testronix, PCE, Michigan, X-Rite, and Datacolor. The market is regulated by specific legal and regulatory factors that impact market conditions. The report provides a detailed analysis of these factors and highlights their impact on market dynamics. Global Color Measuring Instrument Market Top Companies ALTANA Konica Minolta Testronix PCE Michigan Enquire Now: https://www.reportprime.com/enquiry/pre-order/1467 Global Color Measuring Instrument Market Segment The latest trends followed by the Color Measuring Instrument market include the development of portable and handheld color measurement devices, which offer ease of operation and convenience to the end-users. The integration of AI and machine learning algorithms in color measurement systems is another trend, which enables accurate and efficient color analysis. The growing use of non-contact color measurement techniques such as spectrophotometers, colorimeters, and imaging colorimeters is also gaining momentum. The major challenges faced by the Color Measuring Instrument market include the high cost of advanced systems, which limits their adoption by small businesses, lack of awareness among end-users regarding the benefits of color measurement systems, and the absence of a standardized color measurement process. Buy the Report: https://www.reportprime.com/checkout?id=1467&price=3590 Global Color Measuring Instrument Market by Application Laboratory Industry Global Color Measuring Instrument Market by Types Desktop Portable This information is sourced from:- https://www.reportprime.com/ To read the full report:- https://www.reportprime.com/color-measuring-instrument-r1467
reportprime23
1,901,690
Global Coriolis Mass Flowmeters Market by Market growth, trend, opportunity and forecast 2023-2030
Global Coriolis Mass Flowmeters Market The Coriolis Mass Flowmeters Market is expected to grow from...
0
2024-06-26T17:46:23
https://dev.to/reportprime23/global-coriolis-mass-flowmeters-market-by-market-growth-trend-opportunity-and-forecast-2023-2030-4lnj
Global Coriolis Mass Flowmeters Market The Coriolis Mass Flowmeters Market is expected to grow from USD 2.50 Billion in 2022 to USD 3.70 Billion by 2030, at a CAGR of 5.10% during the forecast period. Get Sample Report: https://www.reportprime.com/enquiry/sample-report/7205 Global Coriolis Mass Flowmeters Market Size Coriolis Mass Flowmeters are devices used for measuring the mass flow rate of a fluid by employing the Coriolis effect. These flowmeters have two parallel tubes that are vibrated back and forth, and the resulting Coriolis forces cause a phase shift that is proportional to the mass flow rate of the fluid. The market research report on Coriolis Mass Flowmeters provides a detailed analysis of the market segment based on type, application, region, and market players. The types of Coriolis Mass Flowmeters include Liquid and Gas, whereas the applications are Chemical and Petrochemical, Food and Beverages, Oil and Gas, Pharmaceuticals, Water and Wastewater Treatment, Pulp and Paper, and Others. The report covers regions like North America, Asia Pacific, Middle East, Africa, Australia, and Europe. The market players discussed in the report are Endress+Hauser, Emerson, ABB, Yokogawa, Krohne, Rheonik, Honeywell, Siemens AG, Schneider, Azbil Corporation, Badger Meter, OMEGA Engineering, Tokyo Keiso Co. Ltd., OVAL Corporation, Keyence, Beijing Sincerity Automatic Equipment, Liquid Controls (IDEX), Brooks Instruments (ITW), TRICOR Coriolis Technology (TASI), Heinrichs Messtechnik (KOBOLD), Alicat Scientific, Qingdao Add Value Flow Metering, Shanghai Yinuo Instrument, Tianjin Sure Instrument, and Zhejiang Sealand Technology. Furthermore, the report provides information on regulatory and legal factors that are specific to market conditions. Global Coriolis Mass Flowmeters Market Top Companies Endress+Hauser Emerson ABB Yokogawa Krohne Enquire Now: https://www.reportprime.com/enquiry/pre-order/7205 Global Coriolis Mass Flowmeters Market Segment The latest trend in the Coriolis Mass Flowmeters market is the integration of advanced features such as wireless communication, real-time monitoring, and improved accuracy. The manufacturers in the market are investing in research and development to innovate and introduce technologically advanced products to meet the growing demand of end-users. However, the Coriolis Mass Flowmeters market faces challenges such as intense competition, lack of standardization, high initial cost, and lack of awareness among end-users about the benefits of Coriolis Mass Flowmeters over traditional flow measuring devices. Moreover, the COVID-19 pandemic has also impacted the market due to supply chain disruptions and a decline in demand from end-users. Buy the Report: https://www.reportprime.com/checkout?id=7205&price=3590 Global Coriolis Mass Flowmeters Market by Types Liquid Coriolis Mass Flowmeters Gas Coriolis Mass Flowmeters Global Coriolis Mass Flowmeters Market by Application Chemical and Petrochemical Food and Beverages Oil and Gas Pharmaceuticals Water and Wastewater Treatment Pulp and Paper Others This information is sourced from:- https://www.reportprime.com/ To read the full report:- https://www.reportprime.com/coriolis-mass-flowmeters-r7205
reportprime23
1,901,680
Securing the Kingdom: IAM Best Practices for AWS Cloud Castle
It was a dark and stormy night when the alarm bells started ringing across the cloud castle. Deep...
0
2024-06-26T17:45:36
https://dev.to/ikoh_sylva/securing-the-kingdom-iam-best-practices-for-aws-cloud-castle-2085
cloudcomputing, aws, cloudskills, cloudpractitioner
It was a dark and stormy night when the alarm bells started ringing across the cloud castle. Deep within the servers, the sentries detected a disturbance - an unauthorized attempt to access the crown jewels. "Sound the alarm!" barked Sir Lancelot, the stalwart chief of security. "We've got a breach!" As the castle guard sprang into action, Lancelot pulled up the security logs on his dashboard. There, he spotted the culprit - a shadowy figure trying to brute-force their way into the kingdom's most precious resources. Lancelot let out a heavy sigh. This was no ordinary attack. The intruder was exploiting vulnerabilities in the castle's identity and access management (IAM) protocols, the very systems designed to protect the realm. "Gather the royal advisors," he commanded. "It's time we shore up our IAM defences, once and for all." ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kmnlsklexozeqmxfuvqe.jpg) In the cloud kingdom of AWS, the IAM system is the gatekeeper to the kingdom's most valuable assets - virtual servers, databases, APIs, and a vast trove of sensitive data. Just as a real castle needs sturdy walls and vigilant guards, the cloud requires ironclad IAM protocols to keep the black knights at bay. Yet all too often, cloud architects make rookie mistakes that leave the kingdom vulnerable. Credentials shared recklessly, over-permissioned policies, passwords scrawled on sticky notes - it's an open invitation for disaster. That's why following IAM best practices is crucial for any organization seeking to secure its AWS environment. Let's explore some of the key strategies that Sir Lancelot and his team of Royal Technologists have implemented to protect the cloud castle: - Principle of Least Privilege: The foundation of robust IAM is granting the bare minimum permissions required for each user, group, or role to perform their duties. No more, no less. This "least privilege" approach severely limits the blast radius if an account is ever compromised. - Multi-Factor Authentication (MFA): Relying on passwords alone is like leaving the castle gates unlocked. MFA adds an extra layer of security, requiring users to verify their identity using a one-time code or biometric. Even if credentials are stolen, the thieves can't get in without that second factor. - Centralized Policy Management: Rather than scattering permissions across dozens of individual users, the royal advisors consolidated IAM controls into reusable policy documents. This makes it easier to enforce consistent security standards and rapidly update protections in response to evolving threats. - Rotation of Credentials: Like changing the castle locks, regularly rotating access keys, passwords, and other credentials reduces the window of vulnerability if they are ever exposed. Automated tools can handle this tedious but critical task. - Granular Logging and Monitoring: To detect and respond to intrusion attempts, the castle's security team closely monitors all IAM activity. Services like AWS CloudTrail provide a detailed audit trail, while Amazon GuardDuty proactively hunts for suspicious behaviour. - Federated Access: Rather than managing user identities within the cloud, the kingdom leverages existing identity providers like Azure AD or Okta. This "federated" approach simplifies credential management and ensures access is immediately revoked when an employee departs. - Avoid Root Account Use: The all-powerful "root" user account is the master key to the castle. Lancelot has strictly limited its use, ensuring day-to-day operations are conducted through least-privileged IAM roles instead. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/glm0nm9buratcinfehbf.jpg) By implementing these core IAM best practices, the cloud castle's security team has erected formidable defences against even the most cunning cyber attackers. No longer can the black knights waltz in and pilfer the kingdom's most valuable assets. Of course, safeguarding the realm is an endless battle. As new threats emerge and cloud technologies evolve, the royal technologists must remain ever-vigilant, continuously optimizing their IAM strategies to keep the castle secure. But thanks to their diligence and the lessons learned from past intrusion attempts, the kingdom's crown jewels now rest safely behind lock and key. The moat has been fortified, the watchtowers reinforced - all to ensure the cloud castle's prosperity for generations to come. Most importantly, never underestimate the importance of automation and continuous learning. By staying vigilant, adaptive, and committed to mastering the intricacies of AWS security, you'll be well-equipped to navigate the ever-evolving challenges that lie ahead, ensuring that your organization's cloud journey is a secure and successful one. I hope my story wasn’t too vague, just wanted to explore a different approach of preaching the AWS Cloud Gospel of Identity and Access Management Best Practices. I am Ikoh Sylva a Cloud Computing Enthusiast with few months hands on experience on AWS. I’m currently documenting my Cloud journey here from a beginner’s perspective. If this sounds good to you kindly like and follow, also consider recommending this article to others who you think might also be starting out their cloud journeys. You can also consider following me on social media below; [LinkedIn](http://www.linkedin.com/in/ikoh-sylva-73a208185) [Facebook ](https://www.facebook.com/Ikoh.Silver) [X](x.com/Ikoh_Sylva)
ikoh_sylva
1,901,687
The New Wave (Exploring the Future)
Exploring the Future: Web Development, Programming, AI, and Open Source Web Development Trends The...
0
2024-06-26T17:44:56
https://dev.to/dan52242644dan/the-new-wave-exploring-the-future-4291
webdev, programming, ai, opensource
Exploring the Future: Web Development, Programming, AI, and Open Source Web Development Trends The landscape of web development is constantly evolving, with new technologies and methodologies emerging to enhance user experience and developer efficiency. In 2024, several key trends are shaping the future of web development: AI Integration: AI is becoming a cornerstone in web development, from AI-powered chatbots to personalized user experiences. Developers are increasingly using AI tools to automate tasks and improve website functionality1. Low-Code/No-Code Platforms: These platforms are democratizing web development, allowing non-developers to create sophisticated websites and applications with minimal coding knowledge2. Static Site Generators: Tools like Gatsby and Next.js are gaining popularity for their ability to create fast, secure, and scalable websites2. Fediverse Expansion: The fediverse, a collection of federated social networks, is growing rapidly. With major platforms like Meta’s Threads joining, it’s becoming a significant part of the web ecosystem1. Programming Innovations Programming continues to evolve with new languages, frameworks, and tools designed to streamline development and improve performance: Declarative Programming: Languages and frameworks that emphasize declarative paradigms, such as React and Vue.js, are becoming more prevalent1. Cloud-Native Development: Developers are increasingly adopting cloud-native technologies, including containers and Kubernetes, to build scalable and resilient applications3. AI-Powered Development Tools: Tools like GitHub Copilot are revolutionizing coding by providing AI-driven code suggestions and automating repetitive tasks3. The Rise of AI Artificial Intelligence is not just a buzzword; it’s transforming industries and redefining what’s possible in technology: Generative AI: AI models like GPT-4 are being used to create content, generate code, and even design websites3. Foundation Models: These large-scale AI models are being integrated into various applications, providing advanced capabilities in natural language processing, image recognition, and more3. AI in Open Source: Open source projects are increasingly incorporating AI, with many new contributors focusing on AI-driven innovations3. Open Source: The Backbone of Innovation Open source software continues to be a driving force in the tech industry, fostering collaboration and accelerating innovation: Community-Driven Development: Open source projects thrive on community contributions, with platforms like GitHub facilitating collaboration among developers worldwide3. Commercial Backing: Many successful open source projects receive support from commercial entities, ensuring sustainability and continuous improvement3. AI and Open Source: The intersection of AI and open source is particularly exciting, with numerous AI projects emerging as top contributors on platforms like GitHub3. Conclusion The convergence of web development, programming, AI, and open source is creating a dynamic and innovative tech landscape. As these fields continue to evolve, they offer endless possibilities for developers and organizations alike. Embracing these trends will not only keep you ahead of the curve but also empower you to contribute to the future of technology.
dan52242644dan
1,901,679
How to install Anaconda on Arch Linux
Links: Anaconda: Anaconda AUR Package Miniconda: Miniconda AUR Package Step 1: Clone...
0
2024-06-26T17:36:15
https://dev.to/kutt27/how-to-install-anaconda-on-arch-linux-5a1c
anaconda, python, archlinux, jupyter
**Links:** - Anaconda: [Anaconda AUR Package](https://aur.archlinux.org/packages/anaconda) - Miniconda: [Miniconda AUR Package](https://aur.archlinux.org/packages/miniconda3) ## Step 1: Clone the Package to Your System 1. Click on the `Git Clone URL` and copy the repository link. 2. Open your terminal and navigate to your desired directory. Here, I'm moving to the `/Downloads/` directory. ```bash git clone https://aur.archlinux.org/packages/anaconda ``` Once the cloning process completes successfully, proceed to the next step. ## Step 2: Install the Package into Your System 1. Open a terminal and change directory to where you cloned the repository. ```bash cd /Downloads/anaconda ``` 2. Build and install the package using `makepkg`. ```bash makepkg -si ``` This process will take some time. Once it finishes, proceed to the next steps. 3. Locate the installation script, typically named `anaconda_xxxxxx.sh`. Make it executable from the terminal. ```bash sudo chmod +x anaconda_xxxxxx.sh ``` 4. Execute the installation script. ```bash sudo ./anaconda_xxxxxx.sh ``` Follow the on-screen prompts, including reading the agreement and typing `yes` to proceed with the setup. ## Creating Environment and Example To activate the base environment: ```bash conda activate base ``` You'll see `(base)` in front of your terminal prompt indicating the base environment is active. For those who installed Anaconda, you can launch Anaconda Navigator using: ```bash anaconda-navigator ``` That's it for Anaconda installation. ### Miniconda Verification Verify the Miniconda installation by checking its version: ```bash conda --version ``` #### Checking Miniconda Path To ensure Miniconda's binaries are in your PATH: ```bash echo $PATH ``` Look for a directory like `/path/to/miniconda3/bin` in the output. If it's missing, you'll need to add it to your PATH. Assuming everything is correct, activate the base environment again: ```bash conda activate base ``` #### Opening Jupyter Notebook Install Jupyter Notebook: ```bash conda install jupyter notebook ``` Then, launch Jupyter Notebook: ```bash jupyter-notebook ``` You've successfully run Jupyter Notebook with Miniconda. ## Extra Perks ### 1. Creating a New Conda Environment Create a new environment specifying Python version and install packages: ```bash conda create -n <env_name> python=<version> conda activate <env_name> ``` For example: ```bash conda create -n myenv python=3.9 conda activate myenv conda install numpy ``` ### 2. Listing and Deleting Environments List all environments: ```bash conda env list ``` Delete an environment: ```bash conda env remove -n <env_name> ``` ### 3. Deactivating an Environment Deactivate the current environment: ```bash conda deactivate ``` For more information on managing environments, refer to [this link](https://askubuntu.com/questions/1026383/why-does-base-appear-in-front-of-my-terminal-prompt).
kutt27
1,901,686
Are PRs effective in the code review process? 👩‍💻
Well, love them or hate them, but you can't escape them! 😥 Dive deep into today's compelling...
0
2024-06-26T17:44:28
https://dev.to/grocto/are-prs-effective-in-the-code-review-process-29kk
webdev, devops, learning, career
Well, love them or hate them, but you can't escape them! 😥 Dive deep into today's compelling discussion on enhancing DevEx, code reviews, leading Gen Zs, AI dev tools & more in hashtag#ep43 of the groCTO podcast featuring Jacob Singh, CTO in Residence at Alpha Wave Global 🎙 [](https://grocto.substack.com/p/ep-43-enhancing-devex-code-review) Jacob's passionate views on PRs have sparked an interesting debate. What are your thoughts? We'd love to hear from you in the comments! ✍ Key Takeaways 👇 💻 What separates the tech cultures in India & the West? 💟 Creating good DevEx for Gen Z & younger generations ✅ Advice for leaders in misaligned organizations 📈 How Grofers improved their developer experience 🤖 Jacob on PRs, code reviews & utilizing AI tools for development Find links to the full episode in the comments ✨ Meanwhile, Check out the sneak peek of our discussion below 👇 groCTO aims to bring the best tech leaders across the globe to empower you with their practical experiences and learnings in the realm of engineering management and leadership 🌍 Do let us know in the comments below what you’d like to see more of in our future podcast episodes 😁
grocto
1,901,685
Google Drive API and Google Cloud Billing
is Google Drive API will cost me a charge after my Google Cloud Free trial? Sorry, I am confused.
0
2024-06-26T17:43:32
https://dev.to/phil_cajurao/google-drive-api-and-google-cloud-billing-3gea
is Google Drive API will cost me a charge after my Google Cloud Free trial? Sorry, I am confused.
phil_cajurao
1,901,683
Aggregation query in Cosmos DB
Aggregation Queries in Cosmos DB with Ternary: A Workaround for Performance Concerns Due to potential...
0
2024-06-26T17:40:40
https://dev.to/seifolahghaderi/aggregation-query-in-cosmos-db-366b
Aggregation Queries in Cosmos DB with Ternary: A Workaround for Performance Concerns Due to potential performance issues, I understand it's not typical to write aggregation queries like SUM and AVG on NoSQL databases. However, sometimes it's necessary to find a workaround for temporary situations. With my extensive experience in database development and providing SQL reports in operational and data warehouse databases, I know how useful aggregation queries can be. In my special case, I needed a combination of SUM and CASE, common in Oracle and SQL Server databases. For example: ``` SELECT SUM(CASE WHEN status = 'approved' THEN 1 ELSE 0 END) AS approved FROM orders; ``` However, Cosmos DB does not support the CASE operation. Instead, it introduces the `Ternary` operator. `Ternary` works like `iif` : ```<bool_expr> ? <expr_true> : <expr_false>``` Here's how I constructed my query to get the sum of orders for today and this month: ``` SELECT SUM((c.orderDate > '{yesterday:O}' AND c.orderDate <= '{today:O}') ? c.price : 0) AS todayPrice, SUM((c.orderDate >= '{thisMonthStart:O}' AND c.orderDate < '{thisMonthStart.AddMonths(1):O}') ? c.price : 0) AS thisMonthPrice FROM c ``` Explanation: - Ternary Operator: Used to replace the CASE statement in Cosmos DB. - todayPrice: Sums the prices of orders placed today. - thisMonthPrice: Sums the prices of orders placed this month. I eliminate real date calue but you can provide them if you need a direct query or use placeholders if you run the code from an API.
seifolahghaderi
1,901,682
Process Scheduling
During my recent exploration of "Operating System Concepts with Java," I delved into Chapter 4,...
0
2024-06-26T17:39:04
https://dev.to/_hm/process-scheduling-52a
webdev, beginners, programming, tutorial
During my recent exploration of "Operating System Concepts with Java," I delved into Chapter 4, focusing on process scheduling—an area integral to our work as software developers. This chapter provides a comprehensive view of how operating systems manage this critical task. It's always enriching to deepen your understanding of familiar topics. I'll be sharing a summary and a diagram illustrating the process, along with a snapshot of the book. Join me in exploring these insights, fellow programmers! ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5f9m2bnm0v13w0wjxqwq.png) **4.2 Process Scheduling** Process scheduling is a fundamental aspect of operating systems that aims to maximize CPU utilization and provide efficient execution of processes. Here’s an overview of the key concepts related to process scheduling: Objectives of Process Scheduling: 1.Multiprogramming: Ensure that at least one process is running at all times to maximize CPU utilization. 2. Time Sharing: Switch the CPU among processes frequently enough to allow users to interact with each program effectively. Scheduling Queues: As processes enter the system, they are managed through various queues: 1. Job Queue: Contains all processes in the system, including those waiting to be admitted to the system. 2. Ready Queue: Contains processes that are residing in main memory and are ready and waiting to execute. This queue is typically implemented as a linked list, where each PCB points to the next PCB in the queue. 3. Device Queues: Each I/O device has its own queue containing processes waiting for access to that device. For example, a disk device queue would hold processes waiting for disk I/O operations to complete. Process State Transitions: - Dispatching: When a process is selected from the ready queue to be executed on the CPU. - I/O Request: If a process issues an I/O request, it moves from the running state to the waiting state and enters the corresponding device queue. - Subprocess Creation: If a process spawns a new subprocess, it may wait for the new subprocess to terminate before returning to the ready queue. - Interrupt Handling: Processes can be forcibly removed from the CPU due to interrupts (e.g., timer interrupts, I/O interrupts), causing them to be put back into the ready queue after handling the interrupt. Process Lifecycle: - Execution: A process executes on the CPU until it either completes its task, waits for an event (e.g., I/O completion), or is interrupted. - Terminatio: When a process finishes its execution, it is removed from all queues, its PCB and resources are deallocated, and it transitions to the terminated state. **Queueing Diagram:** ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/as53n5us1vqi99n8p6v8.png) A queueing diagram illustrates the flow of processes between different queues (ready queue, device queues) and the resources that serve them (CPUs, I/O devices). This visual representation helps in understanding how processes move through the system during scheduling.
_hm
1,901,629
Let’s Get Hands-On with WordPress FSE Theme and Custom Blocks — Part 1
FSE (Full Site Editing): Blessing and Curse The introduction of Full Site Editing (FSE)...
0
2024-06-26T17:37:56
https://dev.to/silviamalavasi/lets-get-hands-on-with-wordpress-fse-theme-and-custom-blocks-part-1-lmj
wordpress, development, fse, blockthemes
# FSE (Full Site Editing): Blessing and Curse The introduction of Full Site Editing (FSE) represents one of the most significant evolutions in WordPress history. The aim is to provide users with a visual builder-like experience, allowing them to see a direct preview in the dashboard of what will happen on the frontend. Since the beginning, the evolution of WordPress has pursued this need: to bridge the gap between the complexity of the CMS and the goal of creating a system where a moderately tech-savvy user can independently build a site suited to their needs, including the visual aspect. Like any evolutionary process, this journey has seen its share of rapid transformations. Sometimes, developers have found themselves facing an enormous change that took time to digest. With FSE and Block Themes in particular, the change has been radical: * No more PHP in the theme root. * Strange HTML filled with comments that aren’t comments. * A configuration file in JSON format. For an introduction to these topics, [WordPress Developer Resources](https://developer.wordpress.org/themes/) is a good starting point. [Here](https://developer.wordpress.org/themes/core-concepts/theme-structure/), the structure of the new themes is explained. A major innovation is the management of page parts. The header and footer can now be edited by users through a dedicated editor in the dashboard. Users can also visually manage templates in this editor, adding core blocks to each page template, including Query Loops. Templates and Parts are saved in the database but can be set via code using the strange HTML mentioned earlier. In terms of styling, an FSE theme maintains its own `style.css`, while global styles for typography, colors, padding, and other site-wide elements are declared in the `theme.json` file. These styles are accessible to users in the sidebar of the selected block. With these changes alone, it’s evident that the level of user control over the site is infinitely greater than in a traditional theme. Evolution has certainly taken place. However, as developers, we’re not just users; we want to delve deep into the possibilities that the Block Themes system offers. A notable innovation is an [integrated build process](https://developer.wordpress.org/themes/advanced-topics/build-process/) in FSE. Do we want to add JavaScript in ES6? Do we want to use SASS? WordPress provides the WordPress scripts package (built on top of [webpack](https://webpack.js.org/)) to compile and include our files. # Block Editor: It Gets Even Better Block Themes allow the integration of custom [Blocks](https://developer.wordpress.org/block-editor/) that can be used anywhere in the theme. I’m not talking about [Patterns](https://developer.wordpress.org/themes/patterns/), which are essentially simplified versions of a Block designed to create a reusable graphic template (somewhat like the old reusable blocks). We’re talking about entities that use React and require a build process. ## The anatomy of a block A block consists of several components. Some are for declaring the block, others for how the block behaves in the dashboard, and others for the block’s display on the frontend. * `index.js` is our entry point and initializes the block * `block.json` declares the block properties, including custom attributes or additional JavaScript to be run on the frontend * `edit.js` and `edit.scss` (or css) handle the dashboard-related part * `save.js` and `save.scss` (or css) handle the frontend-related part Additional JavaScript or PHP files can be incorporated for specific functionalities, such as animating elements using GSAP or adjusting server-side rendering for the block. To use these blocks, they need to be registered in our `functions.php` file. ## Example block: gallery with SwiperJS Now, finally, some code. This block is moderately complex but perfect for explaining the integration between PHP and React. We will create a gallery with a management interface on the dashboard side. We’ll use [SwiperJS](https://swiperjs.com/) library for the gallery’s JavaScript and import it into our block. The build process will be explained in the [second part of the article](https://dev.to/silviamalavasi/lets-get-hands-on-with-wordpress-fse-theme-and-custom-blocks-part-2-13ea) because it differs somewhat from the standard process. Let’s start with `index.js`: ``` import { registerBlockType } from "@wordpress/blocks"; import "./edit.scss"; import "./save.scss"; import Edit from "./edit"; import save from "./save"; import metadata from "./block.json"; registerBlockType(metadata.name, { edit: Edit, save, }); ``` Nothing more than an entry point for the WordPress scripts package build system. In `block.json`, we start to see something more interesting ``` { "apiVersion": 2, "name": "blocks/gallery", "title": "Gallery", "version": "1.0.0", "category": "custom-blocks", "icon": "format-gallery", "description": "A block with a gallery on the left, and text on the right.", "supports": { "html": false, }, "editorScript": "file:./index.js", "editorStyle": "file:./index.css", "style": "file:./index.css", "viewScript": "file:./gallery.js", "attributes": { "slideCount": { "type": "number", "default": 1 } }, "example": { "innerBlocks": [ { "name": "core/image", "attributes": { "url": "http://localhost/site/wp-content/themes/my-theme/blocks/custom-blocks/gallery/gallery.png", "alt": "gallery preview" } } ] } } ``` In addition to the standard declarations, we have stated that we will use the `gallery.js` file on the frontend. ``` "viewScript": "file:./gallery.js", ``` And we have declared a new attribute called `slideCount` ``` "attributes": { "slideCount": { "type": "number", "default": 1 } }, ``` which we will use. We have also included a preview image to replace the standard preview system. This part is optional. ``` "example": { "innerBlocks": [ { "name": "core/image", "attributes": { "url": "http://localhost/site/wp-content/themes/my-theme/blocks/custom-blocks/gallery/gallery.png", "alt": "gallery preview" } } ] } ``` Alright, now React comes into play. Let’s look at edit.js It’s a bit long, but if you prefer, you can skip directly to the explanations below the code. ``` import { Button } from "@wordpress/components"; import { useDispatch, useSelect } from "@wordpress/data"; import { InnerBlocks, useBlockProps } from "@wordpress/block-editor"; import { useEffect, useRef } from "@wordpress/element"; import Swiper from "swiper"; import { Navigation, Pagination } from "swiper/modules"; export default function Edit({ attributes, setAttributes, className, clientId }) { const ref = useRef(); const swiperRef = useRef(); const blockProps = useBlockProps({ ref }); const { blockOrder, rootClientId, areBlocksInserted } = useSelect( (select) => { const { getBlockOrder, getBlockHierarchyRootClientId } = select("core/block-editor"); const blockOrder = getBlockOrder(clientId); const rootClientId = getBlockHierarchyRootClientId(clientId); return { blockOrder, rootClientId, areBlocksInserted: blockOrder.length === blockOrder.length, }; }, [clientId] ); const { insertBlock, removeBlock } = useDispatch("core/block-editor"); useEffect(() => { if ( areBlocksInserted && ref.current && !ref.current.querySelector(".block-editor-inner-blocks").classList.contains("swiper-container-initialized") ) { let swiperElement = ref.current.querySelector(".block-editor-inner-blocks"); let swiperWrapper = ref.current.querySelector(".block-editor-block-list__layout"); swiperElement.classList.add("swiper"); swiperWrapper.classList.add("swiper-wrapper"); let swiper_pagination = ref.current.querySelector(".swiper-pagination"); let swiper_prev = ref.current.querySelector(".swiper-prev"); let swiper_next = ref.current.querySelector(".swiper-next"); swiperRef.current = new Swiper(swiperElement, { modules: [Navigation, Pagination], observer: true, observeParents: true, pagination: { el: swiper_pagination, clickable: true, }, navigation: { nextEl: swiper_next, prevEl: swiper_prev, }, slidesPerView: 1, speed: 800, touchStartPreventDefault: false, }); } }, [areBlocksInserted, ref.current]); const TEMPLATE = [ [ "core/columns", { className: "gallery-cont swiper-slide" }, [ ["core/column", {}, [["core/image", {}]]], [ "core/column", {}, [ ["core/heading", { placeholder: "Insert title", level: 2 }], ["core/heading", { placeholder: "Insert small title", level: 3 }], ["core/paragraph", { placeholder: "Insert content" }], ], ], ], ], ]; const addSlide = () => { const newSlideCount = attributes.slideCount + 1; setAttributes({ slideCount: newSlideCount }); const slideBlock = wp.blocks.createBlock( "core/columns", { className: `gallery-cont swiper-slide slide-${attributes.slideCount}` }, [ wp.blocks.createBlock("core/column", {}, [wp.blocks.createBlock("core/image", {})]), wp.blocks.createBlock("core/column", {}, [ wp.blocks.createBlock("core/heading", { placeholder: "Insert title", level: 2 }), wp.blocks.createBlock("core/heading", { placeholder: "Insert small title", level: 3 }), wp.blocks.createBlock("core/paragraph", { placeholder: "Insert content" }), ]), ] ); insertBlock(slideBlock, blockOrder.length, clientId); swiperRef.current.update(); swiperRef.current.slideTo(attributes.slideCount); setTimeout(() => { swiperRef.current.slideTo(attributes.slideCount); }, 300); }; const removeSlide = () => { const newSlideCount = attributes.slideCount - 1; setAttributes({ slideCount: newSlideCount }); if (swiperRef.current) { const activeIndex = swiperRef.current.activeIndex; const blockToRemove = blockOrder[activeIndex]; removeBlock(blockToRemove, rootClientId); swiperRef.current.update(); } }; return ( <div {...blockProps} className={`${className || ""} my-block gallery edit`} > <div className="swiper"> <div className="swiper-wrapper"> <InnerBlocks template={TEMPLATE} templateInsertUpdatesSelection={false} templateLock={false} /> </div> </div> <div className="swiper-pagination"></div> <div className="swiper-navigation" style={{ height: attributes.slideCount === 1 ? "0" : "50px" }} > <div className="swiper-prev"></div> <div className="swiper-next"></div> </div> <div className="button-wrapper"> <Button isSecondary onClick={addSlide} > Add a Slide </Button> <Button isSecondary onClick={removeSlide} style={{ display: attributes.slideCount > 1 ? "inline-flex" : "none" }} > Remove Current Slide </Button> </div> </div> ); } ``` First, let’s import some built-in WordPress functions ``` import { Button } from "@wordpress/components"; import { useDispatch, useSelect } from "@wordpress/data"; import { InnerBlocks, useBlockProps } from "@wordpress/block-editor"; import { useEffect, useRef } from "@wordpress/element"; import Swiper from "swiper"; import { Navigation, Pagination } from "swiper/modules"; ``` Some of these are familiar to those who use React. Here we import a version of those functions that is usable in the editor. As attributes of `Edit` function, we use, among others, our custom attribute that we declared in `block.json` ``` export default function Edit({ attributes, setAttributes, className, clientId }) { ``` Then we set up `Refs` in the React way and use `useDispatch` and `useSelect` to manage the dynamic data flow from the server. Inside `useEffect`, we handle SwiperJS. This is intriguing because we are launching a gallery that will be displayed directly in the editor. If we optimize the block interface effectively, our user will see the gallery almost exactly as it will appear to site visitors (on the frontend). Of course, in the editor, we’ll also include buttons to add more slides, which won’t be visible to visitors. This marks a notable shift in perspective. Now, what does `TEMPLATE` refer to? ``` const TEMPLATE = [ [ "core/columns", { className: "gallery-cont swiper-slide" }, [ ["core/column", {}, [["core/image", {}]]], [ "core/column", {}, [ ["core/heading", { placeholder: "Insert title", level: 2 }], ["core/heading", { placeholder: "Insert small title", level: 3 }], ["core/paragraph", { placeholder: "Insert content" }], ], ], ], ], ]; ``` It’s a part of [InnerBlocks](https://developer.wordpress.org/block-editor/how-to-guides/block-tutorial/nested-blocks-inner-blocks/), a very handy system for importing elements already present in WordPress into our custom block, such as images (including the media library handling) and text elements. Optionally, we can insert videos, quotes, and many other blocks, each with their integrated management system. This syntax closely resembles what we find in the strange HTML of FSE themes, akin to what we see when copying a block and pasting it into a text file. The template will be rendered with minimal code in `save.js`, while here it is invoked within `InnerBlocks` in our return statement (yes, exactly, we're in React logic). The functions `addSlide` and `removeSlide` manage the insertion of new slides. The slide count is managed through the attribute we added. Why isn't it an internal variable? Because we need the slide count to be stored in the database, and that's exactly what custom attributes do. We use `createBlock`, `insertBlock`, and `removeBlock` to insert and remove the slides, which are structured exactly like our initial `TEMPLATE`. Let’s take a look: ``` const slideBlock = wp.blocks.createBlock( "core/columns", { className: `gallery-cont swiper-slide slide-${attributes.slideCount}` }, [ wp.blocks.createBlock("core/column", {}, [wp.blocks.createBlock("core/image", {})]), wp.blocks.createBlock("core/column", {}, [ wp.blocks.createBlock("core/heading", { placeholder: "Insert title", level: 2 }), wp.blocks.createBlock("core/heading", { placeholder: "Insert small title", level: 3 }), wp.blocks.createBlock("core/paragraph", { placeholder: "Insert content" }), ]), ] ); ``` In the end, our render function mirrors the HTML structure needed by SwiperJS, with our `TEMPLATE` inserted as the first slide. ``` <InnerBlocks template={TEMPLATE} templateInsertUpdatesSelection={false} templateLock={false} /> ``` And we include buttons for inserting or removing slides: ``` <Button isSecondary onClick={addSlide} > Add a Slide </Button> <Button isSecondary onClick={removeSlide} style={{ display: attributes.slideCount > 1 ? "inline-flex" : "none" }} > Remmove Current Slide </Button> ``` The frontend part is much simpler. The file responsible for rendering our block on the site is `save.js` ``` import { InnerBlocks, useBlockProps } from "@wordpress/block-editor"; export default function Save({ className }) { const blockProps = useBlockProps.save(); return ( <div {...blockProps} className={`${className || ""} my-block gallery`} > <div className="swiper"> <div className="swiper-wrapper"> <InnerBlocks.Content /> </div> </div> <div className="swiper-pagination"></div> </div> ); } ``` We import `InnerBlocks` to display the content of our `InnerBlocks` as defined in `edit.js`. Finally, we add the JavaScript code to launch the gallery with SwiperJS on the frontend in `gallery.js`. ``` import Swiper from "swiper"; import { Pagination } from "swiper/modules"; function gallery() { document.querySelectorAll(".my-block.gallery").forEach(function (el) { var swiper_pagination = el.querySelector(".swiper-pagination"); var swiper = new Swiper(el.querySelector(".swiper"), { modules: [Pagination], pagination: { el: swiper_pagination, clickable: true, }, slidesPerView: 1, speed: 800, }); }); } document.addEventListener("DOMContentLoaded", gallery); ``` #Create your own Blocks Creating your own blocks following this structure is straightforward at this point. Here, I’ve shown a block with a dynamic interface for managing slides, but you can apply the same method to create blocks that are simpler or much more complex. What remains to be defined is the folder structure of the project (whether it’s a theme or a plugin) and, of course, the build process. This will be the topic of the [second part of the article](https://dev.to/silviamalavasi/lets-get-hands-on-with-wordpress-fse-theme-and-custom-blocks-part-2-13ea). Another function we can employ when we want to fetch dynamic data is [render_callback](https://developer.wordpress.org/block-editor/how-to-guides/block-tutorial/creating-dynamic-blocks/), which we pass as an argument when registering the block in `functions.php`. `render_callback` allows us to retrieve a list of posts, an archive, or navigation menus directly from the server. For instance, in my [ZenFSE](https://github.com/SilviaMalavasi/zenfse_full) theme, I've developed a custom header that allows users to choose between two different menus—one for desktop and one for mobile. These menus are fetched using `render_callback`. #Even more: Modifying Core Blocks Now that we understand how a block works, keep in mind that WordPress Core Blocks operate in the same way (but with many more features). If you want to take a look at the source code of the blocks, you can go to the [WordPress Repository](https://github.com/WordPress/gutenberg/tree/trunk/packages/block-library/src) on GitHub. All blocks, including Core blocks, can be customized using [hooks](https://developer.wordpress.org/block-editor/reference-guides/packages/packages-hooks/). In particular, we’re interested in `addFilter`. With this hook, for example, we can enhance an existing block’s functionality without having to rewrite it from scratch. For instance, we could add an additional text field to an image block, or add a dropdown in the block sidebar for selecting an animation to execute when the block is rendered. In [ZenFSE](https://github.com/SilviaMalavasi/zenfse_full), I've used GSAP to add an entrance animation to all my blocks. #Conclusion — Part One Conclusion — Part One This level of interactivity with components truly leaves ample room for creativity and imagination, both from the developer and the user perspectives. Another advantage is the modular nature of the code. Each block resides within its own folder with its files, and in terms of dependencies, each block has its own CSS and JavaScript. Blocks can be used anywhere on the site and can also be inserted into other blocks (such as groups or columns). Moreover, `InnerBlocks` inherit the properties of the blocks they are composed of, so it’s possible to apply colors, typography, and all globally declared properties from the `theme.json` file. WordPress continues to evolve with new solutions, the latest of which, at the moment I am writing, is the [Interactivity API](https://make.wordpress.org/core/2024/03/04/interactivity-api-dev-note/), which allows defining block interactions in an even more immediate manner. There is no rest for developers.
silviamalavasi
1,901,675
A More Straightforward Guide to Solving Two Sum.
When approaching the classic two sum problem you can answer in a few different ways. You can attempt...
0
2024-06-26T17:32:18
https://dev.to/shavonharrisdev/a-more-straightforward-guide-to-solving-two-sum-2gbi
When approaching the classic two sum problem you can answer in a few different ways. You can attempt a brute method in which you consider every possible number.This would be attempted using a nested for loop. This is not an efficient runtime. Or, you can do the one pass method using a hashmap, which is much more efficient. ## Brute Force Given an array lets use ``` let array = [2,1, 5, 3] let target = 4 ``` We are look for 2 numbers in this array to equal the target 4. 2 + 1 = 3 2 + 5 = 7 2 + 3 = 5 After iterating through the array once using the 0th index we can see that 2 is not going to help us reach the target. So we pass through again using the 1st index. 1 + 5 = 6 1 + 3 = 4 Finally, we find two numbers that add up to the target: 1 and 3. In the array, the number 1 is at index 1, and the number 3 is at index 3. Therefore, the answer is [1, 3], indicating that the numbers at indices 1 and 3 of the array sum up to the target value of 4! Now imagine the array looked like [2, 5, 1, 3] instead. We would have had to iterate through the array 3 times. That is much too much and there is a more efficient method. Enter hashmaps. ## What are Hashmaps? A hashmap is a data structure that stores data in pairs of keys and values. It allows for fast retrieval of values based on their associated keys. That's the textbook definition and a bit confusing so let's try to visualize this. Imagine you're in a library looking for specific books. Each book has a unique identifier, like a number, known as a 'key.' When you need a particular book, you simply look up its key, and voila, you've found it. This key—whether it's the book title or a number—quickly directs you to the book you're searching for. This method is much faster than searching through every book one by one. ## Here's the Plan: Now that we know what a hashmap is we can use one to solve our problem. 1. Traverse the list once. 2. For each number, calculate its complement(target - current number). 3. Check if this complement exists in our hashmap. 4. If it doesn’t, store the current number with its index. 5. If it does, return the current index and the stored index from the hashmap. ## Problem Statement Given an array of integers and a target integer, your task is to return the indices of two numbers in the array that add up to the target. ### Example: Input: nums = [2, 7, 11, 15], target = 9 Output: [0, 1] Explanation: nums[0] + nums[1] = 2 + 7 = 9 Note: you can return the indices in any order and there is only one answer. ## Dive into the code? Not quite. Let's draw this hashmap out! Hashmap value: index **Step 1**: We start with an empty hashmap and we look at each number and we check something very specific: - Does the difference between the target and the current number already exist as a key in our hashmap. **Step 2**: For the first number(2) the difference between the target (9) and 2 is 7. If we look in the hashmap: - is 7 noted down already? Of course not, the hashmap is still empty. - So, write down 2 and where you found it (index 0) **Current Hashmap**: - 2: 0 **Step 3**: Move to the next number 7 . - The difference between the target (9) and the number (7) is 2. Is 2 in the hashmap? Yes, and its noted at being at index 0! - This means you've found the two numbers that add up to the target:2 (from index 0) and 7 (the number you're currently looking at, at index 1). **Updated Hashmap**: 2: 0 7: 1 Because you found that both components of the sum are in the hashmap, you can now return their indices: [0, 1]. These indices tell us where in the array each component of the target sum can be found. Yay! ## Let's dive into the code! Let's remind ourselves or our steps and problem. ### Problem Given an array of integers and a target integer, your task is to return the indices of two numbers in the array that add up to the target. ### Steps 1. Traverse the list once. 2. For each number, calculate its complement(target- current number). 3. Check if this complement exists in our hashmap. 4. If it doesn’t, store the current number with its index. 5. If it does, return the current index and the stored index from the hashmap. If we are going to traverse a list, we should definitely use some sort of loop. I will utilize a for loop to iterate through the array just like I did in the visualization. ![Image of the twosum walk through](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fajj58u63suwitt3pzbg.png) The two approaches to solving the 2-sum problem—brute force and the one-pass hashmap method—demonstrate the power of choosing the right data structure for the job. While the brute force method is straightforward, it's not efficient, especially with larger arrays. The hashmap approach, on the other hand, optimizes the process by reducing the number of comparisons needed to find the correct indices. This makes it an ideal choice for real-world applications. **Key Takeaways:** - **Efficiency**: Using hashmaps can drastically improve the efficiency of your code, reducing the time complexity from O(n²) with brute force to O(n) with hashmaps. - **Simplicity**: Despite their power, hashmaps are straightforward to implement and use. - **Practicality**: Understanding when and how to use hashmaps is a valuable skill in computer science, applicable to a wide range of problems beyond just the 2-sum issue. By now, you should have a solid understanding of how the two-sum problem can be efficiently solved using hashmaps! Hope you enjoyed! Happy Coding!
shavonharrisdev
1,901,674
7 Worst States To Buy Property in the Next 5 Years: A Comprehensive Guide
Investing in real estate can be a lucrative venture, but not all investments are created equal. As we...
0
2024-06-26T17:29:07
https://dev.to/axon_group_e1f753b73b4102/7-worst-states-to-buy-property-in-the-next-5-years-a-comprehensive-guide-4bei
california, florida, connecticut, ohio
Investing in real estate can be a lucrative venture, but not all investments are created equal. As we look towards the next five years, certain U.S. states present more challenges than opportunities for potential property buyers. High costs, economic instability, and environmental risks contribute to making these states less desirable for investors and homeowners alike. Here, we explore the[ 7 worst states to buy property in the next 5 years](https://blogking.uk/7-worst-states-to-buy-property-in-the-next-5-years/), helping you to navigate away from potential pitfalls in your real estate journey. 1. California: The Golden State's Losing Shine Why Avoid California? High Property Prices: California remains one of the most expensive states in the U.S. to buy property, with prices driven high by demand and not enough supply. Risk of Natural Disasters: From wildfires to earthquakes, the natural disaster risk significantly increases potential costs and uncertainties. Economic Volatility: The tech-driven economy can lead to boom and bust cycles that affect property values unpredictably. 2. New Jersey: The Costly Corridor Why Avoid New Jersey? Exorbitant Taxes: New Jersey has some of the highest property taxes in the country, which can be a significant financial burden. Stagnant Market Growth: The real estate market in New Jersey has shown sluggish growth, limiting potential returns on investment. Economic Challenges: With an economy struggling to rebound fully from financial setbacks, the future looks uncertain. 3. Illinois: The Fiscal Quagmire Why Avoid Illinois? Economic Instability: Illinois faces significant fiscal challenges, including high state debt and unfunded pension liabilities. Property Tax Burden: High property taxes increase the overall cost of owning a home, diminishing the attractiveness for investors and residents. Population Decline: With residents moving to more tax-friendly states, the demand for property is decreasing, which may lead to lower home values. 4. Connecticut: The Diminishing Prospect Why Avoid Connecticut? Economic Slowdown: Connecticut’s economy has been slow to recover from past recessions, impacting job growth and real estate demand. High Living Costs: The cost of living in Connecticut is higher than the national average, making it less attractive to new residents. Aging Population: An aging demographic and a shrinking workforce pose challenges for long-term real estate demand. 5. Florida: The Sunshine Mirage Why Avoid Florida? Hurricane Risk: Frequent hurricanes lead to higher insurance costs and risk of property damage. Market Oversaturation: Rapid construction in recent years could lead to an oversupply of properties, potentially depressing prices. Tourism Dependency: Florida’s economy heavily relies on tourism, which can fluctuate greatly due to external economic factors. 6. Ohio: The Rust Belt’s Struggle Why Avoid Ohio? Industrial Decline: Parts of Ohio are still struggling with the decline of traditional manufacturing industries, affecting local economies. Uneven Development: While some areas are revitalizing, others are experiencing decline, leading to a patchy real estate market. Low Growth Potential: Economic and demographic factors suggest limited growth potential in many parts of the state. 7. Mississippi: The Bottom Line Why Avoid Mississippi? Economic Challenges: Mississippi remains one of the poorest states in the U.S., with persistent economic challenges. Low Property Values: Property values in Mississippi are low and show little sign of significant appreciation. Quality of Life Issues: Quality of life indicators such as health services, education, and employment opportunities are below average, which can deter potential residents. Navigating the Real Estate Landscape Investing in real estate requires careful consideration of many factors, including economic conditions, property taxes, potential growth, and environmental risks. The states listed here represent particular challenges that might make them less desirable for buying property over the next five years. However, each state can have pockets of opportunity; it's crucial to conduct thorough research or consult with real estate professionals before making any investment decisions. By understanding the broader economic and regional factors, you can make more informed choices and steer your investments towards more secure and prosperous territories.
axon_group_e1f753b73b4102
1,901,671
Finding the Best Leather Minimalist Wallet in the USA
Hi everyone, I'm on the hunt for a Leather Minimalist Wallet here in the USA, and despite the...
0
2024-06-26T17:27:14
https://dev.to/lucast/finding-the-best-leather-minimalist-wallet-in-the-usa-20ib
Hi everyone, I'm on the hunt for a Leather Minimalist Wallet here in the USA, and despite the multitude of options available, I still haven't been able to pinpoint the best one. I'm looking for a wallet that combines both style and functionality, with a sleek design that doesn't compromise on quality. Ideally, it should be made from high-quality leather, offer enough space for a few cards and some cash, and be durable enough to withstand daily use. I've come across a few brands like Debin Leather and few others for this type of [leather minimalist wallet](https://debinleather.com/collections/leather-minimalist-wallets) but I'm not entirely sure if these are the best choices. I've read mixed reviews about each, and it's hard to know which one truly stands out in terms of craftsmanship and longevity. Has anyone here had any great experiences with a particular brand or model? Any recommendations would be greatly appreciated. Additionally, if you know of any lesser-known brands that offer excellent minimalist wallets, I'd love to hear about those as well.
lucast
1,901,391
40 Days Of Kubernetes (5/40)
Day 5/40 What is Kubernetes Video Link @piyushsachdeva Git Repository My Git...
0
2024-06-26T17:24:42
https://dev.to/sina14/40-days-of-kubernetes-540-376d
kubernetes, 40daysofkubernetes
## Day 5/40 # What is Kubernetes [Video Link](https://www.youtube.com/watch?v=SGGkUCctL4I) @piyushsachdeva [Git Repository](https://github.com/piyushsachdeva/CKA-2024/) [My Git Repo](https://github.com/sina14/40daysofkubernetes) We're going to look at `Kubernetes` architecture in depth. 1. `Control Plane` components. 2. Why it's needed. 3. How do they work? ![Kubernetes_arch](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hrumznwa567ftsmdmlo6.png) --- - `Node` is nothing but a virtual machine. - `Control Plane` is a virtual machine that hosts many administrative components, like board in companies! - Actual works are done by `Worker Node`. - `POD` is the smallest deployable unit that can be managed by `Kubernetes` which can include one or multiple containers. - `API Server` acts as the central management point of the entire `Kubernetes` cluster. - `Schedular` is that it automatically decides how to distribute workloads across a cluster of servers. - `Controller Manager` which handles all interactions with `node controller`, `namespace controller`, `deployment controller`, `replication controller` and so on, and make sure everything is up & running and monitored. - `etcd` is a key-value datastore which stores the data required to manage clusters. Importantly. - `kubelet` functions as an agent within nodes and is responsible for the running of pod cycles within each node. - `kube-proxy` controls traffic routing and network connectivity for services within the cluster. - `kubectl` is a command-line tool which command communicates with the `kube-apiserver` and sends orders to the `control plane` node. --- ### Work Flow 1. `kubectl` to `api-server`: Please create the `pod` 2. `api-server` validate and authenticate the user with `etcd` and notify the `controller-manager`. 3. `controller-manager` tells the `api-server` it's ok to have a `POD` 4. `api-server` write the request in `etcd` database. 5. `etcd` says to `api-server` yes it's done. 6. `api-server` says to `scheduler` to watch the new request. 7. `schedular` monitors and find out there's a `pod` creation request. 8. `schedular` says to `api-server` that I found a very special node for deploying the new `pod`. 9. `api-server` interacts with the `kubelet` of the selection `node`. 10. `kubelet` asks `docker` (or other alternative) to build the `container` with requeted `image`. 11. `docker` says to `'kubelet`, hey deploy what I built. 12. `kubelet` run the `pod` and sends back the detail to `api-server` that the `pod` is deployed. 13. `api-server` update the related record in `etcd` database. 14. `api-server` sends the detail to `kubectl` and the `user`. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/53okd63k478f28j2h2eu.png)
sina14
1,901,666
Exploring the main areas of Programming
Programming is a multifaceted and essential skill in the modern digital world, encompassing various...
0
2024-06-26T17:23:41
https://dev.to/annalaura2/exploring-the-main-areas-of-programming-248e
code, backend, frontend, programming
Programming is a multifaceted and essential skill in the modern digital world, encompassing various fields that allow developers to create, innovate, and solve problems in unique ways. In this text, we'll explore the main areas of programming, highlighting their characteristics, applications, and career opportunities. #### 1. **Web Development** Web development is one of the most popular and dynamic areas of programming, divided into two main subareas: front-end and back-end. - **Front-End**: Involves creating the user interface and visual experience of the site. Common technologies include HTML, CSS, and JavaScript, as well as frameworks like React, Angular, and Vue.js. Front-end developers work to ensure that websites are attractive and functional across all devices. - **Back-End**: Focuses on server logic, databases, and API integration. Common languages include Python, Java, Ruby, PHP, and Node.js. Frameworks like Django, Ruby on Rails, and Express.js are widely used. Back-end developers ensure that websites operate correctly and securely. #### 2. **Mobile Development** Mobile development is dedicated to creating applications for mobile devices. There are two main approaches: - **Native**: Involves creating applications specifically for an operating system, such as Swift or Objective-C for iOS and Kotlin or Java for Android. Native apps generally offer better performance and a more integrated user experience. - **Cross-Platform**: Uses frameworks like React Native, Flutter, and Xamarin to create apps that work on multiple operating systems from a single codebase. This approach can reduce development time and cost. #### 3. **Game Development** Game programming is a fascinating field that combines creativity and technical skills. It involves using game engines like Unity, Unreal Engine, and Godot to create games for various platforms, including consoles, PC, and mobile devices. Common languages include C#, C++, and Python. #### 4. **Artificial Intelligence and Machine Learning** Artificial intelligence (AI) and machine learning (ML) are transforming various industries. Developers in this area create algorithms that allow computers to learn and make decisions based on data. Popular languages include Python, R, and Java, and libraries like TensorFlow, Keras, and Scikit-Learn are widely used. #### 5. **Software Development and Systems Engineering** This area encompasses the development of software for desktops, servers, and embedded systems. It involves creating operating systems, enterprise software, productivity tools, and more. Common languages include C, C++, Java, and .NET. #### 6. **Database Development** Database developers focus on designing, implementing, and maintaining databases that store and organize large volumes of data. SQL is the standard language, but NoSQL databases like MongoDB and Cassandra are also gaining popularity. #### 7. **DevOps and Automation Engineering** DevOps is an approach that combines software development and IT operations with the aim of shortening the development life cycle and delivering high-quality software. Common tools include Docker, Kubernetes, Jenkins, and Ansible. DevOps promotes automation and continuous integration, facilitating collaboration between development and operations teams. #### 8. **Cybersecurity** Cybersecurity is crucial for protecting systems, networks, and data from attacks and unauthorized access. It involves creating secure systems, security auditing, vulnerability analysis, and incident response. Knowledge of cryptography, networks, and operating systems is essential, as well as specific security tools and techniques. ### Conclusion Programming offers a vast and diverse range of areas to explore, each with its own technologies, challenges, and opportunities. Whether your passion is creating engaging interfaces, developing mobile solutions, building immersive games, or working with AI, there's a path in programming that can satisfy your aspirations. The continuous technological evolution ensures that the demand for programming skills will only increase, making this a field full of potential for innovation and professional growth.
annalaura2
1,892,585
Part 10 — Controller Input
In this part, we will make the watch hidden by default and show it in a few seconds by pressing a...
0
2024-06-18T15:01:34
https://dev.to/kurohuku/part-10-controller-input-1mdm
In this part, we will make the watch hidden by default and show it in a few seconds by pressing a controller button. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yj6umot4u9eizsgfkdjw.gif) There are various ways to get controller input with Unity. This time, we will use [OpenVR Input API (SteamVR Input)](https://github.com/ValveSoftware/openvr/wiki/SteamVR-Input). ## Prerequisite setting Open the SteamVR Settings. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/t32iq53hn1p5nsfw9xd1.png) Toggle **Advanced Settings**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/549oreg9jk784t4773m2.png) Toggle **Developer > Enable debugging options in the input binding user interface** to **On**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rigizcffoij3wiw2xntj.png) ## Create action manifest SteamVR Input uses abstracted actions instead of direct access to the physical buttons. The developer predefines actions and may create default mappings for each controller. Users can assign physical buttons to the actions in the SteamVR Input setting window. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ffx83rfwp4ccllvbjtm7.png) First, create the action manifest that defines actions in JSON format. We use the Unity SteamVR Plugin GUI tool to generate the action manifest easily. ### Generate action manifest Select Unity menu **Window > SteamVR Input**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m2jqigt1n9ewfr3h6ojt.png) It asks you whether you want to use a sample action manifest file. This time we will make up from the first so select No. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pifrl2hh0aou8fgewio9.png) Input the action sets name to **“Watch”**, and select the below dropdown to **per hand**. An application can have multiple action sets. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/stzacsek8vkgr7kpaddb.png) **Per hand** means it can map different actions to the buttons of the left and right-hand controller separately. **Mirrored** means it can map actions to only one controller and the other buttons are mirrored to the mapping. Click the **NewAction** name that is in the **In** box of the **Actions** section. The **“Action Details”** is shown to the right. Change the **Name** to **“WakeUp”**. Click the **“Save and generate”** button at the left bottom to generate an action manifest file. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s51rgvet3nafw0461u62.png) It will generate a new action manifest file as **StreamingAssets/SteamVR/actions.json**. ```json { "actions": [ { "name": "/actions/Watch/in/WakeUp", "type": "boolean" } ], "action_sets": [ { "name": "/actions/Watch", "usage": "leftright" } ], "default_bindings": [], "localization": [] } ``` ## Create default binding Click the right bottom **Open binding UI** button. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tn2re0jybmn5y4meaww9.png) If any VR controller is not shown, check your HMD is recognized from SteamVR. Click the **“Create New Binding”**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/t03057rs5el9ociegwnt.png) Set the Y button to show the watch for a few seconds. If your controller doesn’t have the Y button, use another button instead. Click the **+** button to the right of the **Y Button**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eh80pb6voosgjzxl61lt.png) Select **BUTTON**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2l31lthf3uhv8rmlx71j.png) Select **None** to the right of the Click, then select **wakeup** action. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jz5krz7saktu7qsi2s3n.png) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3m058df3manpz6fztkdw.png) Click the left bottom check icon to save changes. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ridp4mr8ad3584mf78rr.png) Click the right bottom **Replace Default Binding**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/w4xv1v4sobexu5hxp2ss.png) Click **Save**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/swt8f3p74vd2v7k3y8tt.png) At this point, a new default binding file is added to the action manifest. `actions.json` ```json { "action_sets" : [ { "name" : "/actions/Watch", "usage" : "single" } ], "actions" : [ { "name" : "/actions/Watch/in/WakeUp", "type" : "boolean" } ], "default_bindings" : [ { + "binding_url" : "application_generated_unity_retryoverlay_exe_binding_oculus_touch.json", + "controller_type" : "oculus_touch" } ], "localization" : [] } ``` The default binding file is generated into the same directory to actions.json. These files will be included in the build for users. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ffrm5hluq2zw6gcz99qn.png) Click the **<- BACK** button at the upper left, then you see the new setting is active. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/julogqv193yaue21jetr.png) Now we’ve made the default binding. Close the SteamVR Input window. Also, close the Unity SteamVR Input window. At this time, it asks you to save the change, so click **Close**. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hwnfj46azm4q8cskgigt.png) In this tutorial environment, clicking the Save button overwrites the action manifest as an empty **default_bindings**. (There is no problem with clicking Save when we open the binding setting window next time.) --- ### How about other controller bindings? You can create other controller’s default bindings by clicking the controller name on the right side. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/txjz55r3v1x0suwza0yh.png) Other controllers’ default bindings are added to default_bindings param of action manifest. ```diff { "action_sets" : [ { "name" : "/actions/NewSet", "usage" : "leftright" } ], "actions" : [ { "name" : "/actions/NewSet/in/NewAction", "type" : "boolean" } ], "default_bindings" : [ { "binding_url" : "application_generated_unity_steamvr_inputbindingtest_exe_binding_oculus_touch.json", "controller_type" : "oculus_touch" }, + { + "binding_url" : "application_generated_unity_steamvr_inputbindingtest_exe_binding_vive_controller.json", + "controller_type" : "vive_controller" + } ], "localization" : [] } ``` You don’t necessarily have to create other controller default bindings because SteamVR automatically remaps actions for other controllers if at least one default binding exists. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9e7al8ucfkf8vctwfcow.png) --- ### Create script Create a new script `InputController.cs` into `Scripts` folder. We will add the controller input related code to this file. On hierarchy, **right click > Create Empty** to create an empty object, change the object name to **InputController**. Drag `InputController.cs` from the project window to the **InputController** object. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9o5rzzptco92fysdyp4l.png) ### Initialize and cleanup OpenVR Copy the below code to **InputController.cs**. ```cs using System; using UnityEngine; using Valve.VR; public class InputController : MonoBehaviour { private void Start() { OpenVRUtil.System.InitOpenVR(); } private void Destroy() { OpenVRUtil.System.ShutdownOpenVR(); } } ``` ## Set action manifest path Set the action manifest path with [SetActionManifestPath()](https://valvesoftware.github.io/steamvr_unity_plugin/api/Valve.VR.CVRInput.html#Valve_VR_CVRInput_SetActionManifestPath_System_String_) at launch. (read the [wiki](https://github.com/ValveSoftware/openvr/wiki/SteamVR-Input#api-documentation) for details) The action manifest is generated as **StreamingAssets/SteamVR/actions.json** so we set this. ```diff using System; using UnityEngine; using Valve.VR; public class InputController : MonoBehaviour { private void Start() { OpenVRUtil.System.InitOpenVR(); + var error = OpenVR.Input.SetActionManifestPath(Application.streamingAssetsPath + "/SteamVR/actions.json"); + if (error != EVRInputError.None) + { + throw new Exception("Failed to set action manifest path: " + error); + } } private void Destroy() { OpenVRUtil.System.ShutdownOpenVR(); } } ``` ## Get action set handle An application can have some action sets that have multiple actions. We use an action set handle to determine which action set is used. Get action set handle with [GetActionSetHandle()](https://valvesoftware.github.io/steamvr_unity_plugin/api/Valve.VR.CVRInput.html#Valve_VR_CVRInput_GetActionSetHandle_System_String_System_UInt64__). The action set is set with a string **/actions/[action_set_name]**. This time, it’s **/actions/Watch**. ```diff public class InputController : MonoBehaviour { + ulong actionSetHandle = 0; private void Start() { OpenVRUtil.System.InitOpenVR(); var error = OpenVR.Input.SetActionManifestPath(Application.streamingAssetsPath + "/SteamVR/actions.json"); if (error != EVRInputError.None) { throw new Exception("Failed to set action manifest path: " + error); } + error = OpenVR.Input.GetActionSetHandle("/actions/Watch", ref actionSetHandle); + if (error != EVRInputError.None) + { + throw new Exception("Failed to get action set /actions/Watch: " + error); + } } private void Destroy() { OpenVRUtil.System.ShutdownOpenVR(); } } ``` ## Get action handle Next, get the action handle with [GetActionHandle()](https://valvesoftware.github.io/steamvr_unity_plugin/api/Valve.VR.CVRInput.html#Valve_VR_CVRInput_GetActionHandle_System_String_System_UInt64__). Action is set with **/actions/[action_set_name]/in/[action_name]**. This time, action is **/actions/Watch/in/Wakeup**. ```diff public class InputController : MonoBehaviour { ulong actionSetHandle = 0; + ulong actionHandle = 0; private void Start() { OpenVRUtil.System.InitOpenVR(); var error = OpenVR.Input.SetActionManifestPath(Application.streamingAssetsPath + "/SteamVR/actions.json"); if (error != EVRInputError.None) { throw new Exception("Faild to set action manifest path: " + error); } error = OpenVR.Input.GetActionSetHandle("/actions/Watch", ref actionSetHandle); if (error != EVRInputError.None) { throw new Exception("Failed to get action set /actions/Watch: " + error); } + error = OpenVR.Input.GetActionHandle($"/actions/Watch/in/WakeUp", ref actionHandle); + if (error != EVRInputError.None) + { + throw new Exception("Faild to get action /actions/Watch/in/WakeUp: " + error); + } } private void Destroy() { OpenVRUtil.System.ShutdownOpenVR(); } } ``` ## Update action state Update each action state at the first of each frame. ### Prepare action set to update We use an action set to specify which actions will be updated. Pass target action sets as an array of [VRActiveActionSet_t](https://valvesoftware.github.io/steamvr_unity_plugin/api/Valve.VR.VRActiveActionSet_t.html) type. This time, the action set is only * so we make a single-element array `VRActionActiveSet_t[]`. ```diff private void Start() { OpenVRUtil.System.InitOpenVR(); var error = OpenVR.Input.SetActionManifestPath(Application.streamingAssetsPath + "/SteamVR/actions.json"); if (error != EVRInputError.None) { throw new Exception("Failed to set action manifest path: " + error); } error = OpenVR.Input.GetActionSetHandle("/actions/Watch", ref actionSetHandle); if (error != EVRInputError.None) { throw new Exception("Failed to get action set /actions/Watch: " + error); } error = OpenVR.Input.GetActionHandle($"/actions/Watch/in/WakeUp", ref actionHandle); if (error != EVRInputError.None) { throw new Exception("Failed to get action /actions/Watch/in/WakeUp: " + error); } } + private void Update() + { + // Action set list to update + var actionSetList = new VRActiveActionSet_t[] + { + // This time, Watch action only. + new VRActiveActionSet_t() + { + // Pass Watch action set handle + ulActionSet = actionSetHandle, + ulRestrictedToDevice = OpenVR.k_ulInvalidInputValueHandle, + } + }; + } ... ``` The second argument `activeActionSize` is the byte size of `VRActiveActionSet_t` the struct. ## Get action value We have updated the action state. Next, we will get the action value. The function used to retrieve action value varies depending on the action type. ### GetDigitalActionData() This gets an on/off value likes the button is pushed or not. ### GetAnalogActionData() This gets analog values like thumb stick direction or how much trigger is pulled. ### GetPoseActionData() This gets a pose like controller position and rotation. This time, we want to get the button on/off value so use `GetDigitalActionData()`. Get the action value with WakeUp action handle we prepared. ```diff private void Update() { var actionSetList = new VRActiveActionSet_t[] { new VRActiveActionSet_t() { ulActionSet = actionSetHandle, ulRestrictedToDevice = OpenVR.k_ulInvalidInputValueHandle, } }; var activeActionSize = (uint)System.Runtime.InteropServices.Marshal.SizeOf(typeof(VRActiveActionSet_t)); var error = OpenVR.Input.UpdateActionState(actionSetList, activeActionSize); if (error != EVRInputError.None) { throw new Exception("Failed to update action state: " + error); } + var result = new InputDigitalActionData_t(); + var digitalActionSize = (uint)System.Runtime.InteropServices.Marshal.SizeOf(typeof(InputDigitalActionData_t)); + error = OpenVR.Input.GetDigitalActionData(actionHandle, ref result, digitalActionSize, OpenVR.k_ulInvalidInputValueHandle); + if (error != EVRInputError.None) + { + throw new Exception("Failed to get WakeUp action data: " + error); + } } ``` The 3rd argument `digitalActionSize` is the byte size of `InputDigitalActionData_t` structure. The 4th argument `ulRestrictToDevice` is generally not used so set `OpenVR.k_ulInvalidInputValueHandle`. The returned value type is [InputDigitalActionData_t](https://valvesoftware.github.io/steamvr_unity_plugin/api/Valve.VR.InputDigitalActionData_t.html). ```cs struct InputDigitalActionData_t { bool bActive; VRInputValueHandle_t activeOrigin; bool bState; bool bChanged; float fUpdateTime; }; ``` The action on/off state is set to `bState`. `bChanged` is `true` at the frame where the action state is changed. Let’s detect the action. ```diff private void Update() { var actionSetList = new VRActiveActionSet_t[] { new VRActiveActionSet_t() { ulActionSet = actionSetHandle, ulRestrictedToDevice = OpenVR.k_ulInvalidInputValueHandle, } }; var activeActionSize = (uint)System.Runtime.InteropServices.Marshal.SizeOf(typeof(VRActiveActionSet_t)); var error = OpenVR.Input.UpdateActionState(actionSetList, activeActionSize); if (error != EVRInputError.None) { throw new Exception("Failed to update action state: " + error); } var result = new InputDigitalActionData_t(); var digitalActionSize = (uint)System.Runtime.InteropServices.Marshal.SizeOf(typeof(InputDigitalActionData_t)); error = OpenVR.Input.GetDigitalActionData(actionHandle, ref result, digitalActionSize, OpenVR.k_ulInvalidInputValueHandle); if (error != EVRInputError.None) { throw new Exception("Failed to get WakeUp action data: " + error); } + if (result.bState && result.bChanged) + { + Debug.Log("WakeUp is executed"); + } } ``` Run the program and push the Y button, then the message is logged to the Unity console. ## Create event handler ### Notify to Unity Event Add `UnityEvent` into **InputController.cs** to notify WakeUp action, then call it with `Invoke()`. ```diff using System; using UnityEngine; using Valve.VR; + using using UnityEngine.Events; public class InputController : MonoBehaviour { + public UnityEvent OnWakeUp; private ulong actionSetHandle = 0; private ulong actionHandle = 0; // code omit. private void Update() { var actionSetList = new VRActiveActionSet_t[] { new VRActiveActionSet_t() { ulActionSet = actionSetHandle, ulRestrictedToDevice = OpenVR.k_ulInvalidInputValueHandle, } }; var activeActionSize = (uint)System.Runtime.InteropServices.Marshal.SizeOf(typeof(VRActiveActionSet_t)); var error = OpenVR.Input.UpdateActionState(actionSetList, activeActionSize); if (error != EVRInputError.None) { throw new Exception("Failed to update action state: " + error); } var result = new InputDigitalActionData_t(); var digitalActionSize = (uint)System.Runtime.InteropServices.Marshal.SizeOf(typeof(InputDigitalActionData_t)); error = OpenVR.Input.GetDigitalActionData(actionHandle, ref result, digitalActionSize, OpenVR.k_ulInvalidInputValueHandle); if (error != EVRInputError.None) { throw new Exception("Failed to get WakeUp action data: " + error); } if (result.bState && result.bChanged) { - Debug.Log("WakeUp is executed"); + OnWakeUp.Invoke(); } } ``` ### Attach event handler Add method into `WatchOverlay.cs` that will be called when `WakeUp` action is executed. ```diff public class WatchOverlay : MonoBehaviour { ... + public void OnWakeUp() + { + // Show watch. + } } ``` In the hierarchy, open `InputController` inspector, then set `WatchOverlay.OnWakeUp()` to `OnWakeUp()` field. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/92wrvf65j3rp9p6raomq.png) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hg4yhsq0pa2offp7rat0.png) ## Add code to show/hide the watch Update **WatchOverlay.cs** to show the watch only when `WakeUp` action is executed. ### Hide watch by default ```diff private void Start() { OpenVRUtil.System.InitOpenVR(); overlayHandle = Overlay.CreateOverlay("WatchOverlayKey", "WatchOverlay"); Overlay.FlipOverlayVertical(overlayHandle); Overlay.SetOverlaySize(overlayHandle, size); - Overlay.ShowOverlay(overlayHandle); } ``` ### Show watch by WakeUp action Show overlay when the WakeUp action is executed. ```diff public void OnWakeUp() { + Overlay.ShowOverlay(overlayHandle); } ``` ### Add hide method Add `HideOverlay()` to **OpenVRUtil.cs**. ```diff public static void ShowOverlay(ulong handle) { var error = OpenVR.Overlay.ShowOverlay(handle); if (error != EVROverlayError.None) { throw new Exception("Failed to show overlay: " + error); } } + public static void HideOverlay(ulong handle) + { + var error = OpenVR.Overlay.HideOverlay(handle); + if (error != EVROverlayError.None) + { + throw new Exception("Failed to hide overlay: " + error); + } + } public static void SetOverlayFromFile(ulong handle, string path) { var error = OpenVR.Overlay.SetOverlayFromFile(handle, path); if (error != EVROverlayError.None) { throw new Exception("Failed to draw image file: " + error); } } ``` ### Hide watch after three seconds This time, we use Unity Coroutine to wait three seconds. `WatchOverlay.cs` ```diff using System; + using System.Collections; using UnityEngine; using Valve.VR; using OpenVRUtil; public class WatchOverlay : MonoBehaviour { public Camera camera; public RenderTexture renderTexture; public ETrackedControllerRole targetHand = ETrackedControllerRole.RightHand; private ulong overlayHandle = OpenVR.k_ulOverlayHandleInvalid; + private Coroutine sleepCoroutine; ... public void OnWakeUp() { Overlay.ShowOverlay(overlayHandle); + if (sleepCoroutine != null) + { + StopCoroutine(sleepCoroutine); + } + sleepCoroutine = StartCoroutine(Sleep()); } + private IEnumerator Sleep() + { + yield return new WaitForSeconds(3); + Overlay.HideOverlay(overlayHandle); + } } ``` Run the program, and check the watch is shown for three seconds when the Y button is pushed. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kr69g2zdqhm0pupvl5w3.gif) ## Complete 🎉 The watch application is completed. This tutorial ends here, so the final code organization is up to you. Look back at what we touched on in this tutorial. - Initialize OpenVR - Create overlay - Draw image file - Change overlay size and position - Follow devices - Draw camera output - Create dashboard overlay - Process events - Controller input We learned the basics of overlay application development. OpenVR has many more APIs, so how about trying to make original tools? The next page is additional information for more details.
kurohuku
1,901,638
AI in Cybersecurity: Improving Data Security Services
Artificial intelligence (AI) integration has significantly transformed numerous industries, with...
0
2024-06-26T17:22:08
https://dev.to/calsoftinc/ai-in-cybersecurity-improving-data-security-services-5028
ai, cybersecurity, database, security
Artificial intelligence (AI) integration has significantly transformed numerous industries, with cybersecurity being one of the most influential practices. Sensitive data can now be covered from sophisticated cyberattacks with the use of AI-powered data security services. Utilizing AI`s improved threat detection, real-time monitoring, and automated responses, can safeguard sensitive data against sophisticated cyber assaults. As AI can examine massive amounts of data, identify trends, and anticipate future vulnerabilities, it plays a critical role in data security services with the aid of a proactive method of cybersecurity. With the use of AI, companies can stay ahead of possible threats and ensure the security of their data. This technological breakthrough may not be the most effective at boosting the effectiveness of security measures, but it does increase the overall resilience of data protection tactics. As cyber assaults become more complex, inclusive AI in statistical safety products is crucial for preserving robust defences and protecting sensitive data's integrity and security. Read this blog to understand how AI integration in cybersecurity practices enhances data security services. ## The Role of AI in Cybersecurity Artificial intelligence plays an essential role in modern cybersecurity frameworks. It offers superior competencies that significantly enhance **[data security services](https://www.calsoftinc.com/technology/security/)**. AI systems can quickly examine large amounts of data, identify patterns, and detect anomalies that might cause a cyber threat. These structures identify every interaction, constantly enhancing the risk detection and response mechanisms. ## Benefits of AI in Data Security Services ### 1. Advanced Threat Detection: AI-driven data security offerings can discover threats faster than traditional methods. By analysing patterns and behaviours, AI can recognize unusual activities that could indicate a cyberattack. This proactive technique facilitates mitigating risks before they escalate. ### 2. Real-Time Monitoring and Response: AI structures offer real-time monitoring, permitting instant response to potential threats. This functionality is important in minimizing damage and ensuring that data breaches are contained swiftly. ### 3. Automated Incident Response: With AI, data security offerings can automate incident reaction processes. This automation reduces the time to deal with safety breaches, ensuring a faster and more efficient resolution. ## Implementation of AI in data security services Implementing AI in data security services includes integrating advanced algorithms for real-time threat detection, anomaly evaluation, and automated incident response. AI structures process large volumes of data to identify patterns and predict vulnerabilities, improving the performance and accuracy of security measures. Organizations must ensure continuous tracking and updating of AI models to adapt to new threats. Combining AI with human knowledge can, in addition, enhance threat evaluation and response strategies, providing robust protection against sophisticated cyber threats. This integration is critical for maintaining the integrity and confidentiality of sensitive data in an increasingly complex cyber landscape. ## Key Applications of AI in Data Security Services ### 1. Behavioural Analytics: AI utilizes behavioural analytics to monitor user activities and detect deviations from normal behaviour. This technique allows for figuring out insider threats and compromised accounts. ### 2. Threat Intelligence: AI-powered threat intelligence structures acquire and examine data from various sources to predict and prevent cyber-attacks. These structures can offer actionable insights, supporting businesses to strengthen their security posture. ### 3. Network Security: AI complements network safety by monitoring visitors’ styles and detecting anomalies. It can perceive suspicious activities, including unauthorized access attempts and data exfiltration, ensuring robust safety for networked systems. ## Challenges in AI-Driven Data Security Services While AI offers numerous benefits, its implementation in data security services also presents challenges. One significant challenge is the risk of false positives, wherein legitimate activities are mistakenly flagged as threats. Additionally, AI structures require massive datasets to teach effectively, and ensuring the significance and relevance of these data may be challenging. ## Overcoming Challenges in AI Integration To deal with those challenges, companies must adopt a balanced approach. Combining AI with human knowledge can enhance the accuracy of threat detection and response. Continuous monitoring and updating of AI structures are important to conform to new and emerging threats. ## The Future of AI in Data Security Services The future of AI in data security offerings is promising. AI is set to revolutionize statistics security by providing advanced threat detection, real-time monitoring, and automated response capabilities. AI-pushed data security offerings will enhance the capacity to expect and prevent cyber threats, making data security more efficient and effective. As the AI era evolves, it will permit more sophisticated safety measures, ensuring strong protection mechanisms in opposition to more complicated cyber threats. Integrating AI into data security offerings is crucial for companies to preserve a robust security posture in the digital age. By leveraging AI, data security services can provide unparalleled safety, ensuring the protection and integrity of sensitive data in an ever-evolving cyber-risk landscape. ## Conclusion Integrating AI into data security services not only improves protection but also ensures that enterprises can react quickly to the ever-changing panorama of cybersecurity threats. Embracing AI in cybersecurity is critical for maintaining strong data security and protecting against advanced assaults. Calsoft provides **[AI services](https://www.calsoft.ai/)** to provide more powerful and efficient data security services, protecting the protection and integrity of clients' data. As AI technology progresses, its position in data protection services will expand, bringing new and inventive solutions to prevent cybercrime. Calsoft is at the forefront of this transformation, offering cutting-edge AI-powered data protection solutions that far outweigh any potential drawbacks. Calsoft is a critical partner in modern cybersecurity strategy, helping to preserve sensitive information.
calsoftinc
1,901,627
The Ultimate Guide to Mastering EC2 Compute Instances and a Step-by-Step Deployment Project
Introduction Amazon Elastic Compute Cloud (EC2) is a cornerstone of Amazon Web Services...
0
2024-06-26T17:20:38
https://dev.to/hacker_haii/the-ultimate-guide-to-mastering-ec2-compute-instances-and-a-step-by-step-deployment-project-3o3e
aws, devops, virtualmachine, cloud
## **Introduction** Amazon Elastic Compute Cloud (EC2) is a cornerstone of Amazon Web Services (AWS), providing resizable compute capacity in the cloud. EC2 allows users to launch virtual servers, known as instances, which can be configured to meet specific needs. This flexibility makes EC2 a critical tool in cloud computing, enabling scalable and cost-effective solutions for a variety of applications. ** ## Types of EC2 Instances ** AWS offers a wide range of EC2 instance types, each optimized for different workloads. Understanding these options helps you choose the right instance for your specific needs. ## General Purpose General Purpose instances are versatile and provide a balanced mix of compute, memory, and networking resources. They are ideal for a wide range of applications. ** - t3 **: Burstable performance instances that provide a baseline level of CPU performance with the ability to burst to higher levels. Suitable for general-purpose workloads such as web servers, small databases, and development environments. ** - m5: ** Offer a balance of compute, memory, and networking resources. These instances are ideal for applications like web servers, app servers, and back-end servers for enterprise applications. Compute Optimized Compute Optimized instances are designed for compute-bound applications that benefit from high-performance processors. ** - c5: ** Provide a high ratio of compute to memory and are ideal for CPU-intensive tasks such as high-performance web servers, scientific modeling, and batch processing workloads. ## Memory Optimized Memory Optimized instances are designed to deliver fast performance for workloads that process large datasets in memory. **r5:** These instances are optimized for memory-intensive applications such as high-performance databases, big data analytics, in-memory caching, and real-time processing of large datasets. ## Storage Optimized Storage Optimized instances are designed for workloads that require high, sequential read and write access to large datasets on local storage. ** - i3: ** Ideal for storage-heavy workloads, these instances offer high IOPS for databases, data warehousing, Elasticsearch, and other data-intensive applications. ** ## Accelerated Computing Accelerated Computing instances use hardware accelerators, or co-processors, to perform functions such as floating-point number calculations, graphics processing, or data pattern matching more efficiently than software running on general-purpose CPUs. ** - p3: ** These instances are optimized for machine learning, high-performance computing (HPC), computational fluid dynamics, computational finance, seismic analysis, speech recognition, autonomous vehicles, and video transcoding. ** ## Availability Zones and Regions ** - Availability Zones (AZs): ** Physically separate data centers within an AWS region. Each AZ is isolated but interconnected with other AZs in the region through low-latency links. ** - Regions: ** Geographically separate areas that contain multiple availability zones. Each region operates independently and is designed to be isolated from failures in other regions. ** ## Importance ** Selecting the right region and availability zone is crucial for optimizing latency, ensuring compliance with local regulations, and achieving high availability and fault tolerance. ** ## Selection Criteria - **Latency:** Choose a region geographically close to your users for lower latency. **- Compliance:** Ensure the region meets any data residency requirements. **- Cost:** Pricing can vary by region, so consider cost-efficiency when selecting a region. ** - Service Availability: ** Not all AWS services are available in every region, so ensure the required services are supported in your chosen region. ** ## Step-by-Step Project: Deploying Jenkins on AWS ** ## Step 1: Creating an Ubuntu EC2 Instance - Launch EC2 Instance: - Sign in to the AWS Management Console. - Navigate to EC2 Dashboard. - Click on “Launch Instance.” - Choose the “Ubuntu Server 20.04 LTS” AMI. - Select an instance type (e.g., t3.medium). - Configure instance details, add storage, and tag the instance. - Configure security group: Allow SSH (port 22) and HTTP (port 80). - Review and launch the instance, and download the key pair for SSH access. ## **2. Configure Security Groups:** Ensure your security group allows inbound traffic on the necessary ports (SSH, HTTP). ## Step 2: Connecting to the EC2 Instance - Open Terminal and SSH into Instance: ``` ssh -i /path/to/your-key-pair.pem ubuntu@your-ec2-public-dns ``` **2. Troubleshooting Tips:** Ensure the key pair file permissions are set to read-only: ``` chmod 400 /path/to/your-key-pair.pem ``` ## Step 3: Installing Jenkins - Update Package List and Install Java: ``` sudo apt update sudo apt install openjdk-11-jdk -y ``` - Install Jenkins: ``` wget -q -O - https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add - sudo sh -c 'echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list' sudo apt update sudo apt install jenkins -y ``` - Start Jenkins: ``` sudo systemctl start jenkins sudo systemctl enable jenkins ``` ## Step 4: Configuring Jenkins - Unlock Jenkins: _Open a web browser and navigate to http://your-ec2-public-dns:8080. Retrieve the initial admin password:_ ``` sudo cat /var/lib/jenkins/secrets/initialAdminPassword ``` - Complete Setup: _Follow the on-screen instructions to install suggested plugins. Create the first admin user._ ## Step 5: Securing the Jenkins Instance - Configure Firewall: _Update your security group to restrict access to Jenkins. Set up rules to allow only specific IP addresses._ - Set Up HTTPS: _Install and configure a reverse proxy (e.g., Nginx) to handle HTTPS connections._ ** ## Conclusion ** Amazon EC2 offers a versatile and powerful platform for deploying a wide range of applications. Understanding the different instance types and how to effectively use availability zones and regions can help you optimize performance and cost. By following the detailed steps to deploy Jenkins on an EC2 instance, you can leverage the full potential of AWS for your development and operational needs. Experimenting with different configurations will allow you to tailor your environment to best meet your specific requirements.
hacker_haii
1,901,661
Explorando as principais áreas da programação
Explorando as Principais Áreas da Programação A programação é uma habilidade multifacetada...
0
2024-06-26T17:20:34
https://dev.to/annalaura2/explorando-as-principais-areas-da-programacao-f9l
programming, techtalks, backend, frontend
### Explorando as Principais Áreas da Programação A programação é uma habilidade multifacetada e essencial no mundo digital moderno, abrangendo diversas áreas que permitem aos desenvolvedores criar, inovar e resolver problemas de maneiras únicas. Neste texto, exploraremos as principais áreas da programação, destacando suas características, aplicações e oportunidades de carreira. #### 1. **Desenvolvimento Web** O desenvolvimento web é uma das áreas mais populares e dinâmicas da programação, dividida em duas principais subáreas: front-end e back-end. - **Front-End**: Envolve a criação da interface do usuário e a experiência visual do site. Tecnologias comuns incluem HTML, CSS e JavaScript, além de frameworks como React, Angular e Vue.js. Desenvolvedores front-end trabalham para garantir que os sites sejam atraentes e funcionais em todos os dispositivos. - **Back-End**: Foca na lógica do servidor, banco de dados e integração de APIs. Linguagens comuns incluem Python, Java, Ruby, PHP e Node.js. Frameworks como Django, Ruby on Rails e Express.js são amplamente utilizados. Desenvolvedores back-end garantem que os sites funcionem corretamente e de forma segura. #### 2. **Desenvolvimento Mobile** O desenvolvimento mobile é dedicado à criação de aplicativos para dispositivos móveis. Existem duas abordagens principais: - **Nativo**: Envolve a criação de aplicativos especificamente para um sistema operacional, como Swift ou Objective-C para iOS e Kotlin ou Java para Android. Aplicativos nativos geralmente oferecem melhor desempenho e uma experiência de usuário mais integrada. - **Cross-Platform**: Utiliza frameworks como React Native, Flutter e Xamarin para criar aplicativos que funcionam em múltiplos sistemas operacionais a partir de uma única base de código. Essa abordagem pode reduzir o tempo e custo de desenvolvimento. #### 3. **Desenvolvimento de Jogos** A programação de jogos é uma área fascinante que combina criatividade e habilidades técnicas. Envolve o uso de motores de jogos como Unity, Unreal Engine e Godot para criar jogos para diversas plataformas, incluindo consoles, PC e dispositivos móveis. Linguagens comuns incluem C#, C++ e Python. #### 4. **Inteligência Artificial e Machine Learning** A inteligência artificial (IA) e o machine learning (ML) estão transformando diversas indústrias. Desenvolvedores nesta área criam algoritmos que permitem aos computadores aprender e tomar decisões com base em dados. Linguagens populares incluem Python, R e Java, e bibliotecas como TensorFlow, Keras e Scikit-Learn são amplamente utilizadas. #### 5. **Desenvolvimento de Software e Engenharia de Sistemas** Esta área abrange o desenvolvimento de software para desktops, servidores e sistemas integrados. Envolve a criação de sistemas operacionais, softwares empresariais, ferramentas de produtividade e muito mais. Linguagens comuns incluem C, C++, Java, e .NET. #### 6. **Desenvolvimento de Banco de Dados** Os desenvolvedores de banco de dados se concentram no design, implementação e manutenção de bancos de dados que armazenam e organizam grandes volumes de dados. SQL é a linguagem padrão, mas também existem bancos de dados NoSQL como MongoDB e Cassandra que estão ganhando popularidade. #### 7. **DevOps e Engenharia de Automação** DevOps é uma abordagem que combina desenvolvimento de software e operações de TI com o objetivo de reduzir o ciclo de vida de desenvolvimento e entregar software de alta qualidade. Ferramentas comuns incluem Docker, Kubernetes, Jenkins e Ansible. DevOps promove a automação e a integração contínua, facilitando a colaboração entre equipes de desenvolvimento e operações. #### 8. **Cibersegurança** A cibersegurança é crucial para proteger sistemas, redes e dados contra ataques e acessos não autorizados. Envolve a criação de sistemas seguros, auditoria de segurança, análise de vulnerabilidades e resposta a incidentes. Conhecimentos em criptografia, redes e sistemas operacionais são essenciais, assim como ferramentas e técnicas específicas de segurança. ### Conclusão A programação oferece uma gama vasta e diversificada de áreas para explorar, cada uma com suas próprias tecnologias, desafios e oportunidades. Seja qual for a sua paixão — criar interfaces atraentes, desenvolver soluções móveis, construir jogos envolventes ou trabalhar com IA — há um caminho na programação que pode satisfazer suas aspirações. A contínua evolução tecnológica garante que a demanda por habilidades em programação só aumente, tornando este um campo repleto de potencial para inovação e crescimento profissional.
annalaura2
1,901,650
Dynamic DOM Element Creation : <template>
In HTML Create parent div &lt;!--------------- Parent div --------------&gt; ...
0
2024-06-26T17:19:35
https://dev.to/__khojiakbar__/dynamic-dom-element-creation--400k
javascript, dom, template
### In HTML 1. Create parent div ``` <!--------------- Parent div --------------> <div id="parent" class="card m-5 mx-auto" > <h1>Hello Template</h1> </div> ``` 2. Create template: ``` <!----------------- Template tag ------------> <template id="template"> <div class="card d-flex"> <img id="temp-img" src="https://picsum.photos/id/293/100/100" alt="smth"> <h1>Template title</h1> <p>Lorem ipsum dolor sit amet, consectetur adipisicing elit. Animi, architecto dolor eius fugiat incidunt ipsa iure iusto officiis recusandae rerum!</p> </div> </template> ``` ### In CSS: ``` #parent{ width: 43%; } .card { width:500px; padding: 20px; } .card h1{ font-size: 50px; } #temp-img { margin: 0 auto; width: 400px; height: 300px; } ``` ### In Javascript: - **Call from HTML** ``` const parent = document.querySelector('#parent'); const template = document.querySelector('#template'); ``` - **Clone the template** ``` clonedTemplate = template.content.cloneNode(true); ``` - **Append the new cloned template to the parent** ``` parent.prepend(clonedTemplate); ```
__khojiakbar__
1,901,414
Who's the real boss? - Lead vs Manager
Lead vs Manager in the workplace: Who is who? Distinct Paths in Team Leadership The...
0
2024-06-26T14:44:28
https://dev.to/zoltan_fehervari_52b16d1d/lead-vs-manager-59ag
lead, manager, teamleadership, management
[Lead vs Manager](https://bluebirdinternational.com/lead-vs-manager/) in the workplace: Who is who? ## Distinct Paths in Team Leadership The distinction between leads and managers is marked by their focus and scope of authority. **Leads** go deep into technical details, steering projects with their expertise. **Managers** cast a wider strategic net, aligning team efforts with overarching business goals. ## Technical Direction vs. Strategic Vision Leads are the technical compasses of a project, often remaining hands-on with the work and mentoring others in their craft. Managers, holding the reins of project direction and resource allocation. ## Comparing Leadership Roles **Leads** concentrate on technical proficiency and day-to-day mentoring. **Managers** handle strategic planning and have the final say in decision-making. ## The Responsibilities Divide Leads are in the trenches with their team, assigning tasks and ensuring technical quality. Managers have a bird’s-eye view, orchestrating multiple teams, managing budgets, and fostering stakeholder relations. ## Skill Sets for Success Leads must possess deep technical knowledge and strong communication skills. Managers require a mix of strategic insight, project management, and interpersonal savvy. ## Who Guides the Team’s Course? Leads may need managerial green lights for broader decisions, whereas managers are the decision-makers shaping the project’s journey. ## The Influence on Culture and Growth Both leads and managers influence team dynamics, with managers typically setting the cultural tone and leads nurturing the technical skill growth of the team.
zoltan_fehervari_52b16d1d
1,899,083
Exploring Angular 18’s RedirectCommand class and @let block
Written by Lewis Cianci✏️ In a single day, 385,000 babies are born, the sun evaporates over a...
0
2024-06-26T17:19:20
https://blog.logrocket.com/exploring-angular-18-redirectcommand-class-let-block
angular, webdev
**Written by [Lewis Cianci](https://blog.logrocket.com/author/lewiscianci/)✏️** In a single day, 385,000 babies are born, the sun evaporates over a trillion tons of water off the surface of the ocean, and fifteen new stable versions of Angular are released. Okay, that’s an exaggeration (not the babies and evaporation bit!), but [the Angular release cycle](https://blog.logrocket.com/exploring-angular-evolution/) these days is certainly quick — and it’s not slowing down. This latest update brings a couple of notable quality-of-life improvements. First of all, redirects within Angular can now occur by way of a `RedirectCommand`, which allows greater control over how a redirect occurs and easier implementation of best practices. Secondly, we can now declare variables within the view themselves, which can be helpful. How do we use these features? Let’s take a look. ## Let me redirect you: The `RedirectCommand` class Why is redirecting such a headline feature in Angular 18? We’re literally just shifting from one page to another. But navigation — something that we take for granted when we click on a link — is deceptively simple. Consider a simple website with some trivial routes: * `/` — The root page of the application * `/admin` — Where administrators can see everyone's journal entries * `/write` — Where users write journal entries Everyone should be able to access the application root, including anonymous users. But should everyone be able to access the `admin` route? Of course not — that’s just a cybersecurity risk waiting to happen. What about if users are on the `write` route, and they try to navigate to the root route? That’s fine from a security perspective, but they might lose what they’re writing! Ideally, we’d want to show a confirmation dialog before performing that routing operation. Within the Angular nomenclature, these actions essentially boil down to the following actions: * `canMatch`: Sometimes, more than one route can match a given path. If a user is an admin, they should see the admin `write` route. Otherwise, they should see the normal `write` page * `canActivate`: If a route is allowed to be activated or not. For example, the `write` route should only be accessible if the user is logged in * `canDeactivate`: If a user can navigate away from a route. Typically used to show a dialog box when navigating away from a page with changed data To help understand the power of `RedirectCommand`, let’s [create a sample application called Journalpedia](https://github.com/azimuthdeveloper/angulareighteen.git) that has some of this basic routing functionality. Journalpedia will look very plain, and that’s okay, because the focus of this article is on routing and not visual appeal. For the purposes of this article, we’ll have our `app.component.html` contain a `router-outlet`, but it will also contain a **Login Simulator** that lets us change between user types on the fly. The idea is that we can route to anywhere on the test site and still be able to change between accounts: ```html <router-outlet></router-outlet> <div style="bottom: 0; right: 0; height: 200px; width: 200px; background-color: wheat; position: fixed"> <app-login-simulator></app-login-simulator> </div> ``` The front page of our app will look like this: ![Homepage For Demo Angular App Showing Welcome Message, Three Buttons, And Login Simulator](https://blog.logrocket.com/wp-content/uploads/2024/06/Demo-Angular-app-homepage.png) Our app has three buttons. Typically, we wouldn’t display the **Admin** or **Write a journal entry** buttons because we don’t want users to try to click on them before logging in, but in this article we’ll show them anyway so we can see how route guards and activation work. Normally, users could log in or change their logins at any time, but we don’t have a fully fledged authentication system in place. Instead, we can click on our **Login Simulator** to simulate a user logging in. ### Why do we need the `RedirectCommand` class? To clarify, redirect functionality has long been part of Angular’s feature set. So why did Angular 18 introduce an entirely new class? Let’s see how redirection occurs in Angular apps today to see why `RedirectCommand` is an improvement. Typically, routes within an application like this would look like the following: ```typescript export const routes: Routes = [ { path: '', component: HomeComponent, }, { path: 'admin', component: AdminComponent, canActivate: [() => { const router = inject(Router); const auth = inject(AuthenticationService); if (auth.userLogin$.value == LoginType.AdminUser){ return true; } else{ router.navigate(['/unauthorized']); return false; } }], }, { path: 'write', component: WriteComponent, canActivate: [() => { const router = inject(Router); const auth = inject(AuthenticationService); if (auth.userLogin$.value != undefined) { return true; } else { router.navigate(['/unauthorized']); return false; } }] }, { path: 'unauthorized', component: UnauthorizedComponent } ]; ``` The crux of the routing operation here is to check that a user is of the correct type, and confirm that the navigation can happen. Otherwise, redirect to the unauthorized page. For most applications, this is fine, but it introduces a dependency on returning either a `true` or `false` value from the guard. If we’re returning `true`, the navigation will complete, but if it returns `false`, then the navigation will just not occur. Sometimes, this can cause developers and consumers alike to become confused at the lack of routing occurring. Fortunately, we can return a URL tree directly from within our `canActivate` function, specifying a direct path to another page, like so: ```typescript { path: 'admin', component: AdminComponent, canActivate: [() => { const router = inject(Router); const auth = inject(AuthenticationService); if (auth.userLogin$.value == LoginType.AdminUser){ return true; } return router.parseUrl('/unauthorized') }], } ``` This is a better pattern for two reasons: first, we’re not flat-out rejecting the navigation. Second, something will always occur — either the navigation will succeed, or it will return an alternative route. The only wrinkle in this approach is that we can’t pass extra `NavigationExtra` properties in `parseUrl`. `NavigationExtra` properties pave the way for a bunch of [advanced routing scenarios](https://angular.dev/api/router/NavigationExtras) — for example, how query parameters should be handled, whether a location change should be skipped or not. They also give us the opportunity to pass data to the navigated page via the `state` property. Fortunately, that’s exactly where `RedirectCommand` comes in. `RedirectCommand` accepts a `UrlTree`, but also accepts `NavigationExtra` properties. We can redirect to another page and use `NavigationExtra` properties at the same time. Because it’s a new class, it doesn’t break any existing functionality, which is great! So, our admin route becomes: ```typescript { path: 'admin', component: AdminComponent, canActivate: [() => { const router = inject(Router); const auth = inject(AuthenticationService); if (auth.userLogin$.value == LoginType.AdminUser) { return true; } return new RedirectCommand(router.parseUrl('/unauthorized'), { skipLocationChange: true, state: { loginDuringAuth: auth.userLogin$.value } }) }], }, ``` With this, we don’t return `false` from the `canActivate` function, and we can take full advantage of the `NavigationExtra` superpowers. In this example, we observe something that happened during the routing operation and put that in the `state` variable. Within the constructor for the `unauthorized` page, we can retrieve this state variable, like so: ```typescript constructor(private _location: Location, private _router: Router) { console.log(this._router.getCurrentNavigation()?.extras.state); this.loginDuringAuth.set(_router.getCurrentNavigation()?.extras?.state?.['loginDuringAuth']); } ``` Then, we can make the changes to our `unauthorized` page: ```html <h2>Unauthorized :(</h2> @if (this.loginDuringAuth() == undefined){ The router thinks you're not logged in. } @else { The router thinks you're logged in as {{LoginType[this.loginDuringAuth()!]}} } <br> <button (click)="goBack()">Go back</button> ``` Now, we pass through the router state and display the result on the screen: ![Example Admin Route For Demo Angular App Blocked For Normal User With Message Explaining Why Route Is Blocked And Providing A Button To Go Back](https://blog.logrocket.com/wp-content/uploads/2024/06/Example-admin-route-blocked-normal-user.png) Having `RedirectCommand` provides a favorable outcome because all existing routing operations will still work and more complex routing operations are possible. ## Let it be: The `@let` operator The other notable addition to Angular 18 — specifically, the incremental v18.1 release — is the `@let` operator. As Angular developers, we’re likely already familiar with the `let` operator in TypeScript, along with `var`, `const`, etc. But now it’s in the template code, so what gives? Essentially, `@let` lets us define and assign variables within the template itself. If that sounds scary to you, that’s totally understandable. It’s not so hard to imagine how people would completely misuse this and declare way-too-complex business logic in their views. In the long run, that’s going to hurt debugging and troubleshooting. Still, `@let` is a good addition because it means that we can easily access asynchronous variables with the `async` pipe without worrying about being personally responsible for unsubscribing. How would our unauthorized page change if we used `@let`?. Instead of using `@if` statements, we can update our code like this: ```typescript <h2>Unauthorized :(</h2> @let authStatus = this.loginDuringAuth() ?? 'Not logged in'; The router thinks you're logged in as {{authStatus}} <button (click)="goBack()">Go back</button> ``` That’s a pretty basic example, but if you have a single view and you just want to quickly declare a variable in the view, it becomes quite simple. ### When not to use the `@let` operator in Angular There are [many ways to use the `@let` keyword](https://dev.to/oler/introduction-to-let-in-angular-18-cm6). But let’s pause for some real talk — as developers, we’ve all encountered code that is badly written or isn’t easy to follow. If you use `@let` to define all of your application logic in your views, and forego writing any more TypeScript, you’re likely to gain an unfavorable reputation. Imagine the horror! You’ll notice that your co-workers stop talking when you enter the room. Nicknames like “Letty” will be assigned. Tales will be told about Letty, the developer who just used `@let` everywhere and created an un-debuggable mess. Fabricated drama aside, if you find yourself starting to use `@let` to complete complicated business logic, or copy-and-pasting it over and over again in your code, **stop.** You’re on your way to creating an unintelligible mess and the people you work with will be upset with your life choices. `@let` exists in the view (or the template) to provide view-related functionality. So, use it for that — and don’t overuse it. Using `@let` isn’t a magic bullet for performance either, and completing computationally intensive operations within a `@let` block will cause browser lockups and other bad things to happen. They should be completed within TypeScript — but you already knew that, right? ## Feels good to be an Angular developer We’re definitely getting a lot of DX enhancements in Angular, and I for one am enjoying it immensely. These changes will continue to enhance Angular’s ease of use when developing your apps for the web. Feel free to [clone the project](https://github.com/azimuthdeveloper/angulareighteen.git) and see for yourself how these changes make a difference. --- ###Experience your Angular apps exactly how a user does Debugging Angular applications can be difficult, especially when users experience issues that are difficult to reproduce. If you’re interested in monitoring and tracking Angular state and actions for all of your users in production, [try LogRocket](https://lp.logrocket.com/blg/angular-signup). [![LogRocket Signup](https://files.readme.io/610d6a7-687474703a2f2f692e696d6775722e636f6d2f696147547837412e706e67.png)](https://lp.logrocket.com/blg/angular-signup) [LogRocket](https://lp.logrocket.com/blg/angular-signup) is like a DVR for web and mobile apps, recording literally everything that happens on your site including network requests, JavaScript errors, and much more. Instead of guessing why problems happen, you can aggregate and report on what state your application was in when an issue occurred. The LogRocket NgRx plugin logs Angular state and actions to the LogRocket console, giving you context around what led to an error, and what state the application was in when an issue occurred. Modernize how you debug your Angular apps — [start monitoring for free](https://lp.logrocket.com/blg/angular-signup).
leemeganj
1,901,647
#amazon#nail
Best nail design. Click here.... https://t.ly/C9Mrt
0
2024-06-26T17:16:42
https://dev.to/shahin1310/amazonnail-2dki
Best nail design. Click here.... [![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b8mh7hh4k1b2re65cev7.png)](url)https://t.ly/C9Mrt
shahin1310
1,901,645
Adding ESP to supercharge your Django Email 🚀
Whether you are trying to build your own SAAS or building a project for client a its very important...
0
2024-06-26T17:16:18
https://foxcraft.hashnode.dev/how-to-add-email-service-provider-to-django
django, webdev, djangoemail, email
Whether you are trying to build your own SAAS or building a project for client a its very important to have email integration for sending invoice, newsletter's, updates and more. Using the default Django's SMTP email is not scalable, with emails hitting the per day limits or going to spam folders. [Email Service Providers](https://templates.foxcraft.tech/blog/b/what-are-email-service-providers) (ESPs) offer scalable, highly deliverable, trackable, and compliant email solutions. Many ESPs provide API access, enabling you to send, schedule, and customize emails efficiently. Setting up an ESP for your transactional emails is quick and straightforward, often taking less than 15 minutes, making it a highly recommended addition to your email strategy. There are many email service providers such as [Sendgrid](https://sendgrid.com/en-us), Brevo, MailGun, Amazon SES etc. In this tutorial we'll see how to add [Brevo ESP](https://www.brevo.com/) (formally sendinblue) to Django. Before we get started make sure to signup to Brevo and fill in the details to keep your email complainant. ## Django Setup for ESP We'll be using Anymail package as it offers API compatible with Django's existing email and can be configured for any ESP. Start by installing anymail ``` pip install django-anymail[brevo] ``` Now go to you settings.py and add anymail to `INSTALLED_APPS` ```py INSTALLED_APPS = [ . . 'anymail', ] ``` Now add brevo as your email backend ```py EMAIL_BACKEND = "anymail.backends.brevo.EmailBackend" BREVO_API_URL = "https://api.brevo.com/v3/" # optional ANYMAIL = { "BREVO_API_KEY": 'BREVO_API_KEY', # use brevo api key "IGNORE_RECIPIENT_STATUS": True, } DEFAULT_FROM_EMAIL = 'ben@example.com' # default from email ``` Anymail works with default Django email, so now import `send_mail` ```py from django.core.mail import send_mail subject = "Test transactional email" body = f"Hi\nthis is a test email" html_message = """ Hi,<br> This is a brevo test mail. """ send_mail(subject, message=body, from_email=None, recipient_list=['james@example.com'], html_message=html_message) ``` The `html_message` is necessary to send via Brevo, so ensure you add it. If the `from_email` is set to `None` It uses the default mail you set up in the `settings.py` file. If you need to add cutsom parameters for brevo to fill in, just use `{{params.name}}` read more about [Brevo templating](https://help.brevo.com/hc/en-us/articles/360000946299-Personalize-your-emails-using-Brevo-Template-Language) That's all, all you need is you Brevo API key and you'll be able to send messages via Brevo ## Brevo API Key Go to the top right corner dropdown -> click on SMTP & API ![Brevo API Key](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jaf89ntvk111diu9icgo.png) Now from the API's tab, click on generate new API key ![generate brevo key](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4jmop07bpa8oazzov3ms.png) Copy the key and Paste it in the `settings.py` file and you are good to go. When you send your first test email, you may get a email from Brevo asking you to add it to authorized IP, once you set it, it'll start sending the emails. You can also disable this by going to profile dropdown -> Security ->Authorized IPs ![Brevo authorized IP](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nqa5m82i8dfaqcva84ji.png) ## Advanced usage You can also create templates in Brevo and send customized bulk emails with just one API call. Start by creating a [template](https://app.brevo.com/templates/listing/email). Follow the steps in Brevo to create the template. Now to send the via template lets make some changes to the code ```py from django.core.mail import EmailMultiAlternatives recipient_list = ['ben@example.com', 'james@example.com'] from_email = None # uses default email address email = EmailMultiAlternatives(from_email=from_email, to=recipient_list) email.template_id = 8 # your brevo template id email.merge_data = { 'ben@example.com': {'name': "Ben", 'order_no': "12345"}, # customize per customer 'james@example.com': {'name': "James", 'order_no': "54321"}, } email.merge_global_data = { 'ship_date': "May 15", # this is common global data for both ben and james } email.send() ``` That's it now you can also send emails using Brevo template's. If you have any questions, drop a comment.
paul_freeman
1,901,639
Even Server-Side TypeScript Needs the Option to Avoid Asynchronous Processing
I am developing a TypeScript ORM library called Accel Record. Unlike other TypeScript/JavaScript ORM...
27,598
2024-06-26T17:15:11
https://dev.to/koyopro/even-server-side-typescript-needs-the-option-to-avoid-asynchronous-processing-1opm
typescript, javascript, node
I am developing a TypeScript ORM library called [Accel Record](https://www.npmjs.com/package/accel-record). Unlike other TypeScript/JavaScript ORM libraries, Accel Record adopts a synchronous API instead of an asynchronous API. In the process of examining the ORM interface, we compared the advantages and disadvantages of asynchronous and synchronous APIs. In this article, I would like to organize those thoughts and discuss the idea that we might need an option to develop in synchronous processing even in server-side TypeScript. ## What Is Asynchronous Processing in TypeScript/JavaScript? When running JavaScript on the server side, Node.js is often used. Node.js uses a single-threaded asynchronous I/O model, and it is common to implement applications that use asynchronous processing. The way to write asynchronous processing has historically changed, but currently, writing with `async/await` is mainstream. ```ts // Functions that perform asynchronous processing are marked with `async` and return a Promise const fetchUsers = async (): Promise<User[]> => { // Process to retrieve users from the DB // ... }; // Use await to wait for the result of the asynchronous process const users = await fetchUsers(); ``` As shown above, functions that perform asynchronous processing are implemented by marking them with `async` and returning a Promise. On the caller side, the result of the asynchronous process is awaited using `await`. In the past, JavaScript achieved asynchronous processing using callback functions, but recently, using `async/await` has made writing asynchronous processing more intuitive. While the development experience has greatly improved, the necessity to be aware of whether the called function performs asynchronous processing and write `await` accordingly remains unchanged. Calling an asynchronous function without `await` can lead to unintended behavior. Therefore, it is necessary to consider whether the called function is synchronous or asynchronous and decide whether to write `await`, a task that needs to be done every time. ## Asynchronous Processing Used on the Server Side If there are hardly any asynchronous calls, the above task may not be much of an issue. However, in server-side development for web applications, asynchronous processing is often used. This is because DB access in JavaScript libraries is basically implemented as asynchronous processing. In server-side processing for web applications, the main flow is to receive an HTTP request, perform DB access such as data retrieval or writing, and return the final result as an HTTP response. The more complex the application, the more places where DB access is needed, and asynchronous processing is often used. When there are many places to write asynchronous processing, the necessity to be aware of whether to write `await` increases. Compared to cases where asynchronous processing is not used at all, the cost of caring for such detailed parts increases, reducing development efficiency. ## Benefits of Using Asynchronous Processing So, what are the benefits of asynchronous processing, and why is it used? The most significant advantage is the improvement of system performance. In execution environments like Node.js, using asynchronous processing allows the time spent waiting for I/O to be utilized for other processes. As a result, it is expected that the overall system performance will be better than if asynchronous processing is not used. For example, when there is a process to access the DB upon receiving an HTTP request, it becomes possible to accept other requests while waiting for the response from the DB. Node.js has the concept of an event loop, and it is considered important for performance to not block the event loop by properly using asynchronous processing. ## The Option to Avoid Asynchronous Processing In the process of examining the ORM interface, I began to think that we might not necessarily need to use asynchronous processing just because we are in a JavaScript execution environment. Implementing the application primarily with synchronous processing may lead to some performance degradation compared to using asynchronous processing[^1]. However, there is the benefit of reducing the burden of deciding whether to add `await` based on whether the called function performs asynchronous processing, thereby improving development efficiency. Depending on the nature of the product being developed, prioritizing development efficiency over system performance might be desirable. I believe this is a rather common case. In other languages used for server-side development, not using asynchronous processing is common. It is often adopted because there are benefits in other aspects even without pursuing performance through asynchronous processing. (On the contrary, I feel that choosing TypeScript/JavaScript for server-side development is not yet mainstream.) ## Advantages of Choosing TypeScript for Server-Side Development So, if prioritizing development efficiency, why choose TypeScript for server-side development? I believe there are mainly two significant advantages. ### 1. Reducing Context Switching by Unifying Development Languages with the Frontend Currently, TypeScript is widely used for frontend development in web applications. By adopting TypeScript for server-side development as well, the commonality with the frontend can be improved, and the burden on developers can be reduced by lowering the cost of switching languages. ### 2. Improved Development Experience Through Type Safety and Type Support TypeScript is a statically typed language, which makes it easier to receive editor support such as autocompletion and allows for early bug detection through type checking. This type safety is a very powerful factor, especially in the development of large-scale applications. Compared to other languages chosen for server-side development, which are not always highly type-safe, the advantages of choosing TypeScript are significant. These two advantages make it worthwhile to choose TypeScript for server-side development, even when prioritizing development efficiency. ## Enhancing Development Efficiency in Server-Side TypeScript From the above two points, development in TypeScript has the characteristic of maintaining high development efficiency. However, the current TypeScript server-side development environment, which requires an asynchronous processing-centric implementation, seems to be a factor that lowers that development efficiency. Moreover, it seems to affect various libraries. Due to the assumption of asynchronous processing, ideal interfaces may not always be adopted, resulting in compromised usability. As a result, the advantage of choosing TypeScript for server-side development might be halved. For server-side TypeScript development to become more widespread, I believe it is necessary to have the option to develop without using asynchronous processing. It is necessary to be able to choose a development experience where there is no need to care for asynchronous processing every time there is DB access. When developing a new TypeScript ORM library, I decided to adopt a synchronous API. The reason is that by using a synchronous API, the interface can be more abstracted, and I believe the experience of library users will improve more than with an asynchronous API. Until now, server-side JavaScript might have been chosen for performance. However, I think server-side TypeScript might be chosen for development efficiency. And to support that, I believe the option to avoid asynchronous processing is necessary. ## Conclusion I discussed the idea that to improve development efficiency in server-side development using TypeScript, it might be necessary to have the option to develop without using asynchronous processing. Please check out '[Introduction to "Accel Record": A TypeScript ORM Using the Active Record Pattern](https://dev.to/koyopro/introduction-to-accel-record-a-typescript-orm-using-the-active-record-pattern-2oeh)' and the [README](https://github.com/koyopro/accella/blob/main/packages/accel-record/README.md) to see what kind of interface Accel Record can achieve by adopting a synchronous API. [^1]: Depending on the environment, there may be cases where performance is not adversely affected, or ways to avoid performance degradation through system configuration. Please refer to '[Why We Adopted a Synchronous API for the New TypeScript ORM](https://dev.to/koyopro/why-we-adopted-a-synchronous-api-for-the-new-typescript-orm-1jm)' which I wrote previously.
koyopro
1,901,644
Top 10 Reasons that Slow Down Your Woocommerce Site
A fast website is key for any online store's success. This is especially true for WooCommerce...
0
2024-06-26T17:14:31
https://dev.to/ayeshamehta/top-10-reasons-that-slow-down-your-woocommerce-site-34i2
technology, woocommerce, development
![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vly6x5t6haf5juv2gaw6.jpg) A fast website is key for any online store's success. This is especially true for WooCommerce sites. Here speed directly impacts user experience and sales. In the long run it will affect your revenue. Unfortunately, several common issues can significantly slow down your WooCommerce site. Problems like hosting problems and unoptimized images. Each factor can contribute to a sluggish performance. Today we will discuss the top issues that can slow down your WooCommerce site. And provide practical tips to address these issues. **1. Poor Hosting** Having Inadequate hosting can significantly hinder your website’s performance. A slow site frustrates visitors. About 53% of mobile users leave a site. They do this if it takes more than three seconds to load. Slow performance harms user experience and can lower sales. Outdated software and broken links result from poor maintenance. They make your site vulnerable to hackers. This risks sensitive information and erodes trust with visitors. Here are some signs of poor hosting- Frequent Downtime: If your site frequently goes down, this means the hosting is bad. Laggy Pages: Slow-loading pages frustrate users. They also harm SEO rankings. Limited Scalability: Inflexible hosting can not handle traffic spikes or business growth. **2. Too Many Plugins** When you have too many plugins, it can lead to several issues. Each plugin adds extra HTTP requests, slowing down your site’s loading time. Excessive plugins increase database queries, affecting performance. Poorly coded or conflicting plugins may cause errors or break functionality. Outdated or poorly maintained plugins can introduce security flaws. Data Breaches: A compromised plugin may expose sensitive data or lead to cyberattacks. Consider these steps to manage your plugins. 1. Audit Your Plugins: Regularly review installed plugins. Remove unused or redundant ones. Also, prioritize quality over quantity. 2. Test Performance Impact: Use performance monitoring tools to assess each plugin’s effect on loading times. Remove or replace any plugins that significantly slow down your site. 3. Stay Updated: Keep your plugins up-to-date. This will patch security holes. Disable or remove outdated or unsupported plugins. If you want a custom plugin developed that offers unique functionality, get help from [**WooCommerce development services**](https://www.saffiretech.com/custom-woocommerce-development/). **3. Large Images and Media Files** High-resolution images and bulky media files increase page load times. Visitors get frustrated waiting for slow content to display. How you can address It: Resize Images: Before uploading, resize images to the appropriate dimensions for your site. Make use of tools like online editors. Compress Images: Reduce file size without compromising quality. Plugins like Smush or ShortPixel can do this automatically. File type: Make use of JPEG format for photos. And for graphics use PNG format. JPEG compresses photos well, and PNG keeps graphics transparent. Lazy Loading: Load images as users scroll down to improve initial page load speed. Best Practices you can use to manage media files: Organize Folders: Keep media files organized in folders for easy access. Regular Cleanup: Free up server space by deleting unused files. Content Delivery Networks: Use CDNs to send media files around the world. This reduces server load. **4. Lack of Caching** Caching involves temporarily storing recently accessed data in faster memory or storage. It improves performance by reducing the time needed to fetch data from the main source. When data is cached, users can access it faster than from slower storage like disks or networks. Types of Caching: Browser Cache: Stores web page resources like images and scripts locally in the user’s browser. It also speeds up page load time on subsequent visits. Server-Side Cache: Stores data on the server to reduce load times. Tools like Redis can be used. Content Delivery Network Cache: Distributes content globally across servers, minimizing latency. Here are some of the caching plugins I would like to recommend: W3 Total Cache: Comprehensive plugin with browser, page, and object caching. WP Super Cache: Easy-to-use plugin for static file caching. LiteSpeed Cache: Optimized for LiteSpeed servers, offering powerful caching features. **5. Unoptimized Database** Another reason why your WooCommerce site might be slow is an unoptimized database. Slow databases lead to delayed responses. With the delay, your visitors might get frustrated and leave your site. Scalability also matters. Think of it as a cup. The cup will hold water till its rim. After that, the water will overflow. The same happens with databases when it is not able to scale. Cost Efficiency: Optimized databases require fewer resources, reducing hosting and maintenance expenses. Consequences of Poor Database Performance: An unoptimized database affects user experience. You can use some techniques to optimize your database. Monitor and Analyze: Regularly check and study performance metrics. Optimize Queries: Ensure efficient SQL queries. Create and Manage Indexes: Properly designed indexes speed up data retrieval. Regular Maintenance: Clean up unnecessary data and streamline the database. Disaster Recovery : Plan for disaster recovery. Also, plan for high availability. Scaling Strategies: Get ready for growth. It's a natural part of any business. Have proper measures to scale your database. **6. External Scripts and Resources** Scripts and resources are a crucial part of any successful business. Third-party scripts come from external vendors and enhance your site’s functionality. Some of its examples could be analytics, ads, or social sharing buttons. However, they can slow down page load times and affect user experience. Reason being these scripts delay critical rendering. They impact core web vitals. To solve this, you should identify essential scripts and remove unnecessary ones. Prioritize critical features over non-essential ones. You can load scripts asynchronously. Try to implement lazy loading. Here your script only loads when needed. You could even use Content Delivery Networks. Like I mentioned above, you can use CDNs to quickly deliver content anywhere around the world. Also regularly review and optimize third-party integrations. **7. Heavy Themes** A poorly coded or bloated theme can significantly slow down your website. If your theme is bloated, it would be due to excessive code or unoptimized images. This also affects your loading speed. When you choose a lightweight theme, prioritize a theme that is optimized for speed and efficiency. It should also have a clean code. Make sure it is responsive. Also, your theme should work seamlessly with popular plugins. Some of the best optimized themes for a beginner would be Astra. It is lightweight, customizable, and WooCommerce-friendly and SEO-friendly. If you are looking for a custom theme developed specifically for your site, opt for custom WooCommerce development services. The theme developed will be well optimized and match your brand. **8. Inefficient Code** Unoptimized code can slow down software execution. It may waste system resources. This leads to longer response times and reduced reliability. Efficient code improves performance and user experience. Regular updates ensure security and maintainability. You can use these tools to check your code: 1. Linter: Identifies coding issues like style violations and unused variables. 2. Profiling Tools: Analyzes runtime behavior and bottlenecks. 3. Static Analysis: Detects potential bugs without executing the code. **9. Excessive Redirects** One key issue that slows down your WooCommerce site that not many know of is excessive redirects. Having redirects is a good thing. But having too many slows down your site. It also affects user experience and SEO ranking. Configuration errors or conflicting rules lead to redirection loops. Tips to Reduce Redirects: Clear Cookies: Start by clearing cookies on the redirecting website. Browser Cache: Clear your browser cache to eliminate cached redirects. SSL Certificate: Ensure correct SSL installation. Evaluate Third-Party Services: Check if external services contribute to redirects. Reset .htaccess File: Fix misconfigured rules. Contact Hosting Provider: Seek professional assistance if needed. **10. Not Using a Content Delivery Network** A CDN is a group of servers. They are spread across the globe. They cache content near end users. It speeds up transferring assets for loading web content. These assets include HTML pages, JavaScript files, and stylesheets. They also contain media. Some benefits of Using a CDN are: Improved Load Times: CDNs distribute content closer to visitors, resulting in faster page loading times. Faster sites reduce bounce rates and increase user engagement. Reduced Bandwidth Costs: CDNs decrease data transfer from origin servers, lowering hosting expenses. Enhanced Availability and Redundancy: CDNs ensure content stays accessible. This is true even during traffic spikes or hardware failures. **Some of the Popular CDN Providers:** Cloudflare: Offers robust security features, performance enhancements, and global coverage. Akamai: Known for scalability and reliability. Amazon CloudFront: Integrated with AWS services and suitable for dynamic content. **Conclusion** We understood the importance of keeping the WooCommerce site well optimized. We also discussed some of the key issues that slow down your site. I even provided solutions to prevent them. Some of the main issues arise due to poor hosting, lack of caching, having too many plugins, and using inefficient code. Resolve these issues and your site should be just fine.
ayeshamehta
1,901,657
Cursos De Análise De Dados Gratuitos: Excel E Power BI
Desenvolva suas habilidades em Análise de Dados e impulsione sua carreira em qualquer empresa com os...
0
2024-06-28T13:38:05
https://guiadeti.com.br/cursos-analise-de-dados-gratuitos-excel-power-bi/
cursogratuito, analisededados, cursosgratuitos, dados
--- title: Cursos De Análise De Dados Gratuitos: Excel E Power BI published: true date: 2024-06-26 17:13:45 UTC tags: CursoGratuito,analisededados,cursosgratuitos,dados canonical_url: https://guiadeti.com.br/cursos-analise-de-dados-gratuitos-excel-power-bi/ --- Desenvolva suas habilidades em Análise de Dados e impulsione sua carreira em qualquer empresa com os cursos gratuitos oferecidos pela escola Preditiva. Utilizando ferramentas como Excel e Power BI, você aprenderá a potencializar suas análises de forma eficaz. A Preditiva é uma escola dedicada a ensinar pessoas a analisar e interpretar dados com foco nos negócios. ## Cursos Preditiva Desenvolva suas habilidades em Análise de Dados e alcance uma carreira de sucesso em qualquer empresa com os cursos gratuitos oferecidos pela escola Preditiva. ![](https://guiadeti.com.br/wp-content/uploads/2024/06/image-63-1024x351.png) _Imagem da página da Preditiva_ ### Ferramentas Essenciais Aproveite para usar ferramentas como Excel e Power BI, potencializando suas análises e obtendo insights valiosos para o seu negócio. Confira a ementa: #### Introdução ao Mundo dos Dados - **Seção 1- Revolução 4.0:** Big Data e Inteligência Artificial: A Revolução Industrial 4.0 já começou e você pode estar atrasado. Quais as habilidades a serem desenvolvidas neste ano em diante? O que são os “dados” e qual o papel das técnicas e ferramentas nisso tudo? - **Seção 2 – As Técnicas de Análise de Dados:** Entenda as principais técnicas estatísticas que todo profissinal deve desenvolver. Falaremos sobre Estatística Descritiva, Inferência e Machine Learning. - **Seção – 3 Ferramentas para Análise de Dados:** Conheça as principais ferramentas de dados disponíveis no mercado. Faremos um overview do Excel, SQL, Power BI e Python. Entenda o potencial de cada uma e saiba escolher qual ferramenta estudar e usar para cada objetivo de projeto (e carreira). - **Seção 4 – Cultura Data Driven nas empresas:** A importância da Metodologia em Projetos de Dados. A tal da Cultura Data Driven (cultura analítica) e os desafios na implantação. Engenheiro, Analista ou Cientista? O papel de cada profissional na Jornada dos Dados. - **Seção 5 – Seja estratégico com sua carreira:** Entenda a importância de uma boa gestão de carreira. O processo de aprendizado das habilidades em dados não é linear. Descubra! - **Seção 6 – As principais perguntas respondidas:** “Dados” servem pra minha área? Essa habilidade é para mim? “Não sei se devo virar Engenheiro, Analista ou Cientista de Dados: o que fazer?”, “Quero migrar logo para esta área… Dá para fazer isso com rapidez?”, “Tenho mais do que 40 anos. Será que consigo fazer a transição após essa idade?” #### Excel para Análise de Dados - **Seção 1 – Revolução 4.0: Big Data e Inteligência Artificial:** A Revolução Industrial 4.0 já começou e você pode estar atrasado. Quais as habilidades a serem desenvolvidas neste ano em diante? O que são os “dados” e qual o papel das técnicas e ferramentas nisso tudo? - **Seção 2 – As Técnicas de Análise de Dados:** Entenda as principais técnicas estatísticas que todo profissinal deve desenvolver. Falaremos sobre Estatística Descritiva, Inferência e Machine Learning. - **Seção 3 – Ferramentas para Análise de Dados:** Conheça as principais ferramentas de dados disponíveis no mercado. Faremos um overview do Excel, SQL, Power BI e Python. Entenda o potencial de cada uma e saiba escolher qual ferramenta estudar e usar para cada objetivo de projeto (e carreira). - **Seção 4 – Cultura Data Driven nas empresas:** A importância da Metodologia em Projetos de Dados. A tal da Cultura Data Driven (cultura analítica) e os desafios na implantação. Engenheiro, Analista ou Cientista? O papel de cada profissional na Jornada dos Dados. - **Seção 5 – Seja estratégico com sua carreira:** Entenda a importância de uma boa gestão de carreira. O processo de aprendizado das habilidades em dados não é linear. Descubra! - **Seção 6 – As principais perguntas respondidas:** “Dados” servem pra minha área? Essa habilidade é para mim? “Não sei se devo virar Engenheiro, Analista ou Cientista de Dados: o que fazer?”, “Quero migrar logo para esta área… Dá para fazer isso com rapidez?”, “Tenho mais do que 40 anos. Será que consigo fazer a transição após essa idade?” #### Power BI para Análise de Dados - **Sessão 1 – Introdução ao Power BI:** Saiba como instalar e tenha o primeiro contato com a interface da ferramenta. - **Sessão 2 – Seu primeiro Dashboard:** Construa seu primeiro Dashboard, importando uma base, criando seus primeiros visuais (gráficos) e publicando localmente. - **Sessão 3 – Preparação de Dados:** Aprofunde-se no tratamento de dados de diversos tipos utilizando o Power Query. - **Sessão 4 – Visualização de Dados:** Aprenda como utilizar as diversas possibilidades de visuais e filtros. - **Sessão 5 – Modelagem de Dados:** Relacione diversas bases da dados para aumentar a possibilidade de suas análises - **Sessão 6 – Linguagem DAX:** Conheça a Linguagem DAX, que permite com que expanda sua capacidade de criação de variáveis, filtros e tabelas dentro do Power BI. - **Sessão 7 – Contando histórias com o PBI:** Aprenda como utilizar recursos avançados do Power BI, assim como utilizar temas e estilos para finalizar o seu dashboard. - **Sessão 8 – Publicando seu Dashboard:** Após ter seu Dashboard desenvolvido, saiba como publicar online e criar uma rotina de atualização. - **Sessão 9 – Lives sobre Data Viz:** Construa visualizações e painéis realmente eficiente dentro do Power BI. ### Certificação e Acesso Os cursos da Preditiva entregam certificado e não exigem pré-requisitos. O acesso gratuito estará liberado por até 2 (dois) meses após a inscrição, permitindo que você finalize o curso dentro deste prazo e maximize sua aprendizagem. Aproveite esta oportunidade para se capacitar e se destacar no mercado de trabalho com as habilidades em Análise de Dados oferecidas pela Preditiva. <aside> <div>Você pode gostar</div> <div> <div> <div> <div> <span><img decoding="async" width="280" height="210" src="https://guiadeti.com.br/wp-content/uploads/2024/06/Cursos-Analise-De-Dados-280x210.png" alt="Cursos Análise De Dados" title="Cursos Análise De Dados"></span> </div> <span>Cursos De Análise De Dados Gratuitos: Excel E Power BI</span> <a href="https://guiadeti.com.br/cursos-analise-de-dados-gratuitos-excel-power-bi/" title="Cursos De Análise De Dados Gratuitos: Excel E Power BI"></a> </div> </div> <div> <div> <div> <span><img decoding="async" width="280" height="210" src="https://guiadeti.com.br/wp-content/uploads/2024/06/Curso-Harvard-1-280x210.png" alt="Cursos Harvard" title="Cursos Harvard"></span> </div> <span>Cursos Harvard Gratuitos: Python, JavaScript, Análise De Dados E Mais!</span> <a href="https://guiadeti.com.br/cursos-harvard-gratuitos-python-javascript-outros/" title="Cursos Harvard Gratuitos: Python, JavaScript, Análise De Dados E Mais!"></a> </div> </div> <div> <div> <div> <span><img decoding="async" width="280" height="210" src="https://guiadeti.com.br/wp-content/uploads/2024/06/Curso-CCNAv7-Introducao-As-Redes-280x210.png" alt="Curso CCNAv7: Introdução Às Redes" title="Curso CCNAv7: Introdução Às Redes"></span> </div> <span>Curso CCNAv7 Gratuito CISCO E NIC.br: Introdução Às Redes</span> <a href="https://guiadeti.com.br/curso-ccnav7-gratuito-introducao-as-redes/" title="Curso CCNAv7 Gratuito CISCO E NIC.br: Introdução Às Redes"></a> </div> </div> <div> <div> <div> <span><img decoding="async" width="280" height="210" src="https://guiadeti.com.br/wp-content/uploads/2024/06/Bootcamp-Blockchain-Developer-280x210.png" alt="Bootcamp Blockchain Developer" title="Bootcamp Blockchain Developer"></span> </div> <span>Bootcamp De Blockchain Developer Gratuito Da DIO</span> <a href="https://guiadeti.com.br/bootcamp-blockchain-developer-gratuito-dio/" title="Bootcamp De Blockchain Developer Gratuito Da DIO"></a> </div> </div> </div> </aside> ## Análise De Dados A análise de dados é uma prática fundamental para empresas que desejam tomar decisões baseadas em informações concretas. Essa disciplina envolve a coleta, limpeza, transformação e modelagem de dados com o objetivo de descobrir informações úteis e apoiar a tomada de decisões. Além de Excel e Power BI, existem várias outras ferramentas poderosas que podem ser utilizadas para realizar análises de dados eficazes. ### Ferramentas Populares para Análise de Dados O Tableau é uma ferramenta de visualização de dados amplamente utilizada que permite aos usuários criar gráficos interativos e dashboards. Facilita a análise de grandes volumes de dados, transformando informações complexas em insights visuais claros. É ideal para explorar dados e identificar tendências e padrões. Python é uma linguagem de programação poderosa e versátil, amplamente utilizada em análise de dados. Oferece ferramentas para manipulação de dados, análise estatística e visualização. É especialmente útil para análises avançadas e personalizadas, além de ser uma escolha popular para machine learning e ciência de dados. R é uma linguagem de programação e ambiente de software especializado em estatísticas e gráficos. Possui variados pacotes para análise de dados, modelagem estatística e visualização. Sua capacidade de lidar com grandes conjuntos de dados e realizar análises complexas a torna uma ferramenta indispensável para profissionais de dados. ### Vantagens da Utilização de Ferramentas de Análise de Dados O uso de ferramentas de análise de dados permite que as empresas tomem decisões informadas com base em evidências concretas. Ao analisar tendências e padrões nos dados, os líderes podem identificar oportunidades de crescimento, otimizar operações e melhorar a eficiência. Ferramentas como Tableau e Qlik Sense ajudam a transformar dados complexos em visualizações claras e intuitivas. Isso facilita a comunicação dos insights para as partes interessadas, promovendo uma compreensão mais profunda e uma ação mais rápida. Linguagens de programação como Python e R permitem a automação de tarefas repetitivas e a realização de análises avançadas. Isso não apenas economiza tempo, mas também garante precisão e consistência nos resultados. ### Desafios na Análise de Dados Um dos maiores desafios na análise de dados é garantir a qualidade dos dados utilizados. Dados incompletos, imprecisos ou desatualizados podem levar a conclusões errôneas e decisões equivocadas. À medida que o volume de dados cresce, a complexidade e a necessidade de ferramentas mais robustas também aumentam. Ferramentas como Apache Spark são essenciais para lidar com grandes volumes de dados, mas podem exigir habilidades técnicas avançadas. Proteger a privacidade dos dados e garantir a segurança das informações é uma preocupação constante. As empresas devem implementar medidas de segurança rigorosas e seguir as melhores práticas para proteger os dados contra acessos não autorizados e violações. ## Preditiva A Escola Preditiva é uma instituição dedicada a capacitar profissionais em análise de dados, oferecendo cursos gratuitos e acessíveis que tem o objetivo de democratizar o acesso a essa habilidade essencial no mercado de trabalho atual. ### Metodologia A metodologia da escola é voltada para a aplicação prática do conhecimento, permitindo que os alunos aprendam a resolver problemas reais de forma simples e eficaz. ### Impacto e Acesso A Preditiva tem um impacto significativo na vida dos seus alunos, proporcionando-lhes habilidades valiosas que são altamente demandadas no mercado de trabalho. A escola está contribuindo para a formação de profissionais capacitados e preparados para os desafios do mercado moderno. ## Link de inscrição ⬇️ As [inscrições para os cursos de Análise de Dados](https://www.preditiva.ai/cursos-gratuitos) devem ser realizadas no site da Preditiva. ## Compartilhe esta oportunidade de capacitação gratuita! Gostou do conteúdo sobre os cursos gratuitos de Análise De Dados? Então compartilhe com a galera! O post [Cursos De Análise De Dados Gratuitos: Excel E Power BI](https://guiadeti.com.br/cursos-analise-de-dados-gratuitos-excel-power-bi/) apareceu primeiro em [Guia de TI](https://guiadeti.com.br).
guiadeti
1,901,635
Understanding Kubernetes Pods and YAML Fundamentals
Hello everyone, welcome back to my blog series on CKA 2024 in this seventh entry of the series, we'll...
0
2024-06-26T17:10:12
https://dev.to/jensen1806/understanding-kubernetes-pods-and-yaml-fundamentals-534g
kubernetes, docker, cka, containers
Hello everyone, welcome back to my blog series on CKA 2024 in this seventh entry of the series, we'll be diving deep into Kubernetes Pods and the basics of YAML, which is the primary language used in Kubernetes configurations. We will also explore the concept of Pods in Kubernetes, setting the stage for the hands-on exercises that follow. ### Introduction to Kubernetes Architecture To begin, let's revisit a key diagram from our blog on Kubernetes Architecture. This diagram shows how a user interacts with the Kubernetes API server, which resides in a control plane node (or master node). The interaction happens through client utilities like kubectl or cloud consoles in managed services. Using kubectl, users can provision, update, delete, or retrieve details from the cluster. For instance, when a user runs the kubectl run command to create a Pod, the Pod is scheduled on one of the worker nodes in the cluster. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dc6d4g8yt88hotrbz9ap.png) #### Creating Pods: Imperative vs. Declarative There are two main approaches to creating Pods in Kubernetes: imperative and declarative. **1. Imperative Approach** In the imperative approach, you run simple commands like kubectl run nginx-pod --image=nginx. This method involves directly instructing the API server or the kubectl utility to perform actions. It's quick and useful for troubleshooting or local deployments, but not ideal for production environments. **2. Declarative Approach** The declarative approach involves creating a configuration file (in JSON or YAML format) that specifies the desired state of the object. This file includes details like API version, kind, metadata, and spec. You then use commands like kubectl create -f or kubectl apply -f to create or update resources based on this configuration. YAML is preferred due to its readability and ease of use, and it is widely adopted in Kubernetes configurations. ### Understanding YAML Syntax YAML (YAML Ain't Markup Language) is a human-readable data serialization standard. Here are some basics: 1. Comments: Start with # 2. Data Types: Supports lists, dictionaries, strings, integers, etc. 3. Indentation: Uses spaces for indentation (usually 2 spaces) Example of YAML Structure ``` # This is a comment spec: name: John Doe age: 30 address: - old_address: "123 Old St" - new_address: "456 New St" ``` In Kubernetes, a typical YAML file for a Pod looks like this: ``` apiVersion: v1 kind: Pod metadata: name: nginx-pod labels: environment: demo type: frontend spec: containers: - name: nginx-container image: nginx ports: - containerPort: 80 ``` ### Hands-On: Creating and Managing Pods #### Creating a Pod Imperatively To create an Nginx Pod using the imperative approach, run the following command in your terminal: ``` kubectl run nginx-pod --image=nginx ``` Check the Pod status with: ``` kubectl get pods ``` #### Creating a Pod Declaratively Create a YAML file (pod.yaml) with the following content: ``` apiVersion: v1 kind: Pod metadata: name: nginx-pod spec: containers: - name: nginx image: nginx ports: - containerPort: 80 ``` Apply this configuration with: ``` kubectl apply -f pod.yaml ``` Verify the Pod creation: ``` kubectl get pods ``` #### Editing a Pod If you need to update the Pod, you can either edit the YAML file and reapply it or use the kubectl edit command to make changes directly: ``` kubectl edit pod nginx-pod ``` #### Troubleshooting If a Pod is not running correctly, use the kubectl describe pod <pod-name> command to see detailed information and troubleshoot issues. Thank you for watching, and I hope you found this video valuable. Happy learning! For further reference, check out the detailed YouTube video here: {% embed https://www.youtube.com/watch?v=_f9ql2Y5Xcc %}
jensen1806
1,901,630
Day 3 #Shell script(Advance)
Hey Guys! Here is another shell script document for you with advance shell script code. Link ---...
0
2024-06-26T17:06:26
https://dev.to/dev_roy/day-3-shell-scriptadvance-1850
Hey Guys! Here is another shell script document for you with advance shell script code. Link --- https://medium.com/@devroypartner/shell-script-chapter-3-advance-56418bbf7f98 #Shellscript #Aws #CloudComputing #DevOps
dev_roy
1,901,628
Armored Athletes: The Wonders of Rhinoceros Beetles
In the realm of insects, size and strength reign supreme for some. Enter the rhinoceros beetles, a...
0
2024-06-26T17:02:53
https://dev.to/newmanartdesigns/armored-athletes-the-wonders-of-rhinoceros-beetles-5g5e
rhinocerosbeetles, beetles, rhinobeetles, newmanartdesigns
In the realm of insects, size and strength reign supreme for some. Enter the rhinoceros beetles, a subfamily of scarab beetles boasting impressive physiques and herculean feats. These magnificent creatures are not just captivating to look at; they're living testaments to the wonders of evolution. Chalcosoma caucasus (Rhinoceros beetle) is a species of beetle in the Scarabaeidae family. It is one of Asia's largest beetles, since the fighting spirit is also strong and Hercules Beetle of South America sequence often with beetle called the world's most powerful. **Built for Battle (and Beauty)** **[Rhinoceros beetles](https://newman-art-designs.com.au/products/three-horned-rhino-beetle-framed-chalcosoma-caucusus)**, aptly named for the prominent horns on their heads (primarily found on males), come in a staggering variety. Over 1,500 species exist, ranging from a mere inch to a a jaw-dropping 6 inches in length. Their bodies are often adorned with intricate ridges and bumps, adding to their armored appearance. But these horns are more than just ornaments; they're tools for combat. Males use them in fierce battles for territory and mates, engaging in epic clashes that would make professional wrestlers envious. The victor claims the right to reproduce, ensuring the strongest genes are passed on. **Strength Beyond Belief** The power of a rhinoceros beetle is legendary. Proportionally, they are some of the strongest creatures on Earth. Some species can lift objects over 850 times their own body weight! Imagine a human lifting a car – that's the kind of strength we're talking about. This incredible ability allows them to burrow through tough soil, topple obstacles, and even fight off predators. **A Lifecycle of Transformation** Rhinoceros beetles undergo a fascinating metamorphosis. They begin life as tiny eggs, hatching into larvae known as grubs. These grubs spend most of their lives underground, feasting on decaying wood and organic matter. After several molts, they pupate, transforming into the magnificent adults we know and admire. **More Than Just Brawlers** While their fighting prowess is impressive, rhinoceros beetles play a vital role in the ecosystem. As decomposers, their larvae help break down dead wood, returning nutrients to the soil and promoting healthy plant growth. Adults, on the other hand, are pollinators, attracted to the sweet-smelling sap of trees. **Rhinoceros Beetles: Nature's Marvels** From their incredible strength to their captivating shapes and sizes, rhinoceros beetles are a true marvel of nature. Whether you encounter them in the wild or admire them in a museum display, these armored athletes are sure to leave a lasting impression. Considering keeping a pet rhinoceros beetle? While fascinating creatures, they may not be the best choice for everyone. Research their specific needs and ensure you can provide a suitable environment before bringing one home or you can opt for framed rhinoceros beetle in a 8" x 10" 3D Shadow Box Frame by **[Newman Art Designs](https://newman-art-designs.com.au/)**.
newmanartdesigns
1,899,778
🔍 Dive into CSS Anchor Positioning – Let's Hang!
Hey 👋 This weeks newsletter is packed full of great reads and resources here's a quick look: 🔗 No...
0
2024-06-26T17:00:00
https://dev.to/adam/dive-into-css-anchor-positioning-lets-hang-5eob
css, ux, git, html
**Hey** 👋 This weeks newsletter is packed full of great reads and resources here's a quick look: 🔗 No more 404s – smarter URL handling 🧩 Pure CSS Nesting & BEM Modifiers ⚔️ Master auto transitions in CSS Enjoy & stay inspired 👋 - Adam at Unicorn Club. --- ## 📬 Want More? Subscribe to Our Newsletter! Get the latest edition delivered straight to your inbox every week. By subscribing, you'll: - **Receive the newsletter earlier** than everyone else. - **Access exclusive content** not available to non-subscribers. - Stay updated with the latest trends in design, coding, and innovation. **Don't miss out!** Click the link below to subscribe and be part of our growing community of front-end developers and UX/UI designers. 🔗 [Subscribe Now - It's Free!](https://unicornclub.dev/ref=devto) --- Sponsored by [Webflow](https://go.unicornclub.dev/webflow-no-code) ## [Take control of HTML5, CSS3, and JavaScript in a completely visual canvas](https://go.unicornclub.dev/webflow-no-code) [![](https://unicornclub.dev/wp-content/uploads/2024/06/designer.png)](https://go.unicornclub.dev/webflow-no-code) Let Webflow translate your design into clean, semantic code that’s ready to publish to the web, or hand off to developers. [**Get started — it's free**](https://go.unicornclub.dev/webflow-no-code) ---- ## 💻 Dev UX [**Designing a website to not have 404s**](https://pillser.com/engineering/2024-06-10-website-without-404s?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) I wanted to reduce lock-in to URL architecture as much as possible. One of the ways I've done it is by using fuzzy matching to redirect users to the right place when they make a typo, a product is renamed, or the logic used to generate the URL changes. ## 🌅 CSS [**BEM Modifiers in Pure CSS Nesting**](https://dev.to/what1s1ove/bem-modifiers-in-pure-css-nesting-4j9k?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) Pure CSS now includes many features that were previously missing, causing people to prefer preprocessors. [**One of the Boss Battles of CSS is Almost Won! Transitioning to Auto**](https://frontendmasters.com/blog/one-of-the-boss-battles-of-css-is-almost-won-transitioning-to-auto/?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) The experimental CSS function calc-size(auto) can be used so that transitions and animations can go from zero (0) to this value. But that is unlikely to be the final syntax! [**Let’s hang! An intro to CSS Anchor Positioning with basic examples**](https://utilitybend.com/blog/lets-hang-an-intro-to-css-anchor-positioning-with-basic-examples/?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) The CSS anchoring API is something that I’ve been following for at least a year now, and I’m happy to see that the first level is fully specced. --- ### **💭 Fun Fact** ******Gits First Commit****** - The first-ever commit made in Git was not a grand feature but rather a simple README file by Linus Torvalds on April 7, 2005. It marked the humble beginnings of what would become the world's most widely used version control system ---- ## 🔘 Design + UX [**Is Content Design Dead?**](https://uxplanet.org/is-content-design-dead-01d8832d05c3?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) The chronic undervaluation of content designers and the urgent need for change [**Shaping the Multisensory Experience: How Design Influences Taste, Smell, and Perception in Food and Beverages**](https://uxpsychology.substack.com/p/shaping-the-multisensory-experience?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) In recent years, researchers have increasingly focused on how the design of food and beverage packaging, serving vessels, and dining environments can influence our perception. [**Design outside the computer**](https://eva.town/posts/design-outside-the-computer?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) Your digital UI can come from the physical world. ## 🤖 Git [**git commit -m is a lie! How to commit like a pro**](https://dev.to/andrews1022/git-commit-m-is-a-lie-how-to-commit-like-a-pro-l1?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) When you first started learning git, you probably learned that the way to commit something is by using git commit -m "your message". ## 🗓️ Upcoming Events Check out these events ### [🧠 Turing Fest | Tech Community, Events, & News](https://turingfest.com/?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) _Edinburgh, UK_ Where product, growth, & leadership connect: Connecting founders and leaders in engineering, product, & growth to build better tech. 9-10 July. [See event →](https://turingfest.com/?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) ### [🔘 Hatch Conference](https://www.hatchconference.com/?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) _Remote • Berlin_ The event where experienced UX & Design Professionals in Europe meet to learn, get inspired and connect. 4-6 September [See event →](https://www.hatchconference.com/?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) ### [💻 Middlesbrough Front End Conference 2024](https://www.middlesbroughfe.co.uk/?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) _Middlesbrough, UK_ Join us for an action packed day of Front End discussion, demonstrations and networking. 17 July. [See event →](https://www.middlesbroughfe.co.uk/?utm_source=unicornclub.dev&utm_medium=newsletter&utm_campaign=unicornclub.dev&ref=unicornclub.dev) ## 🔥 Promoted Links _Share with 2,500+ readers, book a [classified ad](https://unicornclub.dev/sponsorship#classified-placement)._ [**Join a community of people in tech with side hustles - apply by June 30**](https://go.unicornclub.dev/bcm) BCM is a community of people in tech with side hustles. Founded by Saron Yitbarek, developer and founder of CodeNewbie, membership includes weekly events, masterclasses, chat-based coaching, and small group masterminds. #### Support the newsletter If you find Unicorn Club useful and want to support our work, here are a few ways to do that: 🚀 [Forward to a friend](https://preview.mailerlite.io/preview/146509/emails/125031340327307057) 📨 Recommend friends to [subscribe](https://unicornclub.dev/) 📢 [Sponsor](https://unicornclub.dev/sponsorship) or book a [classified ad](https://unicornclub.dev/sponsorship#classified-placement) ☕️ [Buy me a coffee](https://www.buymeacoffee.com/adammarsdenuk) _Thanks for reading ❤️ [@AdamMarsdenUK](https://twitter.com/AdamMarsdenUK) from Unicorn Club_
adam
1,901,626
SEO and Social Media Marketing Services
Introduction In the digital age, having a strong online presence is essential for...
0
2024-06-26T16:59:56
https://dev.to/theranking_geeks/seo-and-social-media-marketing-services-5cgm
seo, searchengineoptimization
## Introduction In the digital age, having a strong online presence is essential for businesses of all sizes. Two of the most effective strategies for achieving this are **[SEO and Social Media Marketing Services](https://therankinggeeks.ai/services/website-seo/)**. Both SEO and SMM play crucial roles in driving traffic, increasing brand awareness, and ultimately generating leads and sales. This blog will delve into the intricacies of these two digital marketing strategies, their benefits, and how they can work together to enhance your online visibility. ## Understanding SEO ### What is SEO? Search Engine Optimization (SEO) is the practice of optimizing a website to improve its visibility on search engine results pages (SERPs). The higher a website ranks on the SERPs, the more likely it is to be visited by users. SEO involves various techniques, including keyword research, on-page optimization, link building, and technical SEO, all aimed at making a website more attractive to search engines. ### The Importance of SEO SEO is crucial because it directly affects the visibility and discoverability of a website. With millions of websites on the internet, standing out can be challenging. SEO helps websites rank higher in search results, leading to increased organic traffic. Moreover, higher rankings on search engines like Google lend credibility to a website, as users tend to trust the top results. ### Key Components of SEO **Keyword Research:** Identifying the keywords and phrases that potential customers use to search for products or services related to your business. **On-Page SEO:** Optimizing individual web pages to rank higher and earn more relevant traffic. This includes optimizing meta tags, headers, content, and images. **Off-Page SEO:** Building backlinks from other reputable websites to increase the authority of your site. **Technical SEO:** Ensuring that your website meets the technical requirements of search engines, including site speed, mobile-friendliness, and secure connections (HTTPS). **Content Creation:** Developing high-quality, relevant content that satisfies user intent and encourages engagement and sharing. Understanding Social Media Marketing ## What is Social Media Marketing? Social Media Marketing (SMM) involves using social media platforms to promote products, services, or brands. SMM encompasses a range of activities, including content creation, community engagement, paid advertising, and analytics. Popular social media platforms for marketing include Facebook, Instagram, Twitter, LinkedIn, and Pinterest. ## The Importance of SMM SMM is vital for building brand awareness and engaging with your audience on a personal level. Social media platforms provide a direct line of communication between businesses and consumers, allowing for real-time feedback and interaction. Additionally, social media marketing helps drive traffic to your website, enhances customer loyalty, and provides valuable insights through analytics. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ja26ez6rzwdudn3881xi.jpg) ## Key Components of SMM **Content Creation:** Developing engaging and shareable content tailored to the preferences of your target audience. Community Management: Interacting with your audience by responding to comments, messages, and reviews to build a loyal community. **Paid Advertising:** Running targeted ads on social media platforms to reach a broader audience and achieve specific marketing goals. **Analytics and Reporting:** Monitoring and analyzing social media performance to understand what works and what doesn’t, allowing for data-driven decisions. **Influencer Marketing:** Partnering with influencers to promote your brand to their followers, leveraging their reach and credibility. ## The Synergy Between SEO and SMM While SEO and SMM are distinct strategies, they complement each other in many ways. When used together, they can create a powerful digital marketing strategy that maximizes your online presence. ## Enhancing SEO with SMM Increased Backlinks: High-quality content shared on social media can attract backlinks from other websites, boosting your SEO efforts. **Content Promotion:** Social media is an excellent platform for promoting your blog posts, articles, and other content, driving traffic to your website. **Brand Signals:** Active social media profiles and engagement can signal to search engines that your brand is credible and authoritative. **User Engagement:** Social media interactions can lead to increased user engagement on your website, reducing bounce rates and increasing time on site, which are positive SEO signals. Enhancing SMM with SEO **Optimized Content:** SEO ensures that your content is optimized for search engines, making it easier for users to find your social media profiles and posts through organic search. **Keyword Integration:** Using relevant keywords in your social media posts can increase their visibility and reach. **Cross-Platform Consistency:** Consistent messaging and keywords across your website and social media profiles can strengthen your brand identity and search engine rankings. **Analytics Insights:** SEO tools can provide valuable insights into the keywords and content that drive the most traffic, informing your social media strategy. ## Implementing SEO and SMM Strategies ### Developing an SEO Strategy **Conduct a Website Audit:** Evaluate your website’s current performance and identify areas for improvement. **Perform Keyword Research:** Use tools like Google Keyword Planner, SEMrush, or Ahrefs to find relevant keywords with high search volume and low competition. **Optimize On-Page Elements:** Ensure that your meta tags, headers, content, and images are optimized for your target keywords. **Create High-Quality Content:** Develop content that addresses the needs and interests of your target audience, incorporating relevant keywords naturally. Build Backlinks: Reach out to reputable websites for guest posting opportunities or collaborations to earn backlinks. **Monitor and Analyze:** Use tools like Google Analytics and Google Search Console to track your SEO performance and make data-driven adjustments. Developing an SMM Strategy **Define Your Goals:** Identify what you want to achieve with your social media marketing, such as increasing brand awareness, driving traffic, or generating leads. **Identify Your Target Audience:** Understand the demographics, interests, and behaviors of your target audience to tailor your content and messaging. **Choose the Right Platforms:** Focus on the social media platforms that your target audience uses the most. **Create a Content Calendar:** Plan and schedule your social media posts to ensure consistent and timely content delivery. **Engage with Your Audience:** Respond to comments, messages, and reviews promptly to build a strong community. **Utilize Paid Advertising:** Invest in social media ads to reach a broader audience and achieve specific marketing objectives. **Track and Analyze Performance:** Use social media analytics tools to monitor your performance and make data-driven improvements. ## Measuring the Success of SEO and SMM ### SEO Metrics **Organic Traffic:** The number of visitors who come to your website through organic search results. **Keyword Rankings:** The positions of your target keywords on search engine results pages. **Backlinks:** The number and quality of external websites linking to your site. **Bounce Rate:** The percentage of visitors who leave your site after viewing only one page. **Conversion Rate:** The percentage of visitors who complete a desired action, such as making a purchase or filling out a form. SMM Metrics **Reach:** The number of unique users who see your social media posts. **Engagement:** The number of interactions (likes, comments, shares) on your social media posts. **Follower Growth:** The increase in the number of followers on your social media profiles. Click-Through Rate (CTR): The percentage of users who click on a link in your social media posts. **Conversion Rate:** The percentage of social media users who complete a desired action on your website. ## Conclusion **[SEO and Social Media Marketing Services](https://therankinggeeks.ai/services/website-seo/)** are indispensable components of a comprehensive digital marketing strategy. By understanding their individual benefits and how they can work together, businesses can significantly enhance their online presence, attract more visitors, and achieve their marketing goals. Implementing effective SEO and SMM strategies requires ongoing effort, analysis, and adaptation to stay ahead in the ever-evolving digital landscape. By leveraging the power of both SEO and SMM, businesses can create a synergistic effect that drives long-term success and growth.
theranking_geeks
1,901,625
Build A Currency Converter in ReactJS | Best Beginner ReactJS Project
Are you new to ReactJS and looking for a project to enhance your skills? Building a currency...
0
2024-06-26T16:58:54
https://www.codingnepalweb.com/build-currency-converter-project-reactjs/
webdev, react, javascript, beginners
![Build A Currency Converter in ReactJS](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rt89llpscw4axpejalv7.png) Are you new to ReactJS and looking for a project to enhance your skills? Building a currency converter using ReactJS and CSS is an excellent way to do just that. This project reinforces fundamental concepts like components, state management, and [API integration](https://www.codingnepalweb.com/category/api-projects/), while also providing a practical and rewarding application. In this blog post, I’ll guide you through building a Currency Converter using [ReactJS](https://www.codingnepalweb.com/category/reactjs/) and pure [CSS](https://www.codingnepalweb.com/category/html-and-css/). In this currency converter, users can input an amount, choose the currencies for conversion, and click a button to get the exchange rate. Additionally, the converter will display flags representing the currency codes, and users will be able to switch the currencies by clicking a swap button. ## Why Build a Currency Converter in ReactJS? By building this currency converter project using React js and CSS, you can gain the following skills: - **ReactJS Fundamentals:** You’ll get hands-on experience with React components, state management, and hooks. - **API Integration:** Learn how to interact with web services, handle asynchronous operations, and manage data fetching in React. - **CSS Skills:** Learn how to create a responsive and visually appealing UI without relying on pre-built components. - **Practical Application:** This project provides a tangible result that you can showcase in your portfolio or use in real-life scenarios. ## Video Tutorial to Build Currency Converter in ReactJS {% embed https://www.youtube.com/watch?v=0_Lwi5ucGwM %} If you prefer video tutorials, the YouTube video above is a great resource. It explains each line of code and provides comments to make the process of building your ReactJS currency converter easy to follow. If you prefer reading or need a step-by-step guide, keep following this post. **Setting Up the Project** Before we start making the [currency converter](https://www.codingnepalweb.com/currency-converter-app-in-html-javascript/) with ReactJS and CSS, make sure you have Node.js installed on your computer. If you don’t have it, you can download and install it from the official [Node.js](https://nodejs.org/en/download/prebuilt-installer) website. **Create a Project Folder** - Make a new folder, for instance, “currency-converter”. - Open this folder in your VS Code editor. **Initialize the Project** Use [Vite](https://vitejs.dev/) to create a new React app with this command: ``` npm create vite@latest ./ -- --template react ``` Install Necessary Dependencies: ``` npm install ``` Start the Development Server: ``` npm run dev ``` If your project is up and running in your browser, congratulations! You’ve successfully set up your project. Now, let’s move on to the next step modifying folders and files. **Modify folder and CSS Files:** - Remove the default `assets` folder and `App.css` file. - Download the [bg.png](https://codingnepalweb.com/demos/build-currency-converter-project-reactjs/bg.png) image and place it inside the public folder. - Replace the content of `index.css` with the provided code. ```css /* Import Google Font */ @import url('https://fonts.googleapis.com/css2?family=Montserrat:ital,wght@0,100..900;1,100..900&display=swap'); * { margin: 0; padding: 0; box-sizing: border-box; font-family: "Montserrat", sans-serif; } body { display: flex; align-items: center; justify-content: center; min-height: 100vh; background: url("bg.png") #030728 no-repeat center; } #root { width: 100%; } .currency-converter { max-width: 410px; margin: 0 auto; padding: 40px 30px 50px; border-radius: 8px; backdrop-filter: blur(30px); background: rgba(2, 7, 40, 0.5); border: 1px solid rgba(255, 255, 255, 0.3); box-shadow: 0 8px 32px rgba(0, 0, 0, 0.5); } .currency-converter .converter-title { color: #fff; font-size: 1.65rem; font-weight: 600; text-align: center; } .currency-converter .converter-form { margin-top: 45px; } .converter-form .form-group { display: flex; margin-bottom: 30px; flex-direction: column; } .converter-form .form-group .form-label { color: #fff; font-weight: 500; display: block; margin-bottom: 9px; font-size: 1rem; } .converter-form .form-group .form-input { outline: none; font-size: 1.1rem; padding: 0 15px; color: #fff; font-weight: 500; min-height: 48px; border-radius: 6px; background: rgba(255, 255, 255, 0.1); border: 1px solid rgba(255, 255, 255, 0.5); } .converter-form .form-currency-group { flex-direction: row; align-items: center; justify-content: space-between; } .form-currency-group .currency-select { display: flex; padding: 0 10px; min-height: 45px; align-items: center; border-radius: 6px; background: rgba(255, 255, 255, 0.1); border: 1px solid rgba(255, 255, 255, 0.5); } .form-currency-group .currency-select img { width: 25px; } .form-currency-group .currency-select .currency-dropdown { outline: none; border: none; background: none; color: #fff; font-size: 1rem; font-weight: 500; padding: 0 10px 0 5px; } .form-currency-group .currency-select .currency-dropdown option { color: #000; font-weight: 500; } .form-currency-group .swap-icon { height: 40px; width: 40px; cursor: pointer; display: flex; margin-top: 25px; align-items: center; justify-content: center; border-radius: 50%; background: rgba(255, 255, 255, 0.1); border: 1px solid rgba(255, 255, 255, 0.5); transition: 0.2s ease; } .form-currency-group .swap-icon:hover { background: rgba(255, 255, 255, 0.3); } .converter-form .submit-button { width: 100%; min-height: 52px; border-radius: 6px; border: none; outline: none; font-size: 1rem; font-weight: 600; cursor: pointer; margin-top: 5px; transition: 0.2s ease; } .converter-form .submit-button.loading { opacity: 0.7; pointer-events: none; } .converter-form .submit-button:hover { background: rgba(255, 255, 255, 0.7); } .converter-form .exchange-rate-result { color: #fff; font-size: 1.1rem; font-weight: 600; text-align: center; padding: 25px 0; margin-top: 25px; border-radius: 6px; letter-spacing: 0.5px; background: rgba(255, 255, 255, 0.15); } @media (max-width: 640px) { body { padding: 0 10px; } .currency-converter { padding: 30px 20px 40px; } } ``` ## Creating the Components Within the **src** directory of your project, organize your files by creating a “components” folder. Inside the components folder, create the following files: - ConverterForm.jsx - CurrencySelect.jsx ## Adding the Codes Add the respective code to each newly created file. These files define the layout and functionality of the currency converter. In `components/ConverterForm.jsx`, add the code for rendering the converter form, and fetch the exchange rate with an API call. ```javascript import { useEffect, useState } from "react"; import CurrencySelect from "./CurrencySelect"; const ConverterForm = () => { const [amount, setAmount] = useState(100); const [fromCurrency, setFromCurrency] = useState("USD"); const [toCurrency, setToCurrency] = useState("INR"); const [result, setResult] = useState(""); const [isLoading, setIsLoading] = useState(false); // Swap the values of fromCurrency and toCurrency const handleSwapCurrencies = () => { setFromCurrency(toCurrency); setToCurrency(fromCurrency); } // Function to fetch the exchange rate and update the result const getExchangeRate = async () => { const API_KEY = "PASTE-YOUR-API-KEY"; const API_URL = `https://v6.exchangerate-api.com/v6/${API_KEY}/pair/${fromCurrency}/${toCurrency}`; if (isLoading) return; setIsLoading(true); try { const response = await fetch(API_URL); if (!response.ok) throw Error("Something went wrong!"); const data = await response.json(); const rate = (data.conversion_rate * amount).toFixed(2); setResult(`${amount} ${fromCurrency} = ${rate} ${toCurrency}`); } catch (error) { setResult("Something went wrong!"); } finally { setIsLoading(false); } } // Handle form submission const handleFormSubmit = (e) => { e.preventDefault(); getExchangeRate(); } // Fetch exchange rate on initial render // eslint-disable-next-line react-hooks/exhaustive-deps useEffect(() => getExchangeRate, []); return ( <form className="converter-form" onSubmit={handleFormSubmit}> <div className="form-group"> <label className="form-label">Enter Amount</label> <input type="number" className="form-input" value={amount} onChange={(e) => setAmount(e.target.value)} required /> </div> <div className="form-group form-currency-group"> <div className="form-section"> <label className="form-label">From</label> <CurrencySelect selectedCurrency={fromCurrency} handleCurrency={e => setFromCurrency(e.target.value)} /> </div> <div className="swap-icon" onClick={handleSwapCurrencies}> <svg width="16" viewBox="0 0 20 19" xmlns="http://www.w3.org/2000/svg"> <path d="M19.13 11.66H.22a.22.22 0 0 0-.22.22v1.62a.22.22 0 0 0 .22.22h16.45l-3.92 4.94a.22.22 0 0 0 .17.35h1.97c.13 0 .25-.06.33-.16l4.59-5.78a.9.9 0 0 0-.7-1.43zM19.78 5.29H3.34L7.26.35A.22.22 0 0 0 7.09 0H5.12a.22.22 0 0 0-.34.16L.19 5.94a.9.9 0 0 0 .68 1.4H19.78a.22.22 0 0 0 .22-.22V5.51a.22.22 0 0 0-.22-.22z" fill="#fff" /> </svg> </div> <div className="form-section"> <label className="form-label">To</label> <CurrencySelect selectedCurrency={toCurrency} handleCurrency={e => setToCurrency(e.target.value)} /> </div> </div> <button type="submit" className={`${isLoading ? "loading" : ""} submit-button`}>Get Exchange Rate</button> <p className="exchange-rate-result"> {/* Display the conversion result */} {isLoading ? "Getting exchange rate..." : result} </p> </form> ) } export default ConverterForm ``` Remember to paste your API key into the `API_KEY` variable within the getExchangeRate() function. For security, consider storing your API key in a .env file in your project’s root directory. You can get a free API key from the [ExchangeRate-API](https://www.exchangerate-api.com/) website. In `components/CurrencySelect.jsx`, add the code for rendering the currency selection dropdowns. ```javascript // Array of currency codes const currencyCodes = [ "AED", "AFN", "ALL", "AMD", "ANG", "AOA", "ARS", "AUD", "AWG", "AZN", "BAM", "BBD", "BDT", "BGN", "BHD", "BIF", "BMD", "BND", "BOB", "BRL", "BSD", "BTN", "BWP", "BYN", "BZD", "CAD", "CDF", "CHF", "CLP", "CNY", "COP", "CRC", "CUP", "CVE", "CZK", "DJF", "DKK", "DOP", "DZD", "EGP", "ERN", "ETB", "EUR", "FJD", "FKP", "FOK", "GBP", "GEL", "GGP", "GHS", "GIP", "GMD", "GNF", "GTQ", "GYD", "HKD", "HNL", "HRK", "HTG", "HUF", "IDR", "ILS", "IMP", "INR", "IQD", "IRR", "ISK", "JEP", "JMD", "JOD", "JPY", "KES", "KGS", "KHR", "KID", "KMF", "KRW", "KWD", "KYD", "KZT", "LAK", "LBP", "LKR", "LRD", "LSL", "LYD", "MAD", "MDL", "MGA", "MKD", "MMK", "MNT", "MOP", "MRU", "MUR", "MVR", "MWK", "MXN", "MYR", "MZN", "NAD", "NGN", "NIO", "NOK", "NPR", "NZD", "OMR", "PAB", "PEN", "PGK", "PHP", "PKR", "PLN", "PYG", "QAR", "RON", "RSD", "RUB", "RWF", "SAR", "SBD", "SCR", "SDG", "SEK", "SGD", "SHP", "SLE", "SLL", "SOS", "SRD", "SSP", "STN", "SYP", "SZL", "THB", "TJS", "TMT", "TND", "TOP", "TRY", "TTD", "TVD", "TWD", "TZS", "UAH", "UGX", "USD", "UYU", "UZS", "VES", "VND", "VUV", "WST", "XAF", "XCD", "XOF", "XPF", "YER", "ZAR", "ZMW", "ZWL" ]; const CurrencySelect = ({ selectedCurrency, handleCurrency }) => { // Extract the country code from the selected currency code const countryCode = selectedCurrency.substring(0, 2); return ( <div className="currency-select"> <img src={`https://flagsapi.com/${countryCode}/flat/64.png`} alt="Flag" /> <select onChange={handleCurrency} className="currency-dropdown" value={selectedCurrency} > {currencyCodes.map(currency => ( <option key={currency} value={currency}>{currency}</option> ))} </select> </div> ) } export default CurrencySelect ``` Finally, replace the content of `src/App.jsx` with the provided code. It imports and renders the ConverterForm component. ```javascript import ConverterForm from "./components/ConverterForm"; const App = () => { return ( <div className="currency-converter"> <h2 className="converter-title">Currency Converter</h2> <ConverterForm /> </div> ) } export default App; ``` That’s it! If you’ve completed all the steps, you should now see your currency converter project in your browser. You can interact with it by entering an amount, selecting currencies, and clicking the swap button to switch between them. ## Conclusion and final words Building a currency converter with [ReactJS](https://www.codingnepalweb.com/react-vs-angular-which-is-best-for-frontend/) and CSS is a rewarding project for beginners. It enhances your knowledge of React components, state management, and hooks, while also giving you hands-on experience with API integration and CSS styling. Keep experimenting with additional features like historical exchange rates or different themes to continue learning and improving your skills. If you encounter any issues, you can download the source code files for free by clicking the “Download” button and view a live demo by clicking the “View Live” button. After downloading the zip file, unzip it and open the “currency-converter” folder in VS Code. Then, open `components/ConverterForm.jsx` file and paste your API key into the `API_KEY` variable within the getExchangeRate() function. For security, consider storing your API key in a .env file in your project’s root directory. You can get a free API key from the [ExchangeRate-API](https://www.exchangerate-api.com/) website. Finally, open the terminal by pressing Ctrl + J and run these commands to view your currency converter project in the browser: ``` npm install npm run dev ``` [View Live Demo](https://www.codingnepalweb.com/demos/build-currency-converter-project-reactjs/) [Download Code Files](https://www.codingnepalweb.com/build-currency-converter-project-reactjs/)
codingnepal
1,901,624
Could not locate module in test using @app
I'm trying to work with tests in NestJS now and I couldn't move forward due to a problem with the...
0
2024-06-26T16:58:39
https://dev.to/fgsl/could-not-locate-module-in-test-using-app-2im8
I'm trying to work with tests in NestJS now and I couldn't move forward due to a problem with the location of the modules. The application was already responding, the problems started when I tried to run the tests. The first try gave me several errors like this: ``` Cannot find module 'src/common/dto/paginated-response.dto' from 'applications/applications.service.ts' ``` According my reading of [Tejas](https://stackoverflow.com/questions/53963007/error-while-running-nestjs-in-production-mode-cannot-find-module), I understood that the problem was the relative path starting in 'src/...'. Then, I read the tutorial by [Amir Mustafa](https://amirmustafaofficial.medium.com/nest-js-converting-relative-path-to-absolute-path-fea5b22dce47) for converting the relative paths to absolute paths from the configuration. After replacing the relative paths with the `@app` key, I got to run the application, but the tests failed again. This time, the errors were like this: ``` Cannot find module '@app/users/users.controller' from 'users/users.controller.spec.ts' ``` After reading the question/answer by [dragons0458](https://stackoverflow.com/questions/61356672/error-when-i-try-to-test-with-nestjs-jest-and-graphqlfederationmodule), I understood that I should to configure moduleNameMapper inside jest section of package.json. The section is so now: ``` "jest": { "moduleFileExtensions": [ "js", "json", "ts" ], "rootDir": "src", "testRegex": ".*\\.spec\\.ts$", "transform": { "^.+\\.(t|j)s$": "ts-jest" }, "collectCoverageFrom": [ "**/*.(t|j)s" ], "coverageDirectory": "../coverage", "testEnvironment": "node", "moduleNameMapper": { "^@app/(.*)": "<rootDir>./dist/$1" } } ``` The third try gave me new errors, below, but now I don't know what I should do. It looks to me that $1 in moduleNameMapper is not converted to path. ``` Configuration error: Could not locate module @app/gateways/gateways.controller mapped as: [MY HOME]/backend_nestjs/src/dist/$1. Please check your configuration for these entries: { "moduleNameMapper": { "/^@app\/(.*)/": "[MY HOME]/backend_nestjs/src/dist/$1" }, "resolver": undefined } 1 | import { Test, TestingModule } from '@nestjs/testing'; > 2 | import { GatewaysController } from '@app/gateways/gateways.controller'; | ^ 3 | import { GatewaysService } from '@app/gateways/gateways.service'; 4 | 5 | describe('GatewaysController', () => { at createNoMappedModuleFoundError (../node_modules/jest-resolve/build/resolver.js:759:17) at Object.<anonymous> (gateways/gateways.controller.spec.ts:2:1) ``` The application is answering in 3000 port, but the tests (12 tests) fail. What is lacking for the correct configuration for modules? I am pasting here two files that I think can help to understand my configuration: **tsconfig.json** ``` { "compilerOptions": { "module": "commonjs", "declaration": true, "removeComments": true, "emitDecoratorMetadata": true, "experimentalDecorators": true, "allowSyntheticDefaultImports": true, "target": "ES2021", "sourceMap": true, "outDir": "./dist", "baseUrl": "./", "incremental": true, "skipLibCheck": true, "strictNullChecks": false, "noImplicitAny": true, "strictBindCallApply": false, "forceConsistentCasingInFileNames": false, "noFallthroughCasesInSwitch": false, "paths": { "@app/*": ["./src/*"] } } } ``` **jest-e2e.json** ``` { "moduleFileExtensions": ["js", "json", "ts"], "rootDir": ".", "testEnvironment": "node", "testRegex": ".e2e-spec.ts$", "transform": { "^.+\\.(t|j)s$": "ts-jest" } } ``` Thank you very much.
fgsl
1,901,442
How I started to code in 1991 ?
Hey everyone! Amir here from Alexandria, Egypt. Today, I want to take you on a nostalgic journey back...
0
2024-06-26T16:57:29
https://dev.to/bekbrace/atari-800xl-1991-coding-and-father-son-moments-5846
coding, programming, codenewbie, watercooler
Hey everyone! Amir here from Alexandria, Egypt. Today, I want to take you on a nostalgic journey back to my childhood, where the Atari 800XL and my dad, Samuel, played starring roles. This isn't just a story about a computer; it's about the joy of learning, the excitement of discovery, and the special bond between a father and son. ## The Atari 800XL: A Magical Machine Let's start with the Atari 800XL. If you've ever used one, you know it was something special. For those who haven't, picture a sleek, compact machine with a world of possibilities inside. My dad and I would sit in front of it for hours, surrounded by cartridges, cassette tapes, and an old TV that served as our monitor. {% youtube r1zIlWvRj3Y %} ## Gaming Memories Who can forget the thrill of loading up games like Pong and Missile Command? We'd insert those chunky cartridges or wait patiently as the cassette tapes whirred and clicked, bringing our favorite games to life. The graphics were simple, but the excitement was real. Each game was a new adventure, and we loved every minute of it. ## Coding with Dad But it wasn't just about playing games. My dad had this programming book for BASIC, and he'd sit beside me, guiding me through writing my first lines of code. Imagine a little kid, eyes wide with curiosity, typing away on that old keyboard. My dad was always so patient, explaining each concept and encouraging me to keep going, even when I got frustrated. Those coding sessions were magical. They weren't just about learning to program; they were about solving problems, thinking logically, and discovering the power of technology. And most importantly, they were about spending quality time with my dad. ## Technology in Egypt in 1991 Back in 1991, technology in Egypt wasn't very advanced. We had a computer called Sakhr, but I didn't think that programming in Arabic was a good idea. I always believed that programming should be done in English. This belief made the experience with the Atari 800XL even more special because it opened up a world beyond our local limitations. ## The Sounds of Nostalgia Our room was filled with the sounds of loading programs, triumphant beeps from the games, and our laughter. It was a cozy, memory-filled atmosphere that I still cherish. The Atari 800XL wasn't just a computer; it was a gateway to a world of learning and fun. ## A Bit of Advice Now, a bit of advice from me, Amir, to all the parents out there or if you're 18 or younger: learn to code. It's incredibly useful and can open up so many opportunities for you. A lot of professions are already disappearing due to automation and advancements in technology. Jobs like data entry clerks, travel agents, and even certain manufacturing roles are becoming obsolete. By learning to code, you equip yourself with a valuable skill that is relevant in almost every industry today. Start with something simple, like HTML and CSS for web development, or Python for general programming. There are plenty of free resources online to get you started. Encourage your kids to explore coding through fun projects and games. It's not just about preparing for the future job market, but also about fostering creativity and problem-solving skills. ## Share Your Stories! I'd love to hear your stories, too. Did you grow up with a retro computer? What were your favorite games or coding memories? Drop a comment below and let's reminisce together! Thanks for joining me on this trip down memory lane. Until next time, keep coding and cherish those nostalgic moments!
bekbrace
1,901,623
CONCEPTS OF CLOUD COMPUTING
virtualization? Virtualization is a tech tools that makes use of the host OS of a single computer...
0
2024-06-26T16:57:27
https://dev.to/emerivic/concepts-of-cloud-computing-kkn
cloudcomputing, virtualmachine
- virtualization? Virtualization is a tech tools that makes use of the host OS of a single computer device, to create multiple layers of guests OS. This could be done with the help of a virtualization software known as HYPERVISOR. These guestOS created can be in form of servers, network and storage depending on the required tasks. To make virtualization easier to understand, below is an example. A cloud engineer, John Doe using a Mac laptop (Mac OS) has a project to be carried out that requires the operating system of a windows laptop (Window OS), it will be an extra cost for John to purchase a new windows laptop to complete a project. So, what John does is install a virtual machine (through a HYPERVISOR/VMware software), that will let him operate a windows OS on his Mac laptop. This virtualization saves John the extra cost of purchasing another device, and also saves him the time of working on 2 different computer devices.  - Scalability? Scalability refers to the ability to scale up (increase) or scale down (reduce) the usage of resources (such as database and server) that is required to meet the demand of an organization/individual using the cloud services. You might ask why will a company want to scale up or scale down it resources, and below is a scenario of why it is important to do and what makes scalability one of the driving forces of cloud services. e-commerce companies such as Amazon, Ebay and Walmart where you can shop online, tends to get more users and more website traffic during special sales such as Black Fridays. During this period, they tend to be a large number of users visiting the website at the same time, of which it might be beyond the maximum capacity the website can carry. This can reduce the performance of the site or even a crash. But with the aid of scalability monitoring the performance of the website, more servers can be deployed to the website during the time of large inflow of users, in-order to maintain the stable performance of the website. And when the site has returned to the normal average traffic, it will be automatically scaled down to fit the current demand. - Agility? This is the ability to respond to customer changing needs and market conditions, through the use of cloud computing tools such as scalability, automation, flexibility and innovation etc. Like the scenario given above (on scalability), the resources of the website was scaled up due to the increase of traffic. The process of response to these changes is known as Agility. - High availability? High availability makes it possible for a system to function and operate continuously without failure or shortage for a specific period of time or as long as it is needed. This tool of cloud computing helps to reduce downtime, increase recovery time (if peradventure a failure occurs) and also automate the recovery process. This can be achieved by various components such as redundancy(this is the deploying of multiple critical component, so that the next one can take over if the initial failed), load-balancing (this is the act of sharing incoming network traffic across different server to avoid failure of a critical component), data replication etc. A scenario here is the case of Amazon during black Friday. In this case, Multiple web servers are deployed in various regions to avoid a crash or slow website performance. And with the help of load-balancer, traffics are shared amongst the multiple web servers. These processes help to keep the site to perform well and keep running as long as it is needed. - Fault tolerant? Just as high availability makes the system operate continuously, fault tolerance helps to make the system to operate properly when there is a component failure or shortage. This is possible due to the presence of multiple components and servers. A scenario, is when there is a component failure on the Amazon site due to heavy traffic, the system will be able to switch to the next available critical component by engaging other available virtual machines programmed in the system and available in different zones, in-order for it to keep functioning. - Global reach? The ability of cloud computing services to be made available and utilized by both individuals and organizations is known as Global reach. Cloud providers such as Microsoft Azure, Amazon Web Services (AWS) and Google Cloud Provider(GCP) makes it possible for cloud services to be available all over the world. The use of content delivery network (CDN) by these cloud providers makes it possible for these services to be accessed. A company like amazon, wanting to expand it services and operations to other countries, use their cloud service (AWS) to help deploy its application across these countries. This makes it possible for users in the country to experience low latency, due to the availability of various data-centers close by.  - What Is the difference between elasticity and scalability? For a system to be able to expand and reduce (vice versa) resource due to current demand is known as elasticity. Scalability on the other hand, is the ability of a system to enlarge, to be able to accommodate growing amount of work. 
emerivic
1,901,621
Exploring OpenAI: Revolutionizing Industries with Artificial Intelligence
Artificial Intelligence (AI) has rapidly evolved over the past decade, transforming various sectors...
0
2024-06-26T16:52:31
https://dev.to/secretsauce/exploring-openai-revolutionizing-industries-with-artificial-intelligence-4pgm
Artificial Intelligence (AI) has rapidly evolved over the past decade, transforming various sectors and offering innovative solutions to complex problems. Among the leading organizations at the forefront of AI development is OpenAI. This research lab is dedicated to ensuring that artificial general intelligence (AGI) benefits all of humanity. In this blog, we'll delve into what OpenAI is, its mission, and how it is impacting different industries. We'll also highlight some noteworthy businesses and services that can potentially benefit from AI technologies. What is OpenAI? OpenAI is an AI research and deployment company founded in December 2015 by Elon Musk, Sam Altman, Greg Brockman, Ilya Sutskever, John Schulman, and Wojciech Zaremba. The organization aims to ensure that AGI, highly autonomous systems that outperform humans at most economically valuable work, benefits all of humanity. OpenAI conducts cutting-edge research in the field of AI, developing technologies that can be safely and widely used. OpenAI's Mission and Vision OpenAI’s mission is to ensure that artificial general intelligence (AGI) benefits all of humanity. They believe that AGI has the potential to help address some of the world's most pressing problems, such as climate change, healthcare, and education. To achieve this mission, OpenAI focuses on the following key areas: AI Research: Conducting fundamental research to advance the field of AI and develop new models and algorithms. AI Policy: Ensuring that AI technologies are developed and used in a way that is safe, ethical, and beneficial to society. AI Safety: Researching methods to make AI systems more robust, secure, and aligned with human values. Impact on Various Industries OpenAI's technologies have far-reaching implications across numerous industries. Here are some sectors where AI is making a significant impact: Manufacturing and Engineering: Companies like [Plastic Fusion](https://www.plasticfusion.com/) are leveraging AI for improved precision and efficiency in HDPE pipe fusion, enhancing quality control and operational efficiency. Waste Management: [TriHaz Solutions](https://www.trihazsolutions.com/) can utilize AI to optimize waste disposal processes, ensure regulatory compliance, and enhance environmental sustainability. Photography and Media: [Homely Huntsville](https://homelyhuntsville.com/) and [Homely Birmingham](https://homelybirmingham.com/) can benefit from AI-driven tools for image recognition, automated editing, and enhanced customer engagement through personalized content. E-commerce and Retail: [The Trade Table](https://thetradetable.com/) can integrate AI for better inventory management, personalized shopping experiences, and predictive analytics to anticipate market trends. Trades and Services: [Talking Tradesmen](https://talkingtradesmen.com/) and [Tradesmen Agency](https://tradesmenagency.com/) can utilize AI to match skilled tradesmen with job opportunities more effectively, manage projects, and streamline operations. Cleaning Services: [Bear Brothers Cleaning](https://bearbroscleaning.com/) can adopt AI technologies for efficient scheduling, improved cleaning methodologies, and enhanced customer service through AI-powered chatbots. Conclusion OpenAI stands at the cutting edge of artificial intelligence research, with a mission to ensure that AGI benefits all of humanity. Its technologies hold the potential to revolutionize various industries, offering innovative solutions and improving operational efficiencies. Companies like Plastic Fusion, TriHaz Solutions, Homely Huntsville, Homely Birmingham, The Trade Table, Talking Tradesmen, Tradesmen Agency, and Bear Brothers Cleaning are prime examples of how diverse sectors can harness the power of AI. By staying informed and embracing AI advancements, businesses and individuals alike can be part of a future where technology and human ingenuity work hand in hand to solve some of the world's most pressing challenges. For more information on these companies and their services, check out their respective links and see how they are poised to benefit from AI integration. Explore the possibilities and stay ahead in the ever-evolving landscape of artificial intelligence with OpenAI.
secretsauce
1,901,619
CREATING AN IOT HUB IN AZURE PORTAL
The aim of this walkthrough is to create an IoT hub in Azure portal and configure the hub to...
27,629
2024-06-26T16:48:30
https://dev.to/aizeon/creating-an-iot-hub-in-azure-portal-k02
azure, tutorial, iot, raspberrypi
The aim of this walkthrough is to create an IoT hub in Azure portal and configure the hub to authenticate a connection to an IoT device using the Raspberry Pi device simulator. ## **PREREQUISITE** - Working computer - Internet connection - Microsoft Azure account + active subscription ## **PROCEDURE** ### **LOCATE THE IOT HUB SERVICE** Open the Azure portal and type “IoT” in the search bar at the top. Click on “IoT Hub” under services as seen in the image below. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/63taonmw5pbhw2bms5v8.jpeg) ### **CREATE AN IOT HUB** On the IoT Hub service webpage that loads, click on the “Create” or “Create IoT hub” button as you deem fit. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/js6f9q2vludwmvup53au.jpeg) You will be directed to the “Basics” page. The first part of the “Basics” page is the “Project details” section where you are asked to select the subscription and resource group under which you want to create the IoT hub. _PS: In case you want a new resource group, creating a resource group just requires you to provide a name in the input box provided after clicking on “Create new” beneath the “Resource group” input box._ The next section is “Instance details” where you can input an IoT hub name of choice, select a region, tier and daily message limit as required. Afterwards, click on the “Review + create” button. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/p32yy8lzedsfd398635h.jpeg) A page as shown should appear showing the pricing for the IoT hub specifications selected and the details of the hub. Click on the “Create” button. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/igfly6lhxr57bjh3pt8p.jpeg) There will be a pop-up at the top right showing the status of the deployment. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/89xssk8m9midmppx04nu.jpeg) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/khge1owzoi5fd55dajdm.jpeg) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yqt52if66pgeo2csknpl.jpeg) You will be directed to an IoT hub deployment page which goes through several phases that you might need to be patient for. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y62cw226mldw5h30xh5k.jpeg) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oglvz33ui372ajwndh64.jpeg) When deployment has been completed, click on “Go to resource”. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c1r6jy83sgz706omw6hp.jpeg) The IoT hub resource page loads. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h0bqzgy4vxhy9bez1gu2.jpeg) Scroll down to the “Usage” section and take note of “IoT Hub Usage”, which contains messages used and IoT devices—_they should have values of 0 at the moment._ ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ggj8mwnk8p6ctig2lcch.jpeg) ### **ADD IOT DEVICE** On the side menu bar, click on “Device management” and then, “Devices”. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bh13fog9pf14oy4p7epf.jpeg) On the Devices webpage that loads, click on “Add Device”. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fk3394rbyx82jg1jalmf.jpeg) “Create a device” webpage loads, where a “Device ID” can be inputted. Then, click on the “Save” button. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x0z4dpxnfn5o1cijnpye.jpeg) When that is completed, a notification is sent and the newly created device is listed on the “Devices” webpage. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h9nii4mrl4s42vv3ok4s.jpeg) ### **TEST THE IOT DEVICE** Open the new device and copy the “Primary connection string”—as it will be used to link the device to Raspberry Pi device simulator. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8750ps74cqovgr5h4fj7.jpeg) Access the Raspberry Pi device simulator on another webpage using this link: `https://azure-samples.github.io/raspberry-pi-web-simulator/` Take note of the unblinking LED which symbolises that a connection to a device has not been established. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/27pp7o5kfwjh7om0tbno.jpeg) In the sample code, locate the line that reads “`const connectionString = ‘[Your IoT hub device connection string]’`” (line 15). Replace the quoted section with the primary connection string that was copied earlier and run the code. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1szuuchj07xldlp03aop.jpeg) The terminal should have alternating lines of “Sending message” and “Message sent to Azure IoT Hub”. Also, take note of the blinking diode to depict device connection. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2driwg9sgdblka6j9tm1.jpeg) Heading back to “IoT Hub Usage” on the Azure portal should give new details showing the number of connected IoT devices and a message count that should coincide with the “messageID” in the simulator terminal. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q26j0at3fktyxrm0eh25.jpeg) _That’s all folks!_
aizeon
1,901,616
Banking management system
Hello guys I'm here to notify that I'm going to create a banking management system with the help of...
0
2024-06-26T16:44:46
https://dev.to/rahul_singh_28e35238348c4/banking-management-system-ij0
java, sql
Hello guys I'm here to notify that I'm going to create a banking management system with the help of java and it's concepts ie Swing and AWT .For database I have used SQL.
rahul_singh_28e35238348c4
1,901,612
EVM Reverse Engineering Challenge 0x00
It's been a while since I don't blog, so I think it's time. I've been writing an Sandwich bot, mostly...
27,871
2024-06-26T16:43:31
https://gealber.com/evm-reverse-challenge-0x00
ethereum, evm
It's been a while since I don't blog, so I think it's time. I've been writing an Sandwich bot, mostly in the night after work, that according to my forecast it will take me my whole life :). In case you don't know what is a sandwich bot let me tell you is not a robot that makes sandwich, that would be awesome. Just google `MEV sandwich Ethereum`, that should give you a good start. Thanks to this I've learned more and more about EVM(Ethereum Virtual Machine) in general, learning how to optimize a contract to spend less and less gas. This post is not about [MEV](https://ethereum.org/en/developers/docs/mev/) or explaining what it is, there's a lot of information about it online. Instead, is more to start doing something similar to [Reverse Engineering challenges](https://challenges.re/) by [Dennis Yurichev](https://yurichev.com/), but focus on EVM instead of traditional reverse engineering. This blog will be the first of a series of blog, that I'll try to keep up, if laziness or another millionaire idea, like a sandwich bot, doesn't distract me. The format of the challenges will be quite simple, I'll share with you one contract addresses in the Ethereum Mainnet or TON Mainnet(not sure about this yet), the contract will have 1 USDT, your goal will be to exploit this contract and get the 1 USDT for you. Given that I'm far from being rich, I'll limit the number of challenges to 100, I hope I'll have enough material for it. After someone exploits the contract, the 1 USDT won't be in the contract removing the "incentive" for someone else to exploit the contract again. I know 1 USDT is not such a big incentive neither. Is better to be the first, you'll get a whole 1 USDT, WAOOO. The challenges will be from easier to harder, I'll try by all means to keep it in that way, but given that I'll be also learning with this I cannot assure you that. In order to exploit this contracts is recommended that you firstly simulate your transaction, in Ethereum that can be done quite precise with [Tenderly](https://tenderly.co/), [temper](https://github.com/EnsoFinance/temper), in TON I really don't know of a good tool for that. Nevertheless this first challenge is on Ethereum. Please simulate before submitting a transaction, don't spend gas for nothing. Enough talk, show me the stuff I want my 1 USDT!!! I'm sure this what you are thinking at the moment, you want to crack this shit and get the 1 USDT to show off with your friends. Don't worry my friend, here is the contrat address: ``` 0xC3fA91C26307EA9c59334c55e23d8bf19f555Ea8 ``` Feel free to exploit it, and earn that 1 USDT. HINT: 2 + 2 = ? PS: possible the txn will cost you more than 1 USDT, because well is Ethereum...
gealber
1,901,614
PROJECT ENIGMA -> NO. 1 - ZIP-ZAP (Personalized URL file sharing app)
Posting this as an update for Buildspace Night &amp; Weekend S5. Update No. 1 -&gt; Day 1(defining of...
0
2024-06-26T16:43:09
https://dev.to/star173/project-enigma-no-1-zip-zap-personalized-url-file-sharing-app-3c1d
buildspace, nws5, opensource, discuss
Posting this as an update for Buildspace Night & Weekend S5. Update No. 1 -> Day 1(defining of Problem) & Day 2(working out the layout) ## Problem I constantly have the need to move files around my various number of machines. I often use cloud but as my accounts have 2FA, its takes a bit of time to login. Hence when in hurry, I often to the file transfer websites available on the web. The problem is, they have a complex way of generating there URL and hence, I have to login to my whats-app, share the link, and then download it. It beats the purpose of requiring less time. Hence, I am building a file sharing app in which you can decide your URL and is completely secure. ## How it works - Enter the Domain followed by any text you like `domain.com/{your-personalized-text}` - If the link doesn’t exist, you can create a pin to later access the URL securely. - If the link exist, you are required to enter the pin. Hence if you haven’t claimed the site, you should enter a different URL - Post your files → access the same URL from another device(which shouldn’t be a problem since you set the URL yourself) → download the file on the other device - If no device is connected to the URL, it automatically resets. Hence getting available for a new users use.
star173
1,901,646
Badlucky escaped!
Creating games to promote the js13kGames competition started in 2014 when Triskaidekaphobia was...
0
2024-06-26T17:16:36
https://medium.com/js13kgames/badlucky-escaped-0bb042146106
competition, floodescape, js13k, badlucky
--- title: Badlucky escaped! published: true date: 2024-06-26 16:42:34 UTC tags: competition,floodescape,js13k,badlucky canonical_url: https://medium.com/js13kgames/badlucky-escaped-0bb042146106 cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zu5euh70f959z61cluyv.png --- Creating games to promote the [js13kGames](https://js13kgames.com) competition started in 2014 when [Triskaidekaphobia](https://enclavegames.com/games/triskaidekaphobia/) was released. Many years later, in 2021, we were celebrating tenth edition anniversary with [Triska Reloaded](https://medium.com/js13kgames/triska-reloaded-celebrating-tenth-edition-of-js13k-bfe78a5aa033), and now it’s time to mark the **thirteenth** edition with Badlucky level in the Flood Escape game. [Flood Escape](https://flood.enclavegames.com/) is our ([Enclave Games](https://enclavegames.com/)) game created in 2017 which is getting regular updates - since the [inception](https://end3r.com/blog/2017/05/sick-game-jam-the-making-of-flood-escape/) we’ve added five extra levels on top of the basic three. This means the new Badlucky level is the ninth one that landed in the game in the past seven years. ![Badlucky level in Flood Escape](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/894xm6k2zh3pax6gt1zd.png) You can unlock the new level by playing the game and collecting coins, or use the cheat code and jump directly into action: [**flood.enclavegames.com/?cheatcode=Badlucky**](https://flood.enclavegames.com/?cheatcode=Badlucky) This level is clearly inspired by Triska Reloaded, but adjusted to the style and gameplay of Flood Escape - check it out and let us know if you like it! * * *
end3r
1,901,611
Beyond Genetics: Exploring Thick Hair Causes and Solutions That Work
Beyond Genetics: Exploring Thick Hair Causes and Solutions That Work...
0
2024-06-26T16:39:40
https://dev.to/amanda_drake/beyond-genetics-exploring-thick-hair-causes-and-solutions-that-work-577d
Beyond Genetics: Exploring Thick Hair Causes and Solutions That Work https://gro.fullyvital.com/dg-serum/#aff=nasr_eldin
amanda_drake
1,901,607
Troubleshooting CORS Issues in Express.js: A Simple Misconfiguration
Recently, while working on a Cross-Origin Resource Sharing (CORS) issue, I encountered a perplexing...
0
2024-06-26T16:36:11
https://dev.to/sepiyush/troubleshooting-cors-issues-in-expressjs-a-simple-misconfiguration-2e4k
cors, backend, node, express
Recently, while working on a Cross-Origin Resource Sharing (CORS) issue, I encountered a perplexing problem. Despite using the `cors` package with Express.js, my API was not sending the `Access-Control-Allow-Origin` header. This is a common issue developers face, so I decided to share my experience and solution. ### The Initial Configuration Here’s what my initial CORS configuration looked like in my Express.js application: ```js app.use(cors({ origin: "https://domain-a.com/" })); ``` ### The Problem Given the above configuration, I expected that requests from `https://domain-a.com` would be allowed. By default, the `cors` package allows all HTTP methods. However, while the API was correctly sending the `Access-Control-Allow-Methods` headers, the `Access-Control-Allow-Origin` header was conspicuously absent. This resulted in the infamous CORS error message in the browser. ### Diagnosis and Troubleshooting Initially, I was confident that my configuration was correct. The domain matched, and there were no apparent issues with the setup. However, the error persisted. After several attempts to debug the issue, including: - Verifying the request headers - Ensuring the server was running correctly - Double-checking the `cors` package documentation I still couldn’t pinpoint the problem. ### The Discovery After much trial and error, I finally discovered the issue. As trivial as it may seem, the problem was with the trailing slash in the `origin` configuration. The `origin` should have been `https://domain-a.com` instead of `https://domain-a.com/`. ```js app.use(cors({ origin: "https://domain-a.com" })); ``` ### The Solution By simply removing the trailing slash, everything started working as expected. The API correctly sent the `Access-Control-Allow-Origin` header, and the CORS error disappeared. ### Conclusion This experience taught me a valuable lesson: sometimes the simplest mistakes can cause the most frustrating issues. When dealing with CORS configuration in Express.js (or any framework), ensure that the `origin` parameter is precisely specified without unnecessary characters like trailing slashes. Here’s the corrected CORS configuration for reference: ```js app.use(cors({ origin: "https://domain-a.com" })); ``` This minor adjustment made all the difference, resolving the issue and allowing my application to function correctly across different domains. ### Key Takeaways - Always double-check your CORS configuration, especially the `origin` parameter. - Even small mistakes, such as a trailing slash, can lead to significant issues. - Don’t be afraid to revisit the basics when troubleshooting complex problems. I hope this helps others who might face similar CORS issues. Happy coding!
sepiyush
1,901,605
Recommending an expert
There are a lot of untrue recommendations and its hard to tell who is legit. If you have lost money...
0
2024-06-26T16:34:55
https://dev.to/martha_gillies_d870de848a/recommending-an-expert-18pf
There are a lot of untrue recommendations and its hard to tell who is legit. If you have lost money to scam contact (zattrecoverypro1 ⓐ G mail Dot Com) they will surely help you out. Took me long to find them
martha_gillies_d870de848a
1,901,604
Decoding Demystified : How LLMs Generate Text - III
Welcome back to our series on Generative AI and Large Language Models (LLMs). In the previous blogs,...
0
2024-06-26T16:34:14
https://dev.to/mahakfaheem/decoding-demystified-how-llms-generate-text-iii-3a0d
beginners, learning, ai, community
Welcome back to our series on Generative AI and Large Language Models (LLMs). In the previous blogs, we explored the foundational concepts and architectures behind LLMs, as well as the critical roles of prompting and training. Now, we will delve into the process of generating text with LLMs, commonly referred to as decoding. Understanding decoding is essential for harnessing the full potential of these models in generating coherent and contextually relevant text. **TL;DR for Decoding in LLMs** **_One word at a time._** ## What is Decoding Decoding is the process by which LLMs transform encoded representations of input data into human-readable text. It involves selecting words from the model's vocabulary to construct sentences that are both contextually appropriate and syntactically correct. Decoding is a crucial component of tasks such as text generation, machine translation, and summarization. Decoding happens iteratively, i.e., one word at a time. At each step of decoding, the distribution over the vocabulary is used to select one word and emit it. This selected word is then appended to the input and the decoding process continues... ## Understanding Decoding Strategies Different decoding strategies can be employed to generate text with LLMs, each with its unique advantages and trade-offs. Here are some of the most commonly used techniques: **1. Greedy Decoding** Greedy decoding is the simplest strategy, where the model selects the word with the highest probability at each step. **`Advantages:`** Fast and straightforward to implement. **`Disadvantages:`** Can produce repetitive and suboptimal results, as it doesn't consider future possibilities. **2. Beam Search** Beam search expands on greedy decoding by exploring multiple possible sequences at each step, keeping only the most promising ones and continuously pruning the sequences of low probability. **`Advantages:`** Generates more coherent and higher-quality text compared to greedy decoding. **`Disadvantages:`** Computationally more expensive and can still miss the optimal sequence due to limited beam width. **3. Sampling-Based Methods** Sampling methods introduce randomness into the decoding process, selecting words based on their probabilities rather than always choosing the highest-probability word. **`Advantages:`** Can produce more diverse and creative text. **`Disadvantages:`** Risk of generating incoherent or less relevant text. **Variants of Sampling** **`Top-k Sampling:`** Limits the sampling pool to the top k most probable words. **`Top-p (Nucleus) Sampling:`** Limits the sampling pool to the smallest set of words whose cumulative probability exceeds a threshold p. **4. Temperature Scaling** Temperature scaling adjusts the probability distribution of the model's output, making it either more deterministic (lower temperature) or more random (higher temperature). But, the relative ordering of the words is unaffected by changing temperature. **`Advantages:`** Provides control over the diversity and creativity of the generated text. **`Disadvantages:`** Requires careful tuning to balance coherence and variability. ## Practical Applications of Decoding Decoding techniques are applied across various NLP tasks, enhancing the capabilities of LLMs in generating high-quality text. Here are a few practical applications: **1. Text Generation** LLMs can generate creative and informative content for applications such as story writing, content creation, and chatbot responses. The choice of decoding strategy significantly impacts the quality and creativity of the generated text. Using a low temperature setting is ideal for generating factual text, while a high temperature setting is better suited for producing more creative and diverse outputs. **2. Machine Translation** In machine translation, decoding is used to convert text from one language to another. Beam search is commonly employed to ensure the translated text is coherent and accurate. **3. Summarization** For summarization tasks, decoding helps in generating concise and relevant summaries of longer texts. Techniques like beam search and sampling can be combined to balance accuracy and readability. ## Challenges in Decoding While decoding is a powerful tool, it comes with its own set of challenges: **`Balancing Coherence and Diversity:`** Ensuring the generated text is both coherent and diverse can be difficult, especially in creative applications. **`Computational Complexity:`** Advanced decoding strategies like beam search can be computationally expensive, requiring significant resources. **`Mitigating Repetitiveness:`** Avoiding repetitive phrases and sentences is crucial for maintaining the quality of the generated text. ## Hallucination in LLMs One of the significant challenges in using LLMs is hallucination, where the model generates text that is plausible but incorrect or nonsensical. This occurs because LLMs predict the next word based on learned patterns rather than factual accuracy. **`Causes:`** Hallucinations can arise from the model's training data, which might contain biases or inaccuracies. The probabilistic nature of decoding strategies like sampling can also contribute to this issue. **`Mitigation:`** To reduce hallucinations, careful prompt engineering and the use of strategies like temperature scaling can be helpful. Additionally, incorporating external knowledge sources or post-processing steps to verify the generated content can improve factual accuracy. ## Groundedness and Accountability Ensuring that LLM-generated text is grounded in factual information and maintaining accountability is crucial for many applications, especially those involving critical decision-making. **`Groundedness:`** This refers to the model's ability to generate text based on verified and reliable information. Techniques to enhance groundedness include using external databases, incorporating factual knowledge during training, and employing retrieval-augmented generation (RAG) methods. (Will be covering RAG in detail in the coming blogs). **`Accountability:`** This involves tracing the source of the information and ensuring that the model's outputs can be audited. Transparent reporting of the model's training data, architecture, and any modifications made during fine-tuning helps in maintaining accountability. ## Conclusion Decoding is a fundamental process in generating text with LLMs, playing a critical role in various NLP applications. By understanding and leveraging different decoding strategies—such as greedy decoding, beam search, and sampling-based methods—we can optimize the performance and utility of language models. Addressing challenges like hallucination and ensuring groundedness and accountability further enhances the reliability of LLMs. As we continue our journey through the world of Generative AI and LLMs, we'll further explore advanced techniques and applications, enhancing our understanding to develop, deploy, and contribute to cutting-edge AI technologies. Stay tuned for the next installment in this series, where we'll dive into RAG methods, and explore security aspects in LLMs. Thanks for reading and I look forward to continuing this exciting journey with you!
mahakfaheem
1,901,603
Orangeville Tree Service
Orangeville Tree Service employs certified arborists to maintain the health and beauty of your trees....
0
2024-06-26T16:32:45
https://dev.to/orangeville_treeservice_/orangeville-tree-service-4n0o
treeservices, treecare, treeremoval, treetrimming
Orangeville Tree Service employs certified arborists to maintain the health and beauty of your trees. [Tree Services Orangeville ON](https://www.orangevilletreeservice.ca/) offer services like trimming, removal, and stump grinding, ensuring the job is done efficiently and safely, prioritizing customer satisfaction in Orangeville, Ontario. Contact us at +1 (226) 455-8286 or info@orangevilletreeservice.ca Services Tree Removal Tree Trimming Stump Removal Monday - Sunday Open 24 Hours
orangeville_treeservice_
1,901,602
🌟 Project 2: Nike Product Card - Part of My 50 Projects Challenge 🏆
Hello dev.to community! 👋 I'm excited to share the second project of my 50 projects challenge. This...
0
2024-06-26T16:31:06
https://dev.to/bytesage/project-2-nike-product-card-part-of-my-50-projects-challenge-2mje
webdev, javascript, beginners, programming
![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/27l1jac5zw8md544wlun.jpg) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aioscrbnhlehknxq88hj.jpg) Hello dev.to community! 👋 I'm excited to share the second project of my 50 projects challenge. This project is a stylish Nike product card built using HTML, CSS, and JavaScript. Let's dive into the details of this project! 🎉 🚀 Project Overview The Nike Product Card is a modern and interactive card design that showcases a Nike shoe. It includes various elements such as the Nike logo, shoe images, product details, and interactive features. 🏃‍♂️👟 🛠️ Technologies Used HTML: For structuring the content CSS: For styling the card and making it visually appealing
bytesage
1,901,601
Concurrency and Parallelism in PHP
Concurrency and parallelism are essential concepts in modern programming, allowing applications to...
0
2024-06-26T16:27:25
https://dev.to/francescoagati/concurrency-and-parallelism-in-php-6fc
php, concurrency, parallelism, thread
Concurrency and parallelism are essential concepts in modern programming, allowing applications to perform multiple tasks simultaneously, either through interleaved execution (concurrency) or simultaneous execution (parallelism). PHP, primarily known for its synchronous execution model, has evolved to support these paradigms through various techniques. ### PHP's Standard Execution Model PHP traditionally follows a synchronous execution model, especially when used with Apache in a typical web server setup. In this model, each HTTP request is handled by a single PHP process. The steps involved in processing a request include: 1. **Apache receives an HTTP request** and forwards it to PHP. 2. **PHP executes the script** from the beginning to the end in a single thread. 3. **PHP returns the output** to Apache, which then sends the response back to the client. This model ensures simplicity and ease of understanding but can become inefficient for tasks requiring parallel execution or handling multiple tasks simultaneously. ### Evolution of Concurrency and Parallelism in PHP As web applications became more complex, the need for concurrent and parallel execution in PHP grew. Let's explore the techniques PHP offers to achieve these paradigms. ### 1. Synchronous Code Synchronous code is the simplest form of execution where tasks are performed one after the other. ```php echo "Synchronous Code Example:\n"; function synchronousFunction() { for ($i = 0; $i < 3; $i++) { echo "Synchronous Loop Iteration: $i\n"; sleep(1); } } synchronousFunction(); ``` In this example, each iteration of the loop executes sequentially, with a one-second delay between iterations. This approach is straightforward but inefficient for I/O-bound or CPU-intensive tasks that could benefit from parallel execution. ### 2. Forking a Process Forking creates a new process (child) that runs concurrently with the original process (parent). This is useful for parallelizing tasks. ```php echo "\nForking Process Example:\n"; function forkProcess() { $pid = pcntl_fork(); if ($pid == -1) { die('could not fork'); } else if ($pid) { echo "Parent Process: PID $pid\n"; pcntl_wait($status); // Protect against Zombie children } else { echo "Child Process: Hello from the child process!\n"; exit(0); } } forkProcess(); ``` In this code, `pcntl_fork()` creates a child process. The parent and child processes execute concurrently, allowing parallel task execution. The parent process waits for the child process to finish to avoid creating zombie processes. ### 3. Threading PHP's threading capabilities are available through extensions like `pthreads`. Threads are lighter than processes and share the same memory space, making them suitable for tasks requiring shared data. ```php if (!class_exists('Thread')) { die("Threads are not supported in this PHP build\n"); } echo "\nThreading Example:\n"; class MyThread extends Thread { public function run() { for ($i = 0; $i < 3; $i++) { echo "Thread Loop Iteration: $i\n"; sleep(1); } } } $thread = new MyThread(); $thread->start(); $thread->join(); ``` This example defines a `MyThread` class extending `Thread`. The `run` method is executed in a new thread, running concurrently with the main thread. This approach is useful for I/O-bound operations where threads can handle waiting for resources. ### 4. Generators Generators provide a way to implement simple co-routines, allowing functions to yield results iteratively without blocking the entire program. ```php echo "\nGenerators Example:\n"; function simpleGenerator() { yield 'First'; yield 'Second'; yield 'Third'; } $gen = simpleGenerator(); foreach ($gen as $value) { echo "Generator Yielded: $value\n"; } ``` Generators use the `yield` keyword to produce values one at a time, allowing the function to be paused and resumed, facilitating a form of cooperative multitasking. PHP has come a long way from its synchronous roots to support various forms of concurrency and parallelism. While synchronous code remains simple and effective for many use cases, techniques like forking processes, threading, and using generators open up new possibilities for handling complex, parallelizable tasks efficiently.
francescoagati
1,901,600
Behind `NoInfer` in TypeScript
Disclaimer: As of now, I have not found documentation that explicitly supports the type inference...
0
2024-06-26T16:23:33
https://dev.to/allieschen/behind-noinfer-in-typescript-319
typescript
**Disclaimer**: As of now, I have not found documentation that explicitly supports the type inference mechanism discussed in this article. This is based on my understanding and experience with TypeScript. Readers are encouraged to explore further and refer to the official TypeScript documentation for additional information. ## TL;DR The `NoInfer<T>` utility type in TypeScript blocks type inference, ensuring types are matched explicitly. It works by using a type parameter T in a way that prevents automatic type inference. ## `NoInfer` - the New Utility Type I recently came across an awesome [video](https://youtu.be/QSIXYMIJkQg?si=bfmy0VaRT5Y2HfrJ) and [article](https://www.totaltypescript.com/noinfer) by Matt introducing the `NoInfer` utility type and its mechanism. According to Matt's article and video, a self-implemented version of `NoInfer` is: ```typescript type NoInfer<T> = [T][T extends any ? 0 : never]; ``` I wondered why this works? ## Dive in with Examples I asked my friend Jiar, and he explained that it has to do with TypeScript's type inference mechanism. What `NoInfer` does is to block the type inference. He gave me [this example](https://www.typescriptlang.org/play/?ts=5.0.4&ssl=16&ssc=2&pln=1&pc=1#code/C4TwDgpgBAcg9gSQHYDMICcA8AVAfFAXigG1sBdUqCAD2AiQBMBnKAQyRCgH4oAGKAFxQkEAG4YyAbgBQ0lAFckAY2ABLOEih0mweMjRZsVWvWZQd6VUgDmuABSt01pkOzEyAGmGJUGAIJOLrA+Bji4AJRCAEZwcAA2EOxQAN7SUOlQ6BDA8uiajs4AdFZKcfIMEEx2SCH+geEyAL6y2rq16HZpGcQA5AnATD1ePdZwPZ5d6QD0AFQzUAACAwC0NJAqq+jocOhQM1OTUD1McAC2EADuABZZPdINLZVt+hidGSQ9kHBgCUNHVxA4mBxh5Dp8IN9fqDwkA): ```typescript type NoInfer<T> = [T][T extends any ? 0 : never]; function testNoInfer<T extends string>(args: T[], noInferArgs: NoInfer<T>): boolean { return args.includes(noInferArgs); } testNoInfer( ['lets', 'go'], /** @ts-expect-error */ 'somewhre' ); testNoInfer( ['people', 'help'], 'people', ) ``` ### Does the Order of the Parameters Matter? To understand what "blocking" means, let's change the order of the parameters in the `testNoInfer` function: ```typescript function testNoInfer<T extends string>(noInferArgs: NoInfer<T>, args: T[]): boolean { return args.includes(noInferArgs); } testNoInfer( /** @ts-expect-error */ 'somewhre', ['lets', 'go'], ); testNoInfer( 'people', ['people', 'help'], ) ``` You see, the order doesn't matter-- it still works. ### A Flash of Inspiration What if we leave the `noInferArgs` argument alone? ```typescript function testNoInfer<T extends string>(noInferArgs: NoInfer<T>): boolean { return Boolean(noInferArgs); } testNoInfer( 'somewhre' ); testNoInfer( 'people' ) ``` No error! I knew it! The type `NoInfer` is just the type parameter `T` itself! So why does defining the other parameter change everything? ## Conclusion In my opinion, the "block" occurs because TypeScript checks the `args` with plain `T` and then references this `T` in the NoInfer type. Therefore, when another parameter is defined with plain `T`, TypeScript insists that `T` should match what `args` received.
allieschen
1,901,570
Rails Migration Tips and Examples (not a full step by step)
Generates the model and migration, but also all of the RCAVs for index, show, create, update, and...
0
2024-06-26T16:21:43
https://dev.to/mayas1111/rails-migration-tips-and-examples-not-a-full-step-by-step-3g1d
migration, ruby, database, generate
Generates the model and migration, but also all of the RCAVs for index, show, create, update, and destroy. - The model name must be singular. - Separate column names and datatypes with colons (NO SPACES). - Separate name:datatype pairs with spaces (NO COMMAS). ``` rails generate draft:resource <MODEL_NAME_SINGULAR> <COLUMN_1_NAME>:<COLUMN_1_DATATYPE> <COLUMN_2_NAME>:<COLUMN_2_DATATYPE> # etc EX: rails generate draft:resource board name:string OR rails generate draft:resource post title:string body:text expires_on:date board_id:integer ``` Execute the generated migrations with: ``` rake db:migrate ``` This will remove all the files that rails generate draft:resource post ... had previously added. You can then correct your error and re-generate from scratch. ``` rails destroy draft:resource post ``` If you made a mistake when generating your draft:resource AND you’ve already run rake db:migrate, then first you have to run the command: ``` rake db:rollback ``` Adding and Removing columns ``` Ex: rails g migration AddImageToPosts OR rails g migration RemoveExpiresOnFromPosts ``` - If your database gets into a weird state (usually caused by deleting old migration files), your ultimate last resort is - This will destroy your entire database and all the data within it. Then, you can fix your migrations and re-run all from scratch with rake db:migrate, then rake sample_data to recover the sample data. ``` rake db:drop ```
mayas1111
1,889,165
Continuacao Fundamentos Java
Programação orientada a objetos A programação orientada a objetos (OOP) é central para Java, que...
0
2024-06-15T02:24:21
https://dev.to/devsjavagirls/continuacao-fundamentos-592n
**Programação orientada a objetos** A programação orientada a objetos (OOP) é central para Java, que adota essa metodologia para lidar com a crescente complexidade dos programas, organizando-os em torno de dados e não apenas de código, e utilizando conceitos-chave como encapsulamento, polimorfismo e herança para criar uma estrutura de programação mais clara e eficaz. **Encapsulamento** O encapsulamento é um mecanismo na programação orientada a objetos que une código e dados em um único bloco seguro contra interferências externas, criando objetos que escondem seus detalhes internos e expõem apenas uma interface controlada através de classes, que definem a estrutura e o comportamento desses objetos. **Polimorfismo** O polimorfismo é uma característica da programação orientada a objetos que permite que uma única interface acesse diferentes ações específicas, dependendo do contexto, facilitando a reutilização de código e simplificando a complexidade ao permitir o uso de uma interface genérica para várias operações relacionadas. **Herança** Herança é o processo que permite que um objeto adquira as propriedades de outro objeto, facilitando classificações hierárquicas onde objetos mais específicos herdam características de objetos mais gerais, reduzindo a necessidade de definir repetidamente atributos comuns e permitindo que cada objeto se concentre apenas em suas características únicas.
devsjavagirls
1,900,608
What is a Ledger and Why Are Names Harder?
Portugues Version What is Ledger Series What is a Ledger and why you need to learn about...
0
2024-06-26T16:20:54
https://dev.to/woovi/what-is-a-ledger-and-why-are-names-harder-25jn
webdev, javascript, programming
[Portugues Version](https://daniloab.substack.com/p/o-que-e-ledger-e-por-que-nomes-sao) ## What is Ledger Series 1. [What is a Ledger and why you need to learn about it?](https://dev.to/woovi/what-is-ledger-and-why-does-it-need-idempotence-18n9) 2. [What is Ledger and why does it need Idempotence?](https://dev.to/woovi/what-is-ledger-and-why-does-it-need-idempotence-18n9) 3. [What is a Ledger and Why Floating Points Are Not Recommended?](https://dev.to/woovi/what-is-a-ledger-and-why-floating-points-are-not-recommended-1f4l) 4. [What is a Ledger and Why Are Names Harder?](https://dev.to/woovi/what-is-a-ledger-and-why-are-names-harder-25jn) In software development, choosing appropriate names for variables, functions, and data structures is a crucial skill. Clear and consistent names are essential for code maintainability and for promoting a healthy development culture. Let's explore why names are important and how we can improve the readability and maintenance of our ledger example. ## The Importance of Names 1. **Clarity**: Well-chosen names make the code easier to understand, both for the author and for other developers who will work on the project in the future. 2. **Maintainability**: Readable and clear code facilitates maintenance and updates, reducing the time needed to fix bugs or add new features. 3. **Communication**: Good names act as implicit documentation, helping to communicate the code's intent more effectively. 4. **Consistency**: Consistent names help establish patterns within the codebase, making it more predictable and less prone to errors. ## Example Improvement: Changing transactionId to idempotencyId In our previous example, we used transactionId to ensure transaction idempotency. While this name is clear, idempotencyId can be a more descriptive and precise name, reflecting its exact purpose. ## Updating the Code Let's update the code example to use idempotencyId instead of transactionId: Defining the ledger structure in MongoDB: ```ts { _id: ObjectId("60c72b2f9b1d8e4d2f507d3a"), date: ISODate("2023-06-13T12:00:00Z"), description: "Deposit", amount: 100000, // Value in cents balance: 100000, // Balance in cents idempotencyId: "abc123" // Idempotency ID } ``` Function to add a new entry to the ledger and calculate the balance using idempotency: ```ts const { MongoClient } = require('mongodb'); async function addTransaction(description, amount, idempotencyId) { const url = 'mongodb://localhost:27017'; const client = new MongoClient(url); try { await client.connect(); const database = client.db('finance'); const ledger = database.collection('ledger'); const existingTransaction = await ledger.findOne({ idempotencyId: idempotencyId }); if (existingTransaction) { console.log('Transaction already exists:', existingTransaction); return; } const lastEntry = await ledger.find().sort({ date: -1 }).limit(1).toArray(); const lastBalance = lastEntry.length > 0 ? lastEntry[0].balance : 0; const newBalance = lastBalance + amount; const newEntry = { date: new Date(), description: description, amount: amount, balance: newBalance, idempotencyId: idempotencyId }; await ledger.insertOne(newEntry); console.log('Transaction successfully added:', newEntry); } finally { await client.close(); } } addTransaction('Deposit', 50000, 'unique-idempotency-id-001'); // Value in cents ``` ## Conclusion Choosing appropriate names is an essential part of software development. Clear, descriptive, and consistent names facilitate code readability, maintenance, and communication. In the example above, we changed transactionId to idempotencyId, making the variable's purpose clearer. In our next post, we'll explore more about the importance of naming best practices and how they can positively impact code quality and development team efficiency. Stay tuned for more insights on building reliable financial systems and development best practices! --- Visit us [Woovi](https://woovi.com/)! --- Follow me on [Twitter](https://x.com/daniloab_) If you like and want to support my work, become my [Patreon](https://www.patreon.com/daniloab) Want to boost your career? Start now with my mentorship through the link https://mentor.daniloassis.dev See more at https://linktr.ee/daniloab Photo of <a href="https://unsplash.com/pt-br/@lucaonniboni?utm_content=creditCopyText&utm_medium=referral&utm_source=unsplash">Luca Onniboni</a> in <a href="https://unsplash.com/pt-br/fotografias/maquina-de-escrever-teal-e-preta-4v9Kk01mEbY?utm_content=creditCopyText&utm_medium=referral&utm_source=unsplash">Unsplash</a>
daniloab
1,900,377
Get to know Xperience by Kentico: Unlocking performance insights with MiniProfiler
In today's fast-paced digital world, performance is a critical factor in ensuring a seamless user...
0
2024-06-26T16:19:47
https://dev.to/michael419/get-to-know-xperience-by-kentico-unlocking-performance-insights-with-miniprofiler-2b2c
kentico, xperience, miniprofiler
In today's fast-paced digital world, performance is a critical factor in ensuring a seamless user experience, and as developers, we constantly seek tools that help us optimize and monitor the performance of our applications. One such powerful tool is MiniProfiler, and when integrated with Xperience by Kentico, it offers invaluable insights into your application's performance. In this article, we'll explore the benefits of this integration and how you can set it up to elevate your web development projects. ## 📟 What is MiniProfiler? MiniProfiler is a simple but effective tool for profiling and monitoring the performance of your web applications. It provides detailed insights into the performance of your code, database queries, and HTTP requests, helping you identify bottlenecks and optimize your application's performance. I've personally been using it for over a decade, and I find myself coming back to it again-and-again because: 1. It's just so easy to set up and configure 2. It helps you identify the most evil thing that can slow down your application...SQL queries. ## ✅ Kentico has an integration for MiniProfiler Kentico's dev team, working on the Xperience by Kentico product, were using MiniProfiler internally, so they decided to open-source the integration earlier this year so that developers, like us, can use it. Thanks Kentico dev team 🙌 ## 🔧 How do I integrate MiniProfiler into my Xperience by Kentico web application? 1. Install the `Kentico.Xperience.MiniProfiler` nuget package into your XbyK project. 2. Open your `Program.cs` file and add the following code: ```CSharp using Kentico.Xperience.MiniProfiler; ... builder.Services.AddKentico(features => { // Kentico builder services configuration ... }); ... var env = builder.Environment; if (env.IsDevelopment()) { // After the call to builder.Services.AddKentico( ... ); builder.Services.AddKenticoMiniProfiler(); } ... var app = builder.Build(); app.UseKentico(); // After call to UseKentico() if (env.IsDevelopment()) { app.UseMiniProfiler(); } ``` Underneath-the-hood, this code sets up and configures MiniProfiler's middleware, along with the script that needs adding to the HTML `body` tag that renders the profiler UI when you load a page. ## 🏃 You're all set Run your local website and you will see the profiler display its output in the top-left of the page. ![MiniProfiler UI](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qu6mmiv9q295xv8xyoj3.png) What you are seeing here is, as [MiniProfiler themselves](https://miniprofiler.com/dotnet/HowTo/ProfileCode) put it: > MiniProfiler is generally setup as 1 profiler per “action” (e.g. an HTTP Request, or startup, or some job) of an application. Inside that profiler, there are steps. Inside steps, you can also have custom timings. So, a list of profilers in the top-left of the screen is displayed. Click one to reveal a list of the steps that were executed as part of that profiler. Each step displays the total duration taken to execute the step. If any SQL queries were executed as part of the step, then they are display as blue links. Click one and a modal popup displays the SQL query statements that were performed, which enables you to start thinking about how you might optimise your the code that creates the query. Great stuff 🙌 ![MiniProfiler SQL ](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1213xw5uvtg9gn76swqf.png) ## ⏱️ Adding custom timing This is where things get really interesting. A pretty common use case is that you might want to time a specific piece of code, and any resulting SQL queries executed as part of that code. Say you have a widget and you want to check the duration of a call to a repository method from the widget's `ViewComponent`: 1.&nbsp;Add the MiniProfiler namespace in your using directives in the `ViewComponent` of your widget: ```CSharp using StackExchange.Profiling; ``` 2.&nbsp;Then wrap your code with MiniProfiler's `Step` extension method, giving the step a custom name: ```CSharp IEnumerable<Coffee> products = Enumerable.Empty<Coffee>(); using (MiniProfiler.Current.Step("Custom: CoffeeRepository.GetCoffees()")) { products = await repository.GetCoffees(); } ``` 3.&nbsp;Now build and run the site again, navigate to a page that has that widget placed on it, expand the relevant profiler: ![MiniProfiler custom step](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dhrr8ldbuvjmgkm9ypg6.png) This is really powerful stuff as it enables you to measure the performance of your code, quickly and easily, then it's up to you to hone your code to increase performance. ## ⚠️ Ensuring you have caching disabled In my example screenshot above, I'm performing these web page loads straight after having built my Dancing Goat sample project, so there won't be any cached data at this point. This means I will see all of the SQL calls profiled as they haven't been cached at this point. Subsequent reloads of the same page will display reduced list of steps as the resultant data returned from SQL queries are cached by the system, and so the MiniProfiler UI won't show them all. Kentico's Dancing Goat Xperience by Kentico sample site follows their best practice for [data caching](https://docs.kentico.com/developers-and-admins/development/caching/data-caching) and [output caching](https://docs.kentico.com/developers-and-admins/development/caching/output-caching) where the explicitly recommend disabling caching for pages when view in preview mode, so if your project has followed this approach, then you can always load the page in the preview tab when authoring the page in your website channel, and you'll see the MiniProfiler UI. Alternatively, you might have architected your caching approach to utilise an app settings key to disabled/enable caching at a global level. ## 🚀 Conclusion Integrating MiniProfiler with Xperience by Kentico is a powerful way to gain deep insights into the performance of your web applications. By following the steps outlined above, you can set up MiniProfiler and start optimizing your application's performance, ensuring a smooth and efficient user experience. ## Reference - [Xperience by Kentico MiniProfiler project on GitHub](https://github.com/Kentico/xperience-by-kentico-miniprofiler) - [MiniProfiler: How-To Profile Code](https://miniprofiler.com/dotnet/HowTo/ProfileCode)
michael419
1,901,510
Advanced Angular Forms: Dynamic Fields & Custom Validators
Introduction Forms are a fundamental part of web applications, allowing users to input and...
0
2024-06-26T16:19:05
https://dev.to/wirefuture/advanced-angular-forms-dynamic-fields-custom-validators-3cn0
webdev, angular, javascript, typescript
## Introduction Forms are a fundamental part of web applications, allowing users to input and submit data. Form handling in Angular is robust and flexible using reactive forms. More complex and dynamic forms of applications require advanced techniques to handle them. In this article, we'll delve into advanced Angular form handling, focusing on dynamic forms and custom validators. We'll use Angular 18.0.3 for this and leverage the new features and improvements. This guide will help developers of any [Angular Development](https://wirefuture.com/blog/angular-development) Company to improve their form handling capabilities. ## Setting Up the Angular Project First, let's set up a new Angular project. Ensure you have Angular CLI installed, then create a new project: ``` ng new advanced-angular-forms cd advanced-angular-forms ng add @angular/forms ``` This setup provides the basic structure and dependencies needed for our application. The Angular CLI facilitates a smooth development experience by scaffolding the project structure and managing dependencies. ## Creating Dynamic Forms [Dynamic forms](https://docs.angular.lat/guide/dynamic-form) are forms whose structure can change at runtime. This is useful when form fields need to be added or removed based on user interactions or data retrieved from a server. These forms provide flexibility and adaptability, crucial for applications with variable data input requirements. ### Step 1: Define the Form Model Start by defining a form model using FormGroup and FormControl in your component. Create a new component for our form: ``` ng generate component dynamic-form ``` In dynamic-form.component.ts, define the form model: ``` import { Component, OnInit } from '@angular/core'; import { FormBuilder, FormGroup, FormArray, Validators } from '@angular/forms'; @Component({ selector: 'app-dynamic-form', templateUrl: './dynamic-form.component.html', styleUrls: ['./dynamic-form.component.css'] }) export class DynamicFormComponent implements OnInit { dynamicForm: FormGroup; constructor(private fb: FormBuilder) { this.dynamicForm = this.fb.group({ name: ['', Validators.required], items: this.fb.array([]) }); } ngOnInit(): void { } get items(): FormArray { return this.dynamicForm.get('items') as FormArray; } addItem(): void { this.items.push(this.fb.control('', Validators.required)); } removeItem(index: number): void { this.items.removeAt(index); } onSubmit(): void { console.log(this.dynamicForm.value); } } ``` In this code, we use FormBuilder to simplify the creation of our form model. The form consists of a name control and an items form array. The items array will hold dynamically added form controls. ### Step 2: Build the Template In dynamic-form.component.html, create a form template that allows adding and removing form controls dynamically: ``` <form [formGroup]="dynamicForm" (ngSubmit)="onSubmit()"> <div> <label for="name">Name</label> <input id="name" formControlName="name"> </div> <div formArrayName="items"> <div *ngFor="let item of items.controls; let i = index"> <label for="item-{{ i }}">Item {{ i + 1 }}</label> <input [id]="'item-' + i" [formControlName]="i"> <button type="button" (click)="removeItem(i)">Remove</button> </div> </div> <button type="button" (click)="addItem()">Add Item</button> <button type="submit" [disabled]="!dynamicForm.valid">Submit</button> </form> ``` This template uses Angular's form directives to bind the form model to the view. The *ngFor directive iterates over the items array, creating form controls dynamically. The formArrayName directive binds the form array to the template, ensuring synchronization between the model and view. ## Implementing Custom Validators Custom validators are necessary for complex validation logic that goes beyond Angular's built-in validators. Validators ensure that the data entered by users meets specific criteria, enhancing data integrity and user experience. Let's implement a custom validator that ensures no duplicate items are added to the dynamic form. ### Step 1: Create the Validator Function In dynamic-form.component.ts, add a custom validator function: ``` import { AbstractControl, ValidationErrors, ValidatorFn } from '@angular/forms'; export function uniqueItemsValidator(): ValidatorFn { return (formArray: AbstractControl): ValidationErrors | null => { const items = formArray.value; const uniqueItems = new Set(items); return items.length !== uniqueItems.size ? { duplicateItems: true } : null; }; } ``` This function checks for duplicate items by comparing the length of the items array with the length of a Set created from the items array. If they differ, it means there are duplicates, and the validator returns an error object. ### Step 2: Apply the Validator to the Form Array Update the form group definition to include the custom validator: ``` this.dynamicForm = this.fb.group({ name: ['', Validators.required], items: this.fb.array([], uniqueItemsValidator()) }); ``` Here, we attach the uniqueItemsValidator to the items form array, ensuring that our custom validation logic is applied whenever the form array changes. ### Step 3: Display Validation Errors Update the template to display validation errors: ``` <div formArrayName="items"> <div *ngFor="let item of items.controls; let i = index"> <label for="item-{{ i }}">Item {{ i + 1 }}</label> <input [id]="'item-' + i" [formControlName]="i"> <button type="button" (click)="removeItem(i)">Remove</button> </div> <div *ngIf="items.hasError('duplicateItems')"> Duplicate items are not allowed. </div> </div> ``` This ensures that users receive immediate feedback when they attempt to add duplicate items, enhancing the form's usability and data integrity. ## Performance Considerations When dealing with dynamic forms, performance can be a concern, especially if forms are large or complex. Here are some tips to keep performance in check: **Debounce Input Changes:** Use debounceTime with valueChanges to reduce the frequency of form updates. This can prevent performance degradation in forms with many controls. ``` this.dynamicForm.valueChanges.pipe( debounceTime(300) ).subscribe(value => { console.log(value); }); ``` **OnPush Change Detection:** Use ChangeDetectionStrategy.OnPush to reduce unnecessary change detection cycles. This strategy tells Angular to check the component's view only when its input properties change. ``` @Component({ selector: 'app-dynamic-form', templateUrl: './dynamic-form.component.html', styleUrls: ['./dynamic-form.component.css'], changeDetection: ChangeDetectionStrategy.OnPush }) export class DynamicFormComponent { /*...*/ } ``` **Lazy Loading Form Controls:** Load form controls only when necessary, such as when a user interacts with a particular section of the form. This can significantly reduce initial load times and improve performance. ## Conclusion Advanced form scenarios in Angular require creating dynamic forms and using custom validators to validate data integrity. Using Angular's reactive forms, we build flexible, robust forms that respond to changing requirements. Taking performance into account, these techniques can be used to produce fast and friendly forms in any Angular app. This article demonstrated how to create dynamic forms and custom validators, from which more complex form-based features can be built in Angular. Exploring these concepts will help you to fully utilize Angular's form management capabilities. As you continue to create applications, these advanced form handling methods will prove invaluable for producing user-friendly, responsive, and dynamic forms.
tapeshm
1,901,509
How to Choose the Best Game Site for Your Needs
Choosing the best slot site for your needs requires careful consideration of several factors to...
0
2024-06-26T16:16:51
https://dev.to/nehasinghjk/how-to-choose-the-best-game-site-for-your-needs-29dd
games
Choosing the best slot site for your needs requires careful consideration of several factors to ensure a safe, enjoyable, and rewarding experience. With countless online slot sites available, it can be overwhelming to navigate through the options. Here are some essential tips to help you make an informed decision: <h2>Assess the Site’s Reputation and Security</h2> The first aspect to consider is the reputation and security of the slot site. Before getting started, review sites and ensure they are licensed and regulated by reputable authorities, such as the UK Gambling Commission, Malta Gaming Authority, or Gibraltar Regulatory Authority. These licenses ensure that the site operates legally and adheres to strict standards of fairness and security. <span style="font-weight: 400;">Additionally, read reviews and testimonials from other players or experts to gauge the site's reputation. Many players opt to review </span><span style="font-weight: 400;">[슬롯사이트 순위](https://kpkesihatan.com/)</span><span style="font-weight: 400;"> online before choosing a site in order to ensure a good pick. Trusted review sites and online forums can provide valuable insights into the site's reliability, customer service, and payout efficiency. Ensure that the site uses advanced encryption technology to protect your personal and financial information.</span> <h2>Evaluate the Game Selection</h2> A diverse game selection is crucial for an engaging slot experience. The best slot sites offer a wide variety of games from top developers like NetEnt, Microgaming, Playtech, and Evolution Gaming. Check if the site provides different types of slots, such as classic slots, video slots, and progressive jackpot slots. Moreover, consider the site's game library in terms of themes, features, and volatility. A good site should cater to various preferences, whether you enjoy simple three-reel slots or complex multi-line video slots with bonus rounds and free spins. The availability of demo versions or free play options can also help you test games before wagering real money. <h2>Check for Bonuses and Promotions</h2> Bonuses and promotions are significant factors that can enhance your gaming experience and extend your playtime. Look for sites that offer generous welcome bonuses, no-deposit bonuses, free spins, and ongoing promotions like reload bonuses, cashback offers, and loyalty programs. Always review the fine print associated with these bonuses. Pay attention to wagering requirements, maximum bet limits, game restrictions, and expiration dates. The best slot sites provide transparent and fair bonus terms that give you a genuine chance to benefit from the offers. <h2>Consider Payment Methods and Withdrawal Times</h2> <span style="font-weight: 400;">Convenient and secure banking options are vital for a smooth gambling experience. The best slot sites offer a variety of payment methods, including credit/debit cards, e-wallets like PayPal and Skrill, bank transfers, and even cryptocurrencies. A growing number of players are starting to use crypto at crypto casinos and </span><span style="font-weight: 400;">[anonymous casinos](https://anonymouscasinos.ltd/)</span><span style="font-weight: 400;"> as they provide an extra layer of privacy when wagering online</span> Evaluate the deposit and withdrawal processes, including the minimum and maximum limits, processing times, and any associated fees. Fast and hassle-free withdrawals are a hallmark of a reputable slot site. Look for sites that process withdrawal requests promptly and offer multiple withdrawal options to suit your preferences. <h2>Look for Mobile Compatibility</h2> With the increasing popularity of mobile gaming, it’s essential to choose a slot site that offers a seamless mobile experience. The best slot sites are fully optimized for mobile devices, providing a responsive design and easy navigation on smartphones and tablets. Check if the site offers a dedicated mobile app or a mobile-friendly website that allows you to play your favorite slots on the go. The mobile platform should offer the same level of functionality, game selection, and security as the desktop version. <h2>Evaluate Customer Support</h2> <span style="font-weight: 400;">Reliable </span><span style="font-weight: 400;">[customer support is crucial](https://www.iopex.com/blogs/gaming-customer-support-services-do-we-really-need-it/)</span><span style="font-weight: 400;"> for resolving any issues or answering queries that may arise during your gaming experience. The best slot sites offer multiple support channels, including live chat, email, and phone support, available 24/7</span> Test the responsiveness and helpfulness of the customer support team by asking a few questions before signing up. Efficient and friendly customer service can significantly enhance your overall experience and provide peace of mind knowing that assistance is readily available when needed. <h3>Explore Additional Features</h3> Some slot sites offer additional features that can enhance your experience. These may include: <ul> <li><strong>VIP Programs:</strong> Exclusive rewards, personalized bonuses, and dedicated account managers for loyal players.</li> <li><strong>Tournaments:</strong> Opportunities to compete against other players for prizes and bragging rights.</li> <li><span style="font-weight: 400;">Social Features: Community features like chat rooms and forums where you can </span><span style="font-weight: 400;">[interact with other players](https://digiday.com/sponsored/how-gaming-platforms-are-driving-social-connection/)</span><span style="font-weight: 400;">.</span></li> </ul> <h2>Assess Overall User Experience</h2> The overall user experience of the slot site is another critical factor to consider. The site should have an intuitive interface, easy navigation, and fast loading times. The design and layout should be visually appealing and user-friendly, ensuring that you can find your favorite games and features without any hassle. <h2>Conclusion</h2> Choosing the best slot site for your needs involves a combination of thorough research and personal preferences. By assessing the site’s reputation, game selection, bonuses, payment methods, mobile compatibility, customer support, additional features, and overall user experience, you can make an informed decision that aligns with your style and preferences.
nehasinghjk
1,901,502
Improving Efficiency in Defect Resolution - Your Input Needed
👀 Are you or your team spending more time debugging defects than building game-changing...
0
2024-06-26T16:03:53
https://dev.to/ericwood73/improving-efficiency-in-defect-resolution-your-input-needed-50mn
productivity, softwaredevelopment, defectresolution, debugging
##👀 Are you or your team spending more time debugging defects than building game-changing features? I'm on a mission to **bridge the gaps** holding us back. I’m creating a program to boost efficiency and effectiveness in defect resolution. But I need YOUR insights to ensure it delivers the greatest value. ###📝 [Join our quick survey and be a catalyst for change](https://lnkd.in/d5VDD3gd): **Reasons to participate:** * Help improve industry practices * Gain insights from the collective data * Be a part of a growing community focused on improvement * **First 20 respondents receive a 50% discount voucher** on my upcoming course! ###🔗 Help us reach more professionals! If you know someone who would have valuable insights, please share this post with your network. The more perspectives we gather, the better we can tailor our course to meet the needs of the community. ###🌟 Thank you for your time and valuable input! Your feedback is greatly appreciated and will directly contribute to better strategies and solutions for effective and efficient defect resolution and debugging. Feel free to drop any further comments or questions below.
ericwood73
1,875,668
Join us for the Wix Studio Challenge with Special Guest Judge Ania Kubów: $3,000 in Prizes!
We are so excited to announce our first partnered challenge with Wix. Running through July 07 July...
0
2024-06-26T16:16:40
https://dev.to/devteam/join-us-for-the-wix-studio-challenge-with-special-guest-judge-ania-kubow-3000-in-prizes-3ial
devchallenge, wixstudiochallenge, webdev, javascript
--- title: Join us for the Wix Studio Challenge with Special Guest Judge Ania Kubów: $3,000 in Prizes! published: true description: tags: devchallenge, wixstudiochallenge, webdev, javascript cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cz0a4xd2f6x923a6jeyv.png # Use a ratio of 100:42 for best results. # published_at: 2024-06-03 15:39 +0000 --- We are so excited to announce our first partnered challenge with Wix. Running through ~~July 07~~ **July 14**, the [Wix Studio Challenge](https://dev.to/challenges/wix) provides an opportunity to develop with one of the most popular and in-demand website building solutions the internet has to offer. There is one prompt for this challenge, and one way to win the entire $3,000 prize pool. 🙀 Flex your JavaScript skills while leveraging the Wix Studio low-code environment to create a slick user experience. Your submission will be a demonstration of real-world product development you’ll want to add to your portfolio! ## Our Prompt For this challenge, your mandate is to **build an innovative eCommerce experience with Wix Studio.** Wix offers a powerful visual builder as well as a JavaScript-based development platform that enables you to create dynamic and interactive web experiences. You have full control of your site, from frontend design to backend functionality. We just have one requirement for this challenge: **You should leverage [Wix’s set of APIs](https://www.wix.com/velo/reference)** and libraries to enhance your users’ shopping experience. The more features, the merrier. {% card %} ### Heads up Incorporating [Service Plugins](https://dev.wix.com/docs/velo/articles/api-overview/service-plugins-spis) in your app is a powerful way to enable further customization of your shopping experience. _**However**, these will be at the center of a future challenge_ 😉. So you are advised to build primarily around the **other** APIs at your disposal for this challenge. Of course, we hope to see you participate in this and all future Wix Challenges! {% endcard %} ## Judging and Prizes For the first time ever, we will have a special guest judge joining us for this challenge! With over [400,000 subscribers on YouTube](https://www.youtube.com/aniakubow), Ania Kubów is a prolific software developer, educator, and course creator. Ania has taught tens of thousands of people how to code, from the comforts of their home. {% embed https://dev.to/ania_kubow %} Not only will our winner receive our biggest cash prize to date, but they'll also walk away knowing they have Ania's seal of approval. Talk about bragging rights! Submissions will be judged on the following: - Use of underlying technology - Usability and User Experience - Accessibility - Creativity The winner of our prompt will receive: - $3,000 USD - Exclusive DEV Badge - A gift from the [DEV Shop](https://shop.forem.com) **All Participants** with a valid submission will receive a completion badge on their DEV profile. {% card %} ## How To Participate <mark>In order to participate, you will need to build a **Wix _Studio_ site** (not just any Wix editor site!) and publish a post using the submission template below. </mark> All submissions must include original code and should not contain custom elements or iFrames. You may use any App on the Wix App Market including those made by Wix (ie. Wix Apps), but you must start your project from a blank template. Please do not use any pre-built visual design templates. **Steps to Creating a Wix Studio Site For New Users<mark>(VERY IMPORTANT)</mark>** 1. Navigate to the [Wix Studio page](https://www.wix.com/studio) 2. Click "Start Creating" 3. Create an account 4. **Select "For a client, as a freelancer or an agency"** and click continue 5. Click "Go to Wix Studio" 6. Click "As a freelancer" 7. Select "Web Development" and click continue 8. Click "Start Creating" Wix offers a free tier for Wix Studio and you do not need a credit card to sign up. Please review our full [rules, guidelines, and FAQ page](https://dev.to/challenges/wix) before submitting so you understand our participation guidelines and official contests rules such as eligibility requirements. {% cta https://dev.to/new?prefill=---%0Atitle%3A%20%0Apublished%3A%20%0Atags%3A%20%20devchallenge%2C%20wixstudiochallenge%2C%20webdev%2C%20javascript%0A---%0A%0A*This%20is%20a%20submission%20for%20the%20%5BWix%20Studio%20Challenge%20%5D(https%3A%2F%2Fdev.to%2Fchallenges%2Fwix).*%0A%0A%0A%23%23%20What%20I%20Built%0A%3C!--%20Share%20an%20overview%20about%20your%20project.%20--%3E%0A%0A%23%23%20Demo%0A%3C!--%20Share%20a%20link%20to%20your%20Wix%20Studio%20app%20and%20include%20some%20screenshots%20here.%20--%3E%0A%0A%23%23%20Development%20Journey%0A%3C!--%20Tell%20us%20how%20you%20leveraged%20Wix%20Studio%E2%80%99s%20JavaScript%20development%20capabilities--%3E%0A%0A%3C!--%20Which%20APIs%20and%20Libraries%20did%20you%20utilize%3F%20--%3E%0A%0A%3C!--%20Team%20Submissions%3A%20Please%20pick%20one%20member%20to%20publish%20the%20submission%20and%20credit%20teammates%20by%20listing%20their%20DEV%20usernames%20directly%20in%20the%20body%20of%20the%20post.%20--%3E%0A%0A%0A%3C!--%20Don%27t%20forget%20to%20add%20a%20cover%20image%20(if%20you%20want).%20--%3E%0A%0A%0A%3C!--%20Thanks%20for%20participating!%20%E2%86%92 %} Wix Studio Challenge Submission Template {% endcta %} {% endcard %} ## Community and Resources We encourage everyone who’s curious about the challenge to join the [Devs on Wix Community Discord](http://discord.gg/devs-on-wix) and hop into their #code-challenges channel. This will be the place to ask for technical help and meet community members building with Wix. Since our prompt is about creating an eCommerce experience, we’d like to point everyone to their [eCommerce API documentation](https://dev.wix.com/docs/velo/api-reference/wix-ecom-backend/introduction) as well as a [tutorial on building a shopping wishlist](https://dev.wix.com/docs/develop-websites/articles/code-tutorials/wix-e-commerce-stores/adding-a-wishlist-to-a-wix-stores-site) on Wix. 😉 **Additional Resources** - [Wix Development Platform Overview](https://dev.wix.com/docs/develop-websites/articles/getting-started/about-velo-by-wix) - [API Documentation](https://www.wix.com/velo/reference) - [Wix Studio Forum](https://forum.wixstudio.com/) ## Important Dates - June 26: Wix Studio Challenge begins! - <mark>~~July 07~~ **July 14**: Submissions due at 11:59 PM PDT</mark> - ~~July 09~~ **July 16**: Winner Announced We hope you enjoy building with Wix’s flexible dev tools, and we can’t wait to see what you build! Questions about the challenge? Ask them below. Good luck and happy coding!
thepracticaldev
1,901,507
Colima in macOS
Hey there! If you're like me, trying to set up colima on your M1 Mac and getting stuck with these...
0
2024-06-26T16:16:23
https://dev.to/netoht/colima-in-macos-44g
Hey there! If you're like me, trying to set up colima on your M1 Mac and getting stuck with these pesky proxy issues, self-signed certificates and volumes - don't sweat it. I've got a fix for you. Once we’ve wrapped up the setup, we’ll give it a whirl using docker compose and two images: mongodb and redis. These two tend to give me a bit of a headache. 🙃 #### Install packages ```shell brew install \ colima \ docker \ docker-compose \ docker-buildx ``` #### Setup configs ```shell # Linking the Colima socket to the default socket path. Note that this may break other Docker servers. sudo ln -sf ~/.colima/default/docker.sock /var/run/docker.sock # Docker plugins setup mkdir -p ~/.docker/cli-plugins ln -sfn $(which docker-buildx) ~/.docker/cli-plugins/docker-buildx ln -sfn $(which docker-compose) ~/.docker/cli-plugins/docker-compose ``` #### Setup colima and certificates NOTE: vz is macOS virtualization framework and requires macOS 13 ```shell # For m1 mac colima start --cpu 4 --memory 4 --vm-type vz --vz-rosetta --edit --editor code # For intel mac colima start --cpu 4 --memory 4 --vm-type vz --edit --editor code ``` NOTE: If you need to configure self-signed certificates add the following provision, if not skip this step. Save & close the file. ```yaml provision: - mode: user script: | #!/bin/bash sudo ln -sf /Users/<YOUR_USER>/certs/self_signed_cert.pem /etc/ssl/certs/self_signed_cert.pem ``` #### Check installs ```shell # plugins docker buildx version docker compose version # docker docker ps # docker certificates docker pull hello-world # docker compose volumes mkdir -p ~/colima_test && cd ~/colima_test echo 'try { db.check.insertOne({ _id: 1, test: true }); print(db.check.findOne({ _id: 1 })); } catch (err) { print(`ERROR: ${err}`); } ' > init-mongo.js echo "name: check services: mongodb: image: mongodb/mongodb-community-server:6.0-ubi8 restart: always volumes: - ./docker_volumes/mongodb_data_db:/data/db - ./init-mongo.js:/docker-entrypoint-initdb.d/init-mongo.js:ro ports: - 27017:27017 command: ['--replSet', 'rs0', '--bind_ip_all', '--port', '27017'] healthcheck: # Enabling replica set to execute transactions in mongodb test: echo \"try { rs.status() } catch (err) { rs.initiate({_id:'rs0',members:[{_id:0,host:'127.0.0.1:27017'}]}) }\" | mongosh --port 27017 --quiet interval: 5s timeout: 30s start_period: 0s retries: 10 redis: image: redis restart: always volumes: - ./docker_volumes/redis_data:/data ports: - 6379:6379 command: ['redis-server', '--save', '60', '1', '--loglevel', 'warning'] " > docker-compose.yml docker compose up -d docker ps docker exec -it check-mongodb-1 mongosh --eval 'print(db.check.find())' # You should see this output (item created in collection): # [ { _id: 1, test: true } ] ls docker_volumes/ # You should see this output (new directories created by volumes): # mongodb_data_db redis_data # Clean up the check installs docker compose down -v cd ~ rm -rf ~/colima_test ``` Volume note: Colima default behaviour: $HOME and /tmp/colima are mounted as writable. #### References: - https://github.com/abiosoft/colima - https://github.com/abiosoft/colima/blob/main/docs/FAQ.md
netoht
1,901,508
How to Install Devise
Use Devise for Users Install rails generate devise:install Enter fullscreen mode ...
0
2024-06-26T16:15:20
https://dev.to/mayas1111/how-to-install-devise-28pd
devise, users, programming, ruby
Use Devise for Users Install ``` rails generate devise:install ``` Change :delete to :getin devise file ``` # config/initializers/devise.rb, Line 269 config.sign_out_via = :get ``` Set a root route in routes ``` # config/routes.rb root "boards#index" ``` Use our new generator supplied by the gem to create the User model and routes ``` rails generate devise user ``` Then Migrate ``` rake db:migrate ```
mayas1111
1,901,505
Buy verified cash app account
https://dmhelpshop.com/product/buy-verified-cash-app-account/ Buy verified cash app account Cash...
0
2024-06-26T16:07:16
https://dev.to/thesexillwr933/buy-verified-cash-app-account-1a5a
webdev, javascript, beginners, programming
ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-cash-app-account/\n![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xukt4zxbtxhf9j1y6f7m.png)\n\nBuy verified cash app account\nCash app has emerged as a dominant force in the realm of mobile banking within the USA, offering unparalleled convenience for digital money transfers, deposits, and trading. As the foremost provider of fully verified cash app accounts, we take pride in our ability to deliver accounts with substantial limits. Bitcoin enablement, and an unmatched level of security.\n\nOur commitment to facilitating seamless transactions and enabling digital currency trades has garnered significant acclaim, as evidenced by the overwhelming response from our satisfied clientele. Those seeking buy verified cash app account with 100% legitimate documentation and unrestricted access need look no further. Get in touch with us promptly to acquire your verified cash app account and take advantage of all the benefits it has to offer.\n\nWhy dmhelpshop is the best place to buy USA cash app accounts?\nIt’s crucial to stay informed about any updates to the platform you’re using. If an update has been released, it’s important to explore alternative options. Contact the platform’s support team to inquire about the status of the cash app service.\n\nClearly communicate your requirements and inquire whether they can meet your needs and provide the buy verified cash app account promptly. If they assure you that they can fulfill your requirements within the specified timeframe, proceed with the verification process using the required documents.\n\nOur account verification process includes the submission of the following documents: [List of specific documents required for verification].\n\nGenuine and activated email verified\nRegistered phone number (USA)\nSelfie verified\nSSN (social security number) verified\nDriving license\nBTC enable or not enable (BTC enable best)\n100% replacement guaranteed\n100% customer satisfaction\nWhen it comes to staying on top of the latest platform updates, it’s crucial to act fast and ensure you’re positioned in the best possible place. If you’re considering a switch, reaching out to the right contacts and inquiring about the status of the buy verified cash app account service update is essential.\n\nClearly communicate your requirements and gauge their commitment to fulfilling them promptly. Once you’ve confirmed their capability, proceed with the verification process using genuine and activated email verification, a registered USA phone number, selfie verification, social security number (SSN) verification, and a valid driving license.\n\nAdditionally, assessing whether BTC enablement is available is advisable, buy verified cash app account, with a preference for this feature. It’s important to note that a 100% replacement guarantee and ensuring 100% customer satisfaction are essential benchmarks in this process.\n\nHow to use the Cash Card to make purchases?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card. Alternatively, you can manually enter the CVV and expiration date. How To Buy Verified Cash App Accounts.\n\nAfter submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a buy verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account.\n\nWhy we suggest to unchanged the Cash App account username?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card.\n\nAlternatively, you can manually enter the CVV and expiration date. After submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account. Purchase Verified Cash App Accounts.\n\nSelecting a username in an app usually comes with the understanding that it cannot be easily changed within the app’s settings or options. This deliberate control is in place to uphold consistency and minimize potential user confusion, especially for those who have added you as a contact using your username. In addition, purchasing a Cash App account with verified genuine documents already linked to the account ensures a reliable and secure transaction experience.\n\n \n\nBuy verified cash app accounts quickly and easily for all your financial needs.\nAs the user base of our platform continues to grow, the significance of verified accounts cannot be overstated for both businesses and individuals seeking to leverage its full range of features. How To Buy Verified Cash App Accounts.\n\nFor entrepreneurs, freelancers, and investors alike, a verified cash app account opens the door to sending, receiving, and withdrawing substantial amounts of money, offering unparalleled convenience and flexibility. Whether you’re conducting business or managing personal finances, the benefits of a verified account are clear, providing a secure and efficient means to transact and manage funds at scale.\n\nWhen it comes to the rising trend of purchasing buy verified cash app account, it’s crucial to tread carefully and opt for reputable providers to steer clear of potential scams and fraudulent activities. How To Buy Verified Cash App Accounts.  With numerous providers offering this service at competitive prices, it is paramount to be diligent in selecting a trusted source.\n\nThis article serves as a comprehensive guide, equipping you with the essential knowledge to navigate the process of procuring buy verified cash app account, ensuring that you are well-informed before making any purchasing decisions. Understanding the fundamentals is key, and by following this guide, you’ll be empowered to make informed choices with confidence.\n\n \n\nIs it safe to buy Cash App Verified Accounts?\nCash App, being a prominent peer-to-peer mobile payment application, is widely utilized by numerous individuals for their transactions. However, concerns regarding its safety have arisen, particularly pertaining to the purchase of “verified” accounts through Cash App. This raises questions about the security of Cash App’s verification process.\n\nUnfortunately, the answer is negative, as buying such verified accounts entails risks and is deemed unsafe. Therefore, it is crucial for everyone to exercise caution and be aware of potential vulnerabilities when using Cash App. How To Buy Verified Cash App Accounts.\n\nCash App has emerged as a widely embraced platform for purchasing Instagram Followers using PayPal, catering to a diverse range of users. This convenient application permits individuals possessing a PayPal account to procure authenticated Instagram Followers.\n\nLeveraging the Cash App, users can either opt to procure followers for a predetermined quantity or exercise patience until their account accrues a substantial follower count, subsequently making a bulk purchase. Although the Cash App provides this service, it is crucial to discern between genuine and counterfeit items. If you find yourself in search of counterfeit products such as a Rolex, a Louis Vuitton item, or a Louis Vuitton bag, there are two viable approaches to consider.\n\n \n\nWhy you need to buy verified Cash App accounts personal or business?\nThe Cash App is a versatile digital wallet enabling seamless money transfers among its users. However, it presents a concern as it facilitates transfer to both verified and unverified individuals.\n\nTo address this, the Cash App offers the option to become a verified user, which unlocks a range of advantages. Verified users can enjoy perks such as express payment, immediate issue resolution, and a generous interest-free period of up to two weeks. With its user-friendly interface and enhanced capabilities, the Cash App caters to the needs of a wide audience, ensuring convenient and secure digital transactions for all.\n\nIf you’re a business person seeking additional funds to expand your business, we have a solution for you. Payroll management can often be a challenging task, regardless of whether you’re a small family-run business or a large corporation. How To Buy Verified Cash App Accounts.\n\nImproper payment practices can lead to potential issues with your employees, as they could report you to the government. However, worry not, as we offer a reliable and efficient way to ensure proper payroll management, avoiding any potential complications. Our services provide you with the funds you need without compromising your reputation or legal standing. With our assistance, you can focus on growing your business while maintaining a professional and compliant relationship with your employees. Purchase Verified Cash App Accounts.\n\nA Cash App has emerged as a leading peer-to-peer payment method, catering to a wide range of users. With its seamless functionality, individuals can effortlessly send and receive cash in a matter of seconds, bypassing the need for a traditional bank account or social security number. Buy verified cash app account.\n\nThis accessibility makes it particularly appealing to millennials, addressing a common challenge they face in accessing physical currency. As a result, ACash App has established itself as a preferred choice among diverse audiences, enabling swift and hassle-free transactions for everyone. Purchase Verified Cash App Accounts.\n\n \n\nHow to verify Cash App accounts\nTo ensure the verification of your Cash App account, it is essential to securely store all your required documents in your account. This process includes accurately supplying your date of birth and verifying the US or UK phone number linked to your Cash App account.\n\nAs part of the verification process, you will be asked to submit accurate personal details such as your date of birth, the last four digits of your SSN, and your email address. If additional information is requested by the Cash App community to validate your account, be prepared to provide it promptly. Upon successful verification, you will gain full access to managing your account balance, as well as sending and receiving funds seamlessly. Buy verified cash app account.\n\n \n\nHow cash used for international transaction?\nExperience the seamless convenience of this innovative platform that simplifies money transfers to the level of sending a text message. It effortlessly connects users within the familiar confines of their respective currency regions, primarily in the United States and the United Kingdom.\n\nNo matter if you’re a freelancer seeking to diversify your clientele or a small business eager to enhance market presence, this solution caters to your financial needs efficiently and securely. Embrace a world of unlimited possibilities while staying connected to your currency domain. Buy verified cash app account.\n\nUnderstanding the currency capabilities of your selected payment application is essential in today’s digital landscape, where versatile financial tools are increasingly sought after. In this era of rapid technological advancements, being well-informed about platforms such as Cash App is crucial.\n\nAs we progress into the digital age, the significance of keeping abreast of such services becomes more pronounced, emphasizing the necessity of staying updated with the evolving financial trends and options available. Buy verified cash app account.\n\nOffers and advantage to buy cash app accounts cheap?\nWith Cash App, the possibilities are endless, offering numerous advantages in online marketing, cryptocurrency trading, and mobile banking while ensuring high security. As a top creator of Cash App accounts, our team possesses unparalleled expertise in navigating the platform.\n\nWe deliver accounts with maximum security and unwavering loyalty at competitive prices unmatched by other agencies. Rest assured, you can trust our services without hesitation, as we prioritize your peace of mind and satisfaction above all else.\n\nEnhance your business operations effortlessly by utilizing the Cash App e-wallet for seamless payment processing, money transfers, and various other essential tasks. Amidst a myriad of transaction platforms in existence today, the Cash App e-wallet stands out as a premier choice, offering users a multitude of functions to streamline their financial activities effectively. Buy verified cash app account.\n\nTrustbizs.com stands by the Cash App’s superiority and recommends acquiring your Cash App accounts from this trusted source to optimize your business potential.\n\nHow Customizable are the Payment Options on Cash App for Businesses?\nDiscover the flexible payment options available to businesses on Cash App, enabling a range of customization features to streamline transactions. Business users have the ability to adjust transaction amounts, incorporate tipping options, and leverage robust reporting tools for enhanced financial management.\n\nExplore trustbizs.com to acquire verified Cash App accounts with LD backup at a competitive price, ensuring a secure and efficient payment solution for your business needs. Buy verified cash app account.\n\nDiscover Cash App, an innovative platform ideal for small business owners and entrepreneurs aiming to simplify their financial operations. With its intuitive interface, Cash App empowers businesses to seamlessly receive payments and effectively oversee their finances. Emphasizing customization, this app accommodates a variety of business requirements and preferences, making it a versatile tool for all.\n\nWhere To Buy Verified Cash App Accounts\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nThe Importance Of Verified Cash App Accounts\nIn today’s digital age, the significance of verified Cash App accounts cannot be overstated, as they serve as a cornerstone for secure and trustworthy online transactions.\n\nBy acquiring verified Cash App accounts, users not only establish credibility but also instill the confidence required to participate in financial endeavors with peace of mind, thus solidifying its status as an indispensable asset for individuals navigating the digital marketplace.\n\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nConclusion\nEnhance your online financial transactions with verified Cash App accounts, a secure and convenient option for all individuals. By purchasing these accounts, you can access exclusive features, benefit from higher transaction limits, and enjoy enhanced protection against fraudulent activities. Streamline your financial interactions and experience peace of mind knowing your transactions are secure and efficient with verified Cash App accounts.\n\nChoose a trusted provider when acquiring accounts to guarantee legitimacy and reliability. In an era where Cash App is increasingly favored for financial transactions, possessing a verified account offers users peace of mind and ease in managing their finances. Make informed decisions to safeguard your financial assets and streamline your personal transactions effectively.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 ‪(980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com"
thesexillwr933
1,901,503
Day 30 of my progress as a vue dev
About today Today was a decent day,I ended up finishing my last landing page and spent a decent...
0
2024-06-26T16:05:42
https://dev.to/zain725342/day-30-of-my-progress-as-a-vue-dev-5d45
webdev, vue, typescript, tailwindcss
**About today** Today was a decent day,I ended up finishing my last landing page and spent a decent amount of time working on my job project so I think it was satisfactory. **What's next?** I will be ending this series here as it sort of got a little directionless and did not entirely serve the purpose I started it for. It ended up becoming a series on productivity rather than about progressing as a vue dev. Although it wasn't a complete waste I did learn a lot of new things about development and doing things in a productive manner and I believe it will help me in the future. I will be starting a new series from tomorrow with the intent of learning new technologies and techniques regarding web development in general with the aim to learn something new everyday and document it over here. **Improvements required** I need to pace my consistency and avoid being lazy when it comes to trying new things. I just started my journey and I still have a very long way to go. I learned a lot during this attempt of mine and I believe I will be learning a lot more in the future. Wish me luck!
zain725342
1,897,705
[DAY 63-65] I built a markdown previewer using React
Hi everyone! Welcome back to another blog where I document the things I learned in web development. I...
27,380
2024-06-26T16:00:00
https://dev.to/thomascansino/day-63-65-i-built-a-markdown-previewer-18a4
learning, react, javascript, webdev
Hi everyone! Welcome back to another blog where I document the things I learned in web development. I do this because it helps retain the information and concepts as it is some sort of an active recall. On days 63-65, after completing the first project in the Front End Development Libraries course, I moved on to the second one called “Markdown Previewer”. It is a program that allows you to input text in the textarea element and it will render it as HTML code in the webpage. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r8t4o8vwswaivzfmvdh8.PNG) How I did it is first, I did the plain HTML of the program. I used React because I want to practice my React skills.  I added 2 div containers. One for the textarea element where you input your text. And another for an element where it renders your code as HTML. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sq9lxk1us82p8qxwkip4.PNG) Next, I moved straight to adding the functionalities to fulfill the user stories. I first initialized a state input and created functions to handle event changes in the textarea, as well as render the inputted text as HTML code to the webpage. I used the marked library given by the course in the user stories so that there's no need to parse my own.  ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xqkbhqz1zfbd3gyfppcu.PNG) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bb6em4fvh2po2n8lr9la.PNG) After that, I set an attribute to my div container named dangerouslySetInnerHTML to convert the inputted texts as HTML code. This is paired with the function I made earlier where I used the marked library.  Lastly, I finalized the design using vanilla CSS to make it look good.  ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/luw83vslil2ndel5j3oo.PNG) Overall, the project was short and easier than the previous one named “Random Quote Machine”. I still feel good and satisfied with what I made since at this point, every project small or large is a good practice for me. So, I'll take this as another win for the week. Anyways, that’s all for now, more updates in my next blog! See you there!
thomascansino
1,901,500
Setting Up a New Angular Project and Installing Dependencies
Created by Google, Angular is a popular, powerful framework for building rich, dynamic, and snappy...
0
2024-06-26T15:59:25
https://dev.to/orases1/setting-up-a-new-angular-project-and-installing-dependencies-1a4o
webdev, programming, angular
Created by Google, [Angular](https://angular.io/) is a popular, powerful framework for building rich, dynamic, and snappy web applications. Angular stands out in modern web development due to its extensive toolset, which includes powerful features such as [two-way data binding](https://angular.io/guide/two-way-binding) and a [modular architecture](https://www.dezeen.com/tag/modular/). Compared to other frameworks, Angular offers a more structured and maintainable approach, making it ideal for large-scale applications. Properly setting up an Angular project is an important step as it lays the foundation for smooth development, scalability, and maintainability. A well-configured project provides an efficient workflow while minimizing errors and easily supporting future enhancements. ## Installing Angular CLI [Angular CLI](https://angular.io/cli) ([Command Line Interface](https://www.techtarget.com/searchwindowsserver/definition/command-line-interface-CLI)) is an essential tool that simplifies Angular development by providing commands for project setup, building, testing, and deployment. First, make sure that you have Node.js and npm installed to install Angular CLI on your machine. Open your terminal and run ‘npm install -g @angular/cli’, which downloads and globally installs Angular CLI, making it accessible from all directories. To verify the installation, type ‘ng version’ in your terminal, which will display the installed Angular CLI version to confirm that it’s set up correctly and ready for you to use in your development projects. ## Creating a New Angular Project To start a new project with Angular CLI, open up terminal and run ‘ng new project-name’, replacing 'project-name' with whatever name you want. This makes a new project with a standard structure and default settings. Your primary directories include ‘src’ for source files, ‘app’ for application components, and ‘assets’ for static files. Important files like ‘angular.json’ configure project settings, while ‘package.json’ manages your dependencies. For maintainability and scalability, use clear, descriptive naming conventions for all of your components, services, and modules. Organize your project by feature or functionality, keeping related files together to simplify navigation and future updates. ## Exploring Package.json and Dependencies The [‘package.json’](http://nodesource.com/blog/the-basics-of-package-json/) file is important for managing an Angular project's dependencies, scripts, and metadata. It lists all required packages and their versions, guaranteeing consistent environments across different setups. Some common dependencies include ‘@angular/core’ for core Angular functionality, ‘@angular/cli’ for CLI tools, and ‘rxjs’ for reactive programming. To install a dependency, run ‘npm install package-name’ or ‘yarn add package-name’. Updating a dependency in your code involves ‘npm update package-name’ or ‘yarn upgrade package-name’. To remove a dependency, use ‘npm uninstall package-name’ or ‘yarn remove package-name’. These commands maintain the ‘package.json’ and ‘package-lock.json’ (or ‘yarn.loc’) files to help maintain project stability. ## Installing Essential Dependencies For a basic Angular project, some essential dependencies include ‘@angular/core’ and ‘@angular/cli’. ‘@angular/core’ is the foundational package containing core Angular functionalities, while ‘@angular/cli’ provides CLI tools for developing and testing Angular applications. To install these dependencies, open your terminal and run ‘npm install @angular/core @angular/cli’ or ‘yarn add @angular/core @angular/cli’. These dependencies are important here as they offer the necessary framework components and command-line tools to simplify development, enforce best practices, and guarantee that your Angular project operates efficiently and effectively from the ground up. ## Configuring Angular Project Settings Angular offers [various configuration options](https://stackoverflow.com/questions/50489384/what-is-the-right-way-to-add-an-environment-configuration-to-angular-json) to customize your project to whatever specific requirements you may have. Your main settings are managed in 'angular.json' and 'src/environments'. To change the port number, you should modify the 'serve' options in 'angular.json'. You can adjust the base URL in 'src/environments/environment.ts' by setting the 'baseHref' property. Then, go ahead and define environment variables in 'src/environments/environment.ts' and 'src/environments/environment.prod.ts' for development and production, respectively. These configurations help optimize your [development process](https://orases.com/approach/) and environment-specific behavior. ## Setting Up Development and Production Environments In Angular, development and production environments differ in configuration settings for debugging and optimization efforts. [Angular’s development mode](https://angular.io/api/core/isDevMode) enables detailed error messages and live reloading, while production mode focuses on performance and minification. Configure environment-specific settings in "src/environments/environment.ts" and "src/environments/environment.prod.ts". Use "fileReplacements" in "angular.json" to switch environments during the build process. Some best practices to remember are maintaining separate configuration files, using environment variables, and automating environment switches for efficient and error-free deployments. ## Integrating Third-Party Libraries and Packages [Third-party libraries and packages](https://angular.io/guide/using-libraries) are essential for extending Angular’s capabilities, offering pre-built solutions for common functionalities. To install them, you can use ‘npm install library-name’ or ‘yarn add library-name’. After the installation, import the library into your Angular modules or components. Select reliable libraries by checking documentation and how often their developers update them. Integrating well-maintained libraries can significantly enhance your project’s efficiency, functionality, and maintainability, allowing you to focus on core features. ## Testing Project Setup To verify your Angular project setup, run ‘ng serve’ in the terminal to start a local development server. Access the project in your browser at ‘http://localhost:4200’ and check for the functionality you expect from your application. Ensure all components load correctly and there are no errors in the console. Common issues include missing dependencies or incorrect configurations, resolved by reinstalling packages (npm install or yarn) and verifying settings in angular.json and environment.ts.
orases1
1,901,497
Buy verified cash app account
https://dmhelpshop.com/product/buy-verified-cash-app-account/ Buy verified cash app account Cash...
0
2024-06-26T15:56:13
https://dev.to/rcukmiller65/buy-verified-cash-app-account-1kdm
webdev, javascript, beginners, programming
ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-cash-app-account/\n![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wvrmjjxadut9yvn03lfp.png)\n\n\n\nBuy verified cash app account\nCash app has emerged as a dominant force in the realm of mobile banking within the USA, offering unparalleled convenience for digital money transfers, deposits, and trading. As the foremost provider of fully verified cash app accounts, we take pride in our ability to deliver accounts with substantial limits. Bitcoin enablement, and an unmatched level of security.\n\nOur commitment to facilitating seamless transactions and enabling digital currency trades has garnered significant acclaim, as evidenced by the overwhelming response from our satisfied clientele. Those seeking buy verified cash app account with 100% legitimate documentation and unrestricted access need look no further. Get in touch with us promptly to acquire your verified cash app account and take advantage of all the benefits it has to offer.\n\nWhy dmhelpshop is the best place to buy USA cash app accounts?\nIt’s crucial to stay informed about any updates to the platform you’re using. If an update has been released, it’s important to explore alternative options. Contact the platform’s support team to inquire about the status of the cash app service.\n\nClearly communicate your requirements and inquire whether they can meet your needs and provide the buy verified cash app account promptly. If they assure you that they can fulfill your requirements within the specified timeframe, proceed with the verification process using the required documents.\n\nOur account verification process includes the submission of the following documents: [List of specific documents required for verification].\n\nGenuine and activated email verified\nRegistered phone number (USA)\nSelfie verified\nSSN (social security number) verified\nDriving license\nBTC enable or not enable (BTC enable best)\n100% replacement guaranteed\n100% customer satisfaction\nWhen it comes to staying on top of the latest platform updates, it’s crucial to act fast and ensure you’re positioned in the best possible place. If you’re considering a switch, reaching out to the right contacts and inquiring about the status of the buy verified cash app account service update is essential.\n\nClearly communicate your requirements and gauge their commitment to fulfilling them promptly. Once you’ve confirmed their capability, proceed with the verification process using genuine and activated email verification, a registered USA phone number, selfie verification, social security number (SSN) verification, and a valid driving license.\n\nAdditionally, assessing whether BTC enablement is available is advisable, buy verified cash app account, with a preference for this feature. It’s important to note that a 100% replacement guarantee and ensuring 100% customer satisfaction are essential benchmarks in this process.\n\nHow to use the Cash Card to make purchases?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card. Alternatively, you can manually enter the CVV and expiration date. How To Buy Verified Cash App Accounts.\n\nAfter submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a buy verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account.\n\nWhy we suggest to unchanged the Cash App account username?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card.\n\nAlternatively, you can manually enter the CVV and expiration date. After submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account. Purchase Verified Cash App Accounts.\n\nSelecting a username in an app usually comes with the understanding that it cannot be easily changed within the app’s settings or options. This deliberate control is in place to uphold consistency and minimize potential user confusion, especially for those who have added you as a contact using your username. In addition, purchasing a Cash App account with verified genuine documents already linked to the account ensures a reliable and secure transaction experience.\n\n \n\nBuy verified cash app accounts quickly and easily for all your financial needs.\nAs the user base of our platform continues to grow, the significance of verified accounts cannot be overstated for both businesses and individuals seeking to leverage its full range of features. How To Buy Verified Cash App Accounts.\n\nFor entrepreneurs, freelancers, and investors alike, a verified cash app account opens the door to sending, receiving, and withdrawing substantial amounts of money, offering unparalleled convenience and flexibility. Whether you’re conducting business or managing personal finances, the benefits of a verified account are clear, providing a secure and efficient means to transact and manage funds at scale.\n\nWhen it comes to the rising trend of purchasing buy verified cash app account, it’s crucial to tread carefully and opt for reputable providers to steer clear of potential scams and fraudulent activities. How To Buy Verified Cash App Accounts.  With numerous providers offering this service at competitive prices, it is paramount to be diligent in selecting a trusted source.\n\nThis article serves as a comprehensive guide, equipping you with the essential knowledge to navigate the process of procuring buy verified cash app account, ensuring that you are well-informed before making any purchasing decisions. Understanding the fundamentals is key, and by following this guide, you’ll be empowered to make informed choices with confidence.\n\n \n\nIs it safe to buy Cash App Verified Accounts?\nCash App, being a prominent peer-to-peer mobile payment application, is widely utilized by numerous individuals for their transactions. However, concerns regarding its safety have arisen, particularly pertaining to the purchase of “verified” accounts through Cash App. This raises questions about the security of Cash App’s verification process.\n\nUnfortunately, the answer is negative, as buying such verified accounts entails risks and is deemed unsafe. Therefore, it is crucial for everyone to exercise caution and be aware of potential vulnerabilities when using Cash App. How To Buy Verified Cash App Accounts.\n\nCash App has emerged as a widely embraced platform for purchasing Instagram Followers using PayPal, catering to a diverse range of users. This convenient application permits individuals possessing a PayPal account to procure authenticated Instagram Followers.\n\nLeveraging the Cash App, users can either opt to procure followers for a predetermined quantity or exercise patience until their account accrues a substantial follower count, subsequently making a bulk purchase. Although the Cash App provides this service, it is crucial to discern between genuine and counterfeit items. If you find yourself in search of counterfeit products such as a Rolex, a Louis Vuitton item, or a Louis Vuitton bag, there are two viable approaches to consider.\n\n \n\nWhy you need to buy verified Cash App accounts personal or business?\nThe Cash App is a versatile digital wallet enabling seamless money transfers among its users. However, it presents a concern as it facilitates transfer to both verified and unverified individuals.\n\nTo address this, the Cash App offers the option to become a verified user, which unlocks a range of advantages. Verified users can enjoy perks such as express payment, immediate issue resolution, and a generous interest-free period of up to two weeks. With its user-friendly interface and enhanced capabilities, the Cash App caters to the needs of a wide audience, ensuring convenient and secure digital transactions for all.\n\nIf you’re a business person seeking additional funds to expand your business, we have a solution for you. Payroll management can often be a challenging task, regardless of whether you’re a small family-run business or a large corporation. How To Buy Verified Cash App Accounts.\n\nImproper payment practices can lead to potential issues with your employees, as they could report you to the government. However, worry not, as we offer a reliable and efficient way to ensure proper payroll management, avoiding any potential complications. Our services provide you with the funds you need without compromising your reputation or legal standing. With our assistance, you can focus on growing your business while maintaining a professional and compliant relationship with your employees. Purchase Verified Cash App Accounts.\n\nA Cash App has emerged as a leading peer-to-peer payment method, catering to a wide range of users. With its seamless functionality, individuals can effortlessly send and receive cash in a matter of seconds, bypassing the need for a traditional bank account or social security number. Buy verified cash app account.\n\nThis accessibility makes it particularly appealing to millennials, addressing a common challenge they face in accessing physical currency. As a result, ACash App has established itself as a preferred choice among diverse audiences, enabling swift and hassle-free transactions for everyone. Purchase Verified Cash App Accounts.\n\n \n\nHow to verify Cash App accounts\nTo ensure the verification of your Cash App account, it is essential to securely store all your required documents in your account. This process includes accurately supplying your date of birth and verifying the US or UK phone number linked to your Cash App account.\n\nAs part of the verification process, you will be asked to submit accurate personal details such as your date of birth, the last four digits of your SSN, and your email address. If additional information is requested by the Cash App community to validate your account, be prepared to provide it promptly. Upon successful verification, you will gain full access to managing your account balance, as well as sending and receiving funds seamlessly. Buy verified cash app account.\n\n \n\nHow cash used for international transaction?\nExperience the seamless convenience of this innovative platform that simplifies money transfers to the level of sending a text message. It effortlessly connects users within the familiar confines of their respective currency regions, primarily in the United States and the United Kingdom.\n\nNo matter if you’re a freelancer seeking to diversify your clientele or a small business eager to enhance market presence, this solution caters to your financial needs efficiently and securely. Embrace a world of unlimited possibilities while staying connected to your currency domain. Buy verified cash app account.\n\nUnderstanding the currency capabilities of your selected payment application is essential in today’s digital landscape, where versatile financial tools are increasingly sought after. In this era of rapid technological advancements, being well-informed about platforms such as Cash App is crucial.\n\nAs we progress into the digital age, the significance of keeping abreast of such services becomes more pronounced, emphasizing the necessity of staying updated with the evolving financial trends and options available. Buy verified cash app account.\n\nOffers and advantage to buy cash app accounts cheap?\nWith Cash App, the possibilities are endless, offering numerous advantages in online marketing, cryptocurrency trading, and mobile banking while ensuring high security. As a top creator of Cash App accounts, our team possesses unparalleled expertise in navigating the platform.\n\nWe deliver accounts with maximum security and unwavering loyalty at competitive prices unmatched by other agencies. Rest assured, you can trust our services without hesitation, as we prioritize your peace of mind and satisfaction above all else.\n\nEnhance your business operations effortlessly by utilizing the Cash App e-wallet for seamless payment processing, money transfers, and various other essential tasks. Amidst a myriad of transaction platforms in existence today, the Cash App e-wallet stands out as a premier choice, offering users a multitude of functions to streamline their financial activities effectively. Buy verified cash app account.\n\nTrustbizs.com stands by the Cash App’s superiority and recommends acquiring your Cash App accounts from this trusted source to optimize your business potential.\n\nHow Customizable are the Payment Options on Cash App for Businesses?\nDiscover the flexible payment options available to businesses on Cash App, enabling a range of customization features to streamline transactions. Business users have the ability to adjust transaction amounts, incorporate tipping options, and leverage robust reporting tools for enhanced financial management.\n\nExplore trustbizs.com to acquire verified Cash App accounts with LD backup at a competitive price, ensuring a secure and efficient payment solution for your business needs. Buy verified cash app account.\n\nDiscover Cash App, an innovative platform ideal for small business owners and entrepreneurs aiming to simplify their financial operations. With its intuitive interface, Cash App empowers businesses to seamlessly receive payments and effectively oversee their finances. Emphasizing customization, this app accommodates a variety of business requirements and preferences, making it a versatile tool for all.\n\nWhere To Buy Verified Cash App Accounts\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nThe Importance Of Verified Cash App Accounts\nIn today’s digital age, the significance of verified Cash App accounts cannot be overstated, as they serve as a cornerstone for secure and trustworthy online transactions.\n\nBy acquiring verified Cash App accounts, users not only establish credibility but also instill the confidence required to participate in financial endeavors with peace of mind, thus solidifying its status as an indispensable asset for individuals navigating the digital marketplace.\n\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nConclusion\nEnhance your online financial transactions with verified Cash App accounts, a secure and convenient option for all individuals. By purchasing these accounts, you can access exclusive features, benefit from higher transaction limits, and enjoy enhanced protection against fraudulent activities. Streamline your financial interactions and experience peace of mind knowing your transactions are secure and efficient with verified Cash App accounts.\n\nChoose a trusted provider when acquiring accounts to guarantee legitimacy and reliability. In an era where Cash App is increasingly favored for financial transactions, possessing a verified account offers users peace of mind and ease in managing their finances. Make informed decisions to safeguard your financial assets and streamline your personal transactions effectively.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 ‪(980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com"
rcukmiller65
1,901,496
TIL: The Basics of GitHub
"Every morning we are born again. What we do today is what matters most." — Buddha When I began...
0
2024-06-26T15:56:05
https://dev.to/ghewitt95/til-the-basics-of-github-307a
webdev, beginners, todayilearned, github
> "Every morning we are born again. What we do today is what matters most." — Buddha When I began writing blog posts earlier this year, I would start off with a quote that reflected my current outlook on life. I think it would be a good idea to continue that tradition here. Today I learned how to store a static webpage on GitHub. GitHub is used to store code, and keep track of the revisions made to your files. I feel like this will be very useful for collaborating with others. In addition, I used Codespaces, which is a cloud code editor that is based on VSCode. I was relieved when I saw this because the only code editor I had experience with before this apprenticeship was VSCode. So I felt right at home with Codespaces After creating a simple "Hello world!" webpage, I deployed the app by using Render. Render is a hosting provider that allows your page to be viewed by anyone. I had to plug this code into my Codespace in a new file called 'render.yaml' ``` services: - type: web name: hello-world env: ruby plan: free buildCommand: "./bin/render-build.sh" startCommand: "./bin/render-start.sh" ``` After doing this and creating a blueprint, I went back to GitHub Pages to deploy my site! I know it's not much, But i feel like this will be a milestone that I will look back on!
ghewitt95
1,901,495
Discover Your Path to a Healthier Lifestyle with Happy Dieter
In today's hectic world, maintaining a healthy diet can feel impossible. Between work, family, and...
0
2024-06-26T15:55:36
https://dev.to/alphapik/discover-your-path-to-a-healthier-lifestyle-with-happy-dieter-4ool
<p> In today's hectic world, maintaining a healthy diet can feel impossible. Between work, family, and social commitments, who has the time? That's where <a href="https://www.happydieter.net/" target="_blank">Happy Dieter</a> steps in. </p> <h3>Real Questions, Real Solutions</h3> <p> Ever wondered how to balance a busy schedule and still eat healthily? Struggling with meal planning? Happy Dieter has you covered. </p> <h3>Why Happy Dieter?</h3> <p> Choosing Happy Dieter means choosing a comprehensive, expert-backed resource. Here’s why it stands out: </p> <ul> <li><strong>Comprehensive Articles</strong>: From starting a plant-based diet to intermittent fasting, <a href="https://www.happydieter.net/" target="_blank">Happy Dieter</a> covers it all. Each article is well-researched and written by experts.</li> <li><strong>Delicious Recipes</strong>: Healthy eating doesn’t have to be dull. Find a variety of delicious, easy-to-make recipes that cater to all dietary preferences, including vegan, keto, and gluten-free options.</li> <li><strong>Expert Advice</strong>: Access tips and insights from nutritionists and dietitians to help you make informed decisions about your diet.</li> <li><strong>Community Support</strong>: Join a community of like-minded individuals. Share experiences, ask questions, and find support from others on the same journey.</li> </ul> <h3>How Can Happy Dieter Benefit Developers?</h3> <p> As developers, long hours at the desk can lead to poor eating habits and a sedentary lifestyle. Happy Dieter offers practical solutions to help you eat healthily despite a busy schedule. </p> <p> Quick and nutritious meal prep ideas? Check. Tips on staying active while coding? Got you covered. </p> <h3>FAQs</h3> <p><strong>Q: What makes Happy Dieter different from other diet websites?</strong></p> <p> A: Happy Dieter offers expert advice, delicious recipes, and a supportive community. It’s not just about losing weight; it’s about a holistic approach to a healthy lifestyle. </p> <p><strong>Q: Can Happy Dieter help with specific dietary needs?</strong></p> <p> A: Absolutely. Whether you’re vegan, keto, gluten-free, or have other dietary preferences, Happy Dieter has resources tailored for you. </p> <p><strong>Q: Is the advice on Happy Dieter backed by experts?</strong></p> <p> A: Yes. The content is thoroughly researched and written by nutritionists and dietitians. </p> <h3>Join the Happy Dieter Community</h3> <p> Ready to start your journey to a healthier lifestyle? Visit <a href="https://www.happydieter.net/" target="_blank">Happy Dieter</a> today. Explore the resources available and follow them on social media for the latest updates and tips. </p> <p> By integrating healthier habits into your routine, you’ll feel better physically and improve your focus and productivity at work. Let’s code a healthier future together! </p>
alphapik
1,888,505
This Week In Python
Fri, June 14, 2024 This Week in Python is a concise reading list about what happened in the past...
0
2024-06-14T11:42:37
https://bas.codes/posts/this-week-python-078
thisweekinpython, python
**Fri, June 14, 2024** This Week in Python is a concise reading list about what happened in the past week in the Python universe. ## Python Articles - [Optimal SQLite settings for Django](https://gcollazo.com/optimal-sqlite-settings-for-django/) - [How much math can you do in 10 lines of Python](https://wordsandbuttons.online/how_much_math_can_you_do_in_10_lines_of_python.html) - [Python wheel filenames have no canonical form](https://blog.yossarian.net/2024/06/12/Python-wheel-filenames-have-no-canonical-form) - [CPython Garbage Collection: The Internal Mechanics and Algorithms](https://blog.codingconfessions.com/p/cpython-garbage-collection-internals) - [`bytes`: The Lesser-Known Python Built-In Sequence • And Understanding UTF-8 Encoding](https://www.thepythoncodingstack.com/p/bytes-python-built-in-unicode-utf-8-encoding) ## Projects - [teo](https://github.com/teodevgroup/teo) – Schema-driven web server framework (written in Rust, supports Python) - [jupyterlab-desktop](https://github.com/jupyterlab/jupyterlab-desktop) – JupyterLab desktop application, based on Electron - [mesop](https://github.com/google/mesop) – Build delightful web apps quickly in Python - [CardStock](https://github.com/benjie-git/CardStock) – cross-platform tool for quickly and easily building programs. - [cudf](https://github.com/rapidsai/cudf) – GPU DataFrame Library
bascodes
1,901,494
FastAPI Beyond CRUD Part 14 - Model And Schema Relationships (One To Many SQLModel)
In this video, we leverage SQLmodel’s capabilities to demonstrate how to effectively manage one to...
0
2024-06-26T15:54:49
https://dev.to/jod35/fastapi-beyond-crud-part-14-model-and-schema-relationships-one-to-many-sqlmodel-335c
fastapi, api, python, webdev
In this video, we leverage SQLmodel’s capabilities to demonstrate how to effectively manage one to many relationships between tables. Furthermore, we explore serializing data model objects, enabling their transformation into API-friendly formats such as JSON. {%youtube H-7NNblkzrg%}
jod35
1,901,493
Introduction to Azure with .NET Examples
Microsoft Azure is Microsoft's cloud platform offering powerful tools for developers and businesses...
0
2024-06-26T15:47:09
https://dev.to/adrianbailador/introduction-to-azure-with-net-examples-3ccc
azure, dotnet, cloudcomputing, csharp
Microsoft Azure is Microsoft's cloud platform offering powerful tools for developers and businesses to efficiently and securely build, deploy, and manage applications. In this article, we'll explore some key Azure services and provide practical .NET examples to illustrate their use. ## Discovering Azure ### What is Azure? Azure is Microsoft's cloud service platform that provides solutions for computing, storage, networking, databases, artificial intelligence, and more. Designed with robust global infrastructure, it enables organizations to scale their applications globally with high availability and security. ### Benefits of Using Azure 1. **Scalability**: Allows adjusting resources and applications as per changing needs. 2. **High Availability**: Offers built-in redundancy globally. 3. **Advanced Security**: Complies with strict security standards and regulatory requirements. 4. **Integration with Microsoft Products**: Seamlessly integrates with other Microsoft tools and services like Office 365 and Dynamics 365. ## Exploring Serverless with Azure Functions ### What are Azure Functions? Azure Functions is a serverless computing service that allows running code snippets in response to events without worrying about underlying infrastructure. Ideal for tasks like data processing, system integrations, and workflow automation. ### Practical .NET Example with Azure Functions Let's create a simple function that responds to HTTP requests with a personalized greeting. #### Create an Azure Functions Project in Visual Studio 1. Open Visual Studio and select "Create a new project". Search for "Azure Functions" and choose "Azure Functions v2 (.NET Core)". Click "Next". 2. Configure the HTTP trigger by selecting "HTTP trigger" and naming the function `HttpTriggerFunction`. Set the authorization level to "Function". #### Implementing the Function ```csharp using Microsoft.AspNetCore.Mvc; using Microsoft.Azure.WebJobs; using Microsoft.Azure.WebJobs.Extensions.Http; using Microsoft.AspNetCore.Http; using Microsoft.Extensions.Logging; using System.IO; using System.Threading.Tasks; public static class HttpTriggerFunction { [FunctionName("HttpTriggerFunction")] public static async Task<IActionResult> Run( [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req, ILogger log) { log.LogInformation("Request received."); string name = req.Query["name"]; if (string.IsNullOrEmpty(name)) { return new BadRequestObjectResult("Please provide a name in the query string."); } return new OkObjectResult($"Welcome, {name}!"); } } ``` **Code Explanation**: - `HttpTrigger`: Defines that the function responds to HTTP requests. - `ILogger`: Logs information about the function execution. - `HttpRequest`: Represents the incoming HTTP request. - `IActionResult`: Returns the function's result as an HTTP response. #### Publishing to Azure Right-click on the project and select "Publish". Follow the instructions to configure your Azure account and deploy the function to your resource group. #### Testing the Function on Azure Use tools like Postman or Curl to send an HTTP request: ```bash curl -X GET "https://<your-azure-function>.azurewebsites.net/api/HttpTriggerFunction?name=Codu" ``` You should receive the response: ``` Welcome, Codu! ``` ## Data Storage with Azure Blob Storage ### What is Azure Blob Storage? Azure Blob Storage provides scalable and secure cloud storage for unstructured data such as files and media. Essential for storing and accessing large volumes of information globally. ### .NET Example Usage with Azure Blob Storage Let's upload a file to Azure Blob Storage using .NET. #### Environment Setup 1. Install the `Azure.Storage.Blobs` NuGet package: ```powershell Install-Package Azure.Storage.Blobs ``` 2. Configure your Azure storage account and obtain the connection string from the portal. #### Implementation for Uploading a File ```csharp using Azure.Storage.Blobs; using System; using System.IO; using System.Threading.Tasks; public class BlobStorageExample { private const string connectionString = "YourAzureStorageConnectionString"; private const string containerName = "example-container"; private const string blobName = "example-blob.txt"; private const string localFilePath = "path/to/your/file.txt"; public static async Task UploadBlobAsync() { BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString); BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName); await containerClient.CreateIfNotExistsAsync(); BlobClient blobClient = containerClient.GetBlobClient(blobName); using FileStream uploadFileStream = File.OpenRead(localFilePath); await blobClient.UploadAsync(uploadFileStream, true); uploadFileStream.Close(); Console.WriteLine($"File uploaded to {blobClient.Uri}"); } public static void Main(string[] args) { UploadBlobAsync().GetAwaiter().GetResult(); } } ``` **Code Explanation**: - `BlobServiceClient`: Interacts with the Azure Blob service. - `BlobContainerClient`: Provides access to a specific Blob Storage container. - `BlobClient`: Facilitates manipulation of individual blobs within a container. - `UploadAsync`: Method for uploading a file to the blob. #### Verifying the File in Azure Blob Storage To verify that the file has been successfully uploaded, access the Azure portal, navigate to your storage account, and look for the `example-container` container. You should find `example-blob.txt` inside. Additionally, you can list blobs in the container with additional code to confirm the operation. ```csharp public static async Task ListBlobsAsync() { BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString); BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName); await foreach (BlobItem blobItem in containerClient.GetBlobsAsync()) { Console.WriteLine($"Blob name: {blobItem.Name}"); } } ``` ## Conclusion Azure provides powerful tools for developers and businesses looking to innovate in the cloud. From Azure Functions for serverless code execution to Azure Blob Storage for scalable data storage and management, these solutions enable building robust and flexible applications. For more details and resources, visit the [Azure official documentation](https://docs.microsoft.com/azure).
adrianbailador
1,901,441
alternate way of doing word split/phrase segmentation in python
Doing word split with recursion felt a bit complex to me , so I tried to do it in an easier...
0
2024-06-26T15:43:46
https://dev.to/alexey_27/alternate-way-of-doing-word-splitphrase-segmentation-in-python-pj1
algorithms, recursion, python, learning
## **Doing** word split with recursion felt a bit complex to me , so I tried to do it in an easier way. --- --- ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5z5h5e8bxf0ng47kufhm.png) --- --- - The Hard Way (recursion) -- ``` def word_split(phrase,list_of_words, output = None): if output is None: #Base Case / Initial call output = [] for word in list_of_words: if phrase.startswith(word): output.append(word) return word_split(phrase[len(word):],list_of_words,output) # Recursive Call return output # Result ``` **_gives_** ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6238nl02e56j0os8v7cr.PNG) --- - The Easy Way (indexing/for loop) - ``` def word_split_2(phrase, word_list): output = [] for i in word_list: if i in phrase: output.append(i) return output ``` ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0o0ymz8428wo8f7yjoc7.PNG) --- # this approach might be wrong , please correct me if it is
alexey_27
1,901,490
EDR vs. XDR vs. SIEM vs. MDR vs. SOAR
The world of Cybersecurity is buzzing with tech jargon and abbreviations. Many enterprises prefer to...
0
2024-06-26T15:43:43
https://dev.to/sachindra149/edr-vs-xdr-vs-siem-vs-mdr-vs-soar-2blp
soar, cybersecurity, technology, detectionandresponse
The world of Cybersecurity is buzzing with tech jargon and abbreviations. Many enterprises prefer to use newer approaches to combat the ever-evolving security risks and attack vectors. To counter threats, there are several tools and solutions including **SIEM** (Security Information and Event Management), **MDR** (Managed Detection & Response), **SOAR** (Security Orchestration, Automation & Response), **EDR** (Endpoint Detection & Response) and **XDR** (Extended Detection & Response). **SIEM**: It is a tool that assists enterprises in identifying, assessing and responding to threats that affect businesses. It is intended to increase the visibility of the IT environment, allowing teams to respond to perceive events and security incidents efficiently through communication and collaboration. This involves identifying threats and taking action. It also offers forensic investigation and compliance reporting capabilities. **MDR**: It typically comprises of technology, processes and people that collaborate to detect and respond to cyber threats. It is designed to provide continuous cybersecurity threat protection, detection and response. This makes use of machine learning to investigate, alert and contain cyber threats at scale. As a solution, MDR provides a proactive approach to threat detection and response and also assists enterprises to identify and mitigate threats faster, provide real-time monitoring, and respond to cyber threats. **SOAR**: This is a solution stack that allows an organization to gather information about security threats and respond to events without any human involvement. This enables task coordination, execution and automation between various individuals and tools within a single platform. It provides a centralized platform for incident management, thereby reducing the need for manual processes and various technologies. This allows enterprises to easily plan, track and report on incident management activities, which improves the incident response times and the overall security posture. This can orchestrate and automate tasks across multiple security tools and systems allowing businesses to streamline their incident response process. It can automatically invoke investigation path workflows and shorten the time it takes to resolve alerts. According to Gartner, SOAR is a technology that comprises of security orchestration and automation (SOA), incident response, and threat intelligence platforms (TIPs). This allows security teams to investigate threats by leveraging automated threat hunting playbooks and reduce the overall mean-time-to-detect (MTTD) and mean-time-to-respond (MTTD). **EDR**: This helps detect, investigate and respond to advanced endpoint threats. It is used as a compensation for the shortcomings of traditional endpoint protection solutions for preventing attacks. This allows customers to have full visibility into all security related endpoint activities. This is an advanced version of EPP (Endpoint Protection Platform) and helps completely thwart threats. **XDR**: This is a security solution that aims to identify, investigate and respond to advanced threats emanating from various sources, like cloud, networks and email. It is a SaaS-based security platform that combines the organization's existing security solutions into a single security system. It collates raw telemetry data from a wide range of sources, including cloud apps, email, identity and access control, and integrates it with the data from multiple security systems to improve threat visibility and reduce the time to detect and respond to an attack. This is an evolution of EDR. XDR's capabilities extend beyond endpoint detection, it offers detection, analytics and response capabilities across endpoints, networks, severs, cloud workloads, SIEMs and many other platforms.
sachindra149
1,901,484
WebSocket : Create your First WebSocket connection
What is Backend Communication? When One Backend system interacts with other Backend systems, Its...
0
2024-06-26T15:32:14
https://dev.to/nirvanjha2004/websocket-create-your-first-websocket-connection-47o2
webdev, javascript, beginners, tutorial
**What is Backend Communication?** When One Backend system interacts with other Backend systems, Its called backend communication. In real world, there are various backend systems not just one. Lets see some examples:- 1. For example, for a website like PayTM, whenever you do a transaction, the following might happen ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/20p7hsgb7l2ag3l905fv.png) * The **Backend1** will be doing the Money Transfer work and will delegate the task of sending notification of money transfer via email, messaging to other backends. 2. For leetcode (Leetcode is a problem solving website for programmers), whenever the user submits a problem, the following might happen - ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2d2fgelrd4mamh4p3ukg.png) **Types of communication:** 1. Synchronous (Strong coupling):- 1. HTTP (REST/GraphQL) 2. Asynchronous (Weak coupling):- 1. Messaging queues 2. Pub subs 3. Server-Sent Events 3. WebSockets WebSockets are still debatable whether they are sync or async. So what is Synchronous:- HTTP is considered synchronous due to its request-response model. In this model, a client sends a request to a server and waits for a response. The client sends a request, waits for a response, and only after receiving the response can it send another request. Asynchronous:- It refers to a model where the client does not wait for the server to respond before proceeding with other tasks. Instead, the client can send multiple requests, and handle responses as they arrive. For Example:- ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i4nfuhmiuaxvwqrofce6.png) ## Websockets Websockets are a way of communication between the client(typically a web browser) and the server. So why are we using websockets and why not HTTP?? 1. In HTTP, network Handshake happens for every request. Here the server makes a request and waits for the response and when it recieves the response, the connection gets closed. Now to make another request the network handshake happens again. See below:- ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l47iu2ivanwt20cagesd.png) 2. No way to push server side events. In an Http connection you can not send events or requests (You can not do fetch("URL") from the server side) from the server to the client. Only the Client can make requests. (You can use polling but not the best approach). **WebSockets provide a way to establish a persistent, full-duplex communication channel over a single TCP connection between the client (typically a web browser) and the server and solves the above two problems.** ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j9d89wwuvy0oulua72dt.png) **Use Cases for WebSockets:** 1. Real-Time Applications: Chat applications, live sports updates, real-time gaming, and any application requiring instant updates can benefit from WebSockets. 2. Live Feeds: Financial tickers, news feeds, and social media updates are examples where WebSockets can be used to push live data to users. 3. Interactive Services: Chat application, Collaborative editing tools, live customer support chat, and interactive webinars can use WebSockets to enhance user interaction. **Websockets in Node.js** There are various libraries that let you create a ws server:- https://www.npmjs.com/package/websocket https://github.com/websockets/ws https://socket.io/ We’ll be using the ws library today. Lets dive into code:- ## <u>Backend</u> **Ws in Node.js (Code)** 1. Initialize an empty Node.js project ``` npm init -y ``` 2. Add tsconfig to it ``` npx tsc --init ``` 3. Update tsconfig ``` "rootDir": "./src", "outDir": "./dist", ``` 3.1. Create a src folder and inside it create index.ts file. 4. Install ws ``` npm i ws @types/ws ``` If you want to use the standard http library: - 5. Code using http library ``` import WebSocket, { WebSocketServer } from 'ws'; import http from 'http'; const server = http.createServer(function(request: any, response: any) { console.log((new Date()) + ' Received request for ' + request.url); response.end("hi there"); }); const wss = new WebSocketServer({ server }); wss.on('connection', function connection(ws) { ws.on('error', console.error); ws.on('message', function message(data, isBinary) { wss.clients.forEach(function each(client) { if (client.readyState === WebSocket.OPEN) { client.send(data, { binary: isBinary }); } }); }); ws.send('Hello! Message From Server!!'); }); server.listen(8080, function() { console.log((new Date()) + ' Server is listening on port 8080'); }); ``` If you want to use Express Library:- 5. Code using express 5.1. Install Express ``` npm install express @types/express ``` 5.2. Writw this code in index.ts ``` import express from 'express' import { WebSocketServer } from 'ws' const app = express() const httpServer = app.listen(8080) const wss = new WebSocketServer({ server: httpServer }); wss.on('connection', function connection(ws) { ws.on('error', console.error); ws.on('message', function message(data, isBinary) { wss.clients.forEach(function each(client) { if (client.readyState === WebSocket.OPEN) { client.send(data, { binary: isBinary }); } }); }); ws.send('Hello! Message From Server!!'); }); ``` 6. Open the terminal and write:- ``` tsc -b ``` ``` node dist/index.js ``` 7. To check whether the websocket is established or not u can go to https://hoppscotch.io/realtime/websocket ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wje6yw4u6xfzdu2kr1hw.png) ## <u>Client side code</u> > Websocket is a browser API that you can access (very similar to fetch). 1. Create a React project ``` npm create vite@latest ``` 2. Create a websocket connection to the server in App.tsx ``` import { useEffect, useState } from 'react' import './App.css' function App() { const [socket, setSocket] = useState<WebSocket | null>(null); useEffect(() => { const newSocket = new WebSocket('ws://localhost:8080'); newSocket.onopen = () => { console.log('Connection established'); newSocket.send('Hello Server!'); } newSocket.onmessage = (message) => { console.log('Message received:', message.data); } setSocket(newSocket); return () => newSocket.close(); }, []) //Here you can write the code to maka chat application by adding a input box, by creating a state and the storing the message in it and rendering the messages on the screen. return ( <> hi there </> ) } export default App ``` > In Nextjs write this code in Page.tsx and make it client side by writing "use client" at the top of Page.tsx 3. Open the terminal to run this project:- ``` npm run dev ``` And now you are good to go.
nirvanjha2004
1,901,488
Blog: Creating, Modifying, and Destroying an EC2 Instance and Hosting a Static Website Using S3 in AWS with Terraform
Introduction Hello Friends! As part of my internship, I’ve been exploring the capabilities of...
0
2024-06-26T15:42:21
https://dev.to/jeshlin_pv_1628a63168e90/blog-creating-modifying-and-destroying-an-ec2-instance-and-hosting-a-static-website-using-s3-in-aws-with-terraform-11md
cloudcomputing, aws, cloudskills, blog
**Introduction** Hello Friends! As part of my internship, I’ve been exploring the capabilities of Terraform and AWS. Recently, I worked on creating, modifying, and destroying an EC2 instance, and hosting a static website using S3 with Terraform. This blog will detail the steps I followed, the challenges I faced, and the solutions I found. **Understanding the tools** **Terraform** Terraform is an open-source tool that allows you to define and provision infrastructure as code. Today, I learned some of the basics of Terraform and used it to create an EC2 instance and an S3 bucket. **Amazon EC2 (Elastic Compute Cloud)** Amazon EC2 provides scalable virtual servers in the cloud, allowing you to run applications and services with flexible compute power. **Amazon S3 (Simple Storage Service** Amazon S3 is a scalable object storage service used for storing and retrieving any amount of data. It’s ideal for hosting static websites due to its simplicity and reliability. Here’s a brief overview of my Terraform journey: ● Installation: I started by installing Terraform on my local machine. This was straightforward, requiring just a few commands to download and install the binary. ● Writing Configuration Files: Terraform uses configuration files written in HashiCorp Configuration Language (HCL). I created a simple configuration file to define an EC2 instance and an S3 bucket. ● Applying the Configuration: Using the “terraform apply” command, I provisioned the resources defined in my configuration file. Terraform handled the creation of the EC2 instance and S3 bucket, showing me the power of infrastructure as code. Here's how I did it: To start, I created a new Terraform configuration file and defined the AWS provider to specify the region. Then, I configured an EC2 instance resource with details like the AMI ID and instance type. After initializing Terraform to download the necessary plugins, I applied the configuration to create the EC2 instance. To modify the instance, I updated the instance type in the configuration file and re-applied the changes, which Terraform detected and prompted me to confirm. Finally, to destroy the instance, I used the terraform destroy command, which also required confirmation. For hosting a static website on S3, I configured an S3 bucket resource in the same Terraform configuration file, specifying the bucket name, access control settings, and website configuration. I included the necessary resources to upload the HTML files to the bucket. After initializing and applying the configuration again, the static website was accessible via the S3 bucket’s website endpoint. Throughout this process, I ensured unique bucket names and appropriate IAM permissions. **Challenges and Solutions** During this process, I encountered a few challenges: 1. Unique Bucket Names: S3 bucket names must be globally unique. I had to experiment with different names until I found one that was available. 2. IAM Permissions: Ensuring the IAM user had the necessary permissions to create S3 buckets and upload objects was crucial for smooth operations. **Conclusion** Working with Terraform to create, modify, and destroy an EC2 instance, and host a static website on S3, was an enlightening experience. This hands-on approach demonstrated the power of IaC in managing cloud resources efficiently and consistently. I'm excited to continue exploring Terraform and AWS, learning more about automating and managing infrastructure.
jeshlin_pv_1628a63168e90
1,901,487
MasterChefV2 staking is Fixed or Flexible staking?
Main difference between is in the unbounding period. This is the amount of time you need to wait to...
0
2024-06-26T15:42:10
https://dev.to/fly_mountain_50e58c32540d/masterchefv2-staking-is-fixed-or-flexible-staking-3o4c
Main difference between is in the unbounding period. This is the amount of time you need to wait to get the unstaked amount. So, MasterChefV2 staking is Fixed or Flexible? and what is reason?
fly_mountain_50e58c32540d
1,901,486
How to Implement a Data-Driven Marketing Strategy to Improve ROI
In today's digital era, leveraging data to drive marketing strategies has become indispensable for...
0
2024-06-26T15:40:15
https://dev.to/mlpds011/how-to-implement-a-data-driven-marketing-strategy-to-improve-roi-45k9
In today's digital era, leveraging data to drive marketing strategies has become indispensable for businesses aiming to enhance their return on investment (ROI). A data-driven marketing strategy involves using data to guide decision-making processes, optimize marketing efforts, and ultimately achieve better outcomes. This blog will walk you through the steps to implement an effective data-driven marketing strategy and how data engineering solutions, data engineering strategies, and data lakes play pivotal roles in this process. ## Understanding Data-Driven Marketing Data-driven marketing involves collecting, analyzing, and utilizing data to understand customer behaviors, preferences, and trends. This approach allows marketers to make informed decisions, personalize marketing efforts, and measure campaign effectiveness with greater accuracy. By focusing on data, businesses can identify what works, what doesn’t, and where to allocate resources for maximum impact. ## Steps to Implement a Data-Driven Marketing Strategy **1. Define Clear Objectives** The first step in implementing a data-driven marketing strategy is to define clear objectives. What are you trying to achieve with your marketing efforts? Whether it's increasing brand awareness, boosting sales, or improving customer retention, having clear goals will guide your data collection and analysis efforts. **2. Collect and Organize Data** To make informed decisions, you need access to accurate and relevant data. This is where data engineering solutions come into play. Data engineering involves designing and building systems for collecting, storing, and analyzing data. Implementing robust data engineering strategies ensures that your data is reliable and accessible. **3. Utilize Data Lakes** [Data lakes](https://www.techmango.net/data-lake-best-practices-for-aws) are centralized repositories that allow you to store all your structured and unstructured data at any scale. By leveraging data lakes, you can break down data silos and have a comprehensive view of your data. This holistic view is crucial for gaining insights and making data-driven decisions. **4. Analyze Data for Insights** Once your data is organized, it's time to analyze it for actionable insights. Use advanced analytics tools and techniques to uncover patterns, trends, and correlations. These insights will help you understand your audience better, predict future behaviors, and tailor your marketing efforts accordingly. **5. Implement Data-Driven Campaigns** With insights in hand, you can now design and implement data-driven marketing campaigns. Personalize your messages, target specific customer segments, and optimize your marketing channels based on the data. Monitor the performance of your campaigns in real-time and adjust your strategies as needed to maximize ROI. **6. Measure and Optimize** The final step is to measure the effectiveness of your marketing efforts. Use key performance indicators (KPIs) and metrics to assess the success of your campaigns. Continuously optimize your strategies based on the data to ensure ongoing improvement and higher ROI. ## The Role of Data Engineering Solutions [Data engineering solutions](https://www.techmango.net/data-engineering-services) are the backbone of a data-driven marketing strategy. These solutions involve the development of infrastructure for data collection, storage, processing, and analysis. By investing in robust data engineering solutions, businesses can ensure that their data is accurate, secure, and readily available for analysis. ## Key Components of Data Engineering Solutions **Data Collection:** Implementing systems to collect data from various sources such as websites, social media, CRM systems, and more. **Data Storage:** Using data lakes and data warehouses to store vast amounts of data in a structured and organized manner. **Data Processing:** Utilizing ETL (Extract, Transform, Load) processes to clean, transform, and prepare data for analysis. **Data Analysis:** Employing advanced analytics tools and techniques to derive insights from the data. ## Building a Robust Data Engineering Strategy A successful data engineering strategy is essential for the effective implementation of a data-driven marketing strategy. Here are some key considerations: **Scalability:** Ensure that your data infrastructure can scale with your business needs. **Data Quality:** Implement measures to maintain high data quality and integrity. **Security:** Protect your data with robust security measures to prevent unauthorized access. **Integration:** Integrate data from various sources to get a comprehensive view of your marketing efforts. ## Conclusion Implementing a data-driven marketing strategy is crucial for businesses looking to improve their ROI. By leveraging data engineering solutions, data lakes, and robust data engineering strategies, businesses can gain valuable insights, optimize their marketing efforts, and achieve better outcomes. Start by defining clear objectives, collecting and organizing data, analyzing it for insights, implementing data-driven campaigns, and continuously measuring and optimizing your strategies. With a solid data-driven approach, you can make informed decisions, personalize your marketing efforts, and ultimately drive better results for your business.
mlpds011
1,900,344
To-do List Blockchain dApp avec NextJS + Solidity + Web3.js - Tutoriel pas à pas
Introduction Dans ce tutoriel, nous allons créer une application décentralisée (dApp) de...
0
2024-06-26T15:32:16
https://dev.to/starlabman/to-do-list-blockchain-dapp-avec-nextjs-solidity-web3js-tutoriel-pas-a-pas-212g
blockchain, solidity, nextjs, web3js
### Introduction Dans ce tutoriel, nous allons créer une application décentralisée (dApp) de liste de tâches en utilisant plusieurs technologies clés : NextJS pour le front-end, Solidity pour les smart contracts, Web3.js pour l'interaction avec la blockchain et Alchemy pour la connexion à Ethereum. Nous utiliserons également Sepolia, un réseau de test Ethereum, pour déployer et tester notre application sans utiliser de véritables ethers (ETH). ### Technologies utilisées 1. **dApp (Application Décentralisée)** : Une dApp fonctionne sur un réseau décentralisé, comme la blockchain. Contrairement aux applications traditionnelles, les dApps n'ont pas de point central de contrôle et utilisent des smart contracts pour gérer la logique applicative de manière sécurisée et transparente. 2. **Front-end** : Le front-end est la partie visible de l'application avec laquelle les utilisateurs interagissent directement. Il inclut les interfaces utilisateur (UI) et est généralement construit avec des technologies comme HTML, CSS et JavaScript. Dans ce projet, nous utilisons **Next.js** pour le front-end. 3. **Backend** : Le backend est la partie de l'application qui gère la logique, les calculs, et les interactions avec la base de données. Dans une dApp, une partie de la logique backend est gérée par des smart contracts sur la blockchain. Nous utiliserons **Hardhat** pour développer et tester notre backend en Solidity. 4. **JavaScript** : JavaScript est un langage de programmation utilisé pour développer à la fois le front-end et certaines parties du backend des applications web. Dans ce projet, nous utiliserons JavaScript pour créer et gérer notre front-end avec Next.js, ainsi que pour interagir avec la blockchain via Web3.js. 5. **Web3.js** : Web3.js est une bibliothèque JavaScript qui permet d'interagir avec la blockchain Ethereum. Elle permet aux applications web d'envoyer des transactions, de lire des données et d'interagir avec des smart contracts. 6. **Hardhat** : Hardhat est un environnement de développement Ethereum pour compiler, déployer, tester et déboguer vos smart contracts. Il simplifie le processus de développement des smart contracts en fournissant des outils puissants. 7. **Alchemy** : Alchemy est une plateforme de développement blockchain qui fournit des outils et une infrastructure pour interagir avec la blockchain Ethereum. Elle offre des APIs améliorées et une infrastructure de nœud supérieure pour faciliter le développement et le déploiement de dApps. 8. **Sepolia** : Sepolia est un réseau de test (testnet) Ethereum utilisé pour déployer et tester des smart contracts sans utiliser de véritables ethers (ETH). Les testnets comme Sepolia permettent aux développeurs de tester leurs applications dans un environnement simulé. ### Vue d'ensemble des étapes 1. **Développement du Smart Contract** : - Définir un smart contract pour gérer les éléments de la liste de tâches. - Compiler et déployer le contrat. 2. **Frontend Next.js** : - Configurer un projet Next.js. - Connecter le frontend au réseau Ethereum en utilisant Web3.js. - Implémenter la fonctionnalité pour créer, lire et gérer les éléments de la liste de tâches. 3. **Déploiement du Frontend en Local** : - Exécuter l'application localement pour tester l'intégration avec le smart contract. ### 1. Développement du Smart Contract #### Configuration du Projet Backend Pour commencer, nous devons créer un projet Hardhat. Hardhat est un environnement de développement Ethereum pour compiler, déployer, tester et déboguer vos smart contracts. Ouvrez votre terminal, créez ou accédez à un nouveau répertoire vide et exécutez la commande suivante : ```sh npm install ethers hardhat @nomiclabs/hardhat-waffle ethereum-waffle chai @nomiclabs/hardhat-ethers @openzeppelin/contracts dotenv ``` > **Explication** : Cette commande installe plusieurs packages essentiels pour développer et tester des smart contracts avec Hardhat, y compris `ethers` pour interagir avec Ethereum, `hardhat` pour le développement local, et `dotenv` pour gérer les variables d'environnement. Ensuite, initialisez un nouvel environnement de développement avec la commande Hardhat : ```sh npx hardhat ``` > **Explication** : Cette commande initialise un nouveau projet Hardhat dans le répertoire courant. Suivez les instructions pour créer un projet de base. Après avoir exécuté cette commande, sélectionnez l'option "Create a basic sample project" et pour les autres options, appuyez sur "yes". Vous devriez voir les fichiers et dossiers suivants créés dans votre répertoire racine : - **hardhat.config.js** : Contient la configuration de Hardhat. - **scripts** : Contient un script nommé sample-script.js qui déploiera votre smart contract lorsqu'il sera exécuté. - **test** : Contient un exemple de script de test. - **contracts** : Contient un exemple de smart contract Solidity. #### Obtenir une clé API Ethereum avec Alchemy Alchemy est une plateforme de développement blockchain. Pour créer une API, suivez les étapes suivantes : ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1uo81ijxls0ragqpbf1z.png) 1. Inscrivez-vous sur [Alchemy](https://www.alchemy.com/). 2. Allez sur le Dashboard. 3. Sélectionnez le réseau "Sepolia". 4. Copiez la clé HTTP après avoir créé l'application sur Alchemy. Créez un fichier `.env` dans la racine de votre projet et stockez cette clé HTTP comme suit : ```env ALCHEMY_SEPOLIA_URL="VOTRE_CLE_HTTP_ALCHEMY" ``` > **Explication** : Le fichier `.env` stocke les variables d'environnement de manière sécurisée, comme l'URL de votre API Alchemy, pour ne pas les exposer directement dans votre code source. #### Obtenir votre clé privée de compte depuis Metamask Metamask est un portefeuille de cryptomonnaies qui permet d'interagir avec la blockchain Ethereum. Cette clé privée est nécessaire pour notre script de déploiement du smart contract afin de l'exécuter et de prendre des frais de gaz en ether depuis notre portefeuille. 1. Cliquez sur l'icône de profil. 2. Sélectionnez le compte que vous souhaitez exporter. 3. Cliquez sur "Account Details". 4. Cliquez sur "Export Private Key" et entrez votre mot de passe. 5. Copiez et collez cette clé privée dans votre fichier `.env`. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s8fm4htowfx3g0r5ubtj.png) ```env ACCOUNT_PRIVATE_KEY="VOTRE_CLE_PRIVEE" ``` > **Explication** : La clé privée de votre compte Metamask est nécessaire pour signer et envoyer des transactions sur le réseau Ethereum. Ne partagez jamais cette clé et gardez-la sécurisée. #### Obtenir des tokens de test pour Sepolia Pour tester vos smart contracts sur Sepolia, vous aurez besoin de tokens de test. Vous pouvez obtenir des tokens de test (ETH) à partir de faucets de testnet. Voici quelques faucets que vous pouvez utiliser : ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9h2gn1twfpteqy3xy2zb.png) - [Sepolia Faucet](https://sepoliafaucet.io/) > **Explication** : Les faucets de testnet vous fournissent des ETH gratuits que vous pouvez utiliser pour payer les frais de gaz lors du déploiement et du test de vos smart contracts sur les réseaux de test comme Sepolia. #### Mise à jour de hardhat.config.js Mettez à jour la configuration dans hardhat.config.js comme suit : ```js require("@nomiclabs/hardhat-waffle"); require('dotenv').config(); module.exports = { solidity: "0.8.4", networks: { sepolia: { url: process.env.ALCHEMY_SEPOLIA_URL, accounts: [process.env.ACCOUNT_PRIVATE_KEY] } } }; ``` > **Explication** : Cette configuration permet à Hardhat de se connecter au réseau Sepolia via Alchemy en utilisant l'URL de votre API et la clé privée de votre compte Metamask pour déployer et tester les smart contracts. #### Création de la logique du Smart Contract Créez un nouveau fichier dans le répertoire contracts nommé `TaskContract.sol`. Ajoutez le code suivant : ```solidity // SPDX-License-Identifier: GPL-3.0 pragma solidity 0.8.4; contract TaskContract { event AddTask(address recipient, uint taskId); event DeleteTask(uint taskId, bool isDeleted); struct Task { uint id; address username; string taskText; bool isDeleted; } Task[] private tasks; mapping(uint256 => address) taskToOwner; function addTask(string memory taskText, bool isDeleted) external { uint taskId = tasks.length; tasks.push(Task(taskId, msg.sender, taskText, isDeleted)); taskToOwner[taskId] = msg.sender; emit AddTask(msg.sender, taskId); } function getMyTasks() external view returns (Task[] memory) { Task[] memory temporary = new Task[](tasks.length); uint counter = 0; for (uint i = 0; i < tasks.length; i++) { if (taskToOwner[i] == msg.sender && tasks[i].isDeleted == false) { temporary[counter] = tasks[i]; counter++; } } Task[] memory result = new Task[](counter); for (uint i = 0; i < counter; i++) { result[i] = temporary[i]; } return result; } function deleteTask(uint taskId, bool isDeleted) external { if (taskToOwner[taskId] == msg.sender) { tasks[taskId].isDeleted = isDeleted; emit DeleteTask(taskId , isDeleted); } } } ``` > **Explication** : Ce smart contract gère une liste de tâches. Il permet d'ajouter, de récupérer et de supprimer des tâches. Les tâches sont associées à l'adresse de l'utilisateur qui les crée. #### Tester les Smart Contracts Pour tester le smart contract, créez et ouvrez `test/TaskContractTest.js` et mettez-le à jour avec le code suivant : ```js const { expect } = require("chai"); const { ethers } = require("hardhat"); describe("Task Contract", function () { let TaskContract; let taskContract; let owner; const NUM_TOTAL_TASKS = 5; let totalTasks; beforeEach(async function () { TaskContract = await ethers.getContractFactory("TaskContract"); [owner] = await ethers.getSigners(); taskContract = await TaskContract.deploy(); totalTasks = []; for (let i = 0; i < NUM_TOTAL_TASKS; i++) { let task = { 'taskText': 'Task number: ' + i, 'isDeleted': false }; await taskContract.addTask(task.taskText, task.isDeleted); totalTasks.push(task); } }); describe("Add Task", function () { it("should emit AddTask event", async function () { let task = { 'taskText': 'New Task', 'isDeleted': false }; await expect(await taskContract.addTask(task.taskText, task.isDeleted)) .to.emit(taskContract, 'AddTask') .withArgs(owner.address, NUM_TOTAL_TASKS); }); }); describe("Get All Tasks", function () { it("should return the correct number of total tasks", async function () { const tasksFromChain = await taskContract.getMyTasks(); expect(tasksFromChain.length).to.equal(NUM_TOTAL_TASKS); }); }); describe("Delete Task", function () { it("should emit delete task event", async function () { const TASK_ID = 0; const TASK_DELETED = true; await expect(taskContract.deleteTask(TASK_ID, TASK_DELETED)) .to.emit(taskContract, 'DeleteTask') .withArgs(TASK_ID, TASK_DELETED); }); }); }); ``` > **Explication** : Ces tests vérifient que le smart contract fonctionne correctement en ajoutant, récupérant et supprimant des tâches. Les tests utilisent `chai` pour les assertions et `ethers` pour interagir avec le contrat. Exécutez les tests unitaires avec la commande suivante : ```sh npx hardhat test ``` > **Explication** : Cette commande exécute les tests définis dans le répertoire `test` pour vérifier que le smart contract fonctionne comme prévu. #### Déployer le Smart Contract sur le réseau SEPOLIA Pour déployer le contrat, créez un fichier `deploy.js` dans le dossier `scripts` et ajoutez le contenu suivant : ```js const main = async () => { const contractFactory = await ethers.getContractFactory('TaskContract'); const contract = await contractFactory.deploy(); await contract.deployed(); console.log("Contract deployed to:", contract.address); }; const runMain = async () => { try { await main(); process.exit(0); } catch (error) { console.log(error); process.exit(1); } }; runMain(); ``` > **Explication** : Ce script déploie le smart contract sur le réseau Sepolia en utilisant Hardhat. Il affiche l'adresse du contrat déployé une fois le déploiement terminé. Exécutez le script avec la commande suivante : ```sh npx hardhat run scripts/deploy.js --network sepolia ``` > **Explication** : Cette commande exécute le script de déploiement sur le réseau Sepolia, utilisant les informations de configuration fournies dans `hardhat.config.js`. ### 2. Frontend Next.js #### Initialiser le projet Next.js ```sh npx create-next-app todo-dapp-frontend cd todo-dapp-frontend ``` > **Explication** : Cette commande crée un nouveau projet Next.js et accède au répertoire du projet. #### Installer les dépendances nécessaires Installez Web3.js et les autres dépendances requises pour le frontend : ```sh npm install web3 axios @emotion/react @emotion/styled @mui/icons-material @mui/material react-toastify ``` > **Explication** : Cette commande installe les bibliothèques nécessaires pour construire l'interface utilisateur, gérer les notifications, et interagir avec la blockchain. #### Configuration du Frontend ##### Création de la structure de l'application 1. **Dossier src et pages** : Dans le dossier `src`, créez un dossier `pages` avec un fichier `index.js`. Voici le contenu du fichier `pages/index.js` : ```js "use client"; import { React, useState, useEffect } from 'react'; import { Container, Box, Typography, TextField, Button, AppBar, Toolbar, IconButton } from '@mui/material'; import MenuIcon from '@mui/icons-material/Menu'; import { ToastContainer, toast } from 'react-toastify'; import 'react-toastify/dist/ReactToastify.css'; import TaskTable from '../components/Task'; import Web3 from 'web3'; import { TaskContractAddress } from '../config'; import TaskAbi from '../utils/TaskContract.json'; export default function Home() { const [tasks, setTasks] = useState([]); const [input, setInput] = useState(''); const [currentAccount, setCurrentAccount] = useState(''); const [correctNetwork, setCorrectNetwork] = useState(false); // Fonction pour récupérer toutes les tâches const getAllTasks = async () => { try { const { ethereum } = window; if (ethereum) { const web3 = new Web3(ethereum); const TaskContract = new web3.eth.Contract(TaskAbi.abi, TaskContractAddress); let allTasks = await TaskContract.methods.getMyTasks().call(); allTasks = allTasks.map(task => ({ id: task.id.toString(), taskText: task.taskText, wallet: task.wallet, taskDate: new Date(task.taskDate * 1000).toLocaleDateString(), taskTime: new Date(task.taskDate * 1000).toLocaleTimeString(), isDeleted: task.isDeleted })); setTasks(allTasks); } else { console.log("Ethereum object doesn't exist"); } } catch (error) { console.log(error); } }; useEffect(() => { getAllTasks(); }, []); // Fonction pour connecter le portefeuille Metamask const connectWallet = async () => { try { const { ethereum } = window; if (!ethereum) { toast.error('Metamask not detected'); return; } let chainId = await ethereum.request({ method: 'eth_chainId' }); console.log('Connected to chain:' + chainId); const sepoliaChainId = '0xaa36a7'; if (chainId !== sepoliaChainId) { alert('You are not connected to the Sepolia Testnet!'); return; } else { setCorrectNetwork(true); } const accounts = await ethereum.request({ method: 'eth_requestAccounts' }); console.log('Found account', accounts[0]); setCurrentAccount(accounts[0]); toast.success('Wallet connected'); } catch (error) { console.log('Error connecting to metamask', error); } }; // Fonction pour ajouter une tâche const addTask = async (e) => { e.preventDefault(); const task = { 'id': tasks.length + 1, 'taskText': input, 'isDeleted': false }; try { const { ethereum } = window; if (ethereum) { const web3 = new Web3(ethereum); const TaskContract = new web3.eth.Contract(TaskAbi.abi, TaskContractAddress); await TaskContract.methods.addTask(task.taskText, task.isDeleted).send({ from: currentAccount }); setTasks([...tasks, task]); setInput(''); toast.success('Task added'); } else { console.log("Ethereum object doesn't exist!"); } } catch (error) { console.log("Error submitting new Task", error); toast.error('Error adding task'); } }; // Fonction pour supprimer une tâche const deleteTask = async (taskId) => { try { const { ethereum } = window; if (ethereum) { const web3 = new Web3(ethereum); const TaskContract = new web3.eth.Contract(TaskAbi.abi, TaskContractAddress); await TaskContract.methods.deleteTask(taskId, true).send({ from: currentAccount }); const updatedTasks = tasks.map(task => task.id === taskId.toString() ? { ...task, isDeleted: true } : task ); setTasks(updatedTasks); toast.success('Task deleted'); } else { console.log("Ethereum object doesn't exist"); } } catch (error) { console.log(error); toast.error('Error deleting task'); } }; return ( <div> <ToastContainer /> <AppBar position="static"> <Toolbar> <IconButton edge="start" color="inherit" aria-label=" menu"> <MenuIcon /> </IconButton> <Typography variant="h6" style={{ flexGrow: 1 }}> TodoList DApp </Typography> {currentAccount === '' ? ( <Button color="inherit" onClick={connectWallet}>Connect Wallet</Button> ) : ( <Typography variant="h6">{currentAccount}</Typography> )} </Toolbar> </AppBar> <Container> {currentAccount === '' ? ( <Box display="flex" justifyContent="center" alignItems="center" height="100vh"> <Button variant="contained" color="primary" size="large" onClick={connectWallet}> Connect Wallet </Button> </Box> ) : correctNetwork ? ( <Box mt={4}> <Typography variant="h4" align="center" gutterBottom> TodoList DApp </Typography> <Box component="form" onSubmit={addTask} display="flex" justifyContent="center" mb={2}> <TextField id="outlined-basic" label="New Task" variant="outlined" value={input} onChange={e => setInput(e.target.value)} style={{ marginRight: 8 }} /> <Button variant="contained" color="primary" type="submit">Add Task</Button> </Box> <TaskTable tasks={tasks} onDelete={deleteTask} /> </Box> ) : ( <Box display="flex" flexDirection="column" alignItems="center" mt={4}> <Typography variant="h6" color="error">Please connect to the Sepolia Testnet</Typography> <Typography variant="subtitle1">and reload the page</Typography> </Box> )} </Container> </div> ); } ``` > **Explication** : Ce composant React représente la page principale de notre dApp. Il permet de connecter le portefeuille Metamask, de récupérer et d'afficher les tâches depuis la blockchain, et de gérer l'ajout et la suppression de tâches. `useEffect` est utilisé pour récupérer les tâches au chargement de la page. 2. **Dossier components** : Créez un dossier `components` dans `src` et ajoutez le fichier `Task.js`. Voici le contenu du fichier `Task.js` : ```js import { Table, TableBody, TableCell, TableContainer, TableHead, TableRow, Paper } from '@mui/material'; import DeleteIcon from '@mui/icons-material/Delete'; import './Task.css'; const TaskTable = ({ tasks, onDelete }) => { return ( <TableContainer component={Paper} className="task-table"> <Table> <TableHead> <TableRow> <TableCell>ID</TableCell> <TableCell>Task</TableCell> <TableCell>Action</TableCell> </TableRow> </TableHead> <TableBody> {tasks.map((task) => ( <TableRow key={task.id}> <TableCell>{task.id.toString()}</TableCell> <TableCell>{task.taskText}</TableCell> <TableCell> <DeleteIcon fontSize="large" style={{ opacity: 0.7, cursor: 'pointer' }} onClick={() => onDelete(task.id)} /> </TableCell> </TableRow> ))} </TableBody> </Table> </TableContainer> ); }; export default TaskTable; ``` > **Explication** : Ce composant représente le tableau des tâches. Il affiche chaque tâche dans une ligne de tableau et inclut un bouton de suppression pour chaque tâche. Le bouton appelle la fonction `onDelete` passée en tant que prop lorsqu'il est cliqué. 3. **Fichier CSS** : Ajoutez un fichier `Task.css` dans `components`. Voici le contenu du fichier `Task.css` : ```css /* Task.css */ .task-table { margin-top: 25px; } .task-table th { background-color: #f5f5f5; font-weight: bold; } .task-table td, .task-table th { padding: 12px 15px; } .todo__list { display: flex; justify-content: space-between; align-items: center; padding: 10px 20px; border-bottom: 1px solid #ddd; background-color: #ffffff; transition: background-color 0.3s ease; border-radius: 4px; margin-bottom: 10px; } .todo__list:last-child { border-bottom: none; } .todo__list:hover { background-color: #f1f1f1; } .MuiSvgIcon-root { cursor: pointer; transition: opacity 0.3s, transform 0.3s; opacity: 0.7; } .MuiSvgIcon-root:hover { opacity: 1; transform: scale(1.1); color: #ee0808; } ``` > **Explication** : Ce fichier CSS stylise le tableau des tâches et le bouton de suppression. Il définit les marges, les couleurs de fond, et les transitions pour améliorer l'apparence visuelle de la liste de tâches. 4. **Dossier utils** : Créez un dossier `utils` dans `src` et ajoutez le fichier `TaskContract.json` récupéré au niveau du backend dans le dossier `artifacts/contracts/TaskContract`. Voici le contenu du fichier `TaskContract.json` : ```json { "_format": "hh-sol-artifact-1", "contractName": "TaskContract", "sourceName": "contracts/TaskContract.sol", "abi": [ { "anonymous": false, "inputs": [ { "indexed": false, "internalType": "address", "name": "recipient", "type": "address" }, { "indexed": false, "internalType": "uint256", "name": "taskId", "type": "uint256" } ], "name": "AddTask", "type": "event" }, { "anonymous": false, "inputs": [ { "indexed": false, "internalType": "uint256", "name": "taskId", "type": "uint256" }, { "indexed": false, "internalType": "bool", "name": "isDeleted", "type": "bool" } ], "name": "DeleteTask", "type": "event" }, { "inputs": [ { "internalType": "string", "name": "taskText", "type": "string" }, { "internalType": "bool", "name": "isDeleted", "type": "bool" } ], "name": "addTask", "outputs": [], "stateMutability": "nonpayable", "type": "function" }, { "inputs": [ { "internalType": "uint256", "name": "taskId", "type": "uint256" }, { "internalType": "bool", "name": "isDeleted", "type": "bool" } ], "name": "deleteTask", "outputs": [], "stateMutability": "nonpayable", "type": "function" }, { "inputs": [], "name": "getMyTasks", "outputs": [ { "components": [ { "internalType": "uint256", "name": "id", "type": "uint256" }, { "internalType": "address", "name": "username", "type": "address" }, { "internalType": "string", "name": "taskText", "type": "string" }, { "internalType": "bool", "name": "isDeleted", "type": "bool" } ], "internalType": "struct TaskContract.Task[]", "name": "", "type": "tuple[]" } ], "stateMutability": "view", "type": "function" } ], "bytecode": "0x608060405234801561001057600080fd5b50610e62806100206000396000f3fe608060405234801561001057600080fd5b50600436106100415760003560e01c806320df4581146100465780636e13f81814610062578063aaba290714610080575b600080fd5b610060600480360381019061005b91906108fc565b61009c565b005b61006a610229565b6040516100779190610b34565b60405180910390f35b61009a60048036038101906100959190610950565b61067f565b005b600080805490509050600060405180608001604052808381526020013373ffffffffffffffffffffffffffffffffffffffff168152602001858152602001841515815250908060018154018082558091505060019003906000526020600020906004020160009091909190915060008201518160000155602082 01518160010160006101000a81548173ffffffffffffffffffffffffffffffffffffffff021916908373ffffffffffffffffffffffffffffffffffffffff1602179055506040820151816002019080519060200190610176929190610787565b5060608201518160030160006101000a81548160ff0219169083151502179055505050336001600083815260200190815260200160002060006101000a81548173ffffffffffffffffffffffffffffffffffffffff021916908373ffffffffffffffffffffffffffffffffffffffff1602179055507f1f54e1ba1832d428fbd7e7792beaf62b1fc5a382c207ffd614209c1413e94fda338260405161021c929190610b0b565b60405180910390a1505050565b60606000808054905067ffffffffffffffff811115610271577f4e487b7100000000000000000000000000000000000000000000000000000000600052604160045260246000fd5b6040519080825280602002602001820160405280156102aa57816020015b61029761080d565b81526020019060019003908161028f5790505b5090506000805b600080549050811015610553573373ffffffffffffffffffffffffffffffffffffffff166001600083815260200190815260200160002060009054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff1614801561038857506000151560008281548110610365577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b906000526020600020906004020160030160009054906101000a900460ff161515145b1561054057600081815481106103c7577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b9060005260206000209060040201604051806080016040529081600082015481526020016001820160009054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200160028201805461045090610cb4565b80601f016020809104026020016040519081016040528092919081815260200182805461047c90610cb4565b80156104c95780601f1061049e576101008083540402835291602001916104c9565b820191906000526020600020905b8154815290600101906020018083116104ac57829003601f168201915b505050505081526020016003820160009054906101000a900460ff161515151581525050838381518110610526577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b6020026020010181905250818061053c90610d17565b9250505b808061054b90610d17565b9150506102b1565b5060008167ffffffffffffffff811115610596577f4e487b7100000000000000000000000000000000000000000000000000000000600052604160045260246000fd5b6040519080825280602002602001820160405280156105cf57816020015b6105bc61080d565b8152602001906001900390816105b45790505b50905060005b8281101561067557838181518110610616577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b6020026020010151828281518110610657577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b6020026020010181905250808061066d90610d17565b9150506105d5565b5080935050505090565b3373ffffffffffffffffffffffffffffffffffffffff166001600084815260200190815260200160002060009054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff161415610783578060008381548110610721577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b906000526020600020906004020160030160006101000a81548160ff0219169083151502179055507ff88a35c3d2016c409a46570b994a17b408dbc83c14a03f521512d50b85386d06828260405161077a929190610b56565b60405180910390a15b5050565b82805461079390610cb4565b90600052602060002090601f0160209004810192826107b557600085556107fc565b82601f106107ce57805160ff19168380011785556107fc565b828001600101855582156107fc579182015b828111156107fb5782518255916020019190600101906107e0565b5b509050610809919061084d565b5090565b604051806080016040528060008152602001600073ffffffffffffffffffffffffffffffffffffffff168152602001606081526020016000151581525090565b5b8082111561086657600081600090555060010161084e565b5090565b600061087d61087884610ba4565b610b7f565b90508281526020810184848401111561089557600080fd5b6108a0848285610c72565b509392505050565b6000813590506108b781610dfe565b92915050565b600082601f8301126108ce57600080fd5b81356108de84826020860161086a565b91505092915050565b6000813590506108f681610e15565b92915050565b6000806040838503121561090f57600080fd5b600083013567ffffffffffffffff81111561092957600080fd5b610935858286016108bd565b9250506020610946858286016108a8565b9150509250929050565b6000806040838503121561096357600080fd5b6000610971858286016108e7565b9250506020610982858286016108a8565b9150509250929050565b60006109988383610a8a565b905092915050565b6109a981610c2a565b82525050565b6109b881610c2a565b82525050565b60006109c982610be5565b6109d38185610c08565b9350836020820285016109e585610bd5565b8060005b85811015610a215784840389528151610a02858261098c565b9450610a0d83610bfb565b925060208a019950506001810190506109e9565b50829750879550505050505092915050565b610a3c81610c3c565b82525050565b610a4b81610c3c565b82525050565b6000610a5c82610bf0565b610a668185610c19565b9350610a76818560208601610c81565b610a7f81610ded565b840191505092915050565b6000608083016000830151610aa26000860182610aed565b506020830151610ab560208601826109a0565b5060408301518482036040860152610acd8282610a51565b9150506060830151610ae26060860182610a33565b508091505092915050565b610af681610c68565b82525050565b610b0581610c68565b82525050565b6000604082019050610b2060008301856109af565b610b2d6020830184610afc565b9392505050565b60006020820190508181036000830152610b4e81846109be565b905092915050565b6000604082019050610b6b6000830185610afc565b610b786020830184610a42565b9392505050565b6000610b89610b9a565b9050610b958282610ce6565b919050565b6000604051905090565b600067ffffffffffffffff821115610bbf57610bbe610dbe565b5b610bc882610ded565b9050602081019050919050565b600081905060208201905 0919050565b600081519050919050565b600081519050919050565b6000602082019050919050565b600082825260208201905092915050565b600082825260208201905092915050565b6000610c3582610c48565b9050919050565b60008115159050919050565b600073ffffffffffffffffffffffffffffffffffffffff82169050919050565b6000819050919050565b82818337600083830152505050565b60005b83811015610c9f578082015181840152602081019050610c84565b83811115610cae576000848401525b50505050565b60006002820490506001821680610ccc57607f821691505b60208210811415610ce057610cdf610d8f565b5b50919050565b610cef82610ded565b810181811067ffffffffffffffff82111715610d0e57610d0d610dbe565b5b80604052505050565b6000610d2282610c68565b91507fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff821415610d5557610d54610d60565b5b600182019050919050565b7f4e487b7100000000000000000000000000000000000000000000000000000000600052601160045260246000fd5b7f4e487b7100000000000000000000000000000000000000000000000000000000600052602260045260246000fd5b7f4e487b7100000000000000000000000000000000000000000000000000000000600052604160045260246000fd5b6000601f19601f8301169050919050565b610e0781610c3c565b8114610e1257600080fd5b50565b610e1e81610c68565b8114610e2957600080fd5b5056fea26469706673582212209cf2120d1c2331a546dc67e1b144e55ded28e1e2a9bbcd5277ea9cf6eaec679464736f6c63430008040033", "deployedBytecode": "0x608060405234801561001057600080fd5b50600436106100415760003560e01c806320df4581146100465780636e13f81814610062578063aaba290714610080575b600080fd5b610060600480360381019061005b91906108fc565b61009c565b005b61006a610229565b6040516100779190610b34565b60405180910390f35b61009a60048036038101906100959190610950565b61067f565b005b600080805490509050600060405180608001604052808381526020013373ffffffffffffffffffffffffffffffffffffffff16815260200185815260200184151581525090806001815401808255809150506001900390600052602060002090600402016000909190919091506000820151816000015560208201518160010160006101000a81548173ffffffffffffffffffffffffffffffffffffffff021916908373ffffffffffffffffffffffffffffffffffffffff1602179055506040820151816002019080519060200190610176929190610787565b5060608201518160030160006101000a81548160ff0219169083151502179055505050336001600083815260200190815260200160002060006101000a81548173ffffffffffffffffffffffffffffffffffffffff021916908373ffffffffffffffffffffffffffffffffffffffff1602179055507f1f54e1ba1832d428fbd7e7792beaf62b1fc5a382c207ffd614209c1413e94fda338260405161021c929190610b0b565b60405180910390a1505050565b60606000808054905067ffffffffffffffff811115610271577f4e487b7100000000000000000000000000000000000000000000000000000000600052604160045260246000fd5b6040519080825280602002602001820160405280156102aa57816020015b61029761080d565b81526020019060019003908161028f5790505b5090506000805b600080549050811015610553573373ffffffffffffffffffffffffffffffffffffffff166001600083815260200190815260200160002060009054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff1614801561038857506000151560008281548110610365577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b906000526020600020906004020160030160009054906101000a900460ff161515145b1561054057600081815481106103c7577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b9060005260206000209060040201604051806080016040529081600082015481526020016001820160009054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff16815260200160028201805461045090610cb4565b80601f016020809104026020016040519081016040528092919081815260200182805461047c90610cb4565b80156104c95780601f1061049e576101008083540402835291602001916104c9565b820191906000526020600020905b8154815290600101906020018083116104ac57829003601f168201915b505050505081526020016003820160009054906101000a900460ff161515151581525050838381518110610526577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b6020026020010181905250818061053c90610d17565b9250505b808061054b90610d17565b9150506102b1565b5060008167ffffffffffffffff811115610596577f4e487b7100000000000000000000000000000000000000000000000000000000600052604160045260246000fd5b6040519080825280602002602001820160405280156105cf57816020015b6105bc61080d565b8152602001906001900390816105b45790505b50905060005b8281101561067557838181518110610616577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b6020026020010151828281518110610657577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b6020026020010181905250808061066d90610d17565b9150506105d5565b5080935050505090565b3373ffffffffffffffffffffffffffffffffffffffff166001600084815260200190815260200160002060009054906101000a900473ffffffffffffffffffffffffffffffffffffffff1673ffffffffffffffffffffffffffffffffffffffff161415610783578060008381548110610721577f4e487b7100000000000000000000000000000000000000000000000000000000600052603260045260246000fd5b906000526020600020906004020160030160006101000a81548160ff0219169083151502179055507ff88a35c3d2016c409a46570b994a17b408dbc83c14a03f521512d50b85386d06828260405161077a929190610b56565b60405180910390a15b5050565b82805461079390610cb4565b90600052602060002090601f0160209004810192826107b557600085556107fc565b82601f106107ce57805160ff19168380011785556107fc565b828001600101855582156107fc579182015b828111156107fb5782518255916020019190600101906107e0565b5b509050610809919061084d565b5090565b604051806080016040528060008152602001600073ffffffffffffffffffffffffffffffffffffffff168152602001606081526020016000151581525090565b5b8082111561086657600081600090555060010161084e565b5090565b600061087d61087884610ba4565b610b7f565b905082815260208101 84848401111561089557600080fd5b6108a0848285610c72565b509392505050565b6000813590506108b781610dfe565b92915050565b600082601f8301126108ce57600080fd5b81356108de84826020860161086a565b91505092915050565b6000813590506108f681610e15565b92915050565b6000806040838503121561090f57600080fd5b600083013567ffffffffffffffff81111561092957600080fd5b610935858286016108bd565b9250506020610946858286016108a8565b9150509250929050565b6000806040838503121561096357600080fd5b6000610971858286016108e7565b9250506020610982858286016108a8565b9150509250929050565b60006109988383610a8a565b905092915050565b6109a981610c2a565b82525050565b6109b881610c2a565b82525050565b60006109c982610be5565b6109d38185610c08565b9350836020820285016109e585610bd5565b8060005b85811015610a215784840389528151610a02858261098c565b9450610a0d83610bfb565b925060208a019950506001810190506109e9565b50829750879550505050505092915050565b610a3c81610c3c565b82525050565b610a4b81610c3c565b82525050565b6000610a5c82610bf0565b610a668185610c19565b9350610a76818560208601610c81565b610a7f81610ded565b840191505092915050565b6000608083016000830151610aa26000860182610aed565b506020830151610ab560208601826109a0565b5060408301518482036040860152610acd8282610a51565b9150506060830151610ae26060860182610a33565b508091505092915050565b610af681610c68565b82525050565b610b0581610c68565b82525050565b6000604082019050610b2060008301856109af565b610b2d6020830184610afc565b9392505050565b60006020820190508181036000830152610b4e81846109be565b905092915050565b6000604082019050610b6b6000830185610afc565b610b786020830184610a42565b9392505050565b6000610b89610b9a565b9050610b958282610ce6565b919050565b6000604051905090565b600067ffffffffffffffff821115610bbf57610bbe610dbe565b5b610bc882610ded565b9050602081019050919050565b6000819050602082019050919050565b600081519050919050565b600081519050919050565b6000602082019050919050565b600082825260208201905092915050565b600082825260208201905092915050565b6000610c3582610c48565b9050919050565b60008115159050919050565b600073ffffffffffffffffffffffffffffffffffffffff82169050919050565b6000819050919050565b82818337600083830152505050565b60005b83811015610c9f578082015181840152602081019050610c84565b83811115610cae576000848401525b50505050565b60006002820490506001821680610ccc57607f821691505b60208210811415610ce057610cdf610d8f565b5b50919050565b610cef82610ded565b810181811067ffffffffffffffff82111715610d0e57610d0d610dbe565b5b80604052505050565b6000610d2282610c68565b91507fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff821415610d5557610d54610d60565b5b600182019050919050565b7f4e487b7100000000000000000000000000000000000000000000000000000000600052601160045260246000fd5b7f4e487b7100000000000000000000000000000000000000000000000000000000600052602260045260246000fd5b7f4e487b7100000000000000000000000000000000000000000000000000000000600052604160045260246000fd5b6000601f19601f8301169050919050565b610e0781610c3c565b8114610e1257600080fd5b50565b610e1e81610c68565b8114610e2957600080fd5b5056fea26469706673582212209cf2120d1c2331a546dc67e1b144e55ded28e1e2a9bbcd5277ea9cf6eaec679464736f6c63430008040033", "linkReferences": {}, "deployedLinkReferences": {} } ``` > **Explication** : Ce fichier JSON contient l'ABI (Application Binary Interface) et le bytecode compilé du smart contract. L'ABI est utilisée par Web3.js pour interagir avec le smart contract déployé. Ensuite, créez un fichier `config.js` à la racine du dossier `client` et collez l'adresse du contrat déployé : ```js export const TaskContractAddress = "ADRESSE_DU_CONTRAT_DEPLOYE"; ``` > **Explication** : Ce fichier JavaScript exporte l'adresse du smart contract déployé. Cette adresse sera utilisée par le front-end pour interagir avec le smart contract sur la blockchain. ### Déploiement du frontend en local Pour exécuter le projet en local, suivez ces étapes : 1. Ouvrez votre terminal et accédez au répertoire racine de votre projet frontend. 2. Assurez-vous que toutes les dépendances sont installées en exécutant `npm install`. 3. Lancez l'application avec `npm run dev`. Accédez à votre navigateur et ouvrez `http://localhost:3000` pour voir votre application en action. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dd663b1az04kz03ds80y.png) ### Conclusion Ce tutoriel vous a guidé à travers le processus de création d'une dApp de liste de tâches avec Next.js, Solidity et Web3.js. En suivant ces étapes, vous pouvez maintenant développer et déployer vos propres applications décentralisées. Pour plus de tutoriels et de ressources, restez connectés et continuez à explorer les possibilités infinies offertes par la blockchain. [Voir le projet sur GitHub](https://github.com/starlabman/TodoList_DApp_using_Nextjs_Solidity_Web3js.git) [Regarder le tutoriel sur YouTube](https://www.youtube.com/watch?v=MYjdFSyl_7E) ### Références des outils utilisés - **Next.js** : [Site officiel](https://nextjs.org/) - **Web3.js** : [Site officiel](https://web3js.readthedocs.io/) - **Hardhat** : [Site officiel](https://hardhat.org/) - **Alchemy** : [Site officiel](https://www.alchemy.com/) - **Metamask** : [Site officiel](https://metamask.io/) ### Présentation de l'auteur Je m'appelle **AGBETISIASSI KODJO LABORE**, développeur blockchain passionné par la création de solutions innovantes et sécurisées. Avec une expérience approfondie dans le développement de dApps, je suis dévoué à partager mes connaissances et à aider la communauté à explorer le monde fascinant de la blockchain. Pour me soutenir, voici mes wallets crypto : EVM : 0xf249F24182CdE7bAd264B60Ed38727Fd3674FE6A SOL : Fq9sgX7UHqEEwpVMu7UKjpstQGcf1JD3kPnUTYRbEdcZ Pour plus d'informations et pour voir d'autres projets, vous pouvez me suivre sur : - **LinkedIn** : [AGBETISIASSI KODJO LABORE](https://www.linkedin.com/in/starlabman) - **GitHub** : [Github Profile](https://github.com/starlabman) - **Twitter** : [Twitter Profile](https://twitter.com/0xWeb3Devrel)
starlabman
1,901,483
4 Tech Trends For 2024 and Beyond
The world of technology is constantly changing, so here's a breakdown of the top 4 tech trends that...
0
2024-06-26T15:31:29
https://dev.to/devella/4-tech-trends-for-2024-and-beyond-4g87
programming, webdev, beginners, ai
> The world of technology is **constantly changing**, so here's a breakdown of the **top 4 tech trends** that are set to shape our lives in 2024 and beyond: **1. Artificial Intelligence (AI) _Takes Center Stage_:** Remember those sci-fi movies with intelligent robots? Well, AI is making that vision a reality. AI uses advanced computer programs to mimic human intelligence, allowing them to learn, analyze data, and even make decisions. In the future, expect AI to become even more integrated into our daily lives, from automating tasks at work to revolutionizing healthcare and transportation. **2. The Internet of Things (IoT) _Connects Everything_:** The **Internet of Things** (IoT) refers to everyday objects getting a tech upgrade. These objects, from thermostats to refrigerators, are embedded with sensors and software, allowing them to connect to the internet and collect data. As the IoT grows in 2024, expect our homes and cities to become "smarter," with better automation, energy efficiency, and even personalized living experiences. **3. Extended Reality (XR) _Reshapes Our Perception_:** Imagine learning history by walking through ancient Rome in **virtual reality (VR)**, or trying on clothes virtually before you buy them in **augmented reality (AR)**. Extended Reality (XR) encompasses both VR and AR, creating immersive experiences that blend the digital and physical worlds. XR is expected to move beyond just gaming and entertainment. It can be used for training simulations, education, even healthcare procedures. **4. The Rise of Responsible Technology:** With all this amazing tech advancement comes a responsibility to use it wisely. **Responsible technology** focuses on developing and using technology in a way that benefits society and minimizes harm. This includes issues like data privacy, security, and the ethical implications of AI. In 2024, expect to see more conversations about using technology for good, ensuring it's inclusive and accessible to everyone. > **Conclusion:** >_These are just a taste of the exciting tech trends shaping our future. As technology continues to evolve, the possibilities are endless!_ _**Please remember to like & save this article if you enjoyed it and let me know what topics you'd want to see covered next.**_
devella
1,901,482
From Vite’s Popularity to Selenium’s Legacy: What Defines Open Source Success?
Explore the stories of Vite and Selenium, and learn about the key metrics and community factors that contribute to lasting impact and growth in the open source ecosystem.
0
2024-06-26T15:29:15
https://opensauced.pizza/blog/open-source-success
opensource, javascript, community
--- title: From Vite’s Popularity to Selenium’s Legacy: What Defines Open Source Success? published: true description: Explore the stories of Vite and Selenium, and learn about the key metrics and community factors that contribute to lasting impact and growth in the open source ecosystem. tags: opensource, javascript, community cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g2uot3umoithr2g8m2wm.png # Use a ratio of 100:42 for best results. # published_at: 2024-06-26 15:25 +0000 canonical_url: https://opensauced.pizza/blog/open-source-success --- There’s no doubt that in tech, success is assumed based on the loudest voices or the spiciest headlines. But that doesn’t mean that those people and projects are actually the most successful. What truly defines the best open source projects, and why should we care? Although the recent[State of JavaScript Awards](https://2023.stateofjs.com/en-US/awards/) isn’t specifically for open source, it can still provide more context into this question. Consider [Vite](https://app.opensauced.pizza/s/vitejs/vite?range=360), a build tool that swept multiple categories in the State of JavaScript awards, including “Most Adopted Technology.” If we’re to look at any open source project as successful, I think that taking as many awards as it did, would qualify Vite as successful. Beyond that, in the last year alone, it’s seen 860 contributors, 9,232 Stars, and 1,258 forks. However, it is worth noting that their top four contributors made 46% of all their commits and only 20% of their stargazers and forkers come back to make a meaningful contribution. All of these metrics together tell a story of a healthy, growing project that has captured the attention of the JavaScript community. > To learn more about why Vite is everywhere, check out [Evan You’s episode of The Secret Sauce](https://youtu.be/4_uYqae42uc?feature=shared) But the story of top open source projects isn’t always about the most visible or hyped technologies. Take [Selenium](https://app.opensauced.pizza/s/SeleniumHQ/selenium?range=360), for instance. As Jason Huggins, its creator, recently pointed out: > “Selenium is the biggest open source success story never told. We’re now at exactly 20 years since the first lines of code were written. Since then, it has launched countless careers and companies. Did you know the Selenium Test Automation User Group on LinkedIn alone has almost 250K users?!” Selenium’s journey shows that the best open source projects aren’t always the ones grabbing headlines. They can be the ones that demonstrate remarkable longevity, sustaining their relevance over years, and providing paths to success for individual contributors as well. The long-term success of a project like Selenium, which has remained vital for over two decades, highlights the importance of durability and consistent value in the open source ecosystem. This kind of sustained impact is a testament to the project’s adaptability and the community’s continuous support and improvement efforts, demonstrating that longevity is an important metric of success in open source. In the last year, we’ve seen 560 total contributors to Selenium, with 3,631 stars and 743 forks as well. It might not have the same explosiveness as Vite, but it clearly shows success with its consistency and staying power. ## What Makes a Top Open Source Project? In the last couple of weeks, I’ve started most of my mornings checking out the [top new stars](https://app.opensauced.pizza/workspaces/8691fd9d-4cfc-4fa5-8c40-85da76a229b8) and [top new forks](https://app.opensauced.pizza/workspaces/46eb2b34-7622-462c-93a2-55b5fa34aa76) over the last 24 hours and adding them to my (linked above) workspaces. We know that [starred repositories aren’t a great indicator of project success](https://opensauced.pizza/blog/growth-hacking-killed-github-stars) and a sudden spike in forks often indicates some sort of spam project, but they can still be useful for identifying trends, understanding project engagement, and spotting rising stars. It’s important to recognize that stars or forks or any other metric alone cannot provide a complete picture of a project’s long-term viability or community support. Understanding these metrics in the context of other qualitative and quantitative data helps provide a more comprehensive view of a project’s health and trajectory. They are pieces of a larger puzzle that, when combined with insights into awareness, retention, and community engagement, can help paint a fuller picture of what makes an open source project successful. The truth is, the best open source projects often share several key characteristics: - **Solve Real Problems**: Top open source projects address genuine pain points in the developer community. - **Active Community**: The best open source projects create engaged communities that contribute to their growth and evolution, with a high [contributor confidence](https://opensauced.pizza/docs/features/repo-pages/#insights-into-contributor-confidence), and a variety of contributors, including recurring, new, and internal. - **Sustainability:** Long-term viability is key for top open source projects, ensuring they can be trusted for years to come. There should be clear trends that indicate a healthy project, including a variety of committers to the project, reducing the [lottery factor](https://opensauced.pizza/docs/welcome/glossary/#lottery-factor). - **Innovation**: The best open source projects often push the boundaries of what’s possible, driving the entire industry forward. ### How do the Projects in the State of JS Awards Compare? So how does the State of JS look, when we get up close? Let’s take a look at [the workspace I created with all the winners and runners-ups](https://app.opensauced.pizza/workspaces/fb774fa9-a4a8-4f49-a0af-a8a955f10069?limit=20&range=360). [![state of js repo stats](https://cdn.sanity.io/images/r7m53vrk/production/5bfdbb13156e0b6c04b37e419079ff1b69e791c3-1213x280.png?w=450)]((https://app.opensauced.pizza/workspaces/fb774fa9-a4a8-4f49-a0af-a8a955f10069?limit=20&range=360)) #### Pull Requests and Issues: Indicators of Activity and Engagement This workspace shows impressive numbers for all the projects over the last year: - 14.5k PRs opened - 14.2k merged - 7.7k issues opened - 15.6k closed issues Every single project has a high activity level. [Next.js](https://app.opensauced.pizza/s/vercel/next.js) alone has an astonishing 3,367 PRs with a velocity of 4 days and a high engagement ratio. This high velocity indicates a well-managed project where contributions are reviewed and merged promptly, reflecting an active and responsive maintainer team. The velocity of issue closure, averaging 209 days across projects, also provides insight into the project’s responsiveness to user feedback and bugs. While a high number of closed issues indicates healthy problem-solving, the velocity can highlight areas where improvements are needed. Projects with lower closure times are likely more efficient in handling user concerns, contributing to higher user satisfaction and retention. #### Contributor Diversity and Sustained Engagement The data reveals significant contributions from across projects. For example, [React](https://app.opensauced.pizza/s/facebook/react) boasts 503 contributors, showcasing a broad base of active participants. This diversity reduces the “lottery factor,” ensuring that the project does not overly depend on a few key individuals. ![React stats](https://cdn.sanity.io/images/r7m53vrk/production/ddcd926ba2fd2c234e8785093a4b3eb3ba68a372-935x101.png?w=450) Additionally, the consistency in contributor activity over the last 360 days, shows ongoing interest and sustained participation. Projects like [Vitest](https://app.opensauced.pizza/s/vitest-dev/vitest) and [Astro](https://app.opensauced.pizza/s/withastro/astro) show steady contributor activity, suggesting that these projects have successfully maintained their community’s interest and involvement over time. #### Engagement Metrics: Stars and Forks Projects like [Storybook](https://app.opensauced.pizza/s/storybookjs/storybook) and [Bun](https://app.opensauced.pizza/s/oven-sh/bun) also show high levels of new forks and stars, suggesting they are gaining traction and interest. This can be particularly telling of projects that are on the rise, capturing the attention of developers looking for new tools and frameworks to integrate into their workflows. #### High-Performance and High-Velocity Projects The workspace highlights several projects with both high performance and high velocity in managing PRs and issues. Projects like [Playwright](https://app.opensauced.pizza/s/microsoft/playwright) and Bun demonstrate a remarkable balance between high PR activity and low velocity, indicating a streamlined contribution process and a responsive maintainer team. #### Bonus: Going Deeper with the Lottery Factor I wanted to learn a little more about these projects, so I asked [StarSearch](https://opensauced.pizza/docs/features/star-search/) to tell me more about their Lottery Feature. I was genuinely surprised at what’s happening in some of the repositories. Here’s what I learned: ![lottery factor](https://cdn.sanity.io/images/r7m53vrk/production/01970ef9d17ad213955876650a179444d8d6afea-464x546.png?w=450) ## Looking Into the Future of Open Source Success In the conclusion of the State of JavaScript Report, Cassidy Williams notes an interesting shift in sentiment: “The fact that so many ‘smaller’ libraries like [Preact](https://app.opensauced.pizza/s/preactjs/preact), [Solid](https://app.opensauced.pizza/s/solidjs/solid), and [htmx](https://app.opensauced.pizza/s/bigskysoftware/htmx) are climbing in positive sentiment over something massive like [Next.js](https://app.opensauced.pizza/s/vercel/next.js) is fascinating. We’re starting to see [Angular](https://app.opensauced.pizza/s/angular/angular) make a bit of a comeback, and we’ll see if that trend continues next year. We’re seeing people fall out of love with some of the industry darlings. We’re seeing very Rusty systems grow. [Astro](https://app.opensauced.pizza/s/withastro/astro) feels like it’s off to the races in developer support.” Cassidy underscores an important observation about the open source ecosystem: success isn’t solely determined by size or initial popularity. Smaller, more focused projects can gain significant traction and developer loyalty by addressing specific needs effectively. Understanding these metrics helps provide a more nuanced view of what makes an open source project successful. It's not just about the initial burst of popularity but about building a sustainable, engaged community that can support and grow the project over the long term. In Huggins' words, "Good thing we didn't do it for the fame and glory, huh?" Perhaps it's time we started giving projects like Selenium the glory they deserve.
bekahhw
1,901,481
asdfdfwew
jnkjnkjn kjnkn asfdasdfa`` ` asdfasdfas `
0
2024-06-26T15:28:20
https://dev.to/marcojakob/asdf-c57
jnkjnkjn kjnkn asfdasdfa`` ``` asdfasdfas ```
marcojakob
1,901,480
Solana: The First Web-Scale Blockchain
Introduction Scalability has always been a problem in the rapidly developing field...
27,673
2024-06-26T15:26:34
https://dev.to/rapidinnovation/solana-the-first-web-scale-blockchain-114l
## Introduction Scalability has always been a problem in the rapidly developing field of blockchain technology. Scalability solutions are becoming increasingly important as blockchain networks strive to handle more users and transactions. Relatively new to the blockchain arena, Solana has gained notoriety fast thanks to its ground-breaking inventions that allow it to function at the web- scale. In this piece, we explore the major advancements that set Solana apart as the first web-scale blockchain and transformed the field of decentralized networks. ## Proof of History (POH) Proof of History (POH), a fundamental invention at the core of Solana's revolutionary developments, acts as a globally accessible, permissionless timekeeping system inside the network that functions independently of consensus. Solana's POH offers sub-second granularity, allowing any validator to maintain its own clock independently, negating the need for validators to synchronize and agree on the passage of time. This method improves the scalability and efficiency of Solana's decentralized ecosystem. ## Tower BFT Tower BFT is the apex of Solana's consensus process, built on top of POH. This PBFT-inspired method allows Solana to achieve throughput and efficiency in decentralized network activities. Tower BFT prioritizes liveness while preserving network integrity, enabling quick decision-making and consensus- building, promoting a smooth transaction flow without jeopardizing the network's stability. ## Turbine Turbine, Solana's block propagation algorithm, transforms the way blockchain networks attain resilience and scalability. By breaking up blocks into smaller packets, Turbine optimizes data transmission, ensuring redundancy and fault tolerance. This streaming approach to block propagation minimizes latency and greatly increases Solana's throughput capacity. ## Gulf Stream The Gulf Stream protocol decentralizes transaction caching and forwarding, shifting these responsibilities to the network's edge. This method shortens confirmation times and eases the memory load on validators, ensuring seamless leader transitions and optimizing transaction execution. Gulf Stream's decentralized mempool management model improves network scalability and efficiency. ## Sealevel Sealevel offers a paradigm-shifting improvement in scalability and performance by allowing transactions to be executed simultaneously across GPUs and SSDs. This hyper-parallelized engine enables Solana to process data at remarkable speeds, opening the door for the development of decentralized finance (DeFi) platforms, non-fungible token (NFT) markets, gaming ecosystems, and other applications. ## Pipelining Solana's implementation of CPU pipelining and GPU parallelization transforms efficiency and scalability in transaction validation. This clever strategy allows for parallel processing at several stages, significantly improving transaction throughput and overall system performance. Solana's complex pipeline architecture enables the remarkably accurate synchronization of transaction processing phases. ## Cloudbreak Cloudbreak addresses blockchain scalability by managing simultaneous read and write activities through a horizontally scaled SSD configuration. This architecture ensures effective information retrieval and storage, improving network responsiveness and maximizing resource usage. Cloudbreak is essential to maintaining both performance and scalability in Solana's architecture. ## Archivers Solana's Archivers method creatively addresses the need for scalable storage by transferring the load of data storage from validators to a dispersed network of nodes. Utilizing erasure coding and lightweight proofs, Solana ensures data security, integrity, and availability, creating a strong basis for Solana's expansion and uptake in the decentralized environment. ## Conclusion With its groundbreaking developments, Solana has become the leading web-scale platform and surged to the forefront of blockchain innovation. By combining cutting-edge technologies like Proof of History and Archivers, Solana processes thousands of transactions per second on a globally dispersed network. Solana continues to lead the way, paving the path for the development of decentralized apps and digital economies in the future. 📣📣Drive innovation with intelligent AI and secure blockchain technology! Check out how we can help your business grow! [Blockchain App Development](https://www.rapidinnovation.io/service- development/blockchain-app-development-company-in-usa) [Blockchain App Development](https://www.rapidinnovation.io/service- development/blockchain-app-development-company-in-usa) [AI Software Development](https://www.rapidinnovation.io/ai-software- development-company-in-usa) [AI Software Development](https://www.rapidinnovation.io/ai-software- development-company-in-usa) ## URLs * <https://www.rapidinnovation.io/post/leading-web-scale-blockchain-innovations-with-solana> ## Hashtags #BlockchainInnovation #ScalabilitySolutions #SolanaBlockchain #DecentralizedNetworks #WebScaleTechnology
rapidinnovation
1,901,177
Unlock the Power of JavaScript Map: Don't Confuse it with Array map!
When solving problems that can be optimized with caching, many developers use the Map function....
0
2024-06-26T15:25:54
https://dev.to/rajusaha/unlock-the-power-of-javascript-map-dont-confuse-it-with-array-map-2jhk
webdev, javascript, beginners, programming
When solving problems that can be optimized with caching, many developers use the Map function. However, some developers mistakenly declare a map using a simple empty object or new Map and then set properties directly. While this approach may seem to work, it can lead to confusion during debugging. Let's look at an example ![map setting properties just like object](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rklvv7xfzbpvdtmhuucr.png) You might expect the map to contain the entries `{ 'fruit'=>'Apple', 'vegetable'=>'Tomato' }`, but instead, it remains an empty Map i.e 0, and console only as an object. This is because the correct way to add key-value pairs to a Map is by using the set method. If you try to interact with the map using has or delete, it will not work as expected: ![checking it by map methods](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/alk7cdprovggcu9h8dq8.png) The correct approach is to use the set and get methods ![correct Approcah of map](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ghesgicf20uphmmqei0b.png) Let's update the map with a new key-value pair and check the output ![Updating the properties ](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9czv882uzbt4d4p7bcdt.png) As you can see, the set method correctly updates the existing fruit entry. The map now contains two key-value pairs. To summarize, always use the set and get methods for adding and retrieving properties in a Map. For more information, refer to the [MDN documentation](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map).
rajusaha
1,901,654
Cursos Harvard Gratuitos: Python, JavaScript, Análise De Dados E Mais!
Harvard está oferecendo mais de 120 cursos gratuitos com certificação, proporcionando uma...
0
2024-06-28T13:38:09
https://guiadeti.com.br/cursos-harvard-gratuitos-python-javascript-outros/
cursogratuito, analisededados, cursosgratuitos, inteligenciaartifici
--- title: Cursos Harvard Gratuitos: Python, JavaScript, Análise De Dados E Mais! published: true date: 2024-06-26 15:18:22 UTC tags: CursoGratuito,analisededados,cursosgratuitos,inteligenciaartifici canonical_url: https://guiadeti.com.br/cursos-harvard-gratuitos-python-javascript-outros/ --- Harvard está oferecendo mais de 120 cursos gratuitos com certificação, proporcionando uma oportunidade única para o crescimento pessoal e profissional. Possuindo uma metodologia prática e acessível globalmente, esses cursos abrangem diversas áreas de conhecimento, incluindo 34 cursos voltados para tecnologia da informação. Disponíveis através da plataforma edX, os cursos incluem tópicos como Python, JavaScript, Análise de Dados, SQL e inteligência artificial, permitindo que os alunos adquiram habilidades valiosas sem nenhum custo. ## Cursos Harvard Harvard está disponibilizando mais de 120 cursos gratuitos com certificação, proporcionando uma oportunidade única para o crescimento pessoal e profissional. ![](https://guiadeti.com.br/wp-content/uploads/2024/06/image-61.png) _Imagem da página de Harvard_ Ensinando com uma metodologia prática e acessível globalmente, esses cursos oferecem uma plataforma inestimável para quem busca expandir seus conhecimentos e habilidades. ### Cursos em Tecnologia da Informação Entre os cursos oferecidos, 34 são voltados especificamente para a área de tecnologia da informação. Esses cursos são ministrados através da plataforma edX e abrangem tópicos como Python, JavaScript, Análise de Dados, SQL e inteligência artificial. A variedade de cursos permite que os alunos escolham aqueles que melhor se alinham com seus interesses e objetivos profissionais. Confira a ementa: - Statistical Inference and Modeling for High-throughput Experiments; - Introduction to Linear Models and Matrix Algebra; - Statistics and R; - Quantitative Methods for Biology; - Principles, Statistical and Computational Tools for Reproducible Data Science; - Machine Learning and AI with Python; - Data Science: R Basics; - Data Science: Visualization; - Data Science: Probability; - Data Science: Inference and Modeling; - Data Science: Productivity Tools; - Data Science: Wrangling; - Data Science: Machine Learning; - Data Science: Linear Regression; - Data Science: Capstone; - CS50: Introduction to Computer Science; - CS50’s Introduction to Programming with Scratch; - CS50’s Web Programming with Python and JavaScript; - CS50’s Introduction to Artificial Intelligence with Python; - CS50 for Lawyers; - Introduction to Digital Humanities; - Using Python for Research; - CS50’s Introduction to Databases with SQL; - Digital Humanities in Practice: From Research Questions to Results; - Introduction to Data Science with Python; - Case Studies in Functional Genomics; - Introduction to Bioconductor; - Advanced Bioconductor; - High-Dimensional Data Analysis; - High-Dimensional Data Analysis; - Fundamentals of TinyML; - Applications of TinyML; - Deploying TinyML; - MLOps for Scaling TinyML. ### Flexibilidade e Variedade Os cursos são ministrados em inglês, tando o áudio quanto a legenda, e podem variar em duração e metodologia, dependendo da escolha do aluno. Alguns cursos duram apenas uma semana, enquanto outros podem se estender por até três meses, dependendo dos módulos que contêm. Todos os cursos são 100% online, permitindo que os alunos escolham seus próprios horários para estudar, oferecendo flexibilidade para conciliar com outras responsabilidades. Aproveite esta oportunidade para adquirir novos conhecimentos e habilidades com os cursos gratuitos de Harvard, impulsionando sua carreira e desenvolvimento pessoal. <aside> <div>Você pode gostar</div> <div> <div> <div> <div> <span><img decoding="async" width="280" height="210" src="https://guiadeti.com.br/wp-content/uploads/2024/06/Cursos-Analise-De-Dados-280x210.png" alt="Cursos Análise De Dados" title="Cursos Análise De Dados"></span> </div> <span>Cursos De Análise De Dados Gratuitos: Excel E Power BI</span> <a href="https://guiadeti.com.br/cursos-analise-de-dados-gratuitos-excel-power-bi/" title="Cursos De Análise De Dados Gratuitos: Excel E Power BI"></a> </div> </div> <div> <div> <div> <span><img decoding="async" width="280" height="210" src="https://guiadeti.com.br/wp-content/uploads/2024/06/Curso-Harvard-1-280x210.png" alt="Cursos Harvard" title="Cursos Harvard"></span> </div> <span>Cursos Harvard Gratuitos: Python, JavaScript, Análise De Dados E Mais!</span> <a href="https://guiadeti.com.br/cursos-harvard-gratuitos-python-javascript-outros/" title="Cursos Harvard Gratuitos: Python, JavaScript, Análise De Dados E Mais!"></a> </div> </div> <div> <div> <div> <span><img decoding="async" width="280" height="210" src="https://guiadeti.com.br/wp-content/uploads/2024/06/Curso-CCNAv7-Introducao-As-Redes-280x210.png" alt="Curso CCNAv7: Introdução Às Redes" title="Curso CCNAv7: Introdução Às Redes"></span> </div> <span>Curso CCNAv7 Gratuito CISCO E NIC.br: Introdução Às Redes</span> <a href="https://guiadeti.com.br/curso-ccnav7-gratuito-introducao-as-redes/" title="Curso CCNAv7 Gratuito CISCO E NIC.br: Introdução Às Redes"></a> </div> </div> <div> <div> <div> <span><img decoding="async" width="280" height="210" src="https://guiadeti.com.br/wp-content/uploads/2024/06/Bootcamp-Blockchain-Developer-280x210.png" alt="Bootcamp Blockchain Developer" title="Bootcamp Blockchain Developer"></span> </div> <span>Bootcamp De Blockchain Developer Gratuito Da DIO</span> <a href="https://guiadeti.com.br/bootcamp-blockchain-developer-gratuito-dio/" title="Bootcamp De Blockchain Developer Gratuito Da DIO"></a> </div> </div> </div> </aside> ## JavaScript JavaScript é uma das linguagens de programação mais populares e amplamente utilizadas no desenvolvimento web. Criada em 1995, ela permite que os desenvolvedores criem páginas web interativas, dinâmicas e responsivas. JavaScript é uma linguagem de alto nível, interpretada e baseada em protótipos, que se tornou essencial para a criação de aplicações web modernas. ### Vantagens do JavaScript #### Flexibilidade e Facilidade de Uso Uma das principais vantagens do JavaScript é sua flexibilidade. Ele pode ser usado tanto no lado do cliente (frontend) quanto no lado do servidor (backend), graças a ambientes de execução como Node.js. Isso permite que os desenvolvedores utilizem a mesma linguagem em toda a stack de desenvolvimento, simplificando o processo de aprendizado e integração de tecnologias. #### Grande Comunidade e Suporte JavaScript tem uma enorme comunidade de desenvolvedores que contribuem com uma vasta quantidade de recursos, bibliotecas e frameworks, como React, Angular e Vue.js, facilitando a resolução de problemas, a troca de conhecimento e o acesso a ferramentas que aceleram o desenvolvimento de aplicações. ### Alta Performance e Interatividade JavaScript é executado diretamente no navegador do usuário, permitindo a criação de interfaces altamente interativas e responsivas. A capacidade de manipular o DOM (Document Object Model) em tempo real possibilita a criação de efeitos dinâmicos, validações de formulário instantâneas e atualizações de conteúdo sem a necessidade de recarregar a página. ### Desafios do JavaScript #### Complexidade e Escalabilidade Embora JavaScript seja uma linguagem poderosa, ela pode se tornar complexa e difícil de manter à medida que a aplicação cresce. A falta de tipagem estática pode levar a erros difíceis de detectar, e a natureza assíncrona da linguagem pode complicar a gestão de fluxos de execução. Ferramentas como TypeScript e frameworks de arquitetura ajudam a mitigar esses desafios, mas exigem um esforço adicional de aprendizado e implementação. ### Segurança Por ser executado no lado do cliente, JavaScript está sujeito a várias vulnerabilidades de segurança, como cross-site scripting (XSS) e ataques de injeção de código. Os desenvolvedores precisam estar constantemente vigilantes e seguir as melhores práticas de segurança para proteger suas aplicações contra essas ameaças. ## Harvard Harvard University, localizada em Cambridge, Massachusetts, é uma das instituições de ensino superior mais prestigiadas e antigas dos Estados Unidos. Fundada em 1636, Harvard foi nomeada em homenagem ao seu primeiro benfeitor, John Harvard. Ao longo dos séculos, a universidade cresceu e se desenvolveu, tornando-se um ícone global de excelência acadêmica e pesquisa. ### Excelência Acadêmica Harvard oferece vários programas acadêmicos em diversas áreas de estudo, incluindo artes e humanidades, ciências sociais, ciências naturais, engenharia e medicina. Tendo 13 escolas e institutos, a universidade proporciona uma educação de alta qualidade, voltada para a formação de líderes em várias disciplinas. ### Pesquisa e Inovação Harvard é um centro de pesquisa de ponta, com numerosos laboratórios e centros de pesquisa dedicados a resolver os problemas mais urgentes do mundo. A universidade investe significativamente em áreas como biotecnologia, inteligência artificial, ciências sociais e políticas públicas, promovendo uma cultura de inovação e descoberta. ### Impacto e Legado Harvard tem um legado impressionante de ex-alunos que se tornaram líderes influentes em diversos campos. Entre os ex-alunos notáveis estão presidentes dos Estados Unidos, primeiros-ministros, cientistas premiados com o Nobel, empresários de sucesso e muitos outros. A rede de ex-alunos de Harvard é uma das mais poderosas do mundo, oferecendo oportunidades inestimáveis de networking e colaboração. ## Link de inscrição ⬇️ As [inscrições para os cursos gratuitos de Harvard](https://pll.harvard.edu/catalog) devem ser realizadas no site da Universidade de Harvard. ## Compartilhe e inspire outros com a história e o impacto global da Harvard University! Gostou do conteúdo sobre os cursos gratuitos? Então compartilhe com a galera! O post [Cursos Harvard Gratuitos: Python, JavaScript, Análise De Dados E Mais!](https://guiadeti.com.br/cursos-harvard-gratuitos-python-javascript-outros/) apareceu primeiro em [Guia de TI](https://guiadeti.com.br).
guiadeti
1,901,440
Spring Boot vs Quarkus: Pick one for Java!
This analysis is based on another one I personally am fond of and will cover their features,...
0
2024-06-26T15:18:02
https://dev.to/zoltan_fehervari_52b16d1d/spring-boot-vs-quarkus-pick-one-for-java-4mgl
java, springboot, quarkus, javaframework
This [analysis](https://bluebirdinternational.com/spring-boot-vs-quarkus/) is based on another one I personally am fond of and will cover their features, performance, and how they fit into modern application development. **Spring Boot** simplifies Java web development with a host of out-of-the-box features for building microservice applications. It’s widely embraced for its robust ecosystem and extensive community support. **Quarkus** is designed as a Kubernetes-native Java framework, making it ideal for cloud environments. It’s notable for its efficient performance and rapid startup times, thanks to ahead-of-time compilation. ## Feature Comparison **Spring Boot Features:** - Inversion of Control (IoC) container for managing components - Integration with AspectJ for aspect-oriented programming - Comprehensive MVC framework for web applications - Extensive data access and batch processing capabilities - Strong transaction management and security features **Quarkus Features:** ## Kubernetes-native functionality for optimized cloud performance Smaller runtime footprint and reduced artifact size Enhanced startup performance ideal for microservices Developer-friendly with live coding and hot reload capabilities ## Performance Metrics Quarkus often leads in startup time and memory usage, making it suitable for environments where resources are at a premium. Spring Boot provides robustness and is favored for complex, enterprise-level applications due to its mature environment and feature completeness. ## Development Experience Both frameworks offer dynamic development modes to increase developer productivity: - Spring Boot excels with its comprehensive development tools and plugins. - Quarkus offers a good development mode with real-time coding adjustments. ## Pick one please Opt for Spring Boot if you need a proven framework with strong support for diverse application needs. Choose Quarkus for highly scalable, efficient applications optimized for the cloud.
zoltan_fehervari_52b16d1d
1,901,439
Weaving Your Cloud Together: Serverless Orchestration with AWS EventBridge
Weaving Your Cloud Together: Serverless Orchestration with AWS EventBridge In the...
0
2024-06-26T15:17:45
https://dev.to/virajlakshitha/weaving-your-cloud-together-serverless-orchestration-with-aws-eventbridge-g2i
![usecase_content](https://cdn-images-1.medium.com/proxy/1*zqfBK-ivKOyE5TLv4mHkkA.png) # Weaving Your Cloud Together: Serverless Orchestration with AWS EventBridge In the ever-evolving landscape of cloud computing, serverless architectures have emerged as a powerful paradigm, allowing developers to build and run applications without the burden of managing infrastructure. At the heart of this paradigm lies the need for seamless communication and orchestration between various services. This is where AWS EventBridge shines, acting as a versatile nervous system for your serverless applications and beyond. ### What is AWS EventBridge? AWS EventBridge is a fully managed serverless event bus service that facilitates event-driven architectures on AWS. It provides a centralized platform for routing events from various sources, including AWS services, SaaS applications, and custom applications. EventBridge decouples event producers from consumers, enabling asynchronous and scalable communication. ### Key Components of EventBridge * **Event Buses:** The core of EventBridge, event buses provide a central pipeline for ingesting, filtering, and routing events. * **Event Sources:** These represent the origins of events. EventBridge supports a wide range of sources, including AWS services (e.g., S3, DynamoDB, Lambda), Software as a Service (SaaS) applications, and your own custom applications. * **Rules:** Acting as the brains of the operation, rules determine how events are processed. They match incoming events based on patterns and then route them to designated targets for action. * **Targets:** Targets represent the destinations where events are sent for processing. EventBridge supports numerous targets like AWS Lambda functions, SNS topics, SQS queues, and even other event buses. ### EventBridge Use Cases: Going Beyond the Basics Here are five use cases that showcase the power and versatility of AWS EventBridge in real-world scenarios: **1. Real-time Data Processing and Analytics** Imagine a scenario where you need to process and analyze streaming data from various sources in real time. EventBridge can act as the central nervous system for this data pipeline. * **Scenario:** An e-commerce platform needs to track user behavior, analyze product trends, and update inventory levels in real time. * **Solution:** EventBridge can ingest events from various sources like website clicks, order placements, and inventory updates. Rules can be configured to filter and route events to different targets. For instance, events related to user behavior can be sent to a Kinesis stream for real-time analytics using tools like Amazon Kinesis Data Analytics or Apache Spark, while order events can trigger inventory updates in a DynamoDB table. **2. Automating Infrastructure Management** EventBridge excels at automating infrastructure provisioning and management tasks, reducing manual effort and potential errors. * **Scenario:** A development team requires an automated process for deploying new microservices and their dependencies within your AWS environment. * **Solution:** Leverage AWS CodePipeline to orchestrate your CI/CD pipeline. When a new code commit is detected, CodePipeline can trigger an event that EventBridge captures. A predefined EventBridge rule can then trigger an AWS Lambda function to automatically provision the necessary resources (EC2 instances, ECS tasks, etc.) based on infrastructure-as-code definitions (e.g., AWS CloudFormation templates). **3. Building Responsive Serverless Workflows** EventBridge is a natural fit for orchestrating serverless workflows, enabling you to break down complex processes into smaller, manageable functions. * **Scenario:** An online image editing application needs to process uploaded images through a series of transformations, such as resizing, format conversion, and watermarking. * **Solution:** Each image upload event can trigger an AWS Lambda function for the initial processing. Subsequent transformations can be triggered by events published to EventBridge upon successful completion of the previous step. This creates a responsive, event-driven workflow without the need for a monolithic application. **4. Simplifying Application Integration** Modern applications often rely on a multitude of third-party services. EventBridge acts as a universal translator, simplifying integration and enabling seamless data flow. * **Scenario:** A company wants to synchronize customer data between its CRM system (e.g., Salesforce) and its marketing automation platform (e.g., Marketo). * **Solution:** Utilize a SaaS integration service like Zapier or Workato that integrates with EventBridge. Configure the integration to capture events from the CRM, such as new customer registrations or updates to existing customer profiles. EventBridge can then route these events to the marketing automation platform, ensuring data consistency across both systems. **5. Building Event-Driven Security and Monitoring** EventBridge plays a vital role in enhancing security posture and enabling proactive monitoring. * **Scenario:** An organization wants to implement real-time security monitoring and automated incident response across its AWS environment. * **Solution:** Configure EventBridge to receive events from various AWS security services, such as AWS CloudTrail, AWS Security Hub, and AWS GuardDuty. Define rules that trigger alerts or automated remediation actions based on specific security events. For example, an event indicating unauthorized API calls can trigger a Lambda function to automatically block the suspicious IP address. ### The Ecosystem: Alternatives and Comparisons While AWS EventBridge excels in the AWS ecosystem, it's not the only player in the event-driven orchestration space. Here's a look at notable alternatives and their strengths: * **Azure Event Grid:** Microsoft Azure's eventing service, offering robust integration with Azure services and some external sources. * **Google Cloud Pub/Sub:** A scalable messaging middleware service that can be adapted for event-driven architectures within the Google Cloud Platform. * **Kafka:** A powerful open-source distributed streaming platform well-suited for high-throughput, fault-tolerant event streaming. Each of these options has its own nuances and strengths. The choice often comes down to factors like existing cloud platform commitments, specific feature requirements, and scalability needs. ### Conclusion AWS EventBridge is a game-changer for building event-driven architectures. Its ability to seamlessly connect services and applications, automate workflows, and provide real-time insights makes it an invaluable tool for modern cloud-native applications. As the serverless landscape continues to mature, EventBridge is poised to play an even more central role in enabling agile, responsive, and scalable cloud solutions. ### Advanced Use Case: Building a Multi-Region Disaster Recovery System with EventBridge **The Challenge:** Imagine you're running a mission-critical application with stringent uptime requirements. A simple outage in a single AWS region could have significant financial and operational repercussions. You need a disaster recovery (DR) solution that ensures minimal downtime and data loss. **The EventBridge Solution:** We can build a powerful multi-region DR system leveraging the orchestration capabilities of EventBridge in conjunction with other AWS services. **Architecture:** 1. **Primary Region:** Your primary application runs in this region, with its data stored in services like Amazon RDS, DynamoDB, or S3. 2. **Secondary (DR) Region:** This region houses a near-real-time replica of your primary application environment. 3. **Data Replication:** Utilize AWS services like Amazon RDS for MySQL Multi-AZ deployments, DynamoDB Global Tables, or S3 Cross-Region Replication to keep your data synchronized between the primary and DR regions. 4. **EventBridge Monitoring:** Configure EventBridge in your primary region to monitor for events that signal potential disruptions, such as: * AWS Health events indicating issues within the primary region * Custom application metrics crossing predefined thresholds (e.g., high error rates, latency spikes) 5. **Automated Failover:** * Upon detecting critical events, EventBridge triggers a Lambda function. * This function initiates the DR process by: * Rerouting traffic from the primary to the secondary region using services like AWS Global Accelerator or Route 53. * Promoting the standby database replicas in the DR region to become the primary data stores. * Scaling up resources in the DR region to handle the increased traffic load using AWS Auto Scaling. 6. **Continuous Replication:** Data replication continues between regions, ensuring minimal data loss. 7. **Automated Recovery:** When the primary region recovers, EventBridge can trigger another set of Lambda functions to: * Resynchronize data between regions. * Switch back traffic to the primary region when it's stable. **Benefits:** * **Reduced Downtime:** Event-driven failover minimizes the time it takes to switch to the DR environment. * **Data Protection:** Continuous replication minimizes data loss in case of a regional outage. * **Automation:** EventBridge automates the entire DR process, reducing manual intervention and potential errors. * **Cost-Effectiveness:** You pay for resources in the DR region only when they are actively being used. This advanced use case showcases how AWS EventBridge, in conjunction with a well-architected multi-region strategy, provides a robust solution for achieving high availability and disaster recovery for critical applications.
virajlakshitha
1,901,437
How to Add Custom CSS to Plain HTML Generated by ShowdownJS Using ShowdownCSS
How to Add Custom CSS to Plain HTML Generated by ShowdownJS Using...
0
2024-06-26T15:16:18
https://article.shade.cool/p/28?v=2
showdowncss, showdownjs
# How to Add Custom CSS to Plain HTML Generated by ShowdownJS Using ShowdownCSS > https://article.shade.cool/p/28 Markdown is a popular markup language among developers for its simplicity and ease of use. However, turning Markdown into well-styled HTML often requires additional CSS to make it visually appealing. ShowdownJS is a powerful JavaScript library that converts Markdown to HTML. ShowdownCSS is a CSS library designed to enhance the styling of HTML generated by ShowdownJS. In this article, we'll explore how to add custom CSS to plain HTML generated by ShowdownJS using ShowdownCSS. ## Prerequisites Before we get started, ensure you have the following: 1. Basic understanding of HTML, CSS, and JavaScript. 2. A text editor or IDE (such as VSCode). 3. A web browser to test your HTML files. ## Step 1: Setting Up Your Project Create a new directory for your project and open it in your text editor. Inside this directory, create an `index.html` file and a `styles.css` file. ## Step 2: Include ShowdownJS and ShowdownCSS To use ShowdownJS and ShowdownCSS, you need to include them in your HTML file. You can do this by adding the following CDN links to the `<head>` section of your `index.html` file: ```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Markdown to HTML with ShowdownJS and ShowdownCSS</title> <link rel="stylesheet" href="https://cdn.jsdelivr.net/gh/SH20RAJ/ShowdownCSS@main/showdown.css"> <script src="https://cdn.jsdelivr.net/npm/showdown@1.9.1/dist/showdown.min.js"></script> <link rel="stylesheet" href="styles.css"> </head> <body> <div class="showdowncontainer" id="content"></div> <script src="script.js"></script> </body> </html> ``` In this example, we include the CDN for ShowdownJS and ShowdownCSS, as well as our custom `styles.css` for additional styling. ## Step 3: Write Your Markdown Content Create a new file named `content.md` and add your Markdown content to it. For example: ```markdown # Hello, Markdown! This is **bold** text and this is *italic* text. ## Lists - Item 1 - Item 2 - Item 3 > This is a blockquote. ```javascript function hello() { console.log("Hello, World!"); } ``` ## Step 4: Convert Markdown to HTML Using ShowdownJS Create a new file named `script.js` and add the following JavaScript code to convert the Markdown content to HTML using ShowdownJS: ```javascript document.addEventListener('DOMContentLoaded', function () { fetch('content.md') .then(response => response.text()) .then(markdown => { const converter = new showdown.Converter(); const html = converter.makeHtml(markdown); document.getElementById('content').innerHTML = html; }); }); ``` This script fetches the Markdown content from `content.md`, converts it to HTML using ShowdownJS, and inserts the generated HTML into the `div` with the class `.showdowncontainer`. ## Step 5: Adding Custom CSS Open your `styles.css` file and add your custom styles. These styles will be applied on top of the styles provided by ShowdownCSS. For example: ```css /* Custom styles for Markdown content */ .showdowncontainer { font-family: 'Arial, sans-serif'; line-height: 1.6; } .showdowncontainer h1 { color: #3498db; } .showdowncontainer blockquote { border-left: 4px solid #3498db; padding-left: 10px; color: #7f8c8d; } .showdowncontainer pre { background: #f4f4f4; padding: 10px; border-radius: 5px; } .showdowncontainer code { font-family: 'Courier New', Courier, monospace; background: #f4f4f4; padding: 2px 4px; border-radius: 3px; } ``` These custom styles will enhance the appearance of your Markdown content. You can modify these styles or add more to suit your design preferences. ## Step 6: Testing Your Setup Open the `index.html` file in your web browser. You should see your Markdown content converted to HTML and styled according to both ShowdownCSS and your custom CSS. ## Conclusion By combining ShowdownJS and ShowdownCSS, you can easily convert Markdown to well-styled HTML. Adding custom CSS allows you to further customize the appearance of your content to match your project's design. This setup provides a flexible and powerful way to manage and style Markdown content on your web pages.
sh20raj
1,901,438
How to Add Custom CSS to Plain HTML Generated by ShowdownJS Using ShowdownCSS
How to Add Custom CSS to Plain HTML Generated by ShowdownJS Using ShowdownCSS Markdown is...
0
2024-06-26T15:15:54
https://article.shade.cool/p/28
showdowncss, showdownjs
# How to Add Custom CSS to Plain HTML Generated by ShowdownJS Using ShowdownCSS Markdown is a popular markup language among developers for its simplicity and ease of use. However, turning Markdown into well-styled HTML often requires additional CSS to make it visually appealing. ShowdownJS is a powerful JavaScript library that converts Markdown to HTML. ShowdownCSS is a CSS library designed to enhance the styling of HTML generated by ShowdownJS. In this article, we'll explore how to add custom CSS to plain HTML generated by ShowdownJS using ShowdownCSS. ## Prerequisites Before we get started, ensure you have the following: 1. Basic understanding of HTML, CSS, and JavaScript. 2. A text editor or IDE (such as VSCode). 3. A web browser to test your HTML files. ## Step 1: Setting Up Your Project Create a new directory for your project and open it in your text editor. Inside this directory, create an `index.html` file and a `styles.css` file. ## Step 2: Include ShowdownJS and ShowdownCSS To use ShowdownJS and ShowdownCSS, you need to include them in your HTML file. You can do this by adding the following CDN links to the `<head>` section of your `index.html` file: ```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Markdown to HTML with ShowdownJS and ShowdownCSS</title> <link rel="stylesheet" href="https://cdn.jsdelivr.net/gh/SH20RAJ/ShowdownCSS@main/showdown.css"> <script src="https://cdn.jsdelivr.net/npm/showdown@1.9.1/dist/showdown.min.js"></script> <link rel="stylesheet" href="styles.css"> </head> <body> <div class="showdowncontainer" id="content"></div> <script src="script.js"></script> </body> </html> ``` In this example, we include the CDN for ShowdownJS and ShowdownCSS, as well as our custom `styles.css` for additional styling. ## Step 3: Write Your Markdown Content Create a new file named `content.md` and add your Markdown content to it. For example: ```markdown # Hello, Markdown! This is **bold** text and this is *italic* text. ## Lists - Item 1 - Item 2 - Item 3 > This is a blockquote. ```javascript function hello() { console.log("Hello, World!"); } ``` ## Step 4: Convert Markdown to HTML Using ShowdownJS Create a new file named `script.js` and add the following JavaScript code to convert the Markdown content to HTML using ShowdownJS: ```javascript document.addEventListener('DOMContentLoaded', function () { fetch('content.md') .then(response => response.text()) .then(markdown => { const converter = new showdown.Converter(); const html = converter.makeHtml(markdown); document.getElementById('content').innerHTML = html; }); }); ``` This script fetches the Markdown content from `content.md`, converts it to HTML using ShowdownJS, and inserts the generated HTML into the `div` with the class `.showdowncontainer`. ## Step 5: Adding Custom CSS Open your `styles.css` file and add your custom styles. These styles will be applied on top of the styles provided by ShowdownCSS. For example: ```css /* Custom styles for Markdown content */ .showdowncontainer { font-family: 'Arial, sans-serif'; line-height: 1.6; } .showdowncontainer h1 { color: #3498db; } .showdowncontainer blockquote { border-left: 4px solid #3498db; padding-left: 10px; color: #7f8c8d; } .showdowncontainer pre { background: #f4f4f4; padding: 10px; border-radius: 5px; } .showdowncontainer code { font-family: 'Courier New', Courier, monospace; background: #f4f4f4; padding: 2px 4px; border-radius: 3px; } ``` These custom styles will enhance the appearance of your Markdown content. You can modify these styles or add more to suit your design preferences. ## Step 6: Testing Your Setup Open the `index.html` file in your web browser. You should see your Markdown content converted to HTML and styled according to both ShowdownCSS and your custom CSS. ## Conclusion By combining ShowdownJS and ShowdownCSS, you can easily convert Markdown to well-styled HTML. Adding custom CSS allows you to further customize the appearance of your content to match your project's design. This setup provides a flexible and powerful way to manage and style Markdown content on your web pages.
sh20raj
1,901,436
Dive into the Fascinating World of Information Retrieval! 🔍
Comprehensive course covering the core concepts and techniques of information retrieval, including text processing, indexing, ranking, and evaluation. Hands-on experience in building and evaluating information retrieval systems.
27,844
2024-06-26T15:15:28
https://getvm.io/tutorials/information-retrieval-spring-2018-eth-zurich
getvm, programming, freetutorial, universitycourses
As someone deeply passionate about search engines, data mining, and natural language processing, I'm thrilled to share with you an incredible resource that has truly transformed my understanding of these fields. Prepare to embark on a comprehensive journey through the core concepts and techniques of information retrieval! ## Comprehensive Coverage of Information Retrieval 📚 This course covers the fundamental principles and algorithms that power the search engines we rely on every day. From text processing and indexing to ranking and evaluation, you'll gain a solid foundation in the inner workings of information retrieval systems. ## Hands-on Experience 🖥️ But it's not just about the theory – this course also provides ample opportunities to put your knowledge into practice. You'll have the chance to build and evaluate your own information retrieval systems, giving you invaluable hands-on experience that will prepare you for real-world applications. ## Exposure to the Latest Developments 🔬 The field of information retrieval is constantly evolving, and this course ensures you stay ahead of the curve. You'll be introduced to the latest research and developments, keeping you at the forefront of this exciting and rapidly-changing domain. ## Recommended for Aspiring Professionals 👨‍💻 Whether you're a student or a seasoned professional, this course is a must-have for anyone interested in search engines, data mining, and natural language processing. It will provide you with the solid foundation you need to excel in these fields and open up a world of opportunities. So, what are you waiting for? Dive into the fascinating world of information retrieval by checking out the [YouTube playlist](https://www.youtube.com/playlist?list=PLzn6LN6WhlN1ktkDvNurPSDwTQ_oGQisn) and start your journey today! 🚀 ## Supercharge Your Learning with GetVM Playground 🚀 If you're eager to dive deeper into the world of information retrieval, natural language processing, and search engine optimization, I highly recommend exploring the GetVM Playground. This powerful browser extension for Google Chrome provides an online coding environment that allows you to seamlessly apply the concepts you've learned and put them into practice. The GetVM Playground offers a unique and interactive way to engage with the course content. Rather than just passively watching the videos, you can actively experiment with the techniques and algorithms covered in the lectures. By coding directly in the browser, you'll gain hands-on experience in building and evaluating information retrieval systems, solidifying your understanding and preparing you for real-world applications. The [GetVM Playground for the Information Retrieval course](https://getvm.io/tutorials/information-retrieval-spring-2018-eth-zurich) provides a self-contained environment where you can easily access the course materials, experiment with the code, and receive instant feedback on your progress. This integrated approach to learning ensures that you don't just memorize the theory but truly internalize the practical skills needed to excel in this field. So, why not take your learning to the next level and dive into the GetVM Playground? With its user-friendly interface and seamless integration with the course content, you'll be able to unlock the full potential of this comprehensive information retrieval resource and accelerate your journey towards becoming a search engine expert. 🔍 --- ## Practice Now! - 🔗 Visit [Information Retrieval | Search Engine Optimization | Natural Language Processing](https://www.youtube.com/playlist?list=PLzn6LN6WhlN1ktkDvNurPSDwTQ_oGQisn) original website - 🚀 Practice [Information Retrieval | Search Engine Optimization | Natural Language Processing](https://getvm.io/tutorials/information-retrieval-spring-2018-eth-zurich) on GetVM - 📖 Explore More [Free Resources on GetVM](https://getvm.io/explore) Join our [Discord](https://discord.gg/XxKAAFWVNu) or tweet us [@GetVM](https://x.com/getvmio) ! 😄
getvm
1,900,438
Let's make a custom, AI generated React component based on user data!
Imagine visiting a website where every page feels like it was designed just for you, where each piece...
0
2024-06-26T15:15:04
https://dev.to/shiwaforce/lets-make-a-custom-ai-generated-react-component-based-on-user-data-47l2
webdev, react, nextjs, ai
Imagine visiting a website where every page feels like it was designed just for you, where each piece of content speaks to your interests and preferences. In 2024, this dream is becoming a reality as we strive to offer users a unique, personalized experience. After all, who wouldn't love a digital space that seems to understand and cater to their individual tastes and needs? ## What we are going to build? We will create a web application in which a personalized installment payment calculator will appear after Github login, based on the Github bio of the person who has just logged in. The appearance of the card and the subject of the installment payment itself will be personalized. All this by using the Vercel AI SDK to stream the component from the server. Let's take a look at what technologies we will use to implement this small example application: - [Next.js](https://nextjs.org/) - [Auth.js](https://authjs.dev/) - [Vercel AI SDK](https://sdk.vercel.ai/docs/introduction) - [Tailwind CSS](https://tailwindcss.com/) - [Biome](https://biomejs.dev/) So, as usual, we will use only the latest web technologies. During the implementation, we will carefully go through all the elements of the technology stack and look at exactly what we need and why we need it. But before we dive into the implementation, let's take a look at what we're going to build. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2b4h31b2lvb2ctj36239.png) Okay, so here you can see on the image, that this component probably get generated for a Github user who is interested in sustainable lifestyle (which we all must be interested in anyways). Stay with me and let's see the steps of making this app. In the meantime, let's learn how to use the individual solutions, so that you can later integrate this functionality into your products. ## The implementation Let's get into it! First, let's create a Next.js application. Follow this with me: ``` npx create-next-app@latest ``` Leave the default settings, except for Eslint, because we will use something else instead. We will be using Biomejs instead of Tailwind and Prettier, even if [the automatic Tailwind class sorting is not yet fully resolved](https://github.com/biomejs/biome/pull/1362). So, my configuration for starting the project looks like this: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/o2kj3vlthnk0gwf8fwtf.png) You might have spotted the project has a slightly familiar name. There's gonna be a few buzzwords you might recognize, please don't take these too seriously! Adding a buzzword like BNPL (Buy Now Pay Later) can be a fun touch, even though [Apple has rolled out this service](https://www.cbsnews.com/news/apple-pay-later-service-discontinued-bnpl-plans/). Let's open our project in our favorite IDE. Our first step will be to install Biome. But what is a Biome? Biome JS is used to optimize JavaScript and TypeScript code. Its functions include code formatting, linting, and speeding up development processes. It replaces both Eslint and Prettier for us, all faster. There is a very good picture that came across me on X last time: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ud8ul539uk74cz4kk7uv.png) That sums things up nicely I think. Let's install Biome in our editor. There is also a simple description [HERE](https://biomejs.dev/guides/getting-started/). The next step is to issue the following command on our project: ``` npm install --save-dev --save-exact @biomejs/biome ``` Then we continue by creating the config file: ``` npx @biomejs/biome init ``` This command creates a file for us in which we can make various configuration settings. We can leave it at the default settings, but I modified the formatting a bit: ```javascript { "$schema": "https://biomejs.dev/schemas/1.8.2/schema.json", "organizeImports": { "enabled": true }, "linter": { "enabled": true, "rules": { "recommended": true } }, "formatter": { "indentStyle": "space", "indentWidth": 2 } } ``` After that, it is advisable to integrate in our IDE that a formatting is done for every save. [HERE](https://biomejs.dev/guides/getting-started/) we can download the plugin that suits us, then we can set it up in our IDE so that formatting takes place on the fly. For example, in Webstorm the configuration looks like this: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eetzudkkr1lzvecvf0ro.png) If we got this far, it's great, we can already enjoy fast formatting with Biome, and for this we only had to drag in one dependency. The next step will be to install Auth.js. Here too, we will have a fairly simple task, as the [new Auth.js documentation is very handy](https://authjs.dev/getting-started/installation?framework=next.js). We will use this package to authenticate the user with their Github profile. Of course, other Oauth 2 providers could be used, but Github provides us with one of the simplest APIs out there and probably every one of use have a Github profile as developers. Our command is: ``` npm install next-auth@beta ``` And then this: ``` npx auth secret ``` This will be our secret code, which will be used by Auth.js. After issuing this command, we will receive the message that the secret is ready and copy it to our _.env_ file, which of course we must create first. So .env: ```javascript AUTH_SECRET=<your_secret> ``` Then we proceed according to the official documentation. In the root of our project, we need to create an auth.ts file with the following content: ```javascript import NextAuth from "next-auth" export const { handlers, signIn, signOut, auth } = NextAuth({ providers: [], }) ``` And then create this file in the specified location to be created: ``` ./app/api/auth/[…nextauth]/route.ts ``` Fill it with this content: ```javascript import { handlers } from "@/auth" // Referring to the auth.ts we just created export const { GET, POST } = handlers ``` Next, we set up Github as an [OAuth provider](https://authjs.dev/getting-started/authentication/oauth) so that the user can access our application with his or her account. As we have seen, the providers array is currently empty. In order to be able to authenticate our user with Github Oauth, we need to make some settings within Github. To do this (if you are already logged in to Github) go to [THIS](https://github.com/settings/developers) link. Then click on New Oauth App. We are greeted by this screen, which we fill out as shown in the picture: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e9gltfh8047ynnlzg4ds.png) After clicking on the Register application, we can access the Client ID and generate the Client Secret. Let's do this and copy these into our .env file in the form below: ```javascript AUTH_GITHUB_ID=<your_client_id> AUTH_GITHUB_SECRET=<your_client_secret> ``` Then, modify the auth.ts file to include the Github login integration: ```javascript import NextAuth from "next-auth" import GitHub from "next-auth/providers/github" export const { handlers, signIn, signOut, auth } = NextAuth({ providers: [GitHub], }) ``` We've had it pretty simple so far, haven't we? Now let's implement this on the application interface as well. We will also need a Login button and an action behind the button, where the user authenticates with a button press. app/page.tsx will look like this: ```javascript import { signIn } from "@/auth"; export default async function Home() { return ( <main className="flex min-h-screen flex-col items-center justify-between p-24"> <div className="z-10 w-full max-w-5xl items-center justify-between text-sm lg:flex"> <form action={async () => { "use server"; await signIn(); }} > <button type="submit">Sign in</button> </form> </div> </main> ); } ``` What is going on here? We simply create a form that, with the "use server" directive of Next.js, calls a login interface when the form is submitted, which is provided by Auth.js. Furthermore, it is advisable to remove the frills inserted by the Next.js basic template from globals.css, so that only these remain in that file: ```javascript @tailwind base; @tailwind components; @tailwind utilities; ``` After making these changes, start the application (if you haven't already done so) with the "dev" script in package.json. Our application is already running on http://localhost:3000 and we see a "Sign in" inscription (which is a button by the way). This is already very good progress. Modify page.tsx as follows: ```javascript import { auth, signIn } from "@/auth"; export default async function Home() { const session = await auth(); return ( <main className="flex min-h-screen flex-col items-center justify-between p-24"> <div className="flex max-w-5xl items-center justify-center text-sm"> {session ? ( <p>User logged in</p> ) : ( <form action={async () => { "use server"; await signIn(); }} > <button className="p-3 border rounded-md hover:bg-gray-50" type="submit" > Sign in </button> </form> )} </div> </main> ); } ``` Here, when the session becomes available from Auth.js, we write that the user is logged in, and if the user is not logged in, we display our login form. Also, we added a bit of design to make our site look like something, but that's not the main focus here. Currently do not login, we will make some modification later with the auth flow. So what will be the next step? Let's just remember what our goal is: An interface that uses AI to generate a dynamic installment calculator based on the currently logged in user's Github bio. What else do we need for this? - We need to be able to request the bio of the currently logged-in user from Github. - We need a component that contains the slider itself, where the user can set how many months he or she wants to pay for the item to be purchased. - We'll need Vercel's AI SDK to stream the component. - We need the package called "ai", with which we can also generate text. - We need an OpenAI API key Let's start from the end of the list. As usual, we need an API key, which we can generate [HERE](https://platform.openai.com/api-keys). If you have it, put it in the .env file: ``` OPENAI_API_KEY=<your_api_key> ``` Now let's start installing the still necessary packages: ``` npm i ai ``` and ``` npm i @ai-sdk/openai ``` Next, let's create a small component that we will display until our customized component is displayed to the user. Let's create a folder in the root of our project called 'components' and in it create this file: InstallmentLoading.tsx ```javascript export const InstallmentLoading = () => ( <div className="animate-pulse p-4"> Getting your personalised installment offer... </div> ); ``` Then we need the component itself, which is basically this: ```javascript "use client"; import { type ChangeEvent, useState } from "react"; interface InstallmentProps { styles: any; productItem: { item: string; price: number; emoji: string }; } export const Installment = (props: InstallmentProps) => { const [months, setMonths] = useState(1); const handleSliderChange = (event: ChangeEvent<HTMLInputElement>) => { setMonths(Number(event.target.value)); }; const monthlyPayment = (props.productItem.price / months).toFixed(2); return ( <div className="p-6 bg-white shadow-md rounded-lg max-w-md mx-auto" style={props.styles} > <div className="flex mb-4 justify-center text-lg space-x-3"> <span className="font-semibold">{props.productItem.item}</span> <span>{props.productItem.emoji}</span> </div> <div className="mb-4 text-lg"> Toal Amount:{" "} <span className="font-semibold">${props.productItem.price}</span> USD </div> <div className="mb-4"> <label htmlFor="months" className="block text-sm mb-2"> Installment Duration (in months):{" "} <span className="font-semibold">{months}</span> </label> <input type="range" id="months" name="months" min="1" max="60" value={months} onChange={handleSliderChange} className="w-full" /> </div> <div className="text-lg"> Your monthly payment is:{" "} <span className="font-semibold">${monthlyPayment}</span> USD </div> </div> ); }; ``` Here we can see that there are a lot of variables that we will have to replace. These will be the ones that the AI ​​generates for us based on the user's Github bio. And then comes the core part of things. Let's create an actions.tsx file in the app directory. This is where the server action comes in, which will generate our component for us on the server side using the Vercel AI SDK. This is what the file will look like: ```javascript "use server"; import { Installment } from "@/components/Installment"; import { InstallmentLoading } from "@/components/InstallmentLoading"; import { openai } from "@ai-sdk/openai"; import { generateText } from "ai"; import { streamUI } from "ai/rsc"; import { z } from "zod"; const getInstallmentCardStyle = async (bio: string) => { const { text } = await generateText({ model: openai("gpt-4o"), prompt: `Generate the css for an installment offer component card that would appeal to a user based on this bio in his or her Github profile: ${bio}. Please do not write any comments just write me the pure CSS. Please post only the css key value pairs and make the design modern and elegant! Here is the component div that you should style.` + '<div style="here">You payment details</div>', }); if (text) { const cssWithoutSelectors = text .replace(/div:hover \{([^}]*)\}/g, "$1") .replace(/div \{([^}]*)\}/g, "$1"); return cssWithoutSelectors.trim(); } return ""; }; const getProductSuggestion = async (bio: string) => { const { object } = await generateObject({ model: openai("gpt-4o"), schema: z.object({ product: z.object({ item: z.string(), price: z.number(), emoji: z.string(), }), }), prompt: `Please write a piece of advice on what a user can buy on credit, whose Github bio is this ${bio}. Also write the expected price for the item and a suggested emoji.`, }); return object.product; }; const cssStringToObject = (cssString: string) => { const cssObject: { [key: string]: string } = {}; const cssArray = cssString.split(";"); cssArray.forEach((rule) => { const [property, value] = rule.split(":"); if (property && value) { const formattedProperty = property .trim() .replace(/-./g, (c) => c.toUpperCase()[1]); cssObject[formattedProperty] = value.trim(); } }); return cssObject; }; export async function streamComponent(bio: string) { const result = await streamUI({ model: openai("gpt-4o"), prompt: "Get the loan offer for the user", tools: { getLoanOffer: { description: "Get a loan offer for the user", parameters: z.object({}), generate: async function* () { yield <InstallmentLoading />; const styleString = await getInstallmentCardStyle(bio); const style = cssStringToObject(styleString); const product = await getProductSuggestion(bio); return <Installment styles={style} productItem={product} />; }, }, }, }); return result.value; } ``` The function names speak for themselves, but let's take a look at which one does what more precisely. The function of _getInstallmentCardStyle_ is to return the personalized design of the installment card that the user is likely to be interested in. Here we use generateText from the ai library provided by Vercel, which simply connects to the gtp-4o model, which returns the response based on the given prompt. Then we format the output a bit. _getProductSuggestion_, also with a telling name, calls the AI SKD's _generateObject_ function since here we need the recommended product with a predefined format. Our next function is _cssStringToObject_, which is a helper function we use in the next streamComponent function. This properly formats the css string returned by OpenAI into an object that we can then use in the component. And finally, the previously mentioned _streamComponent_. This function will be the part where we can stream the client component using the streamUI function of the Vercel AI SDK. You can see that within generate we put together the values ​​needed to create our Installment component. We can also notice that one of the parameters of the streamComponent is the github bio. We have reached the point where we query the Github bio of the logged-in user and call this action with it. So to make it all come together, let's go to the _app/page.tsx_ file. Our goal here would be that if the user has already logged in, the user can see his personalized component. But how do we request the Github bio of the logged-in user? For this, we also need to study the Github API a bit. They have [such an endpoint](https://docs.github.com/en/rest/users/users?apiVersion=2022-11-28#get-the-authenticated-user). With this, we can query the data of the currently logged-in user. This is exactly what we need. However, as we read the documentation, we can see that we also need an access token, which we attach to the API query. To do this, we need to slightly modify our auth.ts file, where after logging in, we need to put the access_token received during the github Oauth login into the session managed by Auth.js. Change the content of the auth.ts file to: ```javascript declare module "next-auth" { interface Session { accessToken?: string; } } import NextAuth from "next-auth"; import Github from "next-auth/providers/github" export const { handlers, signIn, signOut, auth } = NextAuth({ providers: [Github], callbacks: { async jwt({token, user, account}) { if (account) { token.accessToken = account.access_token; } return token; }, async session({ session, token }) { session.accessToken = token.accessToken as string; return session; } } }); ``` It is important that we can save the accessToken in this file in our session so that we can call the Github api from our server component on the main page. So let's go back to app/page.tsx: ```javascript import { streamComponent } from "@/app/actions"; import { auth, signIn } from "@/auth"; export default async function Home() { const session = await auth(); let accessToken; let responseJson; if (session) { accessToken = session.accessToken; const response = await fetch(`https://api.github.com/user`, { headers: { Authorization: `Bearer ${accessToken}`, }, }); responseJson = await response.json(); } return ( <main className="flex min-h-screen flex-col items-center justify-between p-24"> <div className="flex max-w-5xl items-center justify-center text-sm"> {session ? ( <div>{await streamComponent(responseJson.bio)}</div> ) : ( <form action={async () => { "use server"; await signIn(); }} > <button className="p-3 border rounded-md hover:bg-gray-50" type="submit" > Sign in </button> </form> )} </div> </main> ); } ``` Since we put the accessToken in the session, we can reach this here and call Github's API, from which we can extract the bio of the logged-in user and match it to our streamComponent function. ## Testing the application Now all that's left is to test the app we've created. Let's have a look! First of all, let's set up a good Github bio description. I also have a good "fake" idea. Last weekend, I watched (again) the Matrix trilogy and played a lot with my Meta Quest 3 VR headset, so the following description is self-explanatory: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7zk25cy4r6e0w7bpvlo4.png) Cool, right? Well, let's see what will be served to me on the main page depending on this! On the http://localhost:3000 page, go to the "Sign in" button, then click the "Sign in with Github" button. After logging in, I can already see that I have received my Matrix themed component, which offers me a Meta Quest 2. After all, it's right, because I don't have Quest 2, only Quest 3. 😁 ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ocpw4gyyi411d1uu6ig1.png) Real Matrix design! Well, it's not a look you'd expect from a modern component, but it fits right into the world of The Matrix, as [their website looked similar in 1999](https://www.webdesignmuseum.org/gallery/the-matrix-1999). ## Conclusion and source code We have reached the point where I can share with you the code of the entire project, which you can find [HERE](https://www.webdesignmuseum.org/gallery/the-matrix-1999) on Github. Of course, this can be further developed in many ways, let's look at a few options: We can deploy it to Vercel! - Introduction of [AI caching with Vercel KV](https://sdk.vercel.ai/docs/advanced/caching) - Creating several basic components for the AI ​​to choose from - TailwindCSS - Shadcn/ui generation instead of vanilla CSS and if you have any more ideas, write them in the comments or make a Github pull request! :)  If you have any questions, feel free to write a comment, e-mail me at gyurmatag@gmail.com or contact me on [Twitter](https://x.com/Gyurmatag), I will be happy to help and answer! Many thanks to [Shiwaforce](www.shiwaforce.com) for their support and resources in the creation of this article.
gyurmatag
1,901,434
I have found your Java Developer Roadmap for 2024
I'm not wasting any time. See down below? Essential Skills for Java Developers: Version...
0
2024-06-26T15:13:15
https://dev.to/zoltan_fehervari_52b16d1d/i-have-found-your-java-developer-roadmap-for-2024-26fb
java, javadeveloperroadmap, developerroadmap, careerdevelopment
I'm not wasting any time. See down below? ## Essential Skills for Java Developers: - Version Control: Achieve fluency in Git and GitHub to manage and track code effectively. - Linux Command Line: Essential for server management and script execution. - Data Structures and Algorithms: Core knowledge for crafting efficient code. - HTTP/HTTPS Protocols: Understand server communication for web development. - Design Patterns: Implement sophisticated architectural solutions in software design. ## Key Development Tools: - IDEs: IntelliJ IDEA or Eclipse for code development. - Build Tools: Maven and Gradle for project management. - Containers: Docker and Kubernetes for application deployment. - CI/CD Tools: Jenkins and TeamCity for continuous integration and delivery. ## Frameworks and Libraries: **- Spring:** Comprehensive application development. **- Hibernate:** Simplify database management with ORM. **- Apache Camel:** Integration of different systems with ease. ## Focus Areas: - Web Development: Using Servlets, JSP, and Spring MVC for dynamic web applications. - Desktop Applications: JavaFX for creating cross-platform desktop software. - Android Development: Android Studio for mobile application creation on Android platforms. ## Advanced Practices: - Unit Testing and TDD: Emphasize reliability with tools like JUnit. - Version Control and Collaboration: Leverage Git and GitHub for efficient team collaboration. - Deployment and Cloud: Utilize Docker for deployment and understand cloud database integration. This [Java developer roadmap](https://bluebirdinternational.com/java-developer-roadmap/) is your guide to becoming a proficient Java developer in 2024.
zoltan_fehervari_52b16d1d
1,901,432
How to Use ServBay to Create and Run a CakePHP Project
What is CakePHP? CakePHP is an open-source PHP web framework designed to help developers...
0
2024-06-26T15:10:05
https://dev.to/servbay/how-to-use-servbay-to-create-and-run-a-cakephp-project-ik6
php, webdev, beginners, programming
## What is CakePHP? [CakePHP](https://cakephp.org/) is an open-source PHP web framework designed to help developers build web applications quickly. It is based on the MVC (Model-View-Controller) architecture and provides a powerful toolkit to simplify common development tasks such as database interactions, form handling, authentication, and session management. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/o4o28ti0aiuqgrxpcp76.png) ## Key Features and Advantages of CakePHP - **Rapid Development**: Provides rich code generation tools to help developers quickly create common code structures. - **Flexible and Powerful ORM**: Built-in ORM (Object-Relational Mapping) layer simplifies database operations. - **Security**: Comes with multiple security features like input validation, CSRF protection, and SQL injection prevention. - **Community Support**: Has an active community and a rich ecosystem of plugins. - **Good Documentation**: Offers comprehensive documentation and tutorials to help developers get started quickly. CakePHP is suitable for projects ranging from small applications to large enterprise systems, enabling developers to build high-quality web applications swiftly. ## Creating and Running a CakePHP Project Using ServBay In this article, we will use the PHP environment provided by ServBay to create and run a CakePHP project. We will utilize ServBay's 'Host' feature to set up a web server and configure the project for access with simple steps. ## Note for NGINX or Apache Users ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/o1y4cm5i43jsedmqhual.png) ServBay uses Caddy as the default [web server](https://www.servbay.com). For users migrating from NGINX and Apache to ServBay, there are some key points to note: 1. **Caddy Configuration** ServBay comes with Caddy pre-configured and optimized. Developers can manage sites through ServBay's 'Host' feature without manually modifying the Caddy configuration file. 2. **Rewrite Rules and .htaccess** In NGINX and Apache, developers typically write their own rewrite rules and .htaccess files for URL rewriting and other configurations. However, ServBay comes with pre-configured Caddy rules, so developers usually do not need to write these rules unless there are special requirements. ## Creating a CakePHP Project ServBay suggests placing websites in the `/Applications/ServBay/www` directory for easy management. 1. **Install Composer** [ServBay](https://www.servbay.com) has Composer pre-installed, so no separate installation is needed. 2. **Create a CakePHP Project** Use Composer to create a new CakePHP project: ```bash cd /Applications/ServBay/www mkdir servbay-cakephp-app cd servbay-cakephp-app composer create-project --prefer-dist cakephp/app . ``` 3. **Enter the Project Directory** Navigate to the newly created CakePHP project directory: ```bash cd /Applications/ServBay/www/servbay-cakephp-app ``` ## Initial Configuration 1. **Configure Environment Variables** In the `config/app_local.php` file, configure database connection information and other environment variables. Ensure the following configuration is correctly set: ```php 'Datasources' => [ 'default' => [ 'host' => '127.0.0.1', 'username' => 'root', 'password' => 'password', 'database' => 'servbay_cakephp_app', 'url' => env('DATABASE_URL', null), ], ], ``` ## Configuring the Web Server Use ServBay's 'Host' feature to access the CakePHP project via the web server. In ServBay's 'Host' settings, add a new host: - **Name**: `My First CakePHP Dev Site` - **Domain**: `servbay-cakephp-test.local` - **Site Type**: `PHP` - **PHP Version**: Select `8.3` - **Site Root Directory**: `/Applications/ServBay/www/servbay-cakephp-app/webroot` For detailed setup steps, please refer to [[Adding Your First Site]]. ## Adding Sample Code In the `config/routes.php` file, add the following code to output "Hello ServBay!": ```php $routes->connect('/', ['controller' => 'Pages', 'action' => 'display', 'home']); ``` In the `src/Controller/PagesController.php` file, add the following code: ```php namespace App\Controller; use Cake\Http\Response; class PagesController extends AppController { public function display() { return new Response(['body' => 'Hello ServBay!']); } } ``` ## Accessing the Site Open a browser and visit `https://servbay-cakephp-test.local`. You should see the page output `Hello ServBay!`. **If you want more specific examples, you can visit the official [Help Center](https://support.servbay.com/php/frameworks/create-and-run-cakephp-project.html).** --- Got questions? Check out our [support page](https://support.servbay.com) for assistance. Plus, you’re warmly invited to join our [Discord](https://talk.servbay.com) community, where you can connect with fellow devs, share insights, and find support. If you want to get the latest information, follow [X(Twitter)](https://x.com/ServBayDev) and [Facebook](https://www.facebook.com/ServBay.Dev). Let’s code, collaborate, and create together!
servbay
1,901,431
Some go for these fine tools: Java GUI Frameworks
If you want to listen to me then please compare the features and performance of Java GUI frameworks...
0
2024-06-26T15:09:47
https://dev.to/zoltan_fehervari_52b16d1d/some-go-for-these-fine-tools-java-gui-frameworks-1e8f
java, javagui, guiframeworks, javaframeworks
If you want to listen to me then please compare the features and performance of Java GUI frameworks for your development projects. Sounds good? Then let's do this. Java GUI frameworks provide essential components for swift and solid application development. These tools offer a mix of libraries and components crucial for crafting appealing GUIs. Here’s an overview of the what’s available to developers. ## Framework Feature Scorecard I have assessed popular frameworks on criteria (on the scale of 1–100) like ease of use, compatibility across systems, adaptability, performance, developer community, integration capabilities, and current industry position. **- Swing** totals 47 points, with a comprehensive component set and stable background. **- JavaFX** leads at 56 points, excelling in cross-platform support and modern design features. **- Apache Pivot** comes in with 45 points, recognized for its wide-ranging component set for rich internet applications. **- AWT** has 42 points, providing a straightforward approach suitable for foundational GUI tasks. **- SWT** scores 48, favored for its responsiveness and native interface consistency. ## Key Components of Java GUI Frameworks These frameworks are built on several key constructs: - Containers that manage the layout of the user interface elements. - GUI components that enable user interaction with the application. - Layout managers that handle the organization of user interface elements. - Event handling systems that respond to user actions. - Graphics and drawing capabilities that enhance the aesthetic appeal. ## Why Choose Java GUI Frameworks? Opting for these frameworks can significantly speed up interface development, offer diverse component libraries for interaction, and provide a uniform experience across different operating systems. ## Detailed look? **- Swing:** Established and versatile, suitable for a broad array of desktop applications. **- JavaFX:** Forward-thinking with its UI design, it is the tool of choice for multimedia-rich applications. **- Apache Pivot:** Aims for uniformity in web and desktop environments. **- AWT:** Basic yet efficient, integrated closely with Java’s primary libraries. **- SWT:** For complex applications requiring a system-consistent look and feel. Way back, I found a more detailed comparison on [GUI frameworks](https://bluebirdinternational.com/java-gui-frameworks/) which is worth checking out, but browse away anyways.
zoltan_fehervari_52b16d1d
1,901,430
DAY 17: Docker Project for DevOps Engineers
Understanding the Dockerfile At the heart of our Docker project lies the Dockerfile — a script that...
0
2024-06-26T15:06:32
https://dev.to/oncloud7/day-17-docker-project-for-devops-engineers-45cp
docker, aws, awschallenge, 90daysofdevop
**Understanding the Dockerfile** At the heart of our Docker project lies the Dockerfile — a script that holds the key to building and running containers. Think of it as a recipe that tells Docker how to assemble and configure your application’s container. In simple terms, it’s a set of instructions that guides Docker on what base image to use, what commands to execute, and what files to include. **Crafting the Dockerfile** For our project, let’s consider a simple web application, say a Node.js or Python app. The Dockerfile would be our guiding hand in assembling the container. It could instruct Docker to use an official web server image, copy our web app’s files into the container, and kickstart the web server upon launch. **Building and Running the Container** Once our Dockerfile is ready, the real magic begins. We build the Docker image using the specified instructions. This process creates a containerized environment tailored to our web application. Running the container is like flicking the switch — it brings our application to life within its own encapsulated space. **Verifying the Application** Now, let’s put our creation to the test. Open up your favorite web browser and access the web application hosted within the Docker container. Witness the seamless integration as your application runs independently, encapsulated from the underlying system. **Pushing to the Repository** Our journey doesn’t end here — it’s time to share our masterpiece with the world. Whether it’s Docker Hub or a private repository, push the Docker image to make it accessible beyond your local environment. This step is crucial for collaboration and deployment on various platforms. **Today's Challenge: Building a Dockerized Web Application** **Step 1: Create a Dockerfile** Let's start by crafting a Dockerfile for a simple web application. Whether it's a Node.js or Python app, your Dockerfile will guide the containerization process. ``` # Use an official base image FROM node:14 # Set the working directory WORKDIR /app # Copy package.json and package-lock.json COPY package*.json ./ # Install dependencies RUN npm install # Copy the application files COPY . . # Expose the port the app runs on EXPOSE 3000 # Define the command to run your app CMD ["npm", "start"] ``` **Step 2: Build and Run the Docker Container** Open your terminal, navigate to the directory containing your Dockerfile and application code, and execute the following commands: ``` # Build the Docker image docker build -t your-image-name . # Run the Docker container docker run -p 3000:3000 -d your-image-name ``` **Step 3: Verify the Application** Open your web browser and navigate to localhost:3000. If all goes well, you should see your web application up and running. **Step 4: Push to a Repository** Now, let's share your Dockerized masterpiece with the world. Push the Docker image to a public or private repository, such as Docker Hub. ``` # Log in to Docker Hub (replace with your Docker Hub username) docker login # Tag your image docker tag your-image-name your-dockerhub-username/your-image-name # Push the image to Docker Hub docker push your-dockerhub-username/your-image-name ```
oncloud7
1,901,429
Securing Your Video Streaming: Best Practices and Solutions
In an increasingly digital age, the importance of video streaming is paramount. From businesses...
0
2024-06-26T15:06:28
https://dev.to/jennife05918349/securing-your-video-streaming-best-practices-and-solutions-5bk6
In an increasingly digital age, the importance of video streaming is paramount. From businesses conducting virtual meetings to entertainment channels streaming the latest films, secure media streaming ensures the safety and privacy of the content and its viewers. But how do you ensure that your video streaming is truly secure? This article delves into securing video streams and how specific tools can assist this endeavor. ## The Need for Secure Media Streaming As the demand for online content rises, so does the necessity for its protection. Secure media streaming isn't just about keeping your videos private; it's about ensuring that your users' data, privacy, and trust are maintained. Streaming security involves protection against unauthorized access, copyright infringement, and ensuring smooth streaming without lags or interruptions. ## Challenges in Video Streaming Security While the importance is clear, securing video streams is not without its challenges: - **Unauthorized Access**: There's a constant threat of hackers gaining unauthorized access to your video content. This could result in copyright violations, revenue loss, and reputation damage. - **Performance Issues**: Users expect high-quality, uninterrupted streaming. Lags, buffering, and poor quality can be as damaging to your brand as a security breach. - **Device Fragmentation**: With myriad devices available in the market, ensuring consistent streaming quality and security across all platforms is challenging. ## Fortifying Stream Security through Advanced Software Testing Tools In the digital content arena, protecting the sanctity of video streaming requires more than just traditional safety measures. Advanced [software testing tools](https://www.headspin.io/blog/top-software-testing-tools) have become the sentinels of the streaming world, serving dual purposes – ensuring both security and superior user experience. Let's delve deeper into how these tools fortify streaming security. - **Robust Authentication and Encryption**: The combination of strong authentication and encryption methods is at the core of streaming security. While authentication ascertains that only authorized users gain access to content, encryption takes it further. By converting data into a code to prevent unauthorized access, encryption ensures that even if intruders bypass authentication, the data remains indecipherable. - **Regular Security Audits and Vulnerability Assessments**: Advanced software testing tools conduct periodic security audits, scanning for vulnerabilities that cyber-attackers might exploit. Identifying and addressing these vulnerabilities in real-time can prevent potential breaches. - **Geoblocking and Geofencing**: Providers can exercise granular control over who views their content by restricting content access based on geographical locations. This is especially vital for those who have the rights to broadcast specific content only in certain regions. - **Dynamic Watermarking**: Beyond just embedding a digital signature, dynamic watermarking subtly alters each viewer's watermark. If the content gets redistributed without authorization, this unique watermark can help trace the leak's origin, allowing providers to take appropriate action. - **Tokenized Security**: Software testing tools ensure the stream URL is secured by generating a unique token for each session or user. This means that the content remains inaccessible to unauthorized users even if someone tries to share the URL. - **DRM (Digital Rights Management) Integration**: Modern software testing tools also facilitate the integration of DRM solutions. DRMs control the use, modification, and distribution of copyrighted works. By integrating DRMs, streaming platforms can prevent unauthorized access and distribution of their content. - **Adaptive Streaming**: While primarily seen as a method to enhance user experience by adjusting video quality based on the user's network, adaptive streaming also adds a layer of security. Since content is made into smaller chunks and streamed individually, it becomes harder for unauthorized entities to download and redistribute the entire content. Utilizing advanced software testing tools is no longer optional for content providers. With cyber threats becoming more sophisticated and user trust becoming invaluable, these tools provide the robust shield streaming platforms need to ensure content security and integrity. These tools keep unauthorized access, breaches, and potential revenue losses at bay by continuously evolving and adapting to the changing digital landscape. ## Performance Optimization: The Imperative for Seamless Video Streaming In video streaming, delivering content is only half the battle; ensuring a smooth and uninterrupted viewing experience is equally crucial. Performance optimization takes center stage in this context, acting as the backbone for a high-quality, user-centric streaming service. But what does performance optimization entail, and why is it so pivotal? **Why Performance Optimization Matters** At its core, performance optimization is about refining the streaming process, ensuring content is delivered seamlessly, with minimal lags or buffering, regardless of external conditions. In today's digital age, where viewers have many choices, even minor disruptions can lead them to abandon a stream and seek alternatives. **Load Testing for Scalability** Load testing is an essential component of optimization. It involves simulating various user loads on your platform to determine its response under different stress levels. Testing how your platform behaves when many users access content simultaneously allows you to anticipate and mitigate issues like server crashes or slowdowns. **Adaptive Bitrate Streaming** One effective approach to [optimizing video streaming](https://www.headspin.io/blog/secure-media-streaming-with-automated-test) is adaptive bitrate streaming. This method automatically adjusts the video quality in real time based on the viewer's network conditions. It ensures viewers on slower connections can still watch without constant buffering, while those on faster networks receive high-definition quality. **Device Compatibility Testing** The sheer variety of devices – from smartphones to tablets to smart TVs – that viewers use to stream content means your service needs to cater to multiple specifications and screen sizes. By conducting thorough device compatibility testing, you can ensure a consistent, optimized experience for all users, regardless of their device. **Network Simulation for Real-world Scenarios** Different users access content under various network conditions – from high-speed allow platforms to mimic these varied conditions, assessing how streams hold up in each scenario. This foresight ensures your content remains stable and high-quality, even in less-than-ideal circumstances. **Edge Caching to Reduce Latency** Latency can be a significant deterrent for live-streaming events. By leveraging edge caching, where content is stored closer to the viewer's location on distributed servers, platforms can reduce the time it takes for data to travel. This leads to faster load times and a more synchronized live streaming experience. ## The HeadSpin Advantage When it comes to secure media streaming and performance optimization, HeadSpin offers a plethora of solutions. Their audio-visual platform is meticulously designed to provide end-to-end visibility into user experience across various devices and networks. With HeadSpin, businesses can ensure security while delivering optimal streaming performance. For professionals such as testers, product managers, SREs, DevOps, and QA engineers, tools like those offered by HeadSpin are indispensable. They provide peace of mind regarding security concerns and ensure that user experience remains at the forefront. ## Conclusion In conclusion, while the digital age offers incredible opportunities for content creators and broadcasters, it also comes with challenges. Ensuring secure and smooth video streaming is no longer a luxury but a necessity. Businesses can leverage sophisticated software and performance testing tools to provide their users with top-notch, secure streaming experiences. Investing in robust streaming security and performance optimization solutions is paramount in a world where user trust can make or break a brand. Be proactive, utilize the right tools, and prioritize security and user experience for successful video streaming.
jennife05918349
1,901,427
Azure AI Studio Documentation
https://learn.microsoft.com/azure/ai-studio/?wt.mc_id=studentamb_360864
0
2024-06-26T15:04:58
https://dev.to/mukuastephen/azure-ai-studio-documentation-517k
azure, azureai, generativeai, webdev
https://learn.microsoft.com/azure/ai-studio/?wt.mc_id=studentamb_360864
mukuastephen
1,901,426
Tidycoder animation text
Check out this Pen I made!
0
2024-06-26T15:03:54
https://dev.to/tidycoder/tidycoder-animation-text-1e00
codepen, css, html, webdev
Check out this Pen I made! {% codepen https://codepen.io/TidyCoder/pen/bGyvBrN %}
tidycoder
1,901,424
Selecting Your LIGHTWEIGHT Python IDE is something you're gonna face!
Finding the ideal Python IDE that aligns with your coding philosophy is keyyyyy. I'm telling you...
0
2024-06-26T15:02:44
https://dev.to/zoltan_fehervari_52b16d1d/selecting-your-lightweight-python-ide-55ok
python, pythonide, lightweightide
**Finding the ideal Python IDE that aligns with your coding philosophy is keyyyyy. I'm telling you :)** So coding in Python? A lightweight Python IDE can streamline your development process. These IDEs deliver critical functionality without taxing your system — a perfect match for those who appreciate efficiency and ease of use. ## The Essence of Lightweight Python IDEs ([examined lightweight Python IDEs](https://bluebirdinternational.com/lightweight-python-ides/)) Lightweight IDEs offer a sweet spot for developers. They’re equipped with essential features such as syntax highlighting, code completion, and debugging capabilities, but they remain light on resource consumption. These environments cater well to developers who need nimble tools for rapid coding sessions. ## Key Considerations for Choosing an IDE **- Resource Efficiency:** Select an IDE that works smoothly on various hardware without hindering performance. **- Feature-Rich:** Look for necessary features like code completion and debugging to maintain a fluid coding rhythm. **- Customization:** An IDE should conform to your preferences, enhancing your productivity. ## Top Lightweight Python IDE Choices **- PyCharm Community Edition:** Offers a harmonious balance of comprehensive features and a lightweight framework. PyCharm is a favorite for its intelligent code assistance and system version control integrations. **- Visual Studio Code (VS Code):** Renowned for its flexibility, VS Code is bolstered by an extensive range of extensions and is a staple for Python programmers seeking an adaptable coding environment. **- Atom:** Atom serves those who prioritize a personalized touch, with its extensive theming and package management. **- PyDev:** With its stellar code analysis and Django framework support, PyDev is an ideal tool for diverse Python project requirements.
zoltan_fehervari_52b16d1d
1,901,422
Mastering The Best Restaurant Advertising Campaigns Detailed Guide
This guide aims to provide detailed insights into running the best restaurant advertising campaigns...
0
2024-06-26T15:01:56
https://www.kopatech.com/blog/mastering-the-best-restaurant-advertising-campaigns
kopatech, foodorderingsystem, multirestaurant
This guide aims to provide detailed insights into running the best restaurant advertising campaigns currently, catering to both physical establishments and online platforms utilizing restaurant delivery software. In the fast-paced and competitive landscape of the restaurant industry, effective advertising campaigns are vital for attracting and retaining customers. Now, with the ever-growing influence of technology and the rise of food delivery platforms, restaurant owners need to adopt innovative strategies to stand out from the crowd.The increased use of online ordering and [Multi Restaurant Food Delivery App](https://www.kopatech.com/multi-restaurant-delivery-software) in restaurant businesses is another game changer that needs to be considered in advertising for good results. ## **Best Restaurant Advertising Campaigns at a Glance** ### **Understanding Your Audience** Define your target audience: Analyze the demographics, preferences, and behaviors of your potential customers. Understanding demographics is crucial for running an upwardly mobile restaurant with an online presence as it allows restaurant owners to tailor their marketing strategies and offerings to meet the specific needs and preferences of their target audience. Demographics provide valuable insights into the characteristics of potential customers, including age, gender, income level, location, and lifestyle preferences.Here's why demographics are essential: **1. Targeted Marketing:** By knowing the demographic profile of their customer base, restaurant owners can create targeted marketing campaigns that resonate with their audience. For example, if the majority of their customers are young professionals in urban areas, they can focus their efforts on social media platforms and offer promotions tailored to this demographic. **2. Menu Development:** Demographic data can inform menu development decisions, such as introducing new dishes or adjusting portion sizes to cater to specific dietary preferences or cultural tastes prevalent among the target audience. For instance, if there is a high demand for plant-based options among millennials in the area, the restaurant can expand its vegetarian or vegan offerings. **3. Pricing Strategy is One of the Best Restaurant Advertising Campaigns:** Understanding the income levels of the target demographic helps in setting appropriate pricing strategies. For instance, a restaurant targeting affluent customers may offer premium dishes and fine dining experiences, while a more budget-friendly establishment may focus on value-driven options to appeal to cost-conscious consumers. **4. Location Selection:** Demographics play a crucial role in determining the ideal location for a restaurant. Analyzing population density, income levels, and lifestyle preferences of the target demographic can help in identifying strategic locations with high foot traffic and demand for dining options. **5. Customer Experience:** Tailoring the dining experience to match the preferences of the target demographic enhances customer satisfaction and loyalty. Whether it's offering fast and convenient online ordering for busy professionals or creating a cozy ambiance for families, understanding demographics allows restaurants to cater to the unique needs of their customers effectively. Utilize data analytics: Leverage customer data from online ordering platforms and social media to understand consumer trends and preferences. Data analytics is indispensable in running the best restaurant advertising campaigns for an upwardly mobile restaurant with an online presence, as it provides valuable insights that drive informed decision-making, optimize operations, and enhance the overall customer experience. Here's why data analytics is crucial: **1. Understanding Customer Behavior:** By analyzing data from online ordering platforms, website traffic, and social media interactions, restaurants can gain valuable insights into customer behavior, preferences, and trends. This information allows them to tailor their menu offerings, marketing campaigns, and promotions to better meet the needs and preferences of their target audience. **2. Personalized Marketing:** Data analytics enables restaurants to segment their customer base and create personalized marketing campaigns that resonate with specific demographics or customer segments. By leveraging insights such as purchase history and order frequency, restaurants can deliver targeted promotions and offers that drive customer engagement and loyalty. **3. Menu Optimization:** Analyzing sales data and customer feedback helps restaurants identify popular menu items, as well as those that may be underperforming. This allows them to optimize their menu offerings by introducing new dishes, adjusting portion sizes, or removing items that are not resonating with customers, ultimately increasing profitability and customer satisfaction. **4. Operational Efficiency:** Data analytics can help restaurants optimize their operational processes for one of the best restaurant advertising campaigns, such as inventory management, staffing levels, and supply chain logistics. By analyzing sales patterns and demand forecasts, restaurants can better anticipate inventory needs, minimize waste, and ensure efficient staffing levels to meet customer demand during peak hours. **5. Performance Tracking:** By tracking key performance indicators (KPIs) such as sales revenue, customer satisfaction scores, and online reviews, restaurants can measure the success of their marketing initiatives and operational strategies. This allows them to identify areas for improvement if they deploy a multi restaurant delivery App and make data-driven decisions to continuously enhance performance and drive business growth. In summary, data analytics empowers upwardly mobile restaurants to make strategic decisions that drive business growth, improve operational efficiency, and deliver exceptional customer experiences in an increasingly competitive market. By harnessing the power of data, restaurants can stay ahead of the curve and thrive in the digital age of dining. ### Crafting Compelling Content 1. Visual appeal: Invest in high-quality food photography and videography for social media and online platforms. 2. Storytelling: Share your restaurant's unique story, values, and behind-the-scenes glimpses to create emotional connections with your audience. 3. User-generated content: Encourage customers to share their dining experiences through reviews, photos, and videos, amplifying your brand reach. ### Leveraging Online Platforms 1. Website optimization: Ensure your website is mobile-friendly, optimized for search engines, and features easy online ordering options. 2. Social media marketing: Engage with your audience on platforms like Instagram, Facebook, and Twitter through regular posts, stories, and interactive content. 3. Collaborate with influencers: Partner with influencers and food bloggers to reach new audiences and enhance credibility. ### Harnessing the Power of Online Ordering 1. Seamless integration: Integrate your online food ordering system with popular delivery platforms to streamline the ordering process and reach a wider customer base. 2. Promotions and discounts: Offer exclusive deals and discounts for online orders to incentivize customers to choose your restaurant over competitors. 3. Order tracking and feedback: Utilize features of delivery software to provide real-time order tracking and gather feedback for continuous improvement. ### Embracing Innovative Marketing Strategies 1. Augmented reality (AR) advertising: Experiment with AR technology to offer interactive dining experiences and uniquely showcase menu items. 2. Virtual events and experiences: Host virtual cooking classes, wine tastings, or live chef demonstrations to engage customers beyond the traditional dining experience. 3. Geo-targeted again is one of the best restaurant advertising campaigns to consider: Utilize location-based targeting to reach potential customers in your vicinity and drive foot traffic to your restaurant. ### Measuring Success and Iterating **Key performance indicators (KPIs):** Track metrics such as website traffic, conversion rates, online orders, and customer feedback to gauge the effectiveness of your advertising campaigns. **A/B testing:** Experiment with different ad formats, messaging, and promotions to identify what resonates best with your audience. **Continuous optimization:** Regularly review and refine your advertising strategies based on insights gathered from data analysis and customer feedback. ## Search Engine Ads Use in Best Restaurant Advertising Campaigns Google Pay-Per-Click advertising is a powerful tool for running an upwardly mobile restaurant with an online presence, offering targeted reach, measurable results, and the ability to effectively compete in the digital marketplace. Here's why Google PPC advertising is crucial for restaurant success: ### Targeted Reach It allows restaurants to target their ads to specific geographic locations, ensuring that they reach potential customers in their vicinity. This is particularly beneficial for local restaurants looking to attract nearby diners who are actively searching for dining options in their area. ### Increased Visibility Restaurants can bid on relevant keywords related to their cuisine, location, or unique selling points. This allows them to appear prominently in search engine results pages (SERPs) when users search for terms related to their restaurant, increasing visibility and driving traffic to their website or online ordering platform. ### Cost-Effective Advertising PPC operates on a pay-per-click model, meaning that restaurants only pay when users click on their ads. This makes it a cost-effective advertising option, as businesses can set their budget and control their spending based on their advertising goals and performance metrics. ### Measurable Results Provides comprehensive analytics and reporting tools that allow restaurants to track the performance of their ads in real-time. From click-through rates (CTR) to conversion rates and return on investment (ROI), restaurants can measure the effectiveness of their campaigns and make data-driven adjustments to optimize results. ### Ad Customization Offers a range of ad formats and customization options, allowing restaurants to create compelling, visually appealing ads that resonate with their target audience. Whether it's text ads, display ads, or responsive search ads, restaurants can tailor their messaging and creative assets to showcase their unique brand personality and offerings. ### Strategic Targeting Enables restaurants to target specific audience segments based on factors such as demographics, interests, and online behavior. This allows them to reach potential customers who are most likely to be interested in their cuisine or dining experience, increasing the likelihood of conversion and customer acquisition. ### Competitive Advantage In today's competitive marketplace, one of the Best Restaurant Advertising Campaigns is having a strong online presence which is essential for restaurants to stand out and attract customers. Businesses get competitive advantage by allowing them to appear prominently in search results, even ahead of organic listings, and outshine competitors who may not be leveraging paid advertising effectively. Staying ahead of the curve requires a combination of creativity, technology adoption, and data-driven decision-making. By understanding your audience, crafting compelling restaurant advertising ideas, leveraging online platforms and delivery software, embracing innovative marketing strategies, and measuring success iteratively, restaurant owners can run effective advertising campaigns that drive both online and offline business growth. _This post originally appeared on [kopatech.com](https://www.kopatech.com/blog/mastering-the-best-restaurant-advertising-campaigns)_
kopatech2000
1,899,186
The Complete Guide to Serverless Apps I - Introduction
'Serverless' as a term first appeared around 2012 but it was only in 2014 when it piqued interest...
27,854
2024-06-26T15:00:00
https://www.fermyon.com/serverless-guide/overview-of-serverless-applications
cloud, serverless, webassembly, cloudnative
'Serverless' as a term [first appeared](https://martinfowler.com/articles/serverless.html#origin) around 2012 but it was only in 2014 when it piqued interest after the launch of AWS Lambda. The term spiked in usage in 2020 but went into decline soon after. Early in 2022, interest began to climb again. And right now, the term is as popular as it's ever been. ![serverless google trends](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/v77c9civgs2ouk5py94s.png) There are few reasons for this increase in popularity (which we will delve into shortly) but I thought I'd take this opportunity to draw on my experience working in the cloud, and write a primer for Serverless Applications. This is intended to be read like an e-book and will cover all the facets of Serverless applications. Here's the Complete Developer’s Guide to Serverless Apps which will be serialized over the course of the next few weeks. ### Table of Contents {%- # TOC start (generated with https://github.com/derlin/bitdowntoc) -%} - [Defining Serverless 💫](#defining-serverless) + [Definition 1: Serverless as SaaS](#definition-1-serverless-as-saas) + [Definition 2: Serverless as Hosted Application](#definition-2-serverless-as-hosted-application) + [Definition 3: Serverless as a Software Concept](#definition-3-serverless-as-a-software-concept) - [What is a Serverless App 🤓](#what-is-a-serverless-app) - [Comparing Serverless Apps and PaaS 🗒️](#comparing-serverless-apps-and-paas) - [Conclusion 😊](#conclusion) {%- # TOC end -%} ## Defining Serverless 💫 A friend of mine once argued that there is no such thing as _serverless_ because, of _course_ there are servers underneath every one of the services mentioned above. <img width="100%" style="width:100%" src="https://i.giphy.com/media/v1.Y2lkPTc5MGI3NjExOWVlcnl6MTV2OTdkM3hsNDZmNGxzNDR4dnVjYWhiaTh5cHl2NzlvMSZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw/eUrE2DuMKOE0g/giphy.gif"> While he is not wrong in his statement that the things described as _serverless_ do indeed have servers running somewhere, he missed the main point: _Serverless_ is a statement about what resources you must be concerned with and not so much about the actual presence of physical server hardware. Let’s dive into three definitions of Serverless, and you’ll soon see what I mean. We’ll start with the most generic definition and work toward the most specific. #### Definition 1: Serverless as SaaS The most generic term “serverless” indicates that some offering is run using the Software-as-a-Service (SaaS) model. In SaaS, an entire application is owned, built, and operated by one organization, and other individuals or organizations create accounts to use that application, typically via the web. Some SaaS providers prefer to use the term “serverless” to describe the SaaS model, particularly when they want to emphasize that you, as the user, don’t have to do any management of the cloud resources required to run the SaaS. If this is server-_less_, then what does “server” mean here? The _server_, that this _serverless_ offering is hiding from you is what used to be called the _application server_ (back in the pre-cloud days). In other words, the machine or machines whose job it was to execute a specific piece of software. <figure> <img width="100%" style="width:100%" src="https://i.giphy.com/media/v1.Y2lkPTc5MGI3NjExYjh1d3ZsZHRueTBoZjZha3dqN2d0ZDh0a24ydmV4dzN6N2E4ZGVsdyZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw/3o7524uhfdq7pmiDJe/giphy.gif"> <figcaption>Err..not that Sass</figcaption> </figure> > Definition 1: _”Serverless” merely means no management of cloud resources_ You can spot this usage of the term easily. For example, when _serverless_ is being used as a synonym for _SaaS_ it signals that: * the offering is targeted toward non-engineers, * there is no REST API, library, SDK, or CLI to use, and * the web interface is the only way to interact with the application. #### Definition 2: Serverless as Hosted Application Consider the following cloud services, each of which claims to be serverless: * Serverless database hosting. * Serverless logging and monitoring. * Serverless messaging framework. What do these three have in common? A serverless database is one in which someone else manages the database software (starting and stopping, upgrading, patching security issues) and you simply use the database (creating tables, running queries, inserting data). Similarly, “serverless logging and monitoring” suggests the same: Someone else manages a bunch of servers that do log processing and monitoring, and you merely use the APIs provided to attach to your application. <img width="100%" style="width:100%" src="https://i.giphy.com/media/v1.Y2lkPTc5MGI3NjExbmg3bmw1dHhqdDl4Zjl0amNwem53ZTY4Zm5zMnlxZnh5MXE0c251ZCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw/nH6AAUMOMxYhvzcmZU/giphy.gif"> This definition treats the word “server” as referring to the hardware or Virtual Machine (VM) instance. While you (the user) may be required to say what kind of Operating System (OS) or architecture you prefer their cloud service to run on. In this case, you perhaps also choose your memory and/or storage requirements but you are not responsible for the day-to-day operations relating to any of this infrastructure. > Definition 2: _”Serverless” means you (the user) do not have to manage the infrastructure that your cloud services runs on._ You can spot the meaning of “serverless” in this context when the offering: * is engineering-oriented, * has a REST API, libraries, SDKs, or CLI client, * allows you to create instances of this offering for your usage (but you don’t have to manage the day-to-day operations like upgrading, patching, or monitoring), and * typically does not allow you to gain access directly to the operating system (such as a shell prompt or system administrator account). #### Definition 3: Serverless as a Software Concept Consider the process of building a Ruby on Rails, Python Django, or Node.js Express application. One of the first things you must do is write the code that starts a server to listen on a particular port. Or, if you go a level lower than these common frameworks, you might even have to create a socket server, attach a thread pool, and map incoming requests to an HTTP (or other protocol) handler. When you are doing any of these things, you are writing a _software_ server. In the software world, a server is a long-running process that listens for incoming requests (usually on a network connection) and then handles those requests. A server typically handles hundreds to millions of individual requests over its lifetime, which may span hours, days, months, or even years before the server is restarted. Contrast this with a program where, instead of standing up an entire server, you merely write a function that starts up (receives a single request), handles that request, and then optionally returns a response before it shuts down. That is, each request executes the program from start to finish. Such a program may run for milliseconds, seconds, or perhaps several minutes. But rarely does it run longer. <img width="100%" style="width:100%" src="https://i.giphy.com/media/v1.Y2lkPTc5MGI3NjExNnQxM3dpeGg4NngxaWo3dDB0NmQzZW85NW8xc3ZxaDFubnc3NGk4dyZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw/nn0Kcrd72zzaZb1WxW/giphy.gif"> This is the serverless app model. And this is our third usage of the term “serverless”. This model gained the name “serverless function” when Amazon released its Lambda offering. > Definition 3: _“Serverless” means you do not have to write the software server that will listen for requests, nor do you need to manage the hardware or operating system_ You can spot the meaning of the term “serverless” in this context when: * you, the developer, write request (event) handlers instead of software servers, * a variety of SDKs, APIs, or tooling are provided to make it easier for you to write programs, * you do not need to manage server hardware or virtual machines, and * you also typically do not have administrator access or shell access to the environment executing your code ## What is a Serverless App 🤓 A serverless app is a thing that runs inside of an environment that manages all of the protocol-level and process-level aspects of serving content. Additionally, that serverless environment provides a layer of secure isolation from other serverless apps. In the strongest cases, this allows multitenant hosting, where two different customers or users can run their apps on the same serverless app platform without fear that the other users or customers can tamper with the app. A serverless app is the piece of software you, as the developer, write and upload to a serverless application platform. And the code you write is started when a new request is received. Your code is expected to handle that request and perhaps return a response, at which point your code is shut down again (ready to be started again in the future). >For the most part, it is acceptable to use the terms “serverless app” and “serverless function” interchangeably. In this guide, we tend to use _serverless app_ because it is more generic (and also shorter to type). Where it is necessary to distinguish between the two, we do so carefully. ## Comparing Serverless Apps and PaaS 🗒️ It is useful to contrast a serverless app with an app written for a Platform-as-a-Service (PaaS). Heroku and the open-source [Cloud Foundry](https://www.cloudfoundry.org/) are examples of PaaS. In contrast, [Fermyon Cloud](https://fermyon.com/cloud) is an example of a serverless app platform, and [Spin](https://fermyon.com/spin) is a developer tool for building serverless apps. For starters, Serverless apps are more cost-effective than long-running servers in a PaaS environment. A serverless app is simpler, faster, and more resource-efficient than a PaaS. But let's dive into the conceptual details to see why this is the case. In a PaaS, you (the developer) write an application in any language the PaaS supports. And then you deploy that application to the PaaS service, where it is hosted on your behalf. Developer self-service is a core feature of a PaaS. That means developers can deploy their applications without relying upon an operations (DevOps or platform engineering) team. Most serverless app platforms, including [Fermyon Cloud](https://fermyon.com/cloud), also provide the same kind of developer self-service, including a web dashboard, metrics and monitoring, and a host of tools to assist in developing and debugging. Where a PaaS differs from serverless apps is, again, in the programming model. A PaaS follows [definition 2 of serverless](#definition-2-serverless-as-hosted-application) where the hardware is managed, but the developer must still write a _software server_. Meanwhile, serverless apps follow [the 3rd definition of serverless](#definition-3-serverless-as-a-software-concept). Whereby a developer writes only the small program that handles a request and does not have to worry about writing the software server. If you are looking for a developer self-service platform to run long-running services that are always on, a PaaS is probably the solution you are after. If you are looking for a developer self-service platform where you can quickly write highly efficient apps that need to execute instantly and scale rapidly, you may prefer to take a look at serverless apps. ## Conclusion 😊 That was a comprehensive look at defining Serverless and Serverless apps. In the rest of the series we will look at the characteristics of a serveless function, the use cases and of course some code. <img width="100%" style="width:100%" src="https://i.giphy.com/media/v1.Y2lkPTc5MGI3NjExazlsa212MWZvamt6Y2dwdHQzeXZscjk3bjc0MTVkbjQ3MmZvanJubCZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw/DAzIIpUmSgFXbp1hFV/giphy.gif"> Let us know in the comments if you are using Serverless already, and what your usecases are!
sohan26
1,901,421
LeetCode Day18 Binary Tree Part 8
108. Convert Sorted Array to Binary Search Tree Given an integer array nums where the...
0
2024-06-26T14:59:48
https://dev.to/flame_chan_llll/leetcode-day18-binary-tree-part-8-3pcd
leetcode, java, algorithms, datastructures
# 108. Convert Sorted Array to Binary Search Tree Given an integer array nums where the elements are sorted in ascending order, convert it to a height-balanced binary search tree. Example 1: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/papga47nq098n9l6w7r7.png) Input: nums = [-10,-3,0,5,9] Output: [0,-3,9,-10,null,5] Explanation: [0,-10,5,null,-3,null,9] is also accepted: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vxqffp54191ybduja7gd.png) Example 2: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/29qrrd3ch257qxvqoxyc.png) Input: nums = [1,3] Output: [3,1] Explanation: [1,null,3] and [3,1] are both height-balanced BSTs. Constraints: 1 <= nums.length <= 104 -104 <= nums[i] <= 104 nums is sorted in a strictly increasing order. Constraints: 1 <= nums.length <= 104 -104 <= nums[i] <= 104 nums is sorted in a strictly increasing order. [Original Page](https://leetcode.com/problems/convert-sorted-array-to-binary-search-tree/description/) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6mifpmk9l383ywjqm64f.png) ``` public TreeNode sortedArrayToBST(int[] nums) { if(nums.length == 0){ return null; } if(nums.length == 1){ return new TreeNode(nums[0]); } int size = nums.length; int mid = nums.length>>1; TreeNode root = new TreeNode(nums[mid]); root.left = sortedArrayToBST(Arrays.copyOfRange(nums, 0, mid)); root.right = sortedArrayToBST(Arrays.copyOfRange(nums, mid+1, size)); return root; } ``` we only need to consider dividing the original array into two nearly the same size sub-arrays. Then link the part to the middle element and do it iteratively. # 538. Convert BST to Greater Tree Given the root of a Binary Search Tree (BST), convert it to a Greater Tree such that every key of the original BST is changed to the original key plus the sum of all keys greater than the original key in BST. As a reminder, a binary search tree is a tree that satisfies these constraints: The left subtree of a node contains only nodes with keys less than the node's key. The right subtree of a node contains only nodes with keys greater than the node's key. Both the left and right subtrees must also be binary search trees. Example 1: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3px2tbiw5m3q08ih4weh.png) Input: root = [4,1,6,0,2,5,7,null,null,null,3,null,null,null,8] Output: [30,36,21,36,35,26,15,null,null,null,33,null,null,null,8] Example 2: Input: root = [0,null,1] Output: [1,null,1] Constraints: The number of nodes in the tree is in the range [0, 104]. -104 <= Node.val <= 104 All the values in the tree are unique. root is guaranteed to be a valid binary search tree. [Original Page](https://leetcode.com/problems/convert-bst-to-greater-tree/description/) Note: This question is the same as 1038: https://leetcode.com/problems/binary-search-tree-to-greater-sum-tree/ According to the description the new node value will be added to the sum of the value that is larger than itself and this is a BST. Thus, it implies that we can process the logic from the rightest node (has biggest value) -> then middle -> then left. A reverse version of an in-order transverse (right -> middle -> left) ``` public TreeNode convertBST(TreeNode root) { if(root == null){ return root; } greaterWork(root, 0); return root; } public int greaterWork(TreeNode cur, int cumulate){ if(cur.left == null && cur.right == null){ cur.val += cumulate; cumulate = cur.val; return cur.val; } if(cur.right!=null){ cumulate = greaterWork(cur.right, cumulate); } cur.val += cumulate; cumulate = cur.val; if(cur.left != null){ cumulate = greaterWork(cur.left, cumulate); } return cumulate; } ``` ##Method 2 we can refine this code by editing end logic use node == null instead of node == leaf node; ``` public TreeNode convertBST(TreeNode root) { if(root == null){ return root; } greaterWork(root, 0); return root; } public int greaterWork(TreeNode cur, int cumulate){ if(cur == null){ return cumulate; } cumulate = greaterWork(cur.right, cumulate); cur.val += cumulate; cumulate = cur.val; cumulate = greaterWork(cur.left, cumulate); return cumulate; } } ``` ## Method 3 we can use the method that build a global variable instead of a parameter of the new method ``` private int sum = 0; public TreeNode convertBST(TreeNode root) { if(root == null){ return root; } convertBST(root.right); root.val += sum; sum = root.val; convertBST(root.left); return root; } ``` # 669. Trim a Binary Search Tree Given the root of a binary search tree and the lowest and highest boundaries as low and high, trim the tree so that all its elements lies in [low, high]. Trimming the tree should not change the relative structure of the elements that will remain in the tree (i.e., any node's descendant should remain a descendant). It can be proven that there is a unique answer. Return the root of the trimmed binary search tree. Note that the root may change depending on the given bounds. Example 1: Input: root = [1,0,2], low = 1, high = 2 Output: [1,null,2] Example 2: Input: root = [3,0,4,null,2,null,null,1], low = 1, high = 3 Output: [3,2,null,1] Constraints: The number of nodes in the tree is in the range [1, 104]. 0 <= Node.val <= 10^4 The value of each node in the tree is unique. root is guaranteed to be a valid binary search tree. 0 <= low <= high <= 10^4 ## Wrong Code here ### Debuging ``` class Solution { /** find low find high 1. partent link 2. child link a. low left child parent drop b, low right child parent fill in a. high left child fill in b. high right child drop */ // TreeNode cur; TreeNode parent; TreeNode preParent; boolean isLeft; public TreeNode trimBST(TreeNode root, int low, int high) { TreeNode cur = root; parent = root; findTarget(cur, low); System.out.println(parent.val); System.out.println(cur.val); if(cur.val == root.val){ root.left = null; } if(cur!=null){ System.out.println("not null"); if(cur.right == null){ if(isLeft){ preParent.left = null; } else{ preParent.right = null; } } else{ System.out.println("low right not null"); if(isLeft){ preParent.left = cur.right; }else{ preParent.right = cur.right; } } } cur = root; parent = root; findTarget(cur, high); System.out.println(parent.val); System.out.println(cur.val); if(cur == root){ cur.right = null; } if(cur!=null){ if(cur.left == null){ if(isLeft){ parent.left = null; } else{ parent.right = null; } } else{ if(isLeft){ parent.left = cur.left; }else{ parent.right = cur.left; } } } return root; } public TreeNode findTarget(TreeNode cur, int target){ if(cur == null){ return null; } TreeNode result = findTarget(cur.left, target); if(result !=null){ return result; } if(cur.val == target){ return cur; } preParent = parent; parent = cur; return findTarget(cur.right, target); } ```
flame_chan_llll