id int64 5 1.93M | title stringlengths 0 128 | description stringlengths 0 25.5k | collection_id int64 0 28.1k | published_timestamp timestamp[s] | canonical_url stringlengths 14 581 | tag_list stringlengths 0 120 | body_markdown stringlengths 0 716k | user_username stringlengths 2 30 |
|---|---|---|---|---|---|---|---|---|
1,863,052 | 2597. The Number of Beautiful Subsets | 2597. The Number of Beautiful Subsets Medium You are given an array nums of positive integers and a... | 27,523 | 2024-05-23T17:06:04 | https://dev.to/mdarifulhaque/2597-the-number-of-beautiful-subsets-ih5 | php, leetcode, algorithms, programming | 2597\. The Number of Beautiful Subsets
Medium
You are given an array nums of positive integers and a positive integer k.
A subset of nums is beautiful if it does not contain two integers with an absolute difference equal to k.
Return the number of non-empty beautiful subsets of the array nums.
A subset of nums is an array that can be obtained by deleting some (possibly none) elements from nums. Two subsets are different if and only if the chosen indices to delete are different.
**Example 1:**
- **Input:** nums = [2,4,6], k = 2
- **Output:** 4
- **Explanation:** The beautiful subsets of the array nums are: [2], [4], [6], [2, 6].
It can be proved that there are only 4 beautiful subsets in the array [2,4,6].
**Example 2:**
- **Input:** nums = [1], k = 1
- **Output:** 1
- **Explanation:** The beautiful subset of the array nums is [1].
It can be proved that there is only 1 beautiful subset in the array [1].
**Example 3:**
- **Input:** nums = [4,2,5,9,10,3], k = 1
- **Output:** 23
**Constraints:**
- <code>1 <= nums.length <= 20</code>
- <code>1 <= nums[i], k <= 1000</code>
**Solution:**
```
class Solution {
/**
* @param Integer[] $nums
* @param Integer $k
* @return Integer
*/
function beautifulSubsets($nums, $k) {
$countBeautifulSubsets = -1;
$countNums = array_fill(0, 1010, 0);
$size = count($nums);
$dfs = function($index) use (&$dfs, &$countBeautifulSubsets, &$nums, &$countNums, $size, $k) {
if ($index >= $size) {
++$countBeautifulSubsets;
return;
}
$dfs($index + 1);
$isBeautifulIncrement = $nums[$index] + $k >= 1010 || $countNums[$nums[$index] + $k] == 0;
$isBeautifulDecrement = $nums[$index] - $k < 0 || $countNums[$nums[$index] - $k] == 0;
if ($isBeautifulIncrement && $isBeautifulDecrement) {
++$countNums[$nums[$index]];
$dfs($index + 1);
--$countNums[$nums[$index]];
}
};
$dfs(0);
return $countBeautifulSubsets;
}
}
```
**Contact Links**
- **[LinkedIn](https://www.linkedin.com/in/arifulhaque/)**
- **[GitHub](https://github.com/mah-shamim)** | mdarifulhaque |
1,863,051 | Buy Verified Paxful Account | https://dmhelpshop.com/product/buy-verified-paxful-account/ Buy Verified Paxful Account There are... | 0 | 2024-05-23T17:04:21 | https://dev.to/mcynthiahortonotyu/buy-verified-paxful-account-4e9b | webdev, javascript, beginners, programming | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-paxful-account/\n\n\nBuy Verified Paxful Account\nThere are several compelling reasons to consider purchasing a verified Paxful account. Firstly, a verified account offers enhanced security, providing peace of mind to all users. Additionally, it opens up a wider range of trading opportunities, allowing individuals to partake in various transactions, ultimately expanding their financial horizons.\n\nMoreover, Buy verified Paxful account ensures faster and more streamlined transactions, minimizing any potential delays or inconveniences. Furthermore, by opting for a verified account, users gain access to a trusted and reputable platform, fostering a sense of reliability and confidence.\n\nLastly, Paxful’s verification process is thorough and meticulous, ensuring that only genuine individuals are granted verified status, thereby creating a safer trading environment for all users. Overall, the decision to Buy Verified Paxful account can greatly enhance one’s overall trading experience, offering increased security, access to more opportunities, and a reliable platform to engage with. Buy Verified Paxful Account.\n\nBuy US verified paxful account from the best place dmhelpshop\nWhy we declared this website as the best place to buy US verified paxful account? Because, our company is established for providing the all account services in the USA (our main target) and even in the whole world. With this in mind we create paxful account and customize our accounts as professional with the real documents. Buy Verified Paxful Account.\n\nIf you want to buy US verified paxful account you should have to contact fast with us. Because our accounts are-\n\nEmail verified\nPhone number verified\nSelfie and KYC verified\nSSN (social security no.) verified\nTax ID and passport verified\nSometimes driving license verified\nMasterCard attached and verified\nUsed only genuine and real documents\n100% access of the account\nAll documents provided for customer security\nWhat is Verified Paxful Account?\nIn today’s expanding landscape of online transactions, ensuring security and reliability has become paramount. Given this context, Paxful has quickly risen as a prominent peer-to-peer Bitcoin marketplace, catering to individuals and businesses seeking trusted platforms for cryptocurrency trading.\n\nIn light of the prevalent digital scams and frauds, it is only natural for people to exercise caution when partaking in online transactions. As a result, the concept of a verified account has gained immense significance, serving as a critical feature for numerous online platforms. Paxful recognizes this need and provides a safe haven for users, streamlining their cryptocurrency buying and selling experience.\n\nFor individuals and businesses alike, Buy verified Paxful account emerges as an appealing choice, offering a secure and reliable environment in the ever-expanding world of digital transactions. Buy Verified Paxful Account.\n\nVerified Paxful Accounts are essential for establishing credibility and trust among users who want to transact securely on the platform. They serve as evidence that a user is a reliable seller or buyer, verifying their legitimacy.\n\nBut what constitutes a verified account, and how can one obtain this status on Paxful? In this exploration of verified Paxful accounts, we will unravel the significance they hold, why they are crucial, and shed light on the process behind their activation, providing a comprehensive understanding of how they function. Buy verified Paxful account.\n\n \n\nWhy should to Buy Verified Paxful Account?\nThere are several compelling reasons to consider purchasing a verified Paxful account. Firstly, a verified account offers enhanced security, providing peace of mind to all users. Additionally, it opens up a wider range of trading opportunities, allowing individuals to partake in various transactions, ultimately expanding their financial horizons.\n\nMoreover, a verified Paxful account ensures faster and more streamlined transactions, minimizing any potential delays or inconveniences. Furthermore, by opting for a verified account, users gain access to a trusted and reputable platform, fostering a sense of reliability and confidence. Buy Verified Paxful Account.\n\nLastly, Paxful’s verification process is thorough and meticulous, ensuring that only genuine individuals are granted verified status, thereby creating a safer trading environment for all users. Overall, the decision to buy a verified Paxful account can greatly enhance one’s overall trading experience, offering increased security, access to more opportunities, and a reliable platform to engage with.\n\n \n\nWhat is a Paxful Account\nPaxful and various other platforms consistently release updates that not only address security vulnerabilities but also enhance usability by introducing new features. Buy Verified Paxful Account.\n\nIn line with this, our old accounts have recently undergone upgrades, ensuring that if you purchase an old buy Verified Paxful account from dmhelpshop.com, you will gain access to an account with an impressive history and advanced features. This ensures a seamless and enhanced experience for all users, making it a worthwhile option for everyone.\n\n \n\nIs it safe to buy Paxful Verified Accounts?\nBuying on Paxful is a secure choice for everyone. However, the level of trust amplifies when purchasing from Paxful verified accounts. These accounts belong to sellers who have undergone rigorous scrutiny by Paxful. Buy verified Paxful account, you are automatically designated as a verified account. Hence, purchasing from a Paxful verified account ensures a high level of credibility and utmost reliability. Buy Verified Paxful Account.\n\nPAXFUL, a widely known peer-to-peer cryptocurrency trading platform, has gained significant popularity as a go-to website for purchasing Bitcoin and other cryptocurrencies. It is important to note, however, that while Paxful may not be the most secure option available, its reputation is considerably less problematic compared to many other marketplaces. Buy Verified Paxful Account.\n\nThis brings us to the question: is it safe to purchase Paxful Verified Accounts? Top Paxful reviews offer mixed opinions, suggesting that caution should be exercised. Therefore, users are advised to conduct thorough research and consider all aspects before proceeding with any transactions on Paxful.\n\n \n\nHow Do I Get 100% Real Verified Paxful Accoun?\nPaxful, a renowned peer-to-peer cryptocurrency marketplace, offers users the opportunity to conveniently buy and sell a wide range of cryptocurrencies. Given its growing popularity, both individuals and businesses are seeking to establish verified accounts on this platform.\n\nHowever, the process of creating a verified Paxful account can be intimidating, particularly considering the escalating prevalence of online scams and fraudulent practices. This verification procedure necessitates users to furnish personal information and vital documents, posing potential risks if not conducted meticulously.\n\nIn this comprehensive guide, we will delve into the necessary steps to create a legitimate and verified Paxful account. Our discussion will revolve around the verification process and provide valuable tips to safely navigate through it.\n\nMoreover, we will emphasize the utmost importance of maintaining the security of personal information when creating a verified account. Furthermore, we will shed light on common pitfalls to steer clear of, such as using counterfeit documents or attempting to bypass the verification process.\n\nWhether you are new to Paxful or an experienced user, this engaging paragraph aims to equip everyone with the knowledge they need to establish a secure and authentic presence on the platform.\n\nBenefits Of Verified Paxful Accounts\nVerified Paxful accounts offer numerous advantages compared to regular Paxful accounts. One notable advantage is that verified accounts contribute to building trust within the community.\n\nVerification, although a rigorous process, is essential for peer-to-peer transactions. This is why all Paxful accounts undergo verification after registration. When customers within the community possess confidence and trust, they can conveniently and securely exchange cash for Bitcoin or Ethereum instantly. Buy Verified Paxful Account.\n\nPaxful accounts, trusted and verified by sellers globally, serve as a testament to their unwavering commitment towards their business or passion, ensuring exceptional customer service at all times. Headquartered in Africa, Paxful holds the distinction of being the world’s pioneering peer-to-peer bitcoin marketplace. Spearheaded by its founder, Ray Youssef, Paxful continues to lead the way in revolutionizing the digital exchange landscape.\n\nPaxful has emerged as a favored platform for digital currency trading, catering to a diverse audience. One of Paxful’s key features is its direct peer-to-peer trading system, eliminating the need for intermediaries or cryptocurrency exchanges. By leveraging Paxful’s escrow system, users can trade securely and confidently.\n\nWhat sets Paxful apart is its commitment to identity verification, ensuring a trustworthy environment for buyers and sellers alike. With these user-centric qualities, Paxful has successfully established itself as a leading platform for hassle-free digital currency transactions, appealing to a wide range of individuals seeking a reliable and convenient trading experience. Buy Verified Paxful Account.\n\n \n\nHow paxful ensure risk-free transaction and trading?\nEngage in safe online financial activities by prioritizing verified accounts to reduce the risk of fraud. Platforms like Paxfu implement stringent identity and address verification measures to protect users from scammers and ensure credibility.\n\nWith verified accounts, users can trade with confidence, knowing they are interacting with legitimate individuals or entities. By fostering trust through verified accounts, Paxful strengthens the integrity of its ecosystem, making it a secure space for financial transactions for all users. Buy Verified Paxful Account.\n\nExperience seamless transactions by obtaining a verified Paxful account. Verification signals a user’s dedication to the platform’s guidelines, leading to the prestigious badge of trust. This trust not only expedites trades but also reduces transaction scrutiny. Additionally, verified users unlock exclusive features enhancing efficiency on Paxful. Elevate your trading experience with Verified Paxful Accounts today.\n\nIn the ever-changing realm of online trading and transactions, selecting a platform with minimal fees is paramount for optimizing returns. This choice not only enhances your financial capabilities but also facilitates more frequent trading while safeguarding gains. Buy Verified Paxful Account.\n\nExamining the details of fee configurations reveals Paxful as a frontrunner in cost-effectiveness. Acquire a verified level-3 USA Paxful account from usasmmonline.com for a secure transaction experience. Invest in verified Paxful accounts to take advantage of a leading platform in the online trading landscape.\n\n \n\nHow Old Paxful ensures a lot of Advantages?\n\nExplore the boundless opportunities that Verified Paxful accounts present for businesses looking to venture into the digital currency realm, as companies globally witness heightened profits and expansion. These success stories underline the myriad advantages of Paxful’s user-friendly interface, minimal fees, and robust trading tools, demonstrating its relevance across various sectors.\n\nBusinesses benefit from efficient transaction processing and cost-effective solutions, making Paxful a significant player in facilitating financial operations. Acquire a USA Paxful account effortlessly at a competitive rate from usasmmonline.com and unlock access to a world of possibilities. Buy Verified Paxful Account.\n\nExperience elevated convenience and accessibility through Paxful, where stories of transformation abound. Whether you are an individual seeking seamless transactions or a business eager to tap into a global market, buying old Paxful accounts unveils opportunities for growth.\n\nPaxful’s verified accounts not only offer reliability within the trading community but also serve as a testament to the platform’s ability to empower economic activities worldwide. Join the journey towards expansive possibilities and enhanced financial empowerment with Paxful today. Buy Verified Paxful Account.\n\n \n\nWhy paxful keep the security measures at the top priority?\nIn today’s digital landscape, security stands as a paramount concern for all individuals engaging in online activities, particularly within marketplaces such as Paxful. It is essential for account holders to remain informed about the comprehensive security protocols that are in place to safeguard their information.\n\nSafeguarding your Paxful account is imperative to guaranteeing the safety and security of your transactions. Two essential security components, Two-Factor Authentication and Routine Security Audits, serve as the pillars fortifying this shield of protection, ensuring a secure and trustworthy user experience for all. Buy Verified Paxful Account.\n\nConclusion\nInvesting in Bitcoin offers various avenues, and among those, utilizing a Paxful account has emerged as a favored option. Paxful, an esteemed online marketplace, enables users to engage in buying and selling Bitcoin. Buy Verified Paxful Account.\n\nThe initial step involves creating an account on Paxful and completing the verification process to ensure identity authentication. Subsequently, users gain access to a diverse range of offers from fellow users on the platform. Once a suitable proposal captures your interest, you can proceed to initiate a trade with the respective user, opening the doors to a seamless Bitcoin investing experience.\n\nIn conclusion, when considering the option of purchasing verified Paxful accounts, exercising caution and conducting thorough due diligence is of utmost importance. It is highly recommended to seek reputable sources and diligently research the seller’s history and reviews before making any transactions.\n\nMoreover, it is crucial to familiarize oneself with the terms and conditions outlined by Paxful regarding account verification, bearing in mind the potential consequences of violating those terms. By adhering to these guidelines, individuals can ensure a secure and reliable experience when engaging in such transactions. Buy Verified Paxful Account.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com" | mcynthiahortonotyu |
1,863,050 | TypeScript, Understanding the code you write | TypeScript has given the frontend a powerful chance to deliver reliable and maintainable code. The... | 0 | 2024-05-23T17:02:24 | https://dev.to/zeyadetman/typescript-understanding-the-code-you-write-12eg | typescript, javascript, webdev, frontend |
TypeScript has given the frontend a powerful chance to deliver reliable and maintainable code. The more you correctly type your code, the more you fall into its strengths and importance. In this post, I'll mention some TypeScript tips that you may or may not know about.
## How TypeScript works?
The first thing you should know is **how TypeScript works**?
<figure style={{ textAlign: "center" }}>

<figcaption>
[Source](https://gist.github.com/zeyadetman/e2435ca1e2e0fe78d3dc981250e42499)
</figcaption>
</figure>
Browsers are designed to only understand JavaScript, just as computers are designed to only understand zeros and ones. In the case of computers, we compile high-level code to machine code and then to zeros and ones so the computer can understand it. Similarly, we transpile or translate TypeScript code to JavaScript code to be understood by browsers and other engines that only know JavaScript.
To do this, we may use the popular `tsc` tool, You can also check out the [awesome-typescript-compilers](https://github.com/JohnDeved/awesome-typescript-compilers?tab=readme-ov-file) repository to learn more about them.
For example, code like this:
```ts
type animals = "cat" | "dog";
const getAnimal = (animal: animals) => animal;
getAnimal("cat");
```
Is translated to this JavaScript code, so browsers can understand it:
```js
const getAnimal = (animal) => animal;
getAnimal("cat");
```
**Note: TypeScript code is eliminated in the transpiling process.**
This leads us to an **important conclusion**: once you realize it, you'll notice a lot about TypeScript concepts and why they've been written in their particular way.
**Conclusion: As TypeScript does not run on browsers, it only runs at compile time, not at runtime.**
> **Compile time** is the time when your code is being built, before it runs and becomes interactive. **Run time** is when your code is running and interacting with the client.
### Differences in Action
Let's compare a typed language that checks types at runtime, like C#, with TypeScript, which checks types at compile time:
#### Function overloading
> Some JavaScript functions can be called in a variety of argument counts and types. For example, you might write a function to produce a `Date` that takes either a timestamp (one argument) or a month/day/year specification (three arguments).
> [Resource](https://www.typescriptlang.org/docs/handbook/2/functions.html#function-overloads)
In C#, we can achieve function overloading by declaring multiple function arguments and bodies. This is possible because C# checks the type of the function arguments at runtime and defines the function body accordingly:
```c#
void display() { ... }
void display(int a) { ... }
float display(double a) { ... }
float display(int a, float b) { ... }
```
However, in TypeScript, we go to the runtime without types, without typescript itself, so there's no runtime type checks and then we can't assign multiple values to the same variable in javascript, or in other words we can't create multiple bodies for the same function.
```ts
function makeDate(timestamp: number): Date;
function makeDate(m: number, d: number, y: number): Date;
function makeDate(m: number, d: string, y: number): string;
function makeDate(
mOrTimestamp: number,
d?: number | string,
y?: number
): Date | string {
if (typeof d === "string") {
return "hello";
}
if (d !== undefined && y !== undefined) {
return new Date(y, mOrTimestamp, d);
} else {
return new Date(mOrTimestamp);
}
}
const d1 = makeDate(1234);
const d2 = makeDate(5, 5, 5);
```
This will be compiled to this Javascript code:
```js
function makeDate(mOrTimestamp, d, y) {
if (typeof d === "string") {
return "hello";
}
if (d !== undefined && y !== undefined) {
return new Date(y, mOrTimestamp, d);
} else {
return new Date(mOrTimestamp);
}
}
const d1 = makeDate(1234);
const d2 = makeDate(5, 5, 5);
```
There's libraries can do runtime checks for javascript, one of the most popular libraries do this is [zod](https://zod.dev/) you can give it a try.
## Narrowing (make it more specific)
## `keyof` and `typeof`
- `keyof` from its name, used to "extract" a key of a TYPE, and since objects are the only datatype that contains keys, then `keyof` is used to extract key of an object in a **union** format
```ts
type obj = { [n: number]: string };
type A = keyof obj; // number
```
- `typeof` from its name, used to return the `type` of a variable. simple!
```ts
let str = "Alia";
type a = typeof str; // string
const str2 = "Alia";
type b = typeof str2; // "Alia"
```
**Important note: I used `let` here instead of `const`. If I used `const`, then the type would be the specific string value itself.**
Let's merge both
```ts
const obj = {
id: 1,
name: "Alia",
}; // javascript variable no types
type typeOfObj = typeof obj; // type of the javascript object
type keysOfObj = keyof typeOfObj; // id | name
type keysOfObjOneLine = keyof typeof obj;
```
Notice here
- `keyof` extracts keys of type (object).
- `typeof` extracts the type of variable.
_keep this in your mind for the mapped types section._
## `in` operator and `extends`
Similar to the previous section, `in` operator is kinda similar way you can use it to limit the key to be within a group of types
```ts
type keysOfObj = "id" | "name";
type obj = { [key in keysOfObj]: string };
// type obj = {
// id: string;
// name: string;
// }
```
But we need to modify the code to fix the type for the `id` key to be `number` not `string`
- `extends` consider it as `if else` of typescript.
```ts
type keysOfObj = "id" | "name";
type obj = { [key in keysOfObj]: key extends "id" ? number : string };
// That's it
```
_This called mapped types_
## Generics
Think about it as a function in javascript, if you have a type and you're repeating it with slight changes, then you probably need to use `generics`.
```ts
interface BasicType<T> {
input: T;
}
const x: BasicType<string> = { input: "test" };
const y: BasicType<number> = { input: "test" }; // wrong
```
```ts
function basic(a, b) {
return { a, b };
}
function basic<A, B>(a, b) {
return { a, b };
}
function basic<A, B>(a: A, b: B): { a: A; b: B } {
return { a, b };
}
basic<string, number>("hello", 2);
```
we can still use `extends` with generics to give a powerful type.
## Typescript Coverage
In addition to adding TypeScript rules to your project, it's important to track TypeScript code coverage. Code coverage helps identify missing types for variables.
You might think it's obvious if a variable has a type, but when using third-party libraries, classes, or functions, it's easy to miss some. Not everything, such as query parameters, will be tracked by the TypeScript coverage library, but it definitely helps.
I suggest using [typescript-coverage-report](https://www.npmjs.com/package/typescript-coverage-report) for this purpose. If you know of a better option, feel free to share it in the comments below.
Resources:
- [TypeScript Documentation](https://www.typescriptlang.org/docs/)
- [type-challenges](https://github.com/type-challenges/type-challenges)
- [typehero.dev](https://typehero.dev/)
[Find more... ](https://notes.zeyadetman.com/programming-languages/typescript)
---
_While writing this post, civilians and children in Gaza are being killed in the ongoing genocide. It's important to be informed about what is happening in Palestine and to consider the humanitarian aspects of the situation. Please take the time to research, and understand the events and their impact on palestinian's lives. [know more](https://blog.paulbiggar.com/i-cant-sleep/)_
_Originally Published on [https://zeyadetman.com/posts/typescript-understand-the-code-you-write](https://zeyadetman.com/posts/typescript-understand-the-code-you-write)_ | zeyadetman |
1,863,049 | The Power of Cryptocurrency: Insights from Cardano’s Founder Charles Hoskinson | In a world where the use of digital currencies is becoming increasingly popular, China is trying to... | 0 | 2024-05-23T17:01:45 | https://36crypto.com/the-power-of-cryptocurrency-insights-from-cardanos-founder-charles-hoskinson/ | cryptocurrency, news | In a world where the use of digital currencies is becoming increasingly popular, China is trying to become a leader by actively introducing the digital yuan. However, citizens are in no hurry to use the currency. Why is this happening? Let’s take a closer look at it.
**Chinese Workers Prefer Fiat Money**
The Chinese government is actively trying to introduce a digital yuan. According to the South China Morning Post, some Chinese cities have started paying CBDC to the country’s civil servants. However, a large number of employees are still hesitant to use them for daily transactions and convert them directly into cash.
Sammy Lin, a customer service manager at a Chinese state-owned bank in Suzhou, explains that she does not keep funds in the e-CNY app because it does not charge interest for storage. She also notes that there are currently not many places online or offline where you can use the digital yuan.
Although China has been a “functionally cashless” society for more than a decade, many citizens are still hesitant to use a purely digital currency due to concerns about traceability and limited use cases.
But despite such concerns, according to Yi Gang, former governor of the People’s Bank of China, more than $250 million worth of transactions were made using the digital yuan in July 2023. He also emphasized that the digital yuan is able to “fully protect privacy” due to the so-called “controlled anonymity,” which means that there is no tracking of small payments, but some tracking of large payments is possible.
Charles Hoskinson, the founder of Cardano, emphasizes the importance of cryptocurrency in the modern world. On May 11, he reminded on his social media page that the fundamental concept of cryptocurrency is the creation of new social contracts. Hoskinson emphasized that it is these contracts that will make the government accountable to the people and urged the industry to focus on promoting this goal.
He also warned that anyone who opposes the spread of cryptocurrencies unknowingly supports the concentration of power in the hands of a few.
_“Crypto gives us our voices, financial freedom, and shared humanity back.”_ Hoskinson [remarked](https://x.com/iohk_charles/status/1789403123804688520?s=46&t=aw6ZR-6aD050XLPVXY8AQA).
**European Crypto Industry Development**
The crypto industry is gaining significant growth in Europe. European countries are actively working to create a legal environment to ensure investor protection and financial system stability. For example, Switzerland is known for its “crypto valley” in Zurich, where numerous blockchain companies and startups are based. Malta has adopted a number of laws aimed at regulating cryptocurrencies and ensuring their legality and transparency.
European countries are constantly working on the development of new blockchain solutions and cryptocurrency payment systems. European countries are also actively exploring the possibilities of integrating cryptocurrencies into their financial systems. For example, Switzerland was one of the first countries to [introduce](https://www.bloomberg.com/news/articles/2023-12-06/taxpayers-in-swiss-city-can-settle-bills-with-bitcoin-tether) the possibility of paying taxes in Bitcoin.
Many countries have long accepted cryptocurrency as a means of payment, and Ukraine is no exception. Ukraine is intensively expanding the crypto sphere, attracting investors, startups, and regulators to create a favorable landscape for this spectrum.
Ukrainians have long had the opportunity to pay for purchases in grocery stores using payment solutions from [Corefy](https://corefy.com/en-ua), [Switchere](https://switchere.com/accept-crypto), [Whitepay](https://whitepay.com/), and many others. Moreover, the first cryptocurrency [transaction](https://gncrypto.news/news/ukraines-first-electric-vehicle-purchase-with-crypto/) for the purchase of an electric car has recently taken place in Ukraine. This step demonstrates the significant development and potential of the industry.
**Summary**
The introduction of the digital yuan in China faces a number of challenges, but despite the difficulties, the government continues to promote this initiative. At the same time, the development of the crypto industry in Europe is showing remarkable progress, creating new opportunities for businesses and citizens. The involvement of cryptocurrencies in everyday financial transactions, including charity and shopping, demonstrates the importance of digital assets for economic development. As Charles Hoskinson has aptly noted, the rise of cryptocurrencies is not only shifting financial paradigms, but also has the potential to redefine social contracts and empower people in unprecedented ways. | hryniv_vlad |
1,863,048 | autollama | From the ollama help discord channel: Docker compose: How do you pull a model automatically with... | 0 | 2024-05-23T17:01:36 | https://dev.to/spara_50/autollama-4mi8 | From the [ollama help discord](https://discord.com/invite/ollama) channel:
> Docker compose: How do you pull a model automatically with container creation?
The conundrum of containerizing Ollama is that it must be running to pull a model. If you run ollama using `docker compose`, it doesn't provide a way to pull a model from the ollama registry. What's a budding generative AI nerd to do?
The answer is more `docker compose`, as in `docker compose exec`, which runs a script to download a model. Add both commands to a startup script, and you have [autollama](https://github.com/spara/autollama).
```bash
docker compose up -d
sleep 5
docker compose exec autollama sh /root/.ollama/pull_model.sh
```
Come and get your ollama love [here](https://github.com/spara/autollama). | spara_50 | |
1,863,002 | sCrypt Hackathon 2024 Winners Announced! | sCrypt Hackathon 2024 Winners Announced! The sCrypt community is thrilled to announce the winners of... | 0 | 2024-05-23T17:00:41 | https://dev.to/bitruslukag/scrypt-hackathon-2024-winners-announced-127i | **sCrypt Hackathon 2024 Winners Announced!**
The sCrypt community is thrilled to announce the winners of the 2024 sCrypt Hackathon, held from March 25 to April 25, 2024!
This year's theme revolved around merging sCrypt (Bitcoin smart contracts) with Ordinals (Bitcoin tokens) to pioneer innovative solutions in the blockchain space. We received a fantastic response from developers worldwide, with a wide range of creative projects submitted.
The winners were chosen through a combination of judge evaluation and community votes. You can explore all the amazing projects on Devfolio: [scrypt.devfolio.co/projects](https://scrypt.devfolio.co/projects)
And the ten best selected projects are...
1. Grand Prize Winner ($5,000): Gassed-Up
2. The Bitcoin CPU
3. Satoshi Dragons
4. Block Trust
5. OneSatRollup
6. mediumSv
7. Smart Ordinals
8. Autochaintrade App
9. Facial-Identity-Verification
10. Auoz
We're especially excited to congratulate [Gassed-Up](https://devfolio.co/projects/gassedup-08ec),
the project taking home the Grand Prize of $5,000! Their innovative approach exemplifies the potential of sCrypt to revolutionize everyday processes.
Congratulations to all the participants for their innovative projects and dedication to advancing the sCrypt ecosystem. We're confident these projects will continue to develop and contribute significantly to the growth of the sCrypt ecosystem.
**Summary of 10 Ten Best Projects Developed During the sCrypt Hackathon 2024**
1. 1. **Gassed-Up**
**Description:** A gas pump simulator optimizing cost, trust, and customer experience at the pump with a Bitcoin smart contract.

**Project Link:** [Gassed-Up](https://devfolio.co/projects/gassedup-08ec)
12. 2. **The Bitcoin CPU**
**Description:** Executes code with the Everett CPU architecture on Bitcoin.

**Project Link:** [The Bitcoin CPU](https://devfolio.co/projects/the-bitcoin-cpu-cfe7)
24. 3. **Satoshi Dragons**
**Description:** A smart contract-enforced NFT game on Bitcoin.

**Project Link:** [Satoshi Dragons](https://devfolio.co/projects/satoshi-dragons-08f5)
29. 4. **BlockTrust: Real Estate Management System**
**Description:** A Blockchain Based Real Estate Management System on Bitcoin SV Blockchain.

**Project Link:** [BlockTrust](https://devfolio.co/projects/blocktrust-32fc)
40. 5. **OneSatRollup**
**Description:** Zero knowledge rollups for Onesat Ordinals and other transaction chains.

**Project Link:** [OneSatRollup](https://devfolio.co/projects/onesatrollup-cbf0)
45. 6. **mediumSv: Blockchain Content Rights Platform**
**Description:** Ensures content authenticity and ownership, combating plagiarism and unauthorized use while providing transparent, immutable records of creative works.

**Project Link:** [MediumSv](https://devfolio.co/projects/mediumsv-blockchain-content-rights-platform-5946)
60. 7. **Smart Ordinals**
**Description:** Demonstrates the potential of turning a stateless UTXO script into a stateful one to leverage the benefits of stateful UTXO scripts.

**Project Link:** [Smart Ordinals](https://devfolio.co/projects/smart-ordinals-8034)
65. 8. **Autochaintrade App**
**Description:** A decentralized automotive trading platform utilizing Ordinal NFTs for secure and transparent vehicle transactions.

**Project Link:** [Autochaintrade App](https://devfolio.co/projects/autochaintrade-app-2308)
70. 9. **Facial-Identity-Verification**
**Description:** A groundbreaking platform where cutting-edge technology meets the simplicity of facial recognition.

**Project Link:** [Facial-Identity-Verification](https://devfolio.co/projects/facialidentityverificationonblockchainwithma-2edd)
75. 10. **Auoz**
**Description:** Auoz (Au-oz), a fully Bitcoin-backed stablecoin linked to the price of one ounce of gold (Au).

**Project Link:** [Auoz](https://devfolio.co/projects/auoz-13b3)
In conclusion, these projects demonstrated the creativity and technical prowess of the participants, showcasing a diverse array of applications and solutions within the sCrypt ecosystem.
Stay tuned for more updates and future events from sCrypt at [scrypt.devfolio.co/projects](https://scrypt.devfolio.co/projects).
| bitruslukag | |
1,863,047 | Understanding Lawyers in Corpus Christi, TX | Nestled along the Gulf Coast of Texas, Corpus Christi boasts a vibrant community, bustling economy,... | 0 | 2024-05-23T16:56:52 | https://dev.to/backlink_30/understanding-lawyers-in-corpus-christi-tx-5d30 | Nestled along the Gulf Coast of Texas, Corpus Christi boasts a vibrant community, bustling economy, and a rich tapestry of legal needs. Whether it's a maritime dispute, personal injury claim, or family matter, the residents of Corpus Christi often turn to the expertise of seasoned attorneys to navigate the complex terrain of the legal system. In this comprehensive guide, we delve into the role of lawyers in Corpus Christi, TX, exploring their specialties, qualifications, and the crucial services they provide to individuals and businesses alike.
**Legal Landscape of Corpus Christi**
Corpus Christi, with its strategic location and thriving industries, presents a diverse array of legal challenges. From maritime law stemming from its bustling port to civil litigation arising from its vibrant economy, the legal landscape here is multifaceted. Additionally, family law matters, such as divorce and child custody, are common, reflecting the dynamics of the community.
**Specialties of Lawyers in Corpus Christi**
Maritime Law: Given its position as a major port city, Corpus Christi is a hub for maritime activities. Lawyers specializing in maritime law handle a range of issues, including vessel collisions, cargo disputes, and offshore injuries. They possess a deep understanding of admiralty law and strive to protect the rights of seamen, vessel owners, and maritime businesses.
Personal Injury Law: Accidents happen, and when they do, residents of Corpus Christi rely on personal injury attorneys to seek compensation for their injuries. Whether it's a car accident, slip and fall, or workplace injury, these lawyers are adept at negotiating with insurance companies and litigating in court to ensure their clients receive fair compensation for medical expenses, lost wages, and pain and suffering.
Real Estate Law: With a growing real estate market, the need for skilled real estate attorneys in Corpus Christi is paramount. These lawyers assist clients with property transactions, lease agreements, zoning issues, and property disputes. Whether you're buying your dream home or investing in commercial real estate, having a knowledgeable real estate attorney by your side can make all the difference.
Family Law: Navigating the complexities of family law can be emotionally challenging, but experienced family law attorneys in Corpus Christi offer compassionate guidance and steadfast advocacy. From divorce proceedings to child custody disputes and adoption cases, these lawyers work tirelessly to protect the best interests of their clients and their families.
Business Law: Entrepreneurs and businesses in Corpus Christi rely on business law attorneys to navigate the legal intricacies of commerce. Whether it's forming a business entity, drafting contracts, or resolving disputes, these attorneys provide invaluable counsel to ensure their clients' ventures thrive in a competitive market.
**Qualifications and Credentials**
Lawyers in Corpus Christi, like their counterparts across the country, must meet rigorous standards to practice law. They must graduate from an accredited law school, pass the Texas Bar Exam, and meet the ethical requirements set forth by the State Bar of Texas. Additionally, many lawyers pursue specialized certifications or memberships in professional organizations to enhance their expertise and credibility in their respective fields.
**The Role of Lawyers in the Community**
Beyond their legal expertise, **[Lawyers in Corpus Christi Tx](https://perkinsperkinslaw.com/)** play an integral role in their community. They volunteer their time and resources to various charitable organizations, provide pro bono legal services to those in need, and serve as advocates for social justice and equality. Through their dedication to serving others, these lawyers embody the principles of justice and compassion that underpin the legal profession.
**Finding the Right Lawyer for Your Needs**
When facing a legal issue, finding the right lawyer is essential. Residents of Corpus Christi can utilize various resources to identify attorneys who specialize in their particular area of need. Referrals from friends, family, or trusted professionals are often a valuable starting point. Additionally, online directories, such as the State Bar of Texas website, provide comprehensive listings of licensed attorneys in the area, along with their practice areas and contact information.
**Conclusion**
Lawyers in Corpus Christi, TX, play a pivotal role in upholding justice, protecting rights, and ensuring the rule of law in their community. Whether advocating for the injured, guiding businesses through legal challenges, or helping families navigate turbulent times, these attorneys are dedicated professionals committed to serving the needs of their clients with integrity and compassion. In a city as dynamic as Corpus Christi, the expertise and counsel of skilled lawyers are invaluable assets in navigating the complexities of the legal system. | backlink_30 | |
1,863,046 | How to Craft an Effective Tech Resume Without Experience | How to Craft an Effective Tech Resume Without Experience Highlight Relevant... | 0 | 2024-05-23T16:56:01 | https://dev.to/bingecoder89/how-to-craft-an-effective-tech-resume-without-experience-1l1m | beginners, tutorial, career, codenewbie | ### How to Craft an Effective Tech Resume Without Experience
1. **Highlight Relevant Education:**
- Emphasize your degree, coursework, and any academic projects related to tech. Mention specific subjects that align with the job you're applying for.
2. **Showcase Personal Projects:**
- Include any self-initiated tech projects, such as building a website, creating a mobile app, or contributing to open-source projects. Detail your role and the technologies used.
3. **Certifications and Online Courses:**
- List any tech certifications or online courses you've completed, such as those from Coursera, Udemy, or Codecademy. Highlight skills and knowledge gained from these courses.
4. **Technical Skills Section:**
- Create a dedicated section for technical skills. List programming languages, tools, and software you are familiar with. Include any relevant frameworks or libraries.
5. **Soft Skills and Transferable Skills:**
- Emphasize soft skills like problem-solving, analytical thinking, and teamwork. Mention any transferable skills from other jobs or experiences that are relevant to tech roles.
6. **Internships and Volunteer Work:**
- If you've interned or volunteered in any capacity that involved tech, detail these experiences. Focus on the skills learned and any accomplishments during these roles.
7. **Relevant Extracurricular Activities:**
- Mention tech-related clubs, hackathons, or coding bootcamps you've participated in. Highlight leadership roles or significant contributions in these activities.
8. **Customize for Each Job:**
- Tailor your resume to the specific job you're applying for. Use keywords from the job description and align your skills and experiences with the job requirements.
9. **Include a Personal Statement:**
- Write a brief summary or objective at the top of your resume. Focus on your passion for tech, eagerness to learn, and career aspirations.
10. **Professional Format and Presentation:**
- Use a clean, professional resume template. Ensure it's well-organized, free of errors, and easy to read. A polished presentation can make a strong first impression.
Happy Learning 🎉 | bingecoder89 |
1,863,037 | Building Robust Applications in React with TypeScript and Zod for REST API Validation | When building applications with React and TypeScript, leveraging TypeScript's static typing... | 0 | 2024-05-23T16:54:55 | https://dev.to/schead/building-robust-applications-in-react-with-typescript-and-zod-for-rest-api-validation-cl8 | When building applications with React and TypeScript, leveraging TypeScript's static typing capabilities can significantly enhance your code's reliability. However, even with TypeScript, you can't guarantee the shape and type of data coming from external APIs. This potential discrepancy can lead to runtime errors that disrupt your application's functionality. In this blog post, we'll explore how to handle such situations using Zod, a TypeScript-first schema declaration and validation library.
#### Scenario: Fetching User Data with TypeScript
Imagine you have a React application that fetches user data from an API. The `User` object has a `phoneNumber` property, which is expected to be a string. You'll format this phone number to display it in a user-friendly manner. Here's how you might start:
```tsx
// types.ts
export interface User {
id: number;
name: string;
phoneNumber: string;
}
// api.ts
export const fetchUser = async (userId: number): Promise<User> => {
const response = await fetch(`https://api.example.com/users/${userId}`);
const data: User = await response.json();
return data;
};
// UserComponent.tsx
import React, { useEffect, useState } from 'react';
import { User } from './types';
import { fetchUser } from './api';
const UserComponent: React.FC<{ userId: number }> = ({ userId }) => {
const [user, setUser] = useState<User | null>(null);
const [error, setError] = useState<string | null>(null);
useEffect(() => {
fetchUser(userId)
.then(setUser)
.catch(err => setError(err.message));
}, [userId]);
if (error) return <div>Error: {error}</div>;
if (!user) return <div>Loading...</div>;
const formattedPhoneNumber = formatPhoneNumber(user.phoneNumber);
return (
<div>
<h1>{user.name}</h1>
<p>Phone: {formattedPhoneNumber}</p>
</div>
);
};
const formatPhoneNumber = (phoneNumber: string): string => {
// Logic to format the phone number
return phoneNumber.replace(/(\d{3})(\d{3})(\d{4})/, '($1) $2-$3');
};
export default UserComponent;
```
In this example, we expect `phoneNumber` to be a string. However, if the backend changes `phoneNumber` to a number, this will lead to a runtime error in the `formatPhoneNumber` function.
#### Runtime Error Example
Suppose the backend now sends:
```json
{
"id": 1,
"name": "John Doe",
"phoneNumber": 1234567890
}
```
This change will cause a runtime error in the `formatPhoneNumber` function because `replace` is not a method on numbers. The error will look like this:
```vbnet
TypeError: phoneNumber.replace is not a function
```
This error indicates that `phoneNumber` is not a string as expected, but a number, leading to the failure of the `replace` method.
#### Introducing Zod for Schema Validation
To safeguard against such issues, we can use Zod to validate the data we receive from the API. Here's how we can do it step-by-step:
1. **Install Zod** :
```bash
npm install zod
```
2. **Define a Zod Schema** :
```tsx
// schema.ts
import { z } from 'zod';
export const userSchema = z.object({
id: z.number(),
name: z.string(),
phoneNumber: z.string(),
});
export type User = z.infer<typeof userSchema>;
```
3. **Validate API Response and Provide Feedback** :
```tsx
// api.ts
import { userSchema, User } from './schema';
export const fetchUser = async (userId: number): Promise<User> => {
const response = await fetch(`https://api.example.com/users/${userId}`);
const data = await response.json();
const result = userSchema.safeParse(data);
if (!result.success) {
console.error("Validation Error:", result.error.format());
throw new Error('Invalid data format');
}
return result.data;
};
```
4. **Handle Validation in the Component** :
```tsx
// UserComponent.tsx
import React, { useEffect, useState } from 'react';
import { User } from './schema';
import { fetchUser } from './api';
const UserComponent: React.FC<{ userId: number }> = ({ userId }) => {
const [user, setUser] = useState<User | null>(null);
const [error, setError] = useState<string | null>(null);
useEffect(() => {
fetchUser(userId)
.then(setUser)
.catch(err => setError(err.message));
}, [userId]);
if (error) return <div>Error: {error}</div>;
if (!user) return <div>Loading...</div>;
const formattedPhoneNumber = formatPhoneNumber(user.phoneNumber);
return (
<div>
<h1>{user.name}</h1>
<p>Phone: {formattedPhoneNumber}</p>
</div>
);
};
const formatPhoneNumber = (phoneNumber: string): string => {
// Logic to format the phone number
return phoneNumber.replace(/(\d{3})(\d{3})(\d{4})/, '($1) $2-$3');
};
export default UserComponent;
```
In this updated example, the `fetchUser` function uses the Zod schema to validate the API response. If the data doesn't match the expected schema, an error is thrown, and a detailed validation error is logged to the console. This feedback helps developers quickly identify and resolve issues with the API data.
#### Error Message Example
With the Zod validation in place, if the backend sends the `phoneNumber` as a number, the application will catch this discrepancy and throw an error. The console will log a detailed error message like this:
```css
Validation Error: {
"phoneNumber": {
"_errors": ["Expected string, received number"]
}
}
```
In addition to logging the error, the application will throw an error to the user with the message:
```javascript
Error: Invalid data format
```
This message indicates to the user that there was an issue with the data received from the API.
#### Conclusion
Using Zod for schema validation in your React and TypeScript applications ensures that your application can gracefully handle unexpected changes in the API response structure. This approach helps you catch potential issues early, leading to more robust and reliable applications. By incorporating Zod into your development workflow, you can enhance your application's resilience against API changes and reduce runtime errors.
Remember, while TypeScript provides excellent compile-time type checking, runtime validation with tools like Zod is essential for dealing with external data sources. This combination ensures that your application remains stable and predictable, even when the data it relies on changes unexpectedly.
| schead | |
1,863,043 | Tricky Golang interview questions - Part 2: BigO of len(...) | An example I want to discuss is quite simple and requires knowledge about how some data types are... | 0 | 2024-05-23T16:50:48 | https://dev.to/crusty0gphr/tricky-golang-interview-questions-part-2-bigo-of-len-2om9 | go, interview, tutorial, programming | An example I want to discuss is quite simple and requires knowledge about how some data types are constructed under the hood.
**Question: What is the time complexity of `len(...)` for each data type**
```go
package main
import (
"fmt"
)
func main() {
_ = len("string")
_ = len([]int{1, 2, 3})
_ = len(map[string]int{"one": 1, "two": 2, "three": 3})
}
```
The answer to this question is very easy:
**The time complexity of the `len(...)` for all cases is O(1).**
If you answer like this, the interviewer will come up with the following question: **why?** To answer this question, we need to look under the hood of each of the types.
### String
According to the Go documentation string type can be represented as a struct, a **StringHeader**. StringHeader is the runtime representation of a string.
```go
type StringHeader struct {
Data uintptr
Len int
}
```
- `Data` is a pointer, which points to an underlying (backing) byte array that stores the string data
- `Len` is the length of an underlying (backing) byte array
### Slice
Slice can also be represented as a struct, a **SliceHeader**. SliceHeader is the runtime representation of a slice.
```go
type SliceHeader struct {
Data uintptr
Len int
Cap int
}
```
- `Data` is a pointer, which points to an underlying (backing) byte array that stores the string data
- `Len` is the length of an underlying (backing) byte array
- `Cap` is the capacity of the slice.
### Map
Map is a bit more complex data structure and is represented as a [package](https://go.dev/src/runtime/map.go) in Go. In this package, we can find the map header struct with its corresponding fields.
```go
type hmap struct {
count int
flags uint8
B uint8
noverflow uint16
hash0 uint32
buckets unsafe.Pointer
oldbuckets unsafe.Pointer
nevacuate uintptr
extra *mapextra
}
```
We need the `count` field for this struct. According to the comments
- `count` is live cells == size of map or simply length of map
Now we know how these data structures are constructed under the hood we can answer the follow-up question by the interviewer:
**Why len(...) operations are O(1)?**
___
The `Len` function simply returns the value of the field in the header responsible for storing the length.
___
It's that easy! | crusty0gphr |
1,863,042 | Buy verified cash app account | https://dmhelpshop.com/product/buy-verified-cash-app-account/ Buy verified cash app account Cash... | 0 | 2024-05-23T16:49:56 | https://dev.to/mcynthiahortonotyu/buy-verified-cash-app-account-3h8c | webdev, javascript, beginners, programming | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-cash-app-account/\n\n\nBuy verified cash app account\nCash app has emerged as a dominant force in the realm of mobile banking within the USA, offering unparalleled convenience for digital money transfers, deposits, and trading. As the foremost provider of fully verified cash app accounts, we take pride in our ability to deliver accounts with substantial limits. Bitcoin enablement, and an unmatched level of security.\n\nOur commitment to facilitating seamless transactions and enabling digital currency trades has garnered significant acclaim, as evidenced by the overwhelming response from our satisfied clientele. Those seeking buy verified cash app account with 100% legitimate documentation and unrestricted access need look no further. Get in touch with us promptly to acquire your verified cash app account and take advantage of all the benefits it has to offer.\n\nWhy dmhelpshop is the best place to buy USA cash app accounts?\nIt’s crucial to stay informed about any updates to the platform you’re using. If an update has been released, it’s important to explore alternative options. Contact the platform’s support team to inquire about the status of the cash app service.\n\nClearly communicate your requirements and inquire whether they can meet your needs and provide the buy verified cash app account promptly. If they assure you that they can fulfill your requirements within the specified timeframe, proceed with the verification process using the required documents.\n\nOur account verification process includes the submission of the following documents: [List of specific documents required for verification].\n\nGenuine and activated email verified\nRegistered phone number (USA)\nSelfie verified\nSSN (social security number) verified\nDriving license\nBTC enable or not enable (BTC enable best)\n100% replacement guaranteed\n100% customer satisfaction\nWhen it comes to staying on top of the latest platform updates, it’s crucial to act fast and ensure you’re positioned in the best possible place. If you’re considering a switch, reaching out to the right contacts and inquiring about the status of the buy verified cash app account service update is essential.\n\nClearly communicate your requirements and gauge their commitment to fulfilling them promptly. Once you’ve confirmed their capability, proceed with the verification process using genuine and activated email verification, a registered USA phone number, selfie verification, social security number (SSN) verification, and a valid driving license.\n\nAdditionally, assessing whether BTC enablement is available is advisable, buy verified cash app account, with a preference for this feature. It’s important to note that a 100% replacement guarantee and ensuring 100% customer satisfaction are essential benchmarks in this process.\n\nHow to use the Cash Card to make purchases?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card. Alternatively, you can manually enter the CVV and expiration date. How To Buy Verified Cash App Accounts.\n\nAfter submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a buy verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account.\n\nWhy we suggest to unchanged the Cash App account username?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card.\n\nAlternatively, you can manually enter the CVV and expiration date. After submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account. Purchase Verified Cash App Accounts.\n\nSelecting a username in an app usually comes with the understanding that it cannot be easily changed within the app’s settings or options. This deliberate control is in place to uphold consistency and minimize potential user confusion, especially for those who have added you as a contact using your username. In addition, purchasing a Cash App account with verified genuine documents already linked to the account ensures a reliable and secure transaction experience.\n\n \n\nBuy verified cash app accounts quickly and easily for all your financial needs.\nAs the user base of our platform continues to grow, the significance of verified accounts cannot be overstated for both businesses and individuals seeking to leverage its full range of features. How To Buy Verified Cash App Accounts.\n\nFor entrepreneurs, freelancers, and investors alike, a verified cash app account opens the door to sending, receiving, and withdrawing substantial amounts of money, offering unparalleled convenience and flexibility. Whether you’re conducting business or managing personal finances, the benefits of a verified account are clear, providing a secure and efficient means to transact and manage funds at scale.\n\nWhen it comes to the rising trend of purchasing buy verified cash app account, it’s crucial to tread carefully and opt for reputable providers to steer clear of potential scams and fraudulent activities. How To Buy Verified Cash App Accounts. With numerous providers offering this service at competitive prices, it is paramount to be diligent in selecting a trusted source.\n\nThis article serves as a comprehensive guide, equipping you with the essential knowledge to navigate the process of procuring buy verified cash app account, ensuring that you are well-informed before making any purchasing decisions. Understanding the fundamentals is key, and by following this guide, you’ll be empowered to make informed choices with confidence.\n\n \n\nIs it safe to buy Cash App Verified Accounts?\nCash App, being a prominent peer-to-peer mobile payment application, is widely utilized by numerous individuals for their transactions. However, concerns regarding its safety have arisen, particularly pertaining to the purchase of “verified” accounts through Cash App. This raises questions about the security of Cash App’s verification process.\n\nUnfortunately, the answer is negative, as buying such verified accounts entails risks and is deemed unsafe. Therefore, it is crucial for everyone to exercise caution and be aware of potential vulnerabilities when using Cash App. How To Buy Verified Cash App Accounts.\n\nCash App has emerged as a widely embraced platform for purchasing Instagram Followers using PayPal, catering to a diverse range of users. This convenient application permits individuals possessing a PayPal account to procure authenticated Instagram Followers.\n\nLeveraging the Cash App, users can either opt to procure followers for a predetermined quantity or exercise patience until their account accrues a substantial follower count, subsequently making a bulk purchase. Although the Cash App provides this service, it is crucial to discern between genuine and counterfeit items. If you find yourself in search of counterfeit products such as a Rolex, a Louis Vuitton item, or a Louis Vuitton bag, there are two viable approaches to consider.\n\n \n\nWhy you need to buy verified Cash App accounts personal or business?\nThe Cash App is a versatile digital wallet enabling seamless money transfers among its users. However, it presents a concern as it facilitates transfer to both verified and unverified individuals.\n\nTo address this, the Cash App offers the option to become a verified user, which unlocks a range of advantages. Verified users can enjoy perks such as express payment, immediate issue resolution, and a generous interest-free period of up to two weeks. With its user-friendly interface and enhanced capabilities, the Cash App caters to the needs of a wide audience, ensuring convenient and secure digital transactions for all.\n\nIf you’re a business person seeking additional funds to expand your business, we have a solution for you. Payroll management can often be a challenging task, regardless of whether you’re a small family-run business or a large corporation. How To Buy Verified Cash App Accounts.\n\nImproper payment practices can lead to potential issues with your employees, as they could report you to the government. However, worry not, as we offer a reliable and efficient way to ensure proper payroll management, avoiding any potential complications. Our services provide you with the funds you need without compromising your reputation or legal standing. With our assistance, you can focus on growing your business while maintaining a professional and compliant relationship with your employees. Purchase Verified Cash App Accounts.\n\nA Cash App has emerged as a leading peer-to-peer payment method, catering to a wide range of users. With its seamless functionality, individuals can effortlessly send and receive cash in a matter of seconds, bypassing the need for a traditional bank account or social security number. Buy verified cash app account.\n\nThis accessibility makes it particularly appealing to millennials, addressing a common challenge they face in accessing physical currency. As a result, ACash App has established itself as a preferred choice among diverse audiences, enabling swift and hassle-free transactions for everyone. Purchase Verified Cash App Accounts.\n\n \n\nHow to verify Cash App accounts\nTo ensure the verification of your Cash App account, it is essential to securely store all your required documents in your account. This process includes accurately supplying your date of birth and verifying the US or UK phone number linked to your Cash App account.\n\nAs part of the verification process, you will be asked to submit accurate personal details such as your date of birth, the last four digits of your SSN, and your email address. If additional information is requested by the Cash App community to validate your account, be prepared to provide it promptly. Upon successful verification, you will gain full access to managing your account balance, as well as sending and receiving funds seamlessly. Buy verified cash app account.\n\n \n\nHow cash used for international transaction?\nExperience the seamless convenience of this innovative platform that simplifies money transfers to the level of sending a text message. It effortlessly connects users within the familiar confines of their respective currency regions, primarily in the United States and the United Kingdom.\n\nNo matter if you’re a freelancer seeking to diversify your clientele or a small business eager to enhance market presence, this solution caters to your financial needs efficiently and securely. Embrace a world of unlimited possibilities while staying connected to your currency domain. Buy verified cash app account.\n\nUnderstanding the currency capabilities of your selected payment application is essential in today’s digital landscape, where versatile financial tools are increasingly sought after. In this era of rapid technological advancements, being well-informed about platforms such as Cash App is crucial.\n\nAs we progress into the digital age, the significance of keeping abreast of such services becomes more pronounced, emphasizing the necessity of staying updated with the evolving financial trends and options available. Buy verified cash app account.\n\nOffers and advantage to buy cash app accounts cheap?\nWith Cash App, the possibilities are endless, offering numerous advantages in online marketing, cryptocurrency trading, and mobile banking while ensuring high security. As a top creator of Cash App accounts, our team possesses unparalleled expertise in navigating the platform.\n\nWe deliver accounts with maximum security and unwavering loyalty at competitive prices unmatched by other agencies. Rest assured, you can trust our services without hesitation, as we prioritize your peace of mind and satisfaction above all else.\n\nEnhance your business operations effortlessly by utilizing the Cash App e-wallet for seamless payment processing, money transfers, and various other essential tasks. Amidst a myriad of transaction platforms in existence today, the Cash App e-wallet stands out as a premier choice, offering users a multitude of functions to streamline their financial activities effectively. Buy verified cash app account.\n\nTrustbizs.com stands by the Cash App’s superiority and recommends acquiring your Cash App accounts from this trusted source to optimize your business potential.\n\nHow Customizable are the Payment Options on Cash App for Businesses?\nDiscover the flexible payment options available to businesses on Cash App, enabling a range of customization features to streamline transactions. Business users have the ability to adjust transaction amounts, incorporate tipping options, and leverage robust reporting tools for enhanced financial management.\n\nExplore trustbizs.com to acquire verified Cash App accounts with LD backup at a competitive price, ensuring a secure and efficient payment solution for your business needs. Buy verified cash app account.\n\nDiscover Cash App, an innovative platform ideal for small business owners and entrepreneurs aiming to simplify their financial operations. With its intuitive interface, Cash App empowers businesses to seamlessly receive payments and effectively oversee their finances. Emphasizing customization, this app accommodates a variety of business requirements and preferences, making it a versatile tool for all.\n\nWhere To Buy Verified Cash App Accounts\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nThe Importance Of Verified Cash App Accounts\nIn today’s digital age, the significance of verified Cash App accounts cannot be overstated, as they serve as a cornerstone for secure and trustworthy online transactions.\n\nBy acquiring verified Cash App accounts, users not only establish credibility but also instill the confidence required to participate in financial endeavors with peace of mind, thus solidifying its status as an indispensable asset for individuals navigating the digital marketplace.\n\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nConclusion\nEnhance your online financial transactions with verified Cash App accounts, a secure and convenient option for all individuals. By purchasing these accounts, you can access exclusive features, benefit from higher transaction limits, and enjoy enhanced protection against fraudulent activities. Streamline your financial interactions and experience peace of mind knowing your transactions are secure and efficient with verified Cash App accounts.\n\nChoose a trusted provider when acquiring accounts to guarantee legitimacy and reliability. In an era where Cash App is increasingly favored for financial transactions, possessing a verified account offers users peace of mind and ease in managing their finances. Make informed decisions to safeguard your financial assets and streamline your personal transactions effectively.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com[](url)" | mcynthiahortonotyu |
1,863,040 | ID document verification using MiniAiLive ID Recognition SDK | Miniailive ID Recognition SDK offers efficient ID document verification. It ensures accurate, fast,... | 0 | 2024-05-23T16:48:15 | https://dev.to/miniailive/id-document-verification-using-miniailive-id-recognition-sdk-54hi | webdev, androiddev, biometrics, identity | Miniailive ID Recognition SDK offers efficient ID document verification. It ensures accurate, fast, and secure identity checks.
Miniailive ID Recognition SDK is a reliable solution for verifying identity documents swiftly. Businesses can enhance security and reduce fraud with this advanced technology. It supports various ID types, including passports and driver’s licenses, making it versatile for global use.
The SDK integrates seamlessly into existing systems, providing a user-friendly experience. With its high accuracy and speed, businesses can streamline operations and improve customer satisfaction. Miniailive’s technology uses advanced algorithms to ensure the legitimacy of documents. It is an essential tool for industries requiring stringent identity verification. The SDK helps maintain compliance with regulatory standards, offering peace of mind to both businesses and users.
For more reading:
https://miniai.live/id-document-verification-using-miniailive-id-recognition-sdk/
 | miniailive |
1,862,985 | Agile has fallen. Or has it ? | In today's fast moving world it is not news to hear of companies adapting new ways of working and... | 0 | 2024-05-23T16:45:55 | https://dev.to/polinaeliana/agile-has-fallen-or-has-it--470m | agile, productivity, writing, scrum | In today's fast moving world it is not news to hear of companies adapting new ways of working and shedding old ones. Many are running after the trend only to find themselves stuck in a vicious cycle where money and resources have been spent and now it's too late to get out. So it is to no surprise that so many leaders are raising their voices on the side effects of agile. Here I wanted to share some of my observations from both experience and reading online content on this topic. To start I truly empathise with the trauma endured by those in the tech world who have been forced into the maze of scrum or whatever other framework under the umbrella of agile. As an agile practitioner I cannot negate these horror stories, so before I continue with my 2 cents I wanted to make it clear that I totally understand why people are standing up to express their feelings of frustration and disillusionment.
**The question is this, has agile fallen ?** I would say yes and no.
Now let's look at the real life example of an average corporate that has adapted this fancy new way of working and is rapidly trying to convert their entire company into an agile superhero. They have big dreams which is commendable,
so what's the big deal ? Well everything. Diving deeper, many companies are using this term "agile" as a tool to cover up waterfallish way of doing business, for them agile is nothing more but just business. They are looking at it as means to cut down budget to layoff onshore and re hire offshore, then demand projects to be completed on time and with quality and when these high expectations are not met someone has to be blamed and in our case it's the mister agile. This type of stressful environment with constant changes and unstable teams not only creates burnout but most of all becomes the bottleneck for teams to deliver better product in shorter time. So the very thing which supposed to aid in continuous improvement has become the monster that's stopping them from growing.
**Let's ask ourselves is this really what agility is?**
On the other hand the issue is the focus on certification over mindset, project managers are undergoing scrum conversion but instead of carefully leading their teams through the transformation by upholding the 12 principles they are forcibly imposing this new way on them without allowing for the room to experiment, make mistakes, learn and grow together as a unit. Agile will not work but certainly fail when we use it to conceal our micromanaging ways. It cannot thrive in a rigid climate where the focal point is how many story points can you deliver in 2 weeks and how fast can you make this and that happen. Instead true agility facilitates a pipeline for both business and tech to have transparent conversations in a safe fearless environment with the customer at the center and the teams having the power to share their technical expertise. So any agile framework will expose the weaknesses that the system in place has and it is up to the leadership if they desire true shift or not.
Nevertheless we rarely see holistic agility happen in real life not because agile is bad (it's like saying all chocolate is bad when in fact good quality cacao is a superfood if consumed in wisdom), but because there is misalignment between the leadership and its employees. Real changes start at the top and trickle down, if you are seeing great dissatisfaction with agile at your work place ask yourself if your teams are encouraged to fail fast or are they being reprimanded for not meeting the highly favoured velocity. We have to stop using agile as a mantra but instead be open to trying new things, if one way doesn't work for your team, try something else. It is about building healthy teams in the end and amplifying creativity and growth. It is not all size fits all type of method but an invitation to a journey of discovery of what works for you and not against you. So my encouragement to the struggling scrum masters and teams would be to do what's best for everyone, have empathy and don't be afraid of uncomfortable conversations.
I would love to hear your opinions about this topic, don't be shy to leave your comments. Talk soon !
| polinaeliana |
1,863,036 | Workday Security: Everything You Need to Know | If you're a malicious actor in cyberspace, you could do much worse than targeting a Workday instance... | 0 | 2024-05-23T16:41:38 | https://www.suridata.ai/blog/workday-security/ | cybersecurity, devops | If you're a malicious actor in cyberspace, you could do much worse than targeting a Workday instance at a large corporation. As one of the world's leading Human Capital Management (HCM) applications, Workday holds valuable data from employees and businesses worldwide.
The scale of [Workday's user base](https://techreport.com/statistics/workday-inc-statistics/) hints at the level of risk: 50 million users in over 170 countries. Over a million users access Workday every hour, executing 365 billion transactions annually. The platform's scale and inherent risks prompt a strict security approach.
Workday has rigorous security controls protecting its infrastructure. However, the problem is that much of the risk exposure inherent in Workday exists outside of what the company provides.

[Source](https://www.thomsondata.com/customer-base/companies-that-use-workday-hcm.php)
**What is Workday Security?**
-----------------------------
**Workday security is a series of security steps, protocols, and [information security controls](https://www.suridata.ai/blog/infosec-guide-to-information-security-controls/) to mitigate risks in the complex Workday environment. **Workday runs in highly secure data centers with network and infrastructure security operations. **The company enforces strict application security (AppSec) policies, and all data is encrypted and subject to segregation by the client to protect the multi-tenant architecture. **
Passwords are "native login," always hashed---never stored or transmitted in their actual forms. However, it's worth paying attention to the type of password hashing algorithm used, as older methods may be inefficient. Modern algorithms like bcrypt or Argon2 are adaptive and more resistant to attacks, making passwords significantly more difficult and time-consuming to track. A modern, customizable hashing method is crucial to protect your Workday passwords.
But Workday offers other crucial security features. The whole application is subject to a single security model, so all user interactions are subject to the same policies and enforcement mechanisms. The platform also offers single-sign-on (SSO) using security assertion markup language (SAML) and multi-factor authentication (MFA).
Consider what's going on inside a typical Workday instance. You might have salary and benefit information for thousands of employees worldwide. Employees can see their salary and benefits, not anyone else's. Administrative users may have greater visibility into this private information and be able to define access controls and security configurations.
This power ensures that no one sees what they're not supposed to see and that no outsider can view, exfiltrate, modify, or delete any data. Still, Workday is a SaaS app like any other, so it suffers from the same SaaS security challenges. Just like you would [secure your Salesforce environment](https://www.suridata.ai/blog/essential-steps-to-a-secure-salesforce-environment/) to protect customer data, you should apply extra security measures to your Workday platform to protect employees' private information.

[Source](https://view.ceros.com/workday/chapter-2-workday-security-mb-update)
**Why Should You Care About Securing Your Workday Platform?**
-------------------------------------------------------------
**Workday's extensive user base and dynamic functionality make it vulnerable to cyber threats.** At most Workday companies, virtually every employee can access the software, subject to a broad and often complex range of permissions. **The software integrates with other systems, too, which further expands the attack surface. **
Despite having its robust security features, Workday still faces various risks. **Most of these risks stem from using and integrating this SaaS app with other apps, but they can be mitigated with the proper [SaaS security best practices](https://www.suridata.ai/blog/saas-security-best-practices/).** As a Workday client, here are the common security threats you should be concerned about:
- **Phishing attacks**---Malicious actors can trick employees into sharing their Workday login credentials or other information that lets them gain unauthorized access to the application.
- **Credential stuffing attacks**---If attackers have stolen passwords, such as those on the dark web, they can try to access Workday by pairing known user emails with possible passwords. This attack can work if MFA is not switched on and users use the same password for personal accounts.
- **Insider threats**---Employees may abuse their access to the application for many reasons, ranging from curiosity to disgruntlement and greed. Privileged users comprise the most severe insider threat.
- **API threats/integrations**---Workday connects with many popular enterprise applications through its application programming interface (API). The same API can be the basis for integration with virtually any software. Depending on how you configure the API, it may be vulnerable to threats like broken access control or injection attacks. Left un-remediated, vulnerable APIs can leak data to unauthorized parties.
- **Supply chain attacks**---Workday offers an integrated development environment (IDE) so developers can build custom add-ons and integrations for the software. By allowing custom code from outsiders, Workday could enable insecure software development processes. This exposes Workday customers to supply chain attacks like malicious code injections, social engineering, and compromised dependencies.

[Source](https://www.hbs.net/blog/how-software-supply-chain-attacks-work/)
**Challenges in Maintaining Workday security **
-----------------------------------------------
### **1\. Overprivileged accounts**
It is possible to let some Workday accounts become overprivileged, even if accidentally. For example, if a user has an admin role but then switches to a new job that does not require admin access and that access is not revoked, the user has excessive privileges. **Bad actors can exploit these to access and steal sensitive data, manipulate settings, or approve financial transactions. **
### **2\. Complexity of roles, hierarchies, and organizational structures**
Workday allows you to create highly complex user roles with access rights that map to organizational structures and hierarchies. This level of detail is excellent because you can set up your access how you want it. However, **organizations are dynamic, and over time, the role definitions you set up will become obsolete, creating risk exposure in the process. **
### **3**. **Lack of visibility over access controls**
The complexity of roles and their respective privilege leads to a situation where security admins may lack visibility into access controls. Workday does provide an interface with a view of access controls, but **when there are thousands of users to manage, with varying and ever-changing permission levels, admins struggle to parse who can do what on the software.**
**6 Steps to Improve Workday Security**
---------------------------------------
### **1\. Monitor security configurations**
Security admins' flexibility and discretion can lead to insecure configurations, such as overly permissive bring-your-own-device (BYOD) policies or a lack of MFA. These misconfigurations allow attackers to enter your systems and can be very challenging to spot. **To monitor security configurations properly, you should use an automated monitoring tool like Suridata, which scans configurations continuously and flags problematic ones.**

[Source](https://www.digitalguardian.com/blog/ultimate-guide-byod-security-overcoming-challenges-creating-effective-policies-and-mitigating)
### 2\. **Create a privileges inventory and regularly update it**
Keep your Workday access model simple and map it clearly to your organizational model and role definitions. The result will be an inventory of privileges that identifies what every type of user can see and do on Workday. **This inventory should include the privileges granted to Workday integration system users (ISUs) and Integration Security Groups (ISSGs).** Keeping this inventory up to date has to be someone's job, and executive sponsorship may be necessary to ensure the budget to cover this cost.
### **3\. Deploy API security and third-party security countermeasures**
The Workday API is a potential attack surface, so deploying countermeasures that mitigate API risk is a best practice. Firstly, it's worth familiarizing yourself with the top 10 risks for [OWASP API security](https://www.openappsec.io/post/the-developer-s-guide-to-owasp-api-security). This list gives you a comprehensive overview of the threats to look out for and how attackers exploit these gaps. Then, **using a dedicated API security solution, you can scan your environment for "rogue" APIs that may have been abandoned but expose Workday data to external API clients.** These tools can also monitor API security settings and flag insecure APIs.
### **4**. **Employ Secure DevOps practices**
**If your organization uses the Workday IDE and leverages DevOps, you should employ AppSec methods in your development process.** One key component of AppSec is code scanning, for which you can employ SAST and [DAST tools](https://www.jit.io/a/appsec-tools/top-dast-tools-for-2024) to identify vulnerabilities early. Integrating "shift left" security remediation can also help you embed the proper security controls during development, preventing you from having to fix vulnerabilities while your Workday app is running.

[Source](https://xebia.com/blog/how-to-make-your-web-application-more-secure-by-using-static-application-security-testing-part-1-of-5-in-application-security-testing-series/)
### **5\. Continuous compliance monitoring**
**The personnel data on Workday is subject to several laws regarding consumer privacy, depending on where you operate.** If you have employees in California, you must [comply with CCPA](https://www.memcyco.com/home/ccpa-compliance-checklist/), for example. If you're in Europe, it's GDPR. Violations of these laws can be costly and time-consuming to remediate.
You can reduce your compliance risk by continuously monitoring for compliance (for instance, checking that data is in the correct sovereignty and no data about German citizens is stored in France).
### **6\. Continuously scan for insecure data **
Workday is likely just one element in your broader SaaS ecosystem. It's also likely to be connected to other SaaS apps you use, such as online storage solutions. One risk in these conditions is that users will move data from Workday to another app, such as an online storage platform. This data storage can be highly insecure, leading to the risk of breach, exfiltration, and all the expensive remediation actions that follow such an event.
**SaaS security solutions like Suridata can continuously scan for personnel data across the entire SaaS ecosystem, automatically flagging data that needs to be removed and providing detailed insights into the issue so you can activate the proper remediation workflow. **
**Making Workday SaaS Secure**
------------------------------
Workday has many security features, but the software remains vulnerable to threats like phishing attacks, API attacks, and brute force. Complex access rules compound the problem. To make Workday more secure, you must implement countermeasures addressing these risks.\
Suridata can make securing your Workday platform and data much more manageable. The Suridata tool scans your entire SaaS ecosystem (Workday and beyond, no matter how many SaaS apps you have), monitoring Workday's security configurations, spotting any abnormal user behavior, and accurately identifying data that might have been mistakenly or maliciously exported from Workday. **Interested in seeing it in action? [Schedule a demo](https://www.suridata.ai/demo/) with our team.** | yayabobi |
1,863,034 | Exploring Physicians Immediate Care Near Me | In today's fast-paced world, access to immediate medical care is essential. Whether it's a sudden... | 0 | 2024-05-23T16:39:15 | https://dev.to/backlink_30/exploring-physicians-immediate-care-near-me-p3 |
In today's fast-paced world, access to immediate medical care is essential. Whether it's a sudden illness, a minor injury, or the need for routine healthcare services, having a reliable medical facility nearby can make all the difference. This is where Physicians Immediate Care steps in, offering convenient and accessible healthcare services tailored to meet the needs of individuals and families alike. In this article, we'll delve into the importance of immediate care facilities, the services they provide, and how to locate a Physicians Immediate Care center near you.
**Understanding the Significance of Immediate Care Facilities:
**
Immediate care facilities play a crucial role in the healthcare landscape by providing prompt medical attention for non-life-threatening conditions. They bridge the gap between primary care physicians and emergency rooms, offering a middle ground for patients who require urgent care but do not need the resources of a hospital emergency department.
One of the key advantages of immediate care facilities is their convenience. Unlike primary care providers that operate on regular business hours and often require appointments, immediate care centers typically offer extended hours, including evenings and weekends, without the need for an appointment. This accessibility ensures that patients can receive timely medical attention when they need it most, without having to wait for days or weeks for an appointment with their primary care physician.
Moreover, immediate care centers are equipped to handle a wide range of medical issues, from minor injuries and illnesses to preventive services and vaccinations. This versatility makes them a go-to option for individuals seeking prompt healthcare solutions without the hassle of scheduling appointments or enduring long wait times.
**Exploring the Services Offered by Physicians Immediate Care:
**
Physicians Immediate Care is a leading provider of immediate care services, with numerous locations across the United States. Their centers are staffed by board-certified physicians and trained medical professionals who are dedicated to delivering high-quality care in a timely manner.
**Some of the services offered by Physicians Immediate Care include:
**
Treatment of Minor Injuries: Whether it's a sprained ankle, a minor burn, or a laceration that requires stitches, Physicians Immediate Care is equipped to provide prompt treatment for a variety of minor injuries.
Management of Illnesses: From colds and flu to urinary tract infections and strep throat, Physicians Immediate Care offers diagnosis and treatment for a wide range of common illnesses, helping patients get back on their feet quickly.
Diagnostic Services: Physicians Immediate Care centers are equipped with on-site diagnostic capabilities, including X-rays and laboratory testing, allowing for prompt and accurate diagnosis of various medical conditions.
Preventive Care: In addition to treating acute medical issues, Physicians Immediate Care also offers preventive services such as flu shots, vaccinations, and physical exams to help patients stay healthy and proactive about their healthcare needs.
Occupational Health Services: For employers and employees alike, Physicians Immediate Care provides a range of occupational health services, including pre-employment screenings, drug testing, and treatment for work-related injuries.
Telemedicine Options: In some cases, Physicians Immediate Care offers telemedicine appointments, allowing patients to consult with a healthcare provider remotely for certain non-urgent medical issues.
**Locating a Physicians Immediate Care Center Near You:
**
Finding a Physicians Immediate Care center near you is simple and convenient. With multiple locations in various states, chances are there's a center within close proximity to your home or workplace. Here's how you can locate a Physicians Immediate Care center near you:
Online Search: The quickest and easiest way to find a Physicians Immediate Care center near you is by conducting an online search. Simply type "Physicians Immediate Care near me" into your preferred search engine, and you'll be provided with a list of nearby locations, along with their addresses, contact information, and hours of operation.
Official Website: Alternatively, you can visit the official Physicians Immediate Care website, where you'll find a location finder tool that allows you to search for centers by zip code or city. This tool provides detailed information about each center, including services offered and hours of operation.
Mobile App: **[Physicians Immediate Care Near me](accesstotalcare.com/locations/access-urgent-care-kingsville/)** may also have a mobile app available for download, which allows users to easily locate nearby centers, schedule appointments, and access important healthcare resources on the go.
Once you've located a Physicians Immediate Care center near you, simply visit the facility during operating hours for prompt medical attention. No appointment is necessary, making it easy to receive the care you need when you need it most.
**Conclusion:
**
Physicians Immediate Care plays a vital role in providing convenient and accessible healthcare services to individuals and families across the country. With extended hours, prompt medical attention, and a wide range of services, their centers are equipped to handle a variety of non-life-threatening medical issues, from minor injuries and illnesses to preventive care and occupational health services. By understanding the importance of immediate care facilities and knowing how to locate a Physicians Immediate Care center near you, you can ensure that you have access to timely medical attention whenever the need arises. | backlink_30 | |
1,863,005 | Array Expansion in Flink SQL | I’ve recently started my journey with Apache Flink. As I learn certain concepts, I’d like to share... | 0 | 2024-05-23T16:34:41 | https://dev.to/sandonjacobs/array-expansion-in-flink-sql-40c9 | flink, flinksql, sql | I’ve recently started my journey with [Apache Flink](https://flink.apache.org/). As I learn certain concepts, I’d like to share them. One such "learning" is the expansion of array type columns in Flink SQL. Having used ksqlDB in a previous life, I was looking for functionality similar to the EXPLODE function to "flatten" a collection type column into a row per element of the collection. Because Flink SQL is ANSI compliant, it’s no surprise this is covered in the standard.
There are cases where fact data might contain an array column, perhaps with a collection of codes or identifiers that can later be used to join with dimension data. The end goal may be a read-optimized view of these facts to each dimension identified in that area.
## The Why
Let’s lay the foundation with a quick review of the terms “fact data” and “dimension data”. Fact data typically contains quantitative data of a business process. In an event streaming world, we can think of event streams as fact data - detailing actions of users or other upstream systems. For instance, an online shopper just completed a purchase. A fact event could contain the item identifiers and quantities of the purchased shopping cart, along with identifying information about the user, additional shipping information, and so forth.
Notice the use of the terms “identifiers” and “identifying” in the fact event data, thus making this event fairly “flat.” There are various reasons for this pattern - avoiding redundant/superfluous information, which makes the events themselves smaller. This design pattern would also provide flexibility to update the detailed information about those identified entities.
These details are known as dimension data, and when joined with fact data provide just that to our event streams - **_dimensionality_**. When our user dimension data is joined to these purchase event facts, we can then glean the details about the purchasing user. We now know their name, email address, shipping and billing information, and other contact information pertaining to doing business with them. As for the products purchased, this purchase event can now be joined to detailed product dimension data by the inventory system to earmark the purchased quantities. Meanwhile, the inventory system can update a typo in the description of this product because it’s not reliant on anything in the purchase event other than the identifier of the product(s).
## A Sample Use Case
Let’s use the NOAA Weather Service API data model to illustrate this array expansion functionality. The Flink tables described in this example are a result of using Kafka Connect to source data from the REST APIs, apply a set of Single Message Transformations, and write that data to Apache Kafka topics.
## Dimension Data
Weather "happens" in specific locations. The NOAA defines these locations as `zone` entities. A zone describes a geographic location, with multiple forecast offices, radar stations, and (in some cases) time zones. A Flink table holding this data might look something like this:
```
describe `ws_zones`;
```
| Column Name | Data Type | Nullable | Extras |
|:---------------:|:-------------------:|:--------:|:-------------------------:|
| zoneId | STRING | NOT NULL | PRIMARY KEY, BUCKET KEY |
| url | STRING | NULL | |
| name | STRING | NULL | |
| zoneType | STRING | NULL | |
| state | STRING | NULL | |
| cwas | ARRAY<STRING> | NULL | |
| forecastOffices | ARRAY<STRING> | NULL | |
| timeZones | ARRAY<STRING> | NULL | |
| radarStation | STRING | NULL | |
| ts | TIMESTAMP_LTZ(3) | NULL | METADATA FROM 'timestamp' |
## Fact Data
When a weather alert is created, updated, or expires, the API describes that alert entity with the list of affected zones. A Flink table for active alerts might have a schema such as:
```
describe `ws_active_alerts`;
```
| Column Name | Data Type | Nullable | Extras |
|:-----------------:|:-------------------------:|:--------:|:---------------------------------------------------------------------:|
| id | STRING | NOT NULL | PRIMARY KEY, BUCKET KEY |
| areaDesc | STRING | NULL | |
| geocodeSAME | ARRAY<STRING> | NULL | |
| geocodeUGC | ARRAY<STRING> | NULL | |
| affectedZones | ARRAY<STRING> | NULL | |
| sent | TIMESTAMP(3) | NULL | |
| effective | TIMESTAMP(3) | NULL | |
| onset | TIMESTAMP(3) | NULL | |
| expires | TIMESTAMP(3) | NULL | |
| ends | TIMESTAMP(3) | NULL | |
| status | STRING | NULL | |
| messageType | STRING | NULL | |
| category | STRING | NULL | |
| severity | STRING | NULL | |
| certainty | STRING | NULL | |
| urgency | STRING | NULL | |
| event | STRING | NULL | |
| sender | STRING | NULL | |
| senderName | STRING | NULL | |
| headline | STRING | NULL | |
| description | STRING | NULL | |
| instruction | STRING | NULL | |
| response | STRING | NULL | |
| NWSheadline | ARRAY<STRING> | NULL | |
| eventEndingTime | ARRAY<TIMESTAMP(3)> | NULL | |
| expiredReferences | ARRAY<STRING> | NULL | |
| eventTs | TIMESTAMP_LTZ(3) ROWTIME | NULL | METADATA FROM 'timestamp', WATERMARK AS eventTs - INTERVAL '5' MINUTE |
## Array Expansion
In the `ws_active_alerts` table, we see the column `geocodeUGC` of type `ARRAY<STRING>`. These codes correlate to the identifiers for the zone entities from the `ws_zones` dimension table.
| id | geocodeUGC | severity | category | status | onset | effective |
|:---------------------------------------------------------------------:|:--------------------------------:|:--------:|:--------:|:------:|:-----------------------:|:-----------------------:|
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | [TXC183, TXC423, TXC459, TXC499] | Severe | Met | Actual | 2024-05-15 15:45:00.000 | 2024-05-15 15:45:00.000 |
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.001.1 | [TXC183, TXC203, TXC365, TXC401] | Severe | Met | Actual | 2024-05-15 15:45:00.000 | 2024-05-15 15:45:00.000 |
In Flink SQL, a cross join is a type of join that returns the [Cartesian product](https://en.wikipedia.org/wiki/Cartesian_product) of the two tables being joined. The Cartesian product is a combination of every row from the first table with every row from the second table. This feature can be particularly useful when you need to expand an array column into multiple rows.
With that in mind, let’s expand the rows of ws_active_alerts using CROSS JOIN UNNEST. For each row in the ws_active_alerts table, UNNEST flattens the array column geocodeUDC of that row into a set of rows. Then CROSS JOIN joins this new set of rows with the single row from the ws_active_alerts table. So the result of UNNEST could be thought of as a temporary table - for the scope of this operation.
```
select
active.`id` as `alertId`,
`ActiveAlertsByUgcCode`.geocodeugc as `zoneId`,
active.`event` as `event`,
active.`status` as `alertStatus`,
active.`severity` as `severity`
from ws_active_alerts active
CROSS JOIN UNNEST(active.geocodeUGC) as `ActiveAlertsByUgcCode`(geocodeugc);
```
The results of this query yield a new row for each zone value of the alert.
| alertId | zoneId | event | alertStatus | severity |
|:---------------------------------------------------------------------:|:------:|:-------------:|:-----------:|:--------:|
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | TXC183 | Flood Warning | Actual | Severe |
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | TXC459 | Flood Warning | Actual | Severe |
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | TXC499 | Flood Warning | Actual | Severe |
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.001.1 | TXC183 | Flood Warning | Actual | Severe |
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | TXC423 | Flood Warning | Actual | Severe |
## Join Facts with Dimensions
The goal here is to answer some location questions with the data on hand.
- What are the states affected by a given alert?
- How many alerts are active for a given state?
If we join our expanded facts about alerts with the dimension data from the zone definitions, we can find the affected states of each alert.
```
select
active.`id` as `alertId`,
`ActiveAlertsByUgcCode`.geocodeugc as `zoneId`,
zone.state as `state`
from ws_active_alerts active
CROSS JOIN UNNEST(active.geocodeUGC) as `ActiveAlertsByUgcCode`(geocodeugc)
LEFT JOIN ws_zones zone ON zone.zoneId = `ActiveAlertsByUgcCode`.geocodeugc;
```
| alertId | zoneId | state |
|:---------------------------------------------------------------------:|:------:|:-----:|
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | TXC183 | TX |
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | TXC423 | TX |
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | TXC459 | TX |
| urn:oid:2.49.0.1.840.0.e98d53f97bfcd60fcf971e613b85819ae1ec3cbb.005.1 | TXC499 | TX |
The results of this join operation could be a new table with the read-optimized data needed by a microservice to answer the question of “give me the active alerts for a given state.”
```
-- create new table
create table alert_zone_state (
`alertId` STRING,
`zoneId` STRING,
`state` STRING,
PRIMARY KEY (`alertId`, `zoneId`) NOT ENFORCED
) with (
'value.format' = 'avro-registry',
'kafka.cleanup-policy' = 'delete',
'kafka.retention.time' = '10 minutes'
);
-- load that table with the results of the join
insert into alert_zone_state select
active.`id` as `alertId`,
`ActiveAlertsByUgcCode`.geocodeugc as `zoneId`,
zone.state as `state`
from ws_active_alerts active
CROSS JOIN UNNEST(active.geocodeUGC) as `ActiveAlertsByUgcCode`(geocodeugc)
LEFT JOIN ws_zones zone ON zone.zoneId = `ActiveAlertsByUgcCode`.geocodeugc;
select * from alert_zone_state where state is not null;
```
Here's a screenshot from my Flink SQL Workspace in Confluent Cloud.

But let's expand on this with a new table to get a count of the distinct active alerts for all states:
```
-- create a new table
create table alert_counts_for_states (
`state` STRING,
`alertCount` INTEGER,
PRIMARY KEY (`state`) NOT ENFORCED
) with (
'value.format' = 'avro-registry',
'kafka.cleanup-policy' = 'compact'
);
-- load the counts into the new table from the previous table we created
insert into alert_counts_for_states
select
`state`,
cast(count(distinct(`alertId`)) as INTEGER) as `alertCount`
from alert_zone_state where `state` is not null
group by `state`;
-- query the table
select * from alert_counts_for_states;
```
The results of this query might look like this in my Flink SQL Workspace.

## Land the Plane
`CROSS JOIN UNNEST` proves to be a useful tool to unpack array types in source datasets. We can't always rely on the provider of source data to normalize these raw datasets to meet our application needs.
I hope you find this helpful in your journey with Flink SQL.
| sandonjacobs |
1,858,799 | Internal Developer Platform vs Internal Developer Portal | In today's highly competitive software development landscape, organizations are constantly seeking... | 0 | 2024-05-23T16:33:00 | https://www.devzero.io/blog/internal-developer-platform-vs-internal-developer-portal | tooling, terraform, automation, devops | In today's highly competitive software development landscape, organizations are constantly seeking ways to enhance their internal processes and empower their development teams. Two commonly used tools in this quest are internal developer platforms and internal developer portals. While both serve as valuable resources, understanding the differences between the two is crucial in determining which tool is best suited for your team's needs.
## What is an Internal Developer Platform?
An [internal developer platform](https://www.devzero.io/blog/what-is-an-internal-developer-platform), often referred to as a platform-as-a-service (PaaS) provides a centralized environment for developers to create, test, and deploy applications. It offers a set of pre-configured tools, frameworks, and infrastructure, allowing developers to focus solely on writing code and accelerating the development process.
By utilizing an internal developer platform, companies can streamline their software development lifecycle, reduce time-to-market, and improve overall efficiency. These platforms enable developers to leverage a wide range of services, such as containerization, automatic scaling, and continuous integration/continuous deployment (CI/CD), without having to worry about the underlying infrastructure.
For instance, renowned companies like Salesforce and Microsoft Azure offer robust internal developer platforms that empower their respective developer communities to build cutting-edge applications with ease. These platforms provide a seamless experience, enabling developers to quickly spin up new environments, collaborate with team members, and leverage built-in monitoring and debugging tools.
## Defining an Internal Developer Portal
An internal developer portal, on the other hand, serves as a repository of resources, documentation, and APIs (Application Programming Interfaces) that support and facilitate the development process. It offers developers a centralized location to access information, collaborate, and gather knowledge to enhance their productivity.
Internal developer portals play a crucial role in promoting knowledge sharing and fostering a culture of collaboration within development teams. They provide developers with access to comprehensive documentation, including coding standards, best practices, and architectural guidelines, ensuring consistency and quality across projects.
Companies such as Google and Amazon have successfully implemented internal developer portals, ensuring their development teams have all the necessary tools and information to deliver top-notch solutions. These portals offer features like API documentation, code samples, and interactive forums, enabling developers to learn from each other, ask questions, and share insights.
Moreover, internal developer portals often integrate with other tools and services, such as version control systems, issue tracking systems, and project management platforms. This integration further enhances the development workflow, allowing developers to seamlessly transition between different stages of the software development lifecycle.
Overall, internal developer portals serve as a valuable resource for developers, providing them with the necessary information and tools to build high-quality software solutions efficiently. They empower developers to stay up-to-date with the latest technologies, collaborate effectively, and deliver exceptional results.
## Key Differences Between Internal Developer Platforms and Internal Developer Portals
## Functionality and Features
One of the primary differences between an internal developer platform and an internal developer portal lies in their core functionality. While an internal developer platform focuses on providing a complete development environment, an internal developer portal primarily acts as a knowledge hub.
For example, a developer platform might include features like automated deployment, built-in quality testing, and continuous integration tools. On the other hand, a developer portal might offer documentation, API reference guides, and community forums to support developers throughout the development lifecycle.
## User Interface and Experience
When it comes to user interface and experience, internal developer platforms often prioritize simplicity and ease of use. They offer intuitive interfaces, streamline workflows, and automate repetitive tasks, enabling developers to quickly build and deploy applications.
In contrast, internal developer portals typically focus on creating a seamless browsing experience. They provide comprehensive search functionalities, categorize resources effectively, and ensure swift access to relevant information. This user-centric approach enhances productivity and knowledge sharing among developers.
## Integration and Compatibility
While both internal developer platforms and internal developer portals aim to integrate seamlessly with existing systems and tools, their approaches differ.
Internal developer platforms typically provide ready-to-use integrations with popular development tools, version control systems, and third-party services. This compatibility ensures a smooth development workflow and eliminates any potential roadblocks.
Internal developer portals, on the other hand, offer extensive API documentation and guidelines to enable developers to integrate their applications with various services. They provide the necessary resources to ensure compatibility, while also enabling developers to leverage the power of external APIs.
## Pros and Cons of Internal Developer Platforms
## Advantages of Internal Developer Platforms
Internal developer platforms present numerous advantages for development teams.
- Increased Productivity: By providing a pre-configured environment and automating processes, internal developer platforms streamline development workflows and significantly boost productivity. For example, XYZ Corporation reported a 30% increase in developer productivity after implementing their internal developer platform.
- Faster Release Cycles: The standardized infrastructure and built-in deployment automation enable organizations to release new features and updates rapidly. ABC Corporation witnessed a 50% reduction in release cycles after adopting an internal developer platform.
## Potential Drawbacks of Internal Developer Platforms
Despite their advantages, internal developer platforms may also have some limitations.
- Learning Curve: Mastering the platform's tools and frameworks might require a learning curve for developers. Companies should allocate adequate resources for onboarding and training.
- Dependency on Platform Provider: Organizations relying heavily on external internal developer platform providers might face challenges if the provider discontinues their services or undergoes substantial changes.
## Pros and Cons of Internal Developer Portals
## Benefits of Internal Developer Portals
Internal developer portals offer distinct benefits to development teams.
- Knowledge Sharing and Collaboration: By providing a centralized hub of information, developer portals foster collaboration, knowledge sharing, and quick access to resources. This accelerates development cycles and reduces time spent searching for information.
- Onboarding New Developers: Developer portals serve as invaluable resources for onboarding new team members, providing them with the necessary documentation and guidelines to get up to speed quickly.
## Possible Limitations of Internal Developer Portals
Internal developer portals may also have some potential limitations.
- Information Overload: Without proper organization and prioritization, developer portals can become overwhelming, making it challenging for developers to find the specific resources they need.
- Outdated or Incomplete Documentation: In dynamic development environments, keeping documentation up to date can be a challenge. Outdated or incomplete documentation can lead to confusion and hinder development processes.
## Choosing the Right Tool for Your Team
## Assessing Your Team's Needs
Before making a decision between an internal developer platform and an internal developer portal, it is crucial to assess your team's unique needs and goals. Consider factors such as project complexity, team size, and preferred development workflows.
For example, if your team is primarily focused on rapid development cycles and requires a standardized environment, an internal developer platform might be the ideal choice. Conversely, if your team values collaboration, knowledge-sharing, and quick access to resources, an internal developer portal could provide the necessary tools and infrastructure.
## Evaluating Your Current Infrastructure
When selecting between an internal developer platform and an internal developer portal, it is essential to evaluate your existing infrastructure. Consider compatibility with your current development tools, architecture, and overall IT ecosystem.
For instance, if your organization heavily relies on specific development tools or incorporates legacy systems, an internal developer platform with flexible integration capabilities might be the way to go. However, if your current infrastructure is diverse and open to external APIs, an internal developer portal can seamlessly integrate and enhance your existing ecosystem.
## Considering Future Growth and Scalability
Lastly, it is essential to consider the scalability and future growth potential of your chosen tool. As your team expands and your projects become more complex, you need a tool that can support your evolving needs.
For example, if you anticipate substantial growth and an increasing number of development projects, an internal developer platform might provide the scalability and automation required. Conversely, if your team is smaller and the focus is on knowledge sharing and collaboration, an internal developer portal can effectively cater to your needs.
Ultimately, the choice between an internal developer platform and an internal developer portal depends on the specific requirements and objectives of your development team. By carefully assessing your team's needs, evaluating your current infrastructure, and considering future growth, you can make an informed decision that maximizes productivity, efficiency, and collaboration within your organization. | shohams |
1,863,031 | everything-ai v2.0.0: more AI power on your Desktop | What is everything-ai? 🤖 everything-ai is natively a multi-tasking agent, 100% local, that... | 0 | 2024-05-23T16:32:56 | https://dev.to/astrabert/everything-ai-v200-more-ai-power-on-your-desktop-2k0b | ai, python, docker, lowcode | ## What is *everything-ai*?
🤖 everything-ai is natively a multi-tasking agent, 100% local, that is able to perform several AI-related tasks
## What's new?
🚀 I am more than thrilled to introduce some new functionalities that were added since last release:
- 🎙️🔊 Handle audio files or microphone recordings, classifying or transcribing them with almost every audio-classification and automatic-speech-recognition model on Hugging Face Hub.
- 📽️ Generate video from text prompts with almost every text-to-video model on HuggingFace Hub (original architecture by [Vasiliy Katsyka](https://github.com/Vasiliy-katsyka))
- 🧬 Predict the 3D structure of proteins from their amino-acidic sequence, with EsmFold by AI at Meta ([demo](https://huggingface.co/spaces/as-cle-bert/proteinviz))
- 🏋️ Finetune HF models on several downstream tasks with AutoTrain local integration (AutoTrain is developed by [Abhishek Thakur](https://github.com/abhishekkrthakur))
- 🗣️ Unleash powerful LLMs and exploit larger database collections for RAG with the integration of Hugging Face Spaces API and Supabase PostgreSQL databases ([demo](https://huggingface.co/spaces/as-cle-bert/supabase-ai-chat))
## How can you use all of these features?
You just need a `docker compose up`!🐋
## Where can I find everything I need?
Get the source code (and leave a little ⭐ while you're there):
https://github.com/AstraBert/everything-ai
Get a quick-start with the documentation:
https://astrabert.github.io/everything-ai/
## Credits and inspiration
Shout-outs to Hugging Face, Gradio, Docker, AI at Meta, Abhishek Thakur, Qdrant, LangChain and Supabase for making all of this possible!
Inspired by: Jan, Cheshire Cat AI, LM Studio, Ollama and other awesome local AI solutions!
| astrabert |
1,835,249 | "🚀 Streamlining Kubernetes Deployment: Setting Up EKSCTL, Kubectl, and AWS CLI on Amazon Linux 2 🛠️" | Hey! It's Sarvar Nadaf again, a senior developer at Luxoft. I worked on several technologies like... | 0 | 2024-05-23T16:30:47 | https://dev.to/aws-builders/-streamlining-kubernetes-deployment-setting-up-eksctl-kubectl-and-aws-cli-on-amazon-linux-2--1j16 | aws, kubernetes, devops, beginners | Hey! It's Sarvar Nadaf again, a senior developer at Luxoft. I worked on several technologies like Cloud Ops (Azure and AWS), Data Ops, Serverless Analytics, and Dev Ops for various clients across the globe.
Today, we'll look into AWS EKS's fundamentals. This post will assist you in setting up the various command-line interfaces that will enable you to communicate with the AWS EKS cluster from a laptop or other Linux-based device.
Today, we will configure ESKCTL, Kubectl, and the AWS CLI. With each Linux distribution, there are a few minimal requirements and differences. I'm setting up all of these command-line tools on an Amazon Linux 2 machine today. Simply, I'll obtain a free tier Amazon Linux 2 server and let's begin configuring command-line interfaces.
## **AWS CLI -**
The open-source AWS CLI is a really powerful resource. This enables our interaction with the entire AWS service. Just by doing a few simple steps, we may operate our AWS account from anywhere.
## **Prerequisite -**
- AWS Account
- IAM User
- Access key ID and Secret access key
1.Use the following command to become the root user. so no extra sudo command for each step.
```
[ec2-user ~]$ sudo su -
```
2.Become a Root User Use the following command to carry out a fast software update to make sure your instance's software packages are up-to-date:
```
[root ~]$ sudo yum update –y
```
3.Updating the server. Currently, we are downloading the zip file containing the AWS CLI package. To download the AWS CLI package from the URL, we are using the curl command. As you can see below, we are converting the package into awscli.zip using the -o option.
```
[root ~]$ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv.zip"
```
4.Downloading AWS CLI Package Once the command has been executed successfully, we can use the ls command to view the awscli.zip file. We are now unzipping the zip file that is on our server.
```
[root ~]$ unzip -q awscli.zip
```
5.Checking package using ls command As soon as the zip file is unzipped, the entire awscli package is installed. To run and execute the awscli package, we are using the following command.
```
[root ~]$ ./aws/install
```
6.Installing AWS CLI After the AWS CLI has been successfully installed, we are just running the command below to see if it is functioning as expected or not. You have successfully configured the AWS CLI on your server if you can see the output similar to the one below.
```
[root ~]$ aws --version
```
7.Checking the AWS Version At this time, your IAM user information is being configured in the AWS CLI. To obtain programmatic access to all AWS services, we must submit the access key and secret access key of the IAM user. With the AWS Configure command, we must supply all the necessary information, including the region and region code for the connection you want to make. The output format must always be JSON, so we will receive the output in JSON format. We will be able to access the service that the IAM user has granted us access to.
```
[root ~]$ aws configure
```
8.Setting up AWS IAM User If you can see the result of the following command after setting up the AWS CLI, your server has been successfully set up for use with the AWS CLI.
```
[root ~]$ aws s3 ls
```
## **Kubectl -**
The Kubernetes API server can be reached using the command line program kubectl. When we access our AWS EKS cluster from our system, Kubectl is a big help. We can use our machine to access all of Kubernete's resources. Many operating system package managers contain the kubectl binary. It's frequently simpler to use a package manager for installation than to download and manually install everything.
1.Here, we're setting up a Kubernetes repository. One file must be created in order to build a repository. The file's repo extension contains information about the gpgkey and the kubenetes baseurl of kubectl. This file is located in the directory /etc/yum.repos.d.
```
[root ~]$ cat <<EOF | sudo tee /etc/yum.repos.d/kubernetes.repo
[kubernetes]
name=Kubernetes
baseurl=https://packages.cloud.google.com/yum/repos/kubernetes-el7-\$basearch
enabled=1
gpgcheck=1
gpgkey=https://packages.cloud.google.com/yum/doc/rpm-package-key.gpg
EOF
```
2.Updating Repo Using the cat command, we are checking that everything appears to be as expected.
```
[root ~]$ cat /etc/yum.repos.d/kubernetes.repo
```
3.Checking Repo Using the command below, we are installing the kubectl CLI right now.
```
[root ~]$ sudo yum install -y kubectl
```
4.Installing Kubectl After the kubectl CLI has been installed successfully, we use the version command to determine whether kubectl has been installed correctly or not.
```
[root ~]$ kubectl version
```
## **EKSCTL -**
The AWS EKS cluster can be interacted with using the tool EKSCTL. We'll carry out the operations necessary to create, update, manage, and delete the AWS EKS cluster. This command line interface for communicating with the AWS EKS Cluster is highly useful.
1.Here, we're using the curl command to download the most recent version of the eksctl package into a tar file, which we then untar under the tmp directory with the following command.
```
[root ~]$ curl --silent --location "https://github.com/weaveworks/eksctl/releases/latest/download/eksctl_$(uname -s)_amd64.tar.gz" | tar xz -C /tmp
```
2.Installing and untar the eksctl Data from the untar eksctl package is being transferred to the /usr/local/bin directory.
```
[root ~]$ sudo mv /tmp/eksctl /usr/local/bin
```
3.Moving from tmp directory to bin directory After the eksctl CLI has been installed successfully, we use the version command to determine whether kubectl has been installed correctly or not.
```
[root ~]$ eksctl version
```
Congratulations! Kubectl, eksctl, and the AWS CLI you all have been installed successfully. We have witnessed the step-by-step installation process for all three CLIs. This CLI will undoubtedly be useful while using the AWS EKS cluster or learning about the AWS EKS. This personally helped me a lot since once you learn how to use the CLI, you won't need a console user interface to see things; just a few commands will give you access to all the information in your Linux terminal. If you have any questions, please post a comment below.
— — — — — — — —
**Here is the End!**
Thank you for taking the time to read my article. I hope you found this article informative and helpful. As I continue to explore the latest developments in technology, I look forward to sharing my insights with you. Stay tuned for more articles like this one that break down complex concepts and make them easier to understand.
_Remember, learning is a lifelong journey, and it’s important to keep up with the latest trends and developments to stay ahead of the curve. Thank you again for reading, and I hope to see you in the next article!_
_**Happy Learning!**_ | sarvar_04 |
1,863,030 | Title: The Ultimate Guide to YouTube Video Downloaders: Unlocking High-Quality Content Anytime, Anywhere | Are you tired of being restricted by your internet connection when trying to enjoy your favorite... | 0 | 2024-05-23T16:24:06 | https://dev.to/ssyoutube/title-the-ultimate-guide-to-youtube-video-downloaders-unlocking-high-quality-content-anytime-anywhere-3le2 |
---
Are you tired of being restricted by your internet connection when trying to enjoy your favorite YouTube videos? Do you wish you could watch them offline or save them for later without compromising on quality? Look no further! In this comprehensive guide, we'll explore the world of YouTube video downloaders, unlocking the ability to access high-quality content anytime, anywhere.
**What are [YouTube Video Downloaders
](https://ssyoutube.online/)
YouTube video downloaders are tools or software that allow you to save YouTube videos to your device. These tools come in various forms, from online websites to standalone software, each offering different features and capabilities.
**Why Use a YouTube Video Downloader?**
- **Offline Viewing**: Downloading videos allows you to watch them without an internet connection, perfect for long flights, commutes, or areas with poor connectivity.
- **Quality Preservation**: Many downloaders offer options to save videos in high definition, preserving the quality of your favorite content.
- **Convenience**: Save videos for later viewing, creating your own library of content to enjoy at your leisure.
**How to Download YouTube Videos**
1. **Choose a YouTube Video Downloader**: Select a downloader that suits your needs. Popular options include 4K Video Downloader, YTD Video Downloader, and ClipGrab.
2. **Copy the Video URL**: Go to YouTube, find the video you want to download, and copy its URL from the address bar.
3. **Paste the URL**: Paste the URL into the downloader's interface and select your preferred video quality and format.
4. **Download the Video**: Click the download button to start the downloading process. Once complete, the video will be saved to your device.
**Legal Considerations**
While downloading YouTube videos for personal use is generally considered acceptable, it's important to be aware of copyright laws. Downloading videos for commercial use or distribution without permission is illegal.
**Conclusion**
YouTube video downloaders offer a convenient way to access your favorite content offline, without compromising on quality or convenience. By choosing the right downloader and following legal guidelines, you can unlock a world of high-quality content anytime, anywhere. | ssyoutube | |
1,863,029 | How to Find the Perfect App Developer: 10 Essential Traits | When it comes to creating mobile apps that shine, finding the right app developers is key. But with... | 0 | 2024-05-23T16:21:42 | https://dev.to/stevemax237/how-to-find-the-perfect-app-developer-10-essential-traits-5f0k | hiring | When it comes to creating mobile apps that shine, finding the right app developers is key. But with so many options out there, it can feel overwhelming. Fear not! We've got you covered with this guide on [how to hire app developers](https://www.mobileappdaily.com/knowledge-hub/how-to-hire-app-developers?utm_source=dev&utm_medium=hc&utm_campaign=mad). Let's dive into the top 10 qualities you should look for.
**Tech Savvy:**
Seek developers who have a knack for coding in languages like Swift, Kotlin, Java, or JavaScript, depending on your needs.
**A Proven Track Record:**
Take a peek at their portfolio to see if they've built apps that wow.
**Problem-Solving Whizzes:**
Look for developers who love a good challenge and can think outside the box to solve problems.
**Great Communicators:**
Find developers who can explain tech stuff in plain English and work well with others.
**Attention to Detail:**
Look for developers who sweat the small stuff and take pride in their work.
**Quick Learners:**
Seek developers who aren't afraid to dive into new tech and learn on the fly.
**Time Management Pros:**
Find developers who can juggle tasks and meet deadlines without breaking a sweat.
**Passionate About Mobile:**
Look for developers who eat, sleep, and breathe mobile tech.
**Quality-Oriented:**
Seek developers who are all about quality and testing to make sure your app works like a charm.
**Fits Your Team:**
Find developers who vibe with your company culture and share your values.
| stevemax237 |
1,863,028 | Truncating Tables with Foreign Keys in Laravel | When working with databases in Laravel, there are times when you might need to clear data from tables... | 0 | 2024-05-23T16:21:30 | https://dev.to/rafaelogic/truncating-tables-with-foreign-keys-in-laravel-lac | laravel, webdev, programming, tutorial | When working with databases in Laravel, there are times when you might need to clear data from tables while ensuring the integrity of related data. A common task is truncating tables, but it becomes complex when foreign key constraints are involved. Here, I'll walk you through a custom Laravel Artisan command that simplifies truncating tables and removing related foreign keys.
The Command: **`php artisan table:truncate {table}`**
Assuming you already know how to create a command so let's dive directly into the code:
```php
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\DB;
use Illuminate\Support\Facades\Schema;
class TruncateTableWithForeignKeys extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'table:truncate {table}';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Truncate a table and delete all related foreign keys';
/**
* Execute the console command.
*
* @return int
*/
public function handle()
{
$table = $this->argument('table');
// Drop foreign keys from other tables referencing this table's uuid column
$this->dropForeignKeysFromReferencingTables($table, 'uuid');
// Drop foreign keys from the specified table
$this->dropForeignKeysFromTable($table);
// Truncate the table
DB::table($table)->truncate();
$this->info("Table {$table} truncated and foreign keys dropped.");
return 0;
}
/**
* Drop foreign keys from the specified table.
*
* @param string $table
* @return void
*/
protected function dropForeignKeysFromTable($table)
{
$foreignKeys = $this->getForeignKeys($table);
Schema::table($table, function ($table) use ($foreignKeys) {
foreach ($foreignKeys as $foreignKey) {
$table->dropForeign($foreignKey);
}
});
}
/**
* Drop foreign keys from tables that reference the specified table's column.
*
* @param string $table
* @param string $column
* @return void
*/
protected function dropForeignKeysFromReferencingTables($table, $column)
{
$schemaManager = DB::getDoctrineSchemaManager();
$databasePlatform = $schemaManager->getDatabasePlatform();
$databasePlatform->registerDoctrineTypeMapping('enum', 'string');
$foreignKeys = [];
foreach ($schemaManager->listTableNames() as $tableName) {
if ($tableName !== $table) {
$foreignKeys[$tableName] = [];
foreach ($schemaManager->listTableForeignKeys($tableName) as $foreignKey) {
if ($foreignKey->getForeignTableName() === $table) {
$foreignKeys[$tableName][] = $foreignKey->getName();
}
}
}
}
foreach ($foreignKeys as $tableName => $keys) {
Schema::table($tableName, function ($table) use ($keys) {
foreach ($keys as $key) {
$table->dropForeign($key);
}
});
}
}
/**
* Get the foreign keys for the specified table.
*
* @param string $table
* @return array
*/
protected function getForeignKeys($table)
{
$schemaManager = DB::getDoctrineSchemaManager();
$keys = $schemaManager->listTableForeignKeys($table);
$foreignKeys = [];
foreach ($keys as $key) {
$foreignKeys[] = $key->getName();
}
return $foreignKeys;
}
}
```
### Scenarios for Using This Command
1. **Development and Testing**
In development or testing environments, you often need to reset the database to a known state. This command ensures that you can truncate tables without worrying about foreign key constraints.
```sh
php artisan table:truncate users
```
2. **Data Migration**
During data migration, you might need to clear tables before importing fresh data. This command ensures that tables are emptied correctly even when foreign key constraints exist.
```sh
php artisan table:truncate orders
```
3. **Cleaning Up Stale Data**
In some applications, certain tables might accumulate a lot of data that needs periodic cleanup. Using this command helps maintain database integrity while performing such operations.
```sh
php artisan table:truncate logs
```
Creating custom Artisan commands in Laravel allows you to encapsulate complex operations into simple, reusable commands. The TruncateTableWithForeignKeys command showcases how you can manage table truncation and foreign key constraints efficiently. This command can be particularly useful in development, testing, and data migration scenarios, ensuring your database operations remain smooth and error-free.
Feel free to customize the command further based on your specific needs and scenarios.
### Enjoy! | rafaelogic |
1,863,016 | Lazy Programmer's Guide to Automating Price Calculations | Introduction: Automating Price Calculations for the Lazy (and Efficient) Programmer Let's... | 0 | 2024-05-23T16:17:17 | https://dev.to/jampamatos/lazy-programmers-guide-to-automating-price-calculations-4g93 | python, automation, flask, javascript | ## Introduction: Automating Price Calculations for the Lazy (and Efficient) Programmer
Let's be honest: programmers, whether seasoned or aspiring, are driven by a certain kind of laziness. But this isn't the couch-potato kind; it's the kind that fuels innovation — finding ways to get more done with less effort. Why spend hours on repetitive tasks when a few lines of code can do the job for you?
This mindset has been my guiding star as I navigate my dual roles. On my spare time, I code; at nine-to-five (seven-thirty-to-five-thirty, actually), I manage my dad's paint store. Among my various responsibilities, one of the most daunting is calculating the cost and selling prices of our products. Now, if you're familiar with Brazil's tax system, you'll understand the gravity of this task. For those who aren't, let me paint you a picture.
In Brazil, taxes are a labyrinthine mess. Beyond federal tax rates, each state has the liberty to set its own tax rates, creating a patchwork of regulations that would make any sane person’s head spin. Enter 'substituição tributária' (ST) — a regime where the tax burden is transferred from the seller to the manufacturer or importer. The idea is to streamline tax collection, but in practice, it often complicates matters further, especially when products cross state lines.
Calculating these costs manually is not just tedious; it's a soul-crushing endeavor. The repetitive nature of punching numbers into a calculator, cross-referencing tax tables, and double-checking figures is enough to make anyone yearn for a better way. That's where my programmer's "laziness" kicked in. I saw a clear need for automation — a way to take these convoluted calculations and simplify them into a process that’s as effortless as possible.
Thus, the web-based price calculator project was born. Armed with formulas from my accountant and a determination to streamline our pricing workflow, I set out to build a tool that could handle the heavy lifting. This post will take you through the journey of its creation, detailing the code, the challenges, and the solutions. Along the way, I’ll share insights into how you too can leverage programming to simplify complex tasks in your work or business. So, let's dive in and explore how we can turn lazy into efficient.
## The Accountant's Formulas: Transforming Tedious Calculations into Computable Algorithms
In the world of business, especially one as intricate as a paint store, pricing products accurately is crucial. My accountant, the wizard behind our financial curtain, provided me with a set of formulas to calculate our costs and selling prices. These formulas consider various taxes and additional charges that are part and parcel of Brazil's complex tax landscape.
The general formula for calculating the cost is as follows:
`Cost Price = Unit Price + ICMS(%) + IPI(%) + Freight(%)
Selling Price = Cost Price + Markup(%) + Additional(%) + Federal Tax (fixed at 7%)`
It’s important to note that each percentage is applied on top of the previous values, compounding the total cost incrementally.
But it's not as straightforward as plugging numbers into a formula. The calculation of ICMS (a state tax) depends on whether the product falls under 'substituição tributária' (ST) or not, and this is where things get interesting.
For products without ST (identified by CFOP codes 5101, 5102, 6101, and 6102):
- If the ICMS rate is 4%, the effective ICMS in the formula becomes 13.27%.
- If the ICMS rate is 12%, the effective ICMS in the formula becomes 7.32%.
For products with ST (identified by CFOP codes 5401, 5403, 5405, 6401, 6403, and 6405):
- If the IPI rate is 0%, the ICMS in the formula is 19.7%.
- If the IPI rate is 1.3%, the ICMS in the formula is 20.3%.
- If the IPI rate is 3.25%, the ICMS in the formula is 21.26%.
- If the IPI rate is 6.5%, the ICMS in the formula is 24%.
I don’t entirely understand the logic behind these values — they’re the magic numbers handed down by my accountant. But I trust her expertise, and these are the values we need to work with.
### Manual Calculations: A Tedious Routine
Before automation, the process was as follows:
1. **Gather Information:** Collect the unit price, ICMS rate, IPI rate, freight percentage (if any), markup percentage, and additional charges.
2. **Apply Formulas:** Using the provided formulas, manually calculate the effective ICMS rate based on the CFOP code and applicable ICMS/ IPI rates.
3. **Calculate Cost Price:** Add up all the percentages and apply them to the unit price.
4. **Calculate Selling Price:** Add the calculated cost price to the markup, additional charges, and federal tax.
5. **Repeat: **Perform these steps for each product, using a calculator to ensure accuracy.
This manual process was not only time-consuming but also prone to errors. Every miscalculation could lead to incorrect pricing, impacting our profit margins or making us uncompetitive.
### From Manual to Algorithmic
The beauty of these steps is that they form an algorithm, a series of computable steps. And this is where automation shines. By translating these steps into code, I could create a tool that performs these calculations instantly and accurately. No more tedious number crunching, no more manual errors — just a streamlined, efficient process that ensures we get our pricing right every time.
In the next section, I'll walk you through how I transformed these formulas into a web-based price calculator, turning hours of work into a task that takes mere seconds.
## Implementing the First Part: From Supplier Invoices to a Functional MVP
In our paint store, every supplier sends us invoices in the form of XML files. These XML files are standardized by the Federal Revenue Service of Brazil, ensuring they follow a consistent structure. This consistency makes it feasible to automate the extraction of necessary data for our price calculations.
### Technologies Used
To build this solution, we chose a stack that includes:
- **Flask:** A lightweight web framework for Python that makes it easy to set up routes and handle requests.
- **Jinja2:** The templating engine used by Flask to render HTML pages dynamically.
- **JavaScript (with jQuery):** To handle client-side interactions and make AJAX requests for dynamic updates.
### Setting Up Routes
For the initial implementation, we set up two main routes: `index `and `calculate`.
#### `index `Route
The index route serves the main page of our application. It renders the form that allows users to upload an XML file, input the markup percentage, additional charges, and freight costs. We also set default values for these inputs to streamline the user experience.
```python
@app.route('/')
def index():
return render_template('index.html', product_data=[], markup=50, additional=5, freight=0, include_tax=False)
```
#### `calculate `Route
The `calculate `route processes the uploaded XML file. It extracts the necessary data, applies the formulas provided by my accountant, and calculates the cost and selling prices. This route then renders the results back to the `index.html` template.
```python
@app.route('/calculate', methods=['POST'])
def calculate():
if 'xmlFile' not in request.files:
return redirect(url_for('index'))
file = request.files['xmlFile']
if file.filename == '':
return redirect(url_for('index'))
markup = float(request.form['markup'])
additional = float(request.form['additional'])
freight = float(request.form['freight'])
include_tax = 'federalTax' in request.form
namespaces = {'nfe': 'http://www.portalfiscal.inf.br/nfe'}
product_data = extract_data_from_xml(file, namespaces)
for product in product_data:
cost_price = calculate_cost_price(product['Valor Unitário'], product['Alíq. ICMS'], product['Alíq. IPI'], product['CFOP'], freight)
selling_price = calculate_selling_price(cost_price, markup, additional, include_tax)
product['Preço de Custo Final'] = cost_price
product['Preço de Venda'] = selling_price
return render_template('index.html', product_data=product_data, markup=markup, additional=additional, freight=freight, include_tax=include_tax)
```
### Crafting the MVP
The Minimum Viable Product (MVP) focused on getting the basic functionality right. This included:
1. **Form Setup:** An HTML form for uploading XML files and inputting calculation parameters (markup, additional charges, freight, and federal tax inclusion).
2. **Data Extraction:** Using Python's `xml.etree.ElementTree` to parse the XML files and extract relevant product data.
3. **Calculation Logic:** Implementing the cost and selling price formulas in Python.
4. **Result Display:** Dynamically rendering the calculated prices back to the user through the HTML template.
Here’s the `index.html` template for the form and result display:
```html
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Calculadora de Preço de Venda</title>
<link rel="stylesheet" href="/static/style.css">
<script src="https://code.jquery.com/jquery-3.6.0.min.js"></script>
<script type="application/json" id="productData">
{{ product_data|tojson }}
</script>
<script type="application/json" id="additionalData">
{{ additional }}
</script>
<script type="application/json" id="freightData">
{{ freight }}
</script>
<script type="application/json" id="includeTaxData">
{{ include_tax|tojson }}
</script>
<script src="/static/script.js"></script>
</head>
<body>
<h1>Calculadora de Preço de Venda</h1>
<form action="/calculate" method="post" enctype="multipart/form-data">
<label for="xmlFile">Carregar XML:</label>
<input type="file" id="xmlFile" name="xmlFile" accept=".xml"><br><br>
<label for="markup">Markup (%):</label>
<input type="number" id="markup" name="markup" step="0.1" value="50"><br><br>
<label for="additional">Valor Adicional (%):</label>
<input type="number" id="additional" name="additional" step="0.1" value="5"><br><br>
<label for="freight">Frete (%):</label>
<input type="number" id="freight" name="freight" step="0.1" value="0"><br><br>
<label for="federalTax">Incluir Imposto Federal (7%):</label>
<input type="checkbox" id="federalTax" name="federalTax" checked><br><br>
<button type="submit">Calcular</button>
</form>
{% if product_data %}
<h2>Dados do Formulário:</h2>
<p>
Markup Selecionado: {{ markup }}%<br>
Valor Adicional Selecionado: {{ additional }}%<br>
Frete Adicionado: {{ freight }}%<br>
Incluir Imposto Federal: {{ 'Sim' if include_tax else 'Não' }}
</p>
<h2>Cálculo dos Preços:</h2>
<table border="1">
<tr>
<th>Produto</th>
<th>CFOP</th>
<th>Custo Unit</th>
<th>Aliq. ICMS</th>
<th>Aliq. IPI</th>
<th>Frete</th>
<th>Imposto Federal</th>
<th>Markup (%)</th>
<th>Custo Final</th>
<th>Venda Final</th>
</tr>
{% for product in product_data %}
<tr>
<td>{{ product['Nome do Produto'] }}</td>
<td>{{ product['CFOP'] }}</td>
<td>R$ {{ product['Valor Unitário'] }}</td>
<td>{{ product['Alíq. ICMS'] }}</td>
<td>{{ product['Alíq. IPI'] }}</td>
<td>{{ freight }}%</td>
<td>{{ 'Sim' if include_tax else 'Não' }}</td>
<td><input type="number" class="markup-input" step="0.1" value="{{ markup }}" data-index="{{ loop.index0 }}"></td>
<td>R$ {{ '%.2f' | format(product['Preço de Custo Final']) }}</td>
<td>R$ {{ '%.2f' | format(product['Preço de Venda']) }}</td>
</tr>
{% endfor %}
</table>
{% endif %}
</body>
</html>
```
With this MVP in place, we had a functional tool that could handle the complexities of our pricing calculations, saving us time and reducing the potential for errors. But this was just the beginning.
##Adding Interactivity: Editable Markup Fields for Dynamic Price Updates
In the real world, the need for flexibility in pricing is paramount. Sometimes, different products on the same invoice require different markup percentages. Manually re-uploading the XML file and recalculating prices each time a markup needs to be adjusted is not only inefficient but also prone to errors. To address this, I introduced an interactive, editable markup field that updates the selling price automatically. This is where JavaScript comes into play.
### Enhancing User Experience with JavaScript
To achieve this dynamic behavior, we needed to:
1. Make the markup field editable directly within the product table.
2. Use JavaScript to detect changes in the markup field and trigger recalculations.
3. Send the updated data to the server via AJAX and update the selling price without refreshing the page.
Here's a detailed breakdown of the implementation:
#### 1. Updating the HTML Template
First, we needed to ensure the markup field in our HTML table was editable and had the necessary attributes for our JavaScript to identify and manipulate it.
```html
<td><input type="number" class="markup-input" step="0.1" value="{{ markup }}" data-index="{{ loop.index0 }}"></td>
```
This line creates an input field within the table cell, sets its initial value to the current markup, and adds a `data-index`attribute to keep track of the product's position in the list.
#### 2. JavaScript for Real-Time Updates
We then added a JavaScript file to handle the dynamic updates. This script listens for changes in the markup input fields and makes AJAX requests to update the prices accordingly.
Here’s the content of `static/script.js`:
```javascript
$(document).ready(function() {
const productData = JSON.parse(document.getElementById('productData').textContent);
const additional = JSON.parse(document.getElementById('additionalData').textContent);
const freight = JSON.parse(document.getElementById('freightData').textContent);
const includeTax = JSON.parse(document.getElementById('includeTaxData').textContent);
$('.markup-input').on('input', function() {
let markup = $(this).val();
let index = $(this).data('index');
updatePrices(markup, index);
});
function updatePrices(markup, index) {
$.ajax({
url: '/update_prices',
method: 'POST',
contentType: 'application/json',
data: JSON.stringify({
markup: markup,
index: index,
product_data: productData,
additional: additional,
freight: freight,
include_tax: includeTax
}),
success: function(data) {
$('table').find('tr').eq(index + 1).find('td').eq(9).text('R$ ' + data.new_selling_price.toFixed(2));
}
});
}
});
```
This script does the following:
- Reads the product data and other relevant values from the hidden JSON elements in the HTML.
- Listens for input events on the markup fields.
- Sends an AJAX request to the server whenever a markup value changes, passing the updated markup and necessary data.
- Updates the displayed selling price with the new value returned from the server.
#### 3. Handling AJAX Requests in Flask
We needed a new route in our Flask app to handle the AJAX requests and perform the necessary recalculations.
Here’s the updated `main.py`:
```python
@app.route('/update_prices', methods=['POST'])
def update_prices():
data = request.get_json()
index = data['index']
markup = float(data['markup'])
additional = data['additional']
freight = data['freight']
include_tax = data['include_tax']
product_data = data['product_data']
product = product_data[index]
cost_price = calculate_cost_price(product['Valor Unitário'], product['Alíq. ICMS'], product['Alíq. IPI'], product['CFOP'], freight)
new_selling_price = calculate_selling_price(cost_price, markup, additional, include_tax)
return jsonify({'new_selling_price': new_selling_price})
```
This route:
- Receives the updated markup and product data via a JSON payload.
- Recalculates the cost and selling prices using the same formulas as before.
- Returns the new selling price to the client, which JavaScript then updates in the table.
#### 4. Adjusting the HTML Template
Finally, ensure that the HTML template includes the necessary scripts and hidden JSON data.
Here’s the updated `index.html`:
```html
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Calculadora de Preço de Venda</title>
<link rel="stylesheet" href="/static/style.css">
<script src="https://code.jquery.com/jquery-3.6.0.min.js"></script>
<script type="application/json" id="productData">
{{ product_data|tojson }}
</script>
<script type="application/json" id="additionalData">
{{ additional }}
</script>
<script type="application/json" id="freightData">
{{ freight }}
</script>
<script type="application/json" id="includeTaxData">
{{ include_tax|tojson }}
</script>
<script src="/static/script.js"></script>
</head>
<body>
<h1>Calculadora de Preço de Venda</h1>
<form action="/calculate" method="post" enctype="multipart/form-data">
<label for="xmlFile">Carregar XML:</label>
<input type="file" id="xmlFile" name="xmlFile" accept=".xml"><br><br>
<label for="markup">Markup (%):</label>
<input type="number" id="markup" name="markup" step="0.1" value="50"><br><br>
<label for="additional">Valor Adicional (%):</label>
<input type="number" id="additional" name="additional" step="0.1" value="5"><br><br>
<label for="freight">Frete (%):</label>
<input type="number" id="freight" name="freight" step="0.1" value="0"><br><br>
<label for="federalTax">Incluir Imposto Federal (7%):</label>
<input type="checkbox" id="federalTax" name="federalTax" checked><br><br>
<button type="submit">Calcular</button>
</form>
{% if product_data %}
<h2>Dados do Formulário:</h2>
<p>
Markup Selecionado: {{ markup }}%<br>
Valor Adicional Selecionado: {{ additional }}%<br>
Frete Adicionado: {{ freight }}%<br>
Incluir Imposto Federal: {{ 'Sim' if include_tax else 'Não' }}
</p>
<h2>Cálculo dos Preços:</h2>
<table border="1">
<tr>
<th>Produto</th>
<th>CFOP</th>
<th>Custo Unit</th>
<th>Aliq. ICMS</th>
<th>Aliq. IPI</th>
<th>Frete</th>
<th>Imposto Federal</th>
<th>Markup (%)</th>
<th>Custo Final</th>
<th>Venda Final</th>
</tr>
{% for product in product_data %}
<tr>
<td>{{ product['Nome do Produto'] }}</td>
<td>{{ product['CFOP'] }}</td>
<td>R$ {{ product['Valor Unitário'] }}</td>
<td>{{ product['Alíq. ICMS'] }}</td>
<td>{{ product['Alíq. IPI'] }}</td>
<td>{{ freight }}%</td>
<td>{{ 'Sim' if include_tax else 'Não' }}</td>
<td><input type="number" class="markup-input" step="0.1" value="{{ markup }}" data-index="{{ loop.index0 }}"></td>
<td>R$ {{ '%.2f' | format(product['Preço de Custo Final']) }}</td>
<td>R$ {{ '%.2f' | format(product['Preço de Venda']) }}</td>
</tr>
{% endfor %}
</table>
{% endif %}
</body>
</html>
```
With these enhancements, we transformed a static form into an interactive tool that allows for real-time price adjustments. This not only streamlines our workflow but also provides flexibility in pricing, making the tool much more powerful and user-friendly. In the next section, I'll discuss potential future iterations and improvements to further enhance the functionality and usability of this project.
## Future Iterations: Enhancing Functionality and User Experience
With the basic functionality in place, there are several enhancements that can make the price calculator even more robust and user-friendly. Here are the next steps I'm considering for future iterations:
### 1. Configurations Page
Currently, the tax rates and CFOP codes are hard-coded into the application. This approach works for an MVP, but it's not scalable. Tax rates can change, and new CFOP codes with different tax regimes can emerge. To address this, I plan to implement a configurations page where users can update these values directly within the app.
- **Editable Tax Rates and CFOP Codes:** Allow users to add, edit, and delete tax rates and CFOP codes.
- **Persistent Settings:** Store these configurations in a database to ensure they are retained between sessions.
### 2. Drag-and-Drop XML Upload
To improve the user experience, I want to add drag-and-drop functionality for XML uploads. This would make the process more intuitive and quicker.
- **Drag-and-Drop Interface:** Implement a drag-and-drop zone for users to easily upload XML files.
- **Fallback to Button Upload:** Maintain the current file upload button as a fallback option.
### 3. Hosting on Store’s Domain
Currently, the application is running on a Heroku server. To give it a more professional look and ensure better accessibility, I plan to host it on the store's domain.
- **Custom Domain:** Migrate the app to be hosted on the store's domain.
- **Enhanced Accessibility:** Ensure the app is easily accessible to all users, including employees and customers.
### 4. Improved Aesthetics with CSS
While functionality is king, aesthetics shouldn't be neglected. A well-designed interface can significantly enhance the user experience. I plan to use CSS to make the page more visually appealing.
- **Modern UI Design:** Implement a clean, modern design for the app using CSS.
- **Responsive Design:** Ensure the app works well on different devices, including desktops, tablets, and smartphones.
## Conclusion
The journey of creating this web-based price calculator has been both challenging and rewarding. From automating tedious manual calculations to adding dynamic, real-time updates, each step has made the app more efficient and user-friendly. But the journey doesn't end here. With future iterations focusing on configurability, user experience, and professional hosting, the app will continue to evolve and serve its purpose even better.
Stay tuned for more updates as I continue to enhance and refine this project. If you're on a similar journey or have any suggestions, I'd love to hear from you! Let's keep innovating and making our lives (and jobs) a bit easier, one line of code at a time. | jampamatos |
1,863,015 | Khám phá kho tàng điện thoại OPPO đời cũ: Hành trình hoài niệm và lựa chọn sáng giá | OPPO, thương hiệu điện thoại đình đám đến từ Trung Quốc, nổi tiếng với những chiếc smartphone thời... | 0 | 2024-05-23T16:08:59 | https://dev.to/tatcacacdongdienthoaioppo/kham-pha-kho-tang-dien-thoai-oppo-doi-cu-hanh-trinh-hoai-niem-va-lua-chon-sang-gia-5fh2 | OPPO, thương hiệu điện thoại đình đám đến từ Trung Quốc, nổi tiếng với những chiếc smartphone thời thượng, camera chụp ảnh ấn tượng và giá cả phải chăng. Tuy nhiên, bên cạnh những dòng máy mới nhất, các thế hệ OPPO đời cũ cũng sở hữu sức hút riêng, mang đến cho người dùng những lựa chọn giá rẻ, hiệu năng ổn định và thiết kế độc đáo.
Bài viết này sẽ đưa bạn du hành ngược thời gian, khám phá [tất cả các dòng điện thoại oppo](https://vntre.vn/tat-ca-cac-dong-dien-thoai-oppo-tu-truoc-den-nay-a2969.html) đời cũ, giúp bạn tìm được "chân ái" phù hợp với nhu cầu và ngân sách của bản thân.
**1. Hành trình qua các dòng điện thoại OPPO đời cũ
**
Lịch sử điện thoại OPPO trải dài qua nhiều thế hệ, mỗi thế hệ mang những dấu ấn riêng biệt:
Thế hệ Find:Mở đầu cho hành trình chinh phục thị trường di động, dòng Find ghi dấu với những chiếc điện thoại cao cấp, sang trọng và sở hữu camera ấn tượng. Nổi bật là Find 5, Find 7, Find X với thiết kế độc đáo và hiệu năng mạnh mẽ.
Điện thoại OPPO Find 5

Thế hệ N:Tiếp nối thành công, OPPO cho ra mắt dòng N với nhiều cải tiến về camera và thiết kế. Nổi bật là N1, N3 với camera xoay linh hoạt, mang đến trải nghiệm chụp ảnh độc đáo.
Điện thoại OPPO N1
Thế hệ R:Dòng R tập trung vào phân khúc tầm trung, thu hút người dùng bởi thiết kế thời trang, camera selfie ấn tượng và giá cả hợp lý. R7, R9, R11 là những cái tên tiêu biểu, góp phần khẳng định vị thế của OPPO trên thị trường.
Điện thoại OPPO R11
Thế hệ A:Hướng đến đối tượng người dùng trẻ, dòng A mang đến những chiếc điện thoại giá rẻ, hiệu năng ổn định và camera chụp ảnh tốt. A3, A5, A7 là những đại diện tiêu biểu, được đông đảo người dùng yêu thích.
Thế hệ F:Dòng F là lựa chọn hoàn hảo cho học sinh, sinh viên với mức giá phải chăng, thiết kế trẻ trung và camera selfie chất lượng. F1, F3, F5 là những chiếc điện thoại OPPO đời cũ nổi bật trong dòng F.
>>> Xem thêm: https://insightmaker.com/insight/68SVyzhUhHXHOaroNyyfck/tatcacacdongdienthoaioppo
**2. Lợi ích khi sở hữu điện thoại OPPO đời cũ
**

Điện thoại OPPO đời cũ mang đến nhiều lợi ích bất ngờ cho người dùng:
Giá thành rẻ: So với những dòng máy mới nhất, điện thoại OPPO đời cũ có giá thành rẻ hơn đáng kể, giúp bạn tiết kiệm chi phí.
Hiệu năng ổn định: Mặc dù không mạnh mẽ như các flagship hiện đại, điện thoại OPPO đời cũ vẫn đáp ứng tốt nhu cầu sử dụng cơ bản như nghe gọi, nhắn tin, lướt web, xem phim,...
Thiết kế độc đáo: Nhiều dòng OPPO đời cũ sở hữu thiết kế độc đáo, không đụng hàng với bất kỳ sản phẩm nào khác trên thị trường, giúp bạn thể hiện phong cách cá nhân.
Dễ dàng sửa chữa: linh kiện thay thế cho các dòng OPPO đời cũ khá dồi dào và giá rẻ, giúp bạn dễ dàng sửa chữa khi gặp sự cố.
>>> Xem thêm:https://www.proarti.fr/account/tatcacacdongdien
**3. Lưu ý khi chọn mua điện thoại OPPO đời cũ**
Để có được trải nghiệm tốt nhất khi sử dụng điện thoại OPPO đời cũ, bạn cần lưu ý một số điều sau:
Xác định nhu cầu sử dụng: Cần xác định rõ nhu cầu sử dụng của bản thân để chọn mua dòng máy phù hợp. Ví dụ, nếu bạn cần một chiếc điện thoại để chơi game, bạn nên chọn những dòng có hiệu năng mạnh mẽ như Find X, Reno.
Kiểm tra kỹ máy trước khi mua: Nên kiểm tra kỹ ngoại hình, màn hình, camera và các chức năng cơ bản của máy trước khi mua để đảm bảo máy hoạt động tốt.
▶️ Bài viết chi tiết: https://vntre.vn/tat-ca-cac-dong-dien-thoai-oppo-tu-truoc-den-nay-a2969.html
#tatcacacdongdienthoaioppo, #vntre, #thongtindientu
| tatcacacdongdienthoaioppo | |
1,863,014 | recently i found this new AI travel chat on whatsapp +1 (305) 224-6621 , been using it for finding new activities | A post by Farah Youssef | 0 | 2024-05-23T16:08:50 | https://dev.to/farah_youssef_6666874d8ed/recently-i-found-this-new-ai-travel-chat-on-whatsapp-1-305-224-6621-been-using-it-for-finding-new-activities-1h33 | discuss | farah_youssef_6666874d8ed | |
1,863,013 | Arcanium.art v1.0 Launch | After what felt like an eternity in Beta, I'm thrilled to roll out Arcanium.art version 1.0 to the... | 0 | 2024-05-23T16:08:47 | https://dev.to/jmkweb/arcaniumart-v10-launch-491d | ai, webdev, beta, javascript | After what felt like an eternity in Beta, I'm thrilled to roll out [Arcanium.art](https://arcanium.art/) version 1.0 to the masses. It's been a rollercoaster of bugs, eureka moments, and a fair share of coffee-fueled late nights. But now, it's ready for prime time!
For those out of the loop—and let's be honest, that’s probably most of you—Arcanium is a Stable Diffusion art generator that prides itself on a slick user experience and foolproof simplicity. It survived a grueling six months in Alpha, where I juggled user feedback, tackled bugs, occasionally threw my hands up in frustration, took some sanity breaks, and then dove back in. Just your typical startup drama.
The loudest piece of feedback was clear: “Let me dip my toes in before I dive into the premium pool!” So, I’ve tweaked a few knobs and buttons to make that happen:
- Enjoy 5 free credits refreshed daily to get your art fix.
- Enter our Community Showcase competition to snag 250 credits—on the house.
- Our new Discord Bot is playing cop, keeping the showcase entries clean and the art generation orderly.
- We've spruced up the app to feel more like, well, an app, and less like a secret society.
- Choose between flaunting your creations publicly or keeping them under wraps.
- Upgraded models and sharper, more curated LoRA’s.
- Keep Plus membership is a steal at just $9 for unlimited access.
As always, I’m here, learning on the fly and tweaking as we go. Is a product ever really finished, or just less beta? | jmkweb |
1,863,012 | Defensive Functions and Input Validation in Python: Ensuring Error-Free Code | Writing defensive functions and validating input are essential for creating efficient and error-free... | 0 | 2024-05-23T16:08:41 | https://dev.to/myexamcloud/defensive-functions-and-input-validation-in-python-ensuring-error-free-code-nmp | python, programming, software, coding | Writing defensive functions and validating input are essential for creating efficient and error-free code. While type-hinting can provide helpful documentation, it is not enough to ensure the correct execution of a function. Developers must manually check and enforce the pre-conditions for a function to work properly and handle edge cases where invalid inputs are passed.
One example of this is calculating interest using the principal, rate, and years as input parameters. While the operation may seem simple, it is important to ensure that the input values are within a valid range for the function to execute correctly. This means that certain conditions must be met for the function to function properly.
To ensure this, we can do the following steps:
1. Use type hints to specify the expected data types for each input parameter.
2. Check the input types using conditional statements and raise type errors if the conditions are not satisfied.
3. Check for valid values of each input parameter and raise value errors if any of them are invalid.
4. Perform the calculation only if all the pre-conditions are met.
Let's take a look at how this would be implemented in Python:
`from typing import Union
`
`def calculate_interest(
principal: Union[int, float],
rate: float,
years: int
) -> Union[int, float]:
` `if not isinstance(principal, (int, float)):
raise TypeError("Principal must be an integer or float")
if not isinstance(rate, float):
raise TypeError("Rate must be a float")
if not isinstance(years, int):
raise TypeError("Years must be an integer")
if principal <= 0:
raise ValueError("Principal must be positive")
if rate <= 0:
raise ValueError("Rate must be positive")
if years <= 0:
raise ValueError("Years must be positive")`
` interest = principal * rate * years
return interest
`
Here, we are using conditional statements to validate the input and raising appropriate errors if any of the conditions are not met. This not only ensures the correct execution of the function but also improves its robustness by handling unexpected inputs.
It is worth noting that while Python also has assertion statements that can be used for input validation, they are not recommended for this purpose as they can easily be disabled and lead to unexpected behavior in production. Using explicit conditional statements is a best practice for enforcing pre-conditions, post-conditions, and code invariants.
In conclusion, writing defensive functions and validating input is crucial for creating reliable and efficient code. By following best practices and using proper input validation techniques, we can ensure that our functions work as intended and handle any potential errors seamlessly.
MyExamCloud Study Plans
[Java Certifications Practice Tests](https://www.myexamcloud.com/onlineexam/javacertification.courses) - MyExamCloud Study Plans
[Python Certifications Practice Tests](https://www.myexamcloud.com/onlineexam/python-certification-practice-tests.courses) - MyExamCloud Study Plans
[AWS Certification Practice Tests](https://www.myexamcloud.com/onlineexam/aws-certification-practice-tests.courses) - MyExamCloud Study Plans
[Google Cloud Certification Practice Tests](https://www.myexamcloud.com/onlineexam/google-cloud-certifications.courses) - MyExamCloud Study Plans
[MyExamCloud Aptitude Practice Tests Study Plan](https://www.myexamcloud.com/onlineexam/aptitude-practice-tests.course)
[MyExamCloud AI Exam Generator](https://www.myexamcloud.com/onlineexam/testgenerator.ai) | myexamcloud |
1,863,011 | HOW TO DELETE A PUBLIC IP ADDRESS IN MICROSOFT AZURE | Microsoft Azure provides a robust cloud platform that offers numerous services, including the... | 0 | 2024-05-23T16:04:01 | https://dev.to/atony07/how-to-delete-a-public-ip-address-in-microsoft-azure-5h7p | Microsoft Azure provides a robust cloud platform that offers numerous services, including the allocation of public IP addresses for virtual machines and other resources. However, these public IP addresses can incur charges even when they are not actively in use. For those on a pay-as-you-go subscription, these costs can quickly add up. To manage your expenses efficiently, it is essential to delete any unused public IP addresses. Here’s a step-by-step guide on how to do this.
Steps to Delete a Public IP Address in Microsoft Azure
Step:1: Search and select public IP address.

Step:2: Select any IP address.

Step:3: Click on Dissociate first, so that you won’t be charged by Microsoft.

Step:4: Click YES

Step:5: Click Delete

Step:6: Select Yes
 | atony07 | |
1,852,782 | Pursuing a Master's in Computer Science Without a CS Background: My Journey to the University of York | I've just been accepted into the University of York's online MSc course in Computer Science. During... | 0 | 2024-05-23T16:01:05 | https://dev.to/ken-amenomori/pursuing-a-masters-in-computer-science-without-a-cs-background-my-journey-to-the-university-of-york-5a45 | computerscience | I've just been accepted into the University of York's online MSc course in Computer Science. During my application process, I found many articles and blogs about postgraduate options helpful. However, there was scant information about the University of York, which made my decision-making process challenging. As I considered several schools, I want to share my experiences and the reasons behind my choice of York. I hope this article will assist you in making an informed decision, just as the other blogs helped me.
## My Background
I've been working as a software engineer while completing my undergraduate degree in law. Despite successfully managing in my role without a formal computer science background, I decided to pursue a master's degree to solidify my knowledge base and align with my peers professionally.
## Criteria
Given my background, I had a specific set of criteria for selecting a university. It’s no secret that not many schools are open to candidates without a traditional computer science background. If you’re looking to break into the field like I am, you might relate to the challenges I faced. Here’s what I looked for in a program:
* **No requirement for a CS bachelor’s degree**
* Availability of an online course (or a campus located in Japan)
* Reasonable tuition fees
* Ability to start studying as soon as possible
* Offers research opportunities (no specific topic in mind, but I was eager to explore this aspect of academia)
These criteria guided my search and ultimately influenced my decision on where to apply.
## School I chose
I chose [the University of York][1] for my online MSc course. While it may not be as widely recognized internationally as some other institutions in computer science, the University of York offers an excellent program for students without a traditional CS background. They provide online courses at a reasonable tuition fee of £9,000. The university requires a grade higher than 2:2, which is equivalent to a GPA between 2.8 and 4.0. Although my GPA was on the lower side, it still met their requirements. As a non-native English speaker, I had to submit my TOEFL score.
A significant advantage of York's program is the minimal documentation required for admissionーjust my alma mater's transcript and certification to prove my English proficiency. This streamlined the application process, eliminating the need for personal statements or letters of recommendation from my employer. Moreover, the start date was ideal; I applied in May, and the class begins on June 24th, allowing me to quickly focus on my studies.
Although it may not carry the same immediate name recognition in specific circles as other schools, the University of York is a member of the Russell Group, which comprises 24 leading UK universities. Given my specific needs and circumstances, it was the most suitable choice.
## Other options I considered
* [Georgia Tech][2]
Georgia Tech also offers online courses known as OMSCS. While a CS-related degree is preferable, it's not mandatory. Applicants need a 3.0 or higher GPA and a TOEFL score above 100 (IELTS is also acceptable). Three references are required, one from someone with an academic background. The course costs only $7,000. Georgia Tech is highly prestigious in the US. Although a CS degree is typically required, I've seen some students admitted without one, so it is worth a shot even if you don't have one.
However, I chose not to pursue this due to the lengthy preparation time, which didn't align with my goal to start studying as soon as possible.
* [University of Pennsylvania][3]
Penn offers the online Master of Computer and Information Technology(MCIT) degree. This program doesn't require a CS degree at all. Applicants should have a GPA of 3.0 or higher and a TOEFL score of 100 or more (IELTS is not accepted). It requires at least two letters of recommendation, multiple personal statements, and a resume. Penn is a member of the Ivy League, which is impressive until you see the tuition feesーthey're also Ivy League, but my bank account isn't. It costs more than $26,000. Plus, they don't exactly offer a computer science degree; it's more of a computer and information technology degree.
* [Japan Advanced Institute of Science and Technology][4]
Technically, they don't offer an online course, but they have attractive part-time graduate programs. Although it was one of the best options, the courses are primarily conducted in Japanese, and I was keen on improving my English. FYI, if you choose this school, you really need to devote yourself to research.
* University of the People
This is actually an undergraduate institution, offering tuition-free education, which could have been an appealing option. However, since it's an undergraduate program, it would require committing four years or more, a longer duration than I was willing to invest.
* University of London
Similar to the University of the People, the University of London would have required spending three years, which was not feasible for my plans.
## Finally...
I have chosen the University of York for my MSc in Computer Science and am eager to see how it unfolds. I will provide updates after completing the first module. I can't wait to start the course!
[1]:https://online.york.ac.uk/study-online/msc-computer-science-online/
[2]:https://omscs.gatech.edu/deadlines-decisions-requirements-and-guidelines
[3]:https://online.seas.upenn.edu/degrees/mcit-online/master-of-computer-and-information-technology/
[4]:https://www.jaist.ac.jp/english/ | ken-amenomori |
1,857,847 | talkd.ai got accepted into the Github Accelerator! (also our first official release) | Talkd.ai Dialog First official release and GitHub Accelerator announcement | 0 | 2024-05-23T16:00:00 | https://dev.to/vmesel/talkdai-got-accepted-into-the-github-accelerator-also-our-first-official-release-1ofc | chatgpt, gpt4o, talkdai | ---
title: talkd.ai got accepted into the Github Accelerator! (also our first official release)
published: true
description: Talkd.ai Dialog First official release and GitHub Accelerator announcement
tags: chatgpt, gpt4o, talkdai
cover_image: https://github.blog/wp-content/uploads/2024/05/github-accelerator-2024.jpeg?resize=1484%2C679
published_at: 2024-05-23 13:00 -0300
---
Hey there!
If you still don’t know us, we are [talkd.ai](https://talkd.ai), an open-source organization that is maintaining Dialog, a project focused on letting you easily deploy any LLM that you want (currently any of those available in the Langchain and partner libraries - we will cover more on that later).
Today, we are very grateful for announcing two things:
1. This is our first official public release, our "Numero Uno" and a starting point to this adventure that will be long and fun.
2. Now it's official: We got accepted in the GitHub Accelerator 2024 AI Cohort - a Cohort full of amazing people that started on April 22nd.
## What is the GitHub Accelerator?

The GitHub accelerator is an accelerator that is running its second cohort in a slightly different approach from the usual: it focuses on making open-source sustainable and helping maintainers find sustainable ways to fund their work full-time on the projects.
During the accelerator, you will be connected to references from fields such as AI, InfoSec, Successful Open Source maintainers, investment funds, and many other professionals who will guide you through the many possibilities of open-source funding.
The project also provides you a stipend for 10 weeks, allowing you and your team to be full-time focused on the development of software and communications of your project, credits on OpenAI and Microsoft Azure.
## Back to the project's history: How did the project start?

The project started to help us ([Thiago Avelino](https://github.com/avelino) and I) create chat experiences that resembled human behavior in answering frequently asked questions inside our contexts (Avelino inside Buser and my context of wanting to learn more about LLM deployments and maintenance).
Still, as the project grew, our contexts changed a lot, as the necessity for different techniques, retrievers, optimizations, and plugins
Nowadays, the project allows you to deploy any model that respects Langchain LCELs or our libraries' chain model, you can choose which one to maintain and set up from there.
The process of getting to where we are right now involves lots of people, but I would like to specially:
- [Luan Fernandes](https://github.com/lgabs/) - our Langchain specialist and long-time contributor
- [Walison Filipe](https://github.com/walison17/) - our FastAPI master and testing guru
- Gregg and Kevin from the GitHub Accelerator - on helping us improve our software through communications, mentorship, invaluable resources and connections
- Andreas, Alicia, Namee, and Jurgen - you have amazing projects and invaluable lessons on improving pitching.
## What can I do with talkd.ai/dialog?
[talkd.ai/dialog](https://github.com/talkdai/dialog) lets you deploy any LLM that you want (with the code already adapted to the LCEL or AbstractLLM models) in 5 minutes.
With a simple `docker-compose up` you have sample data and a sample prompt up and running.
If you want to customize, you can use our library: [dialog-lib](https://github.com/talkdai/dialog-lib), to implement custom RAGs and Retrievers. We are fully integrated with SQLAlchemy, PGVector, Anthropic, and OpenAI.
Here is our quick product demo to showcase you how simple is to deploy our software:
{% embed https://www.youtube.com/watch?v=XUm5iqPyAYo %}
Here is the [official release link from GitHub](https://github.blog/2024-05-23-2024-github-accelerator-meet-the-11-projects-shaping-open-source-ai/). | vmesel |
1,863,010 | How to create a Windows Virtual Machine using a QuickStart Templates | Azure QuickStart Templates are pre-configured templates provided by Microsoft and the community to... | 0 | 2024-05-23T15:59:06 | https://dev.to/atony07/how-to-create-a-windows-virtual-machine-using-a-quickstart-templates-3lj2 | Azure QuickStart Templates are pre-configured templates provided by Microsoft and the community to facilitate the deployment of various Azure resources and solutions quickly and easily. These templates are built using Azure Resource Manager (ARM) templates. It is the fastest or shortest time to create an app on Azure.
Benefits of Using QuickStart Templates
• Timesaving: Quickly deploy complex solutions without having to configure each component manually.
• Reliability: Deployments are consistent and reliable, following best practices.
• Ease of Use: Suitable for both beginners and experienced users, providing a straightforward way to deploy Azure resources.
• Scalability: Easily scalable configurations that can be adjusted to meet changing requirements.
How to use a QuickStart Template
Step:1: Log on to your Microsoft Azure account
Step:2: Search the keywords “Deploy a custom template”

Step: 3: Select any common template you want to create and in this case I selected create a windows virtual machine

Step: 4: Choose a resource group, username and password of your choice

Step:5: Select Review+create. Ensure you wait to validate.

Step: 6: Deployment is complete and select “Go to resource group”

Step:7: The name of the virtual machine created is Simple-vm

| atony07 | |
1,863,008 | Buy verified cash app account | https://dmhelpshop.com/product/buy-verified-cash-app-account/ Buy verified cash app account Cash... | 0 | 2024-05-23T15:57:16 | https://dev.to/kennethsmithhjyfdr/buy-verified-cash-app-account-41fl | webdev, javascript, beginners, programming | ERROR: type should be string, got "https://dmhelpshop.com/product/buy-verified-cash-app-account/\n\n\n\n\nBuy verified cash app account\nCash app has emerged as a dominant force in the realm of mobile banking within the USA, offering unparalleled convenience for digital money transfers, deposits, and trading. As the foremost provider of fully verified cash app accounts, we take pride in our ability to deliver accounts with substantial limits. Bitcoin enablement, and an unmatched level of security.\n\nOur commitment to facilitating seamless transactions and enabling digital currency trades has garnered significant acclaim, as evidenced by the overwhelming response from our satisfied clientele. Those seeking buy verified cash app account with 100% legitimate documentation and unrestricted access need look no further. Get in touch with us promptly to acquire your verified cash app account and take advantage of all the benefits it has to offer.\n\nWhy dmhelpshop is the best place to buy USA cash app accounts?\nIt’s crucial to stay informed about any updates to the platform you’re using. If an update has been released, it’s important to explore alternative options. Contact the platform’s support team to inquire about the status of the cash app service.\n\nClearly communicate your requirements and inquire whether they can meet your needs and provide the buy verified cash app account promptly. If they assure you that they can fulfill your requirements within the specified timeframe, proceed with the verification process using the required documents.\n\nOur account verification process includes the submission of the following documents: [List of specific documents required for verification].\n\nGenuine and activated email verified\nRegistered phone number (USA)\nSelfie verified\nSSN (social security number) verified\nDriving license\nBTC enable or not enable (BTC enable best)\n100% replacement guaranteed\n100% customer satisfaction\nWhen it comes to staying on top of the latest platform updates, it’s crucial to act fast and ensure you’re positioned in the best possible place. If you’re considering a switch, reaching out to the right contacts and inquiring about the status of the buy verified cash app account service update is essential.\n\nClearly communicate your requirements and gauge their commitment to fulfilling them promptly. Once you’ve confirmed their capability, proceed with the verification process using genuine and activated email verification, a registered USA phone number, selfie verification, social security number (SSN) verification, and a valid driving license.\n\nAdditionally, assessing whether BTC enablement is available is advisable, buy verified cash app account, with a preference for this feature. It’s important to note that a 100% replacement guarantee and ensuring 100% customer satisfaction are essential benchmarks in this process.\n\nHow to use the Cash Card to make purchases?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card. Alternatively, you can manually enter the CVV and expiration date. How To Buy Verified Cash App Accounts.\n\nAfter submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a buy verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account.\n\nWhy we suggest to unchanged the Cash App account username?\nTo activate your Cash Card, open the Cash App on your compatible device, locate the Cash Card icon at the bottom of the screen, and tap on it. Then select “Activate Cash Card” and proceed to scan the QR code on your card.\n\nAlternatively, you can manually enter the CVV and expiration date. After submitting your information, including your registered number, expiration date, and CVV code, you can start making payments by conveniently tapping your card on a contactless-enabled payment terminal. Consider obtaining a verified Cash App account for seamless transactions, especially for business purposes. Buy verified cash app account. Purchase Verified Cash App Accounts.\n\nSelecting a username in an app usually comes with the understanding that it cannot be easily changed within the app’s settings or options. This deliberate control is in place to uphold consistency and minimize potential user confusion, especially for those who have added you as a contact using your username. In addition, purchasing a Cash App account with verified genuine documents already linked to the account ensures a reliable and secure transaction experience.\n\n \n\nBuy verified cash app accounts quickly and easily for all your financial needs.\nAs the user base of our platform continues to grow, the significance of verified accounts cannot be overstated for both businesses and individuals seeking to leverage its full range of features. How To Buy Verified Cash App Accounts.\n\nFor entrepreneurs, freelancers, and investors alike, a verified cash app account opens the door to sending, receiving, and withdrawing substantial amounts of money, offering unparalleled convenience and flexibility. Whether you’re conducting business or managing personal finances, the benefits of a verified account are clear, providing a secure and efficient means to transact and manage funds at scale.\n\nWhen it comes to the rising trend of purchasing buy verified cash app account, it’s crucial to tread carefully and opt for reputable providers to steer clear of potential scams and fraudulent activities. How To Buy Verified Cash App Accounts. With numerous providers offering this service at competitive prices, it is paramount to be diligent in selecting a trusted source.\n\nThis article serves as a comprehensive guide, equipping you with the essential knowledge to navigate the process of procuring buy verified cash app account, ensuring that you are well-informed before making any purchasing decisions. Understanding the fundamentals is key, and by following this guide, you’ll be empowered to make informed choices with confidence.\n\n \n\nIs it safe to buy Cash App Verified Accounts?\nCash App, being a prominent peer-to-peer mobile payment application, is widely utilized by numerous individuals for their transactions. However, concerns regarding its safety have arisen, particularly pertaining to the purchase of “verified” accounts through Cash App. This raises questions about the security of Cash App’s verification process.\n\nUnfortunately, the answer is negative, as buying such verified accounts entails risks and is deemed unsafe. Therefore, it is crucial for everyone to exercise caution and be aware of potential vulnerabilities when using Cash App. How To Buy Verified Cash App Accounts.\n\nCash App has emerged as a widely embraced platform for purchasing Instagram Followers using PayPal, catering to a diverse range of users. This convenient application permits individuals possessing a PayPal account to procure authenticated Instagram Followers.\n\nLeveraging the Cash App, users can either opt to procure followers for a predetermined quantity or exercise patience until their account accrues a substantial follower count, subsequently making a bulk purchase. Although the Cash App provides this service, it is crucial to discern between genuine and counterfeit items. If you find yourself in search of counterfeit products such as a Rolex, a Louis Vuitton item, or a Louis Vuitton bag, there are two viable approaches to consider.\n\n \n\nWhy you need to buy verified Cash App accounts personal or business?\nThe Cash App is a versatile digital wallet enabling seamless money transfers among its users. However, it presents a concern as it facilitates transfer to both verified and unverified individuals.\n\nTo address this, the Cash App offers the option to become a verified user, which unlocks a range of advantages. Verified users can enjoy perks such as express payment, immediate issue resolution, and a generous interest-free period of up to two weeks. With its user-friendly interface and enhanced capabilities, the Cash App caters to the needs of a wide audience, ensuring convenient and secure digital transactions for all.\n\nIf you’re a business person seeking additional funds to expand your business, we have a solution for you. Payroll management can often be a challenging task, regardless of whether you’re a small family-run business or a large corporation. How To Buy Verified Cash App Accounts.\n\nImproper payment practices can lead to potential issues with your employees, as they could report you to the government. However, worry not, as we offer a reliable and efficient way to ensure proper payroll management, avoiding any potential complications. Our services provide you with the funds you need without compromising your reputation or legal standing. With our assistance, you can focus on growing your business while maintaining a professional and compliant relationship with your employees. Purchase Verified Cash App Accounts.\n\nA Cash App has emerged as a leading peer-to-peer payment method, catering to a wide range of users. With its seamless functionality, individuals can effortlessly send and receive cash in a matter of seconds, bypassing the need for a traditional bank account or social security number. Buy verified cash app account.\n\nThis accessibility makes it particularly appealing to millennials, addressing a common challenge they face in accessing physical currency. As a result, ACash App has established itself as a preferred choice among diverse audiences, enabling swift and hassle-free transactions for everyone. Purchase Verified Cash App Accounts.\n\n \n\nHow to verify Cash App accounts\nTo ensure the verification of your Cash App account, it is essential to securely store all your required documents in your account. This process includes accurately supplying your date of birth and verifying the US or UK phone number linked to your Cash App account.\n\nAs part of the verification process, you will be asked to submit accurate personal details such as your date of birth, the last four digits of your SSN, and your email address. If additional information is requested by the Cash App community to validate your account, be prepared to provide it promptly. Upon successful verification, you will gain full access to managing your account balance, as well as sending and receiving funds seamlessly. Buy verified cash app account.\n\n \n\nHow cash used for international transaction?\nExperience the seamless convenience of this innovative platform that simplifies money transfers to the level of sending a text message. It effortlessly connects users within the familiar confines of their respective currency regions, primarily in the United States and the United Kingdom.\n\nNo matter if you’re a freelancer seeking to diversify your clientele or a small business eager to enhance market presence, this solution caters to your financial needs efficiently and securely. Embrace a world of unlimited possibilities while staying connected to your currency domain. Buy verified cash app account.\n\nUnderstanding the currency capabilities of your selected payment application is essential in today’s digital landscape, where versatile financial tools are increasingly sought after. In this era of rapid technological advancements, being well-informed about platforms such as Cash App is crucial.\n\nAs we progress into the digital age, the significance of keeping abreast of such services becomes more pronounced, emphasizing the necessity of staying updated with the evolving financial trends and options available. Buy verified cash app account.\n\nOffers and advantage to buy cash app accounts cheap?\nWith Cash App, the possibilities are endless, offering numerous advantages in online marketing, cryptocurrency trading, and mobile banking while ensuring high security. As a top creator of Cash App accounts, our team possesses unparalleled expertise in navigating the platform.\n\nWe deliver accounts with maximum security and unwavering loyalty at competitive prices unmatched by other agencies. Rest assured, you can trust our services without hesitation, as we prioritize your peace of mind and satisfaction above all else.\n\nEnhance your business operations effortlessly by utilizing the Cash App e-wallet for seamless payment processing, money transfers, and various other essential tasks. Amidst a myriad of transaction platforms in existence today, the Cash App e-wallet stands out as a premier choice, offering users a multitude of functions to streamline their financial activities effectively. Buy verified cash app account.\n\nTrustbizs.com stands by the Cash App’s superiority and recommends acquiring your Cash App accounts from this trusted source to optimize your business potential.\n\nHow Customizable are the Payment Options on Cash App for Businesses?\nDiscover the flexible payment options available to businesses on Cash App, enabling a range of customization features to streamline transactions. Business users have the ability to adjust transaction amounts, incorporate tipping options, and leverage robust reporting tools for enhanced financial management.\n\nExplore trustbizs.com to acquire verified Cash App accounts with LD backup at a competitive price, ensuring a secure and efficient payment solution for your business needs. Buy verified cash app account.\n\nDiscover Cash App, an innovative platform ideal for small business owners and entrepreneurs aiming to simplify their financial operations. With its intuitive interface, Cash App empowers businesses to seamlessly receive payments and effectively oversee their finances. Emphasizing customization, this app accommodates a variety of business requirements and preferences, making it a versatile tool for all.\n\nWhere To Buy Verified Cash App Accounts\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nThe Importance Of Verified Cash App Accounts\nIn today’s digital age, the significance of verified Cash App accounts cannot be overstated, as they serve as a cornerstone for secure and trustworthy online transactions.\n\nBy acquiring verified Cash App accounts, users not only establish credibility but also instill the confidence required to participate in financial endeavors with peace of mind, thus solidifying its status as an indispensable asset for individuals navigating the digital marketplace.\n\nWhen considering purchasing a verified Cash App account, it is imperative to carefully scrutinize the seller’s pricing and payment methods. Look for pricing that aligns with the market value, ensuring transparency and legitimacy. Buy verified cash app account.\n\nEqually important is the need to opt for sellers who provide secure payment channels to safeguard your financial data. Trust your intuition; skepticism towards deals that appear overly advantageous or sellers who raise red flags is warranted. It is always wise to prioritize caution and explore alternative avenues if uncertainties arise.\n\nConclusion\nEnhance your online financial transactions with verified Cash App accounts, a secure and convenient option for all individuals. By purchasing these accounts, you can access exclusive features, benefit from higher transaction limits, and enjoy enhanced protection against fraudulent activities. Streamline your financial interactions and experience peace of mind knowing your transactions are secure and efficient with verified Cash App accounts.\n\nChoose a trusted provider when acquiring accounts to guarantee legitimacy and reliability. In an era where Cash App is increasingly favored for financial transactions, possessing a verified account offers users peace of mind and ease in managing their finances. Make informed decisions to safeguard your financial assets and streamline your personal transactions effectively.\n\nContact Us / 24 Hours Reply\nTelegram:dmhelpshop\nWhatsApp: +1 (980) 277-2786\nSkype:dmhelpshop\nEmail:dmhelpshop@gmail.com\n\n\n\n" | kennethsmithhjyfdr |
1,863,001 | The Mold Guys in USA | The Mold Guys are Florida's premier mold removal and water damage restoration experts, boasting over... | 0 | 2024-05-23T15:45:50 | https://dev.to/moldguys_71df4e120ccad39c/the-mold-guys-in-usa-4dof | mold, moldguys, moldremoval, waterdamage | [The Mold Guys](https://moldguys.us/services/mold-removal/) are Florida's premier mold removal and water damage restoration experts, boasting over 25 years of industry experience. Our licensed professionals provide comprehensive services, including mold inspection, testing, remediation, and prevention, ensuring your home remains safe and healthy. We are committed to customer satisfaction, offering 24/7 emergency services and utilizing state-of-the-art equipment for effective and efficient mold removal. Trust The Mold Guys for reliable, prompt, and professional service. | moldguys_71df4e120ccad39c |
1,862,673 | HTML Tags You Might Not Know About | Hello Devs👋 In this post, I will share some new and helpful html tags which are added in HTML5 to... | 0 | 2024-05-23T15:44:50 | https://dev.to/dev_kiran/html-tags-you-might-not-know-about-3gk7 | webdev, html, beginners, programming | Hello Devs👋
In this post, I will share some new and helpful html tags which are added in HTML5 to write easy and fast code to create complex, dynamic, engaging, and effective websites.
Lets get started🚀
## dialog
➡ Now you can easily create dialog box or a popup window with `<dialog>` tag. It’s a great way to create custom modal dialogs without relying heavily on **JavaScript**.
```html
<dialog id="myDialog">
<p>This is a dialog box</p>
<button onclick="document.getElementById('myDialog').close()">Close
</button>
</dialog>
<button onclick="document.getElementById('myDialog').showModal()">Open Dialog
</button>
```
## template
➡ The `<template>` tag is used as a container for holding client-side content that you don’t want to display when the page loads. This content can be cloned and inserted into the document using **JavaScript**.
```html
<button onclick="showContent()">Show hidden content</button>
<template>
<h2>Hello, This is Kiran</h2>
<p>Thanks for reading this</p>
</template>
<script>
function showContent() {
let temp = document.getElementsByTagName("template")[0];
let clon = temp.content.cloneNode(true);
document.body.appendChild(clon);
}
</script>
```
## picture
➡ By using `<picture>` tag you can define multiple sources for an image, now the browser choose the best one based on screen size, resolution. This is particularly useful for **responsive** design.
```html
<picture>
<source media="(min-width:650px)" srcset="img_pink_flowers.jpg">
<source media="(min-width:465px)" srcset="img_white_flower.jpg">
<img src="img_orange_flowers.jpg" alt="Flowers" style="width:auto;">
</picture>
```
## meter
➡ The `<meter>` tag can be used for representing a scalar measurement within a known range, like disk usage or the relevance of a query result. It helps to **visually** display values within a range.
```html
<label for="diskUsage">Disk Usage:</label>
<meter id="diskUsage" value="0.6">60%</meter>
```
## output
➡ The `<output>` tag represents the result of a calculation. It can be used with **JavaScript** to display the **result** of a calculation.
```html
<form oninput="result.value=parseInt(a.value)+parseInt(b.value)">
<input type="number" id="a" value="50"> +
<input type="number" id="b" value="25"> =
<output name="result" for="a b">75</output>
</form>
```
## progress
➡ The `<progress>` tag represents the completion progress of a **task**, such as a download or file upload.
```html
<label for="fileProgress">File upload progress:</label>
<progress id="fileProgress" value="70" max="100">70%</progress>
```
## mark
➡ The `<mark>` tag is used to highlight text. It’s particularly useful for search result pages where you want to **highlight** the matching text.
```html
<p>The word <mark>highlighted</mark> is important.</p>
```
## abbr
➡ The `<abbr>` tag is used to define an abbreviation or acronym, providing a full **description** in the **title** attribute.
```html
<p>I'm a true<abbr title="Marvel Cinematic Universe">MCU</abbr>fan.</p>
```
## time
➡ The `<time>` tag is used to represent dates, times, or durations. It’s useful for making your time-related data machine-readable.
```html
<p>The concert starts at <time datetime="20:00">8 PM</time>.</p>
```
## bdi
➡ The `<bdi>` tag is used to isolate a part of text that might be formatted in a different direction from other text outside it. It ensures that your web content remains **consistent** and readable, no matter what languages or text **directions** are involved.
```html
<ul>
<li>Product: <bdi>ABC1234</bdi></li>
<li>Product: <bdi>مرحبا5678</bdi></li>
</ul>
```
## wbr
➡ The `<wbr>` tag specifies where the text can break into a **new line**, if necessary. This is useful for long words or URLs.
```html
<p>Thisisaverylongword<wbr>thatmightneedbreaking.</p>
```
## main
➡ The `<main>` tag is used to specify the **main content** of the document. It should only be used once per page, and it excludes content that is repeated across documents like headers, footers, navigation, and sidebars.
```html
<main>
<h1>Welcome to my blog post</h1>
<p>Today we will learn some new html tags</p>
</main>
```
## figcaption
➡ The `<figcaption>` tag is used for providing a **caption** to the figure.
```html
<figure>
<img src="Thanos.jpg" alt="Thanos image">
<figcaption>Thanos snapping his fingers</figcaption>
</figure>
```
That's it for this article.👍
Thank you for reading❤
Find Me on 👉 {% cta https://x.com/kiran__a__n %} X {% endcta %} {% cta https://github.com/Kiran1689 %} GitHub {% endcta %}
{% embed https://dev.to/dev_kiran %}
| dev_kiran |
1,862,997 | Enhancing PHP Projects With SQL Server | PHP is one of the most widely used scripting languages in the web development industry, enabling... | 0 | 2024-05-23T15:35:39 | https://dev.to/devartteam/enhancing-php-projects-with-sql-server-3la7 | sql, sqlserver, devart, dbforge | PHP is one of the most widely used scripting languages in the web development industry, enabling programmers to easily create dynamic and interactive web applications. Data management is essential to many PHP applications' functioning, and it frequently calls for the integration of strong database solutions. Now, let's examine how to link a PHP project to SQL Server, which is a well-regarded RDBMS due to its extensive feature set, scalability, and dependability.
We'll look at how to combine an SQL Server database with a PHP project in this article. Additionally, we'll show you how to use dbForge Studio, a potent GUI tool, to properly utilize SQL Server's capabilities: https://blog.devart.com/php-connect-to-sql-server.html
| devartteam |
1,862,995 | Navigating MSN Customer Service: msnemail.net.in | Understanding MSN's Services and Product Offerings MSN, or Microsoft Network, is a web portal and... | 0 | 2024-05-23T15:24:17 | https://dev.to/msn_support/navigating-msn-customer-service-msnemailnetin-4p49 | **Understanding MSN's Services and Product Offerings**

MSN, or Microsoft Network, is a web portal and collection of online services provided by Microsoft. As one of the leading technology companies in the world, MSN offers a wide range of services and product offerings to cater to the diverse needs of its users.
At its core, MSN provides access to news, entertainment, and information from various sources. Users can stay up-to-date with the latest headlines, read in-depth articles, and explore a variety of topics, including sports, finance, lifestyle, and more.
Beyond news and content, MSN also offers a suite of productivity tools and services. This includes email, calendar, and contact management through Outlook.com, as well as cloud storage and file-sharing capabilities with OneDrive. MSN also integrates seamlessly with other Microsoft products, such as Office 365, allowing users to access and manage their work and personal documents from a centralized platform.
For those seeking more advanced features, MSN provides subscription-based plans that unlock additional benefits. These premium offerings may include ad-free browsing, exclusive content, enhanced security features, and priority customer support, among other perks.
Overall, MSN's comprehensive services and product offerings make it a valuable resource for individuals and businesses alike, providing a one-stop-shop for a wide range of digital needs.
**Common MSN Customer Service Issues and Troubleshooting Tips**
MSN, the popular web portal and internet service, can sometimes encounter technical issues that can be frustrating for users. In this blog section, we'll explore some of the most common MSN customer service problems and provide helpful troubleshooting tips.
One of the most frequent MSN-related issues is trouble accessing your account. This could be due to forgotten login credentials, account lockouts, or other authentication problems. If you're having trouble signing in, reach out to MSN customer service for assistance resetting your password or verifying your account information.
MSN email is another area where users often encounter difficulties. Email delivery failures, inbox issues, and syncing problems are all common email-related complaints. MSN's technical support team can help diagnose and resolve email-specific concerns.
For any other MSN-related technical issues, the MSN customer service phone number is the best resource. Their trained support representatives can walk you through troubleshooting steps, provide guidance on MSN features and functionality, and escalate more complex problems as needed.
By being aware of these common **[MSN customer service](url)** problems and knowing how to access the right support resources, you can get your MSN account and services back up and running smoothly.
**Contacting MSN Customer Service: Channels and Response Times**
When you need assistance with your MSN account or have questions about MSN services, there are several channels available to contact MSN customer service. Understanding the different options and typical response times can help you get the support you need efficiently.
Phone Support: MSN offers a customer service phone number that you can call to speak with a representative. The phone line is available 24/7, and representatives are typically able to assist with account-related inquiries and troubleshooting.
Live Chat: MSN provides a live chat feature on their website, allowing you to connect with a support agent in real-time. Live chat is a convenient option for getting quick answers to your questions.
Email Support: You can also reach out to MSN via email. While email support may take longer to receive a response compared to phone or live chat, it can be a good option for more complex issues that require detailed explanations or documentation.
Social Media Support: MSN maintains active social media accounts, such as Twitter and Facebook, where you can contact their support team. This can be useful for general inquiries or to escalate an issue.
Response times can vary depending on the channel and the complexity of your request. Phone and live chat support typically provide the fastest response, while email and social media support may take up to a few business days. Knowing these options can help you choose the most appropriate way to **contact MSN customer service** for your needs.
**Resolving MSN Billing and Subscription Inquiries**
When it comes to managing your MSN account and subscriptions, it's important to have a clear understanding of the billing process and available support options. This blog section aims to provide informative guidance on resolving common MSN billing and subscription-related inquiries.
**MSN Billing Support**
For any issues related to your MSN billing, such as incorrect charges, failed payments, or subscription management, you can reach out to the MSN customer support team. They have dedicated representatives who can assist you in understanding your billing statements, processing refunds, or making changes to your subscription plan.
**MSN Subscription Management**
Managing your MSN subscriptions can be done through your online account portal. Here, you can view your active subscriptions, renew or cancel them, and update your payment information as needed. If you encounter any difficulties with your subscription, the support team can help you navigate the process.
**MSN Refund Policy**
MSN has a clear refund policy in place for eligible situations, such as accidental purchases or dissatisfaction with the service. The support team can guide you through the refund request process and ensure that your concerns are addressed in a timely manner.
By understanding these key aspects of MSN billing and subscriptions, you can effectively resolve any issues that may arise and maintain a seamless experience with your MSN account.
**Navigating MSN's Self-Help Resources and Community Support**
MSN offers a comprehensive suite of self-help resources and community support to assist users with a wide range of questions and issues. The MSN Help Center provides detailed user guides, troubleshooting articles, and step-by-step instructions on how to make the most of the platform's various features and functionalities.
For users seeking additional support or wanting to connect with others, the MSN Community Forums offer a vibrant space to share experiences, ask questions, and find solutions. The forums are monitored by MSN moderators and populated by knowledgeable users who are eager to lend a helping hand.
Whether you need guidance on setting up your MSN account, troubleshooting a technical problem, or simply looking to engage with the broader MSN community, the available self-help resources and community support can prove invaluable in navigating the platform with confidence and ease.
**Empowering Yourself with MSN Customer Service Knowledge**
As the digital landscape continues to evolve, understanding how to effectively navigate customer service channels has become increasingly important. One such platform that has proven invaluable is MSN, which offers a wealth of resources to empower users and resolve issues efficiently.
By familiarizing yourself with the **MSN customer service** ecosystem, you can take control of your online experiences and find the support you need, when you need it. From troubleshooting technical problems to exploring account management options, the knowledge gained can be a game-changer in ensuring a seamless and satisfactory user experience.
Whether you're a seasoned MSN user or new to the platform, investing time to understand the available customer service tools and protocols can pay dividends. By leveraging this information, you'll be better equipped to advocate for your needs, identify the right channels for assistance, and ultimately, find resolutions to your queries in a timely and effective manner.
| msn_support | |
1,862,990 | Deliver your Glitch site through Fastly | We’ve been working on a series of Glitch projects to help you learn how to make your websites faster... | 0 | 2024-05-23T15:24:04 | https://blog.glitch.com/post/deliver-your-site-through-fastly/ | cdn, domains, webdev | We’ve been working on a series of Glitch projects to help you learn how to make your websites faster and more reliable using Fastly. In the two years since Glitch became part of Fastly, we’ve been finding ways to unlock opportunities for our community through Fastly’s powerful platform. We’ve also been exploring how to make technologies like CDNs and edge computing accessible to a wider audience using Glitch!
## A new learning project
**The new [~learn-website-delivery](https://glitch.com/~learn-website-delivery) app walks you through the process of setting up Fastly CDN for your Glitch site and putting your own domain in front of it with free TLS.** Using Fastly with your site gives you a performance boost beyond what you can achieve through Glitch alone. In a few short steps you’ll have your site traffic running through Fastly at your own domain, and you can customize your service config in the Fastly control panel at any time.
We developed this project for inclusion in the Fastly documentation site, so it has a [companion tutorial](http://fastly.com/documentation/solutions/tutorials/deliver-your-site/) walking you through the same steps. We’re taking inspiration from the many developer experience exemplars around the industry who use Glitch projects in their docs to onboard users!
## ~~Dogfooding~~ Fishfooding?
We developed the website delivery learning resources through employee training within Fastly. We used Glitch to enable our coworkers inside the company to get started using Fastly, by giving them a Glitch website to use with the CDN. Glitch empowered people across teams to get stuck into our technical product, engaging hands-on regardless of their role or background. Using the project for employee training allowed us to test and refine the learning material, hopefully making it more effective for everyone.
The steps in the guide also show you how to get started measuring performance for your websites. Using your browser tools, you can gain insights into what’s happening in the network when someone visits your site. You’ll also be able to see the detail of how Fastly delivers your Glitch project assets like images automatically through the CDN.

## Try it now!
If you want to try Fastly but without using a Glitch website you can do that! The steps in the [tutorial](http://fastly.com/documentation/solutions/tutorials/deliver-your-site/) will guide you through the flow for your own setup. And if you don’t have a domain you want to use, that’s fine too! You can use a test domain courtesy of Fastly, to discover the performance benefits of caching your site at the edge.
🗞️ **Stay tuned for more apps showing you the many ways in which you can enhance your Glitch website UX with Fastly.**
📣 **If you have feedback on the project or the experience of using Fastly with Glitch, [please share in the forum](https://support.glitch.com)!**
| suesmith |
1,862,993 | Introduction to Devin - The First AI Software Developer | Software is everywhere these days, from your phone to your self-driving cars! But building it can... | 0 | 2024-05-23T15:21:43 | https://dev.to/azubuikeduru/introduction-to-devin-the-first-ai-software-developer-59cm | ai, developers |

Software is everywhere these days, from your phone to your self-driving cars! But building it can be slow and complex, like putting together a complicated puzzle. Programmers have to write tons of code by hand, which takes a long time and can lead to mistakes.
Imagine a world where you tell your computer what program you want, and it writes the code all by itself! No more needing to know those complicated coding languages or spending ages fixing tiny mistakes. That's the kind of future [Devin](https://www.cognition-labs.com/introducing-devin), an amazing new AI, could bring about.
Devin isn't just a simple tool, it's like a super smart teammate for programmers. It can understand what you want a program to do, write the code to make it happen, and even find and fix any problems in the code. This isn't just a dream, Devin is a real AI that's already here, changing the way we make computer programs. Let's delve into the exciting world of AI-powered software development, exploring its potential to revolutionize the way we build applications in fields ranging from healthcare and finance to entertainment and beyond.
## The Problems of Traditional Software Development
While current software development processes have served us well, they are not without their challenges:
- **Time-consuming:** The manual nature of coding can be a significant bottleneck, hindering project timelines and innovation. Repetitive tasks and debugging can eat up valuable developer resources.
- **Limited Scalability:** Creating complex software usually needs big teams and specialized skills. This makes it hard for smaller companies or niche projects to keep up.
- **Human Error:** Even the most skilled programmers can make mistakes, leading to bugs and security vulnerabilities.
These problems are a pain, but AI can help us fix them. By automating tasks, making sure the code is written correctly, and opening the door for more people to build software, AI promises to make things faster, better, and easier for everyone.
## Unveiling Devin: The Prodigy of AI Software Development
The story of [Devin](https://www.cognition-labs.com/introducing-devin) doesn't begin with a sudden flash of insight but with the tireless dedication of a team of passionate engineers at [Cognition](https://www.cognition-labs.com). Driven by the dream of revolutionizing software development, they poured their expertise into building an AI unlike any other. Countless trials, setbacks, and late nights fueled by coffee and determination eventually led to a breakthrough – Devin, the world's first AI software engineer, was born.
Devin isn't similar to other coding assistants like [Copilot](https://github.com/features/copilot). While Copilot suggests lines of code, Devin can actually create entire programs by itself based on instructions. It's a game-changer. Imagine a world where intricate software applications materialize at the speed of thought. Devin possesses this very ability. It can churn out lines of clean, efficient code at an astonishing rate, leaving human programmers free to focus on the bigger picture. But Devin's brilliance extends beyond code generation. It's a debugging whiz, able to identify and rectify errors with an almost unnerving precision. Those head-scratching bugs that plague projects for weeks? Devin can sniff them out and eliminate them in a fraction of the time.

[Source](https://youtu.be/fjHtjT7GO1c?si=-2MD9UGm-NnilbSB)
Imagine this scenario: David, a lead developer, is working on a complex application. He outlines the core functionality to Devin, and within minutes, the AI has generated a skeletal framework. David reviews the code, impressed by its efficiency and clarity. He then throws a curveball – a new feature needs integration. Devin takes this in stride, researching the necessary libraries and seamlessly incorporating the feature into the existing codebase. As David delves deeper, He encounters a logical error. No problem – Devin pinpoints the issue, suggests a fix, and even writes unit tests to ensure the bug is truly squashed. By the end of the day, David has made significant progress, all thanks to Devin's silent yet powerful collaboration.
Devin isn't just an AI; it's a testament to human creativity. It signals a new phase in software development, where humans and machines team up to shape the future, step by step. The potential is vast, and Devin is ready to lead the charge into a new era of innovation.
While the claim of being the "first" AI software engineer is debated, Devin's ability to tackle real-world coding challenges is undeniable. It aced tests where it fixed issues in projects like Django, solving way more problems (almost 14%) than previous AI models like [GPT-4](https://openai.com/research/gpt-4). Devin even surpasses them when they receive hints (like which files to change)! This shows Devin excels at independently understanding and solving complex coding challenges. However, Devin's experience is limited. It's been tested on a small set of projects, so we need to see how it performs with broader coding tasks.

[Source](https://www.cognition.ai/introducing-devin)
However, its experience is limited. We need to see how it performs with broader coding tasks. Integrating Devin with existing workflows and human coders will be crucial to determine its true impact.
## The Impact of Devin
So, how will Devin change things? Here are some ways it can make building software awesome:
- **Turbocharged Productivity:** Imagine development cycles slashed in half! Devin's ability to automate repetitive tasks and generate code at lightning speed will free up developers' time, allowing them to focus on more complex problems and innovative solutions.
- **Unleashing Creativity:** With Devin handling the heavy lifting, developers can shift their focus from the mundane to the magnificent. They can delve deeper into the creative aspects of software design, user experience, and groundbreaking features.
- **Power to the People:** Devin has the potential to democratize software development. Individuals with limited coding experience can leverage Devin's capabilities to bring their ideas to life. This opens doors for citizen developers, entrepreneurs, and anyone with a spark of innovation.
However, with any groundbreaking technology, there are potential concerns to consider:
* **Job Displacement:** While Devin automates tasks, it probably won't replace programmers entirely. Instead, programmers will work with Devin, using its speed and their expertise to build even better things.
* **Ethical Considerations:** Devin's power necessitates responsible use. Programmers and developers need to be careful when building and using Devin to ensure it benefits everyone.
## Future Predictions of Devin
Let's explore how Devin could shape the foreseeable future:
### Devin in Design
Devin isn't just about coding faster. It's about changing how we work. Imagine if Devin can understand what a program needs to do and then make design mockups based on that. No need for separate design teams or long back-and-forth discussions between designers and developers. If Devin can turn the program's purpose into a user-friendly, good-looking interface.
Early design mockups are super important for getting feedback and refining ideas. What if Devin can make visuals based on what the program should do, making communication faster and helping the whole development process go quicker? Everyone involved can see what the program will look like right from the start, which means decisions can be made faster and there's less need for changes later on.
Imagine if we could cut development time in half because design and development happen at the same time. Devin's design skills mean we might not need a separate design phase anymore, saving loads of time and getting products out faster. But don't worry, designers won't lose their jobs. Devin would just help set things up, so designers can focus on making things perfect for users.
### Delvin in Future AI Development
Devin's not just about changing software development; it could also shake up Artificial Intelligence (AI) itself. Here's how:
- Training the Next Generation: Devin's really good at understanding how algorithms work and finding patterns. It could analyze tons of data to figure out what current AI models are good at and where they need improvement. Then, it could create specific training plans to make future AI models even better.
- Making Training Easier: Training AI models can be really complicated and take a long time. But Devin could make it faster by figuring out the best data to use for training and adjusting settings as needed. This means new AI applications could be developed quicker, leaving more time and resources for researchers and developers.
- Getting Better Results: As AI models get more complex, training them becomes harder. But with Devin's help, the process could become much smoother. This could lead to even more powerful AI applications in different fields. And don't worry, Devin won't replace human AI developers; it'll just work alongside them to make things easier and faster.
### Delvin Going Beyond Programming Languages
Devin's potential goes way beyond just coding or training AI models. It could become a versatile AI that adapts to different technical tasks. Picture this, what if devin can understanding complex systems like networks, security, and hardware. It could automate tasks and optimize configurations across various technical areas, saving time and resources.
Imagine Devin seamlessly navigating through complex technical integrations, analyzing different systems and APIs to find the best connections and even generating code for integration. Its versatility wouldn't just benefit software developers or AI trainers; it could help network engineers, system administrators, and professionals in various industries. This collaboration between Devin and human expertise promises to make technical tasks easier and faster, pushing innovation forward.
### To Gain Access to Devin
While Devin AI is still under development and unavailable for public use, Cognition Labs, the company behind Devin, might offer early access in the future. This would likely involve an initiative where potential users would need to meet specific criteria to be considered.
To gain access to Devin, you can either [join the waitlist](https://docs.google.com/forms/d/e/1FAIpQLScHG0Kuxf9rVLR2Ceamr9qq85YLxKPx8fxdQeBr5TwvYEsPUg/viewform) or reach out to cognition lab via email at info@cognition-labs.com.
## Conclusion
Devin's arrival marks a significant advancement in software development. It's not just about speeding up coding; it's about human-AI collaboration. Imagine programmers and Devin as partners: Devin handles the mundane tasks, while programmers use their creativity to come up with new ideas. This teamwork will redefine how we build software, making it faster and better than ever before.
Of course, with any new invention, there are challenges. But by developing AI responsibly and ethically, Devin can become a true force for good. The future of software development is bright! With Devin on our side, we can create amazing new things, from innovative apps to a deeper understanding of how humans and AI can work together. Let's get excited about this future, together!
## Resources
Below are additional resources to enrich your understanding of Devin AI:
* [For further insights into Devin, consider reading this resource](https://medium.com/@kristiyan.velkov/meet-devin-the-worlds-first-ai-software-engineer-f0c35f221bdd).
* [Explore how Devin AI will revolutionize our work dynamics by watching this video](https://youtu.be/or1xia6epeM?si=YYzytd2TRA9XrcD0).
* [Delve deeper into the potential of human-AI collaboration through this insightful read](https://inclusioncloud.com/insights/blog/ai-human-collaboration/). | azubuikeduru |
1,862,991 | SHB | The variety on the SHB menu(see here) is truly impressive. Their classic cheeseburger is a hit,... | 0 | 2024-05-23T15:19:29 | https://dev.to/palman24/shb-23f9 | The variety on the SHB menu([see here](https://www.smashhouseburgers.com/menu/)) is truly impressive. Their classic cheeseburger is a hit, topped with melted cheddar, fresh lettuce, and tomato. For a unique twist, try the pineapple teriyaki burger, which combines savory and sweet flavors perfectly. Vegetarians will appreciate the black bean burger, packed with flavor and topped with creamy avocado. For those who can't decide, the burger sampler platter is a great way to taste several different offerings. And don’t forget to pair your meal with their signature loaded fries. | palman24 | |
1,862,989 | ASEPTOTO > Daftar Situs Slot Gacor Dan Bandar Togel Online Terpercaya No.1 Di Indonesia | Aseptoto adalah salah satu situs terbaik untuk tempat bermain slot gacor dan togel online yang... | 0 | 2024-05-23T15:15:59 | https://dev.to/aseptoto/aseptoto-daftar-situs-slot-gacor-dan-bandar-togel-online-terpercaya-no1-di-indonesia-1d85 | Aseptoto adalah salah satu situs terbaik untuk tempat bermain slot gacor dan togel online yang memiliki kelebihan gampang menang dan pastinya hadiah togel yang sangat luar biasa besar. Jika kalian ingin mencari situs yang baik dan bisa dipercaya [aseptoto](https://aseptoto-gold.tumblr.com/) adalah tempatnya, selain itu aseptoto juga menjamin keamanan akun pemain dan tidak akan menyalahgunakannya untuk hal hal yang negatif. | aseptoto | |
1,862,987 | Network Error when send file with formData(axios or fetch) on Android. Works fine on IOS | When I try to send some post request sending a file with formData I receive a Network Error. Most of... | 0 | 2024-05-23T15:14:04 | https://dev.to/felipe_lopesavila_ec738d/network-error-when-send-file-with-formdataaxios-or-fetch-on-android-works-fine-on-ios-3013 | reactnative | When I try to send some post request sending a file with formData I receive a Network Error. Most of the time the request works fine, but sometimes I receive this error. The problem occurs when use Android, works fine with IOS.
Recently I updated my application to react-native 0.74.1(latest), before the version was 0.64.4. In older version all works fine.
Current versions:
React native: 0.74.1
Node: >= 18
React: 18.2.0

| felipe_lopesavila_ec738d |
1,862,986 | The Best Time Tracking Apps of 2024: Optimize Your Productivity | Nowadays, managing your time effectively is more critical than ever. Whether you're a freelancer, a... | 0 | 2024-05-23T15:14:03 | https://blog.productivity.directory/the-best-time-tracking-apps-d577a660e151 | timetracking, timemanagement, productivity, productivityapps | Nowadays, managing your time effectively is more critical than ever. Whether you're a freelancer, a remote team member, or a manager overseeing multiple projects, [time tracking apps](https://productivity.directory/category/time-tracking) can be invaluable tools for boosting your [productivity](https://blog.productivity.directory/what-is-productivity-bdd6bc399f6f) and ensuring accountability. As we navigate 2024, let's explore some of the top time tracking apps available to help you keep on top of your professional and personal goals.
Toggl Track
===========
[Toggl Track](https://productivity.directory/toggl-track) remains a leader in the time tracking space, renowned for its user-friendly interface and versatile functionality. Suitable for both individual freelancers and large teams, Toggl allows users to track time across various devices seamlessly. Key features include detailed reporting, calendar integration, and automated reminders that make time management hassle-free.
Clockify
========
[Clockify](https://productivity.directory/clockify) continues to stand out as a completely free yet powerful tool for unlimited users, making it especially attractive for startups and non-profits. It offers straightforward time tracking, extensive reporting options, and a customizable dashboard to suit various business needs. Clockify also supports project budgets and billable rates, allowing teams to monitor project profitability.
Harvest
=======
[Harvest](https://productivity.directory/harvest) excels in simplicity and integration capabilities, ideal for those who need to couple time tracking with invoicing and resource planning. Its intuitive design helps users track time without disruption, and the app provides insightful reports that help analyze project health and team productivity. Harvest's integration with numerous third-party tools ensures it fits seamlessly into any workflow.
RescueTime
==========
For those focused on personal productivity and managing distractions, [RescueTime](https://productivity.directory/rescuetime) offers automated time tracking by monitoring the applications and websites you use. Its powerful analytics help you understand your daily habits so you can make better time management decisions. RescueTime also includes goal setting features and distraction blocking to keep you focused on your priorities.
Time Doctor
===========
[Time Doctor](https://productivity.directory/time-doctor) is a robust tool designed for remote teams looking to improve productivity while ensuring transparency. Features such as screenshots, website and application tracking, and detailed analytics make Time Doctor a favorite for managers needing to maintain team efficiency. Additionally, its payroll integration and client billing features streamline administrative tasks.
MyHours
=======
[MyHours](https://productivity.directory/myhours) is perfect for those who prefer a visually engaging and straightforward time tracking experience. Its timeline interface makes it easy to see the whole day at a glance, adjust blocks of time, and switch between tasks efficiently. With real-time tracking, reminders, and reporting features, Hours simplifies time management for freelancers and small business owners.
Hubstaff
========
[Hubstaff](https://productivity.directory/hubstaff) offers comprehensive time tracking solutions geared towards location-based tracking and project management. With features like GPS tracking, online timesheets, automated payroll, and project budgeting, Hubstaff is ideal for businesses that operate both remotely and in the field. Its strong focus on automation helps reduce administrative overhead while maintaining team accountability.
Conclusion
==========
As we look to maximize efficiency and productivity in 2024, these time tracking apps offer various features to meet the diverse needs of professionals across industries. Whether you need detailed project tracking, help managing distractions, or tools for remote team management, there's an app on this list that can help streamline your processes and boost productivity.
> Consider your specific needs and the unique challenges of your work environment to choose the best time tracking app for you. Try out a few of these options --- many offer free trials or versions --- to find the perfect fit that will help you take control of your time and accomplish more.
Ready to take your workflows to the next level? Explore a vast array of [Time Tracking Apps](https://productivity.directory/category/time-tracking), along with their alternatives, at [Productivity Directory](https://productivity.directory/) and Read more about them on [The Productivity Blog](https://blog.productivity.directory/) and Find Weekly [Productivity tools](https://productivity.directory/) on [The Productivity Newsletter](https://newsletter.productivity.directory/). Find the perfect fit for your workflow needs today! | stan8086 |
1,862,984 | Task 18 Python Selenium | The Selenium architecture for Python is structured to facilitate robust and flexible web automation... | 0 | 2024-05-23T15:08:30 | https://dev.to/vigneshpm/task-18-python-selenium-5cpn | The Selenium architecture for Python is structured to facilitate robust and flexible web automation and testing. It involves several key components working together seamlessly:
1. Selenium WebDriver
WebDriver is the core component that interacts directly with web browsers. It provides a programming interface to write and execute test scripts.
2.Language Bindings
Selenium supports multiple programming languages, including Python. These language bindings allow you to write test scripts using Python's syntax. The `selenium` package can be installed via pip:
```sh
pip install selenium
```
3.JSON Wire Protocol
WebDriver uses the JSON Wire Protocol to communicate with browser drivers. This protocol defines a RESTful web service for performing operations like navigating to a page, clicking elements, and entering text.
4.Browser Drivers
Each browser has a specific driver that acts as a bridge between WebDriver and the browser:
- ChromeDriver for Google Chrome
- GeckoDriver for Mozilla Firefox
- IEDriverServer for Internet Explorer
- SafariDriver for Safari
5. Browsers
Selenium can automate any browser that is compatible with the corresponding browser driver, such as Chrome, Firefox, Safari, Edge, and Internet Explorer.
Detailed Flow of Selenium Python Architecture
1.Script Execution: The Selenium script is executed, containing commands like opening a browser, navigating to a URL, and interacting with elements.
2.Python Bindings: These bindings translate Selenium commands written in Python into HTTP requests using the JSON Wire Protocol.
3.WebDriver: Communicates with the appropriate browser driver by sending these HTTP requests.
4.Browser Driver: Converts JSON Wire Protocol commands into browser-specific actions and executes them.
5.Browser Interaction: The browser performs the actions, such as loading a webpage or clicking a button.
6.Response Handling: The browser sends responses back to the browser driver, which then sends them to the WebDriver.
7.Result Return: WebDriver returns the results to the Selenium script through the Python bindings, allowing the script to handle the outcomes and perform further actions.
Integration with Testing Frameworks and CI/CD
Selenium is often integrated with testing frameworks like PyTest or Unittest for better test management and reporting. Additionally, integrating Selenium tests with CI/CD tools like Jenkins or GitHub Actions allows for automated test execution as part of the build process, ensuring new code changes do not introduce bugs.
This architecture allows for efficient and effective web browser automation, making Selenium a powerful tool for web application testing.
A Python virtual environment is a tool that helps to keep dependencies required by different projects in separate places, by creating isolated environments for each of them. This is especially significant for maintaining project dependencies and avoiding conflicts. Here are the key benefits and examples illustrating the significance of using Python virtual environments:
Key Benefits
1. Dependency Management: Different projects might require different versions of the same library. Virtual environments help manage these dependencies without conflicts.
2. Isolation: Each virtual environment is isolated from the others, ensuring that the libraries and dependencies in one environment do not affect another.
3. Portability: Virtual environments can be easily recreated on different machines, ensuring consistent development environments.
4. Clean System: Prevents global installation of packages, keeping the system Python environment clean and uncluttered.
5. Reproducibility: Ensures that the same dependencies are used across different stages of development, testing, and production, aiding in reproducibility of code execution.
Examples
Example 1: Different Django Versions
Imagine you have two projects: Project A and Project B. Project A uses Django 2.2, while Project B uses Django 3.2. Installing these different versions globally would cause conflicts. Here’s how you can handle this with virtual environments:
1. Creating Virtual Environments:
```sh
python -m venv projectA_env
python -m venv projectB_env
```
2. Activating Virtual Environments:
- On Windows:
```sh
projectA_env\Scripts\activate
projectB_env\Scripts\activate
```
- On Unix or MacOS:
```sh
source projectA_env/bin/activate
source projectB_env/bin/activate
```
3. Installing Dependencies:
- For Project A:
```sh
pip install django==2.2
```
- For Project B:
```sh
pip install django==3.2
```
4. Deactivating Virtual Environments:
```sh
deactivate
``` | vigneshpm | |
1,862,983 | How to build your Developer Portfolio with MindsDB: The symbiotic relationship between developers and Opensource in 2024. | What can you as a developer do to showcase your potential and expertise? One of the many challenges... | 0 | 2024-05-23T15:07:17 | https://dev.to/chandrevdw31/how-to-build-your-developer-portfolio-with-mindsdb-the-symbiotic-relationship-between-developers-and-opensource-in-2024-4e1a | developers, opensource, mindsdb, ai |

What can you as a developer do to showcase your potential and expertise? One of the many challenges developers face is finding employment as most companies don’t recruit developers who do not have working experience.
Employers need to see proof of work that developers are capable and efficient. However, where does that leave freshly graduated students who have the qualifications but no experience and those who have been in the workforce but struggle with finding employment? Well Opensource has changed the game and is leading developers into a new era where they can gain experience working with different projects and enhance their skill set.
Open Source is a great way to hone in your developer skills and simultaneously build up a proof of work portfolio regardless of being employed or unemployed by contributing to projects. Developers over the world seek for projects that they can contribute to, especially GitHub users who look for ways to fill up those gray boxes with green on their GitHub profile 😉
From beginners to advanced coders that seek challenges, MindsDB offers developers a wide range of ways that they can contribute to opensource while building your Developer Portfolio.
## What is MindsDB?
MindsDB is the platform for customizing AI from enterprise data, enabling smarter organizations. It allows developers to build AI/ML applications fast. MindsDB integrates with numerous data sources, including databases, vector stores, and applications, and popular AI/ML frameworks, including AutoML and LLMs. In summary, MindsDB brings data and AI together, enabling intuitive implementation and automation of AI systems.
You can gain experience on working with the following:
- Automated Fine-Tuning
- AI Agents
- AI-Powered Data Retrieval
- Data Enrichment
- Predictive Analytics
- In-Database Machine Learning
- AI Workflow Automation
## How to build your developer portfolio
In today’s time, it is essential for developers to have their own GitHub profile that they can use to contribute to an open-source project. As MindsDB is Open Source, you will have no issue with utilizing GitHub to hone your skills. MindsDB has its own GitHub repository where issues are listed for developers to solve. From no-code, low-code, and code technical contributions, developers can choose at what level they want to contribute according to their capabilities.
We will explore no-code, low-code to code contributions to help build your portfolio.
No-Code, Low-Code:
This is for every type of developer on any level of skill, meaning you will be able to do this with little to nothing experience as it involves no-code to low-code skills, which includes SQL.
#### [Reporting bugs and fixing typo’s.](https://docs.mindsdb.com/contribute/issues)
This kind of contribution is for beginners looking for a start in contributing to open source. You can report any bugs found in MindsDB’s platform on their github repository issues page as well as fix any typos on their documentation. This showcases your eye for detail and skill to find bugs for an application, something that is very critical for a company looking to enhance their product.
#### Technical writing
This involves writing a blog or creating a video tutorial on how to use MindsDB.
Here are a few ways you can contribute with writing:
- [Write a README file](https://docs.mindsdb.com/contribute/integrations-readme) - for existing handlers- MindsDB handlers require a clear, comprehensive and user friendly guide on how to contribute to or use the handler
- [Write a Blog tutorial](https://docs.mindsdb.com/contribute/tutorials) - Write about how to use the MindsDB product with it’s different features.
- [Write MindsDB Documentation](https://docs.mindsdb.com/contribute/docs) - Contribute to MindsDB’s documentation and enhance the quality.
Technical writing skills and content creation are a great way to display your skill in being able to convey information and guides that help others understand a product. This is important for companies to have affable documentation that makes it easy for users to understand and trust using their platforms.
#### QA/UX Testing
With MindsDB, you can QA test their platform with their integrations and user interface and report on any issues found or provide feedback as to what is working. This involves running SQL syntax that performs tasks and tests whether the code as well as the platform with its handler are functional. This improves the quality of the product. Companies find this as an important skill set as it enhances user experience and adds value, as their product/release cannot go to market without it being signed off by QA.
### High Code
This involves using the developers coding skills. Your skillset can include Python to build data, AI/ML and application integrations.
#### Fix Bugs/Issues
Developers are able to check for issues to fix on [MindsDB’s Github Issues Page](https://github.com/mindsdb/mindsdb/issues). The issues are marked with labels that indicate what you can work on, which you can find [here](https://docs.mindsdb.com/contribute/issue-labels). Fixing bugs showcases that you are a problem solver and capable of resolving issues. Companies find this capability very valuable as it has an impact on the quality of their product and user experience.
#### Building Integrations
Developers can showcase their Python skills by building integrations with existing [Data handlers](https://docs.mindsdb.com/contribute/data-handlers), [AI/ML handlers](https://docs.mindsdb.com/contribute/ml-handlers) and, [Applications](https://docs.mindsdb.com/contribute/app-handlers). You can also enhance existing handlers with building features for integrations that have not been implemented yet. This contribution demonstrates that developers are capable of assisting companies in Python-related tasks, especially in building integrations that will assist with improving the quality of their product by elevating systems to work together efficiently.
#### Build AI/ML apps
This is where people can exhibit their full potential as a developer. MindsDB allows developers to build AI/ML applications fast and also gives apps access to a wide range of data sources, LLMs, and other applications which enhances the capabilities of their applications. With MindsDB’s [Python SDK](https://docs.mindsdb.com/sdks/python/overview) and [JavaScript SDK](https://docs.mindsdb.com/sdks/javascript/overview), you are able to build applications to your heart’s content with MindsDB as the back-end. Companies seek those who are able to build applications as creating one for their own company can increase brand value, generate more sales and revenue, and benefit customer experience which leads to better customer retention. You as a developer might also want to build an application that is fit to profit from and create your own business around.
## Conclusion
Developers and Open Source are in symbiosis- just as contributors improve a project, they also are provided the opportunity to demonstrate their skills and creativity. As much as it is enjoyable to create cool trendy projects, developers would be wise to use it as an opportunity to design meaningful projects that can be applied to real-world use cases with impact.
MindsDB is a great way for you to interact with different data sources, model management, LLMs, AI integrations, and Automation. All these types of contributions can serve as proof of work for your developer portfolio, showcasing that you are a prudent developer capable of applying your skills and gaining experience.
_For information on how to contribute to [MindsDB](https://github.com/mindsdb/mindsdb), check out this [Contribution guide](https://docs.mindsdb.com/contribute/contribute).
If you have any questions or would like to share your thoughts, you can join MindsDB’s vibrant community of developers and AI enthusiasts on [Slack](https://mindsdb.com/joincommunity). Explore their [Github Repository](https://github.com/mindsdb/mindsdb) and feel free to give it a star!_
**_Twitter: [@Chan_vdw](https://twitter.com/Chan_vdw)
Github: [chandrevdw31](https://github.com/chandrevdw31)_** | chandrevdw31 |
1,862,982 | day 04 | date:- 23 May, 2024. Randomization:- Computers are deterministic(we are not considering malfunctions... | 0 | 2024-05-23T15:03:18 | https://dev.to/lordronjuyal/day-04-31m8 | python | date:- 23 May, 2024.
**Randomization**:- Computers are deterministic(we are not considering malfunctions etc). They will do what we ask them to do. They can't generate random outputs, even the AI. However, we can produce pseudo-random outputs from them by using large data and complex algorithms. Pseudo as we can still find pattern in the data but it will take significant time to reach that. To generate a random number in Python, we need a 'random module'.
**Module**:- Often Python projects get so big that it's not possible to maintain all code in one file. To solve this we create different files for different functionality. In a project, main.py is named as the main module. This file will execute first, and other modules will be called from here.
In Python, we are provided with many in-built modules. We save memory by including only those which are required in our program.
To include a python module, we will have to import it into the file where we want to use it.
**List**:- It's a data structure(way of arranging data). Its syntax is: list_name = [value1, value2]. Each value in it is arranged in order and provided a unique number called an index. The index starts from 0 and is assigned to the leftmost value. We add +1 as we move to the right.
To access the value from the list we write:
list_name[index_of_value] >>value
eg list=[a,b,c,d]
list[0] = a , list[2] = c , list[-1] = d
We can also lists inside a list. eg list=[[a,b,c],[1,2,3]]
list[0][1] = b
String:- like lists, in string, each character is also assigned a unique index.
eg string = "ABC" string[0] >>"A"
Functions I learned today:-
1. random.randint(a,b) -This will generate a random number between a and b, including a & b.
2. random.choice(list) -This will return a random value from the list.
3. list.append(x) - This will add value x at the end of the list. list[-1] = x
4. list.insert(i,x) - This will insert value x at index i, shift the previous value to right(will change its index also)
Program_for_today: Rock, paper and scissors

| lordronjuyal |
1,862,981 | Vue & Nuxt Resource of the week | Audio... | 0 | 2024-05-23T15:02:35 | https://dev.to/leamsigc/vue-nuxt-resource-of-the-week-jb0 | nuxt, vue, tips, podcast |
Audio Version:
https://podcasters.spotify.com/pod/show/the-loop-vuejs/episodes/Resource-of-the-week-e2k18jl
#The Resource of the week is:
[vuejstips](vuejstips.com)
They provide tips related to:
- Vuejs
- Nuxt
- Vite
You can see small examples of the tip on how to use them, base on the filter that you provided.
For me, one of the tips that I’m currently using is the Nuxt Content Module, an Experimental feature related to search.
That will give you the possibility of search in any of your documents:
With some simple configuration and a composable, you can add search to your Nuxt Content site.
> Please if anyone have a better way please comment below and let's learn together
{% gist https://gist.github.com/leamsigc/68d3f891d9298c35de273caa2b21e453 %}
| leamsigc |
1,862,980 | How to build a Content Hub with a headless CMS | Why would you listen to music on Spotify instead of a tape recorder? Well, because Spotify enables... | 0 | 2024-05-23T15:00:09 | https://dev.to/momciloo/how-to-build-a-content-hub-with-a-headless-cms-125e | Why would you listen to music on Spotify instead of a tape recorder? Well, because Spotify enables you to manage music, make playlists, and share them across multiple platforms with one system. The same goes for building a content hub with a headless CMS.
What is a [content hub](https://thebcms.com/blog/content-hub-guide)?- a collection of content focused on one topic.
What is a [headless CM](https://thebcms.com/blog/headless-cms-101)S? - a content management that serves as a content repository.
You can do the math by yourself: collection of content + content repository = content database ready to be published and reused.
In other words, a headless CMS is a Spotify (or better said, digital content hub) for your content, brand, teams, strategies, communication, and management.
Ok, I set up your mindset, now, the only thing left is to set up a content hub with a headless CMS.
## Why build a Content Hub with a headless CMS
Content management systems have long been recognized as an integral part of a digital strategy. Is it even possible today to successfully run a quality website without a CMS? The same philosophy can be applied to content hubs. Here are the reasons why your CMS can be the central hub for your digital content:
### Flexibility in content distribution
A headless CMS separates the content from its presentation, meaning the content is stored as data and delivered via APIs. This makes it easier to distribute the same content across different platforms (websites, mobile apps, IoT devices, etc.) without having to separately write it for each one.
### Centralized content management
The goal of a content hub is to break down the silos of content that exist across multiple components. Organizations can have complete visibility and control over all digital assets by centralizing their content management efforts.
### Better security
With content assets centrally located, organizations can more easily secure one or a group of platforms than handle an isolated system that includes content items they’re not sure about.
Using headless as a central hub to manage security protocols within a single system is smoother and safer than trying to manage multiple users across multiple systems which will result in a system with a secure gateway.
Learn more: [Is Headless CMS a secure CMS? A Comprehensive Analysis](https://thebcms.com/blog/headless-cms-secure-cms)
### Integration support is out-of-the-box
Integrating new systems and channels is easiest with an API-first CMS where they don't require building from scratch. This is particularly useful for organizations that use a diverse tech stack or need to connect their content management with CRM systems, e-commerce platforms, job boards, or other digital marketing tools.
Another option that you can use is to build the systems you need within a CMS.
Example: [How to use Headless CMS as your job board CMS](https://thebcms.com/blog/headless-cms-as-job-board-cms)
### Better developer experience and independence
Everything in CMS revolves around balancing the requirements of both editors and developers. Developers can build and maintain different frontends (for web, mobile, kiosks, etc.) using the best technologies suited to each platform. The headless CMS will deliver content via APIs, ensuring that each frontend displays it appropriately.
The great balance between developers' and other teams' needs in an organization you can see in this example: [Composable architecture example: Go headless (best practices)](https://thebcms.com/blog/composable-architecture-example)
### Benefits of having Content Hub with a headless CMS
Before I go to the "how-to" part of the article, here is the list that outlines the key benefits of using a headless CMS for a Content Hub, each explained clearly and simply so that whenever you need it you can take a 👀.
- Flexibility in content distribution: Easily share content across various platforms.
- Centralized content management: Manage all content from one central location.
- Better security: Enhanced security from separating content management from delivery.
- Better developer experience and independence: Developers use their favorite tools. boosting efficiency.
- Scalability: Handles more content traffic easily as needs grow.
- Cost efficiency: Saves money by simplifying backend needs and infrastructure
- Performance: Speeds up content delivery and improves UX.
- Content modeling: Easily define and structure types of content, to your specific needs.
- Multi-language support: Supports delivering content in multiple languages from the same platform.
- Content reusability: Reuse the same content in different contexts without additional platforms.
- Advanced permissions settings: Control who can view, edit, or manage content. Enhancing security and workflow management.
Ok, the quick reminder is done. So let’s go to some practical steps and use cases.
## How to set up a content hub with a headless CMS
Setting up a content hub with a headless CMS involves several key steps to ensure you manage and your content across multiple platforms. Below is a detailed guide on how to set up such a system with BCMS:
### Step 1: Define and design your content model
- **Identify content types**: Determine the kinds of content you need, such as a home page, team member’s permissions and roles, blog, articles, product pages, user profiles, and more.
Here’s how to do that in BCMS:

[BCMS Templates](https://docs.thebcms.com/inside-bcms/templates) are a pre-defined content structure. So when you determine your content needs you are ready to make templates for each content piece.

### Step 2: Structure your content
Design the structure for each content piece with fields that represent different content elements.
For example:
1. **Articles:**
- Title
- Author
- Body
- Published date
- Tags
2. **Authors:**
- Name
- Bio
- Profile picture
Here’s how to do that in BCMS:

As you can see from the video above, BCMS entry is a feature that helps you structure (define) properties for every template created.
The structure of an entry depends on the properties of its template. Every entry will display the properties defined in its template. Aside from the pre-defined meta, each entry has a content area.
Let’s see an example, I’ll use my profile in BCMS to show how content is structured.

In this example, this entry is structured this way:
Pre-defined meta:
- **Title:** The title is pre-defined for a reason because it enables easier finding (in this case my name) Just by searching my name in the CMS, anyone from the organization can see which projects I worked on and track all actions within the system.
Other entries can benefit from a pre-defined title to make them more structured, so you will be able to easily find the entries you need among 10,000 entries.
- Role ( old and new members, can have insights what is my job, or for what questions they need to contact me).
- Profile photo ( so the whole team can be able to see how handsome I am 😎).
Content area:
In this example, the content area serves to customize this entry. Pre-defined metadata is structured the same for all team members, but a content area, in this case, can be used to write my bio or work experience and give more insights about me.
### Categories/Tags Pages
This feature helps organize the content and makes it easy for users to find related topics. Use categories, tags, and other taxonomies to organize your content effectively.

### **Implement search functionality**
Content hubs need search functionality because it improves UX, provides quick access to information, simplifies navigation, improves content discoverability, allows users to find relevant content quickly, and provides valuable insights into user behavior and content performance.
Kinds of search functionality in CMSs:
1. **Basic search:**
Most CMS platforms provide basic search functionality. Configure this to meet your needs.
2. **Advanced search:**
Consider integrating a third-party search service like Algolia or ElasticSearch for more advanced search capabilities.
3. **Faceted search:**
Implement faceted search to allow users to filter content based on different criteria (e.g., date, category, tags).
Search functionality in BCMS:

### Step 2: Integrate the CMS with existing systems
- **Identify integration points**: Determine which systems need to connect with the BCMS, such as CRM systems, E-commerce platforms, or marketing automation tools.
- **Use APIs**: Leverage the [BCMS’s API](https://docs.thebcms.com/inside-bcms/key-manager) to integrate with these systems.

### Step 3: Develop a content delivery infrastructure
- **Select frontend technologies**: Choose technologies for developing the frontend(s) that will consume the CMS content.
BCMS is frontend agnostic so you can let developers in your organization choose your stack. Also, there are some [official integrations](https://docs.thebcms.com/quickstart) with the popular frontend frameworks that can help them make a decision easier.

- **Build frontends**: Develop the frontend applications separate from the CMS, ensuring they can consume the API endpoints provided by the CMS.
- **Ensure performance**: Optimize content delivery with caching, content delivery networks (CDNs), and API calls.
### Step 4: Set up content governance
- **Define workflows:** Create workflows that dictate how content is created, approved, and published. Include roles for content creators, editors, and administrators.

- **Implement roles and permissions**: Configure the CMS to enforce these workflows by setting up appropriate roles and permissions for all users.

Roles and permissions are essential for a content hub because they control who can create, edit, and publish content. This ensures that only authorized users make changes, maintaining content quality and security. Clear roles and permissions also help with teamwork, as everyone knows their responsibilities, making content management more efficient and organized.
### Step 5: Go live
Before going live, conduct testing of the CMS and frontend integrations, including user acceptance testing (UAT) to ensure everything works as expected and after that, you are ready to go live.
## Get prepared for BCMS as your Content Hub
You've seen the reasons, you learned how to set up (B)CMS as a content hub. The only thing left is to give it a try.
So at the very end, I'll leave you with sources that you and other members of your organization may find useful:
- [BCMS documentation](https://docs.thebcms.com/)
- [OpenAPI documentation](https://rest-apis.thebcms.com/bcms-backend/3-0-0)
- [Code source](https://github.com/bcms)
- [BCMS starters](https://thebcms.com/starters) (in case you want to build additional services)
Now you have a list of materials that help you get started and build a content hub for your enterprise, so every member of your team can enjoy the advantages of a centralized content database. | momciloo | |
1,862,198 | Why CSS-in-JS Doesn't Solve Problems | CSS-in-JS has gained significant traction in the web development community over the past few years.... | 0 | 2024-05-23T15:00:00 | https://dev.to/m_midas/why-css-in-js-doesnt-solve-problems-14i9 | webdev, javascript, beginners, css | CSS-in-JS has gained significant traction in the web development community over the past few years. Proponents argue that it offers a modular, scoped, and dynamic way to style components, especially within the context of JavaScript-heavy frameworks like React. However, despite its popularity, CSS-in-JS introduces a range of challenges and complexities that suggest it may not be the silver bullet solution for managing styles in web applications. Here, we'll explore some of the fundamental reasons why CSS-in-JS doesn't necessarily solve the problems it sets out to address.
<img src="https://media.giphy.com/media/v1.Y2lkPTc5MGI3NjExdGdlOTcxMDZseTZlNjZvZGt5aTljdGljaGQxb3hjN2M0M2ZtaGhpMyZlcD12MV9pbnRlcm5hbF9naWZfYnlfaWQmY3Q9Zw/g697zr42aee76RqD89/giphy.gif">
## Performance Overheads
One of the primary criticisms of CSS-in-JS is the performance overhead it introduces. Traditional CSS is parsed and applied by the browser very efficiently. CSS-in-JS, however, requires styles to be generated and injected into the DOM at runtime, which can lead to significant performance hits, particularly in large applications. This runtime cost includes parsing JavaScript, generating styles, and manipulating the DOM, all of which can contribute to slower page load times and a less responsive user experience.
## Increased Bundle Size
With CSS-in-JS, styles are typically included in the JavaScript bundle. This means that every component with its associated styles contributes to the overall size of the JavaScript bundle. As a result, applications can become bloated, leading to longer download and parse times. This is a stark contrast to traditional CSS, which can be minified and cached separately, reducing the amount of data transferred and improving overall load times.
## Complexity and Learning Curve
CSS-in-JS introduces a new syntax and set of concepts that developers need to learn. While traditional CSS has a well-established and straightforward syntax, CSS-in-JS solutions often require familiarity with JavaScript template literals, tagged templates, and sometimes even CSS preprocessors like Sass. This added complexity can steepen the learning curve for new developers and increase the cognitive load for experienced developers, making the development process more cumbersome.
## Poor Separation of Concerns
One of the foundational principles of web development is the separation of concerns: HTML for structure, CSS for presentation, and JavaScript for behavior. CSS-in-JS blurs these lines by embedding styles directly within JavaScript files. This can lead to harder-to-maintain codebases, as styles are no longer isolated from the logic and structure of the application. When styling rules are mixed with component logic, it can become more challenging to manage and debug issues, especially in larger applications.
## Inconsistent Styling Approaches
CSS-in-JS can lead to inconsistent styling approaches within a codebase. Different libraries and frameworks offer various ways to implement CSS-in-JS, each with its own syntax and conventions. This inconsistency can result in a fragmented codebase where styles are applied in multiple ways, making it harder to enforce a unified design system. Traditional CSS, especially when combined with methodologies like BEM (Block, Element, Modifier) or utility-first CSS frameworks, provides a more consistent and predictable approach to styling.
## Lack of Tooling and Ecosystem Support
While CSS-in-JS solutions have matured, they still lack the extensive tooling and ecosystem support available for traditional CSS. Tools like PostCSS, Autoprefixer, and CSS linting utilities have been developed and refined over years to enhance the CSS development workflow. Although some CSS-in-JS libraries offer plugins and integrations, they often fall short of the comprehensive support and flexibility provided by the broader CSS ecosystem.
## Conclusion
While CSS-in-JS offers some compelling features, such as scoped styles and dynamic theming, it introduces a range of challenges that can negate its benefits. Performance overheads, increased bundle sizes, added complexity, poor separation of concerns, inconsistent styling approaches, and limited tooling support are significant drawbacks that make CSS-in-JS less appealing for many projects.
In many cases, traditional CSS, possibly enhanced with modern methodologies like BEM, utility-first frameworks like Tailwind CSS, or preprocessors like Sass, can provide a more efficient, maintainable, and performant solution. Developers should carefully weigh the pros and cons of CSS-in-JS and consider their specific project requirements before adopting this approach. | m_midas |
1,840,765 | Redirect YAML From Ghost Data Using JQ | A bit ago I realized I needed to generate redirect text for each of my current blog posts on my... | 27,249 | 2024-05-23T14:57:00 | https://dev.to/simplykyra/how-to-find-the-redirect-section-on-your-ghost-account-67 | ghost, yaml, json, jq | A bit ago I realized I needed to generate redirect text for each of my current blog posts on my Casper themed Ghost website. This seemed to be an overwhelming task but after talking it through both my husband and I came up with two potential ideas. Here is his idea that uses the data file, jq, and visual filtering that I implemented, while tired, and shared in case it helps you.
Quick aside: To go over how to find the redirect section through [the Ghost](https://ghost.org) interface you can check out my earlier post in this series `How to Find the Redirect Section on Your Ghost Account`. After this one I'll share an alternative way to do this. If you are looking for my original full post you can check out [Oh No, I Need to Create Redirect Text for All My Posts!](https://www.simplykyra.com/blog/oh-no-i-need-to-create-redirect-text-for-all-my-posts/).
Word of warning I was tired when doing this, it was all new to me, wasn't working, and I easily got frustrated. As such it's a bit of a journey but I figured I'd share my process in case something in it I may have otherwise omitted helps you out.
## Get the Data
To use this method you will need to start with your website data that I got by exporting it from my website through the migration portion of my settings page. I found it by entering my domain and then typing in `/ghost/#/settings/migration`.

## Parse the Data
Once I had my data I needed to extract each of my posts' slugs. We decided to use [`jq`, a lightweight and flexible command-line JSON processor,](https://jqlang.github.io/jq/?ref=simplykyra.com) to get that information but before we did that we first needed to figure out what the command itself, used to parse out that information, should be. To figure it out we opened the data and used `ctrl+f` to look for a slug and then worked my way back up resulting in the following structure. This is the `json` code from the Migration Export part of the Settings with only the information needed extracted:
```
{
"db": [
{
"data": {
"offers": [],
"posts": [
{
"slug": “foo”,
}
]}}]}
```
With the json structure found we next turned to [ChatGPT](https://chat.openai.com/?ref=simplykyra.com) and asked it to create the `jq` query I needed which gave me the command: `jq '.db[].data.posts[].slug' data.json`. I used that to test out the results and further iterated through versions while hoping I was getting closer to what I wanted.

## Downhill
The resulting `yaml` from the command worked when uploaded to localhost but didn't work once uploaded to my main website as I had way too many redirects for the website page to load there. I next went back and forth with ChatGPT constantly trying to fix the `yaml` resulting in, at times, decreasing the number of redirects and a 404 error. In the process I got frustrated and switched over to `json` so I could use the pattern matching mentioned in [Ghost's Implementing redirects](https://ghost.org/tutorials/implementing-redirects//?ref=simplykyra.com#using-regular-expressions) and thus switched over to wanting the `json` output to look something more like:
```
{"from":"^\/(slug)\/$","to":"/blog/$1","permanent":true},
```
## A Command
During this process I was getting frustrated but looking back, at some point, I was using this command where each section's results were piped `|` as input into the next section.
```
cat simply-kyra.ghost.date.json | jq '.db[0].data.posts[] | {"from":"^/(.+)/$","to":"/blog/(.slug)","permanent":true}' | pbcopy
```
Explained below is each section:
- `cat simply-kyra.ghost.date.json`: Displays the contents of the downloaded file from Ghost.
- `jq '.db[0].data.posts[]`: Takes the file's contents and parses out the slugs for each post.
- `{"from":"^/(.+)/$","to":"/blog/(.slug)","permanent":true}'`: Converts the list of slugs to the content `string` I want consisting of this text.
- `pbcopy`: and then sends the output to my clipboard so I can paste it into my actual redirects file and upload back up to my website. Just a heads up this is only a Mac command you'll need to adjust it to something else if you're using a different computer.
## Got It
I kept having trouble with mismatched brackets and the like so I decided to simplify the automated part of this process by switching out the complicated special `string` I wanted outputted and instead padded each slug with several dashes on the left and lots of zeros on the right. I had the idea that I could simply do a find all and replace on the results without worrying about the brackets themselves. With this change the new ChatGPT reply went something like this:

Which resulted in the following command:
```
cat simply-kyra.ghost.date.json | jq -r '.db[].data.posts[] | " ---(.slug)000000"' | pbcopy
```
This command extracts the `slug` field from each post and appends `000000` after it while also prefixing it with a triple dash. For example, if the input JSON contains:
```
{
"db": [
{
"data": {
"offers": [],
"posts": [
{
"slug": "foo"
},
{
"slug": "bar"
}
]
}
}
]
}
```
This command would output:
```
---foo000000
---bar000000
```
And then I'd do two find and replace all commands so the `000000` is replaced with `)\/$","to":"/blog/$1","permanent":true},` and the triple dashes are replaced with `{"from":"^\/(` leading to these results:
```
{"from":"^\/(foo)\/$","to":"/blog/$1","permanent":true},
{"from":"^\/(bar)\/$","to":"/blog/$1","permanent":true},
```
## Warning
During the find and replace portion I happened to notice a page title and realized the posts' section in my data file didn't just include posts but also included all of my pages thus meaning I needed to further mine my file to remove all of the page-related lines that didn't need to be redirected. Luckily I didn't have too many pages but after uploading the new file I realized I missed my about page and it was now redirecting to a location that didn't exist. That said, I'm happy it was just the one page and I was able to fix it by simply renaming the page url (and then fixing the linked menu button) and now it's all good to go... that said, while writing this I realized any previous links to my about post no longer work but at least the menu offers a simple correction for this one. | simplykyra |
1,862,978 | KubeIP v2: Assigning Static Public IPs to Kubernetes Nodes Across Cloud Providers | Kubernetes nodes can benefit from having dedicated static public IP addresses in certain scenarios.... | 0 | 2024-05-23T14:55:38 | https://dev.to/alexeiled/kubeip-v2-assigning-static-public-ips-to-kubernetes-nodes-across-cloud-providers-1a57 | kubernetes, aws, devops, googlecloud | Kubernetes nodes can benefit from having dedicated static public IP addresses in certain scenarios. [KubeIP](https://github.com/doitintl/kubeip), an open-source utility, fulfills this need by assigning static public IPs to Kubernetes nodes. The latest version, KubeIP v2, extends support from Google Cloud's GKE to Amazon's EKS, with a design ready to accommodate other cloud providers. It operates as a DaemonSet, offering improved reliability, configuration flexibility, and user-friendliness over the previous Kubernetes controller method. KubeIP v2 supports assigning both IPv4 and IPv6 addresses.
## Use Cases
### Gaming Applications
In gaming scenarios, a console may need to connect directly to a cloud VM to minimize network hops and latency. Assigning a dedicated public IP to the gaming server's node allows the console to connect directly, improving the gaming experience by reducing latency and packet loss.
### Whitelisting Agent IPs
If you have multiple agents or services running on Kubernetes that require direct connections to an external server and that server needs to whitelist the agents' IP addresses, using KubeIP to assign stable public IPs to the nodes makes this easier to manage than allowing broader CIDR ranges. This is particularly useful when the external server has strict IP-based access controls.
### Avoiding SNAT for Select Pods
By default, pods are assigned private IPs from the VPC CIDR range. When they communicate with external IPv4 addresses, the Amazon VPC CNI plugin translates the pod's IP to the primary private IP of the node's network interface using SNAT (source network address translation). Sometimes, you may want to avoid SNAT for certain pods so that external services see the actual pod IPs. Assigning public IPs to nodes with KubeIP and setting `hostNetwork: true` on the pod spec achieves this. The pod can communicate directly with external services using the node's public IP.
### Direct Inbound Connections and Custom Networking Scenarios
Assigning public IPs to nodes with KubeIP enables a variety of networking scenarios. For instance, you can forward traffic directly to pods running on those nodes, which is useful when you need to expose services on the node to the internet without using a traditional load balancer. An example would be running a web server on a pod and forwarding traffic to it using the node's public IP.
In addition, KubeIP can be used to implement custom networking scenarios that require public IPs on nodes. For example, you could create a custom load balancer that forwards traffic to specific nodes based on the public IP. This flexibility makes KubeIP a powerful tool for testing or deploying custom networking solutions in Kubernetes.
### IPv6 Support
KubeIP extends its functionality beyond IPv4 by supporting the assignment of static public IPv6 addresses to nodes. This feature is increasingly important as the internet continues transitioning towards IPv6 due to the exhaustion of IPv4 addresses. With KubeIP's IPv6 support, you can assign static public IPv6 addresses to your Kubernetes nodes, enabling them to communicate directly with external services over IPv6. This is particularly beneficial for applications that require IPv6 connectivity.
## Conclusion
KubeIP v2 is a powerful tool for assigning static public IPs to Kubernetes nodes across cloud providers. It enables a wide range of use cases, from gaming applications to custom networking scenarios, and supports both IPv4 and IPv6 addresses. The extensible design and simplified DaemonSet model make it easy to deploy and manage in your environment.
## Get Involved
As an open-source [project](https://github.com/doitintl/kubeip), we welcome contributions! Submit pull requests, open issues, help with documentation, or spread the word.
For more details, check out the original [Medium post](https://engineering.doit.com/kubeip-v2-assigning-static-public-ips-to-kubernetes-nodes-across-cloud-providers-0616f684ef28). | alexeiled |
1,862,975 | Exploring the Dynamics of Cryptocurrency: Navigating Crypto Wallets and Crafting Your Own Blockchain | In the fast-paced realm of cryptocurrency, two essential elements stand out: Crypto Wallet and... | 0 | 2024-05-23T14:51:16 | https://dev.to/machik99/exploring-the-dynamics-of-cryptocurrency-navigating-crypto-wallets-and-crafting-your-own-blockchain-4pi3 | In the fast-paced realm of cryptocurrency, two essential elements stand out: [Crypto Wallet](https://www.kryptowallet.dev/) and Building Your Own Blockchain. These components form the backbone of the digital currency ecosystem, offering users both security and innovation. Let's delve deeper into these aspects to understand their significance and potential impact.
**Deciphering Crypto Wallets:**
Imagine your cryptocurrency holdings as a treasure trove, safeguarded by a digital fortress known as the crypto wallet. These wallets serve as your gateway to the world of digital assets, offering a secure haven for storing, sending, and receiving cryptocurrencies. Here's a closer look at the diverse landscape of crypto wallets:
**1.Hot Wallets:** These wallets are connected to the internet, providing easy access for frequent transactions. While convenient, they are more susceptible to hacking and security breaches. Examples include mobile wallets and desktop wallets.
**2.Cold Wallets:**
Cold wallets, on the other hand, store cryptocurrencies offline, away from the reach of cyber threats. Hardware wallets and paper wallets fall into this category, offering enhanced security for long-term storage.
**3.Multi-Signature Wallets:**
Multi-signature wallets require multiple private keys to authorize a transaction, adding an extra layer of security and mitigating the risk of unauthorized access.
**4.Custodial vs. Non-Custodial Wallets:** Custodial wallets are managed by third-party service providers, while non-custodial wallets give users full control over their private keys. Each option has its advantages and drawbacks, depending on individual preferences and risk tolerance.
Choosing the right crypto wallet involves evaluating factors such as security, convenience, and accessibility. Whether you opt for a user-friendly mobile wallet or a robust hardware wallet, prioritizing the safety of your digital assets is paramount in the ever-evolving landscape of cryptocurrency.
**Crafting Your Own Blockchain:**
Beyond the realm of cryptocurrency lies the revolutionary technology of blockchain, a decentralized ledger with limitless potential. Building your own blockchain opens doors to innovation, enabling you to create custom solutions tailored to specific needs. Here's a roadmap to guide you through the process:
**1.Selecting a Framework:** Choose a blockchain framework or platform that aligns with your project requirements. Ethereum, with its robust smart contract capabilities, is a popular choice for developing decentralized applications (dApps). Alternatively, explore options like Hyperledger Fabric or Binance Smart Chain for specialized use cases.
**2.Defining Consensus Mechanism:** Determine the consensus mechanism that governs your blockchain network, ensuring agreement and validity of transactions among network participants. Popular mechanisms include Proof of Work (PoW), Proof of Stake (PoS), and Delegated Proof of Stake (DPoS).
**3.Implementing Smart Contracts:** If your project involves executing programmable logic on the blockchain, smart contracts play a pivotal role. These self-executing contracts automate agreements and transactions, eliminating the need for intermediaries and enhancing transparency.
**4.Testing and Deployment:** Thoroughly test your blockchain network to identify and address any vulnerabilities or inefficiencies. Once satisfied with its performance, deploy the blockchain to the desired network, whether it's a public, private, or consortium blockchain.
**5.Community Engagement:** Foster a vibrant community around your blockchain project, engaging developers, enthusiasts, and stakeholders to contribute ideas, feedback, and support. Building a strong community is essential for fostering adoption and driving innovation.
Crafting your own blockchain empowers you to redefine the boundaries of technology and finance, paving the way for decentralized solutions that empower individuals and organizations worldwide.
**Conclusion:**
Cryptocurrency and blockchain technology represent a paradigm shift in the way we perceive and interact with financial systems. By understanding the dynamics of crypto wallets and embarking on the journey of [build your own blockchain](https://www.kryptowallet.dev/part-one-developing-your-own-blockchain/) , you can harness the transformative power of decentralized innovation. Whether you're safeguarding your digital assets or pioneering groundbreaking solutions, the world of cryptocurrency awaits exploration and discovery.
| machik99 | |
1,862,973 | From Frustration to Fix: Conquering Vercel Errors Like a Pro | প্রোজেক্ট ডিপ্লয়মেন্ট এর জন্য আমাদের মতো স্টুডেন্টদের জন্য Vercel অন্যতম ভরসা। তবে Vercel এ Frontend... | 0 | 2024-05-23T14:48:37 | https://dev.to/ehtisamhaq/from-frustration-to-fix-conquering-vercel-errors-like-a-pro-4mdo | vercel, deployment, deploy, express | প্রোজেক্ট ডিপ্লয়মেন্ট এর জন্য আমাদের মতো স্টুডেন্টদের জন্য Vercel অন্যতম ভরসা। তবে Vercel এ Frontend প্রোজেক্ট ডিপ্লয় করার প্রসেস সহজ হলেও Backend এর প্রোজেক্ট ডিপ্লয় করতে আমরা অনেক প্রবলেম ফেস করি। প্রবলেম ফেস করতে করতে আমরা হতাশ হয়ে যায়, আবার অনেক সময় এসাইন্মেন্ট এর ডেডলাইন মিস করি!
তাই এই ব্লগে আমরা Vercel ব্যাবহারের সঠিক উপায় এর কিছু কমন এরর এবং তার সমাধান সম্পর্কে জানার চেষ্টা করব।
**Vercel কি কি উপায়ে ইউজ করা যায়? কোনটার সুবিধা কি?**
Vercel মূলত তিনটি ভিন্ন উপায়ে ব্যবহার করা যেতে পারে:
- **Vercel CLI:** Vercel CLI হল একটি কমান্ড-লাইন ইন্টারফেস। এর মাধ্যমে অনেক সহজে কিছু কমান্ড ব্যাবহার করে প্রোজেক্ট ডিপ্লয় করা যায়।
- **Vercel Dashboard:** এই পদ্ধতিতে ভারসেল এর ওয়েবসাইট থেকে ওয়েব-ভিত্তিক ইন্টারফেস (গ্রাফিকাল ইন্টারফেস) এর মাধ্যমে কিছু প্রসেস অনুসরন করে প্রোজেক্ট ডিপ্লয় করা হয়। এই পদ্ধতিতে CLI এর চেয়ে বেশি ফিচার ইউজ করা যায়।
- **Vercel API:** এটি একটি শক্তিশালী এবং নিয়ন্ত্রণযোগ্য উপায় যায় মাধ্যমে প্রজেক্টগুলির সাথে আরও জটিল ইন্টারঅ্যাকশন করা যায়।
**Vercel CLI দিয়ে Backend প্রোজেক্ট ডিপ্লয় করার সময় কিছু কমন এরর এবং তার সমাধান:**
- **Build error:** এই এররটি প্রোজেক্টের কোডে কোন ত্রুটি থাকার কারণে হয়। বেশির ভাগ সময় কি সমস্যা হয়েছে তা এররে ম্যানশন করা থাকে। এই এররটি সমাধান করার জন্য, কোডটি চেক করে ত্রুটিগুলি ঠিক করতে হবে।
- **Deployment error:** এই এররটি প্রোজেক্টের সেটিংস বা ইনভাইরন্মেন্টের সমস্যার কারণে হয়। অনেক সময় vercel.json ঠিক ভাবে কনফিগার না করার কারনেও এই ধরনের এরর হয়। এই এররটি সমাধান করার জন্য, প্রোজেক্টের সেটিংস এবং ইনভাইরন্মেন্ট গুলো চেক করতে হবে।
- **Runtime error:** এই এররটি প্রোজেক্টের রানটাইম সময়ে কোন সমস্যার কারণে হয়। প্রজেক্টে কোন প্রসেস অনাকাঙ্ক্ষিত ভাবে বন্ধ হলে, ঠিক ভাবে ডাটাবেজ কানেক্ট না হলে, বা কোন আন হ্যান্ডেল্ড রিজেকশন হলে এই এরর হয়। এই এররটি সমাধান করার জন্য, কোডটি চেক করে প্রয়োজনীয় পরিবর্তন এবং ঠিক ভাবে এরর হান্ডেল করতে হবে।
Vercel এ কোন এরর ফেস করলে শুরুতেই `Logs` চেক করতে হবে। `Logs` টা যদি আপনি মনোযোগ সহকারে পড়েন তাহলে `Error` সম্পর্কে ধারনা পেয়া যাবেন। `Logs`-এ সাধারণত Error-এর কারণ এবং কোথায় Error ঘটেছে সে সম্পর্কে তথ্য থাকে।
**Logs-এর কিছু গুরুত্বপূর্ণ অংশ হল:**
- **Error message:** Error message-এ Error-এর সংক্ষিপ্ত বিবরণ থাকে।
- **Stack trace:** Stack trace-এ Error-এর `path` সম্পর্কে তথ্য থাকে, সে পাথে গিয়ে আপনি এরর এর সমাধান করতে পারেন।
- **Environment variables:** Environment variables-এ কোন প্রবলেম হলে এখানে তা দেখানো হয়, আপনি আপনার প্রোজেক্টের `Settings` অধীনে "Environment" ট্যাবটিতে ক্লিক করে Environment variables-গুলি দেখতে এবং সেট করতে পারেন।।
**কিছু নির্দিষ্ট এরর এবং সমাধান:**
- ***`[Error: "Invalid project name"]`** এই এররটি প্রোজেক্টের নামটি সঠিকভাবে সেট না করা হলে হয়। এই এররটি সমাধান করার জন্য, প্রোজেক্টের নামটি সঠিকভাবে সেট করতে হবে।
- ***`[Error: "No build script found"]`** এই এররটি প্রোজেক্টে একটি build script না থাকার কারণে হয়। এই এররটি সমাধান করার জন্য, প্রোজেক্টে একটি build script যোগ করতে হবে।
- ***`[Error: "Missing dependency"]`** এই এররটি প্রোজেক্টে একটি নির্দিষ্ট dependency না থাকার কারণে হয়। এই এররটি সমাধান করার জন্য, প্রোজেক্টে প্রয়োজনীয় dependency যোগ করতে হবে।
- * **`[Vercel Serverless Functions timing out]`** এই এররটি বিভিন্ন কারনে হয়ে থাকে, কোন রিকুয়েস্ট যদি ১০ সেকেন্ডের পরেও রিসল্ভ না হয় তাহলে সাধারনত এই এরর দেয়। তবে আমরা দেখেছি অনেকে মঙ্গডিবিতে `Current IP` সেট করে রাখার জন্য এর প্রবলেম টা হয়, সেক্ষেত্রে `Current IP` ডিলিট করে `allow access from anywhere` সেট করে দিতে হবে।
**vercel.json কিভাবে সেটআপ করবেন?**
vercel এ express এপ্লকেশন run করার জন্য vercel.json ফাইলে কিছু কনফিগারেশন করা লাগে। সেক্ষেত্রে এটাকে টেমপ্লেট হিসাবে ইউজ করতে পারেন।
```json
{
"version": 2,
"builds": [
{
"src": "index.js", // project এর মেইন ফাইল পাথ,
// আমাদের ক্ষেত্রে সাধারণত `dist/server.js` file
"use": "@vercel/node"
}
],
"routes": [
{
// Specify which paths will route to a destination using a regex
"src": "/(.*)",
// Specify the paths' destination
"dest": "index.js" // project এর মেইন ফাইল পাথ,
// আমাদের ক্ষেত্রে সাধারণত `dist/server.js` file
}
]
}
```
**সচারচর করা কিছু ভুলঃ**
- **বিল্ড না করেই ডিপ্লয় করাঃ** এই প্রবলেম টা সাধারণত CLI দিয়ে ডিপ্লয় করার সময় বেশি চখে পরে। প্রজেক্ট বিল্ড না করে ডিপ্লয় করার কারনে প্রজেক্টটি ঠিকভাবে কাজ করেনা বা আপডেট হয় না। তাই প্রতিবার ডিপ্লয় করার পূর্বে বিল্ড করে নিতে হবে।
- **env সেটআপ না করাঃ** cli এর মাধ্যমে প্রজেক্ট ডিপ্লয় করলে সাধারণত env অটোমেটিক ভাবে সেট হয়ে যায়। কিন্তু মাঝে মাঝে সমস্যা দেখা দেয়। তখন আপনাকে প্রজেক্ট সেটিং থেকে ম্যানুয়ালি Environment Variable সেট করে আবার ড়িপ্লয় করতে হবে।
- **Vercel Functions timing out?** ভারসেলে বিভিন্ন কারনে টাইম আউট হতে পারে। অনেক সময় কোন ফাংশন এক্সিকিউট হতে ১০ সেকেন্ডের বেশি সময় নিলে এটা এরর দেয়। তবে আমরা এটা ফ্রিতে ৬০ সেকেন্ড এক্সটেন্ড করতে পারি।
- **Hobby:** 10s (default) - [configurable](https://vercel.com/docs/functions/configuring-functions/duration) up to 60s
- **Pro:** 15s (default) - [configurable](https://vercel.com/docs/functions/configuring-functions/duration) up to 300s
- **Enterprise:** 15s (default) - [configurable](https://vercel.com/docs/functions/configuring-functions/duration) up to 900s
- https://vercel.com/docs/functions/configuring-functions/duration.
অনেক সময় সব কিছু ঠিক ভাবে করার পরেও ভারসেলে ডিপ্লয় করা প্রজেক্ট কাজ করে না। তখন `.vercel` ফোল্ডার টি ডিলিট করে আবার নতুন ভাবে ডিপ্লয় করতে হবে।
**আরও কিছু টিপস:**
- **সমস্যা সমাধানে Vercel documentation পড়ুন বা support center থেকে হেল্প নিতে পারেন।**
- **Vercel community forums এবং Discord server-এ জুক্ত হতে পারেন। এতে ভারসেলের ফিচার সম্পর্কে আপ টু ডেট থাকতে পারবেন।**
- **আপনার প্রোজেক্টগুলি ডেপ্লয় করার আগে, সেগুলি ভালভাবে পরীক্ষা করুন যাতে কোন ত্রুটি না থাকে।**
আশা করি এই ব্লগটি আপনাদের উপকারে আসবে। এখানে আমি কিছু সাধারন সমস্যা নিয়ে আলোচনা করাছি। আপনারা অন্য কোন সমস্যা ফেস করলে কমেন্ট সেকশনে জানাতে পারেন।
**Vercel Alternatives:**
- [Render](https://render.com/)
- [Railway](https://railway.app/)
- [Cyclick.sh](https://www.cyclic.sh/)
- Netlify
- Cloudflare Pages
- Codesphere etc
**Resource:** [Errors | Vercel Docs](https://vercel.com/docs/errors)
| ehtisamhaq |
1,862,972 | Is Flutter Right For Your Mobile App? | Unlocking Success: The Power of Choosing the Right Framework In the bustling realm of mobile app... | 0 | 2024-05-23T14:44:45 | https://dev.to/n968941/is-flutter-right-for-your-mobile-app-480b | flutter, beginners, programming, firebase |
**Unlocking Success: The Power of Choosing the Right Framework**
In the bustling realm of mobile app development, the stakes are high, and every decision can shape the destiny of your creation. Amidst a plethora of options, one name shines brightly – Flutter. Developed by tech titan Google, Flutter isn't just another toolkit; it's a gateway to crafting masterpieces that transcend platforms with ease.
[read full article](https://flutters.in/is-flutter-right-for-your-mobile-app/)
**Discovering Flutter: A Revolution in UI Development**
At its core, Flutter is an open-source UI software development kit, but it's so much more than that. With Flutter, developers harness the power to sculpt native applications for mobile, web, and desktop – all from a single, harmonious codebase. Say goodbye to the days of juggling multiple frameworks; with Flutter, simplicity reigns supreme.
**Unveiling the Arsenal: Key Features of Flutter**
What sets Flutter apart from the crowd? Let's unravel its arsenal:
- **Single Codebase**: Wave goodbye to redundant code and embrace efficiency with Flutter's single codebase approach.
- **Hot Reload**: Experience the magic of real-time changes without the hassle of restarting your app, turbocharging your development cycle.
- **Rich Widget Library**: Dive into a treasure trove of customizable widgets, meticulously crafted to ensure a seamless user experience across platforms.
- **High Performance**: With Flutter, performance isn't just a goal; it's a guarantee. Compiled to native ARM code, your apps will soar to new heights.
- **Strong Community and Support**: Join forces with a vibrant community and tap into a wealth of resources, ensuring you're never alone on your Flutter journey.
[read full article](https://flutters.in/is-flutter-right-for-your-mobile-app/)
**Embracing the Flutter Advantage: Benefits Galore**
Why should you choose Flutter for your next project? Let's count the ways:
- **Cross-Platform Development**: Break free from platform constraints and conquer multiple domains with ease.
- **Fast Development Cycle**: Time is of the essence, and with Flutter's hot reload feature, every moment counts.
- **Customizable Widgets**: From the mundane to the magnificent, Flutter empowers you to bring your wildest UI dreams to life.
- **Strong Performance**: Speed, reliability, and efficiency – Flutter delivers on all fronts, ensuring your apps run like a well-oiled machine.
- **Reduced Testing Efforts**: Streamline your testing process and bid farewell to platform-specific headaches.
- **Consistent UI Across Platforms**: With Flutter, uniformity reigns supreme, providing users with a seamless experience, no matter their device of choice.
**Navigating Potential Challenges: The Drawbacks of Flutter**
But wait, is Flutter flawless? Not quite:
- **Large App Size**: Brace yourself for larger file sizes, which might give storage-conscious users pause.
- **Limited Native Features**: While Flutter boasts impressive capabilities, it might fall short in certain niche functionalities.
- **Learning Curve**: Mastery takes time, and Flutter is no exception. Prepare for a learning curve, especially for those new to Dart and Flutter's architecture.
**The Flutter Dilemma: To Choose or Not to Choose**
So, when does Flutter shine brightest?
- **Cross-Platform Needs**: If versatility is your game, Flutter is your ace in the hole.
- **Rapid Development**: Tight deadlines? Flutter thrives under pressure, thanks to its lightning-fast development cycle.
- **Budget Constraints**: Save time, save money – Flutter is the budget-friendly solution you've been searching for.
But tread carefully, for there are moments when Flutter might not be your best bet:
- **Platform-Specific Features**: If your app demands platform-specific wizardry, Flutter might not have the magic you seek.
- **High-Performance Requirements**: For apps that demand nothing short of perfection, native development might hold the key.
**The Grand Finale: Flutter in Perspective**
In the grand tapestry of mobile app development, Flutter is but one thread, albeit a mighty one. As you chart your course, weigh the pros and cons, and remember – the perfect framework is not a destination but a journey, and with Flutter, the adventure has only just begun.
---
[read full article](https://flutters.in/is-flutter-right-for-your-mobile-app/) | n968941 |
1,862,971 | An Absolute Beginner's Guide to package.json | As a general rule, any project that's using Node.js will need to have a package.json file. What is a... | 0 | 2024-05-23T14:44:20 | https://dev.to/briann_bn/an-absolute-beginners-guide-to-packagejson-2c6h | node, npm, javascript, webdev | As a general rule, any project that's using Node.js will need to have a `package.json` file. What is a `package.json` file?
At its simplest, a `package.json` file can be described as a manifest of your project that includes the packages and applications it depends on, information about its unique source control, and specific metadata like the project's name, description, and author.
Let's break down the core parts of a typical `package.json` file:
#### _Specific Metadata: name, version, description, license, and keywords_
Inside a package.json, you'll almost always find metadata specific to the project - no matter if it's a web application, Node.js module, or even just a plain JavaScirpt library. This metadata helps identify the project and acts as a baseline for users and contributors to get information about the project.
Here's an example of how these fields would look in a `package.json `file:
```
{
"name": "metaverse", // The name of your project
"version": "0.92.12", // The version of your project
"description": "The Metaverse virtual reality. The final outcome of all virtual worlds, augmented reality, and the Internet.", // The description of your project
"main": "index.js"
"license": "MIT" // The license of your project
}
```
A `package.json` file is always structured in the `JSON` format, which allows it to be easily read as metadata and parsed by machines.
If needing to format a `package.json` file manually to get your project up and running seems a bit daunting, there's a handy command that will automatically generate a base `package.json` file for you - if you'd like to learn how to use it, take a peek at the `npm init` instructions below!
#### _Understanding and Managing Your Project's Dependencies: dependencies and `devDepenendcies` in your ``package.json`_`
The other majorly important aspect of a `package.json` is that it contains a collection of any given project's dependencies. These dependencies are the modules that the project relies on to function properly.
Having dependencies in your project's `package.json` allows the project to install the versions of the modules it depends on. By running an install command (see the instructions for` npm install` below) inside of a project, you can install all of the dependencies that are listed in the project's `package.json` - meaning they don't have to be (and almost never should be) bundled with the project itself.
Second, it allows the separation of dependencies that are needed for production and dependencies that are needed for development. In production, you're likely not going to need a tool to watch your CSS files for changes and refresh the app when they change. But in both production and development, you'll want to have the modules that enable what you're trying to accomplish with your project - things like your web framework, API tools, and code utilities.
What would a project's` package.json` look like with dependencies and `devDependencies`? Let's expand on the previous example of a `package.json` to include some.
```
{
"name": "metaverse",
"version": "0.92.12",
"description": "The Metaverse virtual reality. The final outcome of all virtual worlds, augmented reality, and the Internet.",
"main": "index.js"
"license": "MIT",
"devDependencies": {
"mocha": "~3.1",
"native-hello-world": "^1.0.0",
"should": "~3.3",
"sinon": "~1.9"
},
"dependencies": {
"fill-keys": "^1.0.2",
"module-not-found-error": "^1.0.0",
"resolve": "~1.1.7"
}
}
```
One key difference between the dependencies and the other common parts of a `package.json` is that they're both objects, with multiple key/value pairs. Every key in both dependencies and `devDependencies` is a name of a package, and every value is the version range that's acceptable to install.
| briann_bn |
1,862,970 | Artificial Intelligence Act: All You Need to Know About the European Council’s First Worldwide Rules on AI | On May 21, a seismic shift rippled through the tech realm as the Council of the European Union... | 0 | 2024-05-23T14:43:06 | https://dev.to/n968941/artificial-intelligence-act-all-you-need-to-know-about-the-european-councils-first-worldwide-rules-on-ai-4a2p | flutter, firebase, beginners, programming | On May 21, a seismic shift rippled through the tech realm as the Council of the European Union endorsed the groundbreaking Artificial Intelligence Act. This legislation, poised to become the gold standard for AI regulation worldwide, is set to revolutionize how we approach innovation and safety in the AI landscape.
At its core, the AI Act is a beacon of balance, threading the needle between fostering groundbreaking advancements and safeguarding the rights of citizens. Embracing a 'risk-based' philosophy, it pledges to usher in a new era of trustworthy AI systems while serving as a bulwark against potential pitfalls.
But what precisely does this heralded Act aim to achieve? Picture a future where safe and reliable AI systems flourish within the EU's single market, nurturing both public and private sectors to spearhead innovation. This isn't merely about technological evolution; it's about nurturing an ecosystem where European ingenuity thrives while respecting fundamental rights.
[read full article](https://flutters.in/artificial-intelligence-act/)
Let's delve deeper into the core objectives of the AI Act:
First and foremost, it champions the development of AI systems that are not just innovative but inherently safe and trustworthy. Imagine a world where every AI algorithm is meticulously crafted to prioritize the well-being of users.
Moreover, the Act stands as a staunch defender of citizens' rights, ensuring that the ethical ramifications of AI remain at the forefront of technological progress. It's a clarion call to safeguard privacy, dignity, and autonomy in an increasingly digitized world.
But the AI Act doesn't just set lofty aspirations; it's a catalyst for tangible change. By fostering an environment conducive to investment and innovation, it lays the groundwork for a future where Europe leads the charge in AI research and development.
One of the most intriguing facets of this legislation is its adaptability. Built upon a framework that embraces regulatory learning, the AI Act is designed to evolve in tandem with technological advancements, ensuring that regulations remain effective and relevant.
Crucially, the Act introduces the concept of AI regulatory sandboxes, providing a controlled environment for testing and validating new AI systems. It's a proactive approach that empowers developers to innovate while mitigating potential risks.
Now, let's address the elephant in the room: how will the AI Act differentiate between the myriad risks posed by AI systems? Through meticulous categorization. From AI systems with minimal risk to those deemed high risk, the Act establishes clear guidelines to ensure that the level of scrutiny matches the potential for harm.
But who falls under the purview of this transformative legislation? While it primarily targets the 27 EU member states, its implications extend far beyond European borders. Any company utilizing EU customer data in their AI systems must heed its directives, signaling a seismic shift in global AI governance.
Enforcing such sweeping regulations demands a robust infrastructure. Enter the AI Office within the European Commission, bolstered by a scientific panel, AI board, and advisory forum. Together, these bodies form a formidable bulwark against AI malpractice, ensuring compliance and accountability.
And what of those who dare to flout the rules? The consequences are severe. From hefty fines to proportional administrative penalties, the AI Act leaves no room for negligence or misconduct.
As for implementation, the countdown has begun. With the ink barely dry, the AI Act is poised to take effect, ushering in a new era of AI governance. From publication in the EU's Official Journal to its enforcement two years hence, the wheels of change are already in motion.
In summary, the AI Act isn't just legislation; it's a manifesto for responsible innovation. With its passage, Europe takes a monumental stride towards shaping the future of AI—one defined by progress, integrity, and, above all, trust. Join us as we embark on this transformative journey into the heart of the AI revolution.
[read full article](https://flutters.in/artificial-intelligence-act/) | n968941 |
1,862,968 | New Diploma Course in Artificial Intelligence and Machine Learning at GTTC: Belagavi | Unveiling the Future: GTTC Introduces Diploma in AI & ML In the heart of North Karnataka, a... | 0 | 2024-05-23T14:40:49 | https://dev.to/n968941/new-diploma-course-in-artificial-intelligence-and-machine-learning-at-gttc-belagavi-2j6i | flutter, firebase, beginners, programming | **Unveiling the Future: GTTC Introduces Diploma in AI & ML**
In the heart of North Karnataka, a beacon of innovation shines bright as Belagavi Government Tooling and Training Center (GTTC) unveils its latest breakthrough: a pioneering Diploma in Artificial Intelligence and Machine Learning (AI & ML). This visionary initiative is a response to the region's escalating demand for cutting-edge technical education, setting the stage for an exciting journey into the realms of AI.
**Embark on a Journey of Innovation: Admissions Open for 2024-25**
GTTC's commitment to excellence is echoed in the launch of its AI & ML diploma course, where Python programming, software development, and robotics take center stage. Designed to equip students with the skills needed to thrive in the dynamic landscape of artificial intelligence, this program promises a gateway to a world of boundless opportunities. And the best part? Admissions for the academic year 2024-25 are now open, inviting passionate learners to join the ranks of tomorrow's AI pioneers.
[read full article](https://flutters.in/new-diploma-course-in-artificial-intelligence-and-machine-learning-at-gttc-belagavi/)
**Delving Deeper: Explore the Course Details**
Step into the realm of AI & ML as GTTC unveils a curriculum tailored to nurture the next generation of tech enthusiasts. From mastering Python's intricate syntax to delving into the intricacies of software development and robotics, this diploma course offers a comprehensive dive into the core facets of artificial intelligence.
**Backed by Excellence: Recognitions and Facilities**
With state-of-the-art lab facilities and recognition from the esteemed All India Council for Technical Education (AICTE), GTTC's AI & ML diploma course stands as a testament to unparalleled excellence. Here, students aren't just participants; they're pioneers in a groundbreaking journey towards technological advancement.
**Beyond Boundaries: Additional Offerings**
But GTTC's commitment to innovation doesn't stop there. Alongside the AI & ML course, the center presents a diverse array of diploma programs, including Tool and Die Making, Precision Manufacturing, and Automation and Robotics, catering to a wide spectrum of aspiring learners.
**Charting Your Path: Duration and Entry Options**
Embark on a four-year odyssey into the world of AI & ML, where PU science graduates have the unique opportunity to leapfrog into the second year through lateral entry, ensuring a seamless transition into the realm of artificial intelligence.
**Empowering Learners: Skill Development Initiatives**
GTTC goes above and beyond traditional education, introducing short-term and skill development courses to empower learners from all walks of life. Here, accessibility and inclusivity reign supreme, ensuring that educational opportunities are within reach for everyone.
**A Promise of Excellence: Commitment to Quality Education**
With branches spanning across Belagavi, Chikkodi, and Gokak, GTTC stands as a beacon of excellence in the realm of technical education. Here, quality isn't just a standard; it's a promise, upheld with unwavering dedication and passion.
Join us on this extraordinary journey as we pave the way for a future powered by innovation. Enroll in GTTC's Diploma in AI & ML and become a part of something truly remarkable. The future awaits, and it's brimming with endless possibilities.
[read full article](https://flutters.in/new-diploma-course-in-artificial-intelligence-and-machine-learning-at-gttc-belagavi/) | n968941 |
1,862,967 | Nvidia's Potential Entry into ARM-based Laptops | Nvidia's potential entry into ARM-based laptops isn't just another industry update; it's the herald... | 0 | 2024-05-23T14:38:23 | https://dev.to/n968941/nvidias-potential-entry-into-arm-based-laptops-3e5c | flutter, beginners, programming, ai | Nvidia's potential entry into ARM-based laptops isn't just another industry update; it's the herald of a paradigm shift in computing as we know it. Imagine a laptop seamlessly blending Nvidia's ARM-based CPU with its powerful RTX graphics card, all empowered by AI. This once-distant dream is now on the brink of becoming a reality.
Insights from a recent Bloomberg interview with Nvidia CEO Jensen Huang and Dell CEO Michael Dell have set the tech world abuzz. Their hints at Nvidia's imminent foray into AI-PC integration, possibly launching as soon as next year, indicate a seismic shift on the horizon.
[read full article](https://flutters.in/nvidias-potential-entry-into-arm-based-laptops/)
But this move isn't just about Nvidia making headlines. It's about understanding the broader market dynamics. Reports from October 2023 revealed Microsoft's push for Nvidia, alongside AMD, to enter the ARM-based CPU market. The goal? To challenge Qualcomm's dominance and rival Apple's M-series chips in the realm of ARM-based Windows laptops.
Nvidia isn't stepping into this arena blindly. Their track record speaks volumes. From innovations like DLSS and ray tracing, exclusive to their RTX graphics cards, to advancements in computer vision and natural language processing driven by their AI research division, Nvidia has long been at the forefront of AI-driven technologies.
So, what does Nvidia's potential entry mean for consumers and professionals alike? Picture low-powered SoCs powering handheld consoles or sleek gaming laptops. The broader trend towards AI-enhanced computing underscores the significance of this move. It's not just about performance; it's about unlocking new features and catering to diverse needs.
For Nvidia, this isn't just an opportunity—it's a gateway to redefining the gaming experience. With greater CPU control, they could pioneer a unified AI-gaming ecosystem, transcending the boundaries between laptops, consoles, and desktops.
As we look ahead to 2025, the excitement is palpable. While specifics remain shrouded in mystery, one thing is certain: Nvidia's venture into laptops promises an exhilarating journey into the future of computing.
But don't just take our word for it. Join the conversation. Dive into our FAQs and explore the possibilities. Nvidia's potential entry into ARM-based laptops isn't just a news flash—it's a glimpse into a world where AI reigns supreme, and the possibilities are endless.
[read full article](https://flutters.in/nvidias-potential-entry-into-arm-based-laptops/) | n968941 |
1,862,965 | Nvidia's Profit Soars: Highlighting Its AI Chip Dominance | Nvidia's recent financial triumph is nothing short of extraordinary, painting a vivid picture of its... | 0 | 2024-05-23T14:36:03 | https://dev.to/n968941/nvidias-profit-soars-highlighting-its-ai-chip-dominance-17d1 | flutter, firebase, beginners, programming | Nvidia's recent financial triumph is nothing short of extraordinary, painting a vivid picture of its indispensable role in the ever-evolving AI sector. With profits soaring to unprecedented heights and CEO Jensen Huang casting an innovative vision for the future, the tech giant stands at the forefront of what promises to be the next industrial revolution. Yet, as the euphoria of success settles, whispers of doubt emerge, questioning the sustainability of Nvidia's meteoric rise.
[read full article](https://flutters.in/nvidias-profit-soars-highlighting-its-ai-chip-dominance/)
Record-Breaking Earnings
In a resounding declaration of its dominance, Nvidia shattered Wall Street's expectations, with profits rocketing over sevenfold to a staggering $14.88 billion in its first quarter ending April 28. This astronomical surge, coupled with revenue tripling to $26.04 billion, firmly establishes Nvidia as a titan in the realm of artificial intelligence.
The AI Revolution
CEO Jensen Huang's words reverberate with the weight of prophecy as he paints a portrait of a world transformed by Nvidia's technology. In Huang's vision, companies will harness Nvidia's chips to construct "AI factories," where the alchemy of data produces artificial intelligence as a tangible commodity. The evolution of AI models into "multimodal" systems, capable of processing a myriad of data types, heralds a new era of innovation, empowering AI to reason and plan with unparalleled depth and precision.
Exceeding Expectations
Nvidia's triumph extends beyond financial figures, with the company defying even the most optimistic forecasts. Adjusted earnings per share soared to $6.12, surpassing Wall Street's predictions with a resounding roar. This triumph was further underscored by a bold move—a 10-for-1 stock split and a substantial increase in dividends—sending Nvidia's stock price soaring by 6% in after-hours trading.
Strategic Vision and Market Position
Nestled in the heart of Silicon Valley, Nvidia has meticulously crafted a stronghold in both AI hardware and software under the visionary guidance of CEO Jensen Huang. This strategic foresight, honed over a decade, has propelled Nvidia to the summit of the tech industry, trailing only Microsoft and Apple in market value. From gaming to automotive industries, Nvidia's influence pervades, casting a long shadow over its competitors.
Market Reaction and Future Prospects
Analysts marvel at Nvidia's meteoric ascent, heralding it as a beacon of resilience in an ever-shifting landscape. Despite attempts by some to diminish reliance on Nvidia's AI hardware, the appetite for its specialized chips remains insatiable. Behemoths like Amazon, Google, Meta, and Microsoft are doubling down on their investments in Nvidia's AI chips and data centers, signaling a future of boundless potential.
Challenges Ahead
Yet, amidst the jubilation, shadows of uncertainty loom on the horizon. Analysts ponder the sustainability of Nvidia's explosive growth, wary of the looming specter of saturation. As AI workloads shift from training to inference, the demand for Nvidia's high-powered chips may wane, opening the door for competitors offering more affordable alternatives. In this crucible of innovation, Nvidia's market supremacy hangs in the balance, poised on the precipice of evolution.
As the curtain rises on the next act of the AI saga, Nvidia's strategic prowess and unwavering innovation will serve as guiding stars, illuminating the path to a future defined by limitless possibilities.
[read full article](https://flutters.in/nvidias-profit-soars-highlighting-its-ai-chip-dominance/) | n968941 |
1,862,963 | 8 Reasons Why Flutter is the Future of App Development | The mobile app landscape has reshaped how we interact with the world, granting us easy access to an... | 0 | 2024-05-23T14:33:55 | https://dev.to/n968941/8-reasons-why-flutter-is-the-future-of-app-development-2hnk | flutter, firebase, beginners, programming | The mobile app landscape has reshaped how we interact with the world, granting us easy access to an array of services through our smartphones. But in this fast-paced industry, where competition is fierce and innovation is key, selecting the right development platform is paramount. Enter Flutter, the brainchild of Google, swiftly emerging as the beacon of the future for app development.
**What Exactly is Flutter?**
Flutter is not just another framework; it's a game-changer. Developed by the tech juggernaut Google, Flutter is an open-source framework designed to create stunning user interfaces across various platforms with ease. From web to mobile to desktop, Flutter empowers developers to craft seamless experiences using a single codebase.
**Why Flutter Spells the Future of App Development**
1. **One Codebase, Infinite Possibilities:** With Flutter, developers bid farewell to the hassle of juggling multiple codebases. Write once, deploy anywhere—Android, iOS, desktop, you name it. This streamlined approach slashes development time, cuts costs, and gets your app to market faster than ever.
2. **Speed and Performance Redefined:** Powered by Dart, Flutter delivers top-notch performance by compiling directly to native code. Say goodbye to sluggish apps; Flutter's AOT compilation and Skia graphics engine ensure lightning-fast execution and buttery-smooth animations.
3. **From Idea to MVP in No Time:** Startups, pay attention! Flutter simplifies the journey from concept to MVP with its extensive widget library and powerful toolkit. Test the waters, gather feedback, iterate—all at warp speed.
[read full article](https://flutters.in/8-reasons-why-flutter-is-the-future-of-app-development/)
4. **Developer Bliss:** Flutter doesn't just speak to end-users; it's a dream come true for developers too. With comprehensive documentation, a vibrant community, and a treasure trove of plugins, Flutter ensures a delightful coding experience from start to finish.
5. **Testing Made Effortless:** Bid adieu to endless testing cycles. Flutter's "Hot Reload" feature lets developers see changes in real-time, streamlining bug fixes and enhancing overall app reliability.
6. **Crafting Beautiful UIs:** Flutter's widget-based architecture empowers developers to create visually stunning and consistent UIs effortlessly. Thanks to "Hot Reload," tweaking designs is a breeze, ensuring your app looks and feels exactly how you envision it.
7. **Seamless Firebase Integration:** Need real-time data syncing or push notifications? Flutter and Firebase make the perfect pair. Integrate Firebase effortlessly and elevate your app with interactive features that keep users engaged.
8. **Cost-Effective Brilliance:** Developing native apps for multiple platforms can burn through budgets. Flutter changes the game by allowing a single codebase for all platforms, saving time and money without compromising quality.
**Final Thoughts: A Glimpse into the Future**
In a world where user experience reigns supreme, Flutter shines as a beacon of promise for app developers worldwide. With its unparalleled performance, developer-friendly ecosystem, and cost-effective approach, Flutter is more than just a framework—it's a paradigm shift. Embrace Flutter, and usher your app into the future.
[read full article](https://flutters.in/8-reasons-why-flutter-is-the-future-of-app-development/)
**FAQs: Unveiling the Power of Flutter**
Wondering what makes Flutter tick? Dive into our FAQs for a deeper understanding:
- **What is Flutter?**
- **Why Flutter for App Development?**
- **How does Flutter's single codebase benefit development?**
- **What sets Flutter's performance apart?**
- **How does Flutter expedite MVP creation?**
- **Why is Flutter a developer's dream?**
- **How does Flutter streamline testing efforts?**
- **What's the secret behind Flutter's expressive UI development?**
- **How does Firebase integration amplify Flutter apps?**
- **Why is Flutter a cost-effective choice?**
- **What lies ahead for Flutter in the realm of app development?**
Ready to embark on your Flutter journey? Join our community and witness the future of app development unfold before your eyes!
[read full article](https://flutters.in/8-reasons-why-flutter-is-the-future-of-app-development/) | n968941 |
1,862,962 | Ocultar Slides ou Criar Novas Derivações: Qual a Melhor Abordagem para Apresentações? | Quando se trata de preparar apresentações para diferentes públicos, eu sempre busco a maneira mais... | 0 | 2024-05-23T14:33:34 | https://dev.to/biosbug/ocultar-slides-ou-criar-novas-derivacoes-qual-a-melhor-abordagem-para-apresentacoes-emc | beginners, devrel | Quando se trata de preparar apresentações para diferentes públicos, eu sempre busco a maneira mais eficiente de reduzir a carga cognitiva e me concentrar em atividades de maior valor. Neste post, vou explorar duas estratégias principais: ocultar slides ou criar novas derivações da apresentação. Qual delas é menos onerosa e mais produtiva?
Ocultar Slides: Simplicidade e Rapidez
Vantagens
1. Rápido e Fácil: Marcar slides como ocultos é uma tarefa simples e rápida. Não preciso criar novas versões do arquivo, o que economiza tempo.
2. Menos Trabalho: Manter uma única apresentação significa que todas as atualizações e melhorias são aplicadas de forma uniforme, evitando duplicidade de esforço.
3. Consistência: Com uma única apresentação, é mais fácil garantir que todas as informações importantes estejam atualizadas em todos os momentos.
Desvantagens
1. Complexidade de Gerenciamento: Gerenciar quais slides devem ser ocultos para diferentes públicos pode se tornar confuso, especialmente se houver muitas variações.
2. Risco de Erro: Há um risco maior de esquecer de ocultar ou mostrar slides específicos, resultando em uma apresentação inconsistente e potencialmente confusa.
Criar Novas Derivações: Personalização e Clareza
Vantagens
1. Personalização: Criar apresentações derivadas permite personalizar cada uma de acordo com as necessidades específicas de cada público. Isso garante que a mensagem seja clara e relevante.
2. Clareza: Cada versão da apresentação é clara e específica, sem a necessidade de gerenciar slides ocultos. Isso facilita a preparação e a apresentação.
3. Preparação: Ajuda-me a me preparar melhor, pois cada versão da apresentação é ajustada para um público específico.
Desvantagens
1. Tempo Consumido: Criar e manter várias versões da apresentação pode ser demorado. Cada nova derivação requer tempo e esforço para ser desenvolvida.
2. Manutenção: Atualizações precisam ser aplicadas a todas as versões, o que pode ser trabalhoso e aumentar a chance de inconsistências.
Minha Recomendação
Para decidir qual abordagem é mais produtiva e menos onerosa, considero os seguintes fatores:
- Número de Variações: Se as variações entre as apresentações para diferentes públicos são mínimas, ocultar slides pode ser a melhor opção.
- Frequência de Atualizações: Se a apresentação precisa ser atualizada frequentemente, manter uma única versão com slides ocultos pode ser mais eficiente.
- Complexidade da Personalização: Se a personalização para cada público é significativa, criar novas derivações pode ser mais produtivo a longo prazo.
Na minha experiência, para apresentações que requerem personalização substancial para diferentes públicos, criar novas derivações é mais produtivo.
Para apresentações com pequenas variações, ocultar slides é menos oneroso e mais eficiente. Avaliar essas variáveis pode ajudar a escolher a estratégia que melhor se adapta às suas necessidades, permitindo que você se concentre em atividades de maior valor. | biosbug |
1,862,960 | F# For Dummys - Day 13 Collections Array | Today we introduce another collection Array Array is a type of collection that stores a fixed-size... | 0 | 2024-05-23T14:27:57 | https://dev.to/pythonzhu/f-for-dummys-day-13-collections-array-3f5p | fsharp | Today we introduce another collection Array</br>
Array is a type of collection that stores a fixed-size sequence of elements of the same type</br>
Array is mutable, you can change the elements after the array has been created</br>
#### Create Array
- Explicitly specifying elements
```f#
let array1 = [| 1; 2; 3 |]
```
the ; could be omit between elements if you write them in seperated lines
```f#
let array1 =
[|
1
2
3
|]
```
all elements should be the same type
```f#
let array1 = [| 1; 2; "3"|]
```
the first elements is int, the last is string, will get compile error:</br>
All elements of an array must be implicitly convertible to the type of the first element, which here is 'int'. This element has type 'string'
- Using comprehensions
```f#
let array1 = [| for i in 1 .. 5 -> i * i |] // [| 1; 4; 9; 16; 25 |]
```
- Array.Empty</br>
Returns an empty array of the given type
```f#
let myEmptyArray = Array.empty // Evaluates to [| |]
```
- Array.init</br>
Creates an array given the dimension and a generator function to compute the elements</br>
syntax: Array.init count initializer
```f#
let array0 = Array.init 5 (fun i -> i) // [| 0; 1; 2; 3; 4 |]
```
- Array.create</br>
syntax: Array.init count value</br>
Creates an array whose elements are all initially the given value
```f#
let zeroes = Array.create 5 0 // zeroes = [| 0; 0; 0; 0; 0 |]
```
- Array.zeroCreate</br>
```f#
let arrayOfTenZeroes : int array = Array.zeroCreate 5 // [| 0; 0; 0; 0; 0 |]
```
#### Get element of Array
- Using index
```f#
let array0 = Array.init 5 (fun i -> i)
printfn "first element %A" array0.[0] // first element 0
```
- Using slice
```f#
let array0 = Array.init 5 (fun i -> i)
printfn "elements from 0 to 2: %A" array0.[0..2] // elements from 0 to 2: 0,1,2
printfn "elements from 2 to the end: %A" array0.[2..] // elements from 2 to the end: 2,3,4
```
- Using build in method get
```f#
let inputs = [| "a"; "b"; "c" |]
let secondElement = Array.get inputs 1
printfn "secondElement: %s" secondElement // secondElement: b
```
#### Loop an Array
- for...in
```f#
let array1 = [| for i in 1 .. 5 -> i * i |]
for number in array1 do
printfn "Number: %d" number
```
- Array.iter
syntax: Array.iter action array
```f#
let array1 = [| for i in 1 .. 5 -> i * i |]
Array.iter (printfn "Number: %d") array1
```
#### Modify element of Array
- Using index to set value
```f#
let inputs = [| "a"; "b"; "c" |]
inputs.[0] <- "d"
printfn "inputs: %A" inputs // d,b,c
```
- Using build in method set</br>
syntax: Array.set array index value
```f#
let array1 = [| for i in 1 .. 5 -> i * i |]
printfn "array1 before set: %A" array1 // [| 1; 4; 9; 16; 25 |]
for i in 0 .. array1.Length - 1 do
Array.set array1 i (i * 2)
printfn "array1 after set: %A" array1 // [| 0; 2; 4; 6; 8 |]
```
#### Pattern Matching with Array
```f#
let sumFirstTwoElements arr =
match arr with
| [| x; y |] -> x + y
| [| x |] -> x
| [| |] -> 0
| _ -> failwith "Array has more than two elements."
let sum1 = sumFirstTwoElements [| 1; 2 |]
let sum2 = sumFirstTwoElements [| 5 |]
let sum3 = sumFirstTwoElements [| |]
printfn "Sum of first two elements: %d" sum1 // 3
printfn "Sum of first element: %d" sum2 // 5
printfn "Sum of empty array: %d" sum3 // 0
```
#### Operate Array
- Concat</br>
Builds a new array that contains the elements of each of the given sequence of arrays</br>
Array.concat arrays
```f#
let inputs = [ [| 1; 2 |]; [| 3 |]; [| 4; 5 |] ]
let newArray = inputs |> Array.concat
printfn "newArray: %A" newArray
```
- Append</br>
Builds a new array that contains the elements of the first array followed by the elements of the second array
syntax: Array.append array1 array2
```f#
let results = Array.append [| 1; 2 |] [| 3; 4 |]
printfn "results: %A" results
```
- Filter
```f#
let originalArray = [| 1; 2; 3; 4; 5; 6 |]
let evenNumbers = Array.filter (fun x -> x % 2 = 0) originalArray
printfn "evenNumbers: %A" evenNumbers // 2, 4, 6
```
- Map
```f#
let originalArray = [| 1; 2; 3 |]
let doubleArray = Array.map (fun x -> 2 * x) originalArray
printfn "doubleArray: %A" doubleArray // doubleArray: 2,4,6
```
- Fold
```f#
let numbers = [| 1; 2; 3; 4; 5 |]
let sum = Array.fold (fun acc x -> acc + x) 0 numbers
printfn "sum: %i" sum // sum: 15
``` | pythonzhu |
1,862,959 | How Nonprofits Are Using AI Software to Drive Decision-Making? | Decision-making is not just about profits in the realm of nonprofit organizations; it’s about... | 0 | 2024-05-23T14:26:42 | https://dev.to/amitesh_mondal/how-nonprofits-are-using-ai-software-to-drive-decision-making-cg | nonprofit, ai, software, technology | Decision-making is not just about profits in the realm of nonprofit organizations; it’s about maximizing impact and serving communities effectively. With the rapid advancements in technology, nonprofits are increasingly turning to artificial intelligence (AI) software to inform and drive their decision-making processes. From resource allocation to program evaluation, AI is revolutionizing how nonprofits operate and make critical choices. In this blog, we'll explore how nonprofits are harnessing the power of AI software to drive decision-making and create positive change.
## **Understanding the Role of Nonprofit AI Software**
Before delving into specific examples, let's first understand what AI entails in the context of nonprofits. AI refers to the simulation of human intelligence processes by machines, particularly computer systems.
In this case, **[nonprofit AI software](https://www.liveimpact.org/)** analyzes vast amounts of data to identify patterns, trends, and insights that humans might overlook. This enables organizations to make data-driven decisions that are more accurate, efficient, and impactful.
### **Enhancing Program Effectiveness**
One of the primary ways nonprofits are leveraging the best AI software is to enhance the effectiveness of their programs. By analyzing data related to program outcomes, beneficiary demographics, and environmental factors, AI can provide valuable insights into what works and what doesn't.
For example, a nonprofit focused on education may use artificial intelligence to analyze student performance data and identify areas where intervention is most needed. By understanding these insights, organizations can tailor their programs to better meet the needs of their target populations.
### **Optimizing Resource Allocation**

Resource allocation is a crucial aspect of nonprofit management, and nonprofit artificial intelligence software is helping organizations optimize this process. By analyzing historical data on fundraising, expenses, and program outcomes, AI can recommend the most effective allocation of resources.
For instance, AI algorithms can identify which fundraising strategies yield the highest returns or which programs have the greatest impact per dollar spent. This enables nonprofits to allocate their limited resources more efficiently, maximizing their impact on the communities they serve.
### **Improving Donor Engagement**
Donors are the lifeblood of many nonprofits, and AI systems are transforming how organizations engage with their supporters. AI-powered analytics can segment donors based on their preferences, behavior, and giving history, allowing nonprofits to personalize their communications and outreach efforts.
For example, artificial intelligence tools can predict which donors are most likely to respond to a particular campaign or which communication channels are most effective for different donor segments. By tailoring their outreach strategies, nonprofits can cultivate stronger relationships with donors and increase their fundraising effectiveness.
### **Strengthening Advocacy Efforts**
Advocacy is a cornerstone of many nonprofit missions, and AI is playing an increasingly important role in shaping advocacy strategies. **[The best AI software can analyze social media trends, news articles, and public sentiment to identify key issues and influencers relevant to a nonprofit's cause](https://medium.com/p/1a1e54097e6b)**.
This information enables organizations to craft more targeted advocacy campaigns and engage with stakeholders more effectively. For example, AI-powered sentiment analysis can gauge public opinion on a particular issue, helping nonprofits tailor their messaging to resonate with their audience.
### **Addressing Ethical Considerations**
While the potential benefits of AI in nonprofits are significant, it's essential to consider the ethical implications of using nonprofit artificial intelligence software in decision-making. Nonprofits must ensure that AI algorithms are transparent, accountable, and free from bias.
Additionally, organizations must prioritize data privacy and security to protect the sensitive information of their beneficiaries and donors. By adopting ethical AI principles and practices, nonprofits can harness the power of AI while upholding their core values and commitments to social justice.
## **Conclusion**

In conclusion, AI systems are transforming decision-making in nonprofit organizations, enabling them to operate more effectively, efficiently, and ethically. From enhancing program effectiveness to optimizing resource allocation, AI is revolutionizing how nonprofits serve their communities and create positive change.
By embracing AI software and ethical principles, nonprofits can unlock new opportunities for innovation and impact, ultimately advancing their missions and improving the lives of those they serve. As we continue to navigate the complexities of the nonprofit landscape, artificial intelligence will undoubtedly play an increasingly important role in shaping the future of decision-making and social impact.
| amitesh_mondal |
1,862,957 | what is the benefit of c++ language? | A post by Abdullah Esmail | 0 | 2024-05-23T14:23:58 | https://dev.to/esmailabdullah/what-is-the-benefit-of-c-language-5and | esmailabdullah | ||
1,862,914 | The 3 best AI tools not talked about enough | A comprehensive suite of online design tools, ranging from photo editors to wireframing applications... | 0 | 2024-05-23T14:19:44 | https://dev.to/seren/the-3-best-ai-tools-not-talked-about-enough-3c59 | ai |
A comprehensive suite of online design tools, ranging from photo editors to wireframing applications and beyond.
1.[Bgrem](https://bgrem.ai/) - A multifunctional website with a wide range of tools including image generators, AI filters, interior redesign, background removal, and many others - free generations and affordable prices.

2.[D-ID](https://www.d-id.com/), the premier AI Video Creation Platform, empowers you to craft photorealistic videos with generative AI, whether through D-ID's API or the Creative Reality™ Studio. Transform your vision into lifelike videos effortlessly.

2. [QuillBot](https://quillbot.com/)
Quillbot AI's premium feature includes a powerful plagiarism checker, seamlessly integrated to ensure your content's originality without the need for external tools. Say goodbye to plagiarism worries and hello to effortless authenticity.

| seren |
1,862,913 | Everything You Need to Know About Business Contract Law | Contract law is considered one of the most important legal documents in business. This contract... | 0 | 2024-05-23T14:18:38 | https://dev.to/newsinfo/everything-you-need-to-know-about-business-contract-law-2o91 | law | Contract law is considered one of the most important legal documents in business. This contract determines how the employees are hired, how companies work together, and other details. Contract law is fundamental to understanding the business segment.
We have to study various details in business contract law; therefore, in this article, we will examine these contracts in detail. If you want to [learn more](https://clearwaterbusinessattorney.com/contact/tampa-fl-business-law-attorney/) about the contract laws, this article is for you. Let’s begin our discussion about contract laws. Let’s begin our discussion about business contract laws.
**What is a Contract?**
A contract is a legally binding agreement between two parties designed to create mutual obligations that businesses and individuals use to protect their interests. Some contracts outline the specific terms of engagement for a transaction. These agreements can also dictate the legal consequences if an agreement is violated.
These contracts are either written or verbal. Some businesses use written tracts because it is easier for them to refer to later. Written agreements are more straightforward and less ambiguous to enforce, so it is better to have a written contract.
**What is a Contract Law?**
Contract law is a subset that regulates explicitly how contracts are created and enforced. Various laws deal with these aspects, such as how contracts are formed, what a document must contain to be considered as a contract, who is eligible for entering a contract, what the consequences of exiting or violating a contract, and other such things are. These contract laws explain what a contract exists and when it is enforceable.
**What are the Essential Components of a Contract?**
● Offer: There must be a clear and specific offer from one party to another. There is an offering party and an offeror, and there will be particular terms for the offeree.
● Acceptance: There must be apparent acceptance in the contract to avoid confusion among both parties at later stages. Acceptance can be in three forms: words, actions, and performance.
● Consideration: The contract and the value provided to the contract will be considered. Financial considerations would include loans, property, or other services.
These are some of the significant details of the business contract. The contract law mandates that a contract is only valid when there is proper structure and fulfillment of all requirements, such as public policy, consent from the parties, agreement to all terms and conditions, and various such details. | newsinfo |
1,862,911 | Vasectomy Recovery: Tips for a Smooth Healing Process | Considering a vasectomy as a permanent birth control, Vasectomy Brooklyn, New York, is a good option.... | 0 | 2024-05-23T14:16:20 | https://dev.to/newsinfo/vasectomy-recovery-tips-for-a-smooth-healing-process-10gp | vasetomy | Considering a vasectomy as a permanent birth control, [Vasectomy Brooklyn, New York](https://drjonlazare.com/conditions/vasectomy/), is a good option. A vasectomy is a minor surgical procedure that offers a highly effective way to prevent pregnancy. While it's a relatively simple and effective procedure, proper aftercare is crucial for a smooth and comfortable recovery. This short blog explores essential tips to ensure your body heals optimally after a vasectomy.
**What to Expect After Your Vasectomy?**
Following your vasectomy, it's normal to experience some soreness, swelling, and bruising around the scrotum. This discomfort usually peaks within the first 48 hours and gradually subsides over the next week or two. Your doctor will provide specific instructions on caring for the incision site and managing pain.
**Some Critical Questions for a Smooth Recovery:**
1. **How Much Rest Should I Get?**
Rest is vital for Healing. Plan to take it easy for at least 24-48 hours after your procedure. Avoid strenuous activities or lifting heavy objects. As you feel better, gradually increase your activity level, but listen to your body and avoid pushing yourself too hard.
2. **How Can I Manage Pain and Discomfort?**
Your doctor will likely recommend over-the-counter pain relievers like acetaminophen (Tylenol) to manage discomfort. However, avoid medications like ibuprofen or aspirin that can increase bleeding risk. Apply ice packs to the scrotum for short intervals (20 minutes on, 20 minutes off) to reduce swelling and ease pain.
3. **How Do I Care for the Incision Site?**
Maintaining good hygiene is crucial to prevent infection. Follow your doctor's instructions regarding showering or bathing, typically after 24 hours. Keep the incision site clean and dry. Avoid using harsh soaps or lotions in the area. Wear loose-fitting clothing and supportive underwear, like a jockstrap, to minimize irritation and provide support.
4. **When Can I Resume Sexual Activity?**
While you may feel up for intimacy sooner, it's essential to wait for your doctor's approval. Typically, a waiting period of 7-10 days is recommended to allow for Healing. It's necessary to use alternative contraception methods until a semen analysis confirms the absence of sperm in the ejaculate, signifying a successful vasectomy.
5. **When Should I See a Doctor?**
While some discomfort is expected, be alert for signs of infection, such as increased pain, redness, swelling, or fever. If you experience any unusual bleeding, difficulty urinating, or persistent vomiting, consult your doctor immediately.
**Conclusion**
A vasectomy can be a life-changing decision for men seeking permanent birth control. Following these recovery tips and maintaining open communication with your doctor can ensure a smooth and comfortable healing process. Remember, recovery is an individual journey; listen to your body, prioritize rest, and don't hesitate to seek medical advice if needed. | newsinfo |
1,862,910 | Good Morning Everybody. I m Independent Distribuitor Agent From Herbalife. | Good Morning Developer, today is my 4 years to continue to keep my branch from Herbalife. I believe... | 0 | 2024-05-23T14:14:36 | https://dev.to/githubmario2020/good-morning-everybody-i-m-independent-distribuitor-agent-from-herbalife-2con | Good Morning Developer, today is my 4 years to continue to keep my branch from Herbalife. I believe herbalife has to much to provide in health industry for sure always give good source to complement my life. Know I continue in that field the wellness the really reason is because the active more important in my life is the health not matter where you start you can grow your business. Thats start in my Home Country in Peru in 2020, I m still continue and persue my small business at this time use the same receipt in my complement in my daily life, If you want some suggestion you can start like a preferent clients and continue like distribuitor agent like me. Good luck in your path. Mario Aguilar Business Intelegent and Data Analytic you can write at mariobluestock2022@gmail.com. I am still look for the next enterpreneur in Herbalife Wellness. | githubmario2020 | |
1,862,909 | The Dangers of DIY Pest Control in College Station: When to Call in the Professionals? | Pests, the unwelcome guests that invade our homes causing absolute havoc! They start small, perhaps... | 0 | 2024-05-23T14:14:34 | https://dev.to/newsinfo/the-dangers-of-diy-pest-control-in-college-station-when-to-call-in-the-professionals-3inj | diy | Pests, the unwelcome guests that invade our homes causing absolute havoc!
They start small, perhaps with a few ants scurrying across the kitchen floor or a lone cockroach darting out from under the refrigerator. But before you know it, they've multiplied, spreading their presence throughout your living spaces.
[Click here](https://backyardcomfortandpestcontrol.com/bryan_pest_control.asp) to know the steps to take when faced with such a situation. Many homeowners resort to the quick fix of DIY pest control. While this may seem like a convenient solution at a first glance, it often comes with its own set of difficulties and dangers.
● One of the primary dangers of DIY pest control is that it provides a temporary fix, addressing the symptoms rather than the root cause of the infestation.
● Many off-the-shelf products contain harsh chemicals that not only pose health risks to humans and pets but also fail to eradicate the problem entirely.
● Additionally, misidentification of pests and improper treatment methods can exacerbate the issue, leading to the worsening of infestations.
● Ultimately, DIY solutions can result in a waste of both time and money, leaving you back at square one.
This is where professional pest control services come into play. Unlike DIY methods, pest control professionals are trained to accurately identify pests and implement targeted treatment plans.
But how do you know it's time to call the professionals? Here are its tell-tale signs:
1. Large Infestations: If you notice a significant increase in pest activity or the presence of large numbers of pests, it's likely time to seek professional help.
2. Persistent Infestations: Despite your best efforts with DIY methods, if the infestation persists or returns shortly after the treatment, it's a sign that professional intervention is necessary.
3. Structural Damage: Pests such as termites and rodents can cause extensive damage to your home's structure over time. If you notice signs of structural damage, it's essential to address the problem promptly with professional assistance.
4. Venomous Pests: Dealing with venomous pests such as spiders or scorpions requires specialized knowledge and equipment. In such cases, it's safest to leave the job to trained professionals.
All in all, while DIY pest control may seem like a cost-effective solution, it often falls short in effectively addressing pest infestations. So by recognizing the early signs of pest infestation and calling the professionals right away, you can ensure the swift and thorough eradication of pests from your home. | newsinfo |
1,862,907 | optimize query in laravel and mysql | Optimizing queries for large datasets in Laravel involves several strategies to improve performance... | 0 | 2024-05-23T14:12:56 | https://dev.to/abir_hossen_aurnob/optimize-query-in-laravel-and-mysql-5fab | laravel, mysql, query, largedata |
Optimizing queries for large datasets in Laravel involves several strategies to improve performance and efficiency. Here are some key techniques you can use:
1. Use Eloquent Efficiently
Select Specific Columns: Only select the columns you need to minimize the amount of data being retrieved.
```
$users = User::select('id', 'name', 'email')->get();
```
Eager Loading: Use eager loading to prevent the N+1 query problem.
```
$users = User::with('posts', 'comments')->get();
```
2. Use Query Builder
For complex queries, the Query Builder can be more efficient than Eloquent.
```
$users = DB::table('users')->where('status', 'active')->get();
```
3. Pagination
Instead of retrieving all records at once, use pagination to load data in chunks.
```
$users = User::paginate(50);
```
4. Indexing
Ensure that your database tables have proper indexes on columns that are frequently queried.
```
Schema::table('users', function (Blueprint $table) {
$table->index('email');
});
```
5. Chunking
For processing large datasets, use chunking to handle records in smaller pieces.
```
User::chunk(100, function ($users) {
foreach ($users as $user) {
// Process each user
}
});
```
6. Caching
Cache the results of frequently run queries to reduce database load.
```
$users = Cache::remember('active_users', 60, function () {
return User::where('status', 'active')->get();
});
```
7. Use Raw Queries for Complex Operations
For very complex queries, using raw SQL can sometimes be more efficient.
```
$users = DB::select('SELECT * FROM users WHERE status = ?', ['active']);
```
8. Optimize Database Configuration
Ensure your database is configured for optimal performance:
- Increase memory limits.
- Tune the buffer/cache sizes.
- Use appropriate storage engines.
9. Profiling and Analyzing Queries
Use Laravel's query log to analyze and profile your queries.
```
DB::enableQueryLog();
// Run your query
$users = User::all();
$queries = DB::getQueryLog();
dd($queries);
```
10. Avoid N+1 Problem
Ensure you are not making additional queries in loops.
```
// Bad: N+1 problem
$users = User::all();
foreach ($users as $user) {
echo $user->profile->bio;
}
// Good: Eager loading
$users = User::with('profile')->get();
foreach ($users as $user) {
echo $user->profile->bio;
}
```
Optimizing a Complex Query
Suppose you need to fetch users with their posts and comments, and you want to optimize this operation:
```
$users = User::select('id', 'name', 'email')
->with(['posts' => function ($query) {
$query->select('id', 'user_id', 'title')
->with(['comments' => function ($query) {
$query->select('id', 'post_id', 'content');
}]);
}])
->paginate(50);
```
| abir_hossen_aurnob |
1,862,906 | What to Expect When Facing a Court-Martial | The court-martial is the most complicated and severe legal proceeding any military professional has... | 0 | 2024-05-23T14:12:29 | https://dev.to/newsinfo/what-to-expect-when-facing-a-court-martial-3md3 | court | The court-martial is the most complicated and severe legal proceeding any military professional has ever experienced. As a military person, if you are facing court-martial, you must be in a tough spot. Safeguarding your rights and reputation is essential; thus, learning more about court materials in detail is vital. Knowing what to expect will prepare you since it is a severe legal proceeding. If you are in such a situation, we have got you covered. This article will discuss the top 5 things you must expect during a court martial. So, [learn more about court martials](https://www.defendyourservice.com/practice-areas/military-criminal-court-martial-defense) here!
Things to expect during a court-martial
1. **Formal proceedings**
The formal framework of court martial processes is similar to civil trials. However, it is governed by particular military regulations. In the proceedings, it included:
● Opening remarks
● Evidence presentation
● Witness cross-examination
● Legal arguments
● Closing remarks
A military judge presides over the court martial. The accused can select between having a civilian or military defense attorney defend them.
2. **Charges and specifications**
The accused is aware of the allegations and specifications against them before the court-martial. These charges list the claimed transgressions and infractions against military law, including misbehavior, neglect of duty, and order violation. The defense raises objections to the evidence and makes its case to disprove the accusations, while the prosecution provides evidence to support the charges.
3. **Panel or jury**
Under specific court-martial procedures, the accused may face trial by a jury composed of military personnel. This is referred to as a "jury trial." The accused can undergo the "judge-alone" trial, in which the military judge issues the verdict. Depending on the nature and seriousness of the charges, the jury or panel's makeup changes.
4. **Legal representation**
The accused is entitled to legal counsel during the court-martial. They can hire a civilian attorney from a military defense attorney provided by the military. To ensure that all rights are safeguarded. Further, they provide a defense plan and refute the prosecution's case.
5. **Potential penalties**
Depending on the seriousness of the offenses, the accused suffers punishments if found guilty at a court-martial. Some examples of penalties are:
● Incarceration
● Dishonorable discharge
● Rank reduction
● Fines
The type of court-martial, the magnitude of the charges, and any aggravating or mitigating circumstances mentioned during the hearings all influence the severity of the sanctions.
**Wrapping up**
These are some significant things that you may expect during a court-martial. Ensure you hire a professional lawyer to safeguard your rights. | newsinfo |
1,862,904 | What To Do When Injured in a Ridesharing Accident | Rideshare services like Uber have become extremely common worldwide. These come with great benefits... | 0 | 2024-05-23T14:10:08 | https://dev.to/newsinfo/what-to-do-when-injured-in-a-ridesharing-accident-17k5 | accident | Rideshare services like Uber have become extremely common worldwide. These come with great benefits and convenience. They also carry a great risk of accidents. Whether you are a driver or a third party, ridesharing accidents can be a daunting process. These can lead to physical injuries and emotional trauma. Navigating what to do next is complex. Thus, you must know certain things to do in advance. In this article, we will list the top 5 things you must do when injured in a ridesharing accident. [Learn more](https://tuitelaw.com/rockford-illinois-personal-injury-attorney/rockford-auto-accident-attorneys/)!
Things to do when injured in a ridesharing accident
**1. Seek medical attention**
After the accident, physical and emotional harm is common. You may expect cuts, bruises, strain, fractures, head injuries and more in such accidents. Therefore, it is important to get medical attention immediately. Even if you feel comfortable with no visible injuries, go for a thorough body checkup. A medical evaluation also creates a record that is important during the time of claiming a fair settlement.
**2. Report about the accident**
Once you feel medically fit, you must report the accident to the police station, regardless of its severity. The police report must document all the details of the accident. These include what exactly happened, who all were involved, vehicles and more. While filing a complaint, state only facts. Avoid making up stories, as it may negatively impact your compensation.
**3. Document the scene**
If possible, click pictures of the accident scene. You must also take videos and photos of your bruises and medical reports. Additionally, contact information should be exchanged with the insurance provider and eyewitness. You must also have information about ridesharing companies and service providers.
**4. Notify the ridesharing company**
Use the rideshare company's app or customer service to report the accident. When it comes to managing collisions involving their drivers and passengers, they might have particular procedures in place. Be ready to share information about the event, such as the date, time, and type of injuries you sustained.
**5. Hire an attorney **
The last and most important step after a ridesharing accident is hiring a lawyer. A lawyer is a professional expert who assesses your case closely and helps you understand your legal rights. Further, they seek compensation for your losses and injuries.
**Wrapping up**
After meeting a ridesharing accident, follow these top 5 steps to get a solution! Make sure you hire an experienced lawyer. | newsinfo |
1,862,908 | deck.gl for Google Maps API | In my last post, I talked about how to optimize GeoJSON in Google Maps API by using the Data Layer... | 27,491 | 2024-05-23T14:10:00 | https://www.joshgracie.com/blog/deckgl/ | deckgl, googlemaps, javascript, gis | In my last post, I talked about how to optimize GeoJSON in Google Maps API by using the Data Layer and event listeners. This time, I want to talk about how to use deck.gl to render large datasets in Google Maps. deck.gl is a WebGL-powered framework for visual exploratory data analysis of large datasets. It is (mostly) agnostic to the mapping library you use, so it can be used with Google Maps API.
## What is deck.gl?
Per the [deck.gl](https://deck.gl/) website, deck.gl is a GPU-powered framework for visual exploratory data analysis of large datasets. It makes use of WebGL to render large datasets quickly and efficiently. deck.gl is a great tool for visualizing large datasets in a performant way. It is (mostly) agnostic to the mapping library you use, so it can be used with Google Maps API.
In fact, deck.gl has a Google Maps overlay that allows you to render deck.gl layers on top of a Google Map. There are a few steps to get this set up, but it is relatively straightforward.
## Getting Started
To get started with deck.gl and Google Maps API, you will need to install the deck.gl library. You can do this by running the following command:
```bash
npm install deck.gl
```
Once you have installed deck.gl, you can create a new deck.gl layer and add it to your Google Map. Here is an example of how to do this:
```javascript
import { GeoJsonLayer } from '@deck.gl/layers';
...
// Create a new deck.gl layer
// This example creates a GeoJsonLayer that renders a GeoJSON dataset on top of a Google Map
let newLayer = new GeoJsonLayer({
id: 'geojson',
data,
opacity: 0.8,
stroked: true,
filled: false,
extruded: false,
wireframe: true,
getLineColor: [255, 255, 255],
pickable: true
});
```
After creating the layer, you can add it to a GoogleMapsOverlay object and add that overlay to your Google Map. Here is an example of how to do this:
```javascript
import { GoogleMapsOverlay } from '@deck.gl/google-maps';
...
let overlay = new GoogleMapsOverlay({});
overlay.setProps({
layers: [newLayer],
});
overlay.setMap(map);
```
This will add the deck.gl layer to your Google Map. You can customize the appearance of the layer by changing the properties of the GeoJsonLayer object. For example, you can change the color of the lines by changing the getLineColor property.
## Why use deck.gl with Google Maps API?
There are a few reasons why you might want to use deck.gl with Google Maps API. One reason is that deck.gl is optimized for rendering large datasets quickly and efficiently. If you have a large dataset that you want to visualize on a Google Map, deck.gl can help you do this in a performant way.
As we saw in my last post, rendering large datasets in Google Maps API can be slow and inefficient, even with the optimizations we made using the Data Layer. deck.gl can help you render large datasets more quickly and efficiently by using WebGL to render the data on the GPU.
Another reason to use deck.gl with Google Maps API is that deck.gl provides a lot of flexibility and customization options. You can customize the appearance of your deck.gl layers in a variety of ways, such as changing the color of the lines or adding extrusion to the data. This can help you create more visually appealing and informative visualizations of your data.
## Conclusion
In this post, I talked about how to use deck.gl to render large datasets in Google Maps API. deck.gl is a powerful tool for visualizing large datasets quickly and beautifully. By using deck.gl with Google Maps API, you can create visually appealing and informative visualizations of your data. If you have a large dataset that you want to visualize on a Google Map, I would recommend giving deck.gl a try.
I hope you found this post helpful. If you have any questions or comments, please feel free to leave them below. Thanks for reading! | jgracie52 |
296,535 | Top 404 html page in codepen | https://youtu.be/HdqMTaBJaDc | 0 | 2020-04-01T14:05:18 | https://dev.to/uiswarup/top-404-html-page-in-codepen-ak2 | html, css, webdev, codepen | https://youtu.be/HdqMTaBJaDc | uiswarup |
1,862,902 | Deep Dive In Rails Callbacks: Part-I | Deep Dive In Rails Callbacks: Part-I | by RubyBlaze | May, 2024... | 0 | 2024-05-23T14:08:01 | https://dev.to/rubyblaze/deep-dive-in-rails-callbacks-part-i-36nj | ruby, rails, webdev, beginners | {% embed https://medium.com/@rubyblaze/deep-dive-in-rails-callbacks-part-i-5cde53f38d78 %} | rubyblaze |
1,862,901 | What Should I Do If My Partner Wants Me To Sign A Prenup? | Prenuptial agreements, more commonly known as prenups, have gained a bad reputation mainly due to... | 0 | 2024-05-23T14:07:04 | https://dev.to/newsinfo/what-should-i-do-if-my-partner-wants-me-to-sign-a-prenup-3gpd | prenups | Prenuptial agreements, more commonly known as prenups, have gained a bad reputation mainly due to media content. However, much of it is unwarranted. Many associate the document with a lack of trust and commitment to the marriage. If your partner has asked you to sign a prenup before the wedding, you may feel shocked and even betrayed.
However, there is no reason to feel so. As [stated here](https://rodriguezfamilylawyer.com/divorce/), Prenups can benefit both you and your children. They guarantee a healthier and happier married life and fewer complexities in case things do not go the right way. You can have a family lawyer review the prenup to ensure it works in your family's best interest.
**What to do if your partner wants to sign you a prenup?**
**1. Separate your emotions from practicality.**
It is normal to experience a range of emotions when facing a prenuptial agreement. You may feel misled, insulted, or hurt. However, it is important to separate your feelings from the practicality of the situation.
Instead of seeing a prenuptial agreement as a reflection of your relationship's strength, see it as a tool for financial planning. The goal is to safeguard both parties and maintain equity in unforeseen circumstances.
**2. Determine its benefits.**
When your partner brings up the prenuptial issue, they probably have the partnership's best interests at heart. Even though some people see a prenuptial agreement as divorce preparation, this agreement has several advantages.
First, it promotes earlier financial conversations. It also protects participating parties from one another's debt. Additionally, having a prenuptial agreement may help a couple avoid arguments during a divorce.
**3. Understand their reasons.**
Have an open discussion about your partner's reasons for wanting a prenuptial agreement. Do they worry about safeguarding their assets or company? Do they worry about more debts in the future? Understanding their viewpoint could reduce your anxieties and help make the conversation more productive.
**4. Communicate with your partner.**
It is the time to speak with your spouse once you have gathered sufficient information. If you are willing to sign it, let your partner know, and if not, discuss your concerns with them. You can work out an arrangement with your partner. Seek legal counsel to protect your future if your spouse-to-be asks you to sign a prenuptial agreement.
**5. Wait to sign.**
Never sign a legal document under pressure you do not understand or agree with. Give it a thorough read, and ask queries if anything needs to be clarified. It is essential to know that the prenup terms can also be negotiated.
| newsinfo |
1,862,899 | Environment Output Variables: Easy and Secure Output Piping | Sharing information across environments has become even easier and more secure with the introduction... | 0 | 2024-05-23T14:05:03 | https://www.env0.com/blog/environment-output-variables-easy-and-secure-output-piping | infrastructureascode, devops, cloudcomputing, cloudpractitioner | Sharing information across environments has become even easier and more secure with the introduction of Environment Output Variables. This new feature simplifies sharing outputs of one environment with another in the same project or workflow, storing them securely on the [env0 platform](https://www.env0.com/).
Using environment outputs enhance our existing [Workflows](https://docs.env0.com/docs/workflows) capability, making it even easier to define and manage complex dependencies, enabling you to:
* Pipe outputs to dependent environments
* Securely share sensitive values as [outputs](https://www.env0.com/blog/terraform-output-variables-in-depth-guide-with-examples)
* Avoid complex scripting or data sources
In this post, we'll dig into what options you had before Environment Output Variables, how the new feature works, and how you can get started today.
What’s New?
-----------
A common practice with Infrastructure-as-Code is to break large configurations into smaller, easier-to-manage environments. However, there will be dependencies between the environments and a need to share information across environments.
For example, an application deployment might require a subnet ID from a network environment and a database connection string from a database environment.

Prior to the introduction of Environment Output Variables, you could share the outputs of one environment with another through a few options. Each option has some potential downsides to consider.

The [Import Variable Plugin](https://docs.env0.com/docs/import-variable-plugin) in particular was the preferred solution on env0 for accessing the outputs of other environments.
However, using the plugin requires the creation and maintenance of an API key. Also, using the plugin ting requires writing a custom flow to add the step into the automation for an environment. Finally, and perhaps more importantly, the plugin couldn’t support sensitive data outputs, meaning you had to select a different solution for sensitive data or mark them as insensitive.
Environment outputs offer a better and more elegant solution, native to env0, which does not require the addition of a custom flow or provisioning an API key.
Moreover, here the output values are stored securely using encryption and secrets management, making support for sensitive data values possible.
Let's dig into how Environment Output Variables work on env0.
How it works
------------
As an example, let's say we have two environments. One deploys a Virtual Network and the other deploys an AKS cluster.
The AKS cluster will need a subnet ID from the network environment. To share this information, we would first define an output in the VPC configuration that has the subnet ID for the AKS cluster to use:
output "aks_subnet_id" {
value = module.network.vnet_subnets_name_id["aks"]
}
In the AKS environment configuration, there would be an input variable that accepts the subnet ID:
variable "vnet_subnet_id" {
type = string
}
When setting up the variables for the AKS environment, we can define the value for the **vnet\_subnet\_id** input variable as the Environment Output **aks\_subnet\_id** from the network environment.

Once the Network environment has been provisioned, the output values will be available, and the AKS environment can be deployed without having to update any variable values.
The output values from the source environment are stored on the env0 platform and secured using AWS Secrets Manager with KMS encryption. This makes Environment Output Variables a safe and secure option even for sensitive values.
Using Outputs with Workflows
----------------------------
Environment Output Variables can be used to share configuration data across environments in a project, but they are especially useful for Workflows. To understand why, let's quickly review what Workflows are.
### What are env0 Workflows
[Workflows](https://docs.env0.com/docs/workflows) in env0 are a declarative approach to describing the relationships between different environments. Each sub-environment in the workflow can reference an existing template or VCS repository housing a configuration.
The workflow itself is stored as a template that projects can use as a golden path to streamline platform creation.
Workflows vastly simplify expressing the sequence, creation, and maintenance of each sub-environment.
Using the declarative nature of Workflows, you can describe the complex relationships between deployments and initiate partial or full runs to create and update sub-environments.

Inside the workflow, the relationship between sub-environments is described using a **needs** block, referencing the sub-environments that the current entry is dependent on for deployment.
Not only does this help with the initial deployment of resources in the proper sequence, but subsequent changes to any sub-environment can initiate a workflow run to update any dependent components.
Now, let’s get back to our earlier example of a Virtual Network and AKS cluster.
Rather than having them as two separate environments in a project, workflows enable you to describe the relationship between each environment and the order in which they are deployed.
environments:
network: 'Virtual Network'
name: 'AKS Network'
templateName: 'AppNetwork'
aks:
name: 'AKS Cluster'
templateName: 'AKSCluster'
needs:
- network
When provisioning the Workflow, env0 creates a graph showing the relationship between sub-environments in the Workflow.

Workflows are truly a massive step forward in describing and managing complex deployments, but until now they did not share configuration data across the sub-environments. That is where Environment Output Variables come in!
### Enter: Environment Output Variables
Workflows define the relationship between sub-environments, codifying what you would have previously expressed through custom scripts or tribal knowledge.
Before Environment Output Variables, passing outputs from one sub-environment needed to be handled by another solution.
This was often accomplished through the Import Variable Plugin or with **terraform\_remote\_state data**. Environment outputs replaces the need to use either of those solutions with a native approach that is simpler and more secure.
So how might Environment Output Variables be used in a workflow?
Here are a few examples:
* Sharing subnet and environment information from a network deployment to an AKS deployment
* Passing a database connection string from a database deployment to an application deployment
* Passing the Kubernetes connection information and credentials to a Helm configuration
One especially tricky deployment is the bootstrapping of a Kubernetes cluster with a tool like ArgoCD or Flux after the cluster has been created. Workflows can assist with sequencing tasks and handling dependencies, and output variables can be used to pass the Kubernetes connection information.
What’s next
-----------
Output variables are a great way to pass configuration data between environments in the same project, but there's more goodness to come.
Importantly, the feature capabilities are not limited to OpenTofu and Terraform. Terragrunt, Pulumi, and CloudFormation are all supported as output sources and all IaC Frameworks on env0 can make use of Environment Output Variables as inputs. Support for Helm and Kubernetes as output sources is coming soon.
For shared infrastructure scenarios, you may want to share output values with environments in different projects or Workflows. For instance, sharing Kubernetes cluster information with multiple applications in separate projects.
Coming soon, we will add the ability for Environment Output Variables to reference outputs in other projects.
We're excited for you to try out Environment Output Variables and listen to your feedback!
We invite you to [schedule a demo](https://www.env0.com/demo-request) to get a better idea of how Workflows and Environment Output Variables - and env0 in general - could streamline and improve your IaC journey. | env0team |
1,862,896 | When I add tableView in second view controller I got an error thread 1: signal SIGABRT | This is the link of my project. I created a second view controller. I connect the second view... | 0 | 2024-05-23T14:04:13 | https://dev.to/sarmad_nawaz_031619957008/when-i-add-tableview-in-second-view-controller-i-got-an-error-thread-1-signal-sigabrt-3ib1 | [](https://drive.google.com/drive/folders/1RZKl0wIc4dEE4W-pNe8jBOfDeGz7aoT4?usp=drive_link)
This is the link of my project. I created a second view controller. I connect the second view controller with IBAction button in first view controller. At that time, the app was running and the second view controller was opening. then I try to add tableView in second view controller then the error comes and the xcode is pointing out where I deque the cell with indentifier.
 | sarmad_nawaz_031619957008 | |
1,862,895 | 5 Biggest Ethical Challenges In AI Development | Artificial intelligence (AI) is everywhere these days, from your phone's virtual assistant to the... | 0 | 2024-05-23T14:04:04 | https://www.techdogs.com/td-articles/trending-stories/5-biggest-ethical-challenges-in-ai-development | ai, aidevelopment, technology | [Artificial intelligence (AI)](https://www.techdogs.com/category/ai) is everywhere these days, from your phone's virtual assistant to the fancy filters on your social media. But hold on there, – with great power comes great responsibility, as Uncle Ben (from Spiderman, not Toy Story) almost definitely said.
AI is like a super-powered toddler – it can be a real game-changer, but it needs some serious guidance. That's where ethical AI comes in, – a fancy way of saying making sure AI is used for good and not, well, like a super-powered toddler with a box of matches.
Here's the gist of some of the biggest challenges ethical AI is facing:

[Source](https://tenor.com/view/toy-story-buzz-lightyear-toy-fly-to-infinity-and-beyond-gif-17474152)
1. **Bias in AI**: Imagine an AI accidentally skipping over your resume because your name is Aisha instead of Ashley. Yikes. We need to make sure AI is fair and unbiased, like a good friend who wouldn't judge you for accidentally calling them by their ex's name.
2. **Black Box AI**: Sometimes, AI algorithms are like mysterious fortune cookies – you get the answer, but you have no idea how it came to that conclusion. This lack of transparency can be a problem. We need to understand how AI makes decisions, like figuring out why your friend keeps getting recommended polka music playlists.
3. **Privacy Issues with AI**: Imagine Big Brother watching your every move, but way creepier because it's a computer program. That's the concern with AI and privacy. We need to strike a balance between cool new tech and keeping our personal information, well, personal.
So, how do we fix these AI woes? There's no magic wand here, but there are ways to make AI development more ethical. Think diverse teams creating the AI, algorithms that can explain themselves better than your grumpy grandpa, and data privacy rules stricter than your mom about your room.
The future of AI is bright, but we need to make sure it's a future that benefits everyone, not just robots who can finally take over those pesky CAPTCHAs.
For further details, please read the full article [[here](https://www.techdogs.com/td-articles/trending-stories/5-biggest-ethical-challenges-in-ai-development)].
Dive into our content repository of the latest [tech news](https://www.techdogs.com/resource/tech-news), a diverse range of [articles ](https://www.techdogs.com/resource/td-articles)spanning [introductory guides](https://www.techdogs.com/resource/td-articles/curtain-raisers), product reviews, [trends ](https://www.techdogs.com/resource/td-articles/techno-trends)and more, along with engaging interviews, up-to-date [AI blogs](https://www.techdogs.com/category/ai) and hilarious [tech memes](https://www.techdogs.com/resource/td-articles/tech-memes)!
Also explore our collection of [branded insights](https://www.techdogs.com/resource/branded-insights) via informative [white papers](https://www.techdogs.com/resource/white-papers), enlightening case studies, in-depth [reports](https://www.techdogs.com/resource/reports), educational [videos ](https://www.techdogs.com/resource/videos)and exciting [events and webinars](https://www.techdogs.com/resource/events) from leading global brands.
**Head to the [TechDogs](https://www.techdogs.com/) homepage to Know Your World of technology today!** | td_inc |
1,861,148 | How to Target XHR Errors in Cypress | Introduction Hello, I am Miyazawa, an engineer at WESEEK, Inc. I usually develop GROWI, an... | 0 | 2024-05-23T14:00:00 | https://dev.to/weseek-inc/how-to-target-xhr-errors-in-cypress-59cl | cypress, xhr, programming | ## Introduction
Hello, I am [Miyazawa](https://github.com/miya), an engineer at WESEEK, Inc. I usually develop [GROWI](https://growi.org/en/?utm_source=dev+community&utm_medium=referral&utm_campaign=How_to_Target_XHR_Errors_in_Cypress), an **open-source wiki**. Today, I will explain how to deal with XHR errors that occur in [Cypress](https://www.cypress.io/), which is used for visual regression testing of GROWI.
## Error Details
The error content introduced here is circled in red below. The screenshot shows the actual GROWI login screen.

The actual Cypress code in which the error occurred is shown below. Specifically, it is a Cypress [custom command](https://docs.cypress.io/api/cypress-api/custom-commands) to log in by going to `/login`, entering the user name and password registered in advance, and pressing the login button. You can code as `cy.login(username, password)` to use it.

## Error Cause
I believe that the cause is that the user's session data was not saved properly in the browser by trying to move to a page that requires login without waiting for the `/api/v3/login` response that runs when the login button is pressed.
As a test, after pressing the login button (`cy.get('.btnSubmitForLogin').click()`), I added `cy.wait(10000)`, and the login process was successfully performed without any XHR errors.
From the above, I realized that waiting for the response from `/api/v3/login` seems to be a good idea, so I will modify the code accordingly.
## Solution
Cypress has an API called [intercept](https://docs.cypress.io/api/commands/intercept). This API allows spying and stubbing on HTTP requests made within an application.
Since I only need to spy in this case, the code refers to the documentation. The first argument is the method name and the second argument is the URL. You can create an alias using `as`.If you put the alias you created into `cy.wait`, it will wait for `/api/v3/login` to complete.

I put the above code into the custom command causing the error and it worked correctly.

---
## About Us💡
In addition, I want to introduce a little more about GROWI, an open software developed by us **WESEEK, Inc**.
**GROWI** is a wiki service with features-rich support for efficient information storage within the company. It also boasts high security and various authentication methods are available to simplify authentication management, including **LDAP/OAuth/SAML**.
**GROWI** originated in Japan and GROWI OSS is **FREE** for anyone to [download](https://docs.growi.org/en/admin-guide/?utm_source=dev+community&utm_medium=referral&utm_campaign=How_to_Target_XHR_Errors_in_Cypress) and use **in English**.
For more information, go to [GROWI.org](https://growi.org/en/?utm_source=dev+community&utm_medium=referral&utm_campaign=How_to_Target_XHR_Errors_in_Cypress) to learn more about us. You can also follow our [Facebook](https://www.facebook.com/people/GROWIcloud/100089272547238/) to see updates about our service.
 | weseek-inc |
1,862,894 | SQL Complete Beginner Course 2024 | Master SQL with the SQL Complete Beginner Course 2024 Are you eager to master SQL and dive... | 0 | 2024-05-23T13:58:32 | https://dev.to/shubhadip_bhowmik/sql-complete-beginner-course-2024-2omg | sql, mysql, dbms, database | ### Master SQL with the SQL Complete Beginner Course 2024
Are you eager to master SQL and dive into the world of databases? Look no further! I’m excited to announce the release of my latest YouTube video, "SQL Complete Beginner Course 2024." This comprehensive course is designed specifically for beginners, making it the perfect starting point for anyone looking to understand and use SQL effectively.
{% youtube PRtvv7t2iqA %}
#### Why Learn SQL?
SQL (Structured Query Language) is the standard language for managing and manipulating databases. Whether you're aiming for a career in data science, software development, or any field that requires data management, SQL is an essential skill. Here’s why you should consider learning SQL:
1. **High Demand:** SQL is one of the most sought-after skills in the tech industry.
2. **Versatility:** It’s used in various applications, from web development to business intelligence.
3. **Data Management:** Helps in organizing, retrieving, and analyzing data efficiently.
#### Course Overview
The "SQL Complete Beginner Course 2024" is structured to take you from a complete novice to a confident SQL user. Here’s what you can expect:
1. **Introduction to Databases:** Understanding what databases are and why they are essential.
2. **Basic SQL Commands:** Learn how to create, read, update, and delete data (CRUD operations).
3. **Advanced Queries:** Dive into more complex queries, joins, subqueries, and set operations.
4. **Real-World Examples:** Apply your knowledge with practical examples and exercises.
5. **Best Practices:** Learn tips and techniques for writing efficient and maintainable SQL code.
#### Key Features
- **Step-by-Step Learning:** Each section builds on the previous one, ensuring a smooth learning curve.
- **Hands-On Practice:** Interactive exercises and real-world examples to solidify your understanding.
- **Comprehensive Coverage:** From basic commands to advanced topics, everything you need to know is covered.
- **Free Resources:** Accompanying resources such as cheat sheets and SQL script files are provided.
#### Get Started Today!
The "SQL Complete Beginner Course 2024" is now available on YouTube for free. I believe in providing accessible education to everyone, and this course is designed with that philosophy in mind. Whether you’re a student, a professional looking to upskill, or simply someone curious about databases, this course is for you.
To watch the course, simply visit [SQL Complete Beginner Course 2024 on YouTube](https://youtu.be/PRtvv7t2iqA). Don't forget to subscribe to my channel for more tutorials and updates!
---
Thank you for reading and happy coding! 🚀

Feel free to leave comments and feedback on the video. Your input helps me improve and provide better content for future learners.
Happy learning, and I look forward to seeing you master SQL! | shubhadip_bhowmik |
1,862,893 | TASK - 14 | Q-1-> What is the difference between automation and manual testing in software... | 0 | 2024-05-23T13:58:11 | https://dev.to/hariprasath03/task-14-1hhl | **Q-1-> What is the difference between automation and manual testing in software development?**
**<u>Answer</u>**
**Automation Testing**
- In automation testing, the test cases are executed by automated tools.
- Automation testing is much faster than manual testing.
- It requires automation tools and trained employees those have knowledge about coding.
- It is more reliable than manual testing bcoz this testing is done by tools.
- In sometimes, automation testing needs more investment than manual testing bcoz of automation tools.
- Automation testing uses framework like data driven, testng, cucumber etc.
- It is suitable for regression testing, load testing and performance testing.
**Manual Testing**
- In manual testing, the test cases are executed by human resources.
- Manual testing is consume more time.
- It requires only human resources to execute the test cases.
- It is not more reliable than automation testing bcoz human might make mistake in continuous work.
- It is also needs investment to do testing.
- Manual testing doesn't need frameworks.
- It is suitable for exploratory testing, usability testing and adhoc testing.
================================================
**Q-2-> Explore some of the most common automation testing tool available on the market?**
**<u>Answer</u>**
There are many automation tools are running in the market. They are
- Selenium
- Appium
- TestComplete
- Katalon Studio
- Cypress
- Ranorex Studio
- Perfecto
- LambdaTest
- Postman
- SoapUI
- Tricentis Tosca
- Apache JMeter
- Robot Framework
- Applitools
<u>**Selenium**</u>
Selenium is still number one choice among the automation tester for web application. It is an open source and consists of selenium webdriver, selenium grid and selenium IDE.
**Important Features:**
- It supports all programming languages like java, javascript, python, c, c++ etc.
- It supports all browser as well so we can perform cross-browser testing.
- Integration with other frameworks like Testng, Cucumber, Junit etc.
<u>**Katalon Studio**</u>
Katalon Studio is low code and scalable automation testing tool for web, mobile, API and desktop. It is available in both free and paid version.
**Important Features:**
- Automatic retry failed tests, smart wait, and self-healing mechanisms.
- Flexible methods for test design: record & playback, manual, and scripting mode.
- Smart debugging UI and test reporting to troubleshoot failures quickly.
**<u>Postman</u>**
Postman is one of the most widely used automation testing tools for API. It allows users to write different kinds of tests, from functional and integration to regression tests, and execute them automatically in CI/CD pipelines via the command line.
**Important Features:**
- Friendly and easy-to-use interface equipped with code snippets.
- Support for multiple HTML methods, Swagger, and RAML formats.
- Test suite creation, executions with parameterization, and debugging.
==================================================
**Q-3-> What is the Cross Browser Testing?**
**<u>Answer</u>**
Cross Browser Testing is a type of compatibility testing which comes under the non-functional testing. The aim of this testing is to ensure the application or software is working as intended when accessed through all browsers.
**Different Browser:**
Using different browsers like google chrome, firefox, microsoft edge etc. we should ensure the software is working properly through most of the browser.
**Different OS Combination:**
In desktop, there are different OS are available. So we have to test the software in most of the OS in desktop.
**Different Device:**
In Current world there are so much of mobiles devices came to competition. So that the software should work in most of the device. For that we are doing the cross browser testing.
================================================
**Q-4-> Write a Blog on TDD and BDD?**
**<u>Answer</u>**
<u>**Test Driven Development**</u>
Test Driven Development is a testing methodology or a programming practice implemented from developer's perspective. A developer writes an automated test case based on the requirements specified in the documents.
These tests are executed, and in some cases, they fail as they are developed before the development of an actual feature. The development team then re-factors the code for the test to pass successfully. TDD can be done by a single developer while writing both tests and application code side by side to complete a feature.
**Benefits of TDD:**
- Reduces the amount of time required for rework.
- Explores the bugs or errors very quickly.
- Encourages the development of cleaner and better design.
- Results in the creation of extensive code that is flexible and easy to maintain.
**<u>Behavioral Driven Development</u>**
Behavioral Driven Development is a testing approach derived from the Test-Driven Development (TDD) methodology. In BDD, tests are mainly based on systems behavior. This approach defines various ways to develop a feature based on its behavior.
**Benefits of BDD:**
- Helps to reach a wider audience through the non-technical language.
- Focuses on how the system should behave from the customer's and the developer's perspective.
- It is cost effective technique.
- Reduces the effort needed to verify the any post-development defects.
| hariprasath03 | |
1,862,891 | Georgelis, Larsen & Sabatino Injury Law Firm, P.C. | Guided by integrity, hard work and a thorough knowledge of the law and court system, we provide... | 0 | 2024-05-23T13:54:39 | https://dev.to/tony_georgelismarketing_/georgelis-larsen-sabatino-injury-law-firm-pc-3jbb | Guided by integrity, hard work and a thorough knowledge of the law and court system, we provide powerful and effective representation to accident victims who are injured because of someone else’s negligence and recklessness. Our Lancaster PA Law Firm has been voted the #1 Personal Injury Law Firm year-after-year. With deep ties in the Lancaster County community, this is something we are very proud of.
**Phone No:** 7173943004
**Address:** 2168 Embassy Dr, Lancaster, PA 17603, United States
**Web URL:** https://www.georgelislaw.com/ | tony_georgelismarketing_ | |
1,862,889 | Technical Documentation: Student Attendance Tracking with API Integration for Parental Notifications | Overview This project is designed to track student attendance and integrate with an API to... | 0 | 2024-05-23T13:48:34 | https://dev.to/emmanuelj/technical-documentation-student-attendance-tracking-with-api-integration-for-parental-notifications-1a2 | webdev, programming, python | #### Overview
This project is designed to track student attendance and integrate with an API to notify parents or guardians about their child's absence from school. The system leverages Python, and SQLite for database management, Flask for the web interface, and Twilio API for SMS notifications. The primary goal is to ensure that parents are promptly informed about their child's attendance status, enhancing communication and accountability.
#### Prerequisite
- **Python**: The main programming language used for the entire project.
- **SQLite**: A lightweight, serverless database engine to store student and attendance records.
- **Flask**: A micro web framework for building the web interface.
- **Twilio API**: A communication API used to send SMS notifications to parents/guardians.
#### Project Structure
1. **Database Design**:
- **Students Table**: Stores student details including ID, name, class, and parent's contact information.
- **Attendance Table**: Records daily attendance status for each student.
2. **Web Interface**:
- Built using Flask to manage student data and mark attendance.
- HTML templates for user interaction.
3. **API Integration**:
- Twilio API for sending SMS notifications to parents/guardians.
4. **Business Logic**:
- Functions for managing database operations.
- Logic to determine absent students and trigger notifications.
#### Implementation Steps
##### 1. Setting Up the Project
Initialize the project directory and create a virtual environment:
```sh
mkdir student_attendance
cd student_attendance
python3 -m venv venv
source venv/bin/activate
pip install flask twilio pandas
```
##### 2. Database Setup
Create an SQLite database and define the schema for students and attendance records.
```python
import sqlite3
def create_database():
conn = sqlite3.connect('attendance.db')
cursor = conn.cursor()
cursor.execute('''
CREATE TABLE IF NOT EXISTS students (
id INTEGER PRIMARY KEY,
name TEXT NOT NULL,
class TEXT NOT NULL,
parent_contact TEXT NOT NULL
)
''')
cursor.execute('''
CREATE TABLE IF NOT EXISTS attendance (
id INTEGER PRIMARY KEY,
student_id INTEGER,
date TEXT,
status TEXT,
FOREIGN KEY(student_id) REFERENCES students(id)
)
''')
conn.commit()
conn.close()
create_database()
```
##### 3. Web Interface with Flask
Develop the Flask application to manage student records and mark attendance.
```python
from flask import Flask, request, render_template
import sqlite3
app = Flask(__name__)
def query_db(query, args=(), one=False):
conn = sqlite3.connect('attendance.db')
cursor = conn.cursor()
cursor.execute(query, args)
rv = cursor.fetchall()
conn.close()
return (rv[0] if rv else None) if one else rv
@app.route('/')
def index():
students = query_db('SELECT * FROM students')
return render_template('index.html', students=students)
@app.route('/add_student', methods=['POST'])
def add_student():
name = request.form['name']
class_name = request.form['class']
parent_contact = request.form['parent_contact']
query_db('INSERT INTO students (name, class, parent_contact) VALUES (?, ?, ?)', (name, class_name, parent_contact))
return 'Student added!'
@app.route('/mark_attendance', methods=['POST'])
def mark_attendance():
student_id = request.form['student_id']
date = request.form['date']
status = request.form['status']
query_db('INSERT INTO attendance (student_id, date, status) VALUES (?, ?, ?)', (student_id, date, status))
if status.lower() == "absent":
notify_parent(student_id)
return 'Attendance marked!'
if __name__ == '__main__':
app.run(debug=True)
```
Create the HTML template (`templates/index.html`):
```html
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Student Attendance</title>
</head>
<body>
<h1>Student Attendance</h1>
<form method="post" action="/add_student">
<input type="text" name="name" placeholder="Student Name" required>
<input type="text" name="class" placeholder="Class" required>
<input type="text" name="parent_contact" placeholder="Parent Contact" required>
<button type="submit">Add Student</button>
</form>
<h2>Mark Attendance</h2>
<form method="post" action="/mark_attendance">
<input type="number" name="student_id" placeholder="Student ID" required>
<input type="date" name="date" required>
<select name="status">
<option value="Present">Present</option>
<option value="Absent">Absent</option>
</select>
<button type="submit">Mark Attendance</button>
</form>
</body>
</html>
```
##### 4. Twilio API Integration
Install and set up the Twilio client to send SMS notifications.
```sh
pip install twilio
```
Integrate Twilio into your Flask application.
```python
from twilio.rest import Client
# Twilio configuration
account_sid = 'your_account_sid'
auth_token = 'your_auth_token'
twilio_number = 'your_twilio_number'
client = Client(account_sid, auth_token)
def notify_parent(student_id):
conn = sqlite3.connect('attendance.db')
cursor = conn.cursor()
cursor.execute('SELECT name, parent_contact FROM students WHERE id = ?', (student_id,))
student = cursor.fetchone()
conn.close()
if student:
name, parent_contact = student
message = client.messages.create(
body=f"Dear Parent, your child {name} was absent today.",
from_=twilio_number,
to=parent_contact
)
print(f"Notification sent to {parent_contact}")
```
##### 5. Business Logic
Add logic to manage the database and handle attendance recording and notifications.
```python
def add_student(name, class_name, parent_contact):
conn = sqlite3.connect('attendance.db')
cursor = conn.cursor()
cursor.execute('INSERT INTO students (name, class, parent_contact) VALUES (?, ?, ?)', (name, class_name, parent_contact))
conn.commit()
conn.close()
def mark_attendance(student_id, date, status):
conn = sqlite3.connect('attendance.db')
cursor = conn.cursor()
cursor.execute('INSERT INTO attendance (student_id, date, status) VALUES (?, ?, ?)', (student_id, date, status))
conn.commit()
conn.close()
if status.lower() == "absent":
notify_parent(student_id)
```
### Discussion and Opinion
#### Benefits
1. **Enhanced Communication**: The integration of SMS notifications ensures that parents or guardians are promptly informed about their child's attendance status. This real-time communication can help parents take immediate action if necessary.
2. **Accountability**: By automating the notification process, the system ensures that no absences go unnoticed. This can significantly improve student accountability and attendance rates.
3. **Scalability**: The use of a web-based interface and an API for notifications means the system can be easily scaled to accommodate a larger number of students and additional functionalities in the future.
4. **Ease of Use**: The simple web interface built with Flask allows for easy interaction, making it accessible for school administrators without requiring extensive technical knowledge.
#### Potential Challenges
1. **Data Privacy**: Handling and storing personal information, especially contact details, requires strict adherence to data privacy regulations. Ensuring secure storage and transmission of data is crucial.
2. **Reliability**: The system's reliance on external APIs like Twilio means that its functionality depends on these services' availability and reliability. Any downtime or issues with the API can disrupt notifications.
3. **Initial Setup**: The project involves several steps, including configuring the database, setting up the Flask application, and integrating with the Twilio API. This initial setup might require technical expertise.
#### In Summary
The project to track student attendance with API integration for parental notifications provides a robust solution to enhance school-parent communication and student accountability. By leveraging modern web technologies and communication APIs, the system offers a scalable and efficient way to manage attendance and ensure that parents are always informed about their child's attendance status.
However, it is essential to address potential data privacy challenges and external services' reliability. With careful planning and implementation, this project can significantly benefit educational institutions by improving attendance tracking and fostering better communication between schools and parents. | emmanuelj |
1,862,888 | Swift 101: Basic Operators | Hola Mundo! Welcome to a new article in a series of Swift 101 notes 📝 I did while... | 27,019 | 2024-05-23T13:48:07 | https://dev.to/silviaespanagil/swift-101-basic-operators-a59 | learning, swift, mobile, beginners | #<h1> Hola Mundo! </h1>
<Enter>
Welcome to a new article in a series of [Swift 101 notes](https://dev.to/silviaespanagil/swift-101-getting-into-ios-development-gji) 📝 I did while learning the language and I decided to share them because why not?.
If you're new to Swift or interested in learning more about this language, I invite you to follow my series!🙊
In this chapter, I'll be sharing a little bit about `Basic Operators in Swift`.
___
In the [last chapter](https://dev.to/silviaespanagil/swift-101-understanding-types-variables-and-constants-l9h), we discussed constants and variables, which store values for us to use in our code.
When working with these values, we often need them to interact with each other. So, how do we accomplish this?
✨ Enter operators✨
Operators are special symbols that enable us to perform various operations on our variables. These operations can assign or change the value of a variable, allow us to compare one variable with another, or apply logic in our code.
___
###Arithmetic operators ➕
Even if the "Arithmetic" name sound kinda scary, these operators are pretty simple and easy to understand.
Arithmetic operators are basically symbols that allow us to create mathematical calculations, additions, subtractions, multiplications, and divisions.
| Operation | Action | Example | Result |
| --- | --- | --- | --- |
| Addition + | Addition between two values | var addition = 34 + 4 | print(addition) Output: 38 |
| Subtraction - | Subtract operation between values | var subtraction = 34 - 4 | print(subtraction) Output: 30 |
| Multiplication * | Multiplication of values | var multiplication = 30 * 2 | print(multiplication) Output: 60 |
| Division / | Division calculation | var division = 30 / 2 | print(division) Output: 15 |
And yes, the logic says that this kind of operator only works with numeric types. However, Swift also allows us to do an addition operation with String-type variables. This is called **string concatenation** and it will put together in order different strings.
```swift
var magicSpell = "Accio 🪄 "
var object = "Nimbus 2000"
var completeSpell = magicSpell + object
print(completeSpell)
// Output: "Accio 🪄 Nimbus 2000"
```
Please notice that ✨this is not the only way✨ to put two strings together but as this is a `Basic Operators` article we will not get in-depth with other ways to do so.
___
###Modulo operator
Modulo or remainder operator `%` returns the value that is left after you divide two numbers.
```swift
var moduleA = 10 % 3
var moduleB = 10 % 5
print(moduleA) // 1
print(moduleB) // 0
```
Understanding modulo. If we divide `10 / 3` the quotient is `3` with a remainder of `1`. This means that if we do `10 % 3`, the result is `1`.
Similarly, if we divide `10 / 5` the quotient is `2` which is already a whole number, meaning that there is no remainder from the operation. Therefore, 10 % 5 results in 0.
For me, this was one of the hardest operators to understand🤦🏽♀️, so I wrote down an example that helped me:
Imagine we have ten apples 🍎🍎🍎🍎🍎🍎🍎🍎🍎🍎 and there are three very hungry people, so we want to share the apples between all of us.
🧑🎤 - 🍎🍎🍎
👨🎤 - 🍎🍎🍎
👩🎤 - 🍎🍎🍎
If we want all of us to have the same amount of apples, we will each eat three apples. And there will be an extra apple 🍎 all lonely without a human to eat it. Well, that apple is the module of the first ten! Is the remainder of `10 % 3`
___
###Assignment and compound operators
This kind of operator will assign a value to a variable or constant.
The mother of the assignment operators is the **equal operator** 🟰. We have already used it in the previous chapter and is the one that says to Swift this value should be assigned to this variable or constant
```swift
var variableName: <Variable Type> = <Initial value>
var anotherVariable: <Variable Type>
```
However, the equal operator can be mixed with the arithmetic operators so it does a mathematical calculation and assign the value at the same time.
####Equal `=`
The value from the right will be assigned to the one on the left.
```swift
var a = 10
var b = 100
a = b
print(a) Output: 100
```
####Addition `+=`
The value from the right will be added and assigned to the one on the left.
```swift
var a = 10
var b = 5
a += b
print(a) Output: 15
```
####Subtraction `-=`
The value from the right will be subtracted and assigned to the one on the left.
```swift
var a = 10
var b = 5
a -= b
print(a) Output: 5
```
####Multiplication `*=`
The value from the right will be multiplied and assigned to the one on the left.
```swift
var a = 10
var b = 5
a *= b
print(a) Output: 50
```
####Division `/=`
The value from the right will be divided and assigned to the one on the left.
```swift
var a = 10
var b = 5
a /= b
print(a) Output: 2
```
####Modulo `%=`
It will calculate the modulo and assign it to the variable on the left.
```swift
var a = 10
var b = 5
a %= b
print(a) Output: 0
```
___
###Comparison operators
These operators allow us, as the name states, to compare the value from the left side to the one on the right to check if the condition is true or false.
#### Equal to `==`
The value on the right is equal to the one on the left.
```swift
let a = 10
let b = 100
let c = (a == b)
print(c) Output: false
```
#### Not equal to `!=`
The value on the right is **not equal** to the one on the left.
```swift
let a = 10
let b = 100
let c = (a != b)
print(c) Output: true
```
#### Greater than `>`
The value on the left is **greater** than the one on the right.
```swift
let a = 10
let b = 100
let c = (a > b)
print(c) Output: false
```
#### Less than `<`
The value on the left is **smaller** than the one on the right.
```swift
let a = 10
let b = 100
let c = (a < b)
print(c) Output: true
```
#### Greater or equal to `>=`
The value on the left is **greater than or equal to** the one on the right.
```swift
let a = 100
let b = 100
let c = (a >= b)
print(c) Output: true
```
#### Less or equal to `<=`
The value on the left is **smaller than or equal to** the one on the right.
```swift
let a = 100
let b = 10
let c = (a <= b)
print(c) Output: false
```
___
###Logical operators
Logical operators allow us to check or set some rules based on logic. They compare **two different conditions** and tell us if the comparison between them is true or false.
Logical operators are particularly useful for making decisions in our code, as they enable more complex condition checking.
There are three Logical Operators:
- **And `&&`**: Check if both conditions are true → This **and** that are true
```swift
let a = 10
let b = 10
let c = ((a <= b) && (a == b))
print(c) // Output: true
```
In the example, we check if `a`(10) is less or equal than `b`(10) **and** if `a`(10) is the same as `b`(10). As both of the conditions are true, then the output would be `true`.
- **Or `||`**: Check if at least one of the conditions are true → This **or** that are true
```swift
let a = 10
let b = 100
let c = ((a <= b) || (a == b))
print(c) // Output: true
```
In the example, we check if `a`(10) is less or equal than `b`(100) **or** if `a`(100) is the same as `b`(10). In this case, the first condition is true, but the second is false. As we want to know if one or the other is true, effectively, one of them is true so the output will also be `true`.
- **Not `!`**: Returns true if the operand is false and vice versa
```swift
print(!true) // Output: false
```
In the example, we print the negation of true which is false
___
###Want to keep learning about Swift?
So far, we've looked at what Swift is, explored types, variables, and constants, and now we've seen how basic operators can help us perform calculations, compare values, and make decisions in our code. But... there's more to come! 👩🏽💻
This is a full series on [Swift 101](https://dev.to/silviaespanagil/swift-101-getting-into-ios-development-gji). The next chapter will be the first part of Collections and Collection Types, so I hope to see you there 🫶!
If you enjoyed this, please share, like, and comment. I hope this can be useful to someone and that it will inspire more people to learn and code with Swift. | silviaespanagil |
1,861,332 | Financial Services: Navigating the Digital Frontier | The financial services industry is at a pivotal juncture, driven by rapid technological advancements... | 0 | 2024-05-23T13:45:00 | https://dev.to/brainboard/financial-services-navigating-the-digital-frontier-4bdd | finops, infrastructureascode, digitalworkplace, devops | The financial services industry is at a pivotal juncture, driven by rapid technological advancements and increasing regulatory complexities. Digital transformation is not just beneficial; it's imperative for survival. Financial institutions must innovate to enhance efficiency, secure customer data, and remain competitive.
> "Innovation distinguishes between a leader and a follower." - Steve Jobs
## **How Brainboard Accelerates Digital Transformation in Financial Services**
### **Seamless Cloud Integration**
Brainboard simplifies the migration and management of financial services to the cloud, ensuring that institutions can scale resources dynamically, manage costs effectively, and enhance service delivery without compromising security.
### **Enhanced Compliance and Security**
With features designed to automate and visualize infrastructure deployments, Brainboard helps financial institutions comply with stringent regulations by maintaining an accurate, real-time view of their cloud environments, thus mitigating risks associated with data breaches and cyber threats.
### **Streamlined Operations**
Brainboard's drag-and-drop interface allows for quick setup and deployment of infrastructure, significantly reducing the manual effort required. This enables IT teams to focus on innovation and strategic initiatives rather than routine tasks.
### **Cost Management and Optimization**
By providing tools for detailed cost analysis and projections before deployment, Brainboard helps financial organizations manage their budgets more effectively, ensuring that every dollar spent on cloud infrastructure is fully optimized.
## **Features and Use Cases**
### **Visual Infrastructure Design**

Use Case: Financial analysts and IT teams can collaboratively design and visualize their entire cloud architecture, ensuring all stakeholders understand the workflows and can contribute to the infrastructure planning process.
### **Automatic Code Generation**

Use Case: Convert visual designs directly into executable Terraform code, speeding up the deployment process and reducing the potential for human error, crucial for maintaining operational integrity in financial services.
### **Comprehensive Integration Capabilities**

Use Case: Seamlessly integrates with existing tools like GitHub and Jenkins, facilitating continuous integration and continuous deployment (CI/CD) pipelines that are essential for the fast-paced financial services market.
By adopting [Brainboard](https://app.brainboard.co/)'s solutions, financial institutions can lead in efficiency and security, turning challenges into opportunities for growth and innovation.
| miketysonofthecloud |
1,862,885 | Ready to revolutionise the software development process? | Join us on 11 June for an online event – Code-Less Creations Meetup 💡 The meetup will... | 0 | 2024-05-23T13:44:35 | https://dev.to/daryna_soia/ready-to-revolutionise-the-software-development-process-32fd | lowcode, nocode, meetup, programming | ## Join us on 11 June for an online event – Code-Less Creations Meetup 💡
The meetup will immerse you in Low/No Code practices, making it accessible and useful for everyone from experienced developers to business professionals:
🔸 trends and innovations in Low/No Code development.
🔸 practical cases from industry experts.
🔸 analysis of modern tools and platforms.
**Speakers:**
🎙 [Marcin Szlachcic](https://www.linkedin.com/in/marcin-szlachcic-a28104226/), Implementation Specialist at Archman.
Topic: ‘Automation – Just a Catchy Trend or Money-making Machine’.
🎙 [Oleksandr Skachkov](https://www.linkedin.com/in/alexskachkov/), Head of Technology Consulting at Itera.
Topic: ‘Current State of Low Code or No Code Platforms’.
Invited moderator: [Yana Mikhailenko](https://www.linkedin.com/in/yanamykhailenko/), Director of Engineering at Turnitin.
🗓 11 June at 17:00 (CET)
📍 Online broadcast
💬 Language: English
**Registration is required [by the link](https://forms.gle/YXxZbfvG7gPF63hg7).** | daryna_soia |
1,862,884 | Exhibition Stand Contractor Geneva | Going to exhibit in Geneva? Contact us and get 5 best proposals from exhibition booth builders in... | 0 | 2024-05-23T13:40:41 | https://dev.to/expostandzoness/exhibition-stand-contractor-geneva-3i50 | Going to exhibit in Geneva? Contact us and get 5 best proposals from [exhibition booth builders in Geneva](https://www.expostandzone.com/exhibition-stands/switzerland/geneva).
| expostandzoness | |
1,861,076 | Game Development Diary #3 : Still GameDev.tv Course | 23/05/2024 - Thursday Day Three Today's Progress: -> Make the level out of... | 27,527 | 2024-05-23T13:34:36 | https://dev.to/hizrawandwioka/game-development-diary-3-50n8 | godot, gamedev, game, newbie | 23/05/2024 - Thursday
#Day Three
## Today's Progress:
-> Make the level out of CSGShapes to give a basic environment to work with.

You can choose color for the material using this tools

-> Add the player to the blockout level and finish setting up the camera and environment.

-> Convert the player into a RigidBody3D and control it with forces.
A RigidBody3D is a 3D physics body that is moved by a physics simulation.
```
extends Node3D
# Called every frame. 'delta' is the elapsed time since the previous frame.
func _process(delta: float) -> void:
if Input.is_action_pressed("ui_accept"):
#Move the character posistion
position.y += delta
if Input.is_action_pressed("ui_left"):
rotate_z(delta)
if Input.is_action_pressed("ui_right"):
rotate_z(-delta)
```
this is the previous code
```
extends RigidBody3D
# Called every frame. 'delta' is the elapsed time since the previous frame.
func _process(delta: float) -> void:
if Input.is_action_pressed("ui_accept"):
apply_central_force(basis.y * delta * 1000.0)
if Input.is_action_pressed("ui_right"):
apply_torque(Vector3(0.0,0.0,100.0 * delta))
if Input.is_action_pressed("ui_left"):
apply_torque(Vector3(0.0,0.0,-100.0 * delta))
```
this is the new code
this code make the character move according to the user input. But instead of transforming the position manually like the previous code, we use apply central force to move the character against the gravity and apply torque to rotate the character
-> Remap some of input to more intuitive controls.
In the project setting there are inputmap tab, this menu allows us to set all the input we need for the game



-> Detect RigidBody collisions with signals and identify bodies with Groups.
Signal let the game objects communicate with each other.

## Resource:
Complete Godot 3D: Code Your Own 3D Games In Godot 4! - GameDevTv Courses.
##Next Steps:
-> Learn about tween and implement it in the tutorial's game
-> Adding audio
-> Controlling Audio with script
-> Learn about particle and implement it
| hizrawandwioka |
1,862,865 | Function Calling Agent using OpenAI Assistant | What is “function calling” in LLMs? “Function calling” in LLMs refer to its capability to... | 0 | 2024-05-23T13:34:20 | https://dev.to/dheerajgopi/function-calling-agent-using-openai-assistant-2g6m | llm, openai, ai, python | ## What is “function calling” in LLMs?
“Function calling” in LLMs refer to its capability to accept a list of user defined functions (aka tools) and to intelligently choose which “tool” to use based on the prompt provided by the user.
For example, let’s say we have a function `get_weather(location: str)` which provides the current weather based on the location argument. If we pass the get_weather function to the LLM as a tool and ask `“What’s the current weather in Budapest?”`, the LLM can decide to use the get_weather tool instead of responding with incorrect data. In this case, the LLM will simply return the tool name, which is get_weather and the arguments to be passed which is `location = "Budapest"`. Now, we can call the get_weather function using the arguments provided by the LLM (note that LLM cannot call the function for us, we should do that ourselves). After executing the function, pass the return value of the function back to the LLM, and the LLM will provide us with a sensible response.
## So what is a function calling agent?
The general idea of a function calling agent is that, we pass the user query and a list of tools to the LLM, and then call the LLM in a loop until we get the desired response. I know its a vague description :) So, let’s consider an example for better understanding.
Let’s consider a mathematical query for example, since LLMs are generally bad at math.
**Tools provided to the LLM**
- `add(a: float, b: float)`
- `subtract(a: float, b: float)`
- `multiply(a: float, b: float)`
**Query:**
`calculate sum of 1 and 5 and multiply it with the difference of 6 and 3`
## The Agent Loop
The below image will do a better job in explaining the agent loop than writing a long explanation.

You can see that a “conversation” is going on between the app and the LLM inside the agent loop until the LLM can find the result for the user’s query.
## Implementation using OpenAI assistant
Below is the complete code for the OpenAI function calling agent. The explanation for each step has been provided as comments in the code.
Before executing this script, install the required libraries using the below command.
`pip install openai python-dotenv`
Also, store your OpenAI API key in a `.env` file (`OPENAI_API_KEY=your-api-key`).
```python
import os
import sys
from dotenv import load_dotenv
import openai
from openai.types.beta import Assistant, Thread
from openai.types.beta.threads import Run
import json
import logging
# Load env variables (OPENAI_API_KEY)
load_dotenv()
# set log handlers
log_handler = logging.StreamHandler(sys.stdout)
log = logging.getLogger(__name__)
log.addHandler(log_handler)
log.setLevel(logging.INFO)
# initialize OpenAI client
client = openai.OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
# Define the functions which will be the part of the LLM toolkit
def add(a: float, b: float) -> float:
return a + b
def subtract(a: float, b: float) -> float:
return a - b
def multiply(a: float, b: float) -> float:
return a * b
tool_callables = {
"add": add,
"subtract": subtract,
"multiply": multiply
}
# declaration of tools (functions) to be passed into the OpenAI assistant
math_tools = [
{
"function": {
"name": "add",
"description": "Returns the sum of two numbers.",
"parameters": {
"type": "object",
"properties": {"a": {"type": "number"}, "b": {"type": "number"}},
"required": ["a", "b"],
},
},
"type": "function",
},
{
"function": {
"name": "subtract",
"description": "Returns the difference of two numbers.",
"parameters": {
"type": "object",
"properties": {"a": {"type": "number"}, "b": {"type": "number"}},
"required": ["a", "b"],
},
},
"type": "function",
},
{
"function": {
"name": "multiply",
"description": "Returns the product of two numbers.",
"parameters": {
"type": "object",
"properties": {"a": {"type": "number"}, "b": {"type": "number"}},
"required": ["a", "b"],
},
},
"type": "function",
}
]
openai_assistant: Assistant = client.beta.assistants.create(
model="gpt-4-turbo",
instructions="you are a math tutor, who explains the solutions to math problems",
name="math-tutor",
tools=math_tools
)
def run_math_agent(query: str, max_turns: int = 5) -> str:
# Initialize the OpenAI assistant. The assistant will have its own unique id.
openai_assistant = client.beta.assistants.create(
model="gpt-4-turbo",
instructions="you are a math tutor, who explains the solutions to math problems",
name="math-tutor",
tools=math_tools
)
# Create a Thread. In OpenAI lingo, a `Thread` can be considered like a conversation thread (not the multithreading one)
# The to and fro communication between the script and the LLM will be stored against this thread id.
thread: Thread = client.beta.threads.create()
# Send the user query as part of the newly created thread
client.beta.threads.messages.create(
thread_id=thread.id, role="user", content=query
)
# The user query is now part of the thread. Now call the LLM (or "run" the thread in OpenAI lingo).
# `create_and_poll` is just a helper method which polls the LLM until a terminal state is reached.
#
# The terminal states are given below:
# "requires_action" - A function call is required. Execute the function and submit the response back.
# The results should be submitted back before the `expires_at` timestamp.
# "completed" - The Run is completed successfully.
# "cancelled" - The run was cancelled (its possible to cancel an in-progress Run).
# "failed" - Failed due to some error.
# "expired" - Run can get expired if we fail to submit function call results before `expires_at` timestamp.
run: Run = client.beta.threads.runs.create_and_poll(
thread_id=thread.id,
assistant_id=openai_assistant.id,
)
# The agent loop. `max_turns` will set a limit on the number of LLM calls made inside the agent loop.
# Its better to set a limit since LLM calls are costly.
for turn in range(max_turns):
# Fetch the last message from the thread
messages = client.beta.threads.messages.list(
thread_id=thread.id,
run_id=run.id,
order="desc",
limit=1,
)
# Check for the terminal state of the Run.
# If state is "completed", exit agent loop and return the LLM response.
if run.status == "completed":
assistant_res: str = next(
(
content.text.value
for content in messages.data[0].content
if content.type == "text"
),
None,
)
return assistant_res
# If state is "requires_action", function calls are required. Execute the functions and send their outputs to the LLM.
if run.status == "requires_action":
func_tool_outputs = []
# LLM can ask for multiple functions to be executed. Execute all function calls in loop and
# append the results into `func_tool_outputs` list.
for tool in run.required_action.submit_tool_outputs.tool_calls:
# parse the arguments required for the function call from the LLM response
args = (
json.loads(tool.function.arguments)
if tool.function.arguments
else {}
)
func_output = tool_callables[tool.function.name](**args)
# OpenAI needs the output of the function call against the tool_call_id
func_tool_outputs.append(
{"tool_call_id": tool.id, "output": str(func_output)}
)
# Submit the function call outputs back to OpenAI
run = client.beta.threads.runs.submit_tool_outputs_and_poll(
thread_id=thread.id, run_id=run.id, tool_outputs=func_tool_outputs
)
# Continue the agent loop.
# Agent will check the output of the function output submission as part of next iteration.
continue
# Handle errors if terminal state is "failed"
else:
if run.status == "failed":
log.error(
f"OpenAIFunctionAgent turn-{turn+1} | Run failure reason: {run.last_error}"
)
raise Exception(
f"Failed to generate text due to: {run.last_error}"
)
# Raise error if turn-limit is reached.
raise MaxTurnsReachedException()
class MaxTurnsReachedException(Exception):
def __init__(self):
super().__init__("Reached maximum number of turns")
if __name__ == "__main__":
log.info(run_math_agent("calculate sum of 1 and 5 and multiply it with difference of 6 and 3"))
```
On executing the above script the LLM will respond with the correct result, which is 18. The LLM will also explain (like a math tutor) each step it took to find the result.
## Shameless plug :)
You can use function calling agents for your application in a much easier way by just adding a python library called [LLMSmith](https://github.com/dheerajgopi/llmsmith) to your list of dependencies. As you might have guessed, I’m the author of that library :).
You can install the library using the following command.
`pip install "llmsmith[openai]"`
Here’s the code for using a function calling agent using `llmsmith`.
```python
import asyncio
import os
from dotenv import load_dotenv
import openai
from llmsmith.agent.function.openai import OpenAIFunctionAgent
from llmsmith.agent.function.options.openai import OpenAIAssistantOptions
from llmsmith.agent.tool.openai import OpenAIAssistantTool
from llmsmith.task.models import TaskInput
# load env vars for getting OPENAI_API_KEY
load_dotenv()
# Define the functions which will be the part of the LLM toolkit
def add(a: float, b: float) -> float:
return a + b
async def run():
# initialize OpenAI client
llm = openai.AsyncOpenAI(api_key=os.getenv("OPENAI_API_KEY"))
# declaration of tools (functions) to be passed into the OpenAIFunctionAgent
add_tool = OpenAIAssistantTool(
declaration={
"function": {
"name": "add",
"description": "Returns the sum of two numbers.",
"parameters": {
"type": "object",
"properties": {"a": {"type": "number"}, "b": {"type": "number"}},
"required": ["a", "b"],
},
},
"type": "function",
},
callable=add,
)
# create the agent
task: OpenAIFunctionAgent = await OpenAIFunctionAgent.create(
name="testfunc",
llm=llm,
assistant_options=OpenAIAssistantOptions(
model="gpt-4-turbo",
system_prompt="you are a math tutor, who explains the solutions to math problems"
),
tools=[add_tool],
max_turns=5,
)
# run the agent
res = await task.execute(TaskInput("Add sum of 1 and 2 to the sum of 5 and 6"))
print(f"\n\nAgent response: {res.content}")
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
```
`llmsmith` can be used for building all sorts of LLM based functionalities (not just agents). Refer the [documentation](https://llmsmith.readthedocs.io/en/latest/index.html) for getting a better idea about the library. The [Examples](https://llmsmith.readthedocs.io/en/latest/examples.html) section has sample codes for implementing RAG using llmsmith and more. | dheerajgopi |
1,862,882 | Exhibition Stand Builder Basel | Going to exhibit in Basel? Contact us and get 5 best proposals from top exhibition stand builder in... | 0 | 2024-05-23T13:33:21 | https://dev.to/expostandzoness/exhibition-stand-builder-basel-3kl5 | Going to exhibit in Basel? Contact us and get 5 best proposals from top [exhibition stand builder in Basel](https://www.expostandzone.com/exhibition-stands/switzerland/basel
).
| expostandzoness | |
1,862,879 | Your containerized application with IAC on AWS — Pt.3 | Hi Folks! This will be the final post in our series on infrastructure and containers. We will utilize... | 27,490 | 2024-05-23T13:24:24 | https://medium.com/@shescloud_/your-containerized-application-with-iac-on-aws-pt-3-e6e15829e510 | Hi Folks! This will be the final post in our series on infrastructure and containers. We will utilize Terragrunt and our infrastructure in this section, and at the conclusion, we will have our application operating on Fargate on AWS.
The docker image I’ll be using in this lesson comes from Sonic, an old game that many people associate with their early years. [You may use this image or find it on my dockerhub, whichever you would like.](https://hub.docker.com/r/shescloud/sonic-the-hedgehog)
## **DIRECTORIES**
Again, I’ll leave our directory structure here so you can guide yourself:
```
app
modules
├── amazon_vpc
├── aws_loadbalancer
├── aws_fargate
├── aws_roles
├── aws_ecs_cluster
└── aws_targetgroup
└── aws_certificate_manager
terragrunt
└── dev
└── us-east-1
├── aws_ecs
│ ├── cluster
│ └── service
├── aws_loadbalancer
├── amazon_vpc
├── aws_targetgroup
├── aws_roles
├── aws_certificate_manager
└── terragrunt.hcl
```
## **TERRAGRUNT**
First, let’s look at our terragrunt.hcl, located in us-east-1. It will be used for all common variables in our code, as well as for creating our backend settings and the lock in the dynamodb database.
Typical variables are going to be region, project_name, domain_name, env, host_headers and container_port.
terragrunt.hcl
```
remote_state {
backend = "s3"
generate = {
path = "backend.tf"
if_exists = "overwrite"
}
config = {
bucket = "sonic-iac-series"
key = "dev/${path_relative_to_include()}/terraform.tfstate"
region = "us-east-1"
encrypt = true
dynamodb_table = "terraform-state-lock"
}
}
inputs = {
region = "us-east-1"
project_name = "sonic-iac"
env = "dev"
domain_name = "your domain"
host_headers = "sonic.your domain"
container_port = "8080"
tags = {
ambiente = "dev"
projeto = "sonic-iac"
plataforma = "aws"
gerenciado = "terraform/terragrunt"
}
}
generate "provider" {
path = "provider.tf"
if_exists = "overwrite"
contents = <<EOF
provider "aws" {
profile = "default"
region = "us-east-1"
}
EOF
}
```
## **VPC**
The first resource to be created will be the VPC, as it will be needed for most of our resources.
```
terragrunt
└── dev
└── us-east-1
└── amazon_vpc
└── terragrunt.hcl
```
We will use a range of /25, starting with IP 172.35.0.221, to construct our VPC. Four subnets — two public and two private — will be created inside it.
- VPC: 172.35.0.128/25
- Public Subnet 1: 172.35.0.128/27
- Public Subnet 2: 172.35.0.160/27
- Private Subnet 1: 172.35.0.192/27
- Private Subnet 2: 172.35.0.224/27
These code files will be created within:
terragrunt.hcl
```
include {
path = find_in_parent_folders()
}
inputs = {
vpc_cidr_block = "172.35.0.128/25"
public_subnet1_cidr_block = "172.35.0.128/27"
public_subnet2_cidr_block = "172.35.0.160/27"
private_subnet1_cidr_block = "172.35.0.192/27"
private_subnet2_cidr_block = "172.35.0.224/27"
availability_zone1 = "us-east-1a"
availability_zone2 = "us-east-1b"
}
terraform {
source = "../../../../modules/amazon_vpc"
extra_arguments "custom_vars" {
commands = [
"apply",
"plan",
"import",
"push",
"refresh"
]
}
}
```
## **IAM PERMISSIONS**
The next thing to be created will be permissions for our resources.
```
terragrunt
└── dev
└── us-east-1
└── aws_roles
└── terragrunt.hcl
```
terragrunt.hcl
```
include {
path = find_in_parent_folders()
}
terraform {
source = "../../../../modules/aws_roles"
extra_arguments "custom_vars" {
commands = [
"apply",
"plan",
"import",
"push",
"refresh"
]
}
}
```
## **AWS CERTIFICATE MANAGER**
These are the configurations for applying our certificate; we will generate the certificate and use our domain to validate it.
```
terragrunt
└── dev
└── us-east-1
└── aws_certificate_manager
└── terragrunt.hcl
```
terragrunt.hcl
```
include {
path = find_in_parent_folders()
}
terraform {
source = "../../../../modules/aws_certificate_manager"
extra_arguments "custom_vars" {
commands = [
"apply",
"plan",
"import",
"push",
"refresh"
]
}
}
```
## **AWS LOAD BALANCER**
Let’s set up our loadbalancer using Terragrunt now. This will help distribute our traffic and guarantee that our application is highly available.
```
terragrunt
└── dev
└── us-east-1
└── aws_loadbalancer
└── terragrunt.hcl
```
Our Terragrunt setup looks like this. It’s important to note that in order to increase everything’s dynamic nature, we use dependencies between modules.
terragrunt.hcl
```
include {
path = find_in_parent_folders()
}
dependency "vpc" {
config_path = "../amazon_vpc"
}
dependency "acm" {
config_path = "../aws_certificate_manager"
}
inputs = {
vpc_id = dependency.vpc.outputs.vpc_id
subnet_id_1 = dependency.vpc.outputs.public_subnet1_id
subnet_id_2 = dependency.vpc.outputs.public_subnet2_id
alb_internal = false
certificate_arn = dependency.acm.outputs.acm_arn
priority_listener_rule = "1"
}
terraform {
source = "../../../../modules/aws_loadbalancer"
extra_arguments "custom_vars" {
commands = [
"apply",
"plan",
"import",
"push",
"refresh"
]
}
}
```
## **AWS TARGET GROUP**
Here we will configure our Target Group with Terragrunt, it is super essential for directing traffic to the correct servers for our application.
```
terragrunt
└── dev
└── us-east-1
└── aws_targetgroup
└── terragrunt.hcl
```
terragrunt.hcl
```
include {
path = find_in_parent_folders()
}
dependency "loadbalancer" {
config_path = "../aws_loadbalancer"
}
dependency "vpc" {
config_path = "../amazon_vpc"
}
dependency "acm" {
config_path = "../aws_certificate_manager"
}
inputs = {
vpc_id = dependency.vpc.outputs.vpc_id
subnet_id_1 = dependency.vpc.outputs.public_subnet1_id
subnet_id_2 = dependency.vpc.outputs.public_subnet2_id
certificate_arn = dependency.acm.outputs.acm_arn
listener_ssl_arn = dependency.loadbalancer.outputs.listener_ssl_arn
priority_listener_rule = "2"
health_check_path = "/"
}
terraform {
source = "../../../../modules/aws_targetgroup"
extra_arguments "custom_vars" {
commands = [
"apply",
"plan",
"import",
"push",
"refresh"
]
}
}
```
## **ECS CLUSTER**
In this step we will create our ECS cluster that will host our application.
```
terragrunt
└── dev
└── us-east-1
└── aws_ecs
└── cluster
└── terragrunt.hcl
```
terragrunt.hcl
```
include {
path = find_in_parent_folders()
}
terraform {
source = "../../../../../modules/aws_ecs_cluster"
extra_arguments "custom_vars" {
commands = [
"apply",
"plan",
"import",
"push",
"refresh"
]
}
}
```
## **FARGATE AND ECR**
We will construct our fargate service, the repository in the ECR, and a record on our domain as the final configuration file.
```
terragrunt
└── dev
└── us-east-1
└── aws_ecs
└── service
└── terragrunt.hcl
```
terragrunt.hcl
```
include {
path = find_in_parent_folders()
}
dependency "loadbalancer" {
config_path = "../../aws_loadbalancer"
}
dependency "vpc" {
config_path = "../../amazon_vpc"
}
dependency "role" {
config_path = "../../aws_roles"
}
dependency "targetgroup" {
config_path = "../../aws_targetgroup"
}
dependency "cluster" {
config_path = "../cluster"
}
inputs = {
vpc_id = dependency.vpc.outputs.vpc_id
subnet_id_1 = dependency.vpc.outputs.private_subnet1_id
subnet_id_2 = dependency.vpc.outputs.private_subnet2_id
alb_dns_name = dependency.loadbalancer.outputs.alb_dns_name
sg_alb = dependency.loadbalancer.outputs.alb_secgrp_id
target_group_arn = dependency.targetgroup.outputs.tg_alb_arn
cluster_arn = dependency.cluster.outputs.cluster_arn
ecs_role_arn = dependency.role.outputs.ecs_role_arn
instance_count = "1"
container_vcpu = "512"
container_memory = "1024"
aws_account_id = "your account number"
}
terraform {
source = "../../../../../modules/aws_fargate"
extra_arguments "custom_vars" {
commands = [
"apply",
"plan",
"import",
"push",
"refresh"
]
}
}
```
---
## **APPLY**
After the entire structure has been created, you must apply terragrunt to all directories that contain terragrunt.hcl in the following order.
terragrunt/dev/us-east-1
terragrunt/dev/us-east-1/amazon_vpc
terragrunt/dev/us-east-1/aws_roles
terragrunt/dev/us-east-1/aws_certificate_manager
terragrunt/dev/us-east-1/aws_loadbalancer
terragrunt/dev/us-east-1/aws_targetgroup
terragrunt/dev/us-east-1/aws_ecs/cluster
terragrunt/dev/us-east-1/aws_ecs/fargate
Use this command on terminal to apply. You need use in each directory
`terragrunt apply
`
or in the root folder use:
`terragrunt run-all apply
`
---
## **ECR**
Now we have applied all our infrastructure and our ECR repository has been created, we must upload our image for use in our container.
The image must be downloaded from Docker Hub as an initial step. You can use another image if you prefer or your own from your application.
use this command to download my sonic image:
`docker pull shescloud/sonic-the-hedgehog
`



## **TESTING**
I used a domain I had and our application was temporarily hosted at sonic.shescloud.tech.

**DESTROY**
If you are using it for study, or as a way to complete a test, don’t forget to destroy all resources at the end to avoid unnecessary costs. To delete everything, we will do a process similar to apply, but in the opposite way.
Before deleting everything via terragrunt, you need to access your AWS account, go to the ECR service and delete the image from the repository. After completing this step, you can proceed with destroying each of the repositories.

Now, you must destroy to all directories that contain terragrunt.hcl in the following order.
1. terragrunt/dev/us-east-1/aws_ecs/fargate
2. terragrunt/dev/us-east-1/aws_ecs/cluster
3. terragrunt/dev/us-east-1/aws_targetgroup
4. terragrunt/dev/us-east-1/aws_loadbalancer
5. terragrunt/dev/us-east-1/aws_roles
6. terragrunt/dev/us-east-1/aws_certificate_manager
7. terragrunt/dev/us-east-1/amazon_vpc
8. terragrunt/dev/us-east-1
Use this command on terminal to destroy. You need use in each directory
`terragrunt destroy`
or in the root folder use:
`terragrunt run-all destroy
`
---
## **GITHUB**
You can check the repository with the code on my github:
https://github.com/shescloud/terraform-terragrunt-fargate
---
And that’s it folks! I hope you enjoyed it and get a lot out of this code. See u soon!
| shescloud_ | |
1,862,878 | What is LMS for Employee Onboarding? | An employee onboarding LMS (Learning Management System) is a software platform that helps... | 0 | 2024-05-23T13:23:03 | https://dev.to/academyocean/what-is-lms-for-employee-onboarding-85h | An [employee onboarding LMS](https://academyocean.com/employee-onboarding) (Learning Management System) is a software platform that helps organisations manage, deliver and track training programmes for new employees. It streamlines the onboarding process by providing a centralised system where new employees can access training materials, complete courses and track their progress. Here are some of the key features and benefits of an LMS for employee onboarding:
Key Features:
Course management: Easily create, manage and deliver training courses tailored to new employees.
Progress tracking: Monitor new hires' progress through courses and training modules to ensure they meet required milestones.
Content delivery: Deliver multiple types of content, including videos, presentations, quizzes, and interactive modules.
Assessments and quizzes: Evaluate new employees' understanding and retention of training materials through tests and quizzes.
Certification and compliance: Ensure new hires complete mandatory training and certifications that are critical for compliance in regulated industries.
Onboarding checklists: Create checklists to ensure all necessary onboarding tasks are completed.
Reporting and analytics: Generate reports on training progress and effectiveness to identify areas for improvement.
Integration: Seamlessly integrate with other HR and business systems for a smooth onboarding experience.
Mobile accessibility: Allow new hires to access training materials from any device, increasing flexibility and convenience.
Benefits:
Consistency: Ensure that all new employees receive the same quality and content of training.
Efficiency: Streamline the onboarding process, reducing the time and resources required to train new employees.
Engagement: Use interactive and engaging training materials to enhance the learning experience for new employees.
Scalability: Easily scale onboarding programmes to accommodate any number of new hires.
Tracking and compliance: Maintain accurate records of completed training for compliance and reporting purposes.
Persona | academyocean | |
1,862,877 | Lado Okhotnikov Launched the Uniteverse Program Within the Meta Force Metaverse | Meta Force is a promising project created by a team of techies and cryptocurrency fans. These guys... | 0 | 2024-05-23T13:22:56 | https://dev.to/ali_nasir_62c60a7afa3833b/lado-okhotnikov-launched-the-uniteverse-program-within-the-meta-force-metaverse-14bk | Meta Force is a promising project created by a team of techies and cryptocurrency fans. These guys decided to try their hand at GameFi in 2021. And within a year, the platform united more than a million participants from all over the world. The main principles are transparency, distribution, variety of gameplay and maximum use of GameFi's potential.
The first steps are encouraging. There is already a working prototype of the game, and the community is actively growing. The team has attracted strong partners who help with development.
As for the system itself, it is built on the foundation of the Uniteverse software module. The program ensures the viability of the entire platform and in the near future will allow, for example, the digitization of things that the user owns in real life. This solution means that it is possible to combine the possibilities of the gaming universe with the monetization of the gaming experience.
Uniteverse is not just a software module, but a separate ecosystem, the user of which will be able to receive a complete solution to personal requests. This direction will allow you to completely immerse yourself in virtual reality.
“...At Universe, everyone has the right to create, play, learn and improve financial literacy. We adhere to strict principles of complete freedom for all members of our community,” this is how <b> <a href="https://www.cryptowisser.com/nft-marketplace-launched-in-lado-okhotnikovs-meta-force-metaverse//">Lado Okhotnikov </a> </b> sees the mission of his project.
According to the general director of the company, the project is aimed at achieving a fundamentally new level of reality simulation. The developers plan to build a separate world, where users will be able to enjoy high-quality gameplay and take advantage of the maximum benefits offered by the GameFi industry.
The main task that the ecosystem developers set themselves was to create a new format for user interaction with virtual reality.
“In the Meta Force metaverse, we take part in exciting quests, run a business, and collect unique NFTs. We are looking forward to being able to use the full range of opportunities offered by RWA (Real World Assets) technology,” project participants share their impressions.
Particular attention is paid to the problems of blockchain security and transparency. Mr. Lado is a strong proponent of decentralization and anonymity of transactions. According to him, Collection Authentication technology embedded in the Uniteverse platform will help ensure compliance with these ambitious goals.
In addition, the ecosystem architecture is built on the Polygon blockchain. The structure of the metaverse is based on a native token, the capabilities of which are comparable to the functions of the SAND coin of The Sandbox project.
“Forcecoin is the same deflationary instrument as Bitcoin. Its emission is strictly limited by software algorithms and cannot be changed. Over 80% of the issued tokens go to improve the design base of the ecosystem, to develop the Uniteverse platform,” says Lado Okhotnikov.
About Meta Force
The various elements of the ecosystem - tokenomics, a NFT marketplace, and a metaverse are actively developed in Meta Force. A native token will be launched soon. There are plans to present a RWA-based project. The team intends to build a full-fledged virtual reality with unsurpassed graphics on the platform.
Based on Dan Michael materials
The head of Meta Force Press Center
press@meta-force.space
#lado_okhotnikov
#metaverse
#meta_force
#forcecoin
| ali_nasir_62c60a7afa3833b | |
1,862,876 | Crafting Perfect Layouts: Mastering CSS Layout Techniques.🚀 | 1. CSS Layout Basics Understanding the basics of CSS layout is crucial for creating structured and... | 0 | 2024-05-23T13:21:02 | https://dev.to/dharamgfx/crafting-perfect-layouts-mastering-css-layout-techniques-3c1h | css, webdev, beginners, programming | **1. CSS Layout Basics**
Understanding the basics of CSS layout is crucial for creating structured and visually appealing web pages. CSS provides various properties and techniques to arrange elements on a page.
**Block and Inline Elements**
- **Block Elements**: Take up the full width available (`<div>`, `<p>`, `<h1>`).
- **Inline Elements**: Take up only as much width as necessary (`<span>`, `<a>`, `<img>`).
*Example:*
```html
<div>This is a block element.</div>
<span>This is an inline element.</span>
```
**Display Property**
- Controls how elements are displayed.
*Example:*
```css
.block {
display: block;
}
.inline {
display: inline;
}
.inline-block {
display: inline-block;
}
```
**2. Floats**
Floats are used to position elements to the left or right, allowing text and other elements to wrap around them.
**Floating Elements**
- Use the `float` property to move elements to the left or right.
*Example:*
```css
.float-left {
float: left;
width: 50%;
}
.float-right {
float: right;
width: 50%;
}
```
**Clearing Floats**
- Use the `clear` property to control the behavior of floating elements.
*Example:*
```css
.clear {
clear: both;
}
```
**3. Positioning**
CSS positioning allows you to place elements precisely on the page.
**Static Positioning**
- The default position of elements.
*Example:*
```css
.static {
position: static;
}
```
**Relative Positioning**
- Position relative to its normal position.
*Example:*
```css
.relative {
position: relative;
top: 10px;
left: 10px;
}
```
**Absolute Positioning**
- Positioned relative to the nearest positioned ancestor.
*Example:*
```css
.absolute {
position: absolute;
top: 20px;
left: 20px;
}
```
**Fixed Positioning**
- Positioned relative to the viewport.
*Example:*
```css
.fixed {
position: fixed;
top: 0;
left: 0;
}
```
**Sticky Positioning**
- Switches between relative and fixed, depending on the scroll position.
*Example:*
```css
.sticky {
position: sticky;
top: 0;
}
```
**4. Modern Layout**
Modern CSS layout techniques like Flexbox and Grid provide powerful tools for designing responsive and complex layouts.
**Flexbox**
- A layout model for arranging elements in a single direction (row or column).
*Example:*
```css
.flex-container {
display: flex;
justify-content: space-around;
}
.flex-item {
flex: 1;
}
```
**CSS Grid**
- A two-dimensional layout system for creating grid-based designs.
*Example:*
```css
.grid-container {
display: grid;
grid-template-columns: repeat(3, 1fr);
grid-gap: 10px;
}
.grid-item {
background-color: lightblue;
}
```
**5. Responsive Design**
Responsive design ensures your web pages look great on all devices by using media queries and flexible layouts.
**Media Queries**
- Apply different styles based on screen size.
*Example:*
```css
@media (max-width: 600px) {
.responsive {
flex-direction: column;
}
}
```
**Flexible Units**
- Use relative units like percentages and `em` for responsive sizing.
*Example:*
```css
.container {
width: 80%;
padding: 2em;
}
```
**Responsive Images**
- Ensure images resize within their containers.
*Example:*
```css
img {
max-width: 100%;
height: auto;
}
```
---
**Conclusion**
Mastering CSS layout techniques is essential for creating structured, responsive, and visually appealing web pages. By understanding the basics of layout, floats, positioning, modern layout techniques like Flexbox and Grid, and responsive design principles, you can build versatile and adaptive web designs. Embrace these skills to enhance the user experience across all devices. | dharamgfx |
1,850,496 | How to Set Up GitHub Actions for Continuous Integration | Introduction In the fast-paced world of software development, ensuring that code is... | 0 | 2024-05-23T13:20:37 | https://dev.to/angelotheman/how-to-set-up-github-actions-for-continuous-integration-2054 | devops, tutorial, github, architecture | ## Introduction
In the fast-paced world of software development, ensuring that code is consistently tested and integrated can be a challenging task. This is where **Continuous Integration (CI)** comes to play.
> CI is a development practice that requires developers to integrate code into a shared repository frequently.
CI automates the process of building, testing and integrating code changes regularly, helping developers to catch issues early and maintain a stable codebase.
GitHub Actions, a powerful feature introduced by GitHub, has revolutionized the way developers automate their workflows. We will explore how to use this tool for setting up CI in your projects.
Here's what we will cover
* Understanding CI and it's benefits
* Introduction to GitHub Actions
* Building CI workflows with Github Actions
* Advanced Techniques and Integration with External Tools
## Benefits of Continuous Integration
There are several benefits to Continuous Integration. These include
* **Early Detection of Bugs**
Continuous Integration facilitates the early detection of bugs by automatically running tests with each code integration, allowing developers to identify and fix issues before they escalate, resulting in more stable software releases.
* **Improved Code Quality**
By ensuring consistent coding standards and practices through automated testing and integration, code quality is checked. This reduces technical debt and ensures the maintainability of the codebase
* **Faster Feedback Loop**
With Continuous Integration, developers receive rapid feedback on their code, enabling them to iterate quickly and address issues promptly leading to faster development cycles.
These are some of the benefits derived from Continuous Integration. Now let's look at using GitHub actions for CI.
## Introduction to GitHub Actions
GitHub Actions serves as a platform for continuous integration and continuous delivery (CI/CD), enabling you to automate the process of building, testing, and deploying your software pipeline.
Here are some uses of GitHub Actions
* You can set up workflows that automatically build and test each pull request submitted to your repository.
* Deploy successfully merged pull requests to your production environment.
* Actions can also be used to package or release projects into the pipeline.
GitHub Actions goes beyond just DevOps and lets you run workflows when other events happen in your repository. For example, you can run a workflow to automatically add the appropriate labels whenever someone creates a new issue in your repository. It has gained popularity for its simplicity and the fact that it is integrated directory into your repository.
Now let's look at the components that make up a GitHub workflow
## Understanding a Github Action Workflow
A workflow is a configurable automated process that will run one or more jobs. A GitHub Actions workflow is defined by a YAML file typically located in the `.github/workflows` directory within your repository.
### Components of a workflow
**Triggers**
These define the event that trigger the workflow. Example, the workflow can start when a user:
* Creates a pull request
* Opens an issue
* Pushes a commit to the repository
etc
**Jobs**
These specify the individual tasks to be executed as part of the workflow. Each job runs sequentially or in parallel and can include multiple steps.
**Steps**
Define the individual actions or commands to be executed within a job. Steps can include tasks like checking out code, running tests, or deploying artifacts.
**Actions**
Actions are reusable units of code that perform specific tasks within a workflow. They can be authored by GitHub or by the community and can be easily integrated into workflows. An action can pull your git repository from GitHub, set up the correct toolchain for your build environment
**Runners**
A runner is a server that runs your workflows when they're triggered. Each runner can run a single job at a time. GitHub provides three runners to run your workflows: _Ubuntu Linux_, _Microsoft Windows_ and _macOS_.
## Build an example workflow
In this section we would set up a simple workflow that **_runs linters_** for Python using pycodestyle. This will help ensure that your Python code adheres to PEP 8 style guidelines and is free of common errors.
### Step-By-Step Guide
1. Create the `.github/workflows` directory at the root of your repository.
2. Inside the directory, create a file `lint.yml`. Go to **Add file** and then write the directory as shown in the image below

After this, commit the changes for it to take effect.
3. Open the `lint.yml` file and add the following code
```yaml
name: Lint Python Code
on: [push, pull_request]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.x'
- name: Install Pycodestyle
run: |
pip install pycodestyle
- name: Lint Python files with Pycodestyle
run: |
pycodestyle .
```
### Explanation of the code
* `name`: Defines the name of the workflow as "Lint Python Code".
* `on`: Triggers the workflow on push or pull request events to the main branch. Hence, anytime there's a pull request to merge code or pushing to main branch, this workflow would be triggered.
* `jobs`: Defines a single job named **_lint_** that executes the linting steps.
* `runs-on`: Specifies the environment which would run the workflow, in our case, `ubuntu-latest`.
* `steps`: This section defines the sequential steps within the job:
* `Checkout Code`: Uses the _actions/checkout_ action to check
out the repository code.
* `Set up Python`: Uses the actions/setup-python action to set up
a Python environment.
* `Install Pycodestyle`: Installs the Pycodestyle linter for
Python files.
* `Lint Python Files`: Runs Pycodestyle on the entire repository
to check for PEP 8 compliance.
This workflow ensures that every time code is pushed or a pull request is created, Pycodestyle is automatically run to check for style issues in Python code.
Now let's create a simple python file `hello.py` in our repository with the following code
```python
#!/usr/bin/python3
"""
This is a function file
"""
def hello():
print("hello World")
if __name__ == '__main__':
hello()
```
Now commit and push this code to your repository. The workflow would automatically be triggered since we specified a **push** in the trigger. You would see something equivalent to the image below in the actions tab

Now when there's an error in the workflow (which normally reflects an error in the codebase), it would be marked **red** otherwise, it would check and you'll get something like the image below.

## Debugging a workflow
Here's another python code which should fail in the workflow. We would then learn how to debug or troubleshoot a workflow.
```python
#!/usr/bin/python3
"""
This is a function file
"""
def hello():
print("hello World")
def new():
print("This is me")
if __name__ == '__main__':
hello()
new()
```
This is what we get from the workflow

Now click on the failed workflow to troubleshoot

The logs rightly shows what needs to be done to get the linter to check correctly.
```
./python_with_error.py:6:1: E302 expected 2 blank lines, found 1
./python_with_error.py:9:1: E302 expected 2 blank lines, found 1
./python_with_error.py:10:3: E111 indentation is not a multiple of 4
./python_with_error.py:12:1: E305 expected 2 blank lines after class or function definition, found 1
```
As shown above, there needs to be blank lines added and indentation issues. This gives you a sense of what exactly didn't allow the code to flow in the pipeline.
After fixing the issues with pycodestyle, you should have the equivalent of these


## Advanced CI Workflow Techniques
There are other useful configurations when building a workflow. Let's look at three of them.
### Variables
Variables store non-sensitive information that can change depending on the environment or the context of the workflow.
To add a new variable, navigate to **Settings > Secrets and Variables > Actions**


**Use Cases**
* Setting environment-specific configurations, such as the target
environment (development, staging, production).
* Storing file paths or URLs that might differ between environments.
### Secrets
Secrets store sensitive information securely, such as API keys, passwords, and tokens. They ensure that this information is not exposed in the workflow configuration or logs.
To add a new secret, navigate to **Settings > Secrets and Variables > Actions**


**Use Cases**
* Stores access tokens for third-party services like AWS, Docker Hub,
or email services.
* Database passwords or credentials for accessing secure services.
### Webhooks
Webhooks trigger actions in response to specific events, such as sending notifications to external services.
To add a new webhook, navigate to **Settings > Webhooks**


**Use Cases**
* Sending notifications to Slack, email, or SMS services when a build
fails or succeeds.
* Initiating deployments to environments such as staging or production
when certain conditions are met.
Now let's add a notification to our workflow such that, a SMS is automatically sent to us when our pipeline fails.
Edit the `yml` file with the following code snippet
```yaml
name: CI Pipeline
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.x'
- name: Install dependencies
run: |
pip install requests pycodestyle
- name: Lint Python files with Pycodestyle
run: |
pycodestyle .
- name: Send SMS Notification
if: failure()
run: |
python - <<EOF
import requests
import json
quicksend_url = "https://uellosend.com/quicksend/"
data = {
'api_key': '${{ secrets.SMS_API_KEY }}',
'sender_id': 'GitHub',
'message': 'CI Pipeline has failed, check the details',
'recipient': '${{ secrets.SMS_RECIPIENT }}'
}
headers = {'Content-type': 'application/json'}
response = requests.post(quicksend_url, headers=headers, json=data)
with open('sms_response.json', 'w') as f:
json.dump(response.json(), f, indent=4)
EOF
```
### Explanation of Code
* We check for a failure of the task to be run then we send aSMS to the user.
* We use [UelloSend](https://uellosend.com/index.php) for the SMS service.
* Secrets are configured for the services in the secrets file within the repository.
Below is the SMS that would be sent should the run or job fail

## Helpful links
- [Learn Continuous Integration](https://www.atlassian.com/continuous-delivery/continuous-integration)
- [Learn more about GitHub Actions](https://docs.github.com/en/actions/learn-github-actions)
- [UelloSend API Documentation](https://uellosend.com/developer.php)
## Conclusion
In this article, we explored how GitHub Actions can streamline and enhance your Continuous Integration (CI) workflows. By leveraging GitHub Actions, you can automate various aspects of your development process, ensuring code quality and improving collaboration among team members.
By implementing these techniques, you can create robust CI workflows that not only automate code checks and tests but also keep you informed about the status of your builds in real-time. This enhances the overall development process, making it more efficient and reliable.
What did you think? Share your thoughts and feedback in the comments below!
Your input helps me deliver insightful and informative content in the future. Let's keep the conversation going!
Connect with me:
* [GitHub](https://github.com/angelotheman)
* [X](https://x.com/_angelotheman)
* [LinkedIn](https://linkedin.com/in/angelotheman)
Happy learning 🚀 | angelotheman |
1,862,875 | Digital Marketing Interview Questions and Answers For Freshers | Digital marketing interview understanding the potential questions and formulating strong answers can... | 0 | 2024-05-23T13:20:37 | https://dev.to/lalyadav/digital-marketing-interview-questions-and-answers-for-freshers-3apk | seo, smo, digital, digitalmarketing | Digital marketing interview understanding the potential questions and formulating strong answers can give you a competitive edge. Here are the [top digital marketing interview questions](https://www.onlineinterviewquestions.com/digital-marketing-interview-questions) you should be ready to answer in 2024.

**Q1. What is Digital Marketing, and why is it important?**
Ans: Digital marketing refers to the promotion of products or services using digital channels such as search engines, social media, email, and websites. It’s important because it allows businesses to reach a larger audience more efficiently and cost-effectively compared to traditional marketing methods. Digital marketing provides measurable results and enables personalized marketing strategies.
**Q2. Can you explain the difference between SEO and SEM?**
Ans: SEO (Search Engine Optimization) is the process of optimizing website content to rank higher in organic search engine results. SEM (Search Engine Marketing), on the other hand, involves paid strategies to increase search visibility, such as pay-per-click (PPC) advertising. While SEO focuses on improving organic traffic, SEM includes both paid and organic methods.
**Q3. What are the key components of a successful digital marketing strategy?**
Ans: A successful digital marketing strategy includes several key components:
SEO (Search Engine Optimization): Enhancing website visibility.
Content Marketing: Creating and distributing valuable content.
Social Media Marketing: Engaging with audiences on social platforms.
Email Marketing: Nurturing leads and customer relationships through targeted emails.
PPC Advertising: Using paid ads to drive traffic.
Analytics and Reporting: Measuring and analyzing performance to refine strategies.
**Q4. How do you stay updated with the latest digital marketing trends and tools?**
Ans: Staying updated requires continuous learning and engagement with the industry. This can be achieved through:
Following industry blogs and news sites (e.g., Moz, HubSpot, Search Engine Journal).
Participating in webinars and online courses.
Attending industry conferences and networking events.
Engaging in professional social media groups and forums.
Experimenting with new tools and techniques.
**Q5. What is the significance of keyword research in digital marketing?**
Ans: Keyword research is crucial because it helps identify the terms and phrases that potential customers use to search for products or services. Understanding these keywords allows marketers to optimize content, improve search engine rankings, and attract the right audience. Effective keyword research drives more targeted traffic and increases conversion rates.
**Q6. Can you explain the concept of content marketing and its benefits?**
Ans: Content marketing involves creating and sharing valuable, relevant, and consistent content to attract and retain a clearly defined audience. The benefits of content marketing include:
Building brand awareness and authority.
Engaging and educating the audience.
Improving SEO efforts.
Driving organic traffic.
Generating leads and conversions.
Enhancing customer relationships.
**Q7. What are some common metrics used to measure digital marketing success?**
Ans: Common metrics include:
Website Traffic: The number of visitors to a site.
Conversion Rate: The percentage of visitors who take a desired action.
Click-Through Rate (CTR): The ratio of clicks on a link to the number of times it’s viewed.
Bounce Rate: The percentage of visitors who leave a site after viewing only one page.
Engagement Rate: Interaction metrics such as likes, comments, and shares on social media.
ROI (Return on Investment): The profitability of marketing efforts. | lalyadav |
1,862,864 | 5 Essential Magento 2 Support and Maintenance Best Practices | Magento 2 is a powerful eCommerce platform that can help businesses of all sizes grow their online... | 0 | 2024-05-23T13:18:10 | https://dev.to/chandrasekhar121/5-essential-magento-2-support-and-maintenance-best-practices-2kg2 | magento, webdev, services, programming | <p>Magento 2 is a powerful eCommerce platform that can help businesses of all sizes grow their online sales. </p>
<p>However, like any software, it requires regular <a href="https://webkul.com/magento-2-support-and-maintenance-services/">Magento 2 support and maintenance</a> to keep it running smoothly and securely.</p>

<h2><strong>Here are five essential Magento 2 support and maintenance best practices:</strong></h2>
<ul>
<li>
<p><strong>Keep your Magento 2 software up to date.</strong></p>
</li>
</ul>
<p>Magento 2 releases regular security patches and updates that fix bugs and improve performance.</p>
<p>It's important to keep your Magento 2 software up to date to protect your store from security vulnerabilities and ensure that it's running at its best.</p>
<ul>
<li>
<p><strong>Back up your Magento 2 store regularly.</strong></p>
</li>
</ul>
<p>In the event of a hardware failure, software glitch, or security breach, having a recent backup of your Magento 2 store can save you a lot of time and hassle.</p>
<p>It's a good idea to back up your store regularly, such as daily or weekly.</p>
<ul>
<li>
<p><strong>Monitor your Magento 2 store for performance issues.</strong></p>
</li>
</ul>
<p>Slow loading times, errors, and other performance issues can hurt your sales and damage your brand's reputation.</p>
<p>It's important to monitor your Magento 2 store for performance issues and address them promptly.</p>
<ul>
<li>
<p><strong>Secure your Magento 2 store.</strong></p>
</li>
</ul>
<p>Magento 2 includes several security features, but it's important to take additional steps to secure your store, such as using a strong password, enabling two-factor authentication, and installing a firewall.</p>
<ul>
<li>
<p><strong>Partner with a Magento 2 support provider.</strong></p>
</li>
</ul>
<p>If you don't have the time or expertise to manage <a href="https://webkul.com/magento-2-support-and-maintenance-services/">Magento 2 support services</a> yourself, you can partner with a Magento 2 support provider.</p>
<p>A support provider can help you with tasks such as keeping your software up to date, backing up your store, monitoring for performance issues, and securing your store.</p>
<h2>Conclusion</h2>
<p>By following these best practices, you can keep your Magento 2 store running smoothly and securely, and focus on growing your business.</p> | chandrasekhar121 |
1,862,874 | Efficiently Handle CRUD Actions in Syncfusion ASP.NET MVC DataGrid with Fetch Request | TL;DR: Learn to handle CRUD actions in Syncfusion ASP.NET MVC DataGrid using Fetch requests. This... | 0 | 2024-05-30T04:32:59 | https://www.syncfusion.com/blogs/post/crud-aspdotnet-mvc-grid-using-fetch | aspnetmvc, datagrid, development, syncfusion | ---
title: Efficiently Handle CRUD Actions in Syncfusion ASP.NET MVC DataGrid with Fetch Request
published: true
date: 2024-05-23 13:12:28 UTC
tags: aspnetmvc, datagrid, development, syncfusion
canonical_url: https://www.syncfusion.com/blogs/post/crud-aspdotnet-mvc-grid-using-fetch
cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/isx0cvd9nhdhxf1b2l7c.png
---
**TL;DR:** Learn to handle CRUD actions in Syncfusion ASP.NET MVC DataGrid using Fetch requests. This blog covers binding data and performing CRUD actions using Fetch for server-side updates. It includes examples of adding, editing, and deleting records and handling Fetch success and failure events for smooth execution and real-time data consistency.
Fetch is a robust method that is crucial in modern web development. It allows for the asynchronous sending of data to a server for database updates and the retrieval of data from a server without the need to refresh the entire webpage. This makes for a smoother and more efficient user experience.
The Syncfusion [ASP.NET MVC DataGrid](https://www.syncfusion.com/aspnet-mvc-ui-controls/grid "ASP.NET MVC DataGrid"), a feature-rich component designed to handle vast amounts of data, has built-in support for handling CRUD (Create, Read, Update, Delete) operations. These operations are fundamental to any app that involves data manipulation.
However, recognizing our users’ diverse needs, we have also provided an option for users to execute these CRUD operations in the DataGrid using their own Fetch commands. This means that users can interact with their database in a way that aligns with their specific requirements and preferences.
This feature is particularly beneficial as it allows users to seamlessly integrate their server logic with the Syncfusion ASP.NET MVC DataGrid during CRUD operations. As a result, any changes made during these operations can be immediately and accurately reflected in the Grid.
Let’s see how to bind and perform CRUD operations using Fetch request in the ASP.NET MVC DataGrid.
## Render Syncfusion ASP.NET MVC DataGrid
Syncfusion [ASP.NET MVC DataGrid](https://ej2.syncfusion.com/aspnetmvc/documentation/grid/getting-started-mvc "Getting started with ASP.NET MVC DataGrid") is a feature-rich control for displaying data in a tabular format. Its functionalities include data binding, editing, Excel-like filtering, and selection. It also supports exporting data to Excel, CSV, and PDF formats.
Now, let’s see how to render the ASP.NET MVC DataGrid control. Here, we’ve enabled the [paging](https://ej2.syncfusion.com/aspnetmvc/documentation/grid/paging "Paging in ASP.NET MVC DataGrid") and [editing](https://ej2.syncfusion.com/aspnetmvc/documentation/grid/editing/edit "Editing in ASP.NET MVC DataGrid") features for a more interactive user experience. Refer to the following code example.
```js
@Html.EJS().Grid("Grid")
.EditSettings(e => { e.AllowAdding(true).AllowEditing(true).AllowDeleting(true); })
.Columns(col =>{
col.Field("OrderID").HeaderText("Order ID").IsPrimaryKey(true).Width("130").Add();
col.Field("EmployeeID").HeaderText("Employee ID").Width("150").Add();
col.Field("CustomerID").HeaderText("CustomerID").Width("70").Add();
col.Field("ShipCity").HeaderText("Ship City").Width("70").Add()
})
.AllowPaging(true)
.AllowSorting(true)
.ActionComplete("actionComplete")
.ActionBegin("actionBegin")
.Toolbar(new List<string>() { "Add", "Edit", "Delete", "Update", "Cancel" })
.Render()
```
Previously, the **DataSource** was not bound to the DataGrid. However, now we will utilize Fetch requests to bind data to the DataGrid. On the server side, the **GetData** method within the **HomeController** contains the grid’s data source. When the button is clicked, a Fetch request is sent to fetch the data from the server and bind it to the DataGrid control.
```js
public class HomeController : Controller
{
public ActionResult Getdata()
{
IEnumerable DataSource = OrdersDetails.GetAllRecords();
return Json(DataSource);
}
//Create a model class and define the properties.
public class OrdersDetails
{
public OrdersDetails()
{
}
public OrdersDetails(int OrderID, string CustomerId, int EmployeeId, double Freight, bool Verified, DateTime OrderDate, string ShipCity, string ShipName, string ShipCountry, DateTime ShippedDate, string ShipAddress)
{
this.OrderID = OrderID;
this.CustomerID = CustomerId;
this.EmployeeID = EmployeeId;
this.Freight = Freight;
this.ShipCity = ShipCity;
this.Verified = Verified;
this.OrderDate = OrderDate;
this.ShipName = ShipName;
this.ShipCountry = ShipCountry;
this.ShippedDate = ShippedDate;
this.ShipAddress = ShipAddress;
}
//Render data in this method.
public static List<OrdersDetails> GetAllRecords()
{
List<OrdersDetails> order = new List<OrdersDetails>();
int code = 10000;
for (int i = 1; i < 10; i++)
{
order.Add(new OrdersDetails(code + 1, "ALFKI", i + 0, 2.3 * i, false, new DateTime(1991, 05, 15), "Berlin", "Simons bistro", "Denmark", new DateTime(1996, 7, 16), "Kirchgasse 6"));
order.Add(new OrdersDetails(code + 2, "ANATR", i + 2, 3.3 * i, true, new DateTime(1990, 04, 04), "Madrid", "Queen Cozinha", "Brazil", new DateTime(1996, 9, 11), "Avda. Azteca 123"));
order.Add(new OrdersDetails(code + 3, "ANTON", i + 1, 4.3 * i, true, new DateTime(1957, 11, 30), "Cholchester", "Frankenversand", "Germany", new DateTime(1996, 10, 7), "Carrera 52 con Ave. Bolívar #65-98 Llano Largo"));
order.Add(new OrdersDetails(code + 4, "BLONP", i + 3, 5.3 * i, false, new DateTime(1930, 10, 22), "Marseille", "Ernst Handel", "Austria", new DateTime(1996, 12, 30), "Magazinweg 7"));
order.Add(new OrdersDetails(code + 5, "BOLID", i + 4, 6.3 * i, true, new DateTime(1953, 02, 18), "Tsawassen", "Hanari Carnes", "Switzerland", new DateTime(1997, 12, 3), "1029 - 12th Ave. S."));
code += 5;
}
return order;
}
public int? OrderID { get; set; }
public string CustomerID { get; set; }
public int? EmployeeID { get; set; }
public double? Freight { get; set; }
public string ShipCity { get; set; }
public bool Verified { get; set; }
public DateTime OrderDate { get; set; }
public string ShipName { get; set; }
public string ShipCountry { get; set; }
public DateTime ShippedDate { get; set; }
public string ShipAddress { get; set; }
}
}
```
## Retrieving data via Fetch request
We can utilize the **dataSource** property to fetch data from an external source using the Fetch requests and bind it to the ASP.NET MVC DataGrid.
In the following code example, we’ve demonstrated how to fetch data from the server using a Fetch request. Upon successful retrieval, we’ll bind the data to the **dataSource** property within the button click event, using the **onSuccess** event of the Fetch request.
```js
<script>
let button = document.getElementById('btn');
button.addEventListener("click", function (e) {
let fetch= new ej2.base.Fetch("/Home/Getdata", "POST");
fetch.send();
fetch.onSuccess = function (data) {
var grid = document.getElementById('Grid').ej2_instances[0];
grid.dataSource = JSON.parse(data);
};
});
</script>
```
## Performing CRUD actions with Fetch requests
In addition to binding data, you can utilize Fetch requests to handle CRUD (Create, Read, Update, Delete) actions and update your data on the server side. When any grid action is performed, the [actionBegin](https://help.syncfusion.com/cr/aspnetmvc-js2/Syncfusion.EJ2.Grids.Grid.html#Syncfusion_EJ2_Grids_Grid_ActionBegin "actionBegin property of the ASP.NET MVC DataGrid") event is triggered before the action occurs in the grid.
By leveraging the actionBegin event, you can cancel the default CRUD operations by utilizing the **cancel** argument provided by this event. This allows you to call your server-side method dynamically using Fetch and the relevant data received from the actionBegin event to update your server data accordingly.
## Adding a new record with Fetch request
To create a new record using Fetch requests, you can follow these steps:
1.Click on the **Add** icon located in the grid’s toolbar. This action will generate a form within the grid, enabling you to input the necessary details.
2.After entering the details, click the **Update** icon in the toolbar to commit the changes.
3.Throughout this process, the **actionBegin** event is activated. In this event, you can retrieve the **requestType** as **save** and the **action** value as **add** from the argument.
4.Armed with this information, you can cancel the default action and send a Fetch request to execute the add action on the server side.
Refer to the following code example.
```js
//Insert the record.
public ActionResult Insert(OrdersDetails value)
{
OrdersDetails.GetAllRecords().Insert(0, value);
return Json(value);
}
```
Now, we are going to call the **Insert** method from **actionBegin** event through fetch call.
```js
<script>
var flag = false;
function actionBegin(e) {
// Initially the flag needs to be false in order to enter this condition.
if (!flag) {
var grid = document.getElementById('Grid').ej2_instances[0];
// Add and edit operations.
if (e.requestType == 'save' && (e.action == 'add')) {
var editedData = e.data;
// The default edit operation is canceled.
e.cancel = true;
// Here, you can send the updated data to your server using a fetch call.
var fetch= new ej.base.Fetch({
url: '/Home/Insert',
type: 'POST',
contentType: 'application/json; charset=utf-8',
data: JSON.stringify({ value: editedData })
});
fetch.onSuccess = (args) => {
// Flag is enabled to skip this execution when grid ends add/edit action.
flag = true;
// The added/edited data will be saved in the Grid.
grid.endEdit();
}
fetch.onFailure = (args) => {
// Add/edit failed.
// The flag is disabled if the operation fails so that it can enter the condition on the next execution.
flag = false;
}
fetch.send();
}
}
```
In the Fetch success event, you can utilize the Grid’s [endEdit](https://ej2.syncfusion.com/documentation/api/grid/#endedit "endEdit property for the ASP.NET MVC - EJ2 Grid") method for adding and editing and[deleteRecord](https://ej2.syncfusion.com/documentation/api/grid/#deleterecord "deleteRecord method property for the ASP.NET MVC - EJ2") method to delete the corresponding data in the Grid. However, it’s worth noting that invoking these methods trigger the **actionBegin** event again to save the changes in the DataGrid.
To prevent this behavior and control the execution flow, you can employ a flag variable and manage it within the [actionComplete](https://help.syncfusion.com/cr/aspnetmvc-js2/Syncfusion.EJ2.Grids.Grid.html#Syncfusion_EJ2_Grids_Grid_ActionComplete "actionComplete property of the ASP.NET MVC DataGrid") event and Fetch failure events.
Refer to the following code example.
```js
function actionComplete(e) {
if (e.requestType === 'save' || e.requestType === 'delete') {
// The flag is disabled after the operation is successfully performed so that it can enter the condition on the next execution.
flag = false;
}
}
```
## Updating and saving a record using a Fetch request
To edit and save a record using a Fetch request, follow these steps:
1. Select the desired record in the Grid by clicking or using the **Edit** icon in the toolbar. Alternatively, double-click on a row to initiate the editing process for that specific record.
2. In the edit form, make the necessary modifications to the record’s details.
3. Choose the **Update** icon in the toolbar to save the changes.
4. During this process, the **actionBegin** event is triggered. Within this event, retrieve the **requestType** and **action** values from the argument.
5. Check if the **requestType** is **save** and the **action** is **edit** to identify the specific scenario of editing a record.
6. If the conditions are met, cancel the default action by utilizing the appropriate mechanism provided by your DataGrid library. This ensures that the grid’s default behavior for the edit action is bypassed.
7. Finally, construct a Fetch request to call the update method in the controller.
Refer to the following code example.
```js
//Update the record.
Public ActionResult Update(OrdersDetails value)
{
var ord = value;
OrdersDetails val = OrdersDetails.GetAllRecords().Where(or => or.OrderID == ord.OrderID).FirstOrDefault();
val.OrderID = ord.OrderID;
val.EmployeeID = ord.EmployeeID;
val.CustomerID = ord.CustomerID;
return Json(value);
}
```
Now, we are going to call the **Update** method from the **actionBegin** event through Fetch call.
```js
<script>
var flag = false;
function actionBegin(e) {
// Initially, the flag needs to be false in order to enter this condition.
if (e.requestType == 'save' && (e.action == 'edit')) {
var editedData = e.data;
// The default edit operation is canceled.
e.cancel = true;
// Here, you can send the updated data to your server using a Fetch call.
var fetch= new ej.base.Fetch ({
url: '/Home/Update',
type: 'POST',
contentType: 'application/json; charset=utf-8',
data: JSON.stringify({ value: editedData })
});
fetch.onSuccess = (args) => {
// Flag is enabled to skip this execution when the DataGrid ends add/edit action.
flag = true;
// The added/edited data will be saved in the Grid.
grid.endEdit();
}
fetch.onFailure = (args) => {
// Add/edit failed.
// The flag is disabled if operation is failed so that it can enter the condition on next execution.
flag = false;
}
fetch.send();
}
}
```
## Deleting a record using Fetch request
To delete a record using a Fetch request, follow these steps:
1. Select the record you wish to delete in the Grid by clicking on it or using the **Delete** icon in the toolbar.
2. When the record is selected for deletion, the **actionBegin** event is triggered. Within this event, retrieve the **requestType** value from the argument.
3. Check if the **requestType** is **delete** to identify the delete action.
4. If the condition is met, cancel the default action using the appropriate mechanism available in your DataGrid library. This prevents the default behavior of the grid for the delete action.
5. Construct a Fetch request to call the **delete** method in the controller.
6. Configure the Fetch settings, such as the **URL**, **data**, and **success**/**error** handling, based on your specific requirements.
Refer to the following code example.
```js
//Delete the record.
public ActionResult Delete(int key)
{
OrdersDetails.GetAllRecords().Remove(OrdersDetails.GetAllRecords().Where(or => or.OrderID == key).FirstOrDefault());
var data = OrdersDetails.GetAllRecords();
return Json(data);
}
```
Now, we are going to call the **Delete** method from the **actionBegin** event through Fetch call.
```js
<script>
var flag = false;
function actionBegin(e) {
if (e.requestType == 'delete') {
var editedData = e.data;
// The default delete operation is canceled.
e.cancel = true;
// Here, you can send the deleted data to your server using a Fetch call.
var fetch= new ej.base.Fetch ({
url: '/Home/Delete',
type: 'POST',
contentType: 'application/json; charset=utf-8',
data: JSON.stringify({ key: editedData[0][grid.getPrimaryKeyFieldNames()[0]] })
})
fetch.onSuccess = (args) => {
// Flag is enabled to skip this execution when grid deletes a record.
flag = true;
// The deleted data will be removed from the Grid.
grid.deleteRecord();
}
fetch.onFailure = (args) => {
// Delete failed.
// The flag is disabled if the operation fails so that it can enter the condition on the next execution.
flag = false;
}
fetch.send();
}
}
```
Refer to the following output image.[](https://www.syncfusion.com/blogs/wp-content/uploads/2024/05/Syncfusion-DataGrid-CRUD-with-Fetch.gif)
## Conclusion
Thanks for reading! In this blog, we’ve explored how to handle CRUD actions in Syncfusion [ASP.NET MVC DataGrid](https://ej2.syncfusion.com/aspnetmvc/documentation/grid/getting-started-mvc "Getting started with ASP.NET MVC DataGrid") efficiently using Fetch requests. By leveraging Fetch, we can dynamically fetch and update data from the server without the need to refresh the entire page.
The new version of Essential Studio is available for download from the [License and Downloads](https://www.syncfusion.com/account/downloads "Essential Studio License and Downloads page") page for existing customers. If you are not a Syncfusion customer, try our 30-day [free trial](https://www.syncfusion.com/downloads "Get free evaluation of the Essential Studio products") to explore our available features.
You can contact us through our [support forum](https://www.syncfusion.com/forums "Syncfusion Support Forum"), [support portal](https://support.syncfusion.com/?_gl=1*1f52z3y*_ga*MTc3NTk2OTU1MC4xNjQ1MjA1MTE2*_ga_WC4JKKPHH0*MTY0ODU1MjUxNi4xMzguMS4xNjQ4NTUyNTM0LjA. "Syncfusion Support Portal"), or [feedback portal](https://www.syncfusion.com/feedback "Syncfusion Feedback Portal"). We are here to help you succeed!
## Related blogs
- [Easy Steps to Migrate an ASP.NET MVC Project to an ASP.NET Core Project (syncfusion.com)](https://www.syncfusion.com/blogs/post/migrate-asp-net-mvc-project-to-asp-net-core "Blog: Easy Steps to Migrate an ASP.NET MVC Project to an ASP.NET Core Project (syncfusion.com)")
- [Easy Steps to Use Dropdown Tree with Organization Structures in ASP.NET MVC Applications | Syncfusion Blogs](https://www.syncfusion.com/blogs/post/easy-steps-to-use-dropdown-tree-with-organization-structures-in-asp-net-mvc-applications "Blog: Easy Steps to Use Dropdown Tree with Organization Structures in ASP.NET MVC Applications | Syncfusion Blogs")
- [How to Migrate SQL Server in an ASP.NET MVC Application to MySQL and PostgreSQL | Syncfusion Blogs](https://www.syncfusion.com/blogs/post/how-to-migrate-sql-server-in-an-asp-net-mvc-application-to-mysql-and-postgresql "Blog: How to Migrate SQL Server in an ASP.NET MVC Application to MySQL and PostgreSQL | Syncfusion Blogs")
- [Shield Your ASP.NET MVC Web Applications with Content Security Policy (CSP) (syncfusion.com)](https://www.syncfusion.com/blogs/post/asp-dotnet-mvc-content-security-policy "Blog: Shield Your ASP.NET MVC Web Applications with Content Security Policy (CSP) (syncfusion.com)") | jollenmoyani |
1,859,689 | Vyper beginner's tutorial: Variables. | In our previous tutorial, we managed to write a very simple smart contract code and explained the... | 0 | 2024-05-23T13:09:51 | https://dev.to/mosesmuwawu/vyper-beginners-tutorial-variables-1eo2 | web3, vyper, smartcontracts, blockchain | In our [previous tutorial](https://dev.to/mosesmuwawu/vyper-beginners-tutorial-syntax-and-structure-3ajk), we managed to write a very simple smart contract code and explained the syntax and structures involved. So now, let's extend our learning to a broader spectrum by going deep into **variables**.
## What are Variables?
In Vyper, all [variables](https://docs.vyperlang.org/en/stable/scoping-and-declarations.html?highlight=public#scoping-and-declarations) must be explicitly declared with a type. The type of a variable determines the kind of data that it can store. For example, you can declare a variable to store an unsigned integer, a string, or a boolean value.
## Variable types
In Vyper, variables can be broadly categorized into two main types based on their scope and storage:
- Local Variables
- State Variables
## Local variables:
These are temporary variables that exist only within the scope of a function. They are not stored on the blockchain and only exist in memory during the execution of the function.
Local variables are declared within functions and the variables are only accessible within the functions they are declared in.
## Examples:
To better understand how variables are declared in vyper, we shall need to use a series of examples. If you look closely in the examples we provided below, all variables are declared within the function. In the following examples, our aim is to write a smart contract function for adding two numbers:
```python
@external
def addTwo(x: int128, y: int128) -> int128:
result: int128 = x + y
return result
```
To be more concise and idiomatic:
```python
# a simple function to add two numbers x and y
@external
def addTwo(x: int128, y: int128) -> int128:
return x + y
```
We are going to divide the above code into three parts i. e;
1. ```python
@external
```
- `@external`: This is a decorator that specifies the visibility of the function. It indicates that the function `addTwo` can be called from outside the contract. In other words, other smart contracts and external users can invoke this function. This decorator ensures that the function is part of the contract's public interface.
2. ```python
def addTwo(x: int128, y: int128) -> int128:
```
- `def`: This keyword is used to define a function in Vyper (similar to Python).
- `addTwo`: This is the name of the function. It is a descriptive name indicating that the function will add two numbers.
- `x: int128`: This declares a parameter x for the function. The type of x is int128, which means it is a signed 128-bit integer. In Vyper, specifying the type of each variable is mandatory to ensure type safety and prevent overflow/underflow errors.
- `int128`: This type can hold values from `-2^127` to `(2^127) - 1`. It is used for variables that need to store integer values within this range.
- `y: int128`: This declares another parameter y for the function. Like x, y is also of type int128.
- `-> int128`: This part specifies the return type of the function. It indicates that the function will return a value of type int128.
The truth is, Smart contracts on Ethereum are designed to be **deterministic** and do not interact with users in the same way traditional programs do. Therefore, to be able interact with our contract, we need some other tool and that is web3.py. If you don't know how to use web3.py library, please visit my tutorial [here](https://dev.to/mosesmuwawu/how-to-interact-with-smart-contracts-locally-using-web3py-2896) and it will be helpful.
`my_contract.py` file:
```python
from web3 import Web3
# Connect to a local or remote Ethereum node
web3 = Web3(Web3.HTTPProvider('https://data-seed-prebsc-1-s1.binance.org:8545/')) # or your Infura endpoint, etc.
# ABI of the compiled Vyper contract
abi = [
{
"type": "function",
"name": "addTwo",
"stateMutability": "nonpayable",
"inputs": [
{
"name": "x",
"type": "int128"
},
{
"name": "y",
"type": "int128"
}
],
"outputs": [
{
"name": "",
"type": "int128"
}
]
}
]
# Address of the deployed contract
contract_address = web3.to_checksum_address('YOUR_CONTRACT_ADDRESS') # replace with your contract's address
# Create a contract instance
contract = web3.eth.contract(address=contract_address, abi=abi)
# Call the addTwo function
def call_addtwo():
try:
x = int(input('Enter X value: '))
y = int(input('Enter Y value: '))
result = contract.functions.addTwo(x, y).call()
print(f"The result of {x} + {y} is: {result}")
except ValueError:
print('Invalid input. Please enter valid integers for X and Y.')
except Exception as e:
print(f'Error during addition: {e}')
# Call the function to get user input and add the numbers
call_addtwo()
```
And then, run your file and you should get something similar to this:

## State Variables
These are stored on the blockchain. They are part of the contract's state and are persistent across function calls and transactions.
State variables are declared outside of functions, typically at the top of the contract. They are accessible by all functions within the contract and are stored in the contract's storage. Let's have an example to better understand this:
```python
# Define a public variable to store the greeting
greeting: public(String[100])
# Constructor to initialize the greeting
@external
def __init__():
self.greeting = "Hello, World"
# Function to set a new greeting
@external
def set_greeting(new_greeting: String[100]):
self.greeting = new_greeting
# Function to get the current greeting
@external
@view
def get_greeting() -> String[100]:
return self.greeting
```
Here is the detailed explanation of our code:
```python
# Define a public variable to store the greeting
greeting: public(String[100])
```
- `greeting`: This is a state variable defined with the type String[100].
- `State Variable`: This variable is stored on the blockchain, which means its value persists across transactions.
- `public`: The public keyword makes this variable accessible from outside the contract, creating an automatic **getter** function.
- `String[100]`: As already mentioned in previous examples, this specifies that the variable is a string with a maximum length of 100 characters.
```python
@external
def __init__():
self.greeting = "Hello, World"
```
- `init()`: This is the constructor function of the contract, called once when the contract is deployed.
- `self.greeting = "Hello, World"`: This initializes the greeting state variable with the string "Hello, World".
```python
@external
def set_greeting(new_greeting: String[100]):
self.greeting = new_greeting
```
- `set_greeting(new_greeting: String[100])`: This function allows the greeting to be updated.
- `new_greeting: String[100]`: The function takes one parameter, new_greeting, which must be a string with a maximum length of 100 characters.
- `self.greeting = new_greeting`: This updates the greeting state variable with the new value provided.
```python
@external
@view
def get_greeting() -> String[100]:
return self.greeting
```
- `get_greeting() -> String[100]`: This function returns the current value of the greeting variable.
- `@view`: This is one of the available but optional decorators. It indicates that the function does not modify the state of the contract (i.e., it is a read-only function).
- `return self.greeting`: This returns the value of the greeting state variable.
I am very sure now you purely get the context of variables as used in vyperlang. Let's do one more last part of interacting with our smart contract using web3.py library.
> Please note that in the following example, we didn't use environment variables for the sake of brevity. But, i would recommend not to expose your variables by creating a `.env` file for them.
```python
import sys
from web3 import Web3
# Connect to BSC node (Binance Smart Chain)
bsc_node_url = 'https://data-seed-prebsc-1-s1.binance.org:8545/' # Replace with your BSC node URL
web3 = Web3(Web3.HTTPProvider(bsc_node_url))
# Set the private key directly (For demonstration purposes only, do not hardcode in production)
private_key = 'YOUR_PRIVATE_KEY_HERE' # Replace with your actual private key
account = web3.eth.account.from_key(private_key)
# Contract ABI
contract_abi = [
{
"type": "constructor",
"stateMutability": "nonpayable",
"inputs": []
},
{
"type": "function",
"name": "set_greeting",
"stateMutability": "nonpayable",
"inputs": [
{
"name": "new_greeting",
"type": "string"
}
],
"outputs": []
},
{
"type": "function",
"name": "get_greeting",
"stateMutability": "view",
"inputs": [],
"outputs": [
{
"name": "",
"type": "string"
}
]
},
{
"type": "function",
"name": "greeting",
"stateMutability": "view",
"inputs": [],
"outputs": [
{
"name": "",
"type": "string"
}
]
}
]
# Contract address
contract_address = web3.to_checksum_address('YOUR_CONTRACT_ADDRESS') # Replace with your contract's address
# Create contract instance
contract = web3.eth.contract(address=contract_address, abi=contract_abi)
# Function to get the current greeting
def get_greeting():
return contract.functions.get_greeting().call()
# Function to set a new greeting
def set_greeting(new_greeting):
nonce = web3.eth.get_transaction_count(account.address)
tx = contract.functions.set_greeting(new_greeting).build_transaction({
'chainId': 97, # BSC testnet
'gas': 3000000,
'gasPrice': web3.to_wei('5', 'gwei'),
'nonce': nonce,
})
signed_tx = web3.eth.account.sign_transaction(tx, private_key)
tx_hash = web3.eth.send_raw_transaction(signed_tx.rawTransaction)
receipt = web3.eth.wait_for_transaction_receipt(tx_hash)
return receipt
# Main execution
if __name__ == "__main__":
if len(sys.argv) < 2:
print("Usage: python3 interact_with_greeting_contract.py <action> [<value>]")
print("Actions: get, set")
sys.exit(1)
action = sys.argv[1]
if action == "get":
print("Current greeting:", get_greeting())
elif action == "set":
if len(sys.argv) < 3:
print("Usage: python3 interact_contract.py set <value>")
sys.exit(1)
new_greeting = ' '.join(sys.argv[2:])
receipt = set_greeting(new_greeting)
print("Transaction receipt:", receipt)
else:
print(f"Unknown action: {action}")
sys.exit(1)
```
## Results

From the result image above, you can see that when we initially ran the get command, we got the expected `Current greeting: Hello, World`.
Then, when we go on to run the set command with the string set to "Good Morning Everyone!", we do get the same expected result when we run the get command; `Current greeting: Good Morning Everyone!`
You can check out my next [tutorial on vyper functions](https://dev.to/mosesmuwawu/mastering-vyper-functionspart1--1144).
If you found this article helpful, please give me a heart, follow and I will be happy for any interactions in the comments section. Thank you!
| mosesmuwawu |
1,862,861 | 3 Reasons You’re Not Closing Deals Like a Pro | You’ve been working hard, putting out content, building your brand, and finally, it happens. A... | 0 | 2024-05-23T13:08:29 | https://dev.to/devdiscove19083/3-reasons-youre-not-closing-deals-like-a-pro-1f0n | sales, career | You’ve been working hard, putting out content, building your brand, and finally, it happens. A prospect who has been following your posts for some time finally reaches out, expressing interest in your product or service. This is the moment you’ve been waiting for, right? The moment when all your efforts pay off.
Excitedly, you do what any eager entrepreneur would do: you immediately send them your prices, list all the features of your product, and even include your payment details.
And then… nothing. The prospect reads your message but never responds. EVER AGAIN.
You’re left scratching your head, wondering what went wrong. How could you lose the sale when they seemed so interested?
Let me tell you exactly what happened.
## 1. You Skipped the Rapport-Building Phase
Here’s the truth: sales aren’t just about transactions; they’re about relationships. When a prospect reaches out to you, it’s crucial to build rapport and establish a connection first. Diving straight into prices and features might seem efficient, but it often backfires.
Think about it this way: would you propose marriage on the first date? No, because that would be rushing things and not taking the time to build a genuine connection. The same principle applies to sales. When you focus solely on the transaction, you miss out on the opportunity to understand your prospect’s needs, desires, and pain points.
Instead of immediately sending prices, steer the conversation toward understanding their situation. Ask questions like:
What motivated you to reach out to me?
What challenges are you currently facing?
How have you tried to address these challenges in the past?
These questions show that you care about their specific needs and that you’re not just looking to make a quick sale. Building rapport isn’t just about being friendly; it’s about showing genuine interest in your prospect’s problems and demonstrating that you have the expertise to solve them.
## 2. You Failed to Position Yourself as an Advisor
As an entrepreneur, you’re not just selling a product or service; you’re selling your expertise. When a prospect reaches out to you, they’re looking for guidance, not just a price list. They want to know that you understand their unique situation and that you can offer a solution tailored to their needs.
Imagine going to a doctor with a health concern, and instead of examining you and asking questions, the doctor just hands you a list of medications. Would you trust that doctor? Probably not. You’d want the doctor to diagnose your issue and recommend the best course of action based on their expertise.
The same applies to sales. When prospects reach out, don’t just present them with options. Instead, dig deeper into their pain points and offer tailored solutions. Position yourself as their trusted advisor, not just a salesperson.
Here’s how you can do this:
Understand their problem: Ask detailed questions to get a clear picture of their situation.
Provide insights: Share your expertise and offer insights that demonstrate your understanding of their industry and challenges.
Recommend a solution: Based on your understanding, recommend the best course of action. Don’t just give them choices; give them your expert opinion.
When you position yourself as an advisor, prospects will see you as an expert they can trust, and they’ll be more likely to follow your recommendations.
## 3. You Focused on the Transaction, Not the Relationship
One of the biggest mistakes you can make in sales is focusing too much on the transaction and not enough on the relationship. Sending your prices too early in the conversation can signal to the prospect that you’re only interested in the money, not in helping them solve their problems.
Think about it from the prospect’s perspective. If you don’t take the time to understand their needs and provide value before discussing prices, they might think you’re just in it for the sale. This can make them hesitant to move forward.
Instead, focus on building a relationship and providing value upfront. Here’s how:
Delay the price discussion: Move the conversation away from prices and toward understanding their needs. Provide value first by offering insights and advice.
Showcase your expertise: Share case studies, testimonials, and success stories that demonstrate your ability to solve similar problems.
Provide a consultation: Offer a free consultation or strategy session to help them see the value you can provide before discussing prices.
By focusing on the relationship and providing value upfront, you build trust and demonstrate that you’re genuinely interested in helping them succeed.
## Conclusion: Slow Down to Speed Up
In the fast-paced world of sales, it’s easy to get caught up in the excitement of closing a deal. But remember, sales aren’t just about transactions; they’re about relationships. When you take the time to build rapport, position yourself as an advisor, and focus on the relationship rather than the transaction, you’ll close more deals and build long-term, loyal clients.
So, the next time a prospect reaches out to you, slow down. Take the time to understand their needs, provide value, and build a relationship. Don’t rush the process.
If you need help and clarity on improving your sales approach, don’t hesitate to reach out. I’m here to help you succeed.
Before you go, I need you to do me a quick favor
• Your task is to follow me,[Rise & Shine](https://www.youtube.com/channel/UCCGLroX5cJO-deYu2_xOqog) Motivational Channel for daily doses of motivation and inspiration and strive to become 1% better every day. | devdiscove19083 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.