id
int64
5
1.93M
title
stringlengths
0
128
description
stringlengths
0
25.5k
collection_id
int64
0
28.1k
published_timestamp
timestamp[s]
canonical_url
stringlengths
14
581
tag_list
stringlengths
0
120
body_markdown
stringlengths
0
716k
user_username
stringlengths
2
30
1,700,430
open re: playtest! playtest! playtest!
Fellow list member Kamran Ayub (who maintains and works on Excalibur.js and runs Keep Track of My...
0
2023-12-24T16:20:52
https://onwriting.games/daily/20231124-1046-open-re-playtest-playtest-playtest/
gamedev, writing
--- title: open re: playtest! playtest! playtest! published: true date: 2023-11-24 00:00:00 UTC tags: [ 'gamedev', 'writing' ] canonical_url: https://onwriting.games/daily/20231124-1046-open-re-playtest-playtest-playtest/ --- Fellow list member Kamran Ayub (who maintains and works on [Excalibur.js](https://excaliburjs.com/) and runs [Keep Track of My Games](https://keeptrackofmygames.com/)) shared an interesting thought about yesterday's email: > One thing it seems like narrative games would benefit from playtesting is for inclusivity. Your game could be played from all types of people from different cultures different to yours, so you may unintentionally introduce language or elements that don't land how you intended. This. **So. Much. This.** Inclusion should be a **priority for us**. It is practically impossible to have a complete view of where we are being non-inclusive. At least not without **many years** of experience in the subject. Clearly, playtesting is a good tool to **identify inclusivity issues**. I'm not going to delve into the ramble of cultural Marxism, what matters to me is making the world accessible and inclusive, which is why I was excited about the response. Although Kamran is focusing on content issues (e.g., whether a word or another might be offensive, etc.), I believe this is also true in **matters of form**. For instance, a children's book has less text than a book for adults. This is true in video games as well, **the length of the texts is greatly reduced** , we will talk about this in an upcoming email. And not to mention those of us with ADHD. There are games I cannot play because of the amount of information that needs to be managed at the same time without getting distracted. This speaks to the importance of playtesting with a **variety of players**. It's clear that this extends to all types of playtesting, not just narrative, but it's good to emphasize it. Thank you, Kamran, for your comment! Question: Do you know any games that would have benefited from having done narrative playtesting on something in particular? Why do you think that?
onwritinggames
1,700,437
frustrating results
Something's been on my mind lately, summarizing some of the things we've been talking about: When...
0
2023-12-24T16:21:19
https://onwriting.games/daily/20231201-2349-frustrating-results/
gamedev, writing
--- title: frustrating results published: true date: 2023-12-01 00:00:00 UTC tags: [ 'gamedev', 'writing' ] canonical_url: https://onwriting.games/daily/20231201-2349-frustrating-results/ --- Something's been on my mind lately, summarizing some of the things we've been talking about: When writing narrative games, we have a unique opportunity to **explore the intricacies of the real world**. We can work complex moral and ethical dilemmas into our stories, we can create experiences that are thought-provoking and reflective of the nuanced world we live in. Or not. For instance, think about the tough choices some people face in resource allocation in real life, like in healthcare or during a natural disaster. That's something we don't experience daily, unless we work on those areas, and many people might find that interesting. Many video games default to violence as the primary conflict. It's the easiest thing to do, obviously. Two opposing sides, war, and that's enough. Many of us are tired of this and are looking for different ways of interpreting conflict. The selection of options opens the possibility of generating a type of conflict that does not exist in linear media, but is very strong in human experience. And I'm not referring so much to **moral dilemmas** , which are great to explore. I'm referring more to the conflict that arises between making a decision and the outcome that follows. This is slightly related to Brian Upton's concept of the horizon of intention and horizon of action. The intention that the player has when choosing an option, and the action (or result) can be different, and a great source of conflict. Obviously, we need to be careful and subtle with this, it's not about removing the player's agency, but about exploring the frustration that things turn out to be more complex than one imagines as a design goal. Have you played any games that perform this? I think I have not, but I'd love to...
onwritinggames
1,700,598
Experimenting around FP in PHP
Warn: in this Post, it's tried to use as much typing as possible also if sometimes this means to...
0
2023-12-18T13:23:01
https://dev.to/stefanofago73/experimenting-around-fp-in-php-5ek6
> **Warn**: in this Post, it's tried to use as much typing as possible also if sometimes this means to take some distance from official documentation The evolution of the PHP language has had an effective contribution from some powerful specifications: - the [Callable](https://www.php.net/manual/en/language.types.callable.php) - the [Anonymous Function](https://www.php.net/manual/en/functions.anonymous.php) - the [Closure](https://www.php.net/manual/en/class.closure.php) These are now enriched by new RFC: - the [Anonymous Classes](https://www.php.net/manual/en/language.oop5.anonymous.php) - the [Arrow Functions](https://www.php.net/manual/en/functions.arrow.php) - the [First Class Callable Syntax](https://www.php.net/manual/en/functions.first_class_callable_syntax.php) > **Note**: These RFCs make possible a lot of different constructs that go further than what will be exposed in this Post: remember that the "White Rabbit hole" is deep! ## Callable, Anonymous Function, Closure, and FP! ***Callables*** are generally used as the main type hint to define callbacks: a summary of how they can be defined/used can be seen in the [official documentation](https://www.php.net/manual/en/language.types.callable.php#118032). Introducing the ***__invoke*** function, even at the interface or anonymous class level, makes it possible to manage an object as a function transforming it into a Callable. ```php class Butterfly { public function __invoke():void {echo "flying...";} } $butterfly = new Butterfly(); $butterfly(); // flying... ``` ***Closure*** is the class used to represent anonymous functions: this kind of construct, also in the form of ***arrow*** function, is really useful not only as a callback, in place of a callable, but also more usage since the ability to capture a different scope or to simulate FP constructs such as [currying](https://en.wikipedia.org/wiki/Currying) like in the following code: ```php class Person{ private function __construct( public readonly string $name, public readonly string $familyName, public readonly int $age){} public static function create():\Closure { return static fn(string $name):\Closure => static fn(string $familyName):\Closure => static fn(int $age):Person => new Person($name,$familyName,$age); } } $person = Person::create()('Frank')('Jamison')(50); ``` The introduction of ***First-Class Callable Syntax*** makes it possible to create Closure from any kind of function, defining a sort of ***function reference*** usable in places different from the definition (what follows reuse the example seen above) ```php $personBuilder = Person::create(...); $frank = $personBuilder()('Frank')('Jamison')(45); $john= $personBuilder()('John')('Smith')(40); ``` ## Arrays, arrays functions, and the anonymous functions The use of anonymous functions is particularly useful when we refer to native functions on arrays: the potential is notable and makes the use of [loops superfluous](https://szymonkrajewski.pl/why-you-should-use-array-functions-instead-of-basic-loops/) as in the following code. ```php $stock = [ ['name' => 'Phone', 'qty' => 0], ['name' => 'Notebook', 'qty' => 1], ['name' => 'SDD Drive', 'qty' => 0], ['name' => 'HDD Drive', 'qty' => 3] ]; $inStock = array_filter($stock, static fn($item) => $item['qty'] > 0); ``` Native functions on arrays bring out the functional world when it comes down to it to the ***fold operation*** (the [array_reduce](https://www.php.net/manual/en/function.array-reduce.php) function) that allows us to be a point on which to build other fundamental functions such as ***map*** and ***filter***: ```php /** * *@template T *@template R *@param array<T> $data *@param callable(T):R $mapper *@return array<R> * */ function map(array $data, callable $mapper):array { return array_reduce($data, /** @param T $element */ function(array $accumulator, $element) use ($mapper) { $accumulator[] = $mapper($element); return $accumulator; }, []); } /** * *@template T *@param array<T> $data *@param callable(T):bool $predicate *@return array<T> * */ function filter(array $data, callable $predicate):array { return array_reduce($data, /** @param T $element */ function(array $accumulator, $element)use($predicate){ if($predicate($element)){ $accumulator[]=$element; } return $accumulator;},[]); } ``` The improved performance on arrays, starting from PHP 7.X, and the handling of anonymous functions are convenient from both a performance and memory consumption point of view although the practice of only using arrays sometimes doesn't make you consider the multiple benefits of alternative data structures. Anonymous functions and closures are easy to distinguish given the ***use clause*** that allows you to capture elements from outside the given anonymous function. The value of the captured element is defined when the closure is declared and cannot be changed. ```php class Product{ ... } final class PriceRange { private function __construct(private float $minimumValue, private float $maximumValue){} public static function of( float $minimumValue, float $maximumValue): PriceRange{ return new PriceRange($minimumValue,$maximumValue); } public function priceFilter():\Closure{ return fn(Product $value)=> $value->price() >= $this->minimumValue && $value->price() < $this->maximumValue; } } /** @var array<Product> $prodcuts */ $products = [ ... ]; $range = PriceRange::of(5.0,10.0); array_filter($products, $range->priceFilter()); ``` Closures are still handy in design where you want to isolate the visibility of data, always in the light of the FP paradigm, as in the following example. ```php /** @return \Closure():int */ function rndNumber():\Closure{ return fn()=> rand(0,getrandmax()-1); } /** @return \Closure():int */ function rndConstant():\Closure{ $constant = rand(0,getrandmax()-1); return fn()=> $constant; } $rnd1 = rndNumber(); $c1 = rndConstant(); $c2 = rndConstant(); echo $rnd1(); // 505035335 echo $rnd1(); // 1353685165 echo $c1(); // 2030702172 echo $c1(); // 2030702172 ``` These possibilities are also useful in approaching internal domain-specific languages (***DSL***) or adopting the ***higher-order function*** (in FP slang): the [Loan](https://www.oreilly.com/library/view/design-patterns-and/9781786463593/56585929-d828-45ce-9a91-648ad0dd4823.xhtml) Pattern and the [Execute Around](https://www.dontpanicblog.co.uk/2020/11/28/execute-around-idiom-in-java/) Pattern are examples of this concept. ```php // //Example of Execute-Around... // ...also used in DSL definition for nested-dsl // MailSender::Send( fn(Mail $config):Mail=> $config ->From("john.black@kmail.com") ->To("jack.white@jmail.com") ->Subject("Test message") ->Body("Hello World!") ); // // Example of Load Pattern // class ToyBox{ ... /** * * @param \Closure(ToyBox):ToyBox $boxFillerPolicy * @param \Closure(ToyBox):ToyBox $playActions * @return \Closure():void */ public static final function play(\Closure $boxFillerPolicy, \Closure $playActions):\Closure { return function()use($boxFillerPolicy,$playActions):void{ $box = new ToyBox(); $box->Open(); try{ $box = $boxFillerPolicy($box); $box = $playActions($box); }finally{ $box->Close() ->CleanUpToys(); } }; } ... public function nextToy():ToyBox{ ... } public function addToy(string $toyName):ToyBox{ ... } } function Usage():void{ $refill = fn(ToyBox $box):ToyBox =>$box ->addToy("lego") ->addToy("mechano") ->addToy("laser"); }; $play = fn(ToyBox $box):ToyBox{ => $box ->nextToy() ->nextToy() ->nextToy(); }; $playSession = ToyBox::play($refill, $play); $playSession(); ``` ## The Scope element in Closures Closures can refer to different scopes than those in which they were created thanks to the ***bind*** functions. The possibilities of closures become considerable and they could replace the use of ***Reflection*** since they can access the private state of objects: this, however, has limitations in the presence of static analysis infrastructures such as PHPStan and PSALM which highlight *abuses of closures* as in the following example. ```php class SimpleClass { private int $privateData = 2; } $simpleClosure = function():int {return $this->privateData;}; $instance = new SimpleClass(); $resultClosure = \Closure::bind( $simpleClosure, $instance, SimpleClass::class); echo $resultClosure==false?-1:$resultClosure(); ... // //PSALM Output // // ERROR: InvalidScope - Invalid reference to $this in a // non-class context // INFO: MixedInferredReturnType - Could not verify the return // type 'int' for... // ``` It's however possible to find ways to overcome these problems and exploit appropriately the potential of the closures from the outside of a class. A possible trick is to define a ***bridge operation*** (a protected function, also *static* if needed) to use when the Closure is created, as follows: ```php class SimpleClass { private int $privateData = 2; protected final function data():int{ return $this->privateData; } } $instance = new SimpleClass(); // @phpstan-ignore return.unusedType $accessTo = fn (string $bridgeOperation): ?\Closure => \Closure::bind( function(SimpleClass $instance) use ($bridgeOperation): int{ /** @var callable():int $operation */ $operation = [$instance,$bridgeOperation]; return $operation(); }, NULL, SimpleClass::class); /** @var null|\Closure(SimpleClass):int $resultClosure **/ $resultClosure = $accessTo("privateData"); echo ($resultClosure===null?-1:$resultClosure($instance)); // 2 ``` This approach seems more verbose but we can make it reusable and it passes static analysis given also more strong typing. It's also an example of how we can approach the FP concept of [Optics](https://medium.com/@gcanti/introduction-to-optics-lenses-and-prisms-3230e73bfcfe). ## Conclusion The importance of the exposed specifications is not only in having introduced types for the different elements but it's also in the new design possibilities. They have also brought PHP even closer to natively approaching Functional Programming: so it's important to study and experiment also with static analysis enabled! It is worth noticing that with PHP you can approach FP both in a more function-oriented way or in a more object-oriented way using generics and advanced techniques thanks also to PHPStan or PSALM. ## To Go Furter More elements need to be considered, here not exposed as for the design consequences of ***read-only*** constructs or the ***Generators***, but also if not adopted, it's useful to consider these concepts to enrich our ability to design Clean Software! Other resources allow the adoption of functional idioms/constructs and to study the concepts and approaches presented in this post: so... Happy Learn! - https://www.infoq.com/articles/php7-function-improvements/ - https://github.com/fp4php/functional - https://github.com/haskellcamargo/php-partial-function-application - https://github.com/marcosh/lamphpda - https://github.com/phunkie/phunkie - https://github.com/functional-php - https://github.com/widmogrod/php-functional - https://github.com/loophp/collection - https://leanpub.com/thinking-functionally-in-php - https://www.phparch.com/books/functional-programming-in-php/ - https://link.springer.com/book/10.1007/978-1-4842-2958-3
stefanofago73
1,700,785
Eager loading vs lazy loading
Basically, do you know when you need to study for a test? You don't have much time: eager loading....
0
2023-12-18T03:39:56
https://dev.to/juanpinheirx/eager-loading-vs-lazy-loading-90o
javascript, node, programming, webdev
Basically, do you know when you need to study for a test? You don't have much time: eager loading. Whatever you see in your sight is pretty much enough for you to have. But if you need to study a specific thing: lazy loading. That's when you have time to study specific concepts. Therefore, specific results. Let's have another example with other things like our solar system. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6fuqfsacpmd8m1vj7pln.png) Inquiring one about our solar system will get us a general answer. Let's name this concept as macroquery. We have informations about other planets (enough to know what they're made of, for example). Let's say we want now to get more information on mars. Let's call this inquiry microquery. Mars has its specific details such as earth. When we want to get from a query nothing in particular: eager loading (macroquery). When we want specifics: lazy loading (microquery). Think about atoms and how they were discovered pretty much after stars. In terms of mass, both are completely different however, they share the same matter. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4g6ed0cxvxwfwguabjg7.png) Doesn't it look like a planet with its moons? Let's talk a bit more about space and get the difference between being on a planet, watch it through a telescope and look up to the sky. We have more informations now in Mars than before, due to exploration technologies. We can call this action towards knowledge from mankind as microquery. We want to know now details of mars. Image you wake up in a remote period of ancient time, where all you can do to know about space is to look up. To see little stars with different colors or constellations will give you coordinates or some kind of information such as: the north star, if you're on a western or eastern part of earth, etc. We'll call it macroquery. We have little to work on, and much to do if we want specifics. Now imagine you woke up with Copernicus looking to the stars through a telescope. You could see with your own eyes his studies to get to our general answer today: the sun does not rotate around earth. Is this micro or macroquery? The answer for us is: _it depends._ If you need specific data from your database: lazy loading. Remember microquery. Else, use eager loading and remember macroquery.
juanpinheirx
1,700,923
`On School: Top defensemen to view this period
A few of the most effective defensemen in the NHL are graduates of college hockey. Cale Makar of the...
0
2023-12-18T07:14:28
https://dev.to/brito11/on-school-top-defensemen-to-view-this-period-11og
A few of the most effective defensemen in the NHL are graduates of college hockey. Cale Makar of the Colorado Avalanche won the Norris Prize in 2021-22, one year after Adam Fox of the New York Rangers won it. Look at the NHL's top markers and together with Makar, you'll see Quinn Hughes of the Vancouver Canucks, that leads the Organization with 37 points and went to the University of Michigan. The list goes on, including Charlie McAvoy of the Boston Bruins, Brandon Montour of the Florida Panthers, Jaccob Slavin of the Carolina Hurricanes and Devon Toews of the Avalanche. Luke Hughes of the New Jersey Devils, and the more youthful brother of Quinn, is among the early favorites for the Calder Prize as the League's best rookie. And a lot more are on the method. It will not be long prior to the following wave of university standouts move on to the NHL. Here are 10 of the top NCAA defensemen to view this season. Sean Behrens, College of DenverSelected by the Colorado Avalanche in the second round of the 2021 NHL Draft, the knowledgeable puck-mover belonged of Denver's 2022 championship game team. He has 14 factors in 14 video games this season <a href="https://www.avalancheprostore.com/ben-meyers-jersey">Ben Meyers Jersey</a>. The 20-year-old junior played for the United States in the 2023 World Junior Champion, where he had 3 indicate help them win a bronze medal. Zeev Buium, College of DenverA 17-year-old freshman who is a remarkable skater, Buium is ranked by NHL Central Scouting as an A prospect for the 2024 NHL Draft. He plays with Behrens on Denver's leading pair. He gained National Collegiate Hockey Meeting Defenseman of the Week honors after scoring an objective in a 5-0 win on Nov. 24 and having four aids in a 9-0 win on Nov. 25 as Denver swept Yale University. Buium has 18 points and is plus-16 in 14 video games this period, and has 14 points in his previous eight video games. His bro, Shai, that also plays at Denver, was picked by the Detroit Red Wings in the second round of the 2021 draft. Seamus Casey, University of MichiganAfter an exceptional freshman year with 29 factors in 36 games, Casey is off to a fast begin as a sophomore. He is the top-scoring defenseman in the nation and is tied for 3rd among all players with 22 points in 16 games <a href="https://www.avalancheprostore.com/peter-forsberg-jersey">Peter Forsberg Jersey</a>. The 19-year-old was picked by the New Jersey Devils in the second round of the 2022 NHL Draft. Ryan Chesley, College of MinnesotaSelected by the Washington Capitals in the 2nd round of the 2022 draft, Chelsey competes hard, a trait that figures to make him an area on the USA group at the 2024 World Junior Championship for the second straight year. The 19-year-old student has five points in 14 video games this season. Lane Hutson, Boston UniversityThe 19-year-old was a Hobey Baker Award finalist and led Hockey East in scoring as a fresher last season with 48 factors in 39 video games. This season, he has 16 factors in 13 games <a href="https://www.avalancheprostore.com/henry-bowlby-jersey">Henry Bowlby Jersey</a>. Chosen by the Montreal Canadiens in the second round of the 2022 draft, he figures to be a principal for the United States at the World Junior Championship and possesses a high hockey IQ. Artyom Levshunov, Michigan State UniversityAfter a solid period with Environment-friendly Bay of the USHL last season, Levshunov has actually been a standout in his initial year in university and is expected to be a high choice in the 2024 draft. The 18-year-old has 15 factors in 16 games this season. Scott Morrow, University of MassachusettsSelected by the Carolina Hurricanes in the 2nd round of the 2021 draft, Morrow led his group in scoring last period and is doing it once more in his junior year with 15 factors in 13 video games. The 21-year-old had 31 factors in 35 games last period. `
brito11
1,700,947
Skyflow's privacy vault for building LLMs
Skyflow's Privacy Vault empowers organizations to build and deploy LLMs responsibly, ensuring data privacy and compliance throughout the entire LLM lifecycle.
21,488
2023-12-18T07:31:25
https://codingcat.dev/podcast/3-24-skyflow-privacy-vault
webdev, javascript, beginners, podcast
Original: https://codingcat.dev/podcast/3-24-skyflow-privacy-vault {% youtube https://youtu.be/f_gNOK8cpwI %} In the rapidly evolving realm of artificial intelligence, large language models (LLMs) have emerged as powerful tools for natural language processing and generation. These models, trained on massive datasets of text and code, have demonstrated remarkable capabilities in tasks such as machine translation, text summarization, and creative writing. However, the development and utilization of LLMs raise significant privacy concerns, particularly with regard to the handling of sensitive personal information. Skyflow's Privacy Vault offers a groundbreaking solution to address these concerns, enabling organizations to build and deploy LLMs while upholding the highest standards of data privacy and security. This innovative privacy vault provides a secure environment for sensitive data, ensuring that it remains protected throughout the entire LLM lifecycle, from data collection and preparation to model training and deployment. ## Safeguarding Sensitive Data Throughout the LLM Lifecycle The LLM lifecycle encompasses various stages, each presenting unique data privacy challenges. Skyflow's Privacy Vault effectively addresses these challenges, ensuring that sensitive data is safeguarded at every step. 1. **Data Collection and Preparation:** During data collection, Skyflow's Privacy Vault allows organizations to identify and redact sensitive data before it is used for training or inference. This process helps prevent the inadvertent exposure of personally identifiable information (PII) or other sensitive information. 2. **Model Training:** The Privacy Vault maintains its protection during model training, ensuring that sensitive data remains encrypted and inaccessible to unauthorized parties. This encryption safeguards sensitive data from potential breaches or unauthorized access during the training process. 3. **Model Deployment and Inference:** When deployed for inference, LLMs interact with user-provided data. Skyflow's Privacy Vault extends its protection to this stage, ensuring that sensitive data is redacted or anonymized before being exposed to the LLM. This protection prevents the LLM from learning or disclosing sensitive information during inference. ## Enhancing Data Privacy with Granular Controls and Compliance Skyflow's Privacy Vault goes beyond basic data protection by providing organizations with granular controls over data access and usage. These controls enable organizations to define who can access sensitive data and for what purposes, ensuring that sensitive information is only used for authorized purposes. Furthermore, the Privacy Vault facilitates compliance with various data privacy regulations, such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). The vault's data residency capabilities ensure that sensitive data remains within the specified geographic regions, complying with data localization requirements. ## Unlocking the Potential of LLMs with Confidence Skyflow's Privacy Vault empowers organizations to harness the power of LLMs without compromising data privacy. By providing comprehensive data protection and compliance capabilities, the Privacy Vault enables organizations to build and deploy LLMs responsibly, fostering trust and transparency among stakeholders. **Key Benefits of Skyflow's Privacy Vault** * **Protect sensitive data throughout the LLM lifecycle** * **Implement granular access controls for sensitive data** * **Comply with data privacy regulations, such as GDPR and CCPA** * **Maintain data residency in specified geographic regions** * **Build and deploy LLMs responsibly and ethically** ## Conclusion As LLMs continue to revolutionize various industries, Skyflow's Privacy Vault plays a crucial role in ensuring that these powerful models are developed and deployed in a privacy-conscious manner. By safeguarding sensitive data and enabling compliance with data privacy regulations, the Privacy Vault empowers organizations to leverage the full potential of LLMs while upholding the highest standards of data protection.
codercatdev
1,701,077
Simplify and Stabilize Your Playwright Locators
Who controls the locators, controls the web. Much of Playwright's power comes from its ability to...
25,769
2023-12-19T12:00:00
https://www.browsercat.com/post/strengthen-selectors-and-locators-in-playwright?canonical=2
playwright, javascript, automation, testing
Who controls the locators, controls the web. Much of Playwright's power comes from its ability to target and interact with elements on a webpage. But as you well know, the web is a finicky place. Elements come and go, and the slightest change in HTML can break your automation scripts. In this article, we'll explore strategies for creating simple and stable Playwright locators that can withstand the test of time and changes in web applications. We'll also look at how to optimize your selectors for speed and efficiency. For a more in-depth look at these concepts, check out my complete deep dive on [optimizing Playwright locators](https://www.browsercat.com/post/strengthen-selectors-and-locators-in-playwright). Let's go! ## Optimize Selector Specificity Refining selector specificity involves finding an ideal equilibrium: you need selectors that are precise enough to unambiguously identify target elements across different page states, yet not so intricate that they shatter with the slightest DOM adjustments. Steer clear of overly rigid selectors; such fragility means that a trivial HTML modification could render them useless. On the flip side, too loose selectors might lead you down a path of position-dependent logic (like indices) or unwarranted reliance on the web page's current implementation. We're looking for the Goldilocks zone of search queries. Here are a few anti-patterns and how to fix them: ```ts // 🔴 BAD: Selectors based on fragile structural assumptions const $link = page.locator('li:nth-child(3) > a'); // 🟢 GOOD: Selectors grounded in distinguishing features const $link = page.locator('a[href*="privacy"]'); // 🔴 BAD: Selectors anchored to mutable text const $button = page.locator('button', {hasText: 'Sign up'}); // 🟢 GOOD: Robust, pattern-based selectors const $button = page.locator('button', { hasText: /(sign|start|subscribe|launch)/i, }); // 🔴 BAD: Selectors tied to changeable style-specific classes const $input = page.locator('input.email'); // 🟢 GOOD: Selectors focused on functional attributes const $input = page.locator('input[type="email"]'); // 🔴 BAD: Excessively specific hierarchical selectors const $list = page.locator('footer > ul.inline-links > li'); // 🟢 GOOD: Shallow, adaptive nesting const $list = page.locator('footer:last-of-type li', { has: page.locator('a'), }); ``` To sum up: - Prioritize attributes that capture the essence of the target element, rather than arbitrary qualities. - If your target elements have no stable defining characteristics, anchor your queries to parents or children who do have such qualities. - Keep chain selectors short, and try to only use stable elements within them. - Leverage regular expressions for text matching to accommodate variations. ## Prefer Semantic Locator Methods When it comes to selecting elements, it's always better to prioritize semantic attributes over stylistic or functional attributes. While an element's classes or its location in the DOM are typically the easiest to target, they're also the most prone to change. On the other hand, semantic attributes (such as role, label, or title) change infrequently. And when they do, they tend to change in ways that can be accounted for ahead of time. Given the benefit of targeting semantic attributes, Playwright provides utility methods for directly accessing them. Use them whenever possible. Here's why: - Semantic methods encourage best practices. As you'll see in the passages that follow, the most durable selector patterns have helper methods. - Semantic methods are typed, improving your IDE experience and alerting you to errors. Plain selector strings can only be debugged at runtime, resulting in more bugs. - Semantic methods are chainable. This dramatically simplifies the queries themselves. It also results in better error messages when queries fail. Let's run through each method that's currently available: ```ts // Target Test IDs when you have control of the HTML // E.G. `<section data-testid="delete-modal" />` const $modal = page.getByTestId('delete-modal'); // Target explicit or implied element roles // "button" matches `<button>`, `<input type="button">`, or `<div aria-role="button">` const $button = page.getByRole('button', {hasText: 'Buy'}); // Target text HTML attributes that are unlikely to change const $input = page.getByLabel('Email'); const $search = page.getByPlaceholder(/^search/i); const $image = page.getByAltText('Profile Picture'); const $icon = page.getByTitle('Info', {exact: false}); // Or target elements by `innerText`, when you can be sure it's stable const $dialog = page.getByText(/^confirm/i); ``` I cannot stress enough how valuable it is to target based on `role` and `data-testid` in particular. When you have control of the DOM, `data-testid` is a fantastic [convention](/glossary/data-testid) to implement across dev teams. Unlike every other HTML attribute, there is never any reason for `data-testid` to change, making it the least brittle of all possible selectors. That said, given that you won't always have control over the HTML of target pages, `role` is an excellent fallback. As shown in the example code above, `role` is an inherently forgiving selector. It can continue working even across substantial DOM changes. ## Chain Locators, Not Selectors As described above, it's best to avoid long selector query strings. They're inherently difficult to debug, and they result in less descriptive error messages. Of course, you're not always going to be able to avoid chaining. Sometimes, you need to target particularly evasive DOM nodes. More often, you simply want to break a page up into subtrees and drill down from an intermediate HTML element. (For example, a list of articles in a blog feed.) Here are two strategies for designing locators that are both easy to read and easy to debug... ### Drill Down Within Subtrees Chaining locators is like adding layers to a sketch; each additional stroke refines the image. Begin with a broad locator and use methods like `.filter()`, `.first()`, `.last()`, and conditional parameters to progressively narrow down to your target element. ```ts // Find the first article about BrowserCat const $articles = page.locator('article'); const $aboutBrowserCat = $articles.filter({ hasText: /BrowserCat/i }); const $firstArticle = $aboutBrowserCat.first(); ``` This approach separates concerns, ties the locators to logical entities, and keeps the code readable and adaptable. If your layout changes, you might only need to adjust the parent locator instead of unraveling multiple complex strings. ### Filter by Content with `{has}` and `{hasNot}` Sometimes you need to select an element not only by its properties but also by its relation to others. With `{has}` and `{hasNot}` parameters, you can define these relationships clearly, creating a robust context for your selectors. ```ts // Find the first article with an image const $articles = page.locator('article'); const $withImages = $articles.filter({ has: page.locator('img'), }); const $firstArticle = $withImages.first(); ``` These parameters act as assertions about the presence of certain elements within a parent. This increases the number of stable, unique attributes you can leverage in creating good query patterns. ## Next steps We've touched on several key practices for creating simple and stable Playwright locators that can withstand the test of time and changes in web applications. I go much deeper on performance, error handling, and refactoring your selectors in my [Playwright locators deep-dive](https://www.browsercat.com/post/strengthen-selectors-and-locators-in-playwright). Check it out. Until next time, happy automating!
mikestopcontinues
1,701,183
Unit test in Laravel by example
Testing is fun! Don't take my word for it. You should probably try it yourself. To get familiar with...
0
2023-12-18T11:21:41
https://dev.to/amirsabahi/learn-unit-test-in-laravel-by-example-1k5p
laravel, php, unittest, filesystem
Testing is fun! Don't take my word for it. You should probably try it yourself. To get familiar with testing or even get better at writing unit tests, you can learn from prominent open-source projects. Just open the project and go to the test folder. You might see things like feature tests, integration tests, or unit tests. Choose a unit test and go through the test files. Here we go through the unit test of FileSystems in Laravel framework. The address is [tests/Filesystem/FilesystemTest.php](https://github.com/laravel/framework/blob/10.x/tests/Filesystem/FilesystemTest.php) ## Setting the Stage: Before delving into the details of the unit test, let's set the stage by briefly introducing the Illuminate Filesystem component. This component provides a unified interface for interacting with file systems, allowing developers to perform tasks such as reading, writing, and manipulating files and directories. ## Test Class Overview: The unit test is written in PHP using the PHPUnit testing framework. It is part of the Illuminate\Tests\Filesystem namespace and extends the PHPUnit\Framework\TestCase class. The test class covers a wide range of scenarios, including file retrieval, storage, line operations, permissions, directory manipulation, and more. ### Mock The test file uses use Mockery framework. Sometimes we do not really want to call a piece of code, or send real emails or SMS. For instance, in file system testing, we do not want to create a real file. Instead, we use mock object to call the delete method. a mock is a stand-in for a real object. It pretends to be the real thing in a controlled way, allowing developers to test specific parts of their code independently. Take a look at the code below: ` $files = m::mock(Filesystem::class)->makePartial(); $files->shouldReceive('deleteDirectory')->once()->andReturn(false);` We use mock to create partial test doubles* and fake call deleteDirectory (We do not call it really, and we do not delete anything rather we return what we think the deleteDirectory method returns). Now open the code and walk through it along with the following explanations. ### Setting up Temporary Directory: The test class employs a temporary directory for creating and testing files. The setupTempDir() method, annotated with @beforeClass, creates a temporary directory for testing, while tearDownTempDir() (annotated with @afterClass) cleans up the temporary directory after all tests have been executed. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yz9invrmfrqsr4yslkep.jpeg) ### Filesystem Initialization: In various test methods, an instance of the Filesystem class is created to simulate real-world usage. This includes scenarios where files need to be created, modified, or deleted. ### Testing File Retrieval: The testGetRetrievesFiles method ensures that the get method retrieves the correct content from a file, asserting that the retrieved content matches the expected value. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/69af3h7nfb8xh0dhsc4z.jpeg) ### Testing File Storage: The testPutStoresFiles method examines the put method's ability to store content in a file, with an assertion to validate that the file content matches the expected value. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kmdgksv130hwaf1yikfg.jpeg) ### Lazy Collection for Lines: The testLines method demonstrates the use of LazyCollection for working with lines in a file. It ensures that lines are read correctly and returned as a LazyCollection. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nqyhmhbw8lgxgiuta58v.jpeg) ### File Replacement Operations: The testReplaceCreatesFile and testReplaceInFileCorrectlyReplaces methods test the replace method, both for creating a file and for replacing content within an existing file. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c5m9hhpd096ikvc41ygk.jpeg) ### Testing File Permissions: The testSetChmod and testGetChmod methods focus on testing the _chmod_ method for setting and retrieving file permissions. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7patppsyse2l1w8scz52.jpeg) ### Directory and File Deletion: The testDeleteRemovesFiles method checks the delete method's ability to remove both single files and arrays of files. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q4vf40bj8o9yp6clu19o.jpeg) ### Prepending to Files: The testPrependExistingFiles and testPrependNewFiles methods examine the prepend method's functionality for adding content to the beginning of a file. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vdksiz0ck1nq66y4caci.jpeg) ### File Existence Checks: The testMissingFile method checks the missing method for detecting the absence of a file. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gw5fkvkhbrjnm6y2bpct.jpeg) ### Directory Manipulation: Methods like testDeleteDirectory, testDeleteDirectoryReturnFalseWhenNotADirectory, and testCleanDirectory focus on testing directory-related operations. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hciklp64vqwtits10rxj.jpeg) ### Macro Functionality: The testMacro method demonstrates the ability to add macros dynamically to the Filesystem class. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hk0aps6q6tx0le7ccn8v.jpeg) ### Additional File and Directory Operations: The testFilesMethod, testCopyDirectory, testMoveDirectory, and other methods cover a range of file and directory operations, ensuring correct behavior in various scenarios. ### Exception Handling: Methods like testGetThrowsExceptionNonexisitingFile and testGetRequireThrowsExceptionNonExistingFile test the handling of FileNotFoundExceptions. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4u6tknqf7w2b7udfa23l.jpeg) I hope this post helps you to get your testing skills even further. > Test Doubles: In computer programming and computer science, programmers employ a technique called automated unit testing to reduce the likelihood of bugs occurring in the software. Frequently, the final release software consists of a complex set of objects or procedures interacting together to create the final result. In automated unit testing, it may be necessary to use objects or procedures that look and behave like their release-intended counterparts but are simplified versions that reduce the complexity and facilitate testing. A test double is a generic term used for these objects or procedures. > Wikipedia
amirsabahi
1,701,533
Modern GL + Google Colab = triangle animation
Rendered not in my PC image: *i mean not this gif here in your browser but in colab cloud? Link to...
0
2023-12-18T16:54:50
https://dev.to/fakelaboratory/modern-gl-google-colab-triangle-animation-38bc
moderngl, colab, python
Rendered not in my PC image: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/du0g30ahrv2x5vgqizo1.gif) *i mean not this gif here in your browser but in colab cloud? Link to colab: https://colab.research.google.com/drive/10Ig1Nqbwqd1lF7HAF0rJuxpZpThgjHcB?usp=sharing This notebook restored from old(2018) google notebook, with some changes in code. Links to references in comments of the notebook. ``` !pip install moderngl !pip install moviepy import moderngl as gl import numpy as np from PIL import Image import moviepy.editor as mpy ctx = gl.create_context(standalone=True, backend='egl') print(ctx.info) ``` ``` prog = ctx.program( vertex_shader=""" #version 330 in vec2 in_vert; in vec3 in_color; out vec3 v_color; void main() { v_color = in_color; gl_Position = vec4(in_vert, 0.0, 1.0); } """, fragment_shader=""" #version 330 in vec3 v_color; out vec3 f_color; void main() { f_color = v_color; } """, ) vertices = np.asarray([ -0.75, -0.75, 1, 0, 0, 0.75, -0.75, 0, 1, 0, 0.0, 0.649, 0, 0, 1 ], dtype='f4') ``` ``` def render_frame(time): vbo = ctx.buffer(vertices.tobytes()) vao = ctx.vertex_array(prog, vbo, "in_vert", "in_color") fbo = ctx.framebuffer( color_attachments=[ctx.texture((512, 512), 3)] ) fbo.use() fbo.clear(0.0+time, 0.0+time, 0.0+time, 1.0) vao.render() return np.array(Image.frombytes( "RGB", fbo.size, fbo.color_attachments[0].read(), "raw", "RGB", 0, -1 )) ``` ``` clip = mpy.VideoClip(render_frame, duration=2) # 2 seconds clip.write_gif("anim.gif",fps=15) #now in files anim.gif ``` --- _May the new year bless you with health, wealth, and happiness._ from "65 Happy New Year Wishes for Friends and Family 2024" countryliving article
fakelaboratory
1,701,941
What is Event Bubbling and how we can handle this?
Hi Devs, Have you all heard about Event Bubbling? Recently, someone asked me if I knew about this...
0
2023-12-18T20:46:25
https://dev.to/jaelynlee/what-is-event-bubbling-and-how-we-can-handle-this-501c
webdev, programming, beginners, discuss
Hi Devs, Have you all heard about **Event Bubbling**? Recently, someone asked me if I knew about this concept, and I had no idea what it was! They were also asked about it in an interview and couldn't answer. So, I decided to research and summarize the definition, as well as how we can use it in real code. Shall we start our deep dive into it? ## What is Event Bubbling? **Event Bubbling** is a concept in the DOM (Document Object Model). It occurs when an element receives an event, and that event bubbles up to its parent and ancestor elements in the DOM tree until it reaches the root element. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ihicya37b5cou18huvv8.png) As you can see from this diagram, if `button` is clicked, all events will be propagated until it reaches the root element. Event bubbling is the default behavior for events but can be prevented in certain cases. Let me give you some examples to help you understand better. Of course, I am too lazy to write code, so I asked ChatGPT to generate it for me, and it did a great job, to be honest! So, just like the diagram above, the elements are stacked on top of each other like a set of Russian dolls. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9tnd4h0bn3jc1xs4n2w2.png) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z8gtfgib2uzbrcbau01d.png) By clicking the `Inner Button`, the logs on `innerButton`, `middleDiv`, and `outerDiv` will be executed sequentially. And this is Event bubbling! A very straightforward concept, right? ## Importance of understanding Event bubbling in Web development Think of event bubbling like throwing a pebble into a pond. When you toss the pebble, ripples spread outward from where it landed. Event bubbling works similarly on web pages. When something happens, like a click on a button, that "event" ripples through the different parts of a webpage, starting from where it happened and moving up to the main part of the page. 1. **Controlling how things flow**: Just like you can predict where ripples go in a pond, understanding event bubbling lets developers decide how things react when you click or interact with something on a webpage. 2. **Smart handling for many things at once**: Imagine if you could clap once and have everyone in a room who heard it react in a certain way. Event bubbling lets developers set up reactions for many things (like buttons) all at once without having to do it for each one individually. 3. **Building pieces that work everywhere**: It's like creating LEGO pieces that fit together perfectly. With event bubbling, developers can make these parts (components) that know how to handle their own events. Then, they can use these pieces in different parts of a website without having to rebuild them each time. 4. **Making websites fun to use**: Event bubbling helps in creating web pages that respond when you do something, like clicking a button or dragging an item. It's what makes websites feel interactive and enjoyable to use. In short, understanding event bubbling helps developers control how events move around a webpage, handle lots of things efficiently, make reusable parts, and create fun and interactive websites. ## How can we handle this Event bubbling? What if you want to stop this from happening? There are several ways to handle Event bubbling, but I am going to explain 2 methods today. 1. stopPropagation() The stopPropagation() method prevents further propagation of the current event. However, if an element has multiple event handlers on a single event, then even if one of them stops the bubbling, the other ones still execute. In other words, `event.stopPropagation()` stops the upward movement, but all other handlers on the current element will still run. 2. preventDefault() If you want to stop those behaviors, you should use the `preventDefault()` method. The `preventDefault()` method is used to stop a specific action from happening when an event occurs. Normally, when you interact with something on a webpage (like clicking a link), the browser does something automatically (like navigating to a new page). But with `preventDefault()`, you can tell the browser not to do that automatic action. It's like saying "Hey browser, don't do the default thing, I want to do something else instead." Just keep in mind that using `preventDefault()` should be done carefully and only when necessary because it can change how a webpage or application behaves. Let’s have a look at a simple scenario to better understand. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2w27app90i83s05e6axm.png) There are 3 functions to explain event handling methods. So what do you think will happen if the button is clicked? As the button is stopping the propagation from going up to the parent elements, the alert message on the outer container will not be executed. What about preventDefault()? It will prevent us from navigating to the link and will execute the remaining action. The difference between `e.preventDefault()` and `e.stopPropagation()` is that `e.preventDefault()` is used to stop a specific action from happening when an event occurs, such as preventing a link from navigating to a new page. On the other hand, `e.stopPropagation()` stops the propagation of the event to parent and ancestor elements, but other event handlers on the current element will still execute. I hope this post was helpful for your coding. Happy coding, everyone!
jaelynlee
1,702,071
Discuss the current job market?
Yes, I have also experienced job loss like you. We can indeed discuss the current job market and...
0
2023-12-19T02:34:32
https://dev.to/jialudev/discuss-the-current-job-market-58g
discuss
Yes, I have also experienced job loss like you. We can indeed discuss the current job market and explore the new roles that are in high demand that a programmer can transition into.
jialudev
1,702,125
The Evolving Role of Property Managers in the Modern Real Estate Market
In an ever-changing real estate landscape, the role of property managers has evolved significantly to...
0
2023-12-19T05:00:50
https://dev.to/acceleratemarketing03/the-evolving-role-of-property-managers-in-the-modern-real-estate-market-jii
In an ever-changing real estate landscape, the role of property managers has evolved significantly to meet the dynamic needs of both landlords and tenants. As technology, demographics, and market dynamics reshape the way we approach property management, professionals in this field are embracing new strategies and adopting innovative tools to stay ahead. Let's delve into the key aspects of the evolving role of property managers in the modern real estate market. ![best property management](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0mt7owycexcu37kwxkd3.png) ## Embracing Technological Advancements Gone are the days of manual paperwork and traditional communication methods. Modern property managers leverage cutting-edge technology to streamline their operations. From property management software that automates tasks like rent collection and maintenance requests to online portals that enhance communication between landlords and tenants, embracing technology is a hallmark of the contemporary property manager. ## Data-Driven Decision Making In the digital age, data is a powerful tool for informed decision-making. Property managers now analyze market trends, tenant preferences, and property performance metrics to make strategic decisions. This data-driven approach allows for better forecasting, proactive issue resolution, and the optimization of property portfolios. ## Enhanced Communication Strategies Effective communication has always been a cornerstone of successful property management, but the methods have evolved. Property managers are now utilizing various communication channels, including email, text messaging, and social media, to stay connected with both landlords and tenants. Proactive and transparent communication helps build trust and fosters positive relationships. ## Focus on Tenant Experience Modern property managers recognize the importance of tenant satisfaction. With an increased focus on providing exceptional tenant experiences, property managers are implementing amenities, services, and community-building initiatives that go beyond the basic requirements. This emphasis on tenant satisfaction contributes to tenant retention and positive word-of-mouth referrals. ## Adaptation to Regulatory Changes The legal landscape of real estate is ever-changing, with new regulations and compliance requirements emerging regularly. Property managers stay abreast of these changes, ensuring that landlords remain compliant with local, state, and federal laws. Their understanding of legal nuances protects both landlords and tenants and contributes to the overall stability of the rental market. ## Green and Sustainable Practices Sustainability is no longer a mere buzzword; it's a driving force in property management. Modern property managers are implementing eco-friendly practices, from energy-efficient property upgrades to waste reduction initiatives. These practices not only align with broader environmental goals but also appeal to environmentally conscious tenants. ## Strategic Marketing and Branding Property managers today understand the significance of effective marketing. Whether it's showcasing properties through high-quality online listings, utilizing social media for targeted advertising, or implementing SEO strategies, modern property managers take a proactive approach to attract quality tenants and maximize property exposure. ## Financial Expertise and Budgeting Beyond rent collection, property managers now play a more active role in financial management. They work closely with landlords to develop comprehensive budgets, ensuring that properties remain profitable and well-maintained. Financial expertise is particularly crucial in navigating economic uncertainties and unexpected market shifts. ## Crisis Management and Emergency Response Recent global events have highlighted the importance of crisis management and emergency response in property management. Modern property managers are well-prepared to handle unexpected challenges, whether they be natural disasters, public health crises, or other emergencies. Their ability to adapt swiftly ensures the safety and well-being of tenants and the preservation of property value. ## Professional Development and Industry Certifications Recognizing the complexity of their evolving role, property managers invest in continuous professional development. Many pursue industry certifications, attend workshops, and participate in networking events to stay informed about the latest trends and best practices. This commitment to ongoing education enhances their expertise and benefits the clients they serve. ## Conclusion In the modern real estate market, [**best property management**](https://www.johnpye.com.au/) are no longer simply rent collectors and maintenance coordinators. Their role has transformed into a multifaceted, dynamic profession that requires adaptability, technological savvy, and a holistic understanding of the real estate landscape. As property managers continue to evolve, their ability to navigate complexities, embrace innovation, and prioritize the needs of landlords and tenants positions them as indispensable players in the ever-changing world of real estate.
acceleratemarketing03
1,702,416
Schedule regular Security Audits
You may not know, but MIDAS includes a built-in “Security Audit” tool. This allows you to perform a...
0
2023-12-19T10:21:00
https://mid.as/blog/schedule-regular-security-audits/
development, scheduledtasks, security, v433
--- title: Schedule regular Security Audits published: true date: 2023-04-17 19:23:39 UTC tags: Development,scheduledtasks,security,v433 canonical_url: https://mid.as/blog/schedule-regular-security-audits/ --- You may not know, but MIDAS includes a built-in “Security Audit” tool. This allows you to perform a quick and on-demand security analysis of your MIDAS system. ![Perform a detailed Security Audit of your MIDAS room booking system](https://mid.as/blog/wp-content/uploads/2023/04/midas-security-audit.png) _Perform a detailed Security Audit of your MIDAS room booking system_ [First introduced with the release of MIDAS v4.13 in 2016](https://mid.as/blog/security-enhancements-in-v4-13/), the “Security Audit” tool tests a number of key metrics of your MIDAS booking system. The audit checks your MySQL / MariaDB setup, MIDAS files, and recommended MIDAS security settings. It provides a detailed report with appropriate advisories for hardening the security of your MIDAS system. When the Security Audit was first introduced, it analyzed 15 metrics. Today, that number has increased to over 20. For MIDAS v4.33, the audit now additionally also… - Indicates the number of recently failed login attempts to your MIDAS system. - Checks whether [Geofenced logins](https://mid.as/blog/geolocation-and-geofencing/) have been enabled. But the biggest improvement to Security Audits for MIDAS v4.33 is the ability to schedule regular automated security audits. Until now, a Security Audit could only be manually initiated (via MIDAS Admin Options → Manage MIDAS → Security → Perform a Security Audit) From MIDAS v4.33, you can now use [Scheduled Tasks](https://mid.as/help/manage-scheduled-tasks) to automatically run a Security Audit and email you the results. Audits can be configured to run every 7, 14, 30, 60, or 90 days. ![Schedule automated security audits of your MIDAS booking system](https://mid.as/blog/wp-content/uploads/2023/04/scheduled-security-audit.png) _Schedule automated security audits of your MIDAS booking system_ The post [Schedule regular Security Audits](https://mid.as/blog/schedule-regular-security-audits/) appeared first on [MIDAS - Room Booking System | Blog](https://mid.as/blog).
midas
1,702,897
Unlocking Entertainment: Watching the Best Korean Dramas Online for Free
In the vast landscape of online streaming, Korean dramas have emerged as a global entertainment...
0
2023-12-19T19:11:23
https://dev.to/blogul/unlocking-entertainment-watching-the-best-korean-dramas-online-for-free-20ig
serialecoreene, blogulluiatanase, koreandramas
In the vast landscape of online streaming, Korean dramas have emerged as a global entertainment phenomenon, captivating audiences with their compelling stories and cultural richness. The good news is that you don't need to break the bank to enjoy the best of K-dramas. Here's a guide on how to be entertained by watching the finest Korean dramas online for free. **1. Explore Free Streaming Platforms: Numerous streaming platforms offer a selection of Korean dramas for free. Platforms like Viki, Rakuten Viki, and Tubi provide access to a variety of K-dramas without requiring a subscription fee. While there may be some ads, the content is accessible, making it an excellent starting point for budget-conscious viewers. **2. Utilize Ad-Supported Streaming: Some premium streaming platforms, including Viu and Kocowa, offer ad-supported viewing options. By enduring occasional ads, you can access a wide range of K-dramas without paying a subscription fee. Keep an eye out for free trials or limited-time promotional periods that platforms may offer. **3. YouTube Channels and Playlists: YouTube is a treasure trove for free K-drama content. Many official channels or content creators upload dramas legally, and you can find playlists with full episodes. Be cautious to choose content from legitimate sources to ensure quality and legality. **4. Free Trials on Premium Platforms: Take advantage of free trial periods offered by premium streaming services. Platforms like [https://blogulluiatanase.net/](https://blogulluiatanase.net/) often provide a limited trial period for new subscribers. During this time, you can binge-watch your favorite K-dramas without incurring any cost. Just remember to cancel the subscription before the trial period ends if you choose not to continue. **5. Check Local Broadcasting Services: In some regions, local broadcasting services or streaming platforms may offer Korean dramas for free as part of their content library. Explore the options available in your region to discover if any platforms provide free access to K-dramas. **6. Engage with Apps: Certain mobile apps offer free K-drama content. Apps like KOCOWA, OnDemandKorea, and Viu allow users to watch dramas without a subscription. While these apps may have limitations on the available content, they still provide an opportunity to enjoy K-dramas without a financial commitment. **7. Community Recommendations: Engage with online communities and forums where K-drama enthusiasts share recommendations and links to free streaming sites. Websites like MyDramaList, Reddit, or dedicated K-drama forums often have threads where users discuss and share legal sources for free streaming. **8. Public Libraries and Educational Platforms: Some public libraries offer streaming services with a selection of K-dramas. Additionally, educational platforms may host cultural content, including dramas, as part of their outreach programs. Check with your local library or educational institutions for potential free access to K-dramas. **9. Stay Updated on Promotions: Streaming platforms occasionally run promotional events or partnerships that allow users to access premium content for free during specific periods. Stay informed about these promotions by following the social media accounts of streaming services or subscribing to their newsletters. **10. Support Legal Channels: While enjoying K-dramas for free is fantastic, consider supporting legal channels when possible. Legal streaming services ensure the quality of content, provide a smoother viewing experience, and contribute to the continued production of captivating dramas. In conclusion, the world of Korean dramas is at your fingertips, and you can embark on an entertainment journey without spending a dime. By exploring free streaming platforms, utilizing trial periods, engaging with online communities, and staying informed about promotions, you can unlock the magic of K-dramas without breaking the bank. Enjoy the stories, emotions, and cultural richness of the best Korean dramas from the comfort of your screen, all while staying within your budget.
blogul
1,703,056
Blackjack Terminal Game
This is my game for a portfolio project on a Computer Science course at Codecademy. I would...
0
2023-12-19T22:00:46
https://dev.to/matpluta99/blackjack-terminal-game-16nd
codecademy, beginners, codenewbie
#### This is my game for a portfolio project on a Computer Science course at Codecademy. I would be very grateful if you reviewed the it and left a feedback on what I could improve :D #### https://github.com/mat-pluta99/Blackjack ### What were the goals of the project? The first goal of the project was create a simple basic program for me and friends or family to play with. I decided to create a Blackjack game, since I'd spend too much time on the Strip in Fallout: New Vegas. The last goal was to create a blog post about the project. So here I am, creating a first blog post on DEV :D ### Lady Luck I've managed to create a Blackjack game for 1-5 players. You can play alone or gather friends and try your luck in this classic card game. If you earn enough chips and quit before your luck runs out, your score will be saved in the Top 10 leaderboard.
matpluta99
1,703,110
The 'Serendipity Effect' of Brainstory
A first time Brainstory user shared this story in our community Discord. I originally was going in...
0
2023-12-20T00:14:46
https://blog.brainstory.ai/blog/brainstory-serendipity-effect/
psychology, questioning, culture
--- title: The 'Serendipity Effect' of Brainstory published: true date: 2023-12-19 00:00:00 UTC tags: psychology, questioning, culture canonical_url: https://blog.brainstory.ai/blog/brainstory-serendipity-effect/ --- A first time Brainstory user shared this story in our [community Discord](https://brainstory.ai/discord). > I originally was going in the direction of trying to give myself criteria to make my movie ratings more objective, but it turned into me [talking] about what makes a movie important to me and helped me visualize that for my ratings instead. First time using Brainstory, I like how it pulled me from my first line of thinking (making an objective scale) to a different train of thought (Making me defend and elaborate on my categories of subjective ranking) very seamlessly, very cool! Let’s talk about what this experience is and why! ## What does Brainstory _do_? Brainstory asks you **questions**. It does not give advice or suggestions. While you’re talking, it may seem like Brainstory is “pulling” you into a different train of thought, but that’s actually _you_ pulling yourself into a different direction. All Brainstory does is ask you questions about your thinking. ## Why did we make Brainstory ask questions? Asking questions is good, but there’s a lot of reasons humans don’t want to ask other humans questions. Maybe you feel like you’re wasting someone’s time, or you think the question is too sensitive. [Research](https://www.sciencedirect.com/science/article/abs/pii/S0749597820304003) has shown that the benefits of asking questions outweighs the perceived consequences, but tell that to your emotions! That’s why we made Brainstory— to be a safe place where you can always have your best thoughts. ## Are you reading this in December of 2023? Join our [giveaway](https://brainstory.ai/giveaway) and win awesome cash prizes! See you in the Discord!
lilchenzo
1,703,249
Beginner's TypeScript #6
⭐ Constraining Value Types ⭐ We have a User interface below: interface User { id: number; ...
25,770
2023-12-23T15:00:00
https://dev.to/nhannguyendevjs/beginners-typescript-6-jn5
programming, beginners, javascript, typescript
![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aiikg4nuqjp41qgi9wl7.png) ⭐ **Constraining Value Types** ⭐ We have a **User** interface below: ```ts interface User { id: number; firstName: string; lastName: string; role: string; } ``` We would not want to use a **freeform string**, because a User could have only a set number of roles: **admin**, **user**, or **super-admin**. Consider this **defaultUser**: ```ts export const defaultUser: User = { id: 1, firstName: "Uri", lastName: "Pilot", role: "I_SHOULD_NOT_BE_ALLOWED", } ``` It is being defined with a role that is not one of our options. We will update the **User interface** to restrict the role property to one of the set options. The **I_SHOULD_NOT_BE_ALLOWED** role should cause an error. 👉 **Solution:** The solution is to update **role** to be a [Union type](https://www.typescriptlang.org/docs/handbook/2/everyday-types.html#union-types). The syntax uses **|** to delineate the options for which values a key can be: ```ts interface User { id: number; firstName: string; lastName: string; role: "admin" | "user" | "super-admin"; } ``` ✍️ **About Union Types:** Anything can be added to a union type. For example, we can make a new **SuperAdmin** type and add it to the role: ```ts type SuperAdmin = "super-admin" interface User { // ...other stuff role: "admin" | "user" | SuperAdmin; } ``` Union types are everywhere within TypeScript. They give our code safety and allow us to be really sure of the types of data flowing through our app. --- I hope you found it useful. Thanks for reading. 🙏 Let's get connected! You can find me on: - **Medium:** https://medium.com/@nhannguyendevjs/ - **Dev**: https://dev.to/nhannguyendevjs/ - **Hashnode**: [https://nhannguyen.hashnode.dev](https://nhannguyen.hashnode.dev/)/ - **Linkedin:** https://www.linkedin.com/in/nhannguyendevjs/ - **X (formerly Twitter)**: https://twitter.com/nhannguyendevjs/
nhannguyendevjs
1,703,393
hehe_game
Check out this Pen I made!
0
2023-12-20T08:05:12
https://dev.to/rishitagarg/hehegame-e33
codepen
Check out this Pen I made! {% codepen https://codepen.io/Rishita-Garg/pen/MWxgvVm %}
rishitagarg
1,784,948
Blockchain Apps
Does anyone understand programming to make blockchain apps?
0
2024-03-09T01:26:40
https://dev.to/anonimus/blockchain-apps-51bd
Does anyone understand programming to make blockchain apps?
anonimus
1,703,434
Building an AI powered WebGL experience with Supabase and React Three Fiber
TL;DR: Baldur's Gate 3 inspired WebGL experience called Mirror of Loss Demo video GitHub...
0
2023-12-21T19:59:59
https://dev.to/laznic/building-an-ai-powered-webgl-experience-with-supabase-and-react-three-fiber-3dfn
webgl, webdev, supabase, react
> TL;DR: > - Baldur's Gate 3 inspired WebGL experience called [Mirror of Loss](https://mirrorofloss.com) > - [Demo video](https://drive.google.com/file/d/1t9DJvSedcPtayNZb4--8g1-sm4P8X1wf/preview) > - [GitHub repo](https://github.com/laznic/mirror-of-loss) > React Three Fiber with [Drei](https://github.com/pmndrs/drei) makes things simple > Tech stack used: React, React Three Fiber, Vite, Supabase, OpenAI, Stable Diffusion, Stable Audio I recently participated in the [Supabase Launch Week X Hackathon](https://supabase.com/blog/supabase-hackathon-lwx) where I ended up dabbling in WebGL and created a project called [Mirror of Loss](https://mirrorofloss.com). I had a lot of fun working on it, and figured it would be nice to share a bit about it here, too. I've participated in three different Launch Week Hackathon's before and I've always tried to do something a bit outside my regular web dev work. Usually they turn out to be more of an experience than an app. The hackathons run for a week, so it's a good time to focus on and learn something new and cool! > **Note**: this article will not go into every detail how to build a similar kind of WebGL app with React Three Fiber and Supabase. For example, installation instructions can be found on the libraries own websites instead of being mentioned here. > > This article just provides the bigger picture of how you can build WebGL apps & experiences by sharing my experience with it during the Supabase hackathon. > > Not all the code will be displayed here, as the article would grow as long. It is, however, open sourced, so you can find all the little details in the [GitHub repo](https://github.com/laznic/mirror-of-loss). ## The idea Recently I've been indulging on [Baldur's Gate 3](https://baldursgate3.game/), and what a better way to show your appreciation as a fan than to create something (close) from it! In the game they introduced a [Mirror of Loss (spoiler warning)](https://bg3.wiki/wiki/Mirror_of_Loss), and me being very intrigued by the aesthetics (and the whole story/concept) of [Shar](https://forgottenrealms.fandom.com/wiki/Shar) in the game, I thought it would be nice to do a representation of it myself. And in 3D/WebGL! However, I wasn't planning on doing this during the hackathon: I just wanted to make it a new art project for myself. In the end I decided to try and create this within a week, since it seemed like a good time to do it. ## Preparation So before the hackathon started, around one or two weeks prior, I started wondering how to build this thing in WebGL. I was aware of [Three.js](https://threejs.org/), which I had dabbled a bit in the past, however it seemed a bit intimidating. No way I'd have time to learn the vanilla way of creating WebGL experiences. Luckily I had head about [React Three Fiber](https://docs.pmnd.rs/react-three-fiber/getting-started/introduction) before, although hadn't payed a lot of attention to it. And boy, I was really happy with what I read in their documentation! Everything seemed to abstract the tedious bits in Three.js to an easy to use React components, and I can do React for sure. They provide a lot of additional helper libraries, such as [Drei](https://github.com/pmndrs/drei), [Ecctrl](https://github.com/pmndrs/ecctrl), to make developing a lot easier. Drei is a collection of various, ready-made abstractions of React Three Fiber, and includes things like [trails](https://drei.pmnd.rs/?path=/docs/misc-trail--docs), [making something face camera always, called Billboard](https://drei.pmnd.rs/?path=/docs/abstractions-billboard--docs), and [animated distort material](https://drei.pmnd.rs/?path=/docs/shaders-meshdistortmaterial--docs) for example. Ecctrl on the other hand allows you to set up a character controller very quickly. I recommend checking out both of these if you are planning to do any React Three Fiber work. > **Tip**: Easiest way to get started is to setup up your project with [Vite](https://docs.pmnd.rs/react-three-fiber/getting-started/installation#vite.js). Other than that, my plan was to do two different scenes: one with a mirror for which the user can give a memory to, and one "inside" the mirror where you can see every single memory the mirror holds. Thought this would be a pretty cool concept, and with this in mind, I started experimenting a bit. I started playing around and see if I could create some elements I'd like to use in the scene. I was mostly obsessed about creating a brazier, because you gotta have braziers! Below is what I come up with. It's very simple, however has a nice vibe to it. I didn't have time to study how to create a real looking flame via shaders, so I had to be quite creative in this. Basically it is just two stacked Cone geometries with some transparency. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xevwsme4kg0dsyh8rbep.gif) It came nicely together in React Three Fiber with `Lathe`, `Cone`, and `MeshDistortMaterial` components. ```javascript const lathePoints = []; for (let i = 0; i < 11; i++) { lathePoints.push(new Vector2(Math.sin(i * 0.2) * 2, (i - 5) * 0.2)); } return ( <> <Lathe args={[lathePoints]} rotation={[0, 0, 0]} position={[0, 1.2, 0]}> <meshLambertMaterial color="#777" /> </Lathe> <pointLight color="#fff" intensity={10} position={[0, 5, 0]} /> <Cone args={[1.5, 2, undefined, 50]} position={[0, 2.75, 0]}> <MeshDistortMaterial distort={0.5} speed={10} color="#baa8ff" transparent roughness={0.01} transmission={4.25} ior={1} /> </Cone> <Cone args={[1, 1.5, undefined, 50]} position={[0, 2.25, 0]}> <MeshDistortMaterial distort={0.75} speed={10} color="#fff" roughness={0.8} ior={7} > </Cone> </> ) ``` At this point I really had no idea what I was doing, however felt surprisingly confident that I could do some cool stuff for the hackathon. ## Starting the project, and the Mirror scene First off when the hackathon kicked off, I thought that creating a 3D model of the mirror will play an important part in this as I could quickly generate textures for other, simple objects (e.g. walls, pillars, etc.) via an AI. So I fired up [Spline](https://spline.design) and got to work. After spending several hours on the model, I now just needed to import it to the project and adding some materials to it. Before you can use GLTF files in your project, you'll need to tell Vite to include them in the assets like so: ```javascript // https://vitejs.dev/config/ export default defineConfig({ assetsInclude: ["**/*.gltf"], plugins: [react()], }); ``` It's a bit tedious process to create meshes for the GLTF model by hand, and luckily the Poimandres collective have also created [GLTFJSX](https://github.com/pmndrs/gltfjsx) library to help in that regard. They even have [a website to test it out](https://gltf.pmnd.rs/), which I just ended up using directly. [It prints out a nicely grouped meshes](https://github.com/laznic/mirror-of-loss/blob/57dfb1702e106808f5ebfeae2d65d0b05d20ac00/src/scenes/mirror/components/Mirror.tsx), which can be [altered individually](https://github.com/laznic/mirror-of-loss/commit/eb9501a8fe7f9788c74de9ae836cac4fe9ba6307). At this point, after adding some initial materials to the model as you can see in the latter commit link, I realized that this is going to look very bad if I can't nail the materials perfectly. The alternative route would be creating something with a more oldschool vibe, and mixing 2D with 3D. Basically using sprites in a 3D environment like what they did in old games such as Wolfenstein, Duke Nukem, and Hexen. I especially have always liked the aesthetics of the last game mentioned, so I decided that I could try it out quickly to see if I could make it work. Here is where [DreamStudio by Stability AI](https://dreamstudio.ai/), or any current AI solution really, helps out a lot. With just a few prompts, you are able to generate pixel art textures that you can use anywhere. Below are few examples what I ended up generating and using. ![Pillar textures](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/js0lfoe1k7dyh5npe2up.png) ![Mirror sprites](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6pniq4m6r4hskppsm4xo.png) You of course need to edit the images a bit, since you want to use transparent images when using sprites for objects. For some repeating textures, however, you might get away with just using the generated entries directly. I myself used [Piskel](https://piskelapp.com/) to remove the backgrounds, and later to create animations. For example, the braziers needed to be animated as it would be a bit boring to have them just sit around with static flame on them. Basically, in order to create an animated sprite, this just means having each state (frame) of the animation lined up next to each other in one big, transparent file. Below is how the brazier file looks like. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d3z3l73p5cnu70cw76d3.png) Drei was super helpful again, as it comes with a `SpriteAnimator` component to handle the animation without needing to do it [manually](https://github.com/laznic/mirror-of-loss/blob/932a7a39f18e8484c4fc20e2b95eb41662c9478c/src/scenes/mirror/components/Brazier.tsx#L21-L34). You just give it some props where to start the animation from, how many frames there are, and what texture to use. ```javascript export default function Brazier({ position }: BrazierProps) { return ( <Billboard position={position}> <pointLight intensity={50} position={[0, 1, 0.1]} castShadow /> <SpriteAnimator autoPlay startFrame={0} loop numberOfFrames={5} scale={3} textureImageURL={brazier} /> </Billboard> ); } ``` The mirror itself is just a `Circle` component with a double sided mesh material. How `Circle` differs from a `Sphere` is that `Circle` is two-dimensional, and `Sphere` three-dimensional. Same goes with `Plane` and `Box`, former which I used for bunch of other elements in the scenes. And since I'm working with sprites, I don't want to use textures on a 3D objects as that would wrap it around the object instead of displaying it as intended in the _image_. ```javascript <Circle args={[5, 100]}> <MeshReflectorMaterial map={texture} mirror={0.1} alphaTest={0.1} side={DoubleSide} /> </Circle> ``` I didn't wrap it in an earlier mentioned `Billboard` component as, even though working with sprites is more in the 2D realm, you can get a 3D feel by keeping some 2D objects stationary. For this it's important to have a _backside_ for the 2D object, too. If you didn't, and the the camera moved behind the object, it would disappear because there is nothing to render in that direction of the 3D space: the item is facing forward, and since it is 2D object, it does not have any points to draw in the opposite direction. Using `side` prop and e.g. `DoubleSide` as the value on the mesh renders the given texture on both sides of the 2D object, making it visible from all angles. Note that this `DoubleSide` makes object look exactly the same from the front and the back. If you want to have a different looking backside, you'll need to create a separate 2D object with a backside texture, and `BackSide` as the `side` prop value. > **Tip:** Use `alphaTest` prop to make the mesh material transparent! If you don't set it, and you are using a texture with transparent background, it will get a black background instead. With some initial objects created, I set up my scene a bit further to see how it would look like. And it was looking pretty decent! ![Initial scene](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/iorvyoiepwuks7bry8ad.gif) In the GIF above you can see that I initially used 3D pillars (`Box` component with the pillar texture slapped on it), however I later on decided that they didn't fit in when I added more elements to the scene. Next up I wanted to work on the "memories" you give to the mirror: how do they appear, what do they look like, how are they passed to the mirror. First I started with the styling. I wanted to make them something like they were "extracted" from you, something like blobs that would just float around, and be a bit blurry, but just seeing some imagery inside them. And luckily for us, Drei supports these with some premade components! There is a [Float](https://codesandbox.io/p/sandbox/backdrop-and-cables-2ij9u) component which allows you to wrap any geometry and make them float: no manual calculations needed! You can adjust the speed, rotation, and floating intensities. Then there is also a [MeshTransmissionMaterial](https://codesandbox.io/p/sandbox/meshtransmissionmaterial-hmgdjq) component that allows you to create see-through materials. These see-through materials can also warp or adjust the objects/imagery behind/inside based on lighting, etc. This allows you to create some pretty good looking things! With a lot of trial and error I ended up with something like in the picture a below. Don't mind the different looking scene, we'll get to that in a bit. ![Initial blob styles](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0lrx43x9mvymzgqlshge.png) ```javascript function Blob({ imageUrl, position, visible }: BlobProps) { const texture = useTexture(imageUrl); return ( <group visible={visible}> <Float speed={5} rotationIntensity={0.05} floatIntensity={1}> <group position={position}> <Sphere args={[0.33, 48, 48]} castShadow> <MeshTransmissionMaterial distortionScale={1} temporalDistortion={0.1} transmission={0.95} color={"#fbd9ff"} roughness={0} thickness={0.2} chromaticAberration={0.2} anisotropicBlur={0.5} distortion={1.2} /> </Sphere> <Sphere args={[0.2, 48, 48]}> <meshPhysicalMaterial map={texture} roughness={0.1} /> </Sphere> </group> </Float> </group> ); } ``` So my blobs are basically just two `Sphere` geometries on top of each other. One is using the see-through material, and one is a sphere with the image texture. The see-through material on top of the image gives it a nice look that it's actually "living" inside a sphere, and it allows me to achieve that "memory-like" feel. > **Tip:** Layering order of the materials and geometries matter! > Play around to see what gets you the best results. At this point we're just moving our camera with basic controls (mainly just zooming and rotating with the mouse), however to make it more immersive, we'll need to add a character. Here we'll be using the previously mentioned Ecctrl library. In addition to installing Ecctrl, it needs [Rapier, a physics engine](https://github.com/pmndrs/react-three-rapier). You'll also need to setup [KeyboardControls](https://github.com/pmndrs/drei?tab=readme-ov-file#keyboardcontrols), which can be exported from Drei. Setting up keyboard controls is easy: you'll just add the `KeyboardControls` inside your `Canvas` component, and make it wrap your scene. Then give it a map of keys to use, and you're good to go! Mine looks something like this: ```javascript const keyboardMap = [ { name: "forward", keys: ["ArrowUp", "w", "W"] }, { name: "backward", keys: ["ArrowDown", "s", "S"] }, { name: "leftward", keys: ["ArrowLeft", "a", "A"] }, { name: "rightward", keys: ["ArrowRight", "d", "D"] }, { name: "jump", keys: ["Space"] }, { name: "run", keys: ["Shift"] }, { name: "crouch", keys: ["c", "C"] }, ]; <Canvas> <KeyboardControls map={keyboardMap}> <ambientLight color={"#fff"} /> <Suspense fallback={null}> <SceneContextProvider> <MainScene /> </SceneContextProvider> </Suspense> </KeyboardControls> </Canvas> ``` Then you'll be needing to add physics so that your character can move properly in the environment. The Controller requires to wrap some sort of a character model in order it to work, so a simple `Sphere` can suffice in this case. I set it to be invisible, so you don't see it in any material reflections: you just float around. I adjusted the movement speed from the defaults a bit since my scene ended up being pretty big, and without sprinting it would take a long while to get to the mirror. ```javascript import { CuboidCollider, Physics, RigidBody } from "@react-three/rapier"; import Controller from "ecctrl"; <Physics gravity={[0, -30, 0]}> {/* @ts-expect-error the export is slightly broken in TypeScript so just disabling the TS check here */} <Controller characterInitDir={9.5} camInitDir={{ x: 0, y: 9.5, z: 0 }} camInitDis={-0.01} camMinDis={-0.01} camFollowMult={100} autoBalance={false} camMaxDis={-0.01} sprintMult={2} maxVelLimit={askForMemories ? 0 : 15} jumpVel={askForMemories ? 0 : undefined}> <Sphere> <meshStandardMaterial transparent opacity={0} /> </Sphere> </Controller> {/* other stuff.. */} {/* floor */} <RigidBody type="fixed" colliders={false}> <mesh receiveShadow position={[0, 0, 0]} rotation-x={-Math.PI / 2}> <planeGeometry args={[22, 100]} /> <meshStandardMaterial map={flooring} side={DoubleSide} /> </mesh> <CuboidCollider args={[1000, 2, 50]} position={[0, -2, 0]} /> </RigidBody> {/* other stuff.. */} </Physics> ``` > **Tip:** To get the first-person view, you'll need the following props. > > ```javascript <Ecctrl camInitDis={-0.01} // camera intial position camMinDis={-0.01} // camera zoom in closest position camFollowMult={100} // give any big number here, so the camera follows the character instantly autoBalance={false} // turn off auto balance since it's not useful for the first-person view >``` The cool thing about these physics and colliders, is that you can use them as sensors, too. Say you want to trigger some event when the player enters a specific area. You can just define a `RigidBody` element with a geometry, and a collider (e.g. `CuboidCollider`) which you mark as a sensor. Then for that collider, you also give a `onIntersectionEnter` prop a function that will be triggered once the player is inside that collider. For example, in my case I wanted to make the player unable to move, and focus on an input field so that they can just type without moving the character. I ended up with this simple thing: ```javascript <RigidBody type={"fixed"} position={[0, -3.5, -85.5]} rotation={[-Math.PI / 2, 0, 0]} > <Plane args={[4, 3]}> <meshStandardMaterial transparent opacity={0} /> </Plane> <CuboidCollider args={[2, 2, 1]} sensor onIntersectionEnter={() => setAskForMemories(true)} /> </RigidBody> ``` So whenever the player enters that 2x2x1 collider in, it will update the state and freeze the character. > **Tip:** If you don't set `type={"fixed"}` for the RigidBody, > the collider will collide with other physics and cause some weird behavior on mount. > > For example, my collider is slightly inside the mirror stand, and would just fly high up in the sky when the scene loaded. Setting it as fixed would keep it static and fixed in its place. Next up would be generating images via an Edge Function (as I wanted it this to be without a login and so that not _everyone_ can insert stuff to the database with the anon key), and then displaying the generated images as blobs in realtime. Here I of course used the wonderful Supabase Realtime feature. Below is an example of the wrapping component in order to display the blobs as they appear. ```javascript export default function MemoryBlobs() { const [memories, setMemories] = useState([]); useEffect(() => { const channel = supabase .channel("memories") .on( "postgres_changes", { event: "INSERT", schema: "public", table: "memories", filter: `player_id=eq.${localStorage.getItem("uuid")}`, }, (payload) => { setMemories((prev) => prev.concat([payload.new])); } ) .subscribe(); return () => { channel.unsubscribe(); }; }, []); return memories.map((memory: { id: number; image: string }) => ( <Blob key={memory.id} position={[0, -2, -84]} imageUrl={memory.image} visible /> )); } ``` Very simple setup, as you can see. I will not show the whole Edge Function code here as it's a bit long, however you can check [the source on GitHub](https://github.com/laznic/mirror-of-loss/blob/main/supabase/functions/generate-memories/index.ts) to see how these images are generated. Shortly: we take in user input, pass it to OpenAI, pass its response to Stable Diffusion API, upload images generated by SD to Supabase Storage, and then insert them to the database. The database insertion then triggers the realtime updates, and the images appear as blobs on the screen. If you take a look at the Edge Function code, you notice that I'm using the Fetch API to call OpenAI instead of the OpenAI SDK. This is for a reason: the SDK does not seem to work in these Edge Functions. It all works locally, however when you deploy the function to production and invoke it, it will crash to a `FilesAPI undefined` error. I'm not sure if my setup is a bit outdated, or if this is something that can be fixed by Supabase (or Deno?). In order to take in user input, we need to have an input field, of course. It would be a task of its own to build something like that in WebGL, so it's easier to throw HTML in the mix. This can be done easily with the [HTML](https://drei.pmnd.rs/?path=/docs/misc-html--docs) component from Drei. With it, I could add a HTML Form element to my scene and start taking in user input to generate images for the memories. ```javascript <Html center position={[0, -1.95, -88]}> <form onSubmit={async (e) => { e.preventDefault(); setGenerating(true); const data = new FormData(e.target); const input = data.get("memory"); const { data: responseData } = await supabase.functions.invoke("generate-memories", { body: { input, playerId: localStorage.getItem("uuid"), }, } ); setGenerated(true); if (responseData) { localStorage.setItem( "memoryGroupId", JSON.stringify(responseData.memoryGroupId) ); setTimeout(() => { setTransitionToVoid(true); }, 8000); } }} > <input className="memory-input" name="memory" style={{ display: !askForMemories || generating ? "none" : undefined, }} ref={inputRef} type="text" placeholder="Think of a memory..." /> </form> </Html> ``` With all this functionality, we can move to the Void scene. ## Enter the Void So in this scene I wanted the user to be in zero-gravity environment, and see these little blobs of memories floating around. Then when you click one, you would zoom in to see the name of the memory and date. In this view you would also have access to the generated images. For the zooming part, I needed to create a separate camera controller that would allow me to animate things smoothly, and after googling a while how to do this, I found a nice library called [camera-controls](https://github.com/yomotsu/camera-controls). You'd hook it to React Three Fiber's `useFrame` hook, and update the camera position and where it's looking at based on some given coordinates. You can see the implementation in the [Controls](https://github.com/laznic/mirror-of-loss/blob/main/src/scenes/void/components/Controls.tsx) component. It's hooked to a context, which stores the current camera position and look-at values from the blob click event, and then it does this nice transition when it needs to update from the previous position. The component also contains keyboard event handling to have that zero-gravity feel when moving the camera. You'd update the camera by sending new three-dimensional vector values to the context in some component like in [MemoryGroups](https://github.com/laznic/mirror-of-loss/blob/main/src/scenes/void/components/MemoryGroup.tsx#L62-L63) for example. ```javascript setCamPos(new Vector3(position[0], position[1], position[2] + 20)); setLookAt(new Vector3(position[0], position[1], position[2])); ``` Positioning the groups of memories would be next, and it turned out to be one of the coolest things about this project in my opinion. If you've tried the app, you notice that the group spheres place out quite evenly in the space, and they don't really overlap each other. This is thanks to the pre-installed [PostGIS](https://postgis.net/) plugin on the database. Even before realizing that I can use PostGIS for the locations, I wanted to make the groups appear in random locations, and they shouldn't overlap each other. My initial idea was that I'd store XYZ coordinates in their own database columns, and I would just check in my code if there are any overlaps with any of the database rows within a given range. Doable? Sure. Reasonable? Maybe not. And here where I realized that since I'm _actually_ working with coordinates, I can use PostGIS directly to handle these spatial coordinates. PostGIS comes with a built-in function to check if some coordinates overlap each other, which made this whole thing a lot simpler: I could just let the database handle everything! Only thing I needed to do was to send in the given text for the memory, the player ID, and the group would automatically get assigned to some random place in the 3D environment. Of course, since this was a hackathon, and it was my first time using PostGIS, I actually asked the [AI Assistant](https://supabase.com/blog/studio-introducing-assistant) to generate a database function for me! It didn't work straight out of the box, however it was 99.9% there. Very cool and impressive, so kudos to the Supabase team for this feature. ```javascript const { data: memoryGroupId } = await supabaseClient.rpc( "insert_memory_group", { memory: input, player_id: playerId } ); ``` ```sql CREATE OR REPLACE FUNCTION public.insert_memory_group(memory text, player_id uuid) RETURNS bigint LANGUAGE plpgsql AS $function$ DECLARE random_coordinates geometry; id bigint; BEGIN LOOP -- Generate random coordinates random_coordinates := ST_MakePoint( random() * 180 - 90, random() * 180 - 90, random() * 180 - 90 ); -- Check for intersecting geometries IF NOT EXISTS ( SELECT 1 FROM memory_groups WHERE ST_Intersects(position, random_coordinates) ) THEN -- Insert and return data INSERT INTO memory_groups (memory, position, player_id) VALUES (memory, random_coordinates, player_id) RETURNING memory_groups.id into id; RETURN id; EXIT; END IF; END LOOP; END; $function$ ; ``` > **Tip:** Use Supabase's AI Assistant, it's amazing, and will only get better the more you use it. However, while these coordinates (or geometries) are now stored _properly_, you cannot really use them as is in the code. This is because the stored format isn't a regular float for each axis: it's mix of numbers and letters, for example `01010000A0E6100000404C76755AFF35C03C48167A39544940DCF3805DD9663240`. So in order to use these [Points](https://postgis.net/workshops/postgis-intro/geometries.html#points) in our app, we'll need to convert these to floats. Here I used another database function to do the conversion: ```sql CREATE OR REPLACE FUNCTION public.memory_groups_with_position() RETURNS TABLE(id integer, memory text, created_at date, x double precision, y double precision, z double precision) LANGUAGE sql AS $function$ select id, memory, created_at, st_x(position::geometry) as x, st_y(position::geometry) as y, st_z(position::geometry) as z from public.memory_groups; $function$ ; ``` In the code, when the scene loads, I'd just fetch the groups with an RPC call via Supabase. I also hooked it to the realtime feature, so the scene automatically updates with the latest added memory group if you happen to be there and someone else gives mirror some memory. ```javascript useEffect(() => { async function fetchMemoryGroups() { const { data } = await supabase .rpc("memory_groups_with_position") .limit(1000); if (data) { setMemoryGroups((prev) => prev.concat(data)); } } fetchMemoryGroups(); }, []); useEffect(() => { const channel = supabase .channel("memory_groups") .on( "postgres_changes", { event: "INSERT", schema: "public", table: "memory_groups" }, async (payload) => { const { data } = await supabase .rpc("memory_groups_with_position") .eq("id", payload.new.id) .single(); setMemoryGroups((prev) => prev.concat([data])); } ) .subscribe(); return () => { channel.unsubscribe(); }; }, []); ``` You'll notice that I'll do another fetch for the newly added `memory_group` after getting notified by the INSERT event in database. This is because of what I mentioned earlier: since the group position is a geometry, I cannot use them directly to position them in the 3D space. Instead, I just use the RPC call to fetch the newly added group, which works perfectly in this case. After all this, with bunch of little tweaks here and there, adding "transitions" between the scenes, adjusting functionalities, [generating music with an AI](https://www.stableaudio.com/generate), being desperate at why the Edge Function is not working at 7am in the morning after working on the project for the whole night, the end result ended up being something like seen in this [demo video](https://drive.google.com/file/d/1t9DJvSedcPtayNZb4--8g1-sm4P8X1wf/preview). It shows a bit more optimal experience since it's running locally, however I'm really happy how it turned out. I had a vision, and managed to complete it to my liking, which is always amazing. I've left out quite a lot of details since there is a lot of code and things going on, so make sure to check out the [GitHub repo](https://github.com/laznic/mirror-of-loss) for all the missing parts. If you got this far, thanks for reading! Feel free to add comments if you have anything in mind that you wanna say.
laznic
1,703,539
Oracle Patch Update: Enhancing Performance and Security
Oracle provides robust enterprise software solutions to businesses across the globe. One of the...
0
2023-12-20T10:21:26
https://www.crestreports.com/oracle-patch-update-enhancing-performance-and-security/
oracle, patch, update
![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/78b6b2fs68uv6iy6t9bu.jpg) Oracle provides robust enterprise software solutions to businesses across the globe. One of the cornerstones of Oracle’s commitment to excellence is its Oracle patch update mechanism. Whether you’re managing an on-prem Oracle E-Business Suite or a dynamic Cloud application environment, patching has become an indispensable practice for maintaining optimal system performance, resolving issues, and incorporating cutting-edge features. **The Essence of Patching: Evolution Beyond Maintenance** Patching, in the realm of Oracle, has transcended its traditional connotation of simple maintenance fixes. It has metamorphosed into a dynamic tool that empowers organizations with new functionalities and fortifies the security and reliability of their systems. Be it an intricate on-premise setup or a sophisticated Cloud ecosystem, the necessity of patching remains universal. The multifaceted roles of patch applications extend far beyond mundane issue resolutions. Let’s delve into some key functions that underscore the significance of Oracle patch updates: **Resolving Unresolved Problems**: At the core of patching lies the swift resolution of issues. Whether it’s a minor glitch disrupting the workflow or a critical vulnerability compromising data security, patches address these concerns promptly. This ensures uninterrupted operations and safeguards sensitive information. **Introducing New Features and Functions**: Patches are Oracle’s most facile method of injecting innovation into their systems. These updates usher in new features and functionalities that augment user experiences and streamline processes. Businesses can stay ahead of the curve without the need for disruptive overhauls. **Upgrades to Maintenance Levels**: Keeping up with evolving maintenance requirements is a perpetual challenge. Patch updates encompass these upgrades, ensuring that your system remains aligned with the latest maintenance standards. This translates to enhanced reliability and longevity of your Oracle ecosystem. **Putting Product Upgrades in Place**: Traditional large-scale upgrades can be daunting, but patches make the transition smoother. They enable incremental upgrades that are easier to manage, reducing the complexity associated with massive overhauls while still keeping the system up-to-date. **Technological Stacks’ Interoperability**: In a heterogeneous technological landscape, compatibility issues can hinder productivity. Oracle patch updates address these concerns by ensuring seamless interoperability among various technological components, fostering a harmonious digital environment. **Finding the Sources of the Issues**: Patches not only offer solutions but also facilitate problem diagnosis. By analyzing the components affected by a patch, businesses can gain insights into the root causes of issues. This proactive approach empowers organizations to take preventative measures against future disruptions. **Making Use of Online Assistance**: Oracle’s patch updates are supported by comprehensive documentation and online resources. This extensive knowledge base aids administrators and IT teams in understanding the changes introduced by patches. It serves as a valuable tool for effective implementation and issue resolution. **Conclusion** As we’ve explored the significance of Oracle patch updates and Cloud testing in enhancing system performance, it’s equally crucial to address the challenges associated with testing. This is where Opkey steps in as a game-changing solution, lessening concerns such as slow test creation time and test execution and inconvenient test maintenance. Opkey’s unique selling point lies in its codeless automation capabilities, accelerated patching and customization. This empowers testers of varying technical backgrounds to contribute to the testing process, reducing dependency on specialized scripting skills.
rohitbhandari102
1,703,605
Wednesday Links - Edition 2023-12-20
The Big Cloud Exit FAQ (10 min)☁️ https://world.hey.com/dhh/the-big-cloud-exit-faq-20274010 Compact...
6,965
2023-12-20T11:50:08
https://dev.to/0xkkocel/wednesday-links-edition-2023-12-20-3134
java, jvm, cloud, architecture
The Big Cloud Exit FAQ (10 min)☁️ https://world.hey.com/dhh/the-big-cloud-exit-faq-20274010 Compact Java Bean validation (30 sec)🫘 https://twitter.com/NiestrojRobert/status/1736662080231727346 Tracking Java Native Memory With JDK Flight Recorder (6 min)✈️ https://www.morling.dev/blog/tracking-java-native-memory-with-jdk-flight-recorder/ Announcing jox: Fast and Scalable Channels in Java (5 min)🎚️ https://softwaremill.com/announcing-jox-fast-and-scalable-channels-in-java/ archUnit Spring 0.2.0 released (1 min)🏛️ https://github.com/rweisleder/archunit-spring/releases/tag/v0.2.0
0xkkocel
1,703,671
Free Node.js hosting?
Does someone has a cheap or free Node.js hosting site? Just to host small applications. It should be...
0
2023-12-20T12:38:31
https://dev.to/hazush/free-nodejs-hosting-43bo
Does someone has a cheap or free Node.js hosting site? Just to host small applications. It should be better then replit. Because replit is just the worst.
hazush
1,703,922
Top 10 CIAM Software Solutions
CIAM is like a guard at the digital gate. It's not just about creating barriers to prevent...
0
2023-12-20T16:24:27
https://www.rezonate.io/blog/top-ciam-software-solutions/
devops, cybersecurity
CIAM is like a guard at the digital gate. It's not just about creating barriers to prevent unauthorized access to your systems -- it's also about making your online experience smooth and stress-free. Here's the cool part: According to Gartner, using CIAM with extra safety features like fraud detection and passwordless logins can cut customer churn by more than [50%](https://www.computerweekly.com/blog/Taking-stock-of-retail-tech/Going-passwordless-in-online-shopping#:~:text=Indeed%2C%20according%20to%20Gartner%20research,will%20be%20only%20those%20online) by 2025. CIAM is a custodian of your customer data and a facilitator for great customer experiences.  What Are CIAM Software Solutions? --------------------------------- ![AIM vs CIAM](https://www.rezonate.io/wp-content/uploads/2023/12/AIM-vs-CIAM-1024x291.png) [https://transmitsecurity.com/blog/iam-or-ciam-why-you-need-purpose-built-ciam-to-meet-the-needs-of-customers](https://www.google.com/url?q=https://transmitsecurity.com/blog/iam-or-ciam-why-you-need-purpose-built-ciam-to-meet-the-needs-of-customers&sa=D&source=docs&ust=1702640330283061&usg=AOvVaw15a8ZrX4AljUqG0TKb21Sa) Customer Identity and Access Management (CIAM) software solutions manage and secure digital user identities. CIAM goes beyond basic authentication by incorporating features like social logins, MFA, and consent management. The distinction lies in CIAM's customer-centric approach, ensuring a user-friendly experience while maintaining robust security measures. CIAM software solutions enable you to streamline user registration, manage customer profiles, and implement secure authentication methods. They play a critical role in balancing user convenience and data protection. Benefits Of CIAM Software Solutions ----------------------------------- - Streamlined customer experience: Simplifies and enhances the customer journey by offering streamlined registration processes, personalized interactions, and easy service access. - Enhanced security measures: Ensures a secure environment with techniques like MFA, risk-based access controls, and [real-time security monitoring](https://www.memcyco.com/home/real-time-security-monitoring/). - Scalability and flexibility: Provides scalability to accommodate increasing user numbers and diverse needs. - Builds customer trust: You can build customer trust by demonstrating to your customers that you provide safe and secure services.  - Compliance management: Provides tools for consent management, helping organizations obtain and track customer consent. - Unified customer view: Consolidates customer data into a unified view, allowing you to gain insights into customer behavior, preferences, and interactions. ![CIAM Core principles](https://www.rezonate.io/wp-content/uploads/2023/12/CIAM-Core-principles.png) [https://www.wallarm.com/what/caim-customer-identity-and-access-management](https://www.google.com/url?q=https://www.wallarm.com/what/caim-customer-identity-and-access-management&sa=D&source=docs&ust=1702640330283653&usg=AOvVaw2VjnBENu8S1usAX_XFDfby) Key Features To Look For In A CIAM Software Solution ---------------------------------------------------- When evaluating a CIAM software solution, you must ensure it includes all the features required for your business, such as: - Multi-factor authentication (MFA): Strong MFA adds an extra layer of security by verifying users through multiple methods. - Social identity integration: CIAM that integrates with [social media platforms](https://www.oktopost.com/blog/b2b-social-media-trends/) makes user registration easier with familiar social credentials. - Consent management: Tools for efficient tracking, managing user consent, and staying compliant with privacy rules are a must-have. - Scalability: Opt for a scalable CIAM solution that accommodates growth without compromising performance. - Automated provisioning: Select a solution that automates user provisioning and de-provisioning across all assets.  Top 10 CIAM Software Solutions ------------------------------ Let's review the top CIAM solutions on the market. ### 1\. [Okta](https://www.okta.com/solutions/secure-ciam/) ![Okta](https://www.rezonate.io/wp-content/uploads/2023/12/Okta-1024x501.png) [https://www.okta.com/resources/whitepaper/top-developer-benefits-of-modern-ciam/](https://www.google.com/url?q=https://www.okta.com/resources/whitepaper/top-developer-benefits-of-modern-ciam/&sa=D&source=docs&ust=1702640330276082&usg=AOvVaw3SVMEG8WnbDBjpG9ZXayUy) Okta CIAM is an identity management solution with a developer-friendly approach and minimal custom code requirements to help you implement [Okta best practices](https://www.rezonate.io/blog/okta-security-best-practices/). It offers flexibility for various applications and platforms, including building new apps to enhance account takeover protection. Key Features: - The social authentication feature allows users to sign in with login information from social networking services like Facebook and Google. - Universal Directory feature provides a centralized view of all user identities.  - Single Sign-On and pre-built integrations with various apps.  - Adaptive MFA. Best For: Creating a consistent user experience across multiple platforms.  Price: By inquiry.  [Review:](https://www.gartner.com/reviews/market/access-management/vendor/okta) "Okta's SSO is a great and simple product. For the administrators, the managing capabilities are excellent." ### 2\. [Rezonate](https://www.rezonate.io/) ![Rezonate](https://www.rezonate.io/wp-content/uploads/2023/12/Rezonate.png) While not a CIAM solution, Rezonate is an identity security platform that complements CIAM and IAM strategies. Rezonate addresses the complexity, lack of visibility, and alert fatigue of traditional IAM solutions.  By prioritizing continuous protection and risk management, Rezonate helps you reduce risks at scale, providing insights into security gaps and activities across identity structures and attack surfaces. Key Features: - Easy one-click deployment, ensuring swift setup within fifteen to sixty minutes for large organizations. - Simple and highly scalable. - Provides comprehensive visibility over accounts, assets, and identity levels. - Utilizes real-time risk scores to identify and address security gaps. - Proactively enforces real-world least privileged access. - Detects malicious impersonation, access rights issues, and excessive privileges before any potential damage occurs. Best For: Continuous protection and risk prioritization. Price: Contact [Rezonate customer support](https://www.rezonate.io/contact-us/) for pricing details. Review: "Rezonate solves our extended identity attack surface, enhancing our security process without slowing us down, keeping us vigilant against threats." ### 3\. [IBM Security Verify](https://www.ibm.com/products/verify-identity) ![IBM Security Verify](https://www.rezonate.io/wp-content/uploads/2023/12/IBM-Security-Verify-1024x576.png) [https://www.g2.com/products/ibm-security-verify/reviews](https://www.google.com/url?q=https://www.g2.com/products/ibm-security-verify/reviews&sa=D&source=docs&ust=1702640330276591&usg=AOvVaw2N5YClsdeBCaPRocySXMiZ) IBM Security Verify provides AI-powered context for consumer and workforce identity and access management (IAM). While Security Verify is designed for enterprises that leverage the cloud, IBM offers a Verify Access version for legacy and on-premises apps.  Key Features: - Single Sign-On for centralized control.  - Advanced authentication, including passwordless and MFA. - AI and ML analyze access patterns and risks, adjusting authentication requirements in real time.  - Consent management templates.  Best For: Businesses in the process of migrating from [on-prem to the cloud](https://controlplane.com/blog/post/migrating-on-prem-deployments-to-the-cloud).  Price: Pricing for IBM Security Verify software is based on actual usage. You can calculate the cost based on usage [here](https://www.ibm.com/products/verify-identity/pricing). [Review:](https://www.g2.com/products/ibm-security-verify/reviews) "We utilize Verify to validate user identities with our own bespoke multi-factor authentication mechanisms." ### 4\. [Google Cloud Identity](https://cloud.google.com/identity?hl=en) ![Google Cloud Identity](https://www.rezonate.io/wp-content/uploads/2023/12/Google-Cloud-Identity-1024x507.png) [https://cloud.google.com/blog/products/identity-security/getting-started-with-identity-platform](https://www.google.com/url?q=https://cloud.google.com/blog/products/identity-security/getting-started-with-identity-platform&sa=D&source=docs&ust=1702640330277229&usg=AOvVaw088_MsVeVoDqTouawvWREI) Google Cloud Identity is a robust CIAM unified service that offers identity, access, app, and endpoint management. It has out-of-the-box integration with hundreds of cloud applications and also works with many pre-integrated apps.  Key Features: - Multiple MFA methods, including push notifications and Google Authenticator.  - Compatible with Android, iOS, and Windows devices and applications.  - Extensive documentation is available online.  - Enterprise-grade support and SLA. Best For: Highly scalable cloud-hosted applications and businesses with a very large customer base.  Price: Google Cloud Identity follows a pay-as-you-go pricing structure with $300 free credits for new users.  [Review:](https://www.g2.com/products/google-cloud-identity/reviews) "I have had a positive experience with the Google Cloud Identity so far. It protects my user account not just with Google apps but with all browsing across Chrome." ### 5\. [CyberArk](https://www.cyberark.com/products/customer-identity/) ![CyberArk](https://www.rezonate.io/wp-content/uploads/2023/12/CyberArk.png) [https://www.cyberark.com/resources/cyberark-identity/upcoming-cyberark-identity-user-interface-enhancements](https://www.google.com/url?q=https://www.cyberark.com/resources/cyberark-identity/upcoming-cyberark-identity-user-interface-enhancements&sa=D&source=docs&ust=1702640330279494&usg=AOvVaw2C42_U0eRlhhSd0Sm45Hgh) CyberArk Customer Identity provides a secure platform for managing application access, endpoints, and network infrastructure. It emphasizes strong security features like privacy, consent, and identity verification.  Key Features: - Pre-built security widgets and open APIs to support developers.  - AI-powered and passwordless MFA.  - Secure and [granular access control](https://www.rezonate.io/blog/essential-user-access-review-template/) for human and machine identities within the DevOps pipeline.  - Secure Single Sign-On experience and AI-powered, password-free authentication. Best For: Development and integration support features.  Price: Offers a free trial, then the price is by inquiry.  [Review:](https://www.g2.com/products/cyberark-identity/reviews) "It manages user's passwords as well as monitors activity to prevent inside threats in our organization." ### 6\. [ForgeRock Identity Platform](https://www.forgerock.com/identity-and-access-management-platform) ![ForgeRock Identity Platform](https://www.rezonate.io/wp-content/uploads/2023/12/ForgeRock-Identity-Platform-1024x741.png) [https://backstage.forgerock.com/docs/idcloud/latest/admin-uis.html](https://www.google.com/url?q=https://backstage.forgerock.com/docs/idcloud/latest/admin-uis.html&sa=D&source=docs&ust=1702640330275445&usg=AOvVaw1t336Ia0Y7kb9Dr9zAGxqW) ForgeRock Identity Platform is a versatile solution that supports over 120 integrations, facilitating the integration of solutions for authentication, risk and fraud management, identity proofing, behavioral biometric authentication, and more.  Key Features: - Low-code/no-code interface for easy-to-customize user journeys.  - Automates human and machine identity lifecycle management.  - Provides SDKs for embedding identity into web and mobile apps quickly. - Provides a profile and privacy management dashboard for controlling privacy preferences.  Best For: Hybrid enterprise environments. It supports on-prem, cloud, legacy, and mobile applications.  Price: By inquiry.  [Review:](https://www.g2.com/products/forgerock-forgerock/reviews) "ForgeRock can support legacy systems meanwhile delivering the most cutting-edge IAM solutions with the latest modules." ### 7\. [Auth0](https://auth0.com/ciam) ![Auth0](https://www.rezonate.io/wp-content/uploads/2023/12/Auth0-1024x626.png) [https://auth0.com/docs/get-started/auth0-overview/dashboard](https://www.google.com/url?q=https://auth0.com/docs/get-started/auth0-overview/dashboard&sa=D&source=docs&ust=1702640330280165&usg=AOvVaw1UtEWg6FA6hux_q1vCmGvz) Auth0 is a flexible, drop-in solution that makes it easy to add authentication and authorization to your applications. It is a highly customizable solution, allowing you to unify identity systems across all platforms.  Key Features: - The search feature makes it easy to find, update, and manage user information.  - Diverse libraries for integration with any programming language. - Gathers insights from consumer applications for product and marketing departments.  - Provides extensive learning resources and documentation about authentication, security, and growth. Best For: Customization and branding.  Price: Offer four plans: Free, Essentials ($35 per month), Professional ($240 per month), and Enterprise. [Review:](https://www.g2.com/products/auth0/reviews) "It has a lot of customization potential so that you can set the level of security according to you." ### 8\. [Microsoft Entra](https://www.microsoft.com/en-us/security/business/microsoft-entra) ![Microsoft Entra](https://www.rezonate.io/wp-content/uploads/2023/12/Microsoft-Entra-1024x802.png) [https://learn.microsoft.com/en-us/entra/identity/devices/manage-device-identities](https://www.google.com/url?q=https://learn.microsoft.com/en-us/entra/identity/devices/manage-device-identities&sa=D&source=docs&ust=1702640330280903&usg=AOvVaw0e-1rMjO1YoFVlxYu3WJ39) Microsoft Entra ID is the new name for [Azure AD](https://www.rezonate.io/blog/defending-azure-active-directory/). It facilitates secure access to a wide array of resources, including Microsoft 365, Azure portal, and numerous SaaS applications. Key Features: - Password synchronization. - Customizable single sign-on (SSO) portals for each user. - Authentication support for on-premises applications - Support SMS codes, phone calls, mobile app notifications, and biometrics. Best For: Organizations with cloud-based infrastructure that utilize Microsoft services such as Office 365 and Azure. Price: Offers four pricing plans: Free, P1 ($6.00 user/month), P2 ($9.00 user/month), and Governance ($7.00 user/month).  [Review:](https://www.gartner.com/reviews/market/decentralized-identity-solutions/vendor/microsoft/product/microsoft-entra-verified-id) "MS Entra verified ID can scale up or scale down the access based on the credentials of the employees, which is very useful." ### 9\. [Ping Identity](https://www.pingidentity.com/en/platform/solutions/pingone-for-customers.html) ![Ping Identity](https://www.rezonate.io/wp-content/uploads/2023/12/Ping-Identity-1024x1024.png) [https://www.pingidentity.com/en/platform/capabilities/authentication-authority/pingfederate.html](https://www.google.com/url?q=https://www.pingidentity.com/en/platform/capabilities/authentication-authority/pingfederate.html&sa=D&source=docs&ust=1702640330281705&usg=AOvVaw3aSBmYDwoiShBgtRY2bX-H) PingOne for Customers is a cloud-based identity solution combining no-code identity orchestration, user management, and multi-factor authentication. It's easy to set up and integrate with other identity providers.  Key Features: - Single Sign On and MFA authentication.  - Creates a single view of the customer by unifying customer profiles across identity silos.  - Enforces safe data sharing in line with data privacy best practices.  - Low-code/no-code orchestration capabilities. Best For: Integration with other Ping products and services.  Price: By inquiry.  [Review:](https://www.g2.com/products/ping-identity/reviews) "Ping Identity is super easy to use and configure; their support team has helped us with any issues we've had during setup." ### 10\. [One Login](https://www.onelogin.com/solutions/ciam) ![One Login](https://www.rezonate.io/wp-content/uploads/2023/12/One-Login-1024x675.png) [https://onelogin.service-now.com/kb_view_customer.do?sysparm_article=KB0010427](https://www.google.com/url?q=https://onelogin.service-now.com/kb_view_customer.do?sysparm_article%3DKB0010427&sa=D&source=docs&ust=1702640330282301&usg=AOvVaw2wXOugaNpeNDjRAL6uRP7h) OneLogin provides a single CIAM solution for managing customer identities across multiple platforms. OneLogin facilitates integration with custom apps and third-party systems, plus it allows you to add custom branding and colors.  Key Features: - Provides one-click access for all devices, to all enterprise cloud and on-prem applications.  - Session management architecture requires re-authentication after a pre-defined period of inactivity.  - Single Sign On and social sign on authentication methods for streamlined customer experience.  - Multi-factor authentication, biometrics, and risk-based authentication. Best For: Managing customer identities across multiple channels Price: By inquiry.  [Review:](https://www.g2.com/products/onelogin/reviews) "OneLogin is like a one-stop shop. I can access different apps/websites in just one place." CIAM + IAM = Continuous Protection And Risk Reduction ----------------------------------------------------- Pair a CIAM solution with Rezonate's identity security platform to protect your customers, partners, and employees from identity breaches.  Rezonate simplifies the complexity of IAM by visualizing and profiling your identity and access weak spots. Rezonate makes real time security simple and easy to understand, so you can start finding, prioritizing, and mitigating risks within the first hour of deployment.  [See Rezonate in action](https://www.rezonate.io/demo/) today.
yayabobi
1,704,018
Day 6 - The 12 Days of DEV: June 2023
Greetings on Day 6 of "The 12 Days of DEV"! 🎄🎉 As we approach the year's end, let's journey through...
25,788
2023-12-25T08:00:00
https://dev.to/devteam/day-6-the-12-days-of-dev-june-2023-3kam
bestofdev, devimpact2023
_Greetings on Day 6 of "The 12 Days of DEV"! 🎄🎉 As we approach the year's end, let's journey through the top two articles of June 2023._ _This compilation showcases the community's favorites, determined by a mix of comments, reactions, and page views (with just a very subtle touch of editorial curation). And while these articles shine as the most popular, it's worth noting that they vary from the DEV team's chosen Weekly Top 7 (though there might be some shared gems.) 🌟_ --- {% embed https://dev.to/aralroca/say-goodbye-to-spread-operator-use-default-composer-3c2j %} A game-changer in JavaScript object manipulation. [@aralroca](https://dev.to/aralroca) instructs how the "default-composer" library, at just ~300B, simplifies setting default values for nested objects. Say goodbye to the complexity of spread operators and Object.assign(). Dive into cleaner, more maintainable code and explore the possibilities --- {% embed https://dev.to/shnai0/how-i-build-my-first-open-source-project-with-chatgpt-and-nextjs-10k-users-in-24-hours-2m7n %} [@shnai0](https://dev.to/shnai0) conquered coding challenges in their debut Open Source project using ChatGPT and Next.js. With 10k users in just 24 hours, the LinkedIn Post Generator became a viral sensation. Revisit the details of their process, from initial setup to navigating open source projects, and learn more about code comprehension with ChatGPT. --- _Stay tuned for the next installment of "The 12 Days of DEV" as we continue to unwrap the community's favorite articles. Happy reading, and we look forward to sharing more highlights with you! 📚🎁_
thepracticaldev
1,704,154
Test
This is a test
0
2023-12-20T21:21:04
https://dev.to/admclamb/test-1jg9
webdev
This is a test
admclamb
1,704,207
📌 Azure Monitoring with Linux Web App
I. Description 📝 This scenario addresses the monitoring services you can use and describes...
0
2024-01-06T08:00:00
https://dev.to/brainboard/azure-monitoring-with-linux-web-app-l9g
## I. Description 📝 This scenario addresses the monitoring services you can use and describes a dataflow model for use with multiple data sources. When it comes to monitoring, many tools and services work with Azure deployments. In this scenario, we choose readily available services precisely because they are easy to consume. ## II. Architecture components 🏛️ Let's break down each resource: - azurerm_resource_group: This resource creates a Resource Group named "rg-main". Resource Groups in Azure are a fundamental entity used to group related resources for an application, making management, deployment, and monitoring easier. - azurerm_virtual_network: Defines a Virtual Network (VNet) named "vnet-kube". VNets enable Azure resources to securely communicate with each other, the internet, and on-premises networks. - azurerm_subnet: There are three subnet resources defined - subnet_webapp, subnet_db, and subnet_monitoring. Each subnet is a range of IP addresses in the VNet. They enable you to segment the network, improving security and performance. The subnets are named accordingly based on their intended use (for web apps, databases, and monitoring). - azurerm_public_ip: Creates a public IP address named "pip-kubernetes". Public IP addresses allow Azure resources to communicate with the internet and other Azure services. - azurerm_mariadb_server: This resource sets up a MariaDB server, which is a fully managed database service. The configuration includes version, storage size, administrator login details, and other settings. - azurerm_mariadb_database: Defines a MariaDB database named "mariadb_database" within the above MariaDB server. It specifies the character set and collation for the database. - azurerm_application_insights: Creates an Application Insights resource for monitoring the performance and usage of your apps. It's essential for diagnostics and telemetry. - azurerm_log_analytics_workspace: Sets up a Log Analytics workspace named "acctest-01". This workspace is used for managing and analyzing data logs collected by Azure services. - azurerm_linux_web_app: Creates a Linux-based web app service. It's part of the Azure App Service platform, which is used for hosting web applications. - azurerm_service_plan: Defines a service plan named "serviceplan", which specifies the hosting tier for the Azure web app. Service plans determine the location, features, cost, and compute resources associated with your web app. - azurerm_storage_account: Sets up a storage account named "storageaccountname". Azure Storage Accounts provide scalable cloud storage for data objects, file systems, messaging stores, and NoSQL stores. - azurerm_monitor_action_group: Creates an action group for Azure Monitor, which is used to define a set of actions to be executed when an alert is triggered. - azurerm_portal_dashboard: Establishes a custom dashboard in the Azure Portal named "my-dashboard". Dashboards are used for monitoring resources and data visualization. - azurerm_monitor_diagnostic_setting: Configures diagnostic settings for the MariaDB server. It specifies how metrics and logs are collected and stored, including integration with Log Analytics and a storage account. ## III. Variables In Terraform, a variable is a way to store and reuse values throughout your Terraform code. Variables are defined using the variable block and can be used to parameterize your Terraform code, making it more flexible and reusable. ![Variables](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/49p6idxni9v6xcpv0jm4.png) ## IV. Readme The readme file refers to a text file that provides information about the architecture, its features, requirements, installation instructions, and usage instructions. ![Readme](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xpa3rr8uasqtmrxjhg3v.png) - The readme file will be displayed on the templates description when you publish your architecture. - The readme file will be pushed in git when you are using git as your repository. - The readme file will be cloned along with the design of your architecture. ## V. How to use the architecture To use this architecture, clone it within your project and change the following components: Change the configuration of the cloud provider. In order to use the architecture you need to have a kubernetes cluster in place and change the resource group and name of the kubernetes cluster inside the configuration. Then change the variables: ![How to use the architecture](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rcd456pt8c5h7lghm5gd.png) ## VI. CI/CD 😍 You also have a complete CI/CD engine that allows you to check the security posture, estimate the cost of the infrastructure before deploying it and make sure that it respects your requirements. ![CI/CD](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0f0br19afpsk1cjqep61.png) 👉 You can use the template here: https://app.brainboard.co
tarakbrainboard
1,704,239
Data Scientist turned Dev Advocate
Hi Dev community! Are there people here who were data scientists or machine learning engineers and...
0
2023-12-21T01:17:17
https://dev.to/sonamgupta1105/data-scientist-turned-dev-advocate-733
ai, llm, machinelearning
Hi Dev community! Are there people here who were data scientists or machine learning engineers and now pivoting or already pivoted to being a developer advocate? I'd love to chat. :)
sonamgupta1105
1,704,375
Building a FAQ Bot: A React ChatBotify Guide (Part 3)
Introduction Welcome to the third installment of our comprehensive guide, "Building a...
24,084
2023-12-21T05:50:35
https://tjtanjin.medium.com/building-a-faq-bot-a-react-chatbotify-guide-part-3-7ce13d09933e
react, javascript, typescript, npm
## Introduction ![Demo GIF](https://cdn-images-1.medium.com/v2/resize:fit:800/1*cDX9C7JuNHM0QNgmWngehg.gif) Welcome to the third installment of our comprehensive guide, "Building a FAQ Bot: A React ChatBotify Guide." If you've ever found yourself grappling with repetitive user queries, you've likely pondered the benefits of an FAQ chatbot. In this segment, we will explore how we can build a customized FAQ chatbot to relay commonly requested information to your users! ## A Quick Recap In the previous parts of this series, we equipped you with a setup guide in [Part 1](https://dev.to/tjtanjin/how-to-setup-a-chatbot-with-react-chatbotify-a-step-by-step-tutorial-56d6) and guided you through establishing the basic appearance and conversation structure of your chatbot in [Part 2](https://dev.to/tjtanjin/tailoring-a-chat-bot-a-react-chatbotify-guide-part-2-2f3j). If you're just joining us, or if you need a refresher on these fundamental steps, be sure to check out the earlier installments. Note that this segment assumes you already have a [React ChatBotify](https://react-chatbotify.tjtanjin.com) chatbot setup. If you have not, then do visit [this guide](https://dev.to/tjtanjin/how-to-setup-a-chatbot-with-react-chatbotify-a-step-by-step-tutorial-56d6) first. As we venture into Part 3, we'll dive into one of the most common use cases of a chatbot - creating an FAQ bot that not only streamlines responses but also enhances user satisfaction and operational efficiency. By the end of this guide, you'll have the knowledge and tools to implement an intelligent FAQ bot using [React ChatBotify](https://react-chatbotify.tjtanjin.com). ## Crafting Options Building a chatbot to answer FAQs is very easily done with [React ChatBotify](https://react-chatbotify.tjtanjin.com). In fact, if you already have the chatbot setup from the [previous guide](https://dev.to/tjtanjin/tailoring-a-chat-bot-a-react-chatbotify-guide-part-2-2f3j), then you can easily build off it! However, for the purpose of making this tutorial complete, let's assume we have a clean setup with a bot that greets the user! This can be achieved with the following code snippet: ``` // MyChatBot.js import React from "react"; import ChatBot from "react-chatbotify"; const MyChatBot = () => { const flow = { start: { message: "Hello, I am a FAQ Bot!" } } return ( <ChatBot/> ); }; export default MyChatBot; ``` At this point, we have a very dull chatbot that does nothing more than greeting. We can easily have it show options to users by just adding an option attribute along with an array of options. Let's quickly include three options (Examples, Github & Discord) to the chatbot below: ``` // MyChatBot.js import React from "react"; import ChatBot from "react-chatbotify"; const MyChatBot = () => { const flow = { start: { message: "Hello, I am a FAQ Bot!", options: ["Examples", "Github", "Discord"] } } return ( <ChatBot/> ); }; export default MyChatBot; ``` With one simple change, we have now instructed the chatbot to provide options for the users to click. That said, if you click on the options, you will notice that you are not getting any responses from the chatbot. Let us now take a look at how we can craft our chatbot response! ## Crafting Responses In order to provide responses, we first need to process the user selected option. Create a second process_options block in which we will do our processing and ensure that the path is specified from the first block. The code snippet below shows the complete second block but not to worry, we will break it down in detail: ``` // MyChatBot.js import React from "react"; import ChatBot from "react-chatbotify"; const MyChatBot = () => { const flow = { start: { message: "Hello, I am a FAQ Bot!", options: ["Examples", "Github", "Discord"], path: "process_options" }, process_options: { message: (params) => { let link = ""; switch (params.userInput) { case "Examples": link = "https://react-chatbotify.tjtanjin.com/docs/examples/basic_form"; break; case "Github": link = "https://github.com/tjtanjin/react-chatbotify/"; break; case "Discord": link = "https://discord.gg/6R4DK4G5Zh"; break; default: return "unknown_input"; } setTimeout(() => { window.open(link); }, 1000) return `Sit tight! I'll send you to ${params.userInput}!`; }, } } return ( <ChatBot/> ); }; export default MyChatBot; ``` In the `process_options` block above, we define a `message` attribute. However, the `message` is determined dynamically by the user's choice of input in the `start` block. Notice first that the `message` attribute takes in [`params`](https://react-chatbotify.tjtanjin.com/docs/api/params/) which contains values that may be used in application logic. A full list of what values are accessible is found in the [API documentation](https://react-chatbotify.tjtanjin.com/docs/introduction/quickstart) and for the purpose of this example, we are using the `userInput` value which contains the option selected by the user from the `start` block. The `userInput` is used in the example above to determine which link to visit before informing the user to sit tight and sending the user to said link after a 1 second delay. You can furthermore have the chatbot loop itself by making a quick addition of a loop block: ``` // MyChatBot.js import React from "react"; import ChatBot from "react-chatbotify"; const MyChatBot = () => { const flow = { start: { message: "Hello, I am a FAQ Bot!", options: ["Examples", "Github", "Discord"], path: "process_options" }, process_options: { message: (params) => { let link = ""; switch (params.userInput) { case "Examples": link = "https://react-chatbotify.tjtanjin.com/docs/examples/basic_form"; break; case "Github": link = "https://github.com/tjtanjin/react-chatbotify/"; break; case "Discord": link = "https://discord.gg/6R4DK4G5Zh"; break; default: return "unknown_input"; } setTimeout(() => { window.open(link); }, 1000) return `Sit tight! I'll send you to ${params.userInput}!`; }, transition: {duration: 1}, path: "loop" }, loop: { message: "Do you need any more help?", options: ["Examples", "Github", "Discord"], path: "process_options" }, } return ( <ChatBot/> ); }; export default MyChatBot; ``` Just to give a quick reminder, apart from the `start` block, you can name the other blocks however you want. That said, you are still strongly encouraged to give descriptive names such as in the `process_options` and `loop` blocks shown above. All that said, if you have followed the guide closely up till now, you would have ended up with a FAQ bot similar to the one shown in the [live example](https://react-chatbotify.tjtanjin.com/docs/examples/faq_bot/): ![Demo ChatBot](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fahlkuaefxc3s8p6ypy2.png) You will notice differences from the live example but that's because the example is more extensive and there are multiple ways to go about achieving the same outcome. So, go ahead and use the playground for your exploration and experimentation to see what suits you best! ## Conclusion In this guide, we took a quick look at how we can build an FAQ chatbot with [React ChatBotify](https://react-chatbotify.tjtanjin.com/). In the [next guide](https://dev.to/tjtanjin/how-to-build-and-integrate-a-react-chatbot-with-llms-a-react-chatbotify-guide-part-4-3gbk), we will take a look at another exciting use case - that is building dynamic conversations with your chatbot via integration with [Gemini](https://ai.google.dev)!
tjtanjin
1,704,383
Serverless Sucks: How to Deploy your Next.js App to a VPS and Setup a CI/CD Pipeline
In the vibrant world of web development, Next.js has emerged as a standout framework for crafting...
0
2023-12-21T06:05:02
https://blog.designly.biz/serverless-sucks-how-to-deploy-your-next-js-app-to-a-vps-and-setup-a-ci-cd
nextjs, cicd, vps, nginx
In the vibrant world of web development, Next.js has emerged as a standout framework for crafting sophisticated web applications. While the convenience and simplicity of serverless platforms like Vercel are appealing, they don't always align with every project's needs. This article is dedicated to exploring the advantages of using a Virtual Private Server (VPS) and guiding you through the process of deploying a Next.js application on it. ### Pros and Cons: Serverless Computing vs. Virtual Private Servers (VPS) When considering serverless computing, its benefits include scalable architecture, cost-effectiveness for fluctuating workloads, reduced operational overhead, quicker deployment times, and built-in high availability. However, challenges such as cold start issues, limited environmental control, and potential vendor lock-in are notable. In contrast, VPS offers complete control over the server environment, predictable costs, consistent performance, and enhanced security control. The downsides include the need for regular maintenance, complexities in scaling, potentially higher costs for small or variable workloads, the requirement for technical expertise, and the risk of resource underutilization. Despite the perceived complexity of using a VPS, this guide aims to simplify the process. From choosing the right VPS provider to configuring your server and setting up a CI/CD pipeline, I'll provide comprehensive, step-by-step guidance. By the end, you'll not only have a functioning Next.js application on a VPS but also the confidence to manage it with ease. ### Selecting the Right VPS Provider Choosing an appropriate VPS provider is a pivotal step. Considerations include balancing scalability with cost-effectiveness. Recommended providers are: 1. **Amazon Web Services (AWS)**: Ideal for scalable needs, AWS offers a free-tier for starters, easy upgrade paths, and robust backup solutions. 2. **Linode**: Known for its simplicity and customer support, Linode provides straightforward pricing and reliable service, suitable for various project sizes. 3. **Hostinger**: An affordable option, Hostinger is user-friendly and well-suited for smaller to medium-sized projects. Each provider caters to different requirements. Your selection should align with your project's traffic expectations, budget, and technical demands. For personal projects, I prefer Hostinger, while AWS serves most of my larger clients. If you're inclined towards Hostinger, using my referral link at the article's end supports me! --- ### Preparing Your VPS: Essential Software Installation Once your VPS is up and running, the next step is installing key applications: 1. **NGINX**: A fast, configurable HTTP server, NGINX will manage all requests to your Next.js app, including HTTPS connections. 2. **Certbot**: This tool automates obtaining free SSL certificates from LetsEncrypt.org, crucial for secure communication. 3. **GitHub CLI (gh)**: Essential for CI/CD pipeline integration, it enables efficient interaction with GitHub repositories. 4. Node.js: We'll need it for building and running our Next.js app. First, make sure the system is up to date: ``` sudo apt update; apt -y dist-upgrade ``` Restart, if needed, then install our apps: ``` sudo apt -y install nginx gh sudo snap install --classic certbot ``` We'll install Node.js from Node Source: ``` sudo apt-get install -y ca-certificates curl gnupg sudo mkdir -p /etc/apt/keyrings curl -fsSL https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg sudo apt-get update sudo apt-get install nodejs -y ``` Lastly, we'll install the latest NPM and PNPM: ``` npm i -G npm pnpm ``` --- ## Setting Up the Build Environment We'll need a place to store our Git repository, so we'll create a system user called `web` and we'll keep our repo in the home directory: ``` adduser web ``` You can fill out the finger information if you want, or just leave it all blank. Be sure to choose a secure password. You'll never need to remember it, so you can randomly generate a strong password. Now you can switch to that user by running `sudo su web`. Next, let's pull our Next.js app from Github for the first time: ``` cd gh auth login ``` Github will ask you if you want to login to GitHub.com or an enterprise server, choose Github.com. Next, choose SSH as your preferred protocol and a SSH key will be automatically generated. Next, choose "login with web browser." There's no X server on a VPS, so you'll need to manually go to `https://github.com/login/device` and enter the code displayed in the terminal. Once you enter the code into the web browser, you should be logged in. Now we can pull our repo like this: ``` gh repo clone myusername/myapp ``` You should now have a "myapp" directory in the home directory of the web user. Next, we're going to rename it to "myapp-a" and then copy the directory to "myapp-b": ``` mv myapp myapp-a cp -r myapp-a myapp-b ``` The reason we're doing this is so we can have a directory for our live running server and one for building when we run our CI/CD script. That way there will be no interruption during the build. So what we'll do is create a symlink that points to `myapp-a` to start: ``` ln -s ./myapp-a myapp-live ``` --- ## Creating our CI/CD Scripts Let's write a few scripts that will allow us to build our app with a single command. So go ahead and fire up pico: ``` pico /usr/local/bin/myapp-build ``` Now enter the following code, change to suite your needs: ```bash #!/bin/bash # add the following line to sudoers: # web ALL=(root) NOPASSWD: /usr/sbin/service myapp restart # Define the paths symlink_path="/home/web/myapp-live" path_a="/home/web/myapp-a" path_b="/home/web/myapp-b" service_name="myapp" # Determine the current target of the symlink current_target=$(readlink -f "$symlink_path") # Decide which path is not in use if [ "$current_target" = "$path_a" ]; then unused_path="$path_b" switch_to="b" else unused_path="$path_a" switch_to="a" fi # Change to the unused path echo "Changing directory" cd "$unused_path" || { echo "Failed to change directory to $unused_path"; exit 1; } # Pull from GitHub echo "Pulling from GitHub..." git pull # Run pnpm install echo "Installing dependencies..." pnpm install || { echo "Dependency installation failed"; exit 1; } # Run npm run build echo "Running Next.js build..." npm run build # Check if the build succeeded if [ $? -eq 0 ]; then echo "Build succeeded. Proceeding with the switch and Nginx reload." # Switch to the newly built version vt-switch "$switch_to" || { echo "Failed to switch to $switch_to"; exit 1; } echo "Restarting vtapp service..." sudo service "$service_name" restart echo "Successfully switched to $unused_path" else echo "Build failed. Aborting deployment." exit 1 fi ``` Now let's create another script `/usr/local/bin/myapp-switch`: ```bash #!/bin/bash # Check if the argument is either "a" or "b" if [ "$1" = "a" ] || [ "$1" = "b" ]; then # Assign the link path linkPath=/home/web/myapp-live # Remove the existing symlink if it exists rm -f "$linkPath" # Create the symbolic link ln -s "/home/web/myapp-$1" "$linkPath" else # Print an error message if the argument is not "a" or "b" echo "Error: Argument must be 'a' or 'b'." exit 1 fi ``` Now we'll create another script that will allow us to quickly rollback to the previous deployment if something goes wrong with the current one. We'll put it at `/usr/local/bin/myapp-rollback`: ```bash #!/bin/bash # Define the paths symlink_path="/home/web/myapp-live" path_a="/home/web/myapp-a" path_b="/home/web/myapp-b" service_name="myapp" # Function to switch the symlink switch_symlink() { target=$1 echo "Switching to $target" rm -f "$symlink_path" ln -s "$target" "$symlink_path" echo "Switched symlink to $target" } # Determine the current target of the symlink current_target=$(readlink -f "$symlink_path") # Decide which path to switch to if [ "$current_target" = "$path_a" ]; then switch_symlink "$path_b" elif [ "$current_target" = "$path_b" ]; then switch_symlink "$path_a" else echo "Current symlink target is not recognized." exit 1 fi sudo service "$service_name" restart echo "Rollback completed." ``` Lastly, we need to make those scripts executable, so just run: ``` chmod +x /usr/local/bin/* ``` --- ## Setting Up The Service Ok, we're getting close! We're going to need to run our Next.js app as a `systemd` service. This will take care of starting up our app at system boot, restarting on failure and logging. To do that, we'll need to create a config file at `/etc/systemd/system/myapp`: ``` [Unit] Description=An Awesome Next.js App! After=network.target [Service] Type=simple User=web Group=web WorkingDirectory=/home/web/myapp-live ExecStart=/bin/sh -c '/usr/bin/npm start >> /var/log/myapp/myapp.log 2>&1' Restart=on-failure StandardOutput=null StandardError=null [Install] WantedBy=multi-user.target ``` What this does is it runs `npm start` as the web user and redirects `stderr` and `stdout` to `/var/log/myapp/myapp.log`. It also directs `systemd` to restart on failure. It also makes sure that the network is up and reachable before starting (particularly important if you have remote API calls at build time). Now we need to create the log directory and make sure it's writable by the web user: ``` mkdir /var/log/myapp chown web.web /var/log/myapp ``` TIP: you can watch the log live once the server is up by typing: `tail -f /var/log/myapp/myapp.log`. Now we need to reload `systemd` and enable the service: ``` systemctl daemon-reload systemctl enable myapp ``` Now we don't want the log file to just continue to grow and grow forever, so we'll need to leverage `logrotate` to manage archiving our log file. To do that, we'll create a config file at: `/etc/logrotate.d/myapp`: ``` /var/log/myapp/myapp.log { daily missingok rotate 7 compress delaycompress notifempty create 640 web web sharedscripts postrotate systemctl restart myapp endscript } ``` What this does is every day, `logrotate` will compress the current log file to `gz` format and then restart the service so a new log file is created. It will keep logs for each day of the week. It also makes the log file readable only by the web user. --- ## Setting Up NGINX We're going to use NGINX to create a reverse proxy to our Next.js app. NGINX will handle the multitude of connections coming and and also handle our SSL. To get started navigate to `/etc/nginx/sites-enabled` and delete the default site config file there. Then backout and go to `../sites-available`. We'll create a config file called `myapp.com`: ``` server { server_name myapp.com; error_page 502 =200 /fallback.html; location / { proxy_pass http://localhost:3000; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_intercept_errors on; } location = /fallback.html { root /home/web/myapp-live/static/; internal; } } ``` So this will forward any incoming connection to `http://localhost:3000`. If Next.js is down, it will serve `fallback.html`. You'll need to create your own static HTML file for this. You can do something like "We're currently in maintenance mode," or something like that. To enable this config, just run `ln -s /etc/nginx/sites-available/myapp.com /etc/nginx/sites-enabled/.` Then type `systemctl reload nginx`. --- ## Setting Up SSL via Certbot Before we run certbot, you'll need to make sure that both ports 80 and 443 are open on your VPS. You'll need to keep port 80 open for certbot to run automatically to renew your cert. Certbot will automatically update your NGINX config file for the SSL config and create a rewrite rule to redirect non-SSL connections to SSL. To run certbot, all you need to do it run: ``` certbot nginx ``` You'll need to agree to the terms and enter an admin email address to get notifications about certs that are expiring soon. Next, you should see your `myapp.com` domain listed. Select your domain and hit enter. TIP: If you have multiple domains, you can just hit enter and it will create a combined certificate for all subdomains. Hopefully you see the "congratulations" message which means your certificate is installed and NGINX is already reloaded and ready to go! If you get an error, you most likely have either a DNS or firewall problem. Make sure your DNS points to the *public* ip address of your VPS and you have ports 80 and 443 open on your firewall. ## The Last Step: Building Our App We need to do one last thing before we build our app. We need to allow the web user to restart the service as root once the build is complete. To do this run the command `visudo` and enter the following line: ``` web ALL=(root) NOPASSWD: /usr/sbin/service myapp restart ``` Now switch back to the web user and navigate to your home directory (`cd`). Then run the build command `myapp-build`. PNPM will install the dependencies and the app should start building. If the build is completed successfully, the service should start. Check your domain name in the browser. Hopefully you see your site up and running! If I missed anything in this guide, if you have anything to add, or just want to express your thanks, please feel free to leave a comment! 😊 ## Resources - [Setting up an AWS EC2 instance](https://towardsdatascience.com/setting-up-aws-ec2-instance-for-beginners-e34fa71a4758) - [Setting up a Linode VPS](https://www.linode.com/docs/products/compute/compute-instances/guides/set-up-and-secure/) - [Setting up a Hostinger VPS](https://www.hostinger.com/tutorials/getting-started-with-vps-hosting) Also, if you decide to go with Hostinger, please [follow this link](https://hostinger.com?REFERRALCODE=1J11864) so I can get a little credit. Thanks! --- Thank you for taking the time to read my article and I hope you found it useful (or at the very least, mildly entertaining). For more great information about web dev, systems administration and cloud computing, please read the [Designly Blog](https://blog.designly.biz). Also, please leave your comments! I love to hear thoughts from my readers. If you want to support me, please follow me on [Spotify](https://open.spotify.com/album/2fq9S51ULwPmRM6EdCJAaJ?si=USeZDsmYSKSaGpcrSJJsGg)! Looking for a web developer? I'm available for hire! To inquire, please fill out a [contact form](https://designly.biz/contact).
designly
1,704,416
RAID Server commands | tw_cli | MegaCli | mdadm
This post covers the important commands of the RAID server of MegaCli, tw_cli, and mdam
0
2023-12-21T06:30:26
https://dev.to/hasone/raid-server-commands-twcli-megacli-mdadm-3apk
linux, raid, sysadmin, server
--- title: RAID Server commands | tw_cli | MegaCli | mdadm published: true description: This post covers the important commands of the RAID server of MegaCli, tw_cli, and mdam tags: #linux #raid #sysadmin #server cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ttd2w776rmrjhuzs5mpt.png # Use a ratio of 100:42 for best results. # published_at: 2023-12-21 06:10 +0000 --- #### Basic terms - RAID: Redundant Array of Independent Disks. It's a data storage virtualization technology. - Controller: The hardware that manages the RAID array. - Unit: A virtual drive that consists of one or more physical drives. - Drive: A physical hard drive or SSD. - Rebuild: The process of recreating lost data on a new drive. - Parity: A mathematical method to recreate data lost from a disk in the array. - Striping: Data is split across multiple disks. - Mirroring: Data is duplicated on two or more disks. #### Debugging When debugging, it's important to check the status of your RAID controller and units regularly. If you see a status of `DEGRADED`, this means one or more drives have failed. If you see a status of `REBUILDING`, this means the RAID is currently rebuilding data on a new drive. ## MegaCli Commands - **View Adapter Information**: ```bash MegaCli -AdpAllInfo -aALL ``` - **View the Configuration of the Controller**: ```bash MegaCli -CfgDsply -aALL ``` - **List All Drives**: ```bash MegaCli -PDList -aALL ``` - **View the Physical Disk Information**: ```bash MegaCli -PDInfo -PhysDrv [Enclosure:Slot] -aALL ``` - **Check the Virtual Drive (Logical Drive) Information**: ```bash Copy codeMegaCli -LDInfo -Lall -aALL ``` - **View the Battery Backup Unit (BBU) Status**: ```bash MegaCli -AdpBbuCmd -aALL ``` - **Check for Consistency of RAID Configuration**: ```bash MegaCli -LDCC -aALL ``` - **Rebuild a Drive**: ```bash MegaCli -PDRbld -Start -PhysDrv [Enclosure:Slot] -aALL MegaCli -PDRbld -Stop -PhysDrv [Enclosure:Slot] -aALL MegaCli -PDRbld -ShowProg -PhysDrv [Enclosure:Slot] -aALL ``` - **Identify a Failed Drive**: ```bash MegaCli -PDList -aALL | grep -E 'Slot Number|Firmware state' # OR MegaCli -CfgDsply -aALL | awk '/Physical Disk:/{print ""; print; next} /Slot Number|Firmware state|Enclosure Device ID:|Inquiry Data|Device Id/{print}' ``` - **Replace a Failed Drive**: ```bash MegaCli -PDOffline -PhysDrv [Enclosure:Slot] -aALL MegaCli -PDMarkMissing -PhysDrv [Enclosure:Slot] -aALL MegaCli -PDPrpRmv -PhysDrv [Enclosure:Slot] -aALL ``` - **View Event Log**: ```bash MegaCli -AdpEventLog -GetEvents -f events.log -aALL ``` - **Silence Alarm**: ```bash MegaCli -AdpSetProp AlarmSilence -aALL ``` ## mdadm commands - To create a RAID 5 array: ```bash mdadm --create --verbose /dev/md0 --level=5 --raid-devices=3 /dev/sda1 /dev/sdb1 /dev/sdc1 ``` - To view the status of the RAID: ```bash cat /proc/mdstat ``` - To stop a raid array: ```bash mdadm --stop /dev/md0 ``` - To remove a failed disk: ```bash mdadm --manage /dev/md0 --remove /dev/sda1 ``` - To add a new disk to the array: ```bash mdadm --manage /dev/md0 --add /dev/sdd1 ``` Please note these commands are general and your RAID setup and commands may differ. ## tw_cli Commands `tw_cli` is a command line tool for managing 3ware RAID controllers. Here are some basic commands: - To check the status of the RAID controller: ```bash tw_cli info c0 ``` - To check the status of a unit: ```bash tw_cli /c0/u0 show ``` - To check the status of a drive: ```bash tw_cli /c0/p0 show ``` - To silence an alarm: ```bash tw_cli /c0 silence ``` - To rebuild a unit: ```bash tw_cli /c0/u0 start rebuild disk=1 ``` - To delete a unit: ```bash tw_cli /c7/u2 del ``` - To disable a drive: ```bash tw_cli /c0/p1 remove ``` - To enable a drive: ```bash tw_cli /c0/p1 add ``` - To show events: ```bash tw_cli /c0 show events ``` - To show all physical drives: ```bash tw_cli /c0 show phy ``` ## Conclusion Understanding the RAID server and different command tools is important to manage, maintain, and monitor the RAID Server controllers and units to prevent data loss and smooth operation. ### About me I was recently hired as a System Administrator to handle a variety of RAID servers. My job involves a mix of analyzing and setting up these servers, keeping an eye on their performance, writing scripts to make things run smoother, and putting together dashboards to keep everything easy to track and manage. My Email: hi@ericgit.me
hasone
1,704,697
How Generative AI Can Augment Human Creativity
In the rapidly evolving landscape of technology, one innovation stands out as a...
0
2023-12-21T11:30:18
https://dev.to/perryellysa/how-generative-ai-can-augment-human-creativity-39go
augment, ai
In the rapidly evolving landscape of technology, one innovation stands out as a game-changer—Generative Artificial Intelligence (AI). This groundbreaking technology has not only transformed industries but also sparked a revolution in how we approach creativity. In this blog, we explore the profound impact of generative AI on augmenting human creativity and how businesses, especially those invested in AI app development, AI chatbot development, and generative AI development, are leveraging this synergy. ## What is Generative AI? Generative Artificial Intelligence (AI) is a cutting-edge subset of artificial intelligence that focuses on enabling machines to autonomously produce creative and novel outputs. Unlike traditional AI, which relies on predefined rules and patterns, generative AI leverages advanced algorithms, often based on deep neural networks, to learn from vast datasets and then generate original content. This content can span various forms, including text, images, and even music. The distinguishing feature of generative AI is its ability to create new, unique content that wasn't explicitly programmed into the system. This technology has found applications in diverse fields, ranging from art and design to content creation and conversational interfaces, making it a powerful tool for augmenting human creativity and pushing the boundaries of what's achievable in the realm of artificial intelligence. **Text Generation** Generative AI has revolutionized text generation by autonomously creating human-like written content. Whether it's generating articles, stories, or even code snippets, the technology excels at producing coherent and contextually relevant textual output. This innovation not only streamlines content creation processes but also opens up new possibilities for automating various writing tasks. **Art Generation** In the realm of visual arts, generative AI has become a powerful tool for creating stunning and often avant-garde artworks. Artists and designers leverage algorithms to generate visuals that challenge traditional notions of creativity. This fusion of technology and art has given rise to unique pieces that explore the intersection of human imagination and machine learning. **Music Composition** Generative AI extends its influence to the world of music composition, where it can analyze patterns from vast musical databases and generate original compositions. This technology has proven invaluable for musicians seeking inspiration, helping to compose melodies and harmonies that push the boundaries of conventional music. **Video Game Content** The gaming industry has embraced generative AI to dynamically generate content within video games. From procedurally generated landscapes to adaptive storylines, generative AI enhances the gaming experience by creating immersive and ever-evolving virtual worlds. This not only reduces development time but also ensures a more dynamic and engaging gameplay environment. **Drug Discovery** Generative AI plays a transformative role in drug discovery by rapidly analyzing chemical compounds and predicting potential drug candidates. This accelerates the drug development process, making it more efficient and cost-effective. The technology's ability to explore vast chemical spaces contributes to the discovery of novel treatments and therapies. **Language Translation** Generative AI has significantly improved language translation services by providing more contextually accurate and natural-sounding translations. With advanced language models, these systems can grasp nuances and idiomatic expressions, resulting in more fluent and context-aware translations that bridge communication gaps across diverse linguistic landscapes. **Content Personalization** Businesses leverage generative AI to personalize content for users, tailoring recommendations and experiences based on individual preferences. This level of personalization enhances user engagement, whether in e-commerce, content streaming, or social media, creating a more user-centric digital environment. **Healthcare Imaging** Generative AI has made significant strides in healthcare imaging, aiding in the analysis and interpretation of medical images. From detecting anomalies in X-rays to segmenting organs in MRI scans, the technology enhances diagnostic accuracy and expedites the process of medical image interpretation, ultimately contributing to improved patient care. ## The Rise of Generative AI Generative AI, a subset of artificial intelligence, focuses on creating content autonomously. It goes beyond traditional AI systems by producing new and unique outputs, ranging from text to images and even music. This capability is powered by advanced algorithms, including deep neural networks, enabling machines to learn and replicate creative processes. The Interaction Between Human Creativity and Generative AI The partnership between human creativity and generative AI is a transformative collaboration. Generative AI catalyzes innovation, learning from and inspiring creative expressions. This synergy is particularly evident in industries involving AI app development, AI chatbot development, and generative AI development, where the amalgamation of human creativity and AI capabilities leads to groundbreaking advancements. ## How Generative AI Can Serve as a Creative Tool? Generative AI acts as a potent creative tool, amplifying the creative process across various domains. In AI app development, businesses leverage generative AI to streamline the creation of engaging applications. Similarly, in AI chatbot development, generative AI empowers chatbots to understand and respond more naturally. Generative AI development provides developers with a canvas to explore innovative algorithms and applications, shaping the landscape of artificial intelligence. ## How Generative AI Can Enhance Human Creativity Generative AI enhances human creativity by serving as a collaborative partner, expanding the boundaries of what's achievable. In an [AI app development company](https://www.quytech.com/ai-development-company.php), generative AI tools contribute to the creation of dynamic applications, fostering an environment for experimentation and ideation. In AI chatbot development companies, generative AI elevates conversational interactions, enabling chatbots to adapt and respond creatively to user queries. Generative AI development companies lead the evolution of AI capabilities, ensuring that the technology continues to augment human creativity across diverse industries. ## Revolutionizing Content Creation AI app development companies are at the forefront of incorporating generative AI into their workflows. This powerful technology is not only streamlining the content creation process but also pushing the boundaries of what's possible. From generating compelling narratives for mobile apps to creating visually stunning graphics, the applications are limitless. Businesses seeking to enhance user experiences are increasingly turning to AI app development companies to integrate generative AI into their products. **The Conversational Revolution** [AI chatbot development company](https://www.quytech.com/chatbot-development-company.php) is harnessing the potential of generative AI to revolutionize communication. Chatbots powered by generative AI can understand and respond to natural language, offering a more human-like interaction. This technology is not just about automating responses; it's about creating intelligent, context-aware chatbots that can adapt to users' needs in real time. As a result, businesses are redefining customer engagement by providing personalized and dynamic conversations through AI chatbots. ## The Role of Generative AI Development Companies Generative AI development companies play a pivotal role in shaping the future of AI. These entities are dedicated to pushing the boundaries of what generative AI can achieve, constantly refining algorithms and exploring new possibilities. Businesses looking to integrate generative AI into their operations turn to specialized generative AI development companies to ensure they are at the forefront of innovation. ## Conclusion In conclusion, the marriage of generative AI and human creativity is a testament to the potential for innovation in the digital age. AI app development companies, AI chatbot development companies, and a [generative AI development company](https://www.quytech.com/ai-development-company/generative-ai.php) are instrumental in harnessing the power of this transformative technology. As we continue to explore the synergies between AI and human ingenuity, the possibilities for groundbreaking advancements in creativity are truly boundless. Embracing this collaboration is not just a step into the future; it's a leap into a realm of endless possibilities.
perryellysa
1,705,093
PRmovies APK 2023 Latest v2.5 (NO ADs) For Android Free
Introduction to PRmovies APK PRmovies APK There are many movie downloading options on the market. But...
0
2023-12-21T19:39:29
https://dev.to/prmoviesapk/prmovies-apk-2023-latest-v25-no-ads-for-android-free-3mc
webdev
Introduction to PRmovies APK PRmovies APK There are many movie downloading options on the market. But the majority of them have some sort of security or content restriction. This completely ruins the experience. However, it would be completely safe to say that PRMovies Apk is a solution to all your content problems. This application allows you to enjoy endless content on one platform. It was developed by WPRMovies for all Android setups 4.0 or above. The good thing about this application is that it gives you unlimited access to the latest movies, TV channels, shows, and series in multiple national and regional languages. In addition, it allows you to download your favorite content, even when there is no internet connection. Similarly, the application is tested for malware and viruses, so there is no danger to your security. And you are not bothered by unnecessary ads and glitches. Finally, I want to know more about this fun application. Give this guide a quick read and check out how you can download it on your mobile phone. So, let’s get in.https://apkmolo.com/prmovies-apk/ ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/p97ssw84z8ejdw6ysve9.png)
prmoviesapk
1,705,279
Bundling Go Lambda Functions with the AWS CDK
Recently the Lambda Go runtime has changed from using the Go 1.x managed runtime to using the...
0
2023-12-22T01:38:03
https://bacchi.org/posts/cdk-bundling-golang-functions
aws, cdk, lambda, go
--- title: "Bundling Go Lambda Functions with the AWS CDK" published: true tags: - aws - cdk - lambda - golang canonical_url: https://bacchi.org/posts/cdk-bundling-golang-functions cover_image: https://bacchi.org/images/wind-turbines.jpg --- Recently the Lambda Go runtime has changed from using the Go 1.x managed runtime to using the [provided runtimes](https://docs.aws.amazon.com/lambda/latest/dg/lambda-golang.html#golang-al1) which have been historically used for custom runtimes (i.e. Rust.) The former `go1.x` runtime is being deprecated on January 8, 2024 (quite soon) and the new runtimes `provided.al2023` or `provided.al2` are expected to be used. With the introduction of these new runtimes, all of our Go binaries must [now be called](https://docs.aws.amazon.com/lambda/latest/dg/golang-handler.html#golang-handler-naming) `bootstrap` and be located at the root of the zip file used to deploy the function. If we have two functions built by separate function code, in the `go1.x` runtime the naming convention would allow us to name them differently (i.e. `create` vs. `list`). Now we must name both function handlers `bootstrap`, potentially causing confusion during the bundling process. In today's blog post, we'll describe how to bundle two different Go functions with the same name in a single CDK stack. ## AWS CDK Deployment Using Typescript I'll be using Typescript as the language for the AWS CDK deployment. It might be strange to mix languages for CDK deployment (Typescript) and the Lambda function (Go), but CDK uses Typescript as its main language and most examples are written in that. The reason for using Go in this example is because many deployments need to use compiled languages for task duration and performance (not necessarily to improve cold start time.) ## Source Organization As described earlier, in the past we could have kept the names of the Go function source files in the same directory and compiled them separately to different file names. Now that the handler file name has to be the same (`bootstrap`), they can't be output to the same directory any longer. The layout of the files now looks like the following, with each function in its own subdirectory in the `lib/functions` directory: ![Source code layout ](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6eegcesf1ika7qppsphk.png) ## Bundling Go Functions Now that we have the function code in separate directories, we can bundle them using the Alpha CDK construct for a [Go Function](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-lambda-go-alpha-readme.html), shown below. ```typescript export class BasicGoLambdaStack extends Stack { constructor(scope: Construct, id: string, props?: StackProps) { super(scope, id, props); const table = new TableV2(this, 'Table', { partitionKey: { name: 'id', type: AttributeType.STRING }, removalPolicy: RemovalPolicy.DESTROY, }); const createFunction = new GoFunction(this, 'CreateFunction', { entry: 'lib/functions/create', environment: { "DYNAMODB_TABLE": table.tableName }, logRetention: RetentionDays.ONE_WEEK, }); const listFunction = new GoFunction(this, 'ListFunction', { entry: 'lib/functions/list', environment: { "DYNAMODB_TABLE": table.tableName }, logRetention: RetentionDays.ONE_WEEK, }); ... ``` The above configuration is obviously fairly straightforward. But before I found the [aws-lambda-go-alpha](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-lambda-go-alpha-readme.html) module, I attempted (mostly unsuccessfully) to use the [aws-lambda-function Function construct](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_lambda.Function.html) which required using the [Code.fromAsset](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_lambda.Code.html#static-fromwbrassetpath-options) method with the [ILocalBundling interface](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.ILocalBundling.html). This was extremely uncooperative, I found [multiple blogs](https://github.com/aws-samples/cdk-lambda-bundling/blob/main/lib/cdk-bundling-lambda-stack.ts) and suggestions on how to configure this, and in the end I did get it working, but the solution I came up with was ugly and actually packaged the Go source file in the zip file with the bootstrap handler. Not cool. It's great that the CDK now supports Go Lambda functions as a full blown construct. While [NodeJS functions](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_lambda_nodejs.NodejsFunction.html) have been supported for much longer it makes sense to have Go supported the same way, because the aggravation I endured packaging these Go handlers with the [standard Lambda Function construct](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_lambda.Function.html) was a serious pain in the ass. ## Verifying the Output Now that we have these defined, we can package the artifacts using `cdk synth` and verify that the `bootstrap` binary was created for both functions. We can see here that the first messages output from the `cdk synth` command are the two functions being bundled: ```shell $ cdk synth Bundling asset BasicGoLambdaStack/CreateFunction/Code/Stage... Bundling asset BasicGoLambdaStack/ListFunction/Code/Stage... Resources: ... ``` And when we look in the `cdk.out` directory which is where our intermediate CDK files are placed during packaging, we see that the two Lambda Function output directories itemized in the stack asset JSON file (called `BasicGoLambdaStack.assets.json`) have binaries in them (ignore the third .js asset, that's for the cloudwatch logs): ```shell $ tree cdk.out/ cdk.out/ ├── asset.4e26bf2d0a26f2097fb2b261f22bb51e3f6b4b52635777b1e54edbd8e2d58c35 │   └── index.js ├── asset.b0bae29696f98febe5e6a655c4f61466ad71ebfa0071b46a15e917a1d307333c │   └── bootstrap ├── asset.f153e459ffb70033083e7507aaa06c00f4b13a792fad71433c11b5f050078e74 │   └── bootstrap ├── BasicGoLambdaStack.assets.json ├── BasicGoLambdaStack.template.json ├── cdk.out ├── manifest.json └── tree.json 4 directories, 8 files ``` After running `cdk synth` to verify things, we can deploy with `cdk deploy`. ## Success Now we can execute the functions and see them working as expected: ```shell $ curl -X POST -H "Content-Type: application/json" https://someurl.execute-api.us-west-2.amazonaws.com/todos --data '{ "title": "stuff", "details": "really some stuff"}' {"id":"80cf566a-937b-4aef-98ef-48b84ee75c01","title":"stuff","details":"really some stuff"} $ curl -X GET https://someurl.execute-api.us-west-2.amazonaws.com/todos [{"id":"80cf566a-937b-4aef-98ef-48b84ee75c01","title":"stuff","details":"really some stuff"}] ``` ## Summary The new requirements for using the provided runtimes for Go Lambda Functions poses some minor challenges (especially if you have to migrate already running Go 1.x Lambda Functions) but thankfully the [aws-lambda-go-alpha](https://docs.aws.amazon.com/cdk/api/v2/docs/aws-lambda-go-alpha-readme.html) module makes it much more straightforward than bundling these manually. I hope you've gotten a good understanding of how to bundle Go Lambda functions by reading this. I know I learned a lot, and this will come in handy as I use Go for Lambdas going forward. --- Cover photo by [Jesse De Meulenaere](https://unsplash.com/@jessefotograaf) on [Unsplash](https://unsplash.com)
mbacchi
1,705,428
View in Relational Database
Simply put, Views are stored queries that when invoked produce some results. Proper use of views in...
0
2023-12-22T06:16:53
https://dev.to/the_infinity/view-in-relational-database-43h8
database, tutorial, beginners, learning
**_Simply put, [Views](https://en.wikipedia.org/wiki/View_(SQL)#:~:text=In%20a%20database%2C%20a%20view,kept%20in%20the%20data%20dictionary.) are stored queries that when invoked produce some results._** Proper use of views in relational databases is a key aspect of good SQL database design. In this article, we will take a deep dive into the topic of Views. We will understand their utilities and look at some of their common usage. ## What are Views? A View in a relational database is referred to as the result set (set of rows) of a stored query. A view can be treated similar to any other database object (for example, table). _Let us take an example._ > _You can follow along using a standalone Postgres server on your PC or use this online playground provided by CrunchyData._ Starting from an empty database, we will create a table named student. This table will hold details about students of some university. To keep things simple, our table will only hold the following details — * ID: Number that uniquely identifies a student * Full Name * Phone Number: Text field holding student phone numbers * GPA * Age ``` postgres=# CREATE TABLE student ( id bigint, full_name text, phone_number text, gpa int, age int ); CREATE TABLE ``` Consider a use case where a number of database user would frequently want to select the ID and the GPA of students. To achieve this, each user will have to independently write a query like - ``` SELECT id, gpa FROM student WHERE ...; ``` While the ‘**WHERE**’ portion of the query may change from user to user, the columns to select and the table to select from remains the same. Moreover, if any of the queried columns happen to be renamed, all of its users would have to update their queries. Creating a view would make the things a lot simple and here is how the database administrator will achieve it. ``` postgres=# CREATE VIEW student_id_gpa AS SELECT id AS id, gpa AS gpa FROM student; CREATE VIEW ``` Now, the users can rely on the constant interface that student_id_gpa view provides without worrying about any changes to the underlying table. ``` SELECT id, gpa FROM student_id_gpa WHERE ...; ``` ## Use Cases for Views Views can be useful in many cases. Let us deep dive into some of the major ones. ★ **Views can hide complexity** Parts of a complex query can be broken down and moved behind views. This allows for simpler and more readable SQL queries. It also helps in better management and debugging. A query that brings data from multiple tables, requiring several joins, can be moved behind a view for making it easier to follow and understand. ★ **Views can help implement ACLs on a table** With views, a user’s access to a table’s rows and columns can be restricted. A view with only permissible parts of the table can be created. User can then be given access to use the view but not the actual table. Let us see this in action. First we will need to create a new user and give it login permission. ``` postgres=# CREATE ROLE restricted_user; CREATE ROLE postgres=# GRANT CONNECT ON DATABASE postgres TO restricted_user; GRANT ``` We want the ‘**restricted user**’ to be able to see the name and ID of students who are below the age of 21. Nothing more, Nothing less! We create a view for this and give permission to the ‘restricted user’. ``` postgres=# CREATE VIEW student_below_21 AS SELECT id AS id, full_name AS full_name FROM student WHERE age < 21; CREATE VIEW postgres=# GRANT SELECT ON student_below_21 TO restricted_user; GRANT ``` Now, the new user can only view those parts of the student table that our view allows! ``` postgres=> SELECT * FROM student_below_21; id | full_name ----+----------- 3 | Adam (1 row) postgres=> SELECT * FROM student; ERROR: permission denied for table student ``` ★ **Views can simplify supporting legacy code or schema evolution** Often the schema of a table changes over course of time. Breaking changes in schema would break a lot of code. Views can be useful in performing incremental updates to the database schema. The original table can be replaced with a view of same name. The original table can then be split into several tables and get their schema evolved independently! This keeps the legacy code functional which would now start using the view. ## How are Views stored in Databases? Databases do not store the result set of a view, they store the queries. This makes views space efficient. A large number of views can be created without worrying about the space taken on the database. All the views (in Postgres) are kept in a table named views inside information_schema. Here is how you can view it. ``` SELECT * FROM information_schema.views; ``` Starting Postgres 9.3, materialised views are supported. They work like normal views but persist the result set in a database table! ## Performance? Performance is not an issue with Views. They behave like any other SQL query and it is upto the query planner to optimise database accesses. Even with added conditions and joins, views tends to perform at par with database tables! --- Views in RDBMS are powerful tools that offer a convenient way to present data, simplify complex queries, and enhance security. Having a strong understanding of this concept allows us to make more informed design decisions. As technology evolves, the strategic use of views will continue to be a fundamental aspect of relational database management, shaping the way we interact with and harness the power of data!
the_infinity
1,705,536
Best PostgreSQL Tool - pgAdmin or dbForge Studio for PostgreSQL?
Explore dbForge Studio for PostgreSQL: A Superior Alternative to pgAdmin. In this detailed tutorial,...
0
2023-12-22T08:30:47
https://dev.to/dbajamey/best-postgresql-tool-pgadmin-or-dbforge-studio-for-postgresql-4c9c
postgres, database
Explore dbForge Studio for PostgreSQL: A Superior Alternative to pgAdmin. In this detailed tutorial, we dive into the advanced features of dbForge Studio for PostgreSQL, showcasing its enhanced capabilities in SQL coding, query optimization, data editing, database comparison, and more. Discover how it stands out against pgAdmin and learn about unique tools like Data Generator. Perfect for both beginners and professionals in database management. Tap the link to watch in full: https://youtu.be/V7mYZJOexRQ?si=NzTLdT1i6BW3tIUb
dbajamey
1,705,543
Svelte journey | Templating, Reactivity
Hello! Svelte has been in my vision scope for quite some time, but it wasn't until today that I...
25,826
2023-12-22T08:58:32
https://dev.to/chillyhill/svelte-journey-part-1-38pa
svelte, frontend, javascript, tutorial
Hello! Svelte has been in my vision scope for quite some time, but it wasn't until today that I actually started exploring it. Finally, I decided to dig it further, triggered by another encounter with yet another post titled "Svelte changed my life". I bet a lot of people heard about it but didn’t start for some reason. Well, this article may be the one! I may draw parallels to well-known approaches/tooling (mostly React or Angular) when applicable and interject with my own commentary. Of course, I highly recommend referring to [the official tutorial](https://learn.svelte.dev/tutorial), it is great. However, my aim is to distill the most relevant information and incorporate some thoughts that crossed my mind, you may find this also helpful. In the context of this series, our goal is to grasp the fundamentals of Svelte as a UI library and SvelteKit as a comprehensive toolkit that equips us with a mature set of tools to construct outstanding applications. We'll commence with the Basics and delve into Reactivity in this article. You might have already heard about reactivity in Svelte, and it's truly amazing. Still, Svelte team prepares some updates regarding that ([Runes, signals](https://svelte.dev/blog/runes)). In this chapter, we embark on our journey together. So, let's delve in. ### Templating and basic stuff ```jsx <script> import Nested from './Nested.svelte'; let count = 0; function increment() { count += 1; } </script> <p>This is a paragraph.</p> <Nested /> <button on:click={increment}> Clicked {count} {count === 1 ? 'time' : 'times'} </button> <style> p { color: goldenrod; font-family: 'Comic Sans MS', cursive; font-size: 2em; } </style> ``` It looks pretty much the same as the well-known JSX, so don't expect anything too spicy at this point. However, there are some perks that I would like to outline: **Shorthand attributes** `<img {src} alt="{name} dances." />` — no need for `src={src}`, just `{src}` is enough. **Code placement** Styles, code, and template in a single file. Code should be wrapped in `<script />`, and styles in `<style />`. **Handlers notation** `<button on:click={increment}>` — not `onClick`. **Plain string HTML embed** If you need to render HTML stored in a plain string, then use `<p>{@html string}</p>`. Keep in mind that **there is no sanitization**. Dangerous HTML stays dangerous 🔫. ## Reactivity 💲 As mentioned at the beginning (and as everyone constantly emphasises), reactivity in Svelte is a pure gold. Still, some updates are coming [in Svelte 5](https://svelte.dev/blog/runes). ```markdown // Example of basic reactivity usage <script> let count = 0; $: doubled = count * 2; // Kind of like a selector $: if (count >= 10) { // Kind of a Conditional Effect alert('Count is dangerously high!'); count = 0; } $: console.log(`The count is ${count}`); // Kind of a one-line Effect function increment() { count += 1; } </script> <button on:click={increment}> Clicked {count} {count === 1 ? 'time' : 'times'} </button> <p>{count} doubled is {doubled}</p> ``` Crucial parts, in my opinion, are as follows: `$` **Reactive declarations (something like selectors)** ```jsx let count = 0; $: doubled = count * 2; // This code `$: doubled = count * 2` re-runs whenever any of the referenced values change. ``` **Why not use plain variables for the same purpose?** Variables (`let`, `const`) always keep the same initial constant values if you try to make them like `$`. For example: ```jsx let a = 1; let b = a * 2; $: c = a * 2; function foo() { a++ } ``` When `foo` is called: `a` and `c` changes, but `b` remains the same. You can do assignment of `b` inside the function to change it, but that is another story, which we’ll get to in a minute. **Calling order** Reactive declarations and *statements* run **after other script code** and **before** component markup is **rendered.** `$` **Reactive statements (looks like effects)** Automatic calls happen when the dependency is updated, which feels really good. ```jsx // one liner $: console.log(`The count is ${count}`); // group multiple lines $: { console.log(`The count is ${count}`); console.log(`This will also be logged whenever count changes`); } // with operator (if) $: if (count >= 10) { alert('Count is dangerously high!'); count = 0; } ``` **`$` Reactive stuff and referential types** **Mutations** `push`, `splice`, and similar mutation operations won't automatically cause updates. If you want to change something, then an **assignment should happen**. ```jsx // Ugly way function addNumber() { numbers.push(numbers.length + 1); numbers = numbers; // Apply update via assignment } // [...pretty, way] function addNumber() { numbers = [...numbers, numbers.length + 1]; } ``` Assignments to *properties* of arrays and objects, e.g., `obj.foo += 1` or `array[i] = x`, work the same way as assignments to the values themselves. So, as with plain old assignment for `let` or `const`, values inside properties can be modified without reassignment. ```jsx <script> let foo = { bar: 2 }; let oof = foo; function handleClick() { foo.bar = foo.bar + 1; // This will update {foo.bar} foo.bar = oof.bar + 1; // This will also update {foo.bar} // Uncomment next line to make {oof.bar} be updated // oof = oof; } </script> <button on:click={handleClick}> foo.bar: {foo.bar} | oof.bar: {oof.bar} </button> ``` A simple rule of thumb: **the name of the updated variable must appear on the left** hand side of the assignment. For example: ```jsx // Won't trigger reactivity on obj.foo.bar // unless you follow it up with obj = obj. const obj = { foo: { bar: 1 } }; const foo = obj.foo; foo.bar = 2; ``` By the way, you may be in the same situation as me: when I came across this code sample [in the tutorial](https://learn.svelte.dev/tutorial/updating-arrays-and-objects), I interpreted it as something like this: ```jsx <script> const obj = { foo: { bar: 1 } }; const foo = obj.foo; foo.bar = 2; $: pluckedBar = obj.foo.bar obj.foo.bar = 100; // This still triggers reactivity </script> {pluckedBar} // Shows 100, but I had a feeling that it should stay ``` After spending some time, I figured out that it is actually this: ```jsx <script> let obj = { foo: { bar: 1 } }; let foo = obj.foo; $: pluckedBarBeforeReassign = foo.bar foo.bar = 2; $: pluckedBarAfterReassign = foo.bar function doit() { foo.bar = foo.bar + 1; /* * If commented, then {obj.foo.bar} in the template would never update. * The rest, {foo.bar} {pluckedBarBeforeReassign} {pluckedBarAfterReassign} * act as expected. */ // obj = obj } </script> <button on:click={doit}> {obj.foo.bar} <!--This guy you should pay attention to--> {foo.bar} {pluckedBarBeforeReassign} {pluckedBarAfterReassign} </button> ``` ### Summary In this part, we've covered the basics of templating and reactivity. We've followed the Svelte official tutorial, so if you feel excited about Svelte and want to have a broader look, it is a great idea to visit their [interactive tutorial](https://learn.svelte.dev/tutorial). My goal here is to share with you the most crucial information with extended comments on tricky parts, and I hope that I've accomplished that at least to some extent. Take care, go Svelte!
chillyhill
1,705,587
Capturing Love in the Windy City: Chicago Wedding Photographers Unveiled
In the heart of the Windy City, where love stories unfold against the stunning backdrop of Chicago's...
0
2023-12-22T09:49:41
https://dev.to/youmephotography/capturing-love-in-the-windy-city-chicago-wedding-photographers-unveiled-43jf
In the heart of the Windy City, where love stories unfold against the stunning backdrop of Chicago's iconic skyline, a cadre of talented photographers unveils the artistry of capturing love. [Chicago wedding photographers](https://www.youmephotography.com/weddings/) are not just documentarians; they are visual storytellers, each click of the shutter weaving together moments of joy, romance, and celebration. Let's delve into the enchanting world of Chicago wedding photographers and discover the magic they bring to love stories in this bustling metropolis. A Symphony of Styles: Chicago wedding photographers are a diverse group, each bringing a unique style to the table. From classic elegance to contemporary chic, they tailor their approach to reflect the individuality of each couple. The result is a symphony of styles, a visual tapestry that captures the essence of love in myriad ways. Cityscapes as Love Stories: The allure of Chicago's cityscape provides an exquisite canvas for these photographers. Skyscrapers, historic landmarks, and urban parks become integral components of the love story. Chicago wedding photographers skillfully blend the city's architectural beauty with the intimate moments of weddings, creating images that resonate with both grandeur and emotion. Moments, Candid and True: Chicago wedding photographers excel at capturing candid moments that reveal the authenticity of the couple's connection. Whether it's the exchange of vows, stolen glances, or spontaneous laughter, these photographers have an innate ability to freeze the genuine emotions that make each wedding unique. Seamless Integration: One of the hallmarks of Chicago wedding photographers is their seamless integration into the rhythm of the wedding day. They understand the ebb and flow of events, ensuring that they are present for pivotal moments while remaining unobtrusive during intimate exchanges. This ability to blend into the background allows couples to relive their day authentically through the lens. The Essence of Seasons: Chicago experiences the beauty of all four seasons, and wedding photographers in the city leverage this to their advantage. From the blooming flowers of spring to the snowy wonderland of winter, these photographers skillfully utilize the city's seasonal palette to create visually stunning and diverse wedding albums. Cultural Celebrations: In a city known for its cultural diversity, Chicago wedding photographers embrace and celebrate the richness of different traditions. Whether incorporating cultural elements into ceremonies or capturing the fusion of diverse backgrounds, these photographers ensure that each wedding is a reflection of the couple's unique heritage. Client-Centric Excellence: Chicago wedding photographers prioritize their clients, fostering open communication and understanding. From the initial consultation to the final delivery of photographs, these professionals go the extra mile to make the entire process enjoyable and stress-free. Their client-centric approach ensures that the couple's vision is not just realized but exceeded. Conclusion: In the enchanting world of [Chicago wedding photographers](https://www.youmephotography.com/weddings/), love is not just documented; it is celebrated, immortalized, and unveiled in every frame. Each click of the shutter tells a story of a couple's journey, against the backdrop of the vibrant and dynamic Windy City. If you're seeking professionals who can weave magic into your wedding album, Chicago's wedding photographers are ready to capture your love story in a way that's as unique and extraordinary as the love you share.
youmephotography
1,705,776
STO Development: The Future of Investment Opportunities
Outline Definition of STO (Security Token Offering) Complete Overview of STO...
0
2023-12-22T12:54:29
https://dev.to/matthewthamos/sto-development-the-future-of-investment-opportunities-ipg
cryptocurrency, web3, blockchain, sto
## Outline - **Definition of STO (Security Token Offering)** - **Complete Overview of STO Development** - **Increased liquidity** - **Global accessibility** - **Compliance with regulations** - **Fractional ownership** - **Integration with blockchain technology** - **Tokenization of traditional assets** - **Democratization of investment opportunities** - **Building a secure and compliant platform** - **Token creation and issuance** - **The future potential of STOs as investment opportunities** - **Importance of STO development in enabling this future** ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cdngrj31qt6bhnim6k3w.jpg) ## Definition of STO (Security Token Offering) STO, which stands for Security Token Offering, is a relatively new concept in investment opportunities. It is a fundraising method that allows companies to issue security tokens to investors in exchange for capital. These security tokens are digital assets that represent ownership or stake in a company, similar to traditional securities like stocks or bonds. ## Complete Overview of STO Development ### Increased liquidity One of the key advantages of STO development is the increased liquidity it offers to investors. Traditionally, investing in assets such as real estate or private companies has been a complex and illiquid process. However, with STOs, these assets can be tokenized and traded on blockchain platforms, making them much more accessible and liquid. By tokenizing assets, investors can easily buy and sell their shares or tokens on secondary markets, providing them with greater flexibility and the ability to exit their investments whenever they choose. This increased liquidity opens up a whole new world of investment opportunities for individuals who may have previously been excluded from the traditional investment market. Furthermore, the use of smart contracts in STOs ensures that the transactions are executed automatically and transparently, reducing the need for intermediaries and potentially lowering transaction costs. This further enhances the liquidity of STO investments, making it an attractive option for investors looking for more liquid and efficient investment opportunities. ### Global accessibility With STOs, investors no longer have to be restricted by the boundaries of their own country. They can find investment opportunities from different industries and sectors, regardless of their location. This allows for diversification of investment portfolios and the ability to tap into emerging markets and industries. Furthermore, businesses looking to raise capital through STOs can attract a global pool of investors. This increases the chances of securing funding and can help businesses grow and expand on a global scale. It also provides an opportunity for businesses in developing countries to gain access to international investors who may be interested in supporting their growth. Thanks to the advancements in technology and the blockchain, STOs have made it possible for anyone with an internet connection to participate in investment opportunities that were previously limited to a select few. This democratization of investment is revolutionizing the way people invest and is creating a more inclusive and accessible financial ecosystem. ### Compliance with regulations Compliance with regulations is a crucial aspect of STO development. Unlike traditional fundraising methods, Security Token Offerings are subject to various regulatory requirements to ensure investor protection and prevent fraudulent activities. By adhering to these regulations, STOs provide a more secure and transparent investment environment. ### Fractional ownership Fractional ownership is a concept that is gaining a lot of traction in the investment world, and it is one of the key features of Security Token Offerings (STOs). With fractional ownership, investors can purchase a percentage of an asset, such as real estate, artwork, or even businesses. This opens up a whole new world of investment opportunities, especially for those who may not have the financial means to buy these assets outright. ### Integration with blockchain technology Integration with blockchain technology is one of the key aspects of STO development. Blockchain, as a decentralized and transparent technology, provides a secure and efficient platform for issuing and trading security tokens. By leveraging blockchain, STOs eliminate the need for intermediaries, such as banks or brokers, and allow for direct peer-to-peer transactions. ### Tokenization of traditional assets The tokenization of traditional assets is a key concept in STO development. In simple terms, tokenization refers to the process of converting physical assets, such as real estate, art, or commodities, into digital tokens on a blockchain. This allows investors to buy and trade fractional ownership of these assets, opening up investment opportunities that were previously inaccessible. ### Democratization of investment opportunities Democratization of investment opportunities is one of the key benefits of Security Token Offerings (STOs). In traditional financial markets, access to investment opportunities is often limited to wealthy individuals or institutional investors. However, STOs aim to change this by allowing anyone to participate in the investment process. ### Building a secure and compliant platform Building a secure and compliant platform is crucial when it comes to STO development. The nature of security tokens requires a high level of trust and transparency, which means that the underlying technology must be robust and resistant to any potential vulnerabilities. Additionally, STOs must comply with the regulatory framework in the jurisdictions they operate. ### Token creation and issuance Token creation and issuance is a crucial aspect of STO development. Unlike traditional investment opportunities, STOs allow the creation and issuance of digital tokens representing ownership or investment in a company or project. These tokens leverage blockchain technology, ensuring security, transparency, and effortless transferability. ## The future potential of STOs as investment opportunities One of the key advantages of STOs is their ability to provide fractional ownership of assets. This means that investors can now own a portion of high-value assets such as real estate, art, or even businesses. Previously, these investment opportunities were only available to a select few, but with STOs, anyone can participate and benefit from the potential returns. Another exciting aspect of STOs is their potential to increase liquidity in traditionally illiquid markets. By tokenizing assets and creating a secondary market for trading these tokens, STOs have the potential to unlock value and enable investors to easily buy and sell their investments. This increased liquidity can make investing in traditionally illiquid assets much more attractive and accessible. Furthermore, STOs offer enhanced transparency and security compared to traditional investment options. The use of blockchain technology ensures that each transaction is recorded and cannot be tampered with, providing a higher level of trust and accountability. Additionally, smart contracts can be programmed to automatically enforce the terms of the investment, reducing the need for intermediaries and streamlining the investment process. The global nature of STOs also opens up investment opportunities beyond geographical boundaries. Investors can now participate in projects from around the world, allowing for a more diverse and well-rounded investment portfolio. This global reach can also benefit businesses seeking funding, as they can tap into a larger pool of potential investors. In conclusion, STOs have the potential to revolutionize the investment landscape by providing access to previously inaccessible assets, increasing liquidity in illiquid markets, and offering enhanced transparency and security. As the adoption of blockchain technology continues to grow, so does the potential for STOs to become a mainstream investment option. So, keep an eye out for the exciting developments in the world of STO development, as they shape the future of investment opportunities. ## Importance of STO development in enabling this future The future of investment opportunities is rapidly evolving, and one of the most promising developments in this field is Security Token Offerings (STOs). Unlike traditional investments, which are often limited to high-net-worth individuals or institutional investors, STOs offer a new way for anyone to invest in a wide range of assets, such as real estate, startups, or even works of art. STOs are made possible through blockchain technology, which provides a secure and transparent platform for issuing and trading digital tokens that represent ownership of an asset. These tokens are backed by real-world assets, making them more reliable and less volatile than cryptocurrencies like Bitcoin or Ethereum. The importance of STO development lies in its ability to democratize investment opportunities. By removing barriers to entry, such as high minimum investment amounts or complex accreditation requirements, STOs open up the playing field to a much wider audience. This means that individuals who were previously excluded from traditional investment opportunities can now participate and potentially benefit from the growth of various asset classes. Furthermore, STOs offer increased liquidity compared to traditional investments. Since these tokens can be easily traded on secondary markets, investors can buy or sell their holdings at any time, giving them more control over their investments. Another key advantage of STOs is the enhanced security they offer. Blockchain technology ensures that transactions are recorded and verified on a decentralized network, reducing the risk of fraud or manipulation. Additionally, smart contract functionality can be incorporated into STOs, automating compliance procedures and ensuring that all parties involved adhere to the established rules and regulations. The future of investment opportunities is bright, and [STO development company](url=https://beleaftechnologies.com/sto-development-company) is paving the way for a more inclusive and efficient financial ecosystem. As more companies and individuals recognize the advantages of STOs, we can expect to see a significant increase in the number and variety of assets that are tokenized. This will not only create new investment opportunities but also foster innovation and economic growth.
matthewthamos
1,705,904
How to Launch a PHP Project in VS Code Dev Container
As a designer and frontend developer, I often find myself working on various projects. Recently,...
0
2023-12-22T15:57:29
https://dev.to/lsvs/how-to-launch-a-php-project-in-vs-code-dev-container-2mp0
webdev, devcontainer, php, docker
As a designer and frontend developer, I often find myself working on various projects. Recently, while working on one of legacy projects, I needed to make updates to a website that was built with PHP. After conducting some research, I discovered that the best option for creating a PHP development environment for my case would be to utilize the VS Code Dev Container feature. The Dev Container feature in VS Code allows developers to define and configure a development environment using containers. You can define the runtime, tools, and dependencies required for your project, ensuring consistency and reproducibility across different development machines. ### Prerequisites: 1. [Visual Studio Code](https://code.visualstudio.com/Download) 2. [Dev Containers](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) extension 3. [Docker](https://www.docker.com/products/docker-desktop/) To get started with the Dev Container feature in VS Code, you need to create a `.devcontainer` folder in your project's root directory. Inside this folder, you can define a `devcontainer.json` file that specifies the Docker image, runtime settings, and any additional configurations required for your PHP project. ```json { "name": "PHP", "image": "mcr.microsoft.com/devcontainers/php", "forwardPorts": [ 8080 ] } ``` So, essentially, that’s all 🙃. We can now use `php -S` inside the container to launch the dev server. But the container also already includes Apache, which is used in my project's production environment. Consequently, I've opted to utilize it and automate the configuration and launch of Apache as well. In my case, I extended the default configuration by adding some scripts to the `postCreateCommand` field in the `devcontainer.json` file. ```json { "postCreateCommand": "sudo chmod a+x ./.devcontainer/postCreateCommand.sh && ./.devcontainer/postCreateCommand.sh" } ``` Here is the shell script that I added: ```sh sudo chmod a+x "$(pwd)/public_html" sudo rm -rf /var/www/html sudo ln -s "$(pwd)/public_html" /var/www/html sudo a2enmod rewrite echo 'error_reporting=0' | sudo tee /usr/local/etc/php/conf.d/no-warn.ini sudo apache2ctl start ``` These scripts perform the following actions: 1. Grant execute permission to the `public_html` directory, where my PHP files are located. 2. Remove the default files in the `/var/www/html` directory. 3. Create a symbolic link from the `public_html` directory to the `/var/www/html directory`. 4. Activate the rewrite module in Apache configuration. 5. Turn off PHP error reporting by adding a configuration file `no-warn.ini`. 6. Start the Apache server. Great, everything is ready. Now, to seamlessly integrate our configured development environment, run Docker and navigate to the bottom left corner of VS Code, where you’ll find the “Remote Window” button. Click on it, and choose the “Reopen in Container” option. ![VS Code screenshot with highlighted Dev Container button](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/naqokvh9rydx5znl6zuf.png) Selecting “Reopen in Container” triggers VS Code to rebuild the container based on the configurations we’ve set in the `.devcontainer` folder. Once the container rebuild is complete, you'll find yourself working inside the containerized environment, seamlessly incorporating the specified PHP runtime and the configured Apache server. Now you can open your web browser and navigate to [localhost:8080](http://localhost:8080). Here, you’ll see your PHP application running within the container. --- In conclusion, the VS Code Dev Container feature proves to be a convenient and efficient tool for setting up and managing PHP development environments. Its ability to streamline the development process ensures consistency across various machines, making it an excellent choice for PHP project work. --- Originally posted at [Medium](https://lisovskiy.medium.com/how-to-launch-a-php-project-in-vs-code-dev-container-4896fd55663c)
lsvs
1,706,443
Run MySQL and phpMyAdmin with Docker for Database Management!
To run MySQL and phpMyAdmin containers with Docker and connect them, you can follow these steps: ...
0
2023-12-23T12:46:24
https://dev.to/elaurichenickson/run-mysql-and-phpmyadmin-with-docker-for-database-management-5dob
To run MySQL and phpMyAdmin containers with Docker and connect them, you can follow these steps: ## Step 1- Install Docker Make sure you have Docker installed on your machine. You can download and install Docker from the official website: https://www.docker.com/. ## Step 2- Create a Docker Network Create a Docker network to allow the containers to communicate with each other. This step is optional but helps in connecting containers easily. ``` bash docker network create mynetwork ``` ## Step 3- Run MySQL Container Run a MySQL container with the desired configuration. Replace `<password>` with your preferred MySQL password. ``` docker run -d - name mysql-container - network=mynetwork -e MYSQL_ROOT_PASSWORD=<password> -p 3306:3306 mysql:latest ``` ## Step 4- Run phpMyAdmin Container Run a phpMyAdmin container and link it to the MySQL container. Replace `<password>` with the MySQL password you set in **Step 3**. ``` bash docker run -d - name phpmyadmin-container - network=mynetwork -e PMA_HOST=mysql-container -e PMA_PORT=3306 -p 8080:80 phpmyadmin/phpmyadmin:latest ``` ## Step 5- Access phpMyAdmin Open your browser and navigate to http://localhost:8080. Log in with the MySQL credentials you provided earlier. Notes: - Make sure to replace <password> with a secure password of your choice. - You can choose different ports for MySQL and phpMyAdmin if needed. - The — network flag is used to connect containers to the same network, enabling them to communicate with each other using container names. - Ensure that the MySQL container is running before starting the phpMyAdmin container. By following these steps, you should have a running MySQL container and a phpMyAdmin container connected to it. You can manage your MySQL databases using phpMyAdmin through the web interface.
elaurichenickson
1,706,587
Vision Premium Reviews: Complaints, Ingredients, Side Effects, & More…
The eye is a vital organ that enables you to see. It is made up of several components that work...
0
2023-12-23T13:02:59
https://dev.to/visionpremium/vision-premium-reviews-complaints-ingredients-side-effects-more-4554
The eye is a vital organ that enables you to see. It is made up of several components that work together to allow you to focus on the images and send information to your brain. Several conditions can impair the ability of the eye to see. Myopia, astigmatism, and eye injuries are all common eye problems. These conditions impair eye function, resulting in blurred vision, night blindness, and eyestrain. <ul> <li> <h3><span style="color: #008080;"><b><span style="color: #ff0000;">➣</span><span style="color: #808000;">Product Name —</span> <a href="https://supplementfits.com/get-vision-premium">Vision Premium </a></b></span></h3> </li> <li> <h3><span style="color: #008080;"><b><span style="color: #ff0000;">➤</span><span style="color: #808000;">Main Benefits — <a href="https://supplementfits.com/get-vision-premium">Eye Health &amp; Vision Support</a></span></b></span></h3> </li> <li> <h3><span style="color: #008080;"><b><span style="color: #ff0000;">➤</span><span style="color: #808000;">Composition — </span></b></span><b><a href="https://supplementfits.com/get-vision-premium" target="_blank" rel="nofollow noopener" data-saferedirecturl="https://www.google.com/url?hl=en-GB&amp;q=https://supplementfits.com/get-all-natural-leaf-cbd-gummies&amp;source=gmail&amp;ust=1703066382294000&amp;usg=AOvVaw0Psq9dQE6E-cCGs6Ut3YSo">Natural</a></b></h3> </li> <li> <h3><span style="color: #008080;"><b><span style="color: #ff0000;">➤</span><span style="color: #808000;">Side-Effects — </span></b></span><b><a href="https://supplementfits.com/get-vision-premium" target="_blank" rel="nofollow noopener" data-saferedirecturl="https://www.google.com/url?hl=en-GB&amp;q=https://supplementfits.com/get-all-natural-leaf-cbd-gummies&amp;source=gmail&amp;ust=1703066382294000&amp;usg=AOvVaw0Psq9dQE6E-cCGs6Ut3YSo">NA</a></b></h3> </li> <li> <h3><span style="color: #008080;"><b><span style="color: #ff0000;">➤</span><span style="color: #808000;">Rating: — </span></b></span><b><a href="https://supplementfits.com/get-vision-premium" target="_blank" rel="nofollow noopener" data-saferedirecturl="https://www.google.com/url?hl=en-GB&amp;q=https://supplementfits.com/get-all-natural-leaf-cbd-gummies&amp;source=gmail&amp;ust=1703066382294000&amp;usg=AOvVaw0Psq9dQE6E-cCGs6Ut3YSo">5 Stars</a></b></h3> </li> <li> <h3><span style="color: #008080;"><b><span style="color: #ff0000;">➤</span><span style="color: #808000;">Availability — </span></b></span><b><a href="https://supplementfits.com/get-vision-premium" target="_blank" rel="nofollow noopener" data-saferedirecturl="https://www.google.com/url?hl=en-GB&amp;q=https://supplementfits.com/get-all-natural-leaf-cbd-gummies&amp;source=gmail&amp;ust=1703066382294000&amp;usg=AOvVaw0Psq9dQE6E-cCGs6Ut3YSo">Online</a></b></h3> </li> <li> <h3><b><span style="color: #008080;"><span style="color: #ff0000;">➣</span> <span style="color: #808000;">SHOP NOW –</span> <a href="https://supplementfits.com/get-vision-premium">https://supplementfits.com/get-vision-premium</a></span></b></h3> </li> </ul> <div> <div> <div> <div> <p style="text-align: center;"><span style="color: #33cccc;"><a style="color: #33cccc;" href="https://supplementfits.com/get-vision-premium"><b><u>✅</u></b><b><u>Click Here To Visit – “OFFICIAL WEBSITE”</u></b><b><u>✅</u></b></a></span></p> <p style="text-align: center;"><span style="color: #33cccc;"><a style="color: #33cccc;" href="https://supplementfits.com/get-vision-premium"><b><u>✅Click Here To Visit – “OFFICIAL WEBSITE”✅</u></b></a></span></p> <p style="text-align: center;"><span style="color: #33cccc;"><a style="color: #33cccc;" href="https://supplementfits.com/get-vision-premium"><b><u>✅Click Here To Visit – “OFFICIAL WEBSITE”✅</u></b></a></span></p> </div> </div> </div> <div> <div> Support for Natural Eye Health and Vision Vitamins are a type of supplement that provides consumers with a variety of beneficial nutrients that help their eyes function optimally at any age. The formula is simple to use every day, though it is best avoided around mealtime. <a href="https://supplementfits.com/get-vision-premium"><img class="aligncenter wp-image-9119" src="https://allnutrapro.com/wp-content/uploads/2023/12/Vision-Premium.png" alt="" width="616" height="347" /></a> <h2 style="text-align: center;"><strong>What is Eye Health and Vision Support by Vision Premium?</strong></h2> Many of the tasks that consumers perform during the day are directly related to their ability to see. Consumers rely on their ability to see clearly in every step they take, whether they are spending time with a family member, washing dishes, or even opening a door with a key. When consumers' vision begins to deteriorate, they frequently visit their optometrists to find a pair of glasses, but this may not be the best course of action. <h3 style="text-align: center;"><strong><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://supplementfits.com/get-vision-premium">➤ <span style="color: #008080;">HURRY</span>! Order your "Vision Premium " From The Official Website Before Stock Runs Out!</a></span></strong></h3> Wearing glasses tells the eyes that they don't need to do any more work. It does away with the need for proper nutrition to repair the eyes, except that they must simply look through a different lens. Nourishing the body with the right vitamins and minerals can help to restore and heal the long-term damage to the eyes. While there are a few different ingredients in supplements that are extremely beneficial today, the makers of Vision Premium have created a new formula that prioritizes the top nutrients that eyes require. Natural Eye Health and Vision Support Vitamins is based entirely on clinical research and helps to fill any gaps in the user's diet. Even with all of the necessary nutrients, supplementing this support with additional nourishment can only benefit users. According to the creators, this formula can be used to improve inflammation and the immune system's inflammatory response. It also improves users' vision by utilizing ingredients that have been shown to improve vision. The nutrients in this formula can help the user maintain clarity at night, when looking long distances, or in any other situation. A pair of glasses will not heal your eyes, but the right nutrients may help. <h2 style="text-align: center;"><strong>How Does Vision Premium Work?</strong></h2> The Vision Premium contains 13 clinically proven and effective ingredients that nourish the eye and improve vision. Free radicals cause the majority of eye problems. These radicals build up in the eye, causing oxidative stress. Prolonged stress causes eye tissue damage and macular degeneration. <h3 style="text-align: center;"><strong><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://supplementfits.com/get-vision-premium">➣<span style="color: #00ccff;">BEWARE</span>!! <em>Do Not Forgot That Buy It Only From <span style="color: #00ccff;">Official Website</span></em></a></span></strong></h3> Maintaining a healthy diet provides the necessary nutrients to the eye. Diets, on the other hand, cannot provide enough nutrients to support a healthy eye. Vision Premium provides the necessary ingredients to the eye. It contains powerful antioxidant compounds that eliminate free radicals and toxins that can harm eye tissues. <a href="https://supplementfits.com/get-vision-premium"><img class="aligncenter wp-image-9120" src="https://allnutrapro.com/wp-content/uploads/2023/12/best-eyesight-image-1.png" alt="" width="543" height="447" /></a> <h2 style="text-align: center;"><strong>What Ingredients Go into Eye Health and Vision Support?</strong></h2> When developing this formula, the creators offer many essential vitamins and minerals that already support the users Wellness. However, these nutrients can be found in any multivitamin. To create something truly unique, consumers get access to a proprietary blend that only Vision Premium can offer. The proprietary blend includes: <ul> <li><strong>Lutein</strong></li> <li><strong>Bilberry extract</strong></li> <li><strong>Alpha lipoic acid</strong></li> <li><strong>Eyebright</strong></li> <li><strong>Zeaxanthin</strong></li> <li><strong>Quercetin</strong></li> <li><strong>L-taurine</strong></li> <li><strong>Grape seed extract</strong></li> <li><strong>Lycopene</strong></li> </ul> <h3 style="text-align: center;"><strong><span style="color: #008080;"><a style="color: #008080;" href="https://supplementfits.com/get-vision-premium"><em>✔</em><em>Get</em><em><span style="color: #ff0000;"> Vision Premium</span> <span style="color: #ff0000;">Reviews</span> At The Maximum Discounted Price!</em></a></span></strong></h3> Users will need two capsules each day to get the desired effects, using them about 30 minutes before they plan to eat. Read on below to learn about how each of the ingredients in the proprietary blend can make a substantial difference. <h2 style="text-align: center;"><strong>Benefits of Vision Premium</strong></h2> <ul> <li><strong>It helps ease painful headaches caused by eyestrain and prolonged computer use</strong></li> <li><strong>It relieves eye muscle tension</strong></li> <li><strong>It provides complete mind and body relaxation</strong></li> <li><strong>It boosts mental performance and concentration</strong></li> <li><strong>It enhances the eye’s ability to focus on near and far objects</strong></li> <li><strong>It helps relieve migraines</strong></li> <li><strong>It provides relief from itchy eyes</strong></li> </ul> <h2 style="text-align: center;"><strong>How to Use Vision Premium</strong></h2> Each bottle of the supplement contains 60 capsules. Users need to take two capsules daily with a glass of water. It is essential to follow the recommended dosage to avoid any adverse health effects. The supplement is safe for use for everyone above 18. However, people with underlying medical conditions should seek clearance from a health practitioner before using the supplement. <a href="https://supplementfits.com/get-vision-premium"><img class="aligncenter wp-image-9121" src="https://allnutrapro.com/wp-content/uploads/2023/12/eyesight.png" alt="" width="601" height="406" /></a> <h2 style="text-align: center;"><strong>Are There Any Side Effects Related To This Eye Health Supplement?</strong></h2> Vision Premium has no negative side effects. Many people have used the natural formula to improve their vision on multiple levels. So far, the supplement has only yielded positive results, which explains the positive Vision Premium reviews and ratings on the internet. <h3 style="text-align: center;"><strong><span style="color: #33cccc;"><a style="color: #33cccc;" href="https://supplementfits.com/get-vision-premium"><span style="color: #ff0000;">➛➛</span> Click Here and Secure Your "<span style="color: #ff0000;">Vision Premium</span>" From The Official Website!</a></span></strong></h3> However, keep in mind that this is a supplement, not a miracle cure. To get the best results, you must be patient. Also, if you have an underlying health condition, do not take this supplement without first consulting a doctor. <h2 style="text-align: center;"><strong>From Where Can You Get Vision Premium At Discounted Price?</strong></h2> You can get Vision Premium at a very reasonable price from its official website. Visit the official website of the supplement and scroll down to get to know all about the pricing details of the products and the discounts available. <h2 style="text-align: center;"><strong>Final Words</strong></h2> Vision Premium provides Natural Eye Health and Vision Support nutrients to assist people in improving their eye health. This simple daily supplement can support your night vision, clarity, and other eye health factors by providing essential key vitamins and minerals. That, too, without jeopardizing your overall health. <a href="https://supplementfits.com/get-vision-premium"><img class="aligncenter wp-image-9118" src="https://allnutrapro.com/wp-content/uploads/2023/12/Vision-Premium-2.png" alt="" width="609" height="343" /></a> Vision Premium's Natural Eye Health and Vision Support Vitamins offer consumers a way to improve their eye health without interfering with their daily lives. This formula is simple to use on a daily basis, providing nutrients in a proprietary blend that no other brand can duplicate. Users' vision may gradually improve to the point where their glasses no longer meet their needs, but this transition should be supervised by their optometrist. </div> <div><span style="color: #008080;"><em><strong>Search Tags</strong></em></span></div> </div> <a href="https://supplementfits.com/get-vision-premium">#visionpremium #visionpremiumReviews #visionpremiumWork #visionpremiumBuy #visionpremiumCost #visionpremiumIngredients #visionpremiumBenefits #visionpremiumOrder #visionpremiumPrice #visionpremiumUses #visionpremiumOfficial #visionpremiumShop #visionpremiumResult #visionpremiumWebsite #visionpremiumWhereToBuy</a> <span style="color: #008080;"><em><strong>Official Sources</strong></em></span> <a href="https://sites.google.com/view/vision-premium-eye-health/home"><strong>https://sites.google.com/view/vision-premium-eye-health/home</strong></a> <a href="https://supplementfits.com/vision-premium"><strong>https://supplementfits.com/vision-premium</strong></a> <a href="https://groups.google.com/g/vision-premium"><strong>https://groups.google.com/g/vision-premium</strong></a> <a href="https://visionpremiumreviews.blogspot.com"><strong>https://visionpremiumreviews.blogspot.com</strong></a> <a href="https://visionpremiumreviews.blogspot.com/2023/12/vision-premium-reviews-unveiling-facts.html"><strong>https://visionpremiumreviews.blogspot.com/2023/12/vision-premium-reviews-unveiling-facts.html</strong></a> <a href="https://www.pinterest.com/visionpremiumeyehealth/"><strong>https://www.pinterest.com/visionpremiumeyehealth/</strong></a> <a href="https://www.pinterest.com/pin/980447781369315047"><strong>https://www.pinterest.com/pin/980447781369315047</strong></a> <a href="https://www.pinterest.com/pin/980447781369315063"><strong>https://www.pinterest.com/pin/980447781369315063</strong></a> <a href="https://groups.google.com/g/vision-premium/c/WkxTXTw-7_M"><strong>https://groups.google.com/g/vision-premium/c/WkxTXTw-7_M</strong></a> <a href="https://vision-premium.company.site/"><strong>https://vision-premium.company.site/</strong></a> <a href="https://sketchfab.com/visionpremium"><strong>https://sketchfab.com/visionpremium</strong></a> <a href="https://sketchfab.com/3d-models/vision-premium-reviews-2024-health-care-formula-6b4ab2960f6e46e3b4b2e35430c25c84"><strong>https://sketchfab.com/3d-models/vision-premium-reviews-2024-health-care-formula-6b4ab2960f6e46e3b4b2e35430c25c84</strong></a> <a href="https://visionpremium.bandcamp.com"><strong>https://visionpremium.bandcamp.com</strong></a> <a href="https://visionpremium.bandcamp.com/track/vision-premium-reviews-reviews-research-benefits-drawbacks-and-more"><strong>https://visionpremium.bandcamp.com/track/vision-premium-reviews-reviews-research-benefits-drawbacks-and-more</strong></a> <a href="https://soundcloud.com/health-line-24x7"><strong>https://soundcloud.com/health-line-24x7</strong></a> <a href="https://soundcloud.com/health-line-24x7/vision-premium-reviews-reviews-1-fast-acting-eye-health-formula"><strong>https://soundcloud.com/health-line-24x7/vision-premium-reviews-reviews-1-fast-acting-eye-health-formula</strong></a> <a href="https://www.linkedin.com/events/visionpremiumreviewsreviewstopr7144299255392878592/"><strong>https://www.linkedin.com/events/visionpremiumreviewsreviewstopr7144299255392878592/</strong></a> <a href="https://yfdewatg.wixsite.com/health/post/vision-premium-reviews-what-is-the-working-mechanism-of-vision-premium-reviews"><strong>https://yfdewatg.wixsite.com/health/post/vision-premium-reviews-what-is-the-working-mechanism-of-vision-premium-reviews</strong></a> <a href="https://vision-premium.webflow.io/"><strong>https://vision-premium.webflow.io/</strong></a> <a href="https://hix.one/d/clqi1kiry01yxocj70867mmne"><strong>https://hix.one/d/clqi1kiry01yxocj70867mmne</strong></a> <a href="https://visionpremiumreviews.artstation.com/projects/04qy3y"><strong>https://visionpremiumreviews.artstation.com/projects/04qy3y</strong></a> <a href="https://vision-premium.ck.page/98e2733fdb"><strong>https://vision-premium.ck.page/98e2733fdb</strong></a> <a href="https://www.fuzia.com/fz/vision-premium"><strong>https://www.fuzia.com/fz/vision-premium</strong></a> <a href="https://vision-premium-reviews.jimdosite.com/"><strong>https://vision-premium-reviews.jimdosite.com/</strong></a> <a href="https://gamma.app/public/Vision-Premium-Reviews-Reviews-ALARMING-ALERT-Natural-Eye-Healt-zck51d9jx0bh2fd"><strong>https://gamma.app/public/Vision-Premium-Reviews-Reviews-ALARMING-ALERT-Natural-Eye-Healt-zck51d9jx0bh2fd</strong></a> <a href="https://filmfreeway.com/VisionPremiumReviews-IsItRightForYouUrgentInvestigation"><strong>https://filmfreeway.com/VisionPremiumReviews-IsItRightForYouUrgentInvestigation</strong></a> <a href="https://www.sunflower-cissp.com/glossary/cissp/8313/vision-premium-reviews-ingredients-that-work-or-side-effects-concern"><strong>https://www.sunflower-cissp.com/glossary/cissp/8313/vision-premium-reviews-ingredients-that-work-or-side-effects-concern</strong></a> <a href="https://infogram.com/vision-premium-reviews-2024-buy-from-official-site-1hnq4107m395p23"><strong>https://infogram.com/vision-premium-reviews-2024-buy-from-official-site-1hnq4107m395p23</strong></a> <a href="https://community.thebatraanumerology.com/user/vision_premium"><strong>https://community.thebatraanumerology.com/user/vision_premium</strong></a> <a href="https://community.thebatraanumerology.com/post/vision-premium-reviews-premium-eyes-health-ingredients-or-negative-side-eff--6586d71f0b9d90349ae3b1dc"><strong>https://community.thebatraanumerology.com/post/vision-premium-reviews-premium-eyes-health-ingredients-or-negative-side-eff--6586d71f0b9d90349ae3b1dc</strong></a> <a href="https://soundcloud.com/health-line-24x7/vision-premium-reviews-why-vision-premium-is-so-popular-now"><strong>https://soundcloud.com/health-line-24x7/vision-premium-reviews-why-vision-premium-is-so-popular-now</strong></a> <a href="https://www.linkedin.com/events/visionpremiumcriticalreport-the7144308564394778624/"><strong>https://www.linkedin.com/events/visionpremiumcriticalreport-the7144308564394778624/</strong></a> <a href="https://visionpremium.hashnode.dev"><strong>https://visionpremium.hashnode.dev</strong></a> <a href="https://visionpremium.hashnode.dev/vision-premium-reviews-eye-health-formula-safe-or-dangerous"><strong>https://visionpremium.hashnode.dev/vision-premium-reviews-eye-health-formula-safe-or-dangerous</strong></a> <strong><a href="https://visionpremium.godaddysites.com/">https://visionpremium.godaddysites.com/</a></strong> <a href="https://roggle-delivery.tribe.so/user/vision_premium"><strong>https://roggle-delivery.tribe.so/user/vision_premium</strong></a> <a href="https://roggle-delivery.tribe.so/post/vision-premium-reviews-beware-2024-does-it-work-or-just-scam-6586d8cd0be224822e0e68c4"><strong>https://roggle-delivery.tribe.so/post/vision-premium-reviews-beware-2024-does-it-work-or-just-scam-6586d8cd0be224822e0e68c4 </strong></a> <a href="https://grabcad.com/library/vision-premium-reviews-eye-health-pills-worth-it-or-stay-far-away-1"><strong>https://grabcad.com/library/vision-premium-reviews-eye-health-pills-worth-it-or-stay-far-away-1</strong></a> <a href="https://groups.google.com/g/vision-premium/c/m67rA3mr1mE"><strong>https://groups.google.com/g/vision-premium/c/m67rA3mr1mE</strong></a> <a href="https://www.lawfully.com/community/posts/vision-premium-reviews-exposed-2024-shocking-biggest-scam-of-2024-or-worth-the-buying-2eojovERlmSbVqTFRfxhSQ%3D%3D"><strong>https://www.lawfully.com/community/posts/vision-premium-reviews-exposed-2024-shocking-biggest-scam-of-2024-or-worth-the-buying-2eojovERlmSbVqTFRfxhSQ%3D%3D</strong></a> <a href="https://devfolio.co/projects/vision-premium-4561"><strong>https://devfolio.co/projects/vision-premium-4561</strong></a> <a href="https://www.completefoods.co/diy/recipes/vision-premium"><strong>https://www.completefoods.co/diy/recipes/vision-premium</strong></a> <a href="https://community.gaeamobile.com/forum/dragons-of-atlantis-heirs-of-the-dragon/news-and-announcements-ab/917806-vision-premium-reviews-%E2%80%94-scam-or-legit-know-this-first"><strong>https://community.gaeamobile.com/forum/dragons-of-atlantis-heirs-of-the-dragon/news-and-announcements-ab/917806-vision-premium-reviews-%E2%80%94-scam-or-legit-know-this-first</strong></a> <a href="https://www.scoop.it/u/vision-premium"><strong>https://www.scoop.it/u/vision-premium</strong></a> <a href="https://www.scoop.it/topic/vision-premium/p/4149783272/2023/12/23/vision-premium-reviews-2024-shocking-ingredients-found-that-really-work"><strong>https://www.scoop.it/topic/vision-premium/p/4149783272/2023/12/23/vision-premium-reviews-2024-shocking-ingredients-found-that-really-work</strong></a> </div>
visionpremium
1,706,809
Dance through the Cybersecurity Tango: Unmasking MITM Attacks and Keeping Your Data Groove On!
🔒 Decoding Cyber Intrigue: Man-in-the-Middle Attacks Unveiled! 🕵️‍♂️ Hey fam! 👋 Ever heard...
0
2023-12-23T20:18:22
https://dev.to/farazul/dance-through-the-cybersecurity-tango-unmasking-mitm-attacks-and-keeping-your-data-groove-on-fh4
cybersecurity
## 🔒 Decoding Cyber Intrigue: Man-in-the-Middle Attacks Unveiled! 🕵️‍♂️ Hey fam! 👋 Ever heard of the cyber ninja move called _"Man-in-the-Middle" (MITM) attack_? It's the covert operation shaking up the digital playground. Picture Alice and Bob chatting online, blissfully unaware of the uninvited guest – the "attacker" – silently slipping in between. Sneaky, right? **Unmasking the Cyber Shenanigans:** - Interception of Communication: The attacker stealthily infiltrates the chat, exploiting network glitches, hacking Wi-Fi, or infiltrating devices. - Eavesdropping: Our cyber ninja eavesdrops on the conversation, snatching sensitive info like login details or financials while Alice and Bob remain clueless. - Modification of Data: Feeling mischievous? The attacker might inject humor, redirect to cat memes, or worse, plant malware into the conversation. - Resuming Communication: After the cyber acrobatics, the attacker seamlessly allows the conversation to continue, leaving no trace of their digital mischief. **Cyber Tricks 101: Techniques Employed by Sneaky Hackers:** - Packet Sniffing: The hacker's radar captures data packets, like intercepting digital postcards without detection. - DNS Spoofing: Redirecting to shady sites by tampering with the address book. - Wi-Fi Eavesdropping: Unsecured Wi-Fi becomes a playground for MITM attackers, snooping and snatching digital secrets. - SSL Stripping: Cyber's cloak of invisibility – downgrading a secure connection exposes protected info. **Battling the Cyber Ninjas:** - Encryption: Lock it up with strong encryption like HTTPS to keep messages unreadable without the key. - Secure Wi-Fi Practices: Treat Wi-Fi like a secret hideout, lock it down with a strong password, and add a VPN for extra protection. - Multi-Factor Authentication (MFA): MFA adds an extra layer – even if they get the password, they need the secret dance moves. - Regular Software Updates: Superhero upgrades fixing vulnerabilities that MITM attackers might exploit. Outsmart the cyber tricksters, stay savvy, stay sassy, and let's keep the cyber dance floor secure! 💃🔒🚀
farazul
1,706,888
How to profile your multi-threaded python production code programmatically
Profiling multi-threaded Python applications can be challenging, especially in production...
0
2023-12-24T01:20:28
https://dev.to/oryaacov/how-to-profile-your-multi-threaded-running-python-code-programmatically-collect-periodically-and-analyze-the-results-5fh3
python, cprofile, multithreading, javascript
Profiling multi-threaded Python applications can be challenging, especially in production environments. Unfortunately, 90% of the guides don’t explain how to really do it, especially with remote production multi-threaded code and no cli, and they will lead you to use something like `python -m cProfile -s time my_app.py <args>` Don't worry, this is not another one of those. So why is it “hard” to do? You can’t always run your program in production using the command-line interface (CLI), and cProfile only offers limited support for multi-threaded programming. ## So, How to do Multi-Threading profiling? To perform multi-threading profiling, we can use one of the following approaches: 1. Sum different threads’ stats together to create one set of statistics using `stats.add()`. 2. Create a separate profiler for each thread. Since our program is built from 3 different engine threads, I will cover only the second approach in this guide. So, I created the following utility that allows me to create my threads: ``` import cProfile import threading import time class ProfilerManager: def __init__(self, is_enabled, dump_interval_seconds, profiler_dir_path): self._is_enabled = is_enabled self._logger = "YOUR LOGGER" self._profiler_dir_path = profiler_dir_path self._profiler_dump_interval_seconds = dump_interval_seconds if self._profiler_dump_interval_seconds <= 0: self._logger.warning('invalid _profiler_dump_interval_seconds value {}, setting default'.format(self._profiler_dump_interval_seconds)) self._profiler_dump_interval_seconds = 60 self._profilers = {} def thread_with_profiler(self, target, tag): if not self._is_enabled: return threading.Thread(target=target) self._disable_profiler_if_exists(tag) profiler = cProfile.Profile() self._profilers[tag] = profiler return threading.Thread(target=self._target_with_profiler(target, profiler, tag)) def run(self, target, tag): if not self._is_enabled: return target profiler = cProfile.Profile() self._profilers[tag] = profiler return self._target_with_profiler(target, profiler, tag) def _disable_profiler_if_exists(self, tag): if tag in self._profilers: self._logger.warning('tag {} already exists, disabling, and overriding with new target'.format(tag)) existing_profiler = self._profilers[tag] self._disable_profiler(existing_profiler, tag) def _target_with_profiler(self, target, profiler, tag): def target_with_profiler(): self._enable_profiler(profiler, tag) target() self._disable_profiler(profiler, tag) return target_with_profiler def _enable_profiler(self, profiler, tag): self._logger.debug('enabling profiler {}'.format(tag)) profiler.enable() def _disable_profiler(self, profiler, tag): self._logger.debug('disabling profiler {}'.format(tag)) profiler.disable() def _dump_periodic_profilers(self): while True: time.sleep(self._profiler_dump_interval_seconds) self._logger.trace("dumping profiles") for tag, profiler in self._profilers.items(): dump_file_path = '{}/{}.dump'.format(self._profiler_dir_path, tag) self._logger.trace("dumping profile, {}".format(dump_file_path)) profiler.dump_stats(dump_file_path) ``` Let’s take a look at the `thread_with_profiler` use which is similar to `threading.Thread` the following: ``` ip_manager_thread = profiler_manager.thread_with_profiler(target=self._run_ip_monitor, tag='ip_manager') ip_manager_thread.start() ``` let’s do a quick dive into `thread_with_profiler`: ``` def thread_with_profiler(self, target, tag): if not self._is_enabled: return threading.Thread(target=target) self._disable_profiler_if_exists(tag) profiler = cProfile.Profile() self._profilers[tag] = profiler return threading.Thread(target=self._target_with_profiler(target, profiler, tag)) ``` All that this function does is disable the profiler with the same tag, in case it already exists. It creates a new profiler and a new thread, then runs `_target_with_profiler`, which is the real magic behind the scenes ``` def _target_with_profiler(self, target, profiler, tag): def target_with_profiler(): self._enable_profiler(profiler, tag) target() self._disable_profiler(profiler, tag) return target_with_profiler ``` As we can see, we create and return a new function wrapper that our thread will execute. All that the wrapper does is enable the profiler (start it), call your actual function, and disable it when/if it ends. Then, we have the following function, which actually dumps the current stats to files by tags every X seconds ``` def _dump_periodic_profilers(self): while True: time.sleep(self._profiler_dump_interval_seconds) self._logger.trace("dumping profiles") for tag, profiler in self._profilers.items(): dump_file_path = '{}/{}.dump'.format(self._profiler_dir_path, tag) self._logger.trace("dumping profile, {}".format(dump_file_path)) profiler.dump_stats(dump_file_path) ``` so, now that we got how this all thing works, let’s see a full use example! ``` def async_worker1(): #work def async_worker2(): #work def main_worker(): #work def main(): profiler_manager = ProfilerManager(is_enabled=True, dump_interval_seconds=10, profiler_dir_path="/var/logs") profiler_manager.thread_with_profiler(target=self._run_ip_monitor, tag='async_worker1') profiler_manager.thread_with_profiler(target=self._run_ip_monitor, tag='async_worker2') profiler_manager.run(target=self._run_ip_monitor, tag='main_worker')profiler_manager.thread_with_profiler(target=self._run_ip_monitor, tag='ip_manager') if __name__ == '__main__': main() ``` happy profiling! :)
oryaacov
1,706,972
Rust: Where Innovation Meets Stability
In the vast landscape of programming languages, Rust emerges as a beacon that captures the hearts of...
0
2023-12-24T06:07:34
https://dev.to/trixtec/rust-where-innovation-meets-stability-55al
rust, programming, backend, security
In the vast landscape of programming languages, Rust emerges as a beacon that captures the hearts of developers globally. Its ascendancy is not just about code; it's a narrative that intertwines innovation seamlessly with the core tenets of stability and security. **Why Rust is better than other programming languages?** Unlike some programming languages, Rust does not employ garbage collection. Instead, its ownership and borrowing rules manage memory, which helps empower developers to have precise control over memory allocation and deallocation for efficient resource management. Cargo Package Manager. Rust and C++ are comparable in terms of overall speed and performance, but when we take into account unbiased benchmarking, there are many instances in which Rust will perform even better than its counterpart. Rust is usually used in the back end of web applications. Rust backend frameworks and libraries have made it possible to develop web solutions in a smart and intuitive manner. Rust is safer than C++ because it prevents data races at compile time with its ownership system, making it easier to avoid the risk of memory safety issues. Rust would be an excellent language to learn if you want to develop web applications due to its security and concurrency features. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qsp3pu01k96fclo6wo23.png) **Advantages of Rust over high performance programming language like c/c++** - General purpose language that can be used for any purpose - Memory safety without using garbage collection - Rust can be written faster than C++ - Rust is as fast as C/C++ while it is far safer - Less complexity - Fast and high performance - Low overhead makes it ideal for embedded programming - Facilitates powerful web application development - Ease of use - Code validation - Great Community support - Concurrency - Rust has a inbuilt dependency and build management known as Cargo **Rust at the Core: Powering Infrastructure** Witness Rust's prowess in action as it shapes the fundamental infrastructure of projects like Polkadot. The very essence of blockchain's rules and behavior finds a home in Rust, showcased vividly through the components congregated in the awesome-blockchain-rust repository. **Crafting Efficiency: Command Line Mastery** Explore Rust's ability to compile efficient machine code and wield expressive syntax to craft command line tools. More than a utilitarian endeavor, building a CLI application becomes a journey of discovery, exemplified by a guide that promises results in just 15 minutes. **Compact Powerhouse: Embedded Systems and IoT** Delve into Rust's minimalist runtime and meticulous memory control, making it the go-to language for embedded systems and IoT development. Its knack for eradicating memory-related bugs and generating compact, efficient binaries aligns seamlessly with the demands of the IoT landscape. **The Rust Devotion: Speed, Safety, Performance** While Rust's user base might not rival giants like Java or Python, it consistently finds itself on the podium of most-admired languages. Developers resonate with Rust for its trinity of virtues—speed, safety, and performance. It's a language that thrives, propelled by a community dedicated to its continual growth. **Dynamic Evolution: The Rust Ecosystem** Rust's journey is a dynamic evolution, pulsating with frameworks, tools, and resources. The awesome-rust repository is a living testament to the expanding universe of Rust code and resources, a testament to the language's adaptability and vibrancy. **Pro Tip: Rust Exploration with GitHub Copilot** Embark on your Rust journey with GitHub Copilot, your AI-powered ally in coding. It's not just about mastering documentation; Copilot offers a hands-on, immersive learning experience, letting you sharpen your Rust skills as you navigate the coding landscape. Rust is a Mindset Admiring Rust goes beyond adopting a language—it's about embracing a mindset that balances innovation with unwavering commitments to stability and security. As you set forth on your coding journey with Rust, remember, it's not just about code; it's a journey of adopting a philosophy that resonates with developers worldwide. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x7wbciljdaoxzwaxyx1p.jpg)
trixtec
1,707,379
Yet Another Newsletter LOL: Time to Recharge
This will be the last newsletter until the new year. I'll be taking some time off to recharge. The...
20,787
2023-12-24T14:18:58
https://buttondown.email/nickytonline/archive/yet-another-newsletter-lol-time-to-recharge/
react, javascript, a11y
<p>This will be the last newsletter until the new year. I'll be taking some time off to recharge. The next newsletter will be landing January 14th. With that, another week, another newsletter. Let's get to it!</p> <h2>Around the Web</h2> <ul> <li>It's been another wild year in JavaScript frameworks land. Ryan Carniato gives us a great summary and what to expect in 2024. Check out <a href="https://dev.to/this-is-learning/javascript-frameworks-heading-into-2024-i3l?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">JavaScript Frameworks - Heading into 2024</a>.</li> <li>If you're still new to <a href="https://servercomponents.dev/what-are-react-server-components?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">React Server Components</a> (RSCs), this is a great explainer from <a href="https://bholmes.dev/?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">Ben Holmes</a>, <a href="https://www.youtube.com/watch?v=MaebEqhZR84&amp;utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">React server components from scratch!</a></li> <li><a href="https://react-spectrum.adobe.com/react-aria/index.html?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">React Aria Components 1.0</a> dropped this week. This is a huge win for the React ecosystem. Amazing work <a href="https://x.com/devongovett/status/1737516074663362843?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">Devon Govett</a> and team!</li> </ul> <blockquote> <p>Over 40 components with built-in behavior, adaptive interactions, top-tier accessibility, and internationalization out of the box, ready for your styles.</p> </blockquote> <h2>Fun Stuff</h2> <p>I really enjoyed <a href="https://www.imdb.com/title/tt9288030/?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">Reacher</a> season one, and I have to say, season two is living up to the hype. If you're looking for action and drama, with a splash of comedy, look no further.</p> <h2>Shameless Plugs</h2> <p>Monday I got to hang with <a href="https://twitter.com/saronyitbarek?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">Saron Yitbarek</a> (@saronyitbarek). She's the founder of <a href="https://www.codenewbie.org/?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">CodeNewbie</a> where a bunch of you might know her from, but her latest venture is <a href="https://notadesigner.io/subscribe?ref=oiDwhLbAda&amp;utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">Not a Designer</a>! we dug into her journey into tech all the way to Not A Designer.</p> What a great conversation. Thanks for hanging Saron! {% embed https://www.youtube.com/watch?v=gHnlfLrpq9s %} <p>Wednesday, I hung out with my awesome coworker @bekahhw for the last <a href="https://opensauced.pizza/?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">OpenSauced</a> Contributors Shout-outs stream or the year. Thanks to everyone who has been contributing to the OpenSauced repositories! 💜🍕</p> {% embed https://www.twitch.tv/videos/2009270701?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge %} <p>And lastly, Thursday was my monthly live stream on the <a href="https://cfe.dev/?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">CFE.dev</a> YouTube channel. We dug into <a href="https://fresh.deno.dev/?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">Fresh</a>, a full-stack web framework from the folks at Deno. Lots of great questions and chat during the live stream. Check out the recording!</p> {% embed https://www.youtube.com/watch?v=XtIrc6aUPJU %} <p>I'm off until January 8th, so no live-streaming until then. Check out the <a href="https://www.nickyt.co/pages/stream-schedule/?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">schedule</a> for all the wonderful guests coming on in the new year.</p> <p>That's a wrap! Until the next one and until next year!</p> <p>If you're looking for another friendly nook of the internet, head to <a href="https://discord.iamdeveloper.com?utm_source=nickytonline&amp;utm_medium=email&amp;utm_campaign=yet-another-newsletter-lol-time-to-recharge" target="_blank">discord.iamdeveloper.com</a> and join our community!</p> If you liked this newsletter, you can [subscribe](https://www.nickyt.co/pages/newsletter/) or if RSS is your jam, you can also [subscribe via RSS](https://buttondown.email/nickytonline/rss). <!-- my newsletter -->
nickytonline
1,707,695
Luckywin
Luckywin là cổng game đổi thưởng cực chất với nhiều game hay, thưởng lớn. Truy cập link cổng game...
0
2023-12-25T06:24:13
https://dev.to/luckywinac/luckywin-i6a
Luckywin là cổng game đổi thưởng cực chất với nhiều game hay, thưởng lớn. Truy cập link cổng game trải nghiệm game ngay nhé. Địa chỉ: 2644 QL2, Vân Phú, Thành phố Việt Trì, Phú Thọ, Việt Nam Zip code: 290000 Email: luckywinac@gmail.com Phone: 0383 081 543 Website: https://luckywin.ac/ #luckywin #luckywinac #luckywincasino #conggameluckywin https://twitter.com/Luckywin1377664 https://www.pinterest.com/luckywinac/ https://vimeo.com/luckywinac https://www.tumblr.com/luckywinac https://www.twitch.tv/luckywinac/about https://www.reddit.com/user/luckywinac https://www.youtube.com/@luckywinac https://500px.com/p/luckywinac?view=photos https://www.flickr.com/people/199831738@N08/ https://linkfly.to/51225lZIYtL https://joy.bio/luckywinac https://micro.blog/luckywinac https://dubo.tribe.so/user/luckywinac https://app.zintro.com/profile/zie8ba5818?ref= https://skitterphoto.com/photographers/80701/luckywin https://coub.com/luckywinac
luckywinac
1,716,161
Understanding CORS
Hello there! Happy New Year! I hope you had an opportunity to get some rest during the winter...
0
2024-01-03T16:42:34
https://n0rdy.foo/posts/20240103/understanding-cors/
beginners, tutorial, webdev, programming
Hello there! Happy New Year! I hope you had an opportunity to get some rest during the winter holidays and maybe even made a snowman or two =) Several days ago, I had a dialog with a friend of mine (let's call her Eowyn) who has recently started her path in software engineering: **Eowyn:** Hey, buddy! I'm building a web project that has the frontend and backend parts. Whenever I click a button on the UI (that triggers a DELETE request to the server), my browser blows with these errors: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dp3ynbzgeh1usfanykjc.png) I tried calling the same endpoint with the `curl` command, and it works. I did the same via Postman, and it also works. It makes no sense! What the heck? **Me:** Hehe, congrats, this is a historical moment in your career - the day you discovered CORS! =) To her credit, she didn't want my help in fixing the issue, she wanted to understand it. That's why I spent the next half an hour explaining the topic of CORS. And since that was not the first time I had to do that, I realized that it is an excellent opportunity to write a post on this subject and, next time, share the link to it instead of explaining it once again. **A disclaimer**: this post is called `Understanding CORS,` and that's exactly the goal we are going to pursue today. The target audience is the folks who know nothing or very little about the CORS and would like to learn what's happening on a high level behind the errors like above. We won't dive deep and won't cover all the use-/edge-cases of this topic - if you are looking for more advanced reading, here is [an excellent longread from Mozilla](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS). ## Real-life example If you are following my blog, you might have noticed that I like to explain technical aspects via real-life scenarios/examples. I believe this is the shortest and easiest path to understanding. We'll do the same today. Let me introduce you to Geralt, a witcher, one of the best in his craft. Due to his skills and due to the number of dangerous beasts out there, Geralt's services are in high demand. People contact him so often that he has to hire Bob, a secretary, to manage the calls he receives. Thus, once somebody calls Geralt's office, it is Bob who manages the call. It usually works like this: - once somebody calls Geralt's office, Bob picks up the phone, gathers the info about the call (who, why, and where they are calling from), and asks them to wait ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kfw7zi587nxlx80devwy.jpg) - Bob calls Geralt, shares the info about the caller, and asks whether Geralt is willing to talk to them ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xn5ld1adj7z21z2wrm3s.jpg) As we can see, Geralt replied that if, during the next 2 hours, people were calling from the place called Ingenheim about hunting, Bob should connect them with him right away. - Bob gets back to the caller and connects them with Geralt ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/53bxvh5o175hb6owlbi7.jpg) - once there is another call within the next 2 hours coming from a different place than Ingenheim, Bob rejects the request to connect them with Geralt: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vjn7p8fdhvm82gf3u01y.jpg) However, if there is someone (like the witcher's buddy Yarpen) who knows Geralt's direct phone number, they can still call him regardless of the place and reason for making a call: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qtz2ajgkv87n50v5opq9.jpg) I think this should make perfect sense. But how come is it related to the CORS? ## From the real life to software engineering If we try to map the real-life example we used to the software engineering concepts, it will look like this: - the person who calls the office - frontend application - Bob - browser - Geralt - backend application And the step-by-step sequence of events is the following: - once the frontend app tries to send a request to the backend API, a browser makes a so-called pre-flight request to the backend API and asks about the allowed options: who is allowed to call the API and what type of requests can be sent - the API sends a response with such options and (optionally) includes the duration for which the browser should cache those settings and rely on them instead of making another pre-flight request - if the frontend app and the request it tries to make are within the allow list, the browser lets it through - otherwise, the request is rejected with the error we saw at the very beginning of this post However, this mechanism is easy to bypass by skipping the browser and sending a request directly (like via `curl`, `Postman,` or any other HTTP client) - that's exactly what Yarpen did above by calling Geralt's direct phone number instead of the office's one. Shall we assume that CORS is quite a poor security mechanism, as we can easily bypass it? The answer is "it depends". If we'd like to secure our API in a way that only the allowed services can call it, CORS is not a good idea as a single-on solution, as it doesn't apply to server-to-server communications. CORS' primary use case is CSRF (Cross-Site Request Forgery) attack prevention. Let's discuss what it is. ### CSRF Imagine that today is a payday, so you have just logged in to your online bank account to check the balance, and you can see that the money has arrived - nice! Glad and happy you have opened the social network feed and started scrolling it. And, all of a sudden, there is another "bang": someone has posted a link with the description that there is a big sale on your favorite pickles - that's your lucky day, isn't it? You followed the link, but there are no pickles behind it, just a blank page. "That's unfair! How could someone make such jokes?" Was your thought, after which you closed that page. Suppose you have monitored your network traffic while visiting the "pickles scam" page. In that case, you'd notice that even though the page was blank, it actually contained a small JavaScript code that made a request to the API of your bank and requested a money transfer to the unknown (for you) account. You've been logged in to your online bank account, so the bank is unaware that you made this transfer without even knowing. But if the bank has CORS configs enabled, the browser will verify whether this "pickles scam" domain is allowed to call the bank's API, and since it's not allowed, the request will be rejected. Even though you are left without pickles, your money is safe. I asked ChatGPT to provide me with more examples of CSRF attacks, and here is what I got: **Example 1: Changing Email Address** 1. ***Scenario***: Alice is logged into her email account on `emailservice.com`. 2. ***Attack***: She then visits a malicious website, `malicioussite.com`, which contains a hidden form that is automatically submitted by JavaScript. This form is crafted to send a POST request to `emailservice.com` to change her email settings (like her recovery email address). 3. ***Result***: If `emailservice.com` doesn't have proper CSRF protections, it might process this request as if Alice intentionally submitted it, leading to her recovery email being changed without her knowledge. **Example 2: Social Media Post** 1. ***Scenario***: Bob is logged into a social media platform. 2. ***Attack***: He clicks on a link that leads him to a malicious site. This site contains a script that makes a request to the social media platform to post a message or send a message to all his contacts. 3. ***Result***: If the social media platform doesn't verify the authenticity of the request, it could result in spam or malicious messages being sent from Bob's account. **Example 3: Changing Password** 1. ***Scenario***: Dana is logged into a forum. 2. ***Attack***: She receives an email with a link to an interesting article. Clicking the link takes her to a website that secretly contains a form that sends a request to the forum to change her password. 3. ***Result***: Without CSRF protection, Dana’s password could be changed without her consent, potentially locking her out of her account. As you can see, all of this could have been avoided if the server had CORS configured. How can we do that, though? Let's finally see some code! ## Some code As we have already determined, there are 3 types of actors here: - browser - frontend - backend In this example, we'll use Brave, a Chromium-based browser, for UI simplicity purposes. Let's jump into the code, then. ### Backend We are going to build a tiny books API that has 3 endpoints: - get all books - add a new book - delete all books It's a dummy application, and that's why we'll store all the data in the memory. Here is the entire Go code for our backend: ```go package main import ( "encoding/json" "errors" "fmt" "github.com/go-chi/chi/v5" "net/http" ) var books = []string{"The Lord of the Rings", "The Hobbit", "The Silmarillion"} type Book struct { Title string `json:"title"` } func main() { err := runServer() if err != nil { if errors.Is(err, http.ErrServerClosed) { fmt.Println("server shutdown") } else { fmt.Println("server failed", err) } } } func runServer() error { httpRouter := chi.NewRouter() httpRouter.Route("/api/v1", func(r chi.Router) { r.Get("/books", getAllBooks) r.Post("/books", addBook) r.Delete("/books", deleteAllBooks) }) server := &http.Server{Addr: "localhost:8888", Handler: httpRouter} return server.ListenAndServe() } func getAllBooks(w http.ResponseWriter, req *http.Request) { respBody, err := json.Marshal(books) if err != nil { w.WriteHeader(http.StatusInternalServerError) return } w.Header().Set("Content-Type", "application/json") w.WriteHeader(http.StatusOK) w.Write(respBody) } func addBook(w http.ResponseWriter, req *http.Request) { var book Book err := json.NewDecoder(req.Body).Decode(&book) if err != nil { w.WriteHeader(http.StatusBadRequest) return } books = append(books, book.Title) w.WriteHeader(http.StatusCreated) } func deleteAllBooks(w http.ResponseWriter, req *http.Request) { books = []string{} w.WriteHeader(http.StatusNoContent) } ``` You can find it in [this GitHub repo](https://github.com/n0rdy/n0rdy-blog-code-samples/tree/main/20240103-cors). As you can see, I used an external dependency, `github.com/go-chi/chi/v5` , for the API. It is completely optional to achieve the same with pure Go, but I did that to improve the readability of the code. Other than that, the code is pretty simple: it reads, writes, or deletes the data from/into the slice of books and sends a successful response. Let's run it: the server will be running as `http://localhost:8888` It's time to define a frontend now: ### Frontend Here we need a simple HTML page with JS scripts to make requests to the backend API, and a tiny Go server to serve the page. Here is the HTML code: ```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Books</title> <link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-T3c6CoIi6uLrA9TneNEoa7RxnatzjcDSCmG1MXxSR1GAsXEV/Dwwykc2MPK8M2HN" crossorigin="anonymous"> <script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js" integrity="sha384-C6RzsynM9kWDrMNeT87bh95OGNyZPhcTNXj1NW7RuBCsyN/o0jlpcV8Qyq46cDfL" crossorigin="anonymous"></script> </head> <body> <div class="container p-3"> <button type="button" class="btn btn-primary" id="getBooks">Get books</button> <button type="button" class="btn btn-danger" id="deleteAllBooks">Delete all books</button> <br> <br> <form> <div class="mb-3"> <label for="inputBookTitle" class="form-label">Book title</label> <input type="text" class="form-control" id="inputBookTitle" aria-describedby="emailHelp"> </div> <button type="submit" class="btn btn-primary">Add</button> </form> </div> <script> function getBooks () { fetch('http://localhost:8888/api/v1/books') .then(response => response.json()) .then(data => { const booksList = document.querySelector('.books-list') if (booksList) { booksList.remove() } const ul = document.createElement('ul') ul.classList.add('books-list') data.forEach(book => { const li = document.createElement('li') li.innerText = book ul.appendChild(li) }) document.body.appendChild(ul) }) } function deleteAllBooks () { fetch('http://localhost:8888/api/v1/books', { method: 'DELETE' }) .then(response => { if (response.status === 204) { getBooks() } else { const div = document.createElement('div') div.innerText = 'Something went wrong' document.body.appendChild(div) } }) } const getBooksButton = document.getElementById('getBooks') const deleteAllBooksButton = document.getElementById('deleteAllBooks') const input = document.querySelector('input') const form = document.querySelector('form') getBooksButton.addEventListener('click', () => getBooks()) deleteAllBooksButton.addEventListener('click', () => deleteAllBooks()) form.addEventListener('submit', (event) => { event.preventDefault() const title = input.value fetch('http://localhost:8888/api/v1/books', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ title }) }) .then(response => { if (response.status === 201) { input.value = '' getBooks() } else { const div = document.createElement('div') div.innerText = 'Something wend wrong' document.body.appendChild(div) } }) }) </script> </body> </html> ``` And a Go server: ```go package main import ( "errors" "fmt" "github.com/go-chi/chi/v5" "net/http" ) func main() { err := runServer() if err != nil { if errors.Is(err, http.ErrServerClosed) { fmt.Println("client server shutdown") } else { fmt.Println("client server failed", err) } } } func runServer() error { httpRouter := chi.NewRouter() httpRouter.Get("/", serveIndex) server := &http.Server{Addr: "localhost:3333", Handler: httpRouter} return server.ListenAndServe() } func serveIndex(w http.ResponseWriter, req *http.Request) { http.ServeFile(w, req, "20240103-cors/01-no-cors/client/index.html") } ``` If we run the Go code, it will serve the HTML page to `http://localhost:3333` Based on my drawings, you might have already noticed that I have an exceptional talent in design that's the UI is another masterpiece of mine: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/75jdc6f23cadeoythf33.png) Let's test what we have now. ### Testing time Navigate to http://localhost:3333/ in your browser, and open a DevTools there: it's `Option+Command+I` on MacOS or `View -> Developer -> Developer Tools` . Ideally, we should be able to see the `Network` tab and the `Console` like this: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zx3qmk3kxqmwjya78ptv.png) Let's try to add a new book - "Harry Potter", and click "Add". Boom! ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vvsh3tyiksz58i8gmj0o.png) The error looks somewhat familiar, doesn't it? It's not exactly the same as at the beginning of this post, but it's very similar. Let's try to understand what it actually means. You might have noticed that our backend code doesn't mention CORS at all. It is indeed true, we haven't implemented any CORS configs as of now. But that doesn't matter for the browser: it tried to make a preflight request anyway ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hlx528a46okbtsii3m6m.png) If we click on it, it will expand some details, and we can see that the browser tried to make an OPTIONS request to the same path as the add book endpoint, and received a `405 Method Not Allowed` response, which makes sense as we haven't defined the OPTIONS endpoint in our backend. ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/11222mhmf47b2jrfvv47.png) If we get back to our real-life example for a moment, what happened here is the following: - somebody calls Geralt's office, so Bob picks up the phone, gathers the info about the call (who, why, and where they are calling from), asks them to wait, and calls Geralt to double-check - but "Houston, we have a problem": Geralt's phone is off, so there's no way Bob can get Geralt's preferences for today Will Bob connect the caller with Geralt somehow? Or will he agree on the services required without talking to the witcher? Of course, no, Bob has to reject the customer for now. The same happens in our case: since the browser has no idea about the backend API CORS settings, it simply refuses to make any requests there - safety first! Let's fix that! ### Fixing time The frontend app remains the same, but as for the backend, we need to make a few changes: - introduce a new function to enable CORS: ```go func enableCors(w http.ResponseWriter) { // specifies which domains are allowed to access this API w.Header().Set("Access-Control-Allow-Origin", "http://localhost:3333") // specifies which methods are allowed to access this API w.Header().Set("Access-Control-Allow-Methods", "GET, POST, DELETE") // specifies which headers are allowed to access this API w.Header().Set("Access-Control-Allow-Headers", "Accept, Content-Type") // specifies for how long the browser can cache the results of a preflight request (in seconds) w.Header().Set("Access-Control-Max-Age", strconv.Itoa(60*60*2)) } ``` As you can see, we have introduced 4 CORS settings: - domain from which it is allowed to call our API - `http://localhost:3333` - HTTP methods that are allowed to be used with our API - `GET, POST, DELETE` - request headers that are allowed to be passed to our API - `Accept, Content-Type` - time for which the browser can remember and cache these settings - 2 hours in seconds A short disclaimer: there are more CORS-related headers, but these are enough for understanding. I'll share the links to dive deeper at the end of this post. - introduce an OPTIONS endpoint alongside the existing ones and a function to handle it: ```go ... httpRouter.Route("/api/v1", func(r chi.Router) { r.Options("/books", corsOptions) r.Get("/books", getAllBooks) r.Post("/books", addBook) r.Delete("/books", deleteAllBooks) }) ... func corsOptions(w http.ResponseWriter, req *http.Request) { enableCors(w) w.WriteHeader(http.StatusOK) } ``` - add `enableCors` invocation to the existing functions of other endpoints, for example: ```go func getAllBooks(w http.ResponseWriter, req *http.Request) { respBody, err := json.Marshal(books) if err != nil { w.WriteHeader(http.StatusInternalServerError) return } enableCors(w) w.Header().Set("Content-Type", "application/json") w.WriteHeader(http.StatusOK) w.Write(respBody) } ``` The final code looks like this: ```go package main import ( "encoding/json" "errors" "fmt" "github.com/go-chi/chi/v5" "net/http" "strconv" ) var books = []string{"The Lord of the Rings", "The Hobbit", "The Silmarillion"} type Book struct { Title string `json:"title"` } func main() { err := runServer() if err != nil { if errors.Is(err, http.ErrServerClosed) { fmt.Println("server shutdown") } else { fmt.Println("server failed", err) } } } func runServer() error { httpRouter := chi.NewRouter() httpRouter.Route("/api/v1", func(r chi.Router) { r.Options("/books", corsOptions) r.Get("/books", getAllBooks) r.Post("/books", addBook) r.Delete("/books", deleteAllBooks) }) server := &http.Server{Addr: "localhost:8888", Handler: httpRouter} return server.ListenAndServe() } func corsOptions(w http.ResponseWriter, req *http.Request) { enableCors(w) w.WriteHeader(http.StatusOK) } func getAllBooks(w http.ResponseWriter, req *http.Request) { respBody, err := json.Marshal(books) if err != nil { w.WriteHeader(http.StatusInternalServerError) return } enableCors(w) w.Header().Set("Content-Type", "application/json") w.WriteHeader(http.StatusOK) w.Write(respBody) } func addBook(w http.ResponseWriter, req *http.Request) { var book Book err := json.NewDecoder(req.Body).Decode(&book) if err != nil { w.WriteHeader(http.StatusBadRequest) return } books = append(books, book.Title) enableCors(w) w.WriteHeader(http.StatusCreated) } func deleteAllBooks(w http.ResponseWriter, req *http.Request) { books = []string{} enableCors(w) w.WriteHeader(http.StatusNoContent) } func enableCors(w http.ResponseWriter) { // specifies which domains are allowed to access this API w.Header().Set("Access-Control-Allow-Origin", "http://localhost:3333") // specifies which methods are allowed to access this API (GET is allowed by default) w.Header().Set("Access-Control-Allow-Methods", "POST, DELETE") // specifies which headers are allowed to access this API w.Header().Set("Access-Control-Allow-Headers", "Content-Type") // specifies for how long the browser can cache the results of a preflight request (in seconds) w.Header().Set("Access-Control-Max-Age", strconv.Itoa(60*60*2)) } ``` Let's run this code to see whether it works. It should - if you experience any issues, try to restart both frontend and backend apps, and even access http://localhost:3333 via incognito mode, as there might be some issues due to browser caching. If you click all the buttons a few times by trying to add some books and get/delete all of them, you'll see that it works as expected. Even more, if we take a look at the `Network` tab, we'll see that there is only 1 preflight request in total: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/phkucwh6z5md5jzog3l8.png) Of course, we are not surprised about that, as we have just configured the `Access-Control-Max-Age` header for those purposes. I believe you should already have a good understanding of CORS. But let's take a final step and try to mess with CORS configs to see how it breaks the flow again. ### Breaking (purposely) time We'll do that in a one-by-one fashion by breaking (and reverting) each of the CORS configs. Please remember to restart both frontend and backend apps once we apply a change, as otherwise, we won't see any effect. Let's gooooo! #### Access-Control-Allow-Origin Our frontend app runs on `http://localhost:3333`, and that's exactly the value we have within the `Access-Control-Allow-Origin` header. Let's change it to something else: ```go func enableCors(w http.ResponseWriter) { // specifies which domains are allowed to access this API w.Header().Set("Access-Control-Allow-Origin", "http://example.com") // specifies which methods are allowed to access this API w.Header().Set("Access-Control-Allow-Methods", "GET, POST, DELETE") // specifies which headers are allowed to access this API w.Header().Set("Access-Control-Allow-Headers", "Accept, Content-Type") // specifies for how long the browser can cache the results of a preflight request (in seconds) w.Header().Set("Access-Control-Max-Age", strconv.Itoa(60*60*2)) } ``` Let's restart the apps and see what will happen if we try to use them. ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/852i9s5twx9r0d7a813c.png) Well, the outcome is as expected: only `http://example.com` is allowed to call the API. What's even more interesting is the fact that if we put the localhost value there but with a different port, we'll still get an error: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9nx5g7uyhagmcvsyqnw8.png) The same applies for `http` vs `https` - CORS is very strict: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kcebu34z067amam44q33.png) There is a possibility, though, of allowing anyone to call your API - by using `*` as a value for the `Access-Control-Allow-Origin` header: ```go func enableCors(w http.ResponseWriter) { // specifies which domains are allowed to access this API w.Header().Set("Access-Control-Allow-Origin", "*") // specifies which methods are allowed to access this API w.Header().Set("Access-Control-Allow-Methods", "GET, POST, DELETE") // specifies which headers are allowed to access this API w.Header().Set("Access-Control-Allow-Headers", "Accept, Content-Type") // specifies for how long the browser can cache the results of a preflight request (in seconds) w.Header().Set("Access-Control-Max-Age", strconv.Itoa(60*60*2)) } ``` Even if we try to run our frontend app on different ports, it will pass the CORS step successfully. Use this only if you know what you are doing! Ok, let's change the `Access-Control-Allow-Origin` to the original one and jump to the next one. #### Access-Control-Allow-Methods Before started playing with this header, let me share an important gotcha with you: GET and POST methods are allowed by default regardless the settings. It means that `w.Header().Set("Access-Control-Allow-Methods", "GET, POST, DELETE")` has the same effect as `w.Header().Set("Access-Control-Allow-Methods", "DELETE")` That's why there is no need to delete them and wonder why the application still works. However, let's try to get rid of the `DELETE` one: ```go func enableCors(w http.ResponseWriter) { // specifies which domains are allowed to access this API w.Header().Set("Access-Control-Allow-Origin", "http://localhost:3333") // specifies which methods are allowed to access this API w.Header().Set("Access-Control-Allow-Methods", "GET, POST") // specifies which headers are allowed to access this API w.Header().Set("Access-Control-Allow-Headers", "Accept, Content-Type") // specifies for how long the browser can cache the results of a preflight request (in seconds) w.Header().Set("Access-Control-Max-Age", strconv.Itoa(60*60*2)) } ``` Once we restart both apps, we'll see that there are no issues with getting the books or adding a new one, but trying to delete them fires an error. ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nzc5s579et0my0vrce26.png) If the flow works for you with no issues, it is the browser caching ([the 2nd hardest thing](https://martinfowler.com/bliki/TwoHardThings.html) in the computer science) who spoils the fun. To fix that, try either: - ticking `Disable cache` box under the `Network` tab - opening a new incognito window As we can see, the errors clearly states the `Method DELETE is not allowed by Access-Control-Allow-Methods in preflight response` - we knew that already, didn't we? Let's revert the values and proceed to the next header. #### Access-Control-Allow-Headers We don't explicitly use `Accept` header in our code, that's why let's keep it there. But we do use the `Content-Type` one when we make a POST request. You know what to do with it =) ```go func enableCors(w http.ResponseWriter) { // specifies which domains are allowed to access this API w.Header().Set("Access-Control-Allow-Origin", "http://localhost:3333") // specifies which methods are allowed to access this API w.Header().Set("Access-Control-Allow-Methods", "GET, POST, DELETE") // specifies which headers are allowed to access this API w.Header().Set("Access-Control-Allow-Headers", "Accept") // specifies for how long the browser can cache the results of a preflight request (in seconds) w.Header().Set("Access-Control-Max-Age", strconv.Itoa(60*60*2)) } ``` If we click all the buttons we have, we'll see that getting and deleting books work, but adding a new one fails as expected: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5ixylf0kqy5z3m452b8n.png) `Request header field content-type is not allowed by Access-Control-Allow-Headers in preflight response.` makes sense. Production applications use way more headers, that's why review them carefully in order to configure CORS properly. Time to revert the changes and proceed to the last header in out list. #### Access-Control-Max-Age If not set, the default value is `0` which means that browser shouldn't cache the prefligh request data at all. Let's comment the header out and see what happens next: ```go func enableCors(w http.ResponseWriter) { // specifies which domains are allowed to access this API w.Header().Set("Access-Control-Allow-Origin", "http://localhost:3333") // specifies which methods are allowed to access this API w.Header().Set("Access-Control-Allow-Methods", "GET, POST, DELETE") // specifies which headers are allowed to access this API w.Header().Set("Access-Control-Allow-Headers", "Accept, Content-Type") // specifies for how long the browser can cache the results of a preflight request (in seconds) //w.Header().Set("Access-Control-Max-Age", strconv.Itoa(60*60*2)) } ``` Observe the `Network` tab after restarting the app: ![Image](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/213lswozgyjcrl7b403s.png) No cache policy forces browser to make a preflight request each time there is a POST or DELETE call (there are no preflight requests for GET calls - it's a rule). It's acceptable while doing testing, but it leads to the undesired load to your servers if there is no `Access-Control-Max-Age` header provided, that's why the rule of thumb is to have it. There is no ideal value for it though, it depends on your situation and requirements. And that's basically all I wanted to show you today. I'm sure you have a good understanding about CORS now, and can dive deeper on your own to learn even more on that topic. ## Where to go from here As I mentioned on the disclaimer section, there is [an excellent longread from Mozilla](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS) on that topic - I know that now you are well-equipped for it. If you are more into the RFC types of read, [here is the one that covers CORS as well](https://fetch.spec.whatwg.org/#http-cors-protocol) - the link leads exactly to the CORS section, but feel free to read all of it, if you have time and inspiration. Anyway, it's getting late in my time zone, so I need to call it a day and get some sleep. I hope you learn some new today and get a good grasp on what CORS is and how it works under the hood. See you in the next posts! Have fun =) P.S. Receive an email once I publish a new post - [subscribe here](https://mail.n0rdy.foo/subscription/form) P.P.S. I have created [a Twitter account](https://twitter.com/_n0rdy_) lately, so if you'd like to save my feed from ads, "funny" videos, and posts by Elon Musk, let's follow each other there =)
_n0rdy_
1,716,259
Validação de Formulários com React-Hook-Form e Zod
Introdução Gerenciar formulários em aplicações React pode ser desafiador, especialmente...
27,691
2024-01-05T12:28:37
https://dev.to/vitorrios1001/validacao-de-formularios-com-react-hook-form-e-zod-a6k
javascript, react, typescript, form
## Introdução Gerenciar formulários em aplicações React pode ser desafiador, especialmente quando se trata de validação e gerenciamento de estados. A combinação de React-Hook-Form, TypeScript e Zod oferece uma solução elegante e eficiente para esses desafios. Este artigo explora os benefícios dessa integração, demonstrando como ela simplifica o gerenciamento de formulários, comparando com a abordagem tradicional usando `useState`. ## Gerenciamento Tradicional de Formulários com `useState` Primeiro, vejamos um exemplo de gerenciamento de formulário com `useState`: ```tsx import React, { useState } from 'react'; const TraditionalForm = () => { const [name, setName] = useState(''); const [email, setEmail] = useState(''); const [errors, setErrors] = useState({ name: '', email: '' }); const validate = () => { let isValid = true; const errors = { name: '', email: '' }; if (!name) { errors.name = 'O nome é obrigatório.'; isValid = false; } else if (name.length < 3) { errors.name = 'O nome deve ter mais de 3 caracteres.'; isValid = false; } if (!email) { errors.email = 'O email é obrigatório.'; isValid = false; } else if (!/\S+@\S+\.\S+/.test(email)) { errors.email = 'Email inválido.'; isValid = false; } setErrors(errors); return isValid; }; const handleSubmit = (event) => { event.preventDefault(); if (validate()) { // Lógica de submissão do formulário console.log('Formulário submetido:', { name, email }); } }; return ( <form onSubmit={handleSubmit}> <div> <input value={name} onChange={(e) => setName(e.target.value)} placeholder="Nome" /> {errors.name && <p style={{ color: 'red' }}>{errors.name}</p>} </div> <div> <input value={email} onChange={(e) => setEmail(e.target.value)} placeholder="Email" /> {errors.email && <p style={{ color: 'red' }}>{errors.email}</p>} </div> {/* Outros campos e botão de submit */} <button type="submit">Enviar</button> </form> ); }; export default TraditionalForm; ``` Esta abordagem requer a manutenção de estados separados para cada campo e lógica manual de validação, o que pode tornar o componente extenso e difícil de manter, especialmente em formulários complexos. Antes de mergulharmos nos exemplos, é necessário configurar nosso ambiente com as bibliotecas necessárias. Aqui estão os passos para configurar React-Hook-Form e o Zod em seu projeto React. ### Instalando as Dependências Execute o seguinte comando para instalar React-Hook-Form e Zod: ```bash npm install react-hook-form zod @hookform/resolvers ``` Agora, vamos introduzir React-Hook-Form e Zod, utilizando TypeScript para uma integração tipo-segura. ### Configurando o Formulário com React-Hook-Form e Zod ```tsx import React from 'react'; import { useForm } from 'react-hook-form'; import { z } from 'zod'; import { zodResolver } from '@hookform/resolvers/zod'; const schema = z.object({ name: z.string().min(3, 'Nome é obrigatório'), email: z.string().email('Email inválido') }); type FormData = z.infer<typeof schema>; const ModernForm = () => { const { register, handleSubmit, formState: { errors } } = useForm<FormData>({ resolver: zodResolver(schema) }); const onSubmit = (data: FormData) => { console.log(data); }; return ( <form onSubmit={handleSubmit(onSubmit)}> <input {...register('name')} /> {errors.name && <p>{errors.name.message}</p>} <input {...register('email')} /> {errors.email && <p>{errors.email.message}</p>} {/* Outros campos e botão de submit */} </form> ); }; ``` ### Benefícios da Abordagem Moderna - **Menos Boilerplate**: React-Hook-Form reduz significativamente o código necessário para gerenciar estados e eventos de formulário. - **Validação Integrada**: Zod oferece uma forma declarativa e poderosa de definir esquemas de validação, simplificando a lógica de validação. - **Mensagens de Erro Gerenciadas**: As mensagens de erro são gerenciadas de forma centralizada e automatizada. - **Tipagem Forte com TypeScript**: A integração com TypeScript assegura que os dados do formulário estejam sempre alinhados com o esquema definido, melhorando a segurança e a previsibilidade do código. ## Comparação e Conclusão Comparando as duas abordagens, é evidente que React-Hook-Form, combinado com Zod e TypeScript, oferece uma solução mais limpa, organizada e eficiente para o gerenciamento de formulários em React. O código torna-se mais legível e fácil de manter, enquanto a validação e o gerenciamento de erros são simplificados e mais robustos. Esta stack é particularmente útil em formulários grandes e complexos, onde a abordagem tradicional pode se tornar rapidamente insustentável. Em suma, a adoção de React-Hook-Form, TypeScript e Zod em projetos React não só melhora a qualidade do código, mas também acelera o processo de desenvolvimento, permitindo que os desenvolvedores se concentrem mais na lógica de negócios e menos na mecânica do gerenciamento de formulários.
vitorrios1001
1,716,364
How To Get Social Media Previews Right on Astro blog with OpenGraph Meta Tags
So, you’ve got this fantastic website, and you’re ready to share it with the world. But wait, have...
0
2024-01-05T14:51:43
https://lirantal.com/blog/getting-social-media-previews-right-with-opengraph-meta-tags/
opengraph, social, astro
--- title: How To Get Social Media Previews Right on Astro blog with OpenGraph Meta Tags published: true date: 2024-01-03 00:00:00 UTC tags: opengraph, social, astro canonical_url: https://lirantal.com/blog/getting-social-media-previews-right-with-opengraph-meta-tags/ --- So, you’ve got this fantastic website, and you’re ready to share it with the world. But wait, have you considered how it looks when someone shares it on Facebook, Twitter, LinkedIn, or even WhatsApp? This is where OpenGraph comes into play – the unsung hero of social media previews. ## What is OpenGraph? OpenGraph is not some mysterious concept from a distant galaxy; it’s a set of metadata tags that you can embed in your website’s HTML. These tags provide information to social media platforms about how your content should be presented when shared. Think of them as the backstage passes for your website’s appearance on social media. ## Why OpenGraph important? Imagine sharing a link, and instead of a beautiful preview with an eye-catching image, you get a bland, generic snippet. OpenGraph ensures that when your content is shared, it grabs attention with engaging visuals and relevant information. It’s your website’s chance to make a memorable first impression. ## Example of OpenGraph Meta Tags Here’s a sneak peek at what OpenGraph meta tags look like for Facebook, Twitter, and LinkedIn: ``` <meta property="og:title" content="Your Title Here"> <meta property="og:description" content="Your engaging description here. Keep it concise and intriguing!"> <meta property="og:image" content="https://yourdomain.com/path/to/your/image.jpg"> <meta property="og:url" content="https://yourdomain.com"> ``` ## Using OpenGraph Meta Tags in Astro Astro is a fantastic static site generator that I’ve been using for this blog. It’s a great way to build fast, modern websites with a minimal footprint. Astro also has a handy [SEO component](https://docs.astro.build/core-concepts/seo) that makes it easy to add OpenGraph meta tags to your website. To get started, you’ll need to install the `astro-seo` package: ``` npm install astro-seo ``` Then, you can add the `SEO` component to your Astro project’s `src/components/seo.astro` file. Here’s an example of how I’ve set up the SEO component for my [Node.js Secure Coding](https://www.nodejs-security.com/) blog: ``` --- import { SEO } from 'astro-seo'; import { SITE } from '~/config.mjs'; import defaultImageFile from '~/assets/images/nodejs-secure-coding-website-og-lightmode-v3.png'; const { title = SITE.name, description = '', canonical, noindex = false, nofollow = false, ogTitle = title, ogType = 'website', } = Astro.props; const siteBaseURL = new URL(Astro.url); const defaultImage = new URL(defaultImageFile.src, siteBaseURL); let { image: _image } = Astro.props; _image = _image || defaultImage; let image = null; if (typeof _image === 'string') { image = new URL(_image, siteBaseURL); } else if (_image && typeof _image['href'] !== 'undefined') { image = new URL(_image['href'], siteBaseURL); } else { image = defaultImage; } --- <meta charset="UTF-8" /> <meta name="viewport" content="width=device-width, initial-scale=1.0" /> <SEO title={title} description={description} canonical={canonical} noindex={noindex} nofollow={nofollow} openGraph={{ basic: { url: canonical, title: ogTitle, type: ogType, image: _image?.src ? _image.src : defaultImage.toString(), }, image: { url: image.toString(), secureUrl: image.toString(), alt: ogTitle, height: _image?.height, width: _image?.width, type: _image?.format && `image/${_image.format}`, }, }} twitter={{ creator: '@liran_tal', image: image ? image.toString() : undefined, imageAlt: ogTitle, title: ogTitle, site: '@liran_tal', description: description, card: image ? 'summary_large_image' : 'summary', }} extend={{ meta: [ { name: 'og:locale', content: 'en_US', }, { name: 'og:description', content: description, }, { name: 'og:site_name', content: SITE.name, } ] }} /> ``` ## Key Properties for Ideal OpenGraph Configuration When setting up your OpenGraph, keep these key properties in mind: - **`og:image` Dimensions:** Optimal size is 1200x630 pixels for Facebook and LinkedIn. Twitter prefers 1200x675 pixels. - **Absolute URLs:** Ensure that your URLs are absolute, not relative. It’s the difference between leading someone to your front door and leaving them in the middle of nowhere. ## Important WhatsApp OpenGraph Configuration WhatsApp has its quirks, so make sure to include `og:site_name` as follows: ``` <meta property="og:site_name" content="Your Site Name"> ``` Another thing to get right with WhatsApp previews when it comes to OpenGraph meta tags is the image size. WhatsApp prefers images under 300kb, so keep those visuals lightweight yet impactful. ## Resources for Developing and Testing OpenGraph Meta Directives Feeling a bit overwhelmed? Don’t worry; I’ve got your back with some OpenGraph generator and testing resources. Here are some tools I’ve used myself for this blog to make the OpenGraph work smoother: - [Pika](https://pika.style/templates/open-graph-generator) - Design and generate visually stunning OpenGraph previews effortlessly. - [Uneed](https://www.og-image-generator.com) - Create captivating OpenGraph images with ease. - [opengraph.xyz](https://www.opengraph.xyz) - Test and preview your OpenGraph configuration before unveiling it to the social media world. In conclusion, OpenGraph is your website’s passport to the social media elite. With the right configuration and a touch of creativity, you can turn every share into a visual masterpiece. So, go ahead, spruce up your OpenGraph, and let your website shine in the social media spotlight!
lirantal
1,716,628
Exploring the Impact of verticalScroll() on fillMaxHeight() Modifier in Android Compose
Encountering a peculiar situation in Android Compose where the Modifier.verticalScroll() seems to...
0
2024-01-13T08:06:42
https://dev.to/atsuko/exploring-the-impact-of-verticalscroll-on-fillmaxheight-modifier-in-android-compose-cgo
android
Encountering a peculiar situation in Android Compose where the `Modifier.verticalScroll()` seems to ignore `Modifier.fillMaxHeight()` when applied to the parent composable sparked my curiosity. Let me share my notes on what I discovered while digging into the internal workings. ## What Caught My Eye Consider this scenario: without `verticalScroll()` on the parent, the child's `fillMaxHeight()` behaves as expected. ```kotlin @Composable fun Sample(modifier: Modifier = Modifier) { Column( modifier = modifier ) { Text( modifier = Modifier.fillMaxHeight(), text = "Without verticalScroll", ) } } ``` ![screen shout without verticalScroll()](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xqqxtqesiw4kit74ni68.png) However, once you add `verticalScroll()` to the parent, suddenly `fillMaxHeight()` is overlooked. ```kotlin @Composable fun Sample(modifier: Modifier = Modifier) { Column( modifier = modifier.verticalScroll(rememberScrollState()) ) { Text( modifier = Modifier.fillMaxHeight(), text = "With verticalScroll", ) } } ``` ![screen shout with verticalScroll()](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b4ywnn3thfu0z8ates23.png) I'm eager to understand the specifics of what `verticalScroll()` is doing to cause this shift in behavior. ## Insight from the Official Documentation [The documentation](https://developer.android.com/reference/kotlin/androidx/compose/foundation/package-summary#(androidx.compose.ui.Modifier).verticalScroll(androidx.compose.foundation.ScrollState,kotlin.Boolean,androidx.compose.foundation.gestures.FlingBehavior,kotlin.Boolean)) sheds some light: "Modify element to allow to scroll vertically when height of the content is bigger than max constraints allow." This suggests that if the content's size relies on the parent's size, determining whether to scroll or not may become challenging, leading to the observed behavior of being ignored. ## Peeking into the Internal Mechanism Let's take a closer look at what's happening inside `Modifier.verticalScroll()`. Skipping over some details, the crucial point is the utilization of `ScrollingLayoutModifier`. ```kotlin fun Modifier.verticalScroll( state: ScrollState, enabled: Boolean = true, flingBehavior: FlingBehavior? = null, reverseScrolling: Boolean = false ) = scroll( state = state, isScrollable = enabled, reverseScrolling = reverseScrolling, flingBehavior = flingBehavior, isVertical = true ) ``` ```kotlin @OptIn(ExperimentalFoundationApi::class) private fun Modifier.scroll( state: ScrollState, reverseScrolling: Boolean, flingBehavior: FlingBehavior?, isScrollable: Boolean, isVertical: Boolean ) = composed( factory = { val layout = ScrollingLayoutModifier(state, reverseScrolling, isVertical) semantics .clipScrollableContainer(orientation) .overscroll(overscrollEffect) .then(scrolling) .then(layout) }, ``` The `ScrollingLayoutModifier()` seems to be the culprit as it's overwriting the child's constraints, setting `maxHeight` to `Infinity`, regardless of the original setting. ```kotlin private data class ScrollingLayoutModifier( val scrollerState: ScrollState, val isReversed: Boolean, val isVertical: Boolean ) : LayoutModifier { override fun MeasureScope.measure( measurable: Measurable, constraints: Constraints ): MeasureResult { checkScrollableContainerConstraints( constraints, if (isVertical) Orientation.Vertical else Orientation.Horizontal ) val childConstraints = constraints.copy( maxHeight = if (isVertical) Constraints.Infinity else constraints.maxHeight, maxWidth = if (isVertical) constraints.maxWidth else Constraints.Infinity ) val placeable = measurable.measure(childConstraints) val width = placeable.width.coerceAtMost(constraints.maxWidth) val height = placeable.height.coerceAtMost(constraints.maxHeight) return layout(width, height) { val scroll = scrollerState.value.coerceIn(0, side) val absScroll = if (isReversed) scroll - side else -scroll val xOffset = if (isVertical) 0 else absScroll val yOffset = if (isVertical) absScroll else 0 placeable.placeRelativeWithLayer(xOffset, yOffset) } } ``` This means that, with constraints' `maxHeight` set to `Constraints.Infinity`, the child's height becomes unbounded, effectively immune to the influence of the parent. Consequently, it adopts the height of the Text portion within the child. For more insights into Compose's Layout and Constraints, you might find [this reference](https://developer.android.com/jetpack/compose/layouts/constraints-modifiers) useful.
atsuko
1,716,645
BUY GOOGLE REVIEWS
Buy Google Reviews Why Do You Need to Buy Google Business Reviews? Are you want to buy Google...
0
2024-01-04T06:09:24
https://dev.to/sathikhatun32002/buy-google-reviews-4d92
[Buy Google Reviews](https://mangocityit.com/service/buy-google-reviews/ ) Why Do You Need to Buy Google Business Reviews? Are you want to buy Google Reviews? The present business community is more inclined to use online services to promote their business firms. And the review scheme provided by these online services operates as a system that has the power to create a massive influence on various business firms. Google is the biggest search engine, and one of the most effective advertising websites and [Google Reviews](https://mangocityit.com/service/buy-google-reviews/ ) are an important of the virtual business world that helps consumers meet their right needs and the business firms to achieve a good and a wide customer base. Hence buying Google Reviews has the power to create a positive influence on any business firm that seeks the online platform to promote their firm and the products. What are Google Reviews? Google allows users to write reviews directly on the business’s Google or Google map listing. These third-party search engine results will present a series of stars underneath the main headings. These are a result of the ratings by the previous users. It is basically a simple way to present the overall star rating from a range of sources and gives the user an idea of your company’s credibility before they even visit the website of your business firm. Why is it important to Buy Google Business Reviews? Buying Google Reviews are important because these comments and ratings from different individuals in your review area on your business page can compel the future clients to choose your service over the other thousands of business firms that offer the same service as your company does. Since these reviews will be visible for the users and the ratings will be visible at the top of the heading of your company it is important to have a strong system of positive [Google Reviews](https://mangocityit.com/service/buy-google-reviews/ ). The mere existence of a page will not provide the necessary power to compete with the other competitive business firms. Because[ Google Reviews](https://mangocityit.com/service/buy-google-reviews/ ) builds up the credibility that your company needs in order to create a wide customer base. Positive reviews on Google Plus can help attract new clients to your business firm. And Google places reviews helps to get excellent rankings for local businesses. Hence buy Google Reviews will be a significant step taken towards the success within the business community. Why should you Buy Google Business Reviews through MangoCityit? How do you get positive reviews from your customers? Verify your business information, because only verified local Google Plus pages can respond to Google Reviews. Respond to the reviews from your customers actively. Respond to the negative reviews in a neutral manner and promise the customers that your company will overcome any negativity that the customers witnessed in the future services you provide. When you buy Google Reviews via MangoCityit all the aforementioned facts will be taken into consideration in order to generate Google Reviews with maximum quality. But one of the challenges that users face is the recognition of purchased Google Reviews. The features that MangoCityit provides to its customers include the following: Complete Customer Satisfaction Best services Affordable Prices Reviews of High Standard Constant customer support Also has a team of highly experienced professionals who are willing to help its users in all the problems they encounter when buying Google Reviews. Any user who wishes to increase the number of quality Google Reviews can buy high-quality Google reviews through . All you have to do is use the packages provided that offers you the lowest prices for its quality services. With Mangocityit you will receive the most genuine reviews, and your success in the business world will be a reality.. Why Importance To Buy Google Reviews? While Google reviews are vastly important for a business to grow big, the biggest problem is that customers don’t care. If you have 100 customers to your store or website, very few of them will review your Google business listing. On the other hand, almost all the new customers are looking for good reviews to become your sales leads. As you can see, google reviews are important, but you’re not getting them naturally. That’s why we get you the reviews in the most natural way and give your potential customers a reason to covert. Why Our Service Is the Best for Buy Google Reviews? There are plenty of service providers online who do the same job; so, why choose us? How do we differ from others? Well, quality talks for us and we’re here to get your business a push that nobody could. Here is how we do it: Permanent Reviews We ensure each of our Google reviews are permanent on where you want them. You can buy Google business reviews and get recommended from there for a lifetime; no issues will be there. Or, you can buy Google Maps Reviews, stay on top of your competitors and direct your customers to your place. Expert Team Our expert team has been in the business for a long while now and you get the highest quality from them. They know exactly how to make the reviews more reliable and trustworthy to your potential customers. Your customers will think they’re going for a business where others go with satisfaction. Country Variation If you own an online business with global reach, google reviews from different countries get you more traffic juice. We’re here to get you exactly what you need; you can talk to us and get reviews from different countries. Our team does reviews in the USA, UK, CA, AU, NJ, and so many other countries to ensure maximum variation and safety. Custom Name And Gender Not every business works with the same gender; we understand that and give you the freedom to choose what you need. You can talk to us about the gender variation and which names customers you love to get reviewed from. We will provide you the reviews with the right variation of male and female google users for your business. Secure Payment Options Security is top priority for us and we ensure the maximum of it; both for you and ourselves. When you buy google reviews from us, you’ll have the option to pay us securely without zapratizing your financial information. We accept some of the most secure payment methods like paypal, Mastercard, Visa, skrill, even bitcoin. Lifetime Refill As mentioned earlier, we ensure the google reviews are there for a lifetime where you want it. However, if any of the reviews get vanished or removed, we will be here to assist you on that. We’ll provide you a brand-new review as a placeholder to that removed review to ensure the gap is refilled. Can I Get Banned for Buying Google Reviews? Let’s put, Google doesn’t want you to get reviews that aren’t from your real customers and that’s understandable. However, with our Google review service, you don’t have to worry about that because we know how it works. Most of the time, businesses get banned because of robotic reviews that don’t seem authentic. But our expert team has mastered the process of producing real-life reviews that don’t make Google skeptic. SEO Importance of Google Reviews When it comes to operating SEO campaigns, especially for local businesses, Google reviews have the most importance. Here are the things you’ll get when you buy positive Google reviews for your local business: Better Local exposure Google reviews get you a better local visibility online for the audience of your local business. You can drive the traffic to your local business doorsteps using local SEO which requires Google reviews. You can buy Google Places Reviews and My business reviews and get a higher traffic flow with them. Improved interaction Our Google reviews will improve your customer interaction for sure because we include conversational reviews. New and potential customers will talk about it online which eventually gets you a positive result on the traffic. More interactions with your customers means more conversion and a better revenue stream. Increased Business Relevancy When you buy google reviews from us, we take the time to understand what’s the best relevant to your business. We design the reviews according to the business and make sure the review is relevant to what you sell. Therefore, your local searches will find the reviews more relatable to their needs and situations. Improved CTR Our target is getting you an improved click through ratio so that you can succeed with your business a lot easier. The best part about the Google reviews we get you is the user-centric design and write-ups. Our reviewers are experienced and know what you need, what your business needs for a better CRT. OUR SERVICE FEATURES How We Offer Google Reviews? Our main goal is to ensure the highest quality for our customers who’re trying to climb higher than their competitors. Here is how we offer our Google reviews and ensure that they have the highest quality: Home-7-Search-Engine-Optimization-Image 100% Original Reviews The thing that sets us apart is not only we have the best team in the business, but also separate teams for each job. Each team performs one specific job and that’s the reviews on the site they handle. A team works for customer service around the clock so that you don’t go unattended. Home-7-Real-Time-Analytics-Image 24/7 Customer Services All our reviews are 100% safe and secure since we use only handmade, researched, and drip feed reviews. Our reviews will surely elevate your traffic and get some sustainable spikes on your sales leads. If you’re here to thrive, we’re always there to have your back. Home-7-Content-Marketing-Image Quick Delivery Although we never compromise on the quality of the service, we don’t make it as expensive that people have to bounce. Rather, we’ve kept our profit margins minimal and made a competitive price range while our service is actually dominating. Home-7-Link-Building-Service-Image Drip Feed Review Our teams are ready to take over your job right after you place the order on our site. As soon as you order the reviews, our team gets its hands on it. They do a thorough research to come up with the most optimal reviews and let you know. How to Buy Google Reviews? If you have a matching service for your facebook page that meets your needs perfectly, go for direct order. If you need a custom order that we don’t have described on the site, we have options for you too. Here are two ways you can follow to buy facebook reviews from us and get started: Order Directly First, select the right package and click on the “Order now” button to go to the cart for review. Recheck the service descriptions, you’ll see the total amount you have to pay in the bottom right corner. Click on the “Proceed to checkout” button, then fill out the billing details, select your favorite payment method and pay to complete the order process. Contact Live Chat If our described services don’t match your needs exactly, you can also contact us for a custom service. Please contact us through the live chat button on our website and describe what services you need. Our service team will promptly answer you and get on the matter in no time. You can discuss payment methods, service essentials, or anything that can make it better for you. Questions You Want To Know Here are the most frequently asked questions that our customers ask us about buying Google reviews from us: Is it Safe to Buy Our Google Reviews? Can I Buy Targeted Reviews? Can I Buy Negative Reviews? Will the Reviews Get Banned? Will You Require My Logins? How Much Time Usually it Takes? Can I Get a Discount in Bulk? Will the Reviews Drop? Will the Reviews Be Posted from a Single Account? What Are The Payment Methods? 24 Hours Reply/Contact Email: mangocityit@gmail.com Skype: live:mangocityit Related products SALE!Buy 5 Star Reviews BUY 5 STAR REVIEWS $7.00 $5.00 Add to cartSALE!Buy Negative Google Reviews BUY NEGATIVE GOOGLE REVIEWS Rated 5.00 out of 5 $5.00 – $450.00 Select optionsSALE!BUY GOOGLE 5 STAR REVIEWS BUY GOOGLE 5 STAR REVIEWS $5.00 – $500.00 Select optionsSALE!Buy Remove Negative Reviews From Google BUY REMOVE NEGATIVE REVIEWS FROM GOOGLE Rated 5.00 out of 5 $30.00 – $300.00 Select options WHY CHOOSE US? Welcome to MangoCityit Website. Mangocityit is one of the Best Quality, Reliable Social Media Marketing and SEO Services Provider. We give 100% money back guarantee. Our only demand is to gain customer satisfaction through good and reliable services. If there arise any problem, you could continually contact us and we would be happy to aid you out. Here you can get all kinds of SMM and SEO service at the cheapest price. You can also visit Our Product to know about SMM & SEO Services. Express Delivery And Drip Feed We have maintained an excellent reputation over the past few years by delivering Facebook Reviews, Google Reviews, TrustPilot Reviews, Yelp Reviews & all others SMM & SEO service through high quality profiles. For this reason, there is no scope to cause a negative impact on your business. All profiles have profile pictures, posts, bio & others information. They look realistic and probably nobody will identify that you have bought Accounts & reviews. 100% SAFE AND SECURE Safe and Secure is the most important thing to Us. We want to enjoy our relationship with customer through quality working. If you have any issues with our service, you can contact us without any type of hesitation. We will provide the best possible solution to you. Our team is available nearly 24/7, so you can expect a reply within just a few minutes/hours. Custom orders or special requests are also welcome. We’re looking forward to hearing from you! 24/7 SUPPORT AND HELP You can get quick working experience through our website. In other words, when you buy TrustPilot Reviews or any other service from us, we will start that task within hour after the payment. The best thing about our services is that we are 24 hours available to you for whole week. Email- mangocityit@gmail.com & Skype: Mangocityit , Telegram: Mangocityit ,Just place the order and the task will be completed within few Time. ALL RIGHTS RESERVED BY MANGOCITYIT ↓ Contact Us Contact Form Name Name* Phone Phone* Email Email* Message Message*
sathikhatun32002
1,716,674
Revolutionizing Software Dynamics: A Technical Dive into the Marvel of Browser Extensions
In the intricate world of software development, the emergence of browser extensions has sparked a...
0
2024-01-04T06:34:18
https://dev.to/coditude_pvt_ltd/revolutionizing-software-dynamics-a-technical-dive-into-the-marvel-of-browser-extensions-h60
In the intricate world of software development, the emergence of browser extensions has sparked a revolution, offering a dynamic and technically robust platform for customizing existing software. This blog post explores the technical facets that make browser extensions an exceptional tool for transforming digital landscapes. Web Technologies Empowerment: Browser extensions leverage familiar web technologies such as HTML, CSS, and JavaScript. This empowers developers to harness their web development skills, facilitating the creation of feature-rich extensions that seamlessly integrate with existing software. Extension APIs for Unprecedented Control: Modern browsers provide Extension APIs (Application Programming Interfaces) that enable developers to exert unprecedented control over browser functionality. These APIs serve as the gateway for developers to interact with browser internals, customize user interfaces, and manipulate web content. Event-Driven Architecture: Browser extensions typically follow an event-driven architecture. Developers can register callbacks for specific events like page loading, user interactions, or button clicks. This event-driven approach allows for precise and responsive customization tailored to user actions. Content Scripts for Web Page Interaction: Content scripts are a powerful feature of browser extensions that enable interaction with the content of web pages. Through content scripts, developers can inject custom scripts into web pages, modifying or extracting information dynamically to enhance user interactions. Storage and Synchronization: Browser extensions offer local storage mechanisms for saving user preferences and settings. Additionally, synchronization capabilities enable users to carry their customized experiences across multiple devices seamlessly, ensuring consistency in their software environment. Security Measures: Security is paramount in the realm of browser extensions. Developers adhere to strict security practices to prevent malicious activities. Permissions are granted explicitly, and extension updates undergo rigorous review processes to maintain a secure browsing experience for users. Cross-Browser Compatibility: A standout feature of browser extensions is their cross-browser compatibility. Developers can create extensions that work seamlessly across various browsers, expanding the reach and impact of their customized software solutions. Versioning and Updates: Browser extensions are designed with versioning capabilities, allowing developers to release updates seamlessly. Users benefit from bug fixes, feature enhancements, and security patches, ensuring that their customized software remains cutting-edge and reliable. In conclusion, the technical prowess of browser extensions lies in their utilization of familiar web technologies, robust APIs, and innovative features like content scripts. As developers navigate this dynamic landscape, they unlock new possibilities for software customization, providing users with a technically sophisticated and personalized digital experience. The world of browser extensions continues to evolve, offering a glimpse into the future of customizable software solutions.
coditude_pvt_ltd
1,716,679
Buy Verified Transferwise Account
If You Want To More Information just Contact Now: Email: localsmmshop@gmail.com Skype:...
0
2024-01-04T06:41:43
https://dev.to/buybinance210/buy-verified-transferwise-account-13fm
If You Want To More Information just Contact Now: Email: localsmmshop@gmail.com Skype: LocalSMMShop Telegram: @localsmmshop WhatsApp: +1 (801) 410-0772 Buy Verified TransferWise Account 100% USA, UK, CA Any Country Verified TransferWise Account .If you are looking for Buy verified TransferWise account to use for personal and business purposes then localsmmshop.com is the best place to be. We always strive to achieve customer satisfaction by providing best service with reliability. Our Wise Account Features- ➤100% Customers Satisfaction Guaranteed ➤ Phone Verified ➤ Bank Verified ➤ Date of Birth Provided ➤ Driving License Scan Copy ➤ Photo ID Provided ➤Verified with a valid USA phone number. ➤ Bank verification is done with reputed and dependable USA banks. ➤ USA, UK, CAN, AUS, KHM, COL, DEU other countries TransferWise Looking to buy verified TransferWise account? Contact trusted sellers for a secure and hassle-free transaction. With a verified TransferWise account, you can easily send and receive money worldwide, making your international transactions swift and efficient. Save time and effort by purchasing a verified TransferWise account today. Why Buy Verified TransferWise Account? Buy verified TransferWise account to enjoy enhanced security for your online payments. With a verified account, you can have peace of mind knowing that your financial transactions are protected from potential fraud and unauthorized access. Using a verified TransferWise account provides added layers of security measures, such as two-factor authentication and encryption protocols. This ensures that your personal and financial information is safeguarded throughout the payment process. Additionally, a verified account allows for seamless and hassle-free international money transfers, enabling you to send and receive money from around the world with ease. By opting for a verified TransferWise account, you can trust that your online payments are secure and your financial transactions are conducted safely. Enjoy the benefits of enhanced security by buying verified TransferWise account today. Buy Verified TransferWise Account Understanding Transferwise And Its Features TransferWise is a popular online payment platform that offers various features to its users. It allows individuals and businesses to send and receive money globally at low fees. With a TransferWise account, you can enjoy the convenience of quick and secure transactions. One of its key features is the ability to convert money at mid-market exchange rates, which means you get a fair exchange rate with no hidden fees. Additionally, TransferWise provides multi-currency wallets, making it easier to manage funds in different currencies. It also offers a borderless account, allowing you to hold and send money in multiple currencies, making it perfect for international businesses and freelancers. By buying a verified TransferWise account, you gain access to all these features and can start transacting seamlessly across borders. How To Buy Verified TransferWise Account For Buying TransferWise you need to follow some rules. First you need to find a best provider, who provide fully verified account. LocalSMMShop Provide Fully verified wise account. For Buy Verified TransferWise Account Follow this step: Select Package: Select which package do you need, and which one reliable for you. Click Cart: After selecting Package click add to cart. Checkout and Payment: Then go to checkout page and complete payment. Deliver: LocalSMMShop will be deliver account in 24 Hours. Ensuring Security Of Your Online Payments In an increasingly digital world, ensuring the security of your online payments is crucial. One way to safeguard your transactions is by buying verified TransferWise account. But that’s not the only precaution you can take. Best practices for securing your TransferWise account include enabling two-factor authentication, which adds an extra layer of security. Additionally, it’s important to protect your account from phishing and scams by being vigilant and cautious. By following these steps and taking proactive measures, you can enhance the security of your TransferWise account and have peace of mind when making online payments. Maximizing The Benefits Of A Verified TransferWise Account Maximizing the benefits of a verified TransferWise account involves leveraging their low transaction fees. With TransferWise, cross-border transactions are streamlined, making the process efficient and cost-effective. The platform’s secure online payment system is made possible through the importance of having a verified account. Buy Verified TransferWise Account Frequently Asked Questions For Buy Verified TransferWise Account What Are The Benefits Of Buying Verified TransferWise Account? Buying verified TransferWise account offers convenience and reliability. It allows you to securely transfer money internationally, with no hidden fees, and get access to a fast and transparent payment system. How Do I Know If A TransferWise Account Is Verified? Buy verified TransferWise account will have a blue checkmark badge next to the user’s name, indicating that the account has undergone a verification process. This verification adds an extra layer of security and trust, giving you peace of mind when using the account. Is It Legal To Buy Verified TransferWise Account? Buying a verified TransferWise account is legal and does not violate any laws or terms of service. It is a common practice for individuals who need a reliable and verified account for their international money transfers. Can I Use A Verified TransferWise Account For Personal And Business Transactions? Yes, a verified TransferWise account can be used for both personal and business transactions. It offers a range of services for individuals and businesses, including sending and receiving money, managing currencies, and making international payments. Conclusion Investing in a verified TransferWise account is a smart move for anyone looking to streamline their financial transactions and make international transfers with ease. With its high level of security and reliability, a verified TransferWise account provides peace of mind, ensuring that your funds are protected at all times. By bypassing traditional banking systems and their associated fees, you can save money and time, allowing you to focus on what matters most – growing your business or simply enjoying life. The convenience and flexibility offered by TransferWise make it an ideal choice for individuals and businesses alike, enabling seamless cross-border transactions without the hassle of lengthy processing times. So why wait? Take advantage of all the benefits that a verified TransferWise account has to offer and elevate your financial experience to new heights. Experience the future of global banking today.
buybinance210
1,716,691
What is a UML Use Case Diagram?
A UML Use Case Diagram is a powerful visual representation that helps depict the functional aspects...
0
2024-01-04T06:59:22
https://dev.to/manojsharmajtp2/what-is-a-uml-use-case-diagram-e0b
javascript, webdev, tutorial, machinelearning
A[ UML Use Case Diagram](https://www.javatpoint.com/uml-use-case-diagram) is a powerful visual representation that helps depict the functional aspects of a system from a user's perspective. It's like a roadmap that showcases how users or external systems interact with a particular software application. Key Components of a Use Case Diagram: Actors: These are the participants in the system, typically representing users or external entities. Actors are crucial as they initiate and participate in various use cases. Use Cases: Use cases are the specific functionalities or actions that the system provides to its users. Each oval in the diagram represents a use case, making it easy to understand the different features offered. Associations: The lines connecting actors to use cases illustrate the interactions between them. This visual connection helps in identifying which actors are involved in specific use cases. System Boundary: A box delineating the system boundary encloses all the use cases. It clearly defines the scope of the system under consideration. **Why Use UML Use Case Diagrams?** Clarity in Communication: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/leeooxf9dyxcznjvp5fj.png) Use Case Diagrams serve as a common language between developers, designers, and stakeholders. They provide a clear and visual way to communicate the functionalities of a system. User-Centric Design: By focusing on how users interact with the system, Use Case Diagrams help in designing software from a user's perspective. This user-centric approach enhances the overall user experience. Requirements Analysis: These diagrams aid in the identification and analysis of system requirements. By visually mapping out use cases, development teams can ensure that all necessary functionalities are considered. Creating an Example Use Case Diagram: Imagine an online shopping system. The customer is the primary actor, and key use cases include browsing products, adding items to the cart, and the checkout process. Associations between the customer and these use cases illustrate the user's journey through the system
manojsharmajtp2
1,716,794
Decoding Bitcoin Transactions: A Step-by-Step Process
Understanding the intricacies of Bitcoin transactions is crucial in the ever-evolving world of...
0
2024-01-04T08:30:35
https://dev.to/rkatejo/decoding-bitcoin-transactions-a-step-by-step-process-2308
beginners, bitcoin
Understanding the intricacies of Bitcoin transactions is crucial in the ever-evolving world of cryptocurrency. The step-by-step process of decoding Bitcoin transactions, shedding light on the underlying mechanisms that make digital currency transactions secure and transparent. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qj37dnuw9dl1y989945x.jpg) ## Basics of Bitcoin Transactions Bitcoin transactions involve various key components, including transaction inputs and outputs. The roles of private and public keys in securing these transactions form the foundation of the entire process. ## Step-by-Step Guide to Decoding Bitcoin Transactions To decode a Bitcoin transaction, one must start by obtaining the transaction ID and then [explore the details](https://maple-investments.ca) on the blockchain. Understanding transaction scripts is vital in comprehending the flow of funds between different addresses. Confirmations play a crucial role in validating Bitcoin transactions, ensuring the security and reliability of the digital currency network. ## Common Challenges in Decoding Bitcoin Transactions Despite the robustness of the Bitcoin network, challenges such as transaction malleability, double-spending risks, and unconfirmed transactions persist, requiring ongoing attention. Blockchain explorers, wallet software, and open-source decoding tools empower users to decode Bitcoin transactions effectively, providing transparency and traceability. ## Real-world Applications Businesses leverage decoded Bitcoin transactions to enhance transparency in financial transactions, fostering trust and accountability in the digital economy. Protecting private keys and adopting best practices for secure transactions are essential steps in ensuring the integrity of Bitcoin transactions. ## Future Trends in Bitcoin Transactions As technology advances and regulatory frameworks evolve, the future of Bitcoin transactions holds promising developments, shaping the landscape of digital finance. ## Conclusion In conclusion, decoding Bitcoin transactions is a valuable skill that contributes to the broader understanding of the cryptocurrency ecosystem. Encouraging users to explore and comprehend these transactions fosters a more informed and secure digital financial environment. ## FAQs: Why is decoding Bitcoin transactions important? Decoding Bitcoin transactions enhances transparency and security in digital financial transactions. What are the common challenges in Bitcoin transaction decoding? Challenges include transaction malleability, double-spending risks, and unconfirmed transactions. How do businesses benefit from decoding Bitcoin transactions? Decoded transactions provide businesses with transparency, fostering trust and accountability. What security measures should one take in Bitcoin transactions? Protecting private keys and adopting best practices ensure the security of Bitcoin transactions. What are the future trends in Bitcoin transactions? Advancements in technology and evolving regulatory landscapes shape the future of Bitcoin transactions.
rkatejo
1,716,802
Choosing Between Python and TypeScript: A Guide for Developers
Introduction In the ever-expanding world of programming languages, developers are often...
0
2024-01-04T08:42:51
https://dev.to/pranta/choosing-between-python-and-typescript-a-guide-for-developers-4e31
python, typescript, developersguide, programming
### Introduction In the ever-expanding world of programming languages, developers are often faced with the dilemma of choosing the right tool for the job. Python and TypeScript are two popular languages, each with its own strengths and use cases. In this blog post, we'll explore when it's appropriate to use Python and when TypeScript might be the better choice. ### Python: The Swiss Army Knife of Programming Languages Python has gained immense popularity due to its readability, simplicity, and versatility. Here are some scenarios where Python shines: 1. **Web Development:** - Python, coupled with frameworks like Django or Flask, is an excellent choice for building robust and scalable web applications. Its clean syntax and extensive libraries make development efficient. 2. **Data Science and Machine Learning:** - Python is the go-to language for data scientists and machine learning engineers. Libraries such as NumPy, Pandas, and TensorFlow offer powerful tools for data manipulation, analysis, and machine learning. 3. **Automation and Scripting:** - Python's ease of use and cross-platform compatibility make it ideal for automation and scripting tasks. It is widely used for writing scripts to automate repetitive processes. 4. **Prototyping:** - Python's quick development cycle and dynamic typing make it an excellent choice for prototyping. Developers can rapidly test ideas and concepts before committing to a more rigid implementation. ### TypeScript: Bringing Type Safety to JavaScript TypeScript is a superset of JavaScript that adds static typing to the language. It's particularly beneficial in the following scenarios: 1. **Large-Scale Applications:** - TypeScript shines in large-scale applications where the benefits of static typing become more apparent. The compiler catches type-related errors early in the development process, leading to more maintainable code. 2. **Collaborative Development:** - In projects with multiple developers, TypeScript helps enhance collaboration by providing a clear contract for function and variable inputs and outputs. This reduces the chances of miscommunication and integration issues. 3. **Front-End Development:** - For building modern and complex front-end applications, TypeScript integrates seamlessly with popular frameworks like Angular. It offers a structured approach to building scalable and maintainable codebases. 4. **Codebase Refactoring:** - TypeScript is advantageous when refactoring or maintaining existing codebases. The addition of static types allows developers to make changes with more confidence, knowing that the compiler will catch any potential issues. ### Conclusion: Ultimately, the choice between Python and TypeScript depends on the nature of the project, the team's expertise, and the specific requirements. Python's simplicity and versatility make it an excellent choice for a wide range of applications, while TypeScript's static typing adds a layer of safety and structure that is valuable in large-scale projects and collaborative environments. As a developer, being familiar with the strengths of each language will empower you to make informed decisions based on the unique needs of your project.
pranta
1,716,807
Simplified: Penjana Penutup Surat Pintar untuk Penganalisis Sistem Perniagaan
Penjana penutup surat pintar dari Simplified membantu anda mencipta penutup surat yang menarik dengan...
0
2024-01-04T08:54:20
https://dev.to/business-systems-analyst/simplified-penjana-penutup-surat-pintar-untuk-penganalisis-sistem-perniagaan-27f
Penjana penutup surat pintar dari Simplified membantu anda mencipta penutup surat yang menarik dengan bantuan AI. Dapatkan kelebihan teknologi AI dalam menulis penutup surat yang efektif dan memikat. Memperkenalkan penjana penutup surat AI untuk **[penganalisis sistem perniagaan](https://simplified.com/ms-ai-cover-letter-generator/business-systems-analyst)**. Dapatkan bantuan AI untuk menghasilkan penutup surat yang profesional dan meningkatkan peluang anda dalam permohonan pekerjaan. Simplified memperkenalkan penjana penutup surat AI untuk penganalisis sistem perniagaan. Dapatkan penutup surat berkualiti yang menarik minat majikan dan meningkatkan kesempatan anda dalam permohonan pekerjaan. **[https://simplified.com/ms-ai-cover-letter-generator/business-systems-analyst](https://simplified.com/ms-ai-cover-letter-generator/business-systems-analyst)**
business-systems-analyst
1,716,832
Noise reduction
Tengo instalada la distro https://neon.kde.org/ (iso: neon-user-20231221-0716.iso), y una novedad que...
25,943
2024-01-04T13:43:48
https://dev.to/elferrer/noise-reduction-5gef
bash, pipewire, tutorial
Tengo instalada la distro https://neon.kde.org/ (iso: neon-user-20231221-0716.iso), y una novedad que me he encontrado a finales de 2023 tras unos meses sin tocar mi ordenador ha sido el cambio de *pulseaudio* a *pipewire*. Ello implica que ya no me funciona la configuración de sonido que tenía aplicada para quitarle el ruido de fondo al micrófono. Así que me he puesto manos a la obra para volver a ponerme el eliminador de ruido, pero esta vez para *pipewire*. He de decir que me ha encantado el cambio a *pipewire*. --- ## Documentación Todo lo indicado está basado en la documentación oficial. Puedes encontrar la documentación de *pipewire* en [Pipewire](https://github.com/mikeroyal/PipeWire-Guide) Además, vamos a habilitar la reducción de ruido con *easyeffects*, la documentación está en [Easyeffects](https://github.com/wwmm/easyeffects/wiki/Community-presets) --- ## Índice<a name="index"></a> - [Revisar la instalación de *pipewire*](#chapter-1) - [Instalar librerías](#chapter-1a) - [Sustituir *pipewire-media-session* por *wireplumber*](#chapter-1b) - [Configurar *pipewire* para clientes *alsa*](#chapter-1c) - [Desactivar *pulseaudio*](#chapter-1d) - [Revisar y activar](#chapter-1e) - [Instalar *Easyeffects*](#chapter-2) - [Activar en el inicio (*autostart*)](#chapter-3) --- [-^-](#index) ## Revisar la instalación de *pipewire*<a name="chapter-1"></a> > Todo el código que ves está pensado para que puedas ejecutarlo con un script *bash*, o, directamente en la consola *bash*. Recuerda que si creas un archivo deberás indicar en la primera línea del archivo `#!/bin/bash` para que se ejecute con dicha *shell*. [-^-](#index) ### Instalar librerías<a name="chapter-1a"></a> El primer paso es revisar la instalación de las librerías cliente: ``` sudo apt install pipewire-audio-client-libraries libspa-0.2-bluetooth libspa-0.2-jack ``` [-^-](#index) ### Sustituir *pipewire-media-session* por *wireplumber*<a name="chapter-1b"></a> Lo siguiente que necesitamos es instalar *wireplumber* y quitar *pipewire-media-session* ya que éste último se ha quedado descontinuado y se ha cambiado por el primero: ``` sudo apt purge pipewire-media-session sudo apt install wireplumber ``` [-^-](#index) ### Configurar *pipewire* para clientes *alsa*<a name="chapter-1c"></a> Ahora copiaremos el archivo de configuración para *alsa*: ``` sudo cp /usr/share/doc/pipewire/examples/alsa.conf.d/99-pipewire-default.conf /etc/alsa/conf.d/ ``` [-^-](#index) ### Desactivar *pulseaudio*<a name="chapter-1d"></a> Es el momento de desactivar *pulseaudio*, para ello deshabilitaremos el servicio y anularemos sus mensajes enmascarándolos hacia `/dev/null`: ``` systemctl --user --now disable pulseaudio.service pulseaudio.socket systemctl --user mask pulseaudio ``` [-^-](#index) ### Revisar y activar<a name="chapter-1e"></a> Actualizamos la configuración, para ello lanzamos: ``` sudo ldconfig ``` Activamos *wireplumber*: ``` systemctl --user --now enable wireplumber.service ``` Y en caso de ser necesario reiniciamos los servicios: ``` systemctl --user restart pipewire ``` Comprobamos su estado: ``` systemctl --user status pipewire pipewire-session-manager ``` y consultamos la info: ``` pactl info ``` Verás un resumen. En la info que te lanza `pactl` verás una línea de texto similar a "**Server Name: PulseAudio (on PipeWire x.x.x.x)**". Si vemos *on Pipewire* quiere decir que está todo bien. > Podíamos haber realizado un `grep` para mostrar solo la línea que nos interesa, pero en este momento es más interesante ver toda la info. Puedes adaptarte el script para que te dé un error si no existe 'PipeWire' en el resultado. --- [-^-](#index) ## Instalar *Easyeffects*<a name="chapter-2"></a> Después de haber revisado *pipewire* le llega el momento a *easyeffects*. Con *easyeffects* vamos a añadir el efecto *noise-reduction*. Instalamos con `flatpack` el módulo *easyeffects*. Evidentemente deberás tener instalado `flatpack`: ``` flatpak install flathub com.github.wwmm.easyeffects ``` > Puedes lanzarlo la interfaz visual con `flatpak run com.github.wwmm.easyeffects`. Vamos a empezar, construyamos algunas variables para mejorar la legibilidad y evitar errores: ``` noiseReduction="noise-reduction" folderEasyEffects="$HOME/.var/app/com.github.wwmm.easyeffects/config/easyeffects" easyEffectsConfigInput="autoload/input" configureNoiseReduction="${noiseReduction}.json" noiseReductionJson="${folderEasyEffects}/${easyEffectsConfigInput}/${configureNoiseReduction}" ``` > Te habrás percibido de que vamos a guardar el filtro dentro de la carpeta `autoload/input`, de esta forma siempre se leerá cuando se inicie easyeffects. Por contra, si no quieres que se lea de forma automática deberás guardarlo en la carpeta `input` en vez de en la carpeta `autoload/input`. Ahora vamos a crear el archivo `noise-reduction.json`: ``` read -r -d '' configureContentNoiseReduction <<EOF { "input": { "blocklist": [], "plugins_order": [ "rnnoise#0" ], "rnnoise#0": { "bypass": false, "enable-vad": true, "input-gain": 0.0, "model-path": "", "output-gain": 0.0, "release": 20.0, "vad-thres": 50.0, "wet": 0.0 } } } EOF echo "${configureContentNoiseReduction}" >| "${noiseReductionJson}" ``` > En cada ordenador, habrá que investigar un poco sobre qué valores del filtro son los que mejor se adaptan. Una vez ejecutado el código, comprobamos que se ha creado correctamente: ``` [ -f "${noiseReductionJson}" ] fileExist=$? ``` > Si la variable `fileExist` tiene un `0` quiere decir que el archivo existe; cualquier otro valor es un error. Con estos pasos ya tenemos preparada la configuración y podemos volver a lanzar el script cuando lo necesitemos. --- [-^-](#index) ## Activar en el inicio (*autostart*)<a name="chapter-3"></a> En este último paso vamos a hacer que se ejecute *easyeffects* en el inicio. > Si no quieres que el perfil se inicie automáticamente, puedes lanzarlo con `flatpak run com.github.wwmm.easyeffects --load-preset nombre_del_perfil --gapplication-service &`. Primero nos creamos nuestras variables para simplificar las acciones: ``` profileFolder="/etc/profile.d" initEasyEffects="99-autostart-easyeffects.sh" ``` Creamos el archivo que se ejecutará al inicio: ``` read -r -d '' easyeffectsContent <<EOF #!/bin/bash flatpak run com.github.wwmm.easyeffects --gapplication-service & EOF echo "${easyeffectsContent}" >| "${initEasyEffects}.temp" ``` Movemos el archivo a su destino: ``` sudo mv "${initEasyEffects}.temp" "${profileFolder}/${initEasyEffects}" ``` Comprobamos que se movió correctamente: ``` [ -f "${profileFolder}/${initEasyEffects}" ] fileExist=$? ``` Y le cambiamos los permisos para que se pueda ejecutar: ``` sudo chmod +x ${initEasyEffects} ``` --- Espero te haya servido de ayuda. :)
elferrer
1,716,872
A Look at Sonya Natanzon s Experience at GSAS
In October 2023,Apiumhub proudly organized the Global Software Architecture Summit in Barcelona, an...
0
2024-01-22T11:40:05
https://apiumhub.com/tech-blog-barcelona/sonya-natanzon-experience-gsas/
architecture
--- title: A Look at Sonya Natanzon´s Experience at GSAS published: true date: 2024-01-04 08:00:46 UTC tags: Softwarearchitecture canonical_url: https://apiumhub.com/tech-blog-barcelona/sonya-natanzon-experience-gsas/ --- In October 2023,[Apiumhub](https://apiumhub.com/) proudly organized the [Global Software Architecture Summit](https://gsas.io/) in Barcelona, an event that brought together over 550 attendees from 37 countries. GSAS served as a hub for cutting-edge discussions, insights, and innovations in software architecture. Attendees had the unique opportunity to learn from industry leaders like Sonya Natanzon, explore emerging trends, and gain invaluable perspectives that will shape the future of software development. Not to mention, there were plenty of networking opportunities! Many renowned industry professionals attended the conference to present their insights and ideas, among them Sonya Natanzon, an engineering leader and software architect with many years of experience. Sonya Natanzon presented her talk “[It’s a Feature, Not a Bug: A Step-by-step Guide to Architectural Decisions](https://www.youtube.com/watch?v=chrjl9ALtKQ)“ which generated a lot of discussion among attendees who valued it as one of the most interesting talks. During the event, Apiumhub had the opportunity to interview Sonya Natanzon to learn more about her experience at the conference, her interests, and current essential practices in software architecture. Want to know more about her? Keep reading! ## Sonya Natanzon´s Experience at GSAS 2023 ### What are your thoughts on this year’s Global Software Architecture Summit? It’s huge! Lots of amazing speakers and I am looking forward to the social part of it and the community part of it, interacting with people, and learning from them. That has always been my favorite part and it’s not disappointing this year. ### What are the current essential practices in software architecture this year? Some of the essential practices that I hear surface are socio-technical concerns, team topologies, empowered teams, and independent teams. I think we’re coming to realize that technologies are great and we should know and understand them, but having teams that can work with them, support them, innovate, and drive them forward is the key. ### Which areas in software architecture are you interested in exploring next year? In the upcoming year, I am looking to work with the sociotechnical aspect so maybe a little bit in a different way, at a different angle. While we all strive to empower teams the reality is frequently a lot messier than we like it to be and we can’t always impact how the team structure is created and how it’s changed. So, if we’re dealing with the team structures, and how they are, and we can’t change them, what can we change to be able to deliver faster and to have effective project spaces? That’s the area that I’m interested in. <iframe title="GSAS 2023: Interview with Sonya Natanzon #GSAS23" width="1140" height="641" src="https://www.youtube.com/embed/UvBvhRyi_QQ?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe> Are you interested in watching more interviews like this one with Sonya Natanzon´s experience? Visit our [YouTube channel](https://www.youtube.com/@Apiumhub)! ### GSAS 2024: Join us this year! Did you miss our event last year? We already have the dates for GSAS 2024! The Global Software Architecture Summit will take place on October 14-16,2024 at the Axa Auditorium in sunny Barcelona. Early bird tickets will be available from January 15th [here](https://gsas.io/). Don´t miss the opportunity to learn best practices from top leaders in the industry like Neal Ford, Eoin Woods, Vlad Khononov, Andrew Harmel-Law, Luca Mezzalira, and Christian Ciceri. This year’s conference will focus on AI in software architecture. In this year’s conference, the main focus will be on Artificial Intelligence in software architecture. Would you like to share your cutting-edge research, innovative insights, or real-world experience with the global community of software architects if you have anything to contribute? We would love to hear from you [here](https://docs.google.com/forms/d/e/1FAIpQLSf2PeISgsRvU1G3Q0zOAsxlGqO017lpK_Dp0EO-k0xsAYijlg/viewform). Please feel free to contact us if you have any questions or concerns!
apium_hub
1,716,918
2024 Unveiled: Top Cyber Trends in a Crystal Ball
According to Forbes, by the end of the coming year, the cost of cyber attacks on the global economy...
0
2024-01-04T10:30:03
https://dev.to/cnatsopoulou/2024-unveiled-top-cyber-trends-in-a-crystal-ball-5328
_According to [Forbes](https://www.forbes.com/?sh=79404b432254), by the end of the coming year, the cost of cyber attacks on the global economy is predicted to top $10.5 trillion._ As we step into the new year, fresh trends emerge on the cybersecurity horizon! Today, we'll delve into the anticipated trends of 2024. Given that many of these trends are projections rooted in the developments of 2023 or earlier, it's essential to take a moment to look back and examine the challenges encountered by cyber experts in the past year. This retrospective glance will provide us with valuable insights to navigate the evolving landscape of cybersecurity in the upcoming year. ## 2023: A Year in Review In 2023, several major companies experienced **data breaches**. [Okta](https://www.okta.com/), for instance, fell victim to a breach linked to a compromised personal Gmail account. The hacker used it to sneak into the support case management system, and swiped customer session tokens that could be used to break into the networks of Okta customers. Another affected entity was [23andMe](https://www.23andme.com/en-int/), a consumer genetics and research company headquartered in California US, which encountered a credential stuffing attack, exposing the information of over 5 million users. [UPS](https://www.ups.com/us/en/Home.page) faced a security incident involving attackers using obtained information to conduct SMS phishing attempts, falsely claiming to provide delivery details. Lastly, the [MGM](https://www.mgm.com/) Ransomware attack targeted the company's employee password reset workflow. Certainly, ransomware attacks were among the negative trends of the previous year. In 2023, we observed some of the most notable instances of **ransomware attacks**. This surge of digital attacks centers around taking advantage of a weakness in a managed file transfer software called [MOVEit](https://www.progress.com/moveit). The vulnerability, targeted by the Clop ransomware group linked to Russia, aimed to data theft, particularly personally identifiable information (PII) from customer databases. Moreover, one of the largest US dental health insurers, [Managed Care of North America](https://www.mcna.net/en/home) (MCNA) Dental, was targeted by a ransomware attack that compromised the personal data of about 9 million individuals. Nevertheless, 2023 also ushered in positive trends that enhanced global security. **Multi-factor Authentication** (MFA) emerged as a key player in this landscape. It mandates individuals to authenticate not only with something they know, like a password, but also with something they are or something they have. This extra layer of verification significantly bolsters security by adding multiple checkpoints, making it more challenging for unauthorized access. MFA's widespread adoption marks a crucial step forward in fortifying digital defenses against evolving threats. Furthermore, let's delve into the realm of **quantum computers**. This trend carries both positive and negative implications. On the positive side, their remarkable speed enables the swift resolution of problems, such as optimization problems or machine learning optimization, that would traditionally take years. However, on the flip side, this very capability raises concerns about the potential to crack current encryption. If that happens, our existing ways of keeping data safe would be outdated, and we would need new methods to guard against these super-powerful computers. **Artificial Intelligence** (AI), too, emerges as a trend with dual implications, encompassing both positive and negative aspects. On one hand, AI in the hands of ethical practitioners enables robust analysis and investigation. On the other hand, malicious actors utilizing AI can devise sophisticated attacks. This dual nature underscores the need for responsible development and deployment of AI technologies, as they wield the power to both enhance and potentially undermine various aspects of our digital landscape. ## 10 Top Cyber Trends in 2024 And now let's turn our attention to what's happening in 2024. **1. Moving from passwords to passkeys.** Remembering passwords can be a hassle and poses a risk if they fall into the wrong hands. That's why there's a shift towards using passkeys, a new technology, based on FIDO2 authentication, for secure logins. With passkeys, you don't need to remember a traditional password; instead, you can use a simpler, more user-friendly, and secure method. Passkeys enable users to access apps and websites employing features like fingerprint recognition, facial scans, or screen lock PINs. Unlike passwords, passkeys offer resistance against online threats such as phishing, enhancing security compared to methods like SMS one-time codes. According to Blair Cohen, founder and president of AuthenticID, _"I applaud it and think it's great for everyday consumer use, but don't think FIDO2 will be the choice of enterprises, large-scale banks, etc. There are just too many vulnerabilities,"_ he said, specifically highlighting its vulnerability to first-party fraud. Jack Poller, analyst at TechTarget's Enterprise Strategy Group (ESG) disagreed saying "_FIDO2 is going to win in the consumer marketplace since many enterprise organizations, such as Google, Amazon and Apple, currently support it because it's phishing-resistant._" **2. Increase of IoT attacks.** Today an increasing number of individuals opt for smart devices that seamlessly communicate with each other and connect to the internet. This happens because these devices are designed for ease of use and convenience rather than secure operations, and home consumer IoT devices may be at risk due to weak security protocols and passwords. Nevertheless, this growing reliance on interconnected devices also presents additional opportunities for cyber attackers to infiltrate and exploit potential vulnerabilities. Additionally, with the ongoing work-from-home revolution, the risks associated with employees connecting or sharing data over inadequately secured devices persist as a significant threat in 2024. > In the realm of IoT, every device transforms into a computer, and as we understand, every computer is susceptible to hacking. Thus, each smart device holds the potential vulnerability to hacking. **3. Increased focus on AI.** The 2024 is AI heavy. _AI phishing:_ People have become educated and can often identify phishing emails due to the presence of poor grammar and spelling mistakes. With AI phishing, bad actors can use Large Language Models (LLMs) to remove these idiosyncrasies and sound more like a native speaker, luring victims into a false sense of security. So, organizations will need a more advanced tool to effectively prevent fraud, that examines additional factors about the fraudster, like their IP address, location, and user ID, rather than just focusing on the content they generate. _Increase of deepfake social engineering attempts:_ This kind of attack, where someone's voice and face are faked to make others believe and trust a misleading video, is on the rise. These deepfakes can be used for social engineering attacks, impersonation, and spreading false information. It's crucial for people to establish security measures that don't rely solely on the information in the deepfake itself. In 2024, deepfake technology, creating realistic yet fake audio and video, is becoming a more significant concern. As the threat of deepfakes grows, organizations need to invest in tools and strategies to detect deepfakes and protect their reputation and data integrity. Raising awareness and providing education are vital in addressing this emerging threat. _AI Hallucination threat:_ This is a phenomenon wherein LLM-often a generative AI chatbot-provide people with information that is not always right and contributes to the spread of misinformation. AI models can also be vulnerable to adversarial attack, wherein bad actors manipulate the output of an AI model by subtly tweaking the input data. The best way to mitigate the impact of AI hallucinations is to train the AI models on diverse, balanced and well-structured data, to test the system continually, and make sure a human being is validating and reviewing AI outputs. > If cyber attack and defense in 2024 is a game of chess, then AI is the queen – with the ability to create powerful strategic advantages for whoever plays it best. **4. Ransomware attacks.** In 2024, ransomware attacks are anticipated to evolve into more sophisticated threats, impacting both individuals and organizations. A strong defense against such attacks lies in having a resilient backup and recovery plan. Consistently backing up data, educating staff about phishing risks, and deploying effective security measures are crucial components of this strategy. The ongoing battle against ransomware remains a paramount focus within the realm of cybersecurity. **5. Regulatory Compliance and Privacy.** As organizations increasingly leverage data, there is a growing emphasis on ensuring data privacy and protection. Consequently, regulatory frameworks are constantly evolving. In 2024, organizations will prioritize compliance with rigorous data protection regulations, including acts such as the [Data Act and Data Governance Act](https://dev.to/cnatsopoulou/decoding-data-compliance-a-dive-into-the-data-act-data-governance-act-3a75). The emphasis will be on transparent data practices, the implementation of robust security measures, and the demonstration of accountability in the handling of sensitive information. **6. Less than Zero trust.** In 2024, Zero Trust cybersecurity will continue grow as a priority for many organizations amid intensifying cyberthreats and many organizations will continue implementing a cybersecurity strategy based on zero trust principles as they did the previous year. In 2024, expect more widespread adoption of Zero Trust principles, which means that trust is never assumed and there will be more ways to verify users really are who they claim to be, and adding measures to ensure malicious actors won’t get far even if they thwart initial defenses. **7. Explosion of BYOD and mobile devices.** In 2024, we can expect a sustained surge in the prevalence of BYOD (Bring Your Own Device) and increased reliance on mobile devices. To keep important company information safe on these devices, organizations will have to use strong Mobile Device Management (MDM) solutions and make sure strict security rules are followed. Balancing the need for employees to be productive with making sure all data is protected will be a big challenge for companies in this changing environment. **8. Cybersecurity skills gap and education.** The demand for skilled cybersecurity professionals is higher than ever. There’s a growing gap between the demand and the available talents. According to Forbes, research indicates that a majority (54%) of cyber security professionals believe that the impact of the skills shortage on their organization has worsened over the past two years. In 2024, this IT skills gap will persist, making it challenging for organizations to find qualified experts to manage their cybersecurity needs. Organizations will need to invest in training and development programs to upskill their existing staff and attract new talents. The shortage of cybersecurity experts is a pressing issue that can’t be ignored. **9. Quantum cryptography.** As previously noted, the emergence of quantum computing presents a challenge to traditional cryptography and this year we are closer to this threat. That's why in 2024, there will be a substantial uptick in the use of cryptographic algorithms designed to resist quantum attacks. Businesses will prioritize the enhancement of their cryptographic techniques to protect sensitive data from potential threats posed by quantum computing. This dynamic landscape offers a considerable growth opportunity within the realm of security. **10. Platform engineering**. This trend holds significance in 2024 as it marks a pivotal evolution in the development of software and applications. In the early stages of organizational growth, a single development team typically handles all responsibilities. However, as the organization expands, it becomes common to have separate security and development teams. In this scenario, cybersecurity plays a crucial role in orchestrating effective collaboration between these two teams to enhance productivity and ensure seamless operations. In summary, there has been a concerning increase in data breaches up to the present moment. The looming threat of ransomware attacks remains persistent. MFA is now being provided as an option by numerous websites. Additionally, there has been a staggering 400% rise in IoT threats. As we navigate these challenges, it is crucial for organizations to prioritize cybersecurity measures, adapt to evolving threats, and stay proactive in safeguarding sensitive data and systems this year as well. And, as always, let's wrap up with a few quotes... > “It takes 20 years to build a reputation and few minutes of cyber-incident to ruin it.” > **Stephane Nappo, Vice President. Global CISO, Groupe SEB** > “We discovered in our research that insider threats are not viewed as seriously as external threats, like a cyberattack. But when companies had an insider threat, in general, they were much more costly than external incidents. This was largely because the insider that is smart has the skills to hide the crime, for months, for years, sometimes forever.” > **Dr. Larry Ponemon, Chairman, Ponemon Institute, at SecureWorld Boston**
cnatsopoulou
1,717,073
Things to Consider While Developing On Demand Video Streaming App Like Netflix
Do you recall that frantic flick through the cable channels, only to discover infomercials and...
0
2024-01-04T12:59:44
https://dev.to/developersdev_/things-to-consider-while-developing-on-demand-video-streaming-app-like-netflix-2537
videostreamingapp, appdevelopment
Do you recall that frantic flick through the cable channels, only to discover infomercials and reruns? Oh no. Everybody has been there. Hey, that's why streaming apps are so popular, isn't that right? With so many captivating series, Netflix raised the bar, and now everyone is aiming to create the next big video streaming app. Hold on, though, cowgirls and cowboys. It's not for the timid to enter this virtual rodeo. There is intense competition, so it takes careful preparation to stand out from the throng. Thus, let's talk about the fundamentals of creating a streaming app that will make users exclaim, "This ain't your mama's cable TV!" before you get on your horse and start coding like a mad genius. **Conquering the Stream: A Guide to Video App Development** Before diving into the intricacies of development, it's crucial to comprehend the dynamics of the video streaming app development landscape. Choosing the right video streaming app development company is the first step towards success. Collaborating with seasoned professionals who understand the nuances of the industry can significantly impact the outcome of your project. A reliable video streaming app development company will guide you through the entire process, from conceptualization to deployment, ensuring a smooth and successful journey. Their expertise in Android video streaming app development and [live video streaming app development](https://semidotinfotech.com/video-streaming-app-development) can prove instrumental in creating a robust and scalable platform. **User-Centric Design and Intuitive Interface** The cornerstone of any successful on-demand video streaming app lies in its user interface and design. Users should be able to navigate the app effortlessly, finding the content they desire without unnecessary complexity. A user-centric design enhances the overall user experience, making it more likely for users to spend more time on your platform. Consider incorporating features like personalized recommendations, intuitive search functionalities, and user-friendly menus. This not only adds to the app's attractiveness but also increases user engagement and retention rates. The goal is to create an environment where users feel comfortable and excited to explore the vast array of content your app offers. Transform your streaming app vision into reality! Explore unmatched development excellence. Take the first step towards success – Start your project now! **Content Library and Quality** Imagine yourself curled up on the couch, popcorn and phone in hand, ready to disappear into a pixelated world. All of a sudden, you're overcome with a familiar sense of dread—the scroll of doom, where nothing piques your curiosity. Oh no. Nobody desires that. Because Netflix and its team created a content library so rich and varied, it kept us glued to our screens like buttered popcorn, which is why they became binge-watching royalty. And content, glorious content, is the golden nugget when it comes to creating your own streaming app. We're talking documentaries that will blow your mind wider than an IMAX screen, TV shows you can't resist binge-watching, and movies that will make you laugh out loud (or just gasp, no judgment). Give the public something they'll want like their next caffeine fix and forget about the typical rerun suspects and reality TV yawn-fests! But here's the kicker: in this streaming showdown, you gotta stand out from the crowd like a sequin-covered unicorn. Invest in quality that's sharper than a Hollywood smile. We're talking exclusive titles they can't find anywhere else, shows so good they'll ditch their Netflix subscription (gasp!), and maybe even some original content that'll have them singing your praises from the rooftops. Partner with talented creators, snag those must-watch movies, and don't be afraid to roll up your sleeves and cook up your own content masterpiece. Trust me, in the land of streaming apps, unique flavor wins every time. Remember, it's not just about throwing up any old video – it's about curating a feast for the eyes and ears, a smorgasbord of stories that leave them wanting more. Because at the end of the day, what's a streaming app without a content library that keeps them coming back for seconds (and thirds, and fourths…)? So go forth, content crusader! Build a library that's bursting with brilliance, and watch your app become the talk of the town (and maybe even the world). **Scalability and Performance** The success of a video streaming app often leads to an influx of users. Scalability, therefore, is a critical consideration during the development phase. Your app should be capable of handling a growing user base without compromising on performance or speed. Collaborate with your development team to implement scalable architecture and leverage cloud services to manage increased server loads. This ensures that your app remains responsive and reliable, even during peak usage times. A seamless streaming experience contributes significantly to user satisfaction and loyalty. **Making Money While Making Movies Sing: Monetization Strategies for Your App** Okay, so you've built a streaming app so addictive, users are practically living in their pajamas. Awesome! But here's the reality check: building a sustainable business requires more than just killer content. You gotta figure out how to turn all those binge-watching hours into sweet, sweet revenue. That's where your monetization strategy comes in, and planning it early is like packing a lunch for your entrepreneurial picnic. Let's talk options: • Subscription Model: This is like the Netflix buffet – users pay a monthly fee for unlimited access to your entire content library. Think loyalty programs, exclusive perks, and maybe even early access to new shows to keep them hooked. Remember, the content needs to be so good, they wouldn't dream of canceling! (P.S. Research shows subscription services with 3-tier models, like basic, standard, and premium, tend to thrive.) • Pay-Per-View: It's like buying a movie ticket online. Users pay a one-time fee to watch specific content, like blockbuster movies or highly anticipated episodes. This can be a great way to boost revenue for those extra-special offerings. Bonus points if you offer bundle deals for multiple views or limited-time discounts. • Freemium Model: Picture a popcorn stand with free samples and a gourmet cart for fancy kernels. This model lets users access basic content for free, but they'll have to shell out for the premium stuff – think ad-free viewing, exclusive shows, and early access. It's all about giving them a taste of the good life and making them crave more. But don't stop there! Explore other avenues like: • Advertising: Partner with brands for targeted ads that seamlessly integrate into the viewing experience without annoying your audience. Think less "car insurance jingle" and more "subtle product placement in your original series, sponsored by a trendy clothing brand." • Partnerships and Collaborations: Team up with other companies, like mobile carriers or streaming services, to offer bundled packages or exclusive content. It's like a win-win-win for you, your partner, and the viewers. Remember, the key is to diversify your revenue streams and offer options that cater to different user preferences. Think of it as building a delicious snack bar for your viewers, with something for everyone – from the popcorn purists to the gourmet pretzel connoisseurs. So, go forth, monetization maestro! Craft a strategy that's as creative as your content, and watch your app become the next big thing, both for viewers and your bank account. **Security and Digital Rights Management (DRM)** Protecting the intellectual property and ensuring secure content delivery are non-negotiable aspects of [video streaming app development](https://semidotinfotech.com/video-streaming-app-development). Incorporate robust security measures to safeguard against piracy, unauthorized access, and content theft. Digital Rights Management (DRM) technologies play a crucial role in preventing unauthorized distribution and ensuring content is accessed only by legitimate users. Collaborate with security experts to implement encryption, secure authentication methods, and other measures to fortify your app against potential threats. Building trust with both content creators and users is essential for the long-term success of your platform. **Cross-Platform Compatibility** In an era where users access content on various devices, cross-platform compatibility is a key consideration. Your app should be accessible on different operating systems, including Android, iOS, smart TVs, and web browsers. This not only broadens your user base but also enhances the overall accessibility and convenience for your audience. Collaborate with your development team to ensure a seamless user experience across different platforms. Consistent design elements, features, and functionality contribute to a cohesive brand experience, regardless of the device users choose. **Analytics and User Feedback Integration** Continuous improvement is the hallmark of successful video streaming apps. Integrating analytics tools and user feedback mechanisms provides valuable insights into user behavior, preferences, and app performance. Leverage this data to make informed decisions and implement updates that enhance the overall user experience. Regularly solicit user feedback through surveys, reviews, and ratings. Understand user preferences, pain points, and expectations to refine your app continually. Analytics-driven insights empower you to stay ahead of market trends and deliver a streaming experience that evolves with user needs. **Legal Compliance and Licensing** Navigating the legal landscape of content distribution is critical for the success and sustainability of your video streaming app. Ensure that your platform adheres to copyright laws and licensing agreements. Obtaining the necessary rights for the content you offer is paramount to avoid legal complications and potential copyright infringement issues. Collaborate with legal experts to navigate the complex world of licensing agreements and regional restrictions. By ensuring legal compliance, you not only protect your platform but also foster positive relationships with content creators and copyright holders. Consult with the masters of video streaming app development. Your vision, our expertise – let's make history. Ignite the revolution now!
developersdev_
1,717,154
What is a spam score? How do I reduce spam scores?
A spam score is a metric used to determine the likelihood of an email being perceived as spam by spam...
0
2024-01-04T14:17:15
https://dev.to/dgihost/what-is-a-spam-score-how-do-i-reduce-spam-scores-3ji1
beginners, programming, webdev, seo
A spam score is a metric used to determine the likelihood of an email being perceived as spam by spam filters. It's calculated based on various factors such as the content of the email, sender reputation, use of spammy keywords, domain reputation, formatting, and more. Reducing spam scores involves several practices aimed at improving the quality and legitimacy of your emails. Here are some tips to help reduce spam scores: **Quality Content:** Ensure your emails have relevant, valuable, and well-written content. Avoid using excessive capitalization, misleading subject lines, or overly promotional language. **Authenticate Your Emails:** Implement authentication methods like SPF (Sender Policy Framework), DKIM (DomainKeys Identified Mail), and DMARC (Domain-based Message Authentication, Reporting, and Conformance) to verify your email's legitimacy. **Clean Email List:** Regularly clean your email list by removing inactive or incorrect email addresses. Sending emails to invalid or inactive addresses can increase your spam score. **Avoid Spam Trigger Words:** Refrain from using words or phrases that commonly trigger spam filters. These include phrases like "Act Now," "Free," "Limited Time Offer," and excessive use of exclamation marks or symbols. **Opt-in Subscriptions:** Ensure recipients have opted in to receive your emails. Provide clear and easy options for unsubscribing. **Consistent Sending Patterns:** Maintain a consistent sending schedule and avoid sudden spikes in email volume, which could trigger spam filters. **Monitor Feedback Loops:** Keep an eye on feedback loops from ISPs (Internet Service Providers) or email providers to identify and address any complaints or issues with your emails. **Check HTML Formatting:** Ensure proper HTML coding in your emails. Poorly formatted HTML or excessive use of code can trigger spam filters. **Use Reliable Email Service Providers (ESP):** Consider using reputable ESPs that have good deliverability rates and provide tools to help manage and monitor your email campaigns. **Test Before Sending:** Use spam-checking tools or email testing services to analyze your emails and identify potential issues before sending them out. Reducing spam scores requires a combination of best practices, adherence to email standards, and ongoing monitoring and adjustments to maintain good email deliverability. Thanks for reading, [Dgi Host.com](https://www.dgihost.com)
dgihost
1,717,290
Automate message queue deployment on JBoss EAP
For decades now, software projects have relied on messaging APIs to exchange data. In the Java/Java...
0
2024-01-04T18:00:00
https://developers.redhat.com/articles/2023/09/01/automate-message-queue-deployment-jboss-eap#3_deploying_jms_queues_on_jboss_eap_using_ansible
For decades now, software projects have relied on messaging APIs to exchange data. In the Java/Java EE ecosystem, this method of asynchronous communication has been standardized by the JMS specification. In many cases, individuals and organizations leverage [Red Hat JBoss Enterprise Application Platform (JBoss EAP)](https://developers.redhat.com/products/eap/overview) to act as message-oriented middleware (MOM), which facilitates the management of message queues and topics. Messaging ensures that no messages are lost as they are transmitted from the client and delivered to interested parties. On top of that, JBoss EAP provides authentication and other security-focused capabilities on top of the management functions. In this article, we'll show how to fully automate the setup of JBoss EAP and a JMS queue using Ansible so that we can easily make this service available. # Prerequisites and installation ## Install Ansible First, we’ll set up our Ansible control machine, which is where the automation will be executed. On this system, we need to install Ansible as the first step: ``` $ sudo dnf install -y ansible-core ``` Note that the package name has changed recently from `ansible` to `ansible-core`. ## Configure Ansible to use Red Hat Automation Hub An extension to Ansible, an [Ansible collection](https://docs.ansible.com/ansible/latest/collections_guide/index.html), dedicated to Red Hat JBoss EAP is available from Automation Hub. Red Hat customers need to add credentials and the location for [Red Hat Automation Hub](https://www.ansible.com/products/automation-hub) to their Ansible configuration file (`ansible.cfg`) to be able to install the content using the ansible-galaxy command-line tool. Be sure to replace the with the API token you retrieved from Automation Hub. [For more information about using Red Hat Automation Hub, please refer to the associated documentation](https://access.redhat.com/documentation/en-us/red_hat_ansible_automation_platform/1.2/html/getting_started_with_red_hat_ansible_automation_hub/index). ``` #ansible.cfg: [defaults] host_key_checking = False retry_files_enabled = False nocows = 1 [inventory] # fail more helpfully when the inventory file does not parse (Ansible 2.4+) unparsed_is_failed=true [galaxy] server_list = automation_hub, galaxy [galaxy_server.galaxy] url=https://galaxy.ansible.com/ [galaxy_server.automation_hub] url=https://cloud.redhat.com/api/automation-hub/ auth_url=https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token token=<paste-your-token-here> ``` ## Install the Ansible collection for JBoss EAP With this configuration, we can now install the Ansible collection for JBoss EAP (`redhat.eap`) available on Red Hat Ansible Automation Hub: ``` $ ansible-galaxy collection install redhat.eap Starting galaxy collection install process Process install dependency map Starting collection install process Downloading https://console.redhat.com/api/automation-hub/v3/plugin/ansible/content/published/collections/artifacts/redhat-eap-1.3.4.tar.gz to /root/.ansible/tmp/ansible-local-2529rs7zh7/tmps_4n2eyj/redhat-eap-1.3.4-lr8dvcxo Installing 'redhat.eap:1.3.4' to '/root/.ansible/collections/ansible_collections/redhat/eap' Downloading https://console.redhat.com/api/automation-hub/v3/plugin/ansible/content/published/collections/artifacts/redhat-runtimes_common-1.1.0.tar.gz to /root/.ansible/tmp/ansible-local-2529rs7zh7/tmps_4n2eyj/redhat-runtimes_common-1.1.0-o6qfkgju redhat.eap:1.3.4 was installed successfully Installing 'redhat.runtimes_common:1.1.0' to '/root/.ansible/collections/ansible_collections/redhat/runtimes_common' Downloading https://console.redhat.com/api/automation-hub/v3/plugin/ansible/content/published/collections/artifacts/ansible-posix-1.5.4.tar.gz to /root/.ansible/tmp/ansible-local-2529rs7zh7/tmps_4n2eyj/ansible-posix-1.5.4-4pgukpuo redhat.runtimes_common:1.1.0 was installed successfully Installing 'ansible.posix:1.5.4' to '/root/.ansible/collections/ansible_collections/ansible/posix' ansible.posix:1.5.4 was installed successfully ``` As we will describe a little later on, this extension for Ansible will manage the entire installation and configuration of the Java application server on the target systems. ## Inventory file Before we can start using our collection, we need to provide the [inventory of targets to Ansible](https://access.redhat.com/documentation/en-us/red_hat_ansible_automation_platform/1.2/html/getting_started_with_red_hat_ansible_automation_hub/index). There are several ways to provide this information to the automation tool, but for the purposes of this article, we elected to use a simple ini-formatted inventory file. To easily reproduce this article's demonstration, you can use the same control node as the target. This also removes the need to deploy the required SSH key on all the systems involved. To do so, simply use the following inventory file by creating a file called `inventory`: ``` [all] localhost ansible_connection=local [messaging_servers] localhost ansible_connection=local ``` # Deploying JBoss EAP ## JBoss EAP installation Before we configure the JMS queues that will be configured by Ansible, we'll first deploy JBoss EAP. Once the server is successfully running on the target system, we'll adjust the automation to add the required configuration to set up the messaging layer. This is purely for didactic purposes. Since we can leverage the content of the `redhat.eap` collection, the playbook to install EAP and set it up as systemd service on the target system is minimal. Create a file called `eap_jms.yml` with the following content: ``` --- - name: "Deploy a JBoss EAP" hosts: messaging_servers vars: eap_apply_cp: true eap_version: 7.4.0 eap_offline_install: false eap_config_base: 'standalone-full.xml' collections: - redhat.eap roles: - eap_install - eap_systemd ``` Note that the Ansible collection for JBoss EAP will also take care of downloading the required assets from the Red Hat Customer Portal (the archive containing the Java app server files). However, one does need to provide the credentials associated with a service account. A Red Hat customer can manage service accounts using the hybrid cloud console. Within this portal, on the [service accounts tab](https://console.redhat.com/application-services/service-accounts), you can create a new service account if one does not already exist. **Note: **The values obtained from the hybrid cloud console are sensitive and should be managed accordingly. For the purpose of this article, the value is passed to the ansible-playbook command line. Alternatively, [ansible-vault](https://docs.ansible.com/ansible/latest/vault_guide/index.html) could be used to enforce additional defense mechanisms: ``` $ ansible-playbook -i inventory -e rhn_username=<client_id> -e rhn_password=<client_secret> eap_jms.yml PLAY [Deploy a JBoss EAP] ****************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [redhat.eap.eap_install : Validating arguments against arg spec 'main'] *** ok: [localhost] TASK [redhat.eap.eap_install : Ensure prerequirements are fullfilled.] ********* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/prereqs.yml for localhost TASK [redhat.eap.eap_install : Validate credentials] *************************** ok: [localhost] TASK [redhat.eap.eap_install : Validate existing zipfiles for offline installs] *** skipping: [localhost] TASK [redhat.eap.eap_install : Validate existing zipfiles for offline installs] *** skipping: [localhost] TASK [redhat.eap.eap_install : Check that required packages list has been provided.] *** ok: [localhost] TASK [redhat.eap.eap_install : Prepare packages list] ************************** skipping: [localhost] TASK [redhat.eap.eap_install : Add JDK package java-11-openjdk-headless to packages list] *** ok: [localhost] TASK [redhat.eap.eap_install : Install required packages (4)] ****************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure required local user exists.] ************* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/user.yml for localhost TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Set eap group] ********************************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure group eap exists.] *********************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure user eap exists.] ************************ ok: [localhost] TASK [redhat.eap.eap_install : Ensure workdir /opt/jboss_eap/ exists.] ********* ok: [localhost] TASK [redhat.eap.eap_install : Ensure archive_dir /opt/jboss_eap/ exists.] ***** ok: [localhost] TASK [redhat.eap.eap_install : Ensure server is installed] ********************* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/install.yml for localhost TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Check local download archive path] ************** ok: [localhost] TASK [redhat.eap.eap_install : Set download paths] ***************************** ok: [localhost] TASK [redhat.eap.eap_install : Check target archive: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Retrieve archive from website: https://github.com/eap/eap/releases/download] *** skipping: [localhost] TASK [redhat.eap.eap_install : Retrieve archive from RHN] ********************** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/install/rhn.yml for localhost TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [Download JBoss EAP from CSP] ********************************************* TASK [redhat.eap.eap_utils : Check arguments] ********************************** ok: [localhost] TASK [redhat.eap.eap_utils : Retrieve product download using JBoss Network API] *** ok: [localhost] TASK [redhat.eap.eap_utils : Determine install zipfile from search results] **** ok: [localhost] TASK [redhat.eap.eap_utils : Download Red Hat Single Sign-On] ****************** ok: [localhost] TASK [redhat.eap.eap_install : Install server using RPM] *********************** skipping: [localhost] TASK [redhat.eap.eap_install : Check downloaded archive] *********************** ok: [localhost] TASK [redhat.eap.eap_install : Copy archive to target nodes] ******************* changed: [localhost] TASK [redhat.eap.eap_install : Check target archive: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Verify target archive state: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Read target directory information: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Extract files from /opt/jboss_eap//jboss-eap-7.4.0.zip into /opt/jboss_eap/.] *** changed: [localhost] TASK [redhat.eap.eap_install : Note: decompression was not executed] *********** skipping: [localhost] TASK [redhat.eap.eap_install : Read information on server home directory: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Check state of server home directory: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Set instance name] ****************************** ok: [localhost] TASK [redhat.eap.eap_install : Deploy custom configuration] ******************** skipping: [localhost] TASK [redhat.eap.eap_install : Deploy configuration] *************************** changed: [localhost] TASK [redhat.eap.eap_install : Ensure required parameters for cumulative patch application are provided.] *** skipping: [localhost] TASK [Apply latest cumulative patch] ******************************************* skipping: [localhost] TASK [redhat.eap.eap_install : Ensure required parameters for elytron adapter are provided.] *** skipping: [localhost] TASK [Install elytron adapter] ************************************************* skipping: [localhost] TASK [redhat.eap.eap_install : Install server using Prospero] ****************** skipping: [localhost] TASK [redhat.eap.eap_install : Check eap install directory state] ************** ok: [localhost] TASK [redhat.eap.eap_install : Validate conditions] **************************** ok: [localhost] TASK [Ensure firewalld configuration allows server port (if enabled).] ********* skipping: [localhost] TASK [redhat.eap.eap_systemd : Validating arguments against arg spec 'main'] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Check current EAP patch installed] ************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Check arguments for yaml configuration] ********* skipping: [localhost] TASK [Ensure required local user and group exists.] **************************** TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Set eap group] ********************************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure group eap exists.] *********************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure user eap exists.] ************************ ok: [localhost] TASK [redhat.eap.eap_systemd : Set destination directory for configuration] **** ok: [localhost] TASK [redhat.eap.eap_systemd : Set instance destination directory for configuration] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set base directory for instance] **************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set instance name] ****************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set instance name] ****************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set bind address] ******************************* ok: [localhost] TASK [redhat.eap.eap_systemd : Create basedir /opt/jboss_eap/jboss-eap-7.4//standalone for instance: eap] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Create deployment directories for instance: eap] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Deploy custom configuration] ******************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Deploy configuration] *************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Include YAML configuration extension] *********** skipping: [localhost] TASK [redhat.eap.eap_systemd : Check YAML configuration is disabled] *********** ok: [localhost] TASK [redhat.eap.eap_systemd : Set systemd envfile destination] **************** ok: [localhost] TASK [redhat.eap.eap_systemd : Determine JAVA_HOME for selected JVM RPM] ******* ok: [localhost] TASK [redhat.eap.eap_systemd : Set systemd unit file destination] ************** ok: [localhost] TASK [redhat.eap.eap_systemd : Deploy service instance configuration: /etc//eap.conf] *** changed: [localhost] TASK [redhat.eap.eap_systemd : Deploy Systemd configuration for service: /usr/lib/systemd/system/eap.service] *** changed: [localhost] TASK [redhat.eap.eap_systemd : Perform daemon-reload to ensure the changes are picked up] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Ensure service is started] ********************** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_systemd/tasks/service.yml for localhost TASK [redhat.eap.eap_systemd : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Set instance eap state to started] ************** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=59 changed=6 unreachable=0 failed=0 skipped=22 rescued=0 ignored=0 ``` ## Validating the installation Before going any further with our automation, we will be thorough and add a validation step to double-check that the application server is not only running but also functional. This will ensure, down the road, that any JMS-related issue only affects this subsystem. The Ansible collection for JBoss EAP comes with a handy role, called `eap_validation`, for this purpose, so it's fairly easy to add this step to our playbook: ``` --- - name: "Deploy a JBoss EAP" hosts: messaging_servers vars: eap_apply_cp: true eap_version: 7.4.0 eap_offline_install: false eap_config_base: 'standalone-full.xml' collections: - redhat.eap roles: - eap_install - eap_systemd - eap_validation ``` Let's execute our playbook once again and observe the execution of this validation step: ``` $ ansible-playbook -i inventory -e rhn_username=<client_id> -e rhn_password=<client_secret> eap_jms.yml PLAY [Deploy a JBoss EAP] ****************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [redhat.eap.eap_install : Validating arguments against arg spec 'main'] *** ok: [localhost] TASK [redhat.eap.eap_install : Ensure prerequirements are fullfilled.] ********* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/prereqs.yml for localhost TASK [redhat.eap.eap_install : Validate credentials] *************************** ok: [localhost] TASK [redhat.eap.eap_install : Validate existing zipfiles for offline installs] *** skipping: [localhost] TASK [redhat.eap.eap_install : Validate existing zipfiles for offline installs] *** skipping: [localhost] TASK [redhat.eap.eap_install : Check that required packages list has been provided.] *** ok: [localhost] TASK [redhat.eap.eap_install : Prepare packages list] ************************** skipping: [localhost] TASK [redhat.eap.eap_install : Add JDK package java-11-openjdk-headless to packages list] *** ok: [localhost] TASK [redhat.eap.eap_install : Install required packages (4)] ****************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure required local user exists.] ************* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/user.yml for localhost TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Set eap group] ********************************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure group eap exists.] *********************** changed: [localhost] TASK [redhat.eap.eap_install : Ensure user eap exists.] ************************ changed: [localhost] TASK [redhat.eap.eap_install : Ensure workdir /opt/jboss_eap/ exists.] ********* changed: [localhost] TASK [redhat.eap.eap_install : Ensure archive_dir /opt/jboss_eap/ exists.] ***** ok: [localhost] TASK [redhat.eap.eap_install : Ensure server is installed] ********************* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/install.yml for localhost TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Check local download archive path] ************** ok: [localhost] TASK [redhat.eap.eap_install : Set download paths] ***************************** ok: [localhost] TASK [redhat.eap.eap_install : Check target archive: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Retrieve archive from website: https://github.com/eap/eap/releases/download] *** skipping: [localhost] TASK [redhat.eap.eap_install : Retrieve archive from RHN] ********************** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/install/rhn.yml for localhost TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [Download JBoss EAP from CSP] ********************************************* TASK [redhat.eap.eap_utils : Check arguments] ********************************** ok: [localhost] TASK [redhat.eap.eap_utils : Retrieve product download using JBoss Network API] *** ok: [localhost] TASK [redhat.eap.eap_utils : Determine install zipfile from search results] **** ok: [localhost] TASK [redhat.eap.eap_utils : Download Red Hat Single Sign-On] ****************** ok: [localhost] TASK [redhat.eap.eap_install : Install server using RPM] *********************** skipping: [localhost] TASK [redhat.eap.eap_install : Check downloaded archive] *********************** ok: [localhost] TASK [redhat.eap.eap_install : Copy archive to target nodes] ******************* changed: [localhost] TASK [redhat.eap.eap_install : Check target archive: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Verify target archive state: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Read target directory information: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Extract files from /opt/jboss_eap//jboss-eap-7.4.0.zip into /opt/jboss_eap/.] *** changed: [localhost] TASK [redhat.eap.eap_install : Note: decompression was not executed] *********** skipping: [localhost] TASK [redhat.eap.eap_install : Read information on server home directory: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Check state of server home directory: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Set instance name] ****************************** ok: [localhost] TASK [redhat.eap.eap_install : Deploy custom configuration] ******************** skipping: [localhost] TASK [redhat.eap.eap_install : Deploy configuration] *************************** changed: [localhost] TASK [redhat.eap.eap_install : Ensure required parameters for cumulative patch application are provided.] *** skipping: [localhost] TASK [Apply latest cumulative patch] ******************************************* skipping: [localhost] TASK [redhat.eap.eap_install : Ensure required parameters for elytron adapter are provided.] *** skipping: [localhost] TASK [Install elytron adapter] ************************************************* skipping: [localhost] TASK [redhat.eap.eap_install : Install server using Prospero] ****************** skipping: [localhost] TASK [redhat.eap.eap_install : Check eap install directory state] ************** ok: [localhost] TASK [redhat.eap.eap_install : Validate conditions] **************************** ok: [localhost] TASK [Ensure firewalld configuration allows server port (if enabled).] ********* skipping: [localhost] TASK [redhat.eap.eap_systemd : Validating arguments against arg spec 'main'] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Check current EAP patch installed] ************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Check arguments for yaml configuration] ********* skipping: [localhost] TASK [Ensure required local user and group exists.] **************************** TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Set eap group] ********************************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure group eap exists.] *********************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure user eap exists.] ************************ ok: [localhost] TASK [redhat.eap.eap_systemd : Set destination directory for configuration] **** ok: [localhost] TASK [redhat.eap.eap_systemd : Set instance destination directory for configuration] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set base directory for instance] **************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set instance name] ****************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set instance name] ****************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set bind address] ******************************* ok: [localhost] TASK [redhat.eap.eap_systemd : Create basedir /opt/jboss_eap/jboss-eap-7.4//standalone for instance: eap] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Create deployment directories for instance: eap] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Deploy custom configuration] ******************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Deploy configuration] *************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Include YAML configuration extension] *********** skipping: [localhost] TASK [redhat.eap.eap_systemd : Check YAML configuration is disabled] *********** ok: [localhost] TASK [redhat.eap.eap_systemd : Set systemd envfile destination] **************** ok: [localhost] TASK [redhat.eap.eap_systemd : Determine JAVA_HOME for selected JVM RPM] ******* ok: [localhost] TASK [redhat.eap.eap_systemd : Set systemd unit file destination] ************** ok: [localhost] TASK [redhat.eap.eap_systemd : Deploy service instance configuration: /etc//eap.conf] *** changed: [localhost] TASK [redhat.eap.eap_systemd : Deploy Systemd configuration for service: /usr/lib/systemd/system/eap.service] *** changed: [localhost] TASK [redhat.eap.eap_systemd : Perform daemon-reload to ensure the changes are picked up] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Ensure service is started] ********************** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_systemd/tasks/service.yml for localhost TASK [redhat.eap.eap_systemd : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Set instance eap state to started] ************** changed: [localhost] TASK [redhat.eap.eap_validation : Validating arguments against arg spec 'main'] *** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure required parameters are provided.] **** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure user eap were created.] *************** ok: [localhost] TASK [redhat.eap.eap_validation : Validate state of user: eap] ***************** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure user eap were created.] *************** ok: [localhost] TASK [redhat.eap.eap_validation : Validate state of group: eap.] *************** ok: [localhost] TASK [redhat.eap.eap_validation : Wait for HTTP port 8080 to become available.] *** ok: [localhost] TASK [redhat.eap.eap_validation : Check if web connector is accessible] ******** ok: [localhost] TASK [redhat.eap.eap_validation : Populate service facts] ********************** ok: [localhost] TASK [redhat.eap.eap_validation : Check if service is running] ***************** ok: [localhost] => { "changed": false, "msg": "All assertions passed" } TASK [redhat.eap.eap_validation : Verify server's internal configuration] ****** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_validation/tasks/verify_with_cli_queries.yml for localhost => (item={'query': '/core-service=server-environment:read-attribute(name=start-gracefully)'}) included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_validation/tasks/verify_with_cli_queries.yml for localhost => (item={'query': '/subsystem=undertow/server=default-server/http-listener=default:read-attribute(name=enabled)'}) TASK [redhat.eap.eap_validation : Ensure required parameters are provided.] **** ok: [localhost] TASK [Use CLI query to validate service state: /core-service=server-environment:read-attribute(name=start-gracefully)] *** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query '/core-service=server-environment:read-attribute(name=start-gracefully)'] *** ok: [localhost] TASK [redhat.eap.eap_validation : Validate CLI query was successful] *********** ok: [localhost] TASK [redhat.eap.eap_validation : Transform output to JSON] ******************** ok: [localhost] TASK [redhat.eap.eap_validation : Display transformed result] ****************** skipping: [localhost] TASK [redhat.eap.eap_validation : Check that query was successfully performed.] *** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure required parameters are provided.] **** ok: [localhost] TASK [Use CLI query to validate service state: /subsystem=undertow/server=default-server/http-listener=default:read-attribute(name=enabled)] *** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query '/subsystem=undertow/server=default-server/http-listener=default:read-attribute(name=enabled)'] *** ok: [localhost] TASK [redhat.eap.eap_validation : Validate CLI query was successful] *********** ok: [localhost] TASK [redhat.eap.eap_validation : Transform output to JSON] ******************** ok: [localhost] TASK [redhat.eap.eap_validation : Display transformed result] ****************** skipping: [localhost] TASK [redhat.eap.eap_validation : Check that query was successfully performed.] *** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure yaml setup] *************************** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_validation/tasks/yaml_setup.yml for localhost TASK [Check standard-sockets configuration settings] *************************** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query /socket-binding-group=standard-sockets/remote-destination-outbound-socket-binding=mail-smtp:read-attribute(name=host)] *** ok: [localhost] TASK [redhat.eap.eap_validation : Display result of standard-sockets configuration settings] *** ok: [localhost] TASK [Check ejb configuration settings] **************************************** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query /subsystem=ejb3:read-attribute(name=default-resource-adapter-name)] *** ok: [localhost] TASK [redhat.eap.eap_validation : Display result of ejb configuration settings] *** ok: [localhost] TASK [Check ee configuration settings] ***************************************** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query /subsystem=ee/service=default-bindings:read-attribute(name=jms-connection-factory)] *** ok: [localhost] TASK [redhat.eap.eap_validation : Display result of ee configuration settings] *** ok: [localhost] PLAY RECAP ********************************************************************* localhost : ok=98 changed=9 unreachable=0 failed=0 skipped=24 rescued=0 ignored=0 ``` If the execution of the playbook completed without error, validation of the application server passed successfully. # Deploying JMS queues on JBoss EAP using Ansible ## Changing EAP configuration Because the JMS subsystem is not used in the default JBoss EAP server configuration (`standalone.xml`), we also need to use a different profile (`standalone-alone.xml`). This is why, in the playbook above, we are specifying the required configuration profile: ``` --- - name: "Deploy a JBoss EAP" hosts: messaging_servers vars: eap_apply_cp: true eap_version: 7.4.0 eap_offline_install: false eap_config_base: 'standalone-full.xml' collections: - redhat.eap roles: - eap_install - eap_systemd - eap_validation ``` # Leveraging the YAML config feature of EAP using Ansible In the previous section, JBoss EAP was installed and configured as a systemd service on the target systems. Now, we will update this automation to change the configuration of the app server to ensure a JMS queue is deployed and made available. In order to accomplish this goal, we just need to provide a YAML definition with the appropriate configuration for the JMS subsystem of JBoss EAP. This configuration file is used by the app server, on boot, to update its configuration. To achieve this, we need to add another file to our project that we named `jms_configuration.yml.j2`. While the content of the file itself is YAML, the extension is `.j2` because it's a jinja2 template, which allows us to take advantage of the advanced, dynamic capabilities provided by Ansible. ``` jms_configuration.yml.j2: wildfly-configuration: subsystem: messaging-activemq: server: default: jms-queue: {{ queue.name }}: entries: - '{{ queue.entry }}' ``` Below, you'll see the playbook updated with all the required parameters to deploy the JMQ queue on JBoss EAP: ``` --- - name: "Deploy a Red Hat JBoss EAP server and set up a JMS Queue" hosts: messaging_servers vars: eap_apply_cp: true eap_version: 7.4.0 eap_offline_install: false eap_config_base: 'standalone-full.xml' eap_enable_yml_config: True queue: name: MyQueue entry: 'java:/jms/queue/MyQueue' eap_yml_configs: - jms_configuration.yml.j2 collections: - redhat.eap roles: - eap_install - eap_systemd - eap_validation ``` Let's execute this playbook again: ``` $ ansible-playbook -i inventory -e rhn_username=<client_id> -e rhn_password=<client_secret> eap_jms.yml PLAY [Deploy a Red Hat JBoss EAP server and set up a JMS Queue] **************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [redhat.eap.eap_install : Validating arguments against arg spec 'main'] *** ok: [localhost] TASK [redhat.eap.eap_install : Ensure prerequirements are fullfilled.] ********* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/prereqs.yml for localhost TASK [redhat.eap.eap_install : Validate credentials] *************************** ok: [localhost] TASK [redhat.eap.eap_install : Validate existing zipfiles for offline installs] *** skipping: [localhost] TASK [redhat.eap.eap_install : Validate existing zipfiles for offline installs] *** skipping: [localhost] TASK [redhat.eap.eap_install : Check that required packages list has been provided.] *** ok: [localhost] TASK [redhat.eap.eap_install : Prepare packages list] ************************** skipping: [localhost] TASK [redhat.eap.eap_install : Add JDK package java-11-openjdk-headless to packages list] *** ok: [localhost] TASK [redhat.eap.eap_install : Install required packages (4)] ****************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure required local user exists.] ************* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/user.yml for localhost TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Set eap group] ********************************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure group eap exists.] *********************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure user eap exists.] ************************ ok: [localhost] TASK [redhat.eap.eap_install : Ensure workdir /opt/jboss_eap/ exists.] ********* ok: [localhost] TASK [redhat.eap.eap_install : Ensure archive_dir /opt/jboss_eap/ exists.] ***** ok: [localhost] TASK [redhat.eap.eap_install : Ensure server is installed] ********************* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_install/tasks/install.yml for localhost TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Check local download archive path] ************** ok: [localhost] TASK [redhat.eap.eap_install : Set download paths] ***************************** ok: [localhost] TASK [redhat.eap.eap_install : Check target archive: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Retrieve archive from website: https://github.com/eap/eap/releases/download] *** skipping: [localhost] TASK [redhat.eap.eap_install : Retrieve archive from RHN] ********************** skipping: [localhost] TASK [redhat.eap.eap_install : Install server using RPM] *********************** skipping: [localhost] TASK [redhat.eap.eap_install : Check downloaded archive] *********************** ok: [localhost] TASK [redhat.eap.eap_install : Copy archive to target nodes] ******************* skipping: [localhost] TASK [redhat.eap.eap_install : Check target archive: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Verify target archive state: /opt/jboss_eap//jboss-eap-7.4.0.zip] *** ok: [localhost] TASK [redhat.eap.eap_install : Read target directory information: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Extract files from /opt/jboss_eap//jboss-eap-7.4.0.zip into /opt/jboss_eap/.] *** skipping: [localhost] TASK [redhat.eap.eap_install : Note: decompression was not executed] *********** ok: [localhost] => { "msg": "/opt/jboss_eap/jboss-eap-7.4/ already exists and version unchanged, skipping decompression" } TASK [redhat.eap.eap_install : Read information on server home directory: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Check state of server home directory: /opt/jboss_eap/jboss-eap-7.4/] *** ok: [localhost] TASK [redhat.eap.eap_install : Set instance name] ****************************** ok: [localhost] TASK [redhat.eap.eap_install : Deploy custom configuration] ******************** skipping: [localhost] TASK [redhat.eap.eap_install : Deploy configuration] *************************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure required parameters for cumulative patch application are provided.] *** ok: [localhost] => { "changed": false, "msg": "All assertions passed" } TASK [Apply latest cumulative patch] ******************************************* TASK [redhat.eap.eap_utils : Check installation] ******************************* ok: [localhost] TASK [redhat.eap.eap_utils : Set patch directory] ****************************** ok: [localhost] TASK [redhat.eap.eap_utils : Set download patch archive path] ****************** ok: [localhost] TASK [redhat.eap.eap_utils : Set patch destination directory] ****************** ok: [localhost] TASK [redhat.eap.eap_utils : Check download patch archive path] **************** ok: [localhost] TASK [redhat.eap.eap_utils : Check local download archive path] **************** ok: [localhost] TASK [redhat.eap.eap_utils : Check local downloaded archive: jboss-eap-7.4.9-patch.zip] *** ok: [localhost] TASK [redhat.eap.eap_utils : Retrieve product download using JBossNetwork API] *** skipping: [localhost] TASK [redhat.eap.eap_utils : Determine patch versions list] ******************** skipping: [localhost] TASK [redhat.eap.eap_utils : Determine latest version] ************************* skipping: [localhost] TASK [redhat.eap.eap_utils : Determine install zipfile from search results] **** skipping: [localhost] TASK [redhat.eap.eap_utils : Determine selected patch from supplied version: 7.4.9] *** skipping: [localhost] TASK [redhat.eap.eap_utils : Check remote downloaded archive: /opt/jboss-eap-7.4.9-patch.zip] *** skipping: [localhost] TASK [redhat.eap.eap_utils : Download Red Hat EAP patch] *********************** skipping: [localhost] TASK [redhat.eap.eap_utils : Set download patch archive path] ****************** ok: [localhost] TASK [redhat.eap.eap_utils : Check remote download patch archive path] ********* ok: [localhost] TASK [redhat.eap.eap_utils : Copy patch archive to target nodes] *************** changed: [localhost] TASK [redhat.eap.eap_utils : Check patch state] ******************************** ok: [localhost] TASK [redhat.eap.eap_utils : Set checksum file path for patch] ***************** ok: [localhost] TASK [redhat.eap.eap_utils : Check /opt/jboss_eap/jboss-eap-7.4//.applied_patch_checksum_f641b6de2807fac18d2a56de7a27c1ea3611e5f3.txt state] *** ok: [localhost] TASK [redhat.eap.eap_utils : Print when patch has been applied already] ******** skipping: [localhost] TASK [redhat.eap.eap_utils : Check if management interface is reachable] ******* ok: [localhost] TASK [redhat.eap.eap_utils : Set apply CP conflict default strategy to default (if not defined): --override-all] *** ok: [localhost] TASK [redhat.eap.eap_utils : Apply patch /opt/jboss-eap-7.4.9-patch.zip to server installed in /opt/jboss_eap/jboss-eap-7.4/] *** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_utils/tasks/jboss_cli.yml for localhost TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query 'patch apply --override-all /opt/jboss-eap-7.4.9-patch.zip'] *** ok: [localhost] TASK [redhat.eap.eap_utils : Display patching result] ************************** ok: [localhost] => { "msg": "Apply patch operation result: {\n \"outcome\" : \"success\",\n \"response-headers\" : {\n \"operation-requires-restart\" : true,\n \"process-state\" : \"restart-required\"\n }\n}" } TASK [redhat.eap.eap_utils : Set checksum file] ******************************** changed: [localhost] TASK [redhat.eap.eap_utils : Set latest patch file] **************************** changed: [localhost] TASK [redhat.eap.eap_utils : Restart server to ensure patch content is running] *** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_utils/tasks/jboss_cli.yml for localhost TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query 'shutdown --restart'] *********** ok: [localhost] TASK [redhat.eap.eap_utils : Wait for management interface is reachable] ******* ok: [localhost] TASK [redhat.eap.eap_utils : Stop service if it was started for patching] ****** skipping: [localhost] TASK [redhat.eap.eap_utils : Display resulting output] ************************* skipping: [localhost] TASK [redhat.eap.eap_install : Ensure required parameters for elytron adapter are provided.] *** skipping: [localhost] TASK [Install elytron adapter] ************************************************* skipping: [localhost] TASK [redhat.eap.eap_install : Install server using Prospero] ****************** skipping: [localhost] TASK [redhat.eap.eap_install : Check eap install directory state] ************** ok: [localhost] TASK [redhat.eap.eap_install : Validate conditions] **************************** ok: [localhost] TASK [Ensure firewalld configuration allows server port (if enabled).] ********* skipping: [localhost] TASK [redhat.eap.eap_systemd : Validating arguments against arg spec 'main'] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Check current EAP patch installed] ************** ok: [localhost] TASK [redhat.eap.eap_systemd : Check arguments for yaml configuration] ********* ok: [localhost] TASK [Ensure required local user and group exists.] **************************** TASK [redhat.eap.eap_install : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_install : Set eap group] ********************************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure group eap exists.] *********************** ok: [localhost] TASK [redhat.eap.eap_install : Ensure user eap exists.] ************************ ok: [localhost] TASK [redhat.eap.eap_systemd : Set destination directory for configuration] **** ok: [localhost] TASK [redhat.eap.eap_systemd : Set instance destination directory for configuration] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set base directory for instance] **************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Check arguments] ******************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set instance name] ****************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set instance name] ****************************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set bind address] ******************************* ok: [localhost] TASK [redhat.eap.eap_systemd : Create basedir /opt/jboss_eap/jboss-eap-7.4//standalone for instance: eap] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Create deployment directories for instance: eap] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Deploy custom configuration] ******************** skipping: [localhost] TASK [redhat.eap.eap_systemd : Deploy configuration] *************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Include YAML configuration extension] *********** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_systemd/tasks/yml_config.yml for localhost TASK [redhat.eap.eap_systemd : Create YAML configuration directory] ************ skipping: [localhost] TASK [redhat.eap.eap_systemd : Enable YAML configuration extension] ************ skipping: [localhost] TASK [redhat.eap.eap_systemd : Create YAML configuration directory] ************ changed: [localhost] TASK [redhat.eap.eap_systemd : Enable YAML configuration extension] ************ changed: [localhost] TASK [redhat.eap.eap_systemd : Deploy YAML configuration files] **************** changed: [localhost] => (item=jms_configuration.yml.j2) TASK [redhat.eap.eap_systemd : Check YAML configuration is disabled] *********** skipping: [localhost] TASK [redhat.eap.eap_systemd : Set systemd envfile destination] **************** ok: [localhost] TASK [redhat.eap.eap_systemd : Determine JAVA_HOME for selected JVM RPM] ******* ok: [localhost] TASK [redhat.eap.eap_systemd : Set systemd unit file destination] ************** ok: [localhost] TASK [redhat.eap.eap_systemd : Deploy service instance configuration: /etc//eap.conf] *** changed: [localhost] TASK [redhat.eap.eap_systemd : Deploy Systemd configuration for service: /usr/lib/systemd/system/eap.service] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Perform daemon-reload to ensure the changes are picked up] *** ok: [localhost] TASK [redhat.eap.eap_systemd : Ensure service is started] ********************** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_systemd/tasks/service.yml for localhost TASK [redhat.eap.eap_systemd : Check arguments] ******************************** ok: [localhost] TASK [redhat.eap.eap_systemd : Set instance eap state to started] ************** ok: [localhost] TASK [redhat.eap.eap_validation : Validating arguments against arg spec 'main'] *** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure required parameters are provided.] **** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure user eap were created.] *************** ok: [localhost] TASK [redhat.eap.eap_validation : Validate state of user: eap] ***************** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure user eap were created.] *************** ok: [localhost] TASK [redhat.eap.eap_validation : Validate state of group: eap.] *************** ok: [localhost] TASK [redhat.eap.eap_validation : Wait for HTTP port 8080 to become available.] *** ok: [localhost] TASK [redhat.eap.eap_validation : Check if web connector is accessible] ******** ok: [localhost] TASK [redhat.eap.eap_validation : Populate service facts] ********************** ok: [localhost] TASK [redhat.eap.eap_validation : Check if service is running] ***************** ok: [localhost] => { "changed": false, "msg": "All assertions passed" } TASK [redhat.eap.eap_validation : Verify server's internal configuration] ****** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_validation/tasks/verify_with_cli_queries.yml for localhost => (item={'query': '/core-service=server-environment:read-attribute(name=start-gracefully)'}) included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_validation/tasks/verify_with_cli_queries.yml for localhost => (item={'query': '/subsystem=undertow/server=default-server/http-listener=default:read-attribute(name=enabled)'}) TASK [redhat.eap.eap_validation : Ensure required parameters are provided.] **** ok: [localhost] TASK [Use CLI query to validate service state: /core-service=server-environment:read-attribute(name=start-gracefully)] *** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query '/core-service=server-environment:read-attribute(name=start-gracefully)'] *** ok: [localhost] TASK [redhat.eap.eap_validation : Validate CLI query was successful] *********** ok: [localhost] TASK [redhat.eap.eap_validation : Transform output to JSON] ******************** ok: [localhost] TASK [redhat.eap.eap_validation : Display transformed result] ****************** skipping: [localhost] TASK [redhat.eap.eap_validation : Check that query was successfully performed.] *** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure required parameters are provided.] **** ok: [localhost] TASK [Use CLI query to validate service state: /subsystem=undertow/server=default-server/http-listener=default:read-attribute(name=enabled)] *** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query '/subsystem=undertow/server=default-server/http-listener=default:read-attribute(name=enabled)'] *** ok: [localhost] TASK [redhat.eap.eap_validation : Validate CLI query was successful] *********** ok: [localhost] TASK [redhat.eap.eap_validation : Transform output to JSON] ******************** ok: [localhost] TASK [redhat.eap.eap_validation : Display transformed result] ****************** skipping: [localhost] TASK [redhat.eap.eap_validation : Check that query was successfully performed.] *** ok: [localhost] TASK [redhat.eap.eap_validation : Ensure yaml setup] *************************** included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_validation/tasks/yaml_setup.yml for localhost TASK [Check standard-sockets configuration settings] *************************** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query /socket-binding-group=standard-sockets/remote-destination-outbound-socket-binding=mail-smtp:read-attribute(name=host)] *** ok: [localhost] TASK [redhat.eap.eap_validation : Display result of standard-sockets configuration settings] *** ok: [localhost] TASK [Check ejb configuration settings] **************************************** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query /subsystem=ejb3:read-attribute(name=default-resource-adapter-name)] *** ok: [localhost] TASK [redhat.eap.eap_validation : Display result of ejb configuration settings] *** ok: [localhost] TASK [Check ee configuration settings] ***************************************** TASK [redhat.eap.eap_utils : Ensure required params for JBoss CLI have been provided] *** ok: [localhost] TASK [redhat.eap.eap_utils : Ensure server's management interface is reachable] *** ok: [localhost] TASK [redhat.eap.eap_utils : Execute CLI query /subsystem=ee/service=default-bindings:read-attribute(name=jms-connection-factory)] *** ok: [localhost] TASK [redhat.eap.eap_validation : Display result of ee configuration settings] *** ok: [localhost] RUNNING HANDLER [redhat.eap.eap_systemd : Restart Wildfly] ********************* included: /root/.ansible/collections/ansible_collections/redhat/eap/roles/eap_systemd/tasks/service.yml for localhost RUNNING HANDLER [redhat.eap.eap_systemd : Check arguments] ********************* ok: [localhost] RUNNING HANDLER [redhat.eap.eap_systemd : Set instance eap state to restarted] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=127 changed=8 unreachable=0 failed=0 skipped=34 rescued=0 ignored=0 ``` As illustrated in the output above, the YAML definition is now enabled and the configuration of the JBoss EAP running on the target host has been updated. ## Validate the JMS queue deployment As always, we are going to be thorough and verify that the playbook execution has, indeed, properly set up a JMS queue. To do so, we can simply use the JBoss CLI provided with JBoss EAP to confirm: ``` $ /opt/jboss_eap/jboss-eap-7.4/bin/jboss-cli.sh --connect --command="/subsystem=messaging-activemq/server=default/jms-queue=MyQueue:read-resource" { "outcome" => "success", "result" => { "durable" => true, "entries" => ["queues/MyQueue"], "legacy-entries" => undefined, "selector" => undefined } } ``` The output, as shown above, confirms that the server configuration has indeed been updated and that a brand new JMS queue is now available. Since this verification is fairly easy to automate, we will also add it to our playbook. The Ansible collection for JBoss EAP comes with a handy wrapper allowing for the execution of the JBoss CLI within a playbook. So, all that is needed is the inclusion of the task and the desired command, as shown below: ``` post_tasks: - name: "Check that Queue {{ queue.name }} is available." ansible.builtin.include_role: name: eap_utils tasks_from: jboss_cli.yml vars: jboss_home: "{{ eap_home }}" jboss_cli_query: "/subsystem=messaging-activemq/server=default/jms-queue={{ queue.name }}:read-resource" ``` # Conclusion Thanks to the Ansible collection for JBoss EAP, we have a minimalistic playbook, spared of all the heavy lifting of managing the Java application server, fulfilling the role of MOM. All the configuration required by the automation concerns only the use case we tried to implement, not the inner working of the solution (JBoss EAP). All the configuration required by the automation concerns only the use case we tried to implement, not the inner working of the solution (JBoss EAP). The resulting playbook is safely repeatable and can be used to install the software on any number of target systems. Using the collection for JBoss EAP also makes it easy to keep the deployment up to date.
rpelisse
829,369
My SSGNode and his jun-ssg
I looked for people in slack and was suggested a person who also needed a partner. I then found out...
14,641
2021-09-17T08:59:45
https://dev.to/sirinoks/my-ssgnode-and-his-jun-ssg-fbl
opensource
I looked for people in slack and was suggested a person who also needed a partner. I then found out they used Java. So.. here's my problem. I really don't like Java. I tried working with it, since we had a course, I was somewhat excited cause Minecraft, but oh was I wrong. So I tried to figure out that code... And then I saw another person using Java on slack looking for a partner, so I put the two people together and found someone also using node for myself. Maybe it's a problem that I chose to go for a familiar language. However, I would be also okay to read someone's... idk, Go? C++ maybe. I just personally don't like Java. Reviewing someone else's project and having mine reviewed was.... weird. I found it more weird because I am a little ahead of my class, where we would generally have the same courses, but there are none of the people I know in here. So I talked to people I didn't know at all. Sometimes it was hard to communicate, since I wouldn't understand what they meant. But really, that usually happens to me with anyone.. We communicated on problems mostly in slack, but forgot to make issues for them. So we had solved them before even creating an issue. We had to make issues later just to show that we actually did the work. I found his code pretty different. He used packages to solve problems, where I preferred to stay with "vanilla" and stick to what I know. I remember googling and finding the same packages as a solution that I saw in his code, but decided not to use those because of unfamiliarity. It wasn't that I didn't understand his code, but it was rather.. loaded. And [without comments](https://github.com/juuuuuuun/jun-ssg/issues/3) at all. They used like only three functions, grouping more things into a single pack. I instead made functions for every task. I'm not sure if it's a good or a bad thing. I also left comments everywhere, cause... I was used to it from my other classes, and I think it generally helps not only other people to read my code, but also myself. We had a bunch of issues. [One](https://github.com/juuuuuuun/jun-ssg/issues/1) of them was the same. Instead of putting texts into paragraphs, we instead had everything in one big `<h1>` tag. We had no clue why. It was also funny how two of us randos, with different code had the exactly same issue. I asked on Slack if anyone else did, cause maybe it was that common. I was pointed towards .split method. So I remembered... I decided to use my own solution instead of the straight answer given to me in slack in the very beginning. So maybe I could go back and use that answer? I still don't really understand it. But it solved the problem! Lucky I guess. We also had one where I couldn't launch my partner's code. I'm not sure what they did, but it got fixed. I have also asked my partner to fix [some](https://github.com/sirinoks/SSGNode/issues/2) [things](https://github.com/sirinoks/SSGNode/issues/3) I knew were problematic, but I didn't know how to do them myself. I had to add him as a collaborator so he can add his own branch. Had some readme issues, [both](https://github.com/sirinoks/SSGNode/issues/1) of [us](https://github.com/juuuuuuun/jun-ssg/issues/2) The final problem was a kind of random [error](https://github.com/sirinoks/SSGNode/issues/4). Google said this might be because of the node version. We checked and mine was older. So I changed it and updated, but it turned out the problem was actually because of a really long console.log which I forgot to remove. I learned how to use data from a json that I saw used in my partner's code. I got better at the whole "issues" thing on github. It was interesting to look at someone else's way to tackle the same problem. I'm not sure what I learned there. I learned to handle unknown problems better. Making a list, not focusing on the far tasks and actually doing the thing. I will probably struggle with this again, but it's a step towards getting better at it.
sirinoks
1,717,423
Journey into 2024: Setting Goals for Professional Development and Community Engagement
👋 Hello DEV Community! I'm Mykhailo Sukhostavets, but feel free to call me Misha. I'm a backend...
0
2024-01-04T18:13:32
https://dev.to/mishatech/journey-into-2024-setting-goals-for-professional-development-and-community-engagement-36ad
👋 Hello DEV Community! I'm Mykhailo Sukhostavets, but feel free to call me Misha. I'm a backend developer, and I've recently embarked on an exciting journey with Kraken Technologies, part of the Octopus Energy Group. 🐙 My daily work revolves around TypeScript and Node.js, crafting efficient and scalable backend solutions. Last year, I ventured into open-source for the first time, an experience I found incredibly rewarding! This year's goals: - **Professional Mastery:** I'm diving into Vim motions and Zig language to sharpen my coding skills. 🚀 - **Community Building:** Growing my LinkedIn network and starting to write articles to share knowledge and experiences. 📝 - **Mentorship:** After successfully mentoring a colleague last year, I'm eager to guide another aspiring tech enthusiast towards their dream job. 👨‍💻 I'm thrilled to join this vibrant community, where I look forward to exchanging ideas, learning, and growing together. Let's make 2024 a year of remarkable achievements and meaningful connections! What are your goals for this year? Let's start a conversation! 🌟 Cheers, Misha
mishatech
1,717,456
[PITCH] I created an app that will help you with your NEW YEAR'S RESOLUTIONS
Hello devs, everything ok? Last year, for me, was the year of achieving almost all the goals I had...
0
2024-01-04T18:39:20
https://dev.to/joaolandino/pitch-i-created-an-app-that-will-help-you-with-your-new-years-resolutions-3pj9
Hello devs, everything ok? Last year, for me, was the year of achieving almost all the goals I had for 2023! It was very gratifying to reach the end of the year and see that I accomplished almost all of them, and the ones that I didn't accomplish were just a short time away. But I confess that I missed something that I could use to control these goals, and as a good dev I went and developed my own haha I created an app that, in addition to letting you know what your goal is, you are able to break it down into smaller habits that will help you reach your goal. For example: You want to read 15 books in a year. That's your goal. But to do this you need to have the habit of reading at least 4 times a week. This is the footprint of the app 😎 Available to Android: https://play.google.com/store/apps/details?id=com.landino.habits IOS: https://apps.apple.com/us/app/h%C3%A1bitos-e-metas/id6474440359 Hope you like it :) Follow me on this journey: https://twitter.com/landino_tech https://www.instagram.com/landino.tech
joaolandino
1,717,493
REACT: What exactly is a single-page application?
A Single Page Application (SPA) is a type of web application that loads once and dynamically updates...
25,939
2024-01-05T08:30:00
https://dev.to/yashrai01/reactwhat-is-single-page-application-i1
A **Single Page Application** (SPA) is a type of **web application that loads once and dynamically updates content as users interact, eliminating the need for full page reloads**. It provides a smoother, more responsive user experience by leveraging JavaScript for seamless navigation. In traditional websites, clicking a link triggers the browser to fetch a new page from the server, resulting in a full reload. If you've experienced early versions of websites like Facebook, you might recall that actions such as 'liking' required a reload to see the impact. - SPAs load once, and subsequent interactions dynamically update content. - JavaScript is used to manipulate the page's content without requiring a complete reload. - SPAs provide a smoother and more interactive user experience by eliminating the need for frequent page refreshes. - Actions like 'liking' become instantaneous, enhancing the overall flow of the application. Popular SPA's: Gmail, Google Maps, Airbnb, Netflix, Pinterest, Paypal, and many more. Resource for SPA's: React, Angular, Vue.Js
yashrai01
1,717,498
Patch-Package to the Rescue! 🛠️✨
👋 Hello everyone! I'm back after a long time haha. This time I'm going to tell you something that I...
0
2024-01-04T20:04:52
https://dev.to/antoomartini/patch-package-to-the-rescue-5flf
javascript, reactnative, mobile, beginners
👋 Hello everyone! I'm back after a long time haha. This time I'm going to tell you something that I apply and apply in my day-to-day work. As part of the React Native developer community, I often face challenges when working with third-party libraries, especially when modifications are needed. Especially when aligning these libraries with the latest updates or changes in React Native. In such cases, modifications become necessary. This article presents patch-package as a very valuable tool 🛠️, which allows React Native developers to work smoothly and overcome these obstacles when modifications to third-party dependencies are needed. And I will do so by telling a real case 📖. I was developing with React Native and incorporated the **react-native-swipe-gestures** library to detect user gestures for expanding or collapsing a customizable modal I created. But I found a problem during testing on iOS: after implementing swipe gestures, the vertical scrolling within the modal, which had a ScrollView component as a child, stopped working as intended. The goal was to have a modal that expanded or collapsed based on the user's swipe direction. However, the unintended consequence was the disruption of the vertical scrolling functionality within the ScrollView nested inside the modal. This made it impossible for users to scroll the content vertically, impacting the overall user experience. So, let's do some magic ✨ I found a temporary solution on [Github with pacth-package](https://github.com/glepur/react-native-swipe-gestures/issues/78). How did I solve it? Firstly, ensure you have patch-package installed in your project. You can install it via npm: ``` npm install patch-package --save-dev ``` _Creating a Patch_ First thing to do: Identify the Issue. You have to locate the bug or the change you want to make within your dependency. _Make Changes:_ **Once you've identified the issue, make the necessary changes directly in the dependency's code within node_modules.** I made this change on the dependencies: ![Changes on the dependencies](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hclubp4gheb8zzifzr6t.png) _Generate Patch_ After making your changes, run the following command to generate a patch file: ``` npx patch-package <package-name> ``` Replace <package-name> with the name of the package you modified. In this case what I did was to replace and run the following ``` npx patch-package react-native-swipe-gestures ``` This command will create a .patch file inside a patches directory in your project's root folder. In my case, the results were: ![result of applyng the patch command](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1bt49jfjcy9gnsrb8lsg.png) _Applying the Patch_ To apply the patch during installation or deployment _add to Scripts_ in your _package.json_, add a script to apply patches after installing or before building your project: ``` "scripts": { "postinstall": "patch-package" } ``` _Run Postinstall Script_ Whenever you run npm install, the postinstall script will automatically apply the patches located in the patches directory. Easy, isn't it? patch-package streamlines the process of applying temporary fixes to third-party modules, allowing developers to overcome obstacles without waiting for official updates. 🛠️ On the other hand, I would like to remind the importance of thorough testing on different environments and devices. Look how it worked perfectly on Android but not on iOS 😞
antoomartini
1,717,544
Getting to Know You - Speeding up Developer Onboarding with LLMs and Unblocked
As anyone who has hired new developers onto an existing software team can tell you, onboarding new...
0
2024-01-08T14:03:55
https://dzone.com/articles/getting-to-know-you-speeding-up-developer-onboardi
ai, coding, productivity, github
As anyone who has hired new developers onto an existing software team can tell you, onboarding new developers is one of the most expensive things you can do. One of the most difficult things about onboarding junior developers is that it takes your senior developers away from their work. Even the best hires might get Imposter Syndrome, since they feel like they need to know more than they do and need to depend on their peers. You might have the best documentation, but it can be difficult to figure out where to start with onboarding. Onboarding senior developers takes time and resources as well. With the rise of LLMs, it seems like putting one on your code, documentation, chats, and ticketing systems would make sense. The ability to converse with an LLM trained on the right dataset would be like adding a team member who can make sure no one gets bogged down with sharing something that’s already documented. I thought I’d check out a new service called [Unblocked](https://docs.getunblocked.com/) that does just this. In this article, we will take a spin through a code base I was completely unfamiliar with and see what it would be like to get going on a new team with this tool. ### Data Sources If you’ve been following conversations around LLM development, then you know that they are only as good as the data they have access to. Fortunately, Unblocked allows you to connect a bunch of data sources to train your LLM. Additionally, because this LLM will be working on your specific code base and documentation, it wouldn’t even be possible to train it on another organization’s data. Unblocked isn’t trying to build a generic code advice bot. It’s personalized to your environment, so you don’t need to worry about data leaking to someone else. Setting up is pretty straightforward, thanks to lots of integrations with developer tools. After signing up for an account, you’ll be prompted to connect to the sources Unblocked supports. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/14fd757lqj69ohxed8nw.png) You'll need to wait a few minutes or longer depending on the size of your team while Unblocked ingests your content and trains the model. ### Getting started I tried exploring some of the features of Unblocked. While there’s [a web dashboard](https://docs.getunblocked.com/productGuides/dashboard.html) that you’ll interact with most of the time, I recommend you install [the Unblocked Mac app](https://docs.getunblocked.com/productGuides/mac.html), also. The app will run in your menu bar and allow you to ask Unblocked a question from anywhere. There are a bunch of other features for teammates interacting with Unblocked. I may write about those later, but for now, I just like that it gives me a universal shortcut (Command+Shift+U) to access Unblocked at any time. Another feature of the macOS menu bar app is that it provides a quick way to install [the IDE Plugins](https://docs.getunblocked.com/productGuides/ide.html) based on what I have installed on my machine. Of course, you don’t have to install them this way (Unblocked does this install for you), but it takes some of the thinking out of it. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dbq83llsw2fmw6jbo8dp.png) ### Asking Questions Since I am working on a codebase that is already in Unblocked, I don’t need to wait for anything after getting my account set up on the platform. If you set up your code and documentation, then you won’t need your new developers to wait either. Let’s take this for a spin and look at what questions a new developer might ask the bot. I started by asking a question about setting up the frontend. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/crvmdys1ots795odzrle.png) This answer looks pretty good! It’s enough to get me going in a local environment without contacting anyone else on my team. Unblocked kept everyone else “unblocked” on their work and pointed me in the right direction all on its own. I decided to ask about how to get a development environment set up locally. Let’s see what Unblocked says if I ask about that. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qdjrwb9q40p5e6q15g1w.png) This answer isn’t what I was hoping for … but I can click on the README link and find that this is not really Unblocked’s fault. My team just hasn’t updated the README for the backend app, and Unblocked found the incorrect boilerplate setup instructions. Now that I know where to go to get the code, I’ll just update it after I have finished setting up the backend on my own. In the meantime, though, I will let Unblocked know that it didn’t give me the answer I hoped for. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0vt8a2bs8mno612daxwd.png) Since it isn’t really the bot’s fault that it’s wrong, I made sure to explain that in my feedback. I had a good start, but I wanted some more answers to my architectural questions. Let’s try something a little more complicated than reading the setup instructions from a README. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9y04gvkj01mt574rniva.png) This is a pretty good high-level overview, especially considering that I didn’t have to do anything, other than type them in. Unblocked generated these answers with links to the relevant resources for me to investigate more as needed. ### Browse the code I actually cloned the repos for the frontend and backend of my app to my machine and opened them in VS Code. Let’s take a look at how Unblocked works with the repos there. As soon as I open the Unblocked plugin while viewing the backend repository, I’m presented with recommended insights asked by other members of my team. There are also some references to pull requests, Slack conversations, and Jira tasks that the bot thinks are relevant before I open a single file. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k6c2ql19w0j781j05hwg.png) This is useful. As I open various files, the suggestions change with the context, too. ### Browse components The VS Code plugin also called out some topics that it discovered about the app I’m trying out. I clicked on the Backend topic, and it took me to the following page: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yy2dhoyetb8jsx3932mz.png) All of this is automatically generated, as Unblocked determines the experts for each particular part of the codebase. However, experts can also update their expertise when they configure their profiles on our organization. Now, in addition to having many questions I can look at about the backend application, I also know which of my colleagues to go to for questions. If I go to the Components page on the Web Dashboard, I can see a list of everything Unblocked thinks is important about this app. It also gives me a quick view of who I can talk to about these topics. Clicking on any one of them provides me with a little overview, and the experts on the system can manage these as needed. Again, all of this was automatically generated. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eu7kiy1krxx4niwgxa4f.png) ### Conclusion This was a great start with Unblocked. I’m looking forward to next trying this out on some of the things that I’ve been actively working on. Since the platform is not going to be leaking any of my secrets to other teams, I’m not very concerned at all about putting it on even the most secret of my projects and expect to have more to say about other use cases later. Unblocked is in public beta and [free](https://getunblocked.com/pricing) and worth checking out!
mbogan
1,717,600
Recover your stolen assets or scammed cryptocurrency
My heart pounded like a jackhammer against my ribs as I stared at the empty wallet on my screen. My...
0
2024-01-04T22:49:12
https://dev.to/victoriasideris20/recover-your-stolen-assets-or-scammed-cryptocurrency-3c5c
My heart pounded like a jackhammer against my ribs as I stared at the empty wallet on my screen. My life savings, meticulously converted into Bitcoin, had vanished. Gone. Eradicated. Like a ghost in the machine, my crypto had slipped into the digital abyss, leaving me with nothing but a gnawing emptiness and a rising tide of panic. Days bled into nights as I scoured the internet, desperately clinging to any shred of hope. Forums offered conflicting advice, recovery companies demanded exorbitant fees, and the official channels proved frustratingly unhelpful. Just as despair threatened to engulf me, I stumbled upon a beacon in the darkness: Pro Wizard Gilbert Recovery. Their website, infused with a reassuring air of professionalism, spoke of cutting-edge techniques and a compassionate approach. Hesitantly, I reached out, pouring my tale of woe into their digital ear. To my surprise, a prompt and empathetic response arrived, outlining their process and offering a glimmer of hope. The initial consultation was a revelation. Unlike the robotic interactions of other companies, Pro Wizard Gilbert Recovery engaged with https://prowizardgilbertrecovery.xy me as a fellow human, understanding the emotional toll of my loss. They patiently explained their methods, never once resorting to technical jargon or inflated promises. Their recovery process itself was a masterclass in digital sleuthing. They employed a combination of advanced blockchain analysis, forensic tracing techniques, and, as they jokingly called it, "good old-fashioned detective work." Each step was meticulously documented, and I was kept informed at every turn, the fear gradually giving way to a cautious optimism. A few days later, the news I had been fervently hoping for finally reached me: my crypto had been discovered! I cannot describe the immense delight that swept over me. It was like discovering a missing limb—a portion of my soul came back. Pro Wizard Gilbert Recovery not only helped me get my cryptocurrency back, but they also gave me new hope for the internet. Their fees, while not insignificant, were transparent and fair, a testament to their expertise and dedication. But more importantly, it was their unwavering support and genuine empathy that truly set them apart. They weren't just retrieving lost coins; they were mending broken trust and restoring hope. My experience with Pro Wizard Gilbert Recovery is a testament to the power of human ingenuity and compassion in the face of digital adversity. If you, like me, have ever found yourself lost in the labyrinthine depths of a crypto crisis, I urge you to reach out. These digital wizards might just be the heroes you've been searching for. Write them On; prowizardgilbertrecovery(@)engineer.com & Telegram: @Pro_Wizard_Gilbert_Recovery
victoriasideris20
1,717,608
2023 Clock Out: My Top Lessons Learned
Was is just me or did the vibes this holiday season just feel a little off? Based on my completely...
0
2024-01-04T23:09:13
https://dev.to/tdesseyn/2023-clock-out-my-top-lessons-learned-55d0
Was is just me or did the vibes this holiday season just feel a little off? Based on my completely non-scientific way of asking people, “Did 2023 kick your ass, too?” I’ve come to the conclusion that we all over-stressed and worked way too much, all the way down to the wire in December. So here we are in January— still exhausted, confused, and somehow having the longest short workweek of our career. Now I could start in on a whole list of resolutions to make 2024 #myyear, but instead let’s talk about the things 2023 taught us… besides using, “Yes, chef” as a default response and that Swities could, if mobilized, take over the world.  To those of you who missed seeing my face over the past month, you can check out my solo show [here](https://www.linkedin.com/posts/taylordesseyn_what-i-learned-in-2023-solo-show-activity-7148334667920838657-aMkj?utm_source=share&utm_medium=member_desktop). If you want to get right to my list of fails that lead to “learning experiences” in 2023, here you go: **Trust your gut.**  As some of you already know, this year I left my job of 12 years and moved to a new position at gun.io. But if I’m being honest with myself, like really honest, that move was around three years overdue. I’d been thinking for a while it was time to leave, but fear and uncertainty can sometimes make you stay in situations that are no longer serving you. So when I say to trust your gut, I mean that at the end of the day only you know what’s best for yourself. And if the vibes are off, trust yourself and your capabilities that it’s time to quit wasting time and move on.  **I need to prioritize healthy time management.**  I went into 2023 as most of you did with a moderately strict daily work schedule. I could tell you what I should be doing or working on down to the minute. But every day I was just hitting that 2:00 afternoon wall or, even worse, I was leaving work already fully mentally spent and unable to enjoy the rest of my day. So I started planning rest and breaks into my day. Currently, I’m taking a longer lunch to give myself time to slow down and reset so that I can give my full attention at work and at home. Maybe this will work for you too, or maybe you can find another spot in your day to incorporate a sizable break. But I know that if you don’t make an effort to protect it and your mental health, you’re just going to be continually dealing with the biggest workplace villain besides the sound of someone unexpectedly calling you on Teams… burn out.  **Community is everything.** I’m not going to be pedantic by defining community and how it can help you. I will say that I’ve leaned on my community so much this year and it’s led to only great things and great connections. So use that information how you will.  **Have more confidence in yourself.** In 2023, I realized that I’m confident in the places you normally see me— in front of a camera, with a mic, in your inbox. But I’ve had to own that I’m not so confident when it comes to things at home. And especially when it comes to blending my work and home life. It seems counterintuitive to say, find the confidence to talk about your lack of confidence. Maybe for you that looks like a tweet or talking to someone or reading a couple books. Whatever it takes to confront the things that are holding you back from being your best self.  **I’ve got anxiety.**  I mean don’t we all. I think it comes free with your subscription to being human. But when it comes to work, feeling like you’re always in this fluttery panic of “get it done” or “I need to push harder” or “I need to miss out on something to handle this”  isn’t really doing me any favors. Work is always going to be there. No matter if you stay up all night working on something or slam your laptop shut until Monday. If your work anxiety is affecting your happiness, find a counselor to talk to. Can confirm from personal experience that it helps.  **There’s lot of people who know how to talk, but not many who know how to execute. ** The easiest way to stand out at your job is to just get things done. We all know plenty of people who get lost in the sauce of strategizing, pontificating, making diagrams, pulling numbers, and scheduling meetings to talk about scheduling the next meeting. Be the person that actually gets something done. You’ll stand out, I promise.  **Life goes by like hella fast.** I’m not going to slap in a Ferris Bueller quote here, but y’all (looking at myself in the mirror) have got to slow down. Slow down, show up, and be present in your life. If you don't stop and look around once in a while, you could miss it. **You’re being chased by demons. Stop running.** Not like actual demons, more like a giant rolling boulder of all your previous life burdens that’s trailing behind you at increasing speed because you’ve never addressed them or worked to resolved them in any way. 2023 me was just the result of a life constantly performing the version of myself that other people wanted me to be. How’d I work on that? Counseling my friend. Go make an appointment now.  **Invest in your mental, physical, and spiritual health.**  My wife and I were making our 2024 budget and we realized two things: 1) we spend a lot of money on the constant upkeep of these things and 2) we’re going to have to cut back on the iced coffee budget in 2024 to be able to afford them. Totally worth it to not feel like a human slug with anxiety.  **You know more than you think.** Even if you’ve been doing the same job for 12 years and you’re really crazy nervous about making the career leap to the next thing— do it. You might just surprise yourself with your skills that have never been appreciated or noticed before.  **Define what’s important at the end of the day.**  For me, that’s the support of my family. It’s my wife being proud of me and my daughter being happy and giving me a hug after a day of work. I’ve been making it so complicated for years, but I’m starting to realize how simple it really is. 
tdesseyn
1,717,634
Day 888 : Come Close
liner notes: Professional : Whew.. what a day! Got up super early to catch a meeting that was set...
0
2024-01-05T00:04:27
https://dev.to/dwane/day-888-come-close-5c1c
_liner notes_: - Professional : Whew.. what a day! Got up super early to catch a meeting that was set so folks in the Asia timezones could attend. Went back to sleep. Woke up in time to attend another meeting. Worked on my proof of concept. Had another meeting. Then went back to working on my POC. I've come close a couple of times to getting this one feature done, but one thing is messing it up. I'll figure it out. - Personal : Last night, I got the keyboard interaction for the front page of my side project working the way I want. I also went through and picked up some tracks on Bandcamp and created the social media posts I'll post tomorrow. Ended the night watching an episode of "Zom 100". ![A close up of a bunch of dark blue grapes hanging off a twig in Bourgogne, France](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hs026fch82hx57wbx5zm.jpg) Going to spend like another hour to see if I can get the bug fixed for my POC so that I can finish it up tomorrow. Then I want to put together the tracks for the radio show. Keep my one day lead for working on the radio show going. Want to refactor my side project's front end carousel to be in the format that I want so I can start to do some scroll driven animation on the cards. Got like one more episode of "Zom 100" before I catch up, so I'll probably end the night with that. Going to eat dinner. Have a great night! peace piece Dwane / conshus https://dwane.io / https://HIPHOPandCODE.com {% youtube K-cIoM8qx4U %}
dwane
1,717,780
The impact of 3D animation company in 2024
The animation sector is ever-evolving, and staying informed about the most recent trends is essential...
0
2024-01-05T04:27:06
https://dev.to/ankitaacadereality/the-impact-of-3d-animation-company-in-2024-4j
The animation sector is ever-evolving, and staying informed about the most recent trends is essential for both enthusiasts and professionals. Increasing affordability of 3D animation services and advancements in 3D animation technology are key elements contributing to the growth of the 3D animation marketplace. As more experts in 3D animation companies emerge at an international level, and solutions are sold across borders, the usage and applications of **[3D animation services](https://www.acadereality.com/animation-services/)** are expected to surge in the coming years. Rising customer expectations and the growing penetration of the internet are resulting in more businesses across sectors adopting 3D animation for overall impact, aesthetic appeal, and improved visibility. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0lt2ftn1txwkmjxdia7y.jpg) Methods such as 3D modeling are rising in demand as they create substances based on actual world instances, allowing greater maneuverability of the objects in virtual spaces. 3D Animation Market Demand The growing integration of live-action and 3D animation services to make standard animation bolder, livelier, and more colorful is enhancing the market growth for 3D animation. The beginning of streaming solutions, such as Amazon Prime Video and Netflix, is expanding the animation sector, and the demand for cutting-edge animation is surging, thus surging market growth. The rising acceptance of animated characters is growing the demand for 3D animations in brand campaigns and marketing, especially on social media platforms, which is considerably contributing to the international 3D animation market. The growing appeal for hand-made and aesthetic products, particularly among the Gen-Z population and millennials, is leading to a surge in the demand for affecting image content. This is growing the demand for 3D animation services, which is estimated to boost the market growth. 3D Animation Market Growth The growing demand for 3D animation in academics and education is helping the holistic 3D animation market grow. The approval of 3D animation services on eLearning stages to allow the usage of graphic representation for learning is boosting the market for 3D animation. The application of 3D animations supports the enhancement of learners' understanding, which is fuelling the progress of the market for 3D animations. Also, the growing demand for affordable animation is growing the demand for whiteboard animation. The growing use of 3D animation in certified training is also assisting the growth of the 3D animation market. As different virtual learning platforms, such as Udemy, edX, and Coursera, are gradually becoming prevalent, motion graphic animated videos are being used, which is offering a strong impetus to the market growth for 3D animation services. The Increasing Popularity of 3D Animation 3D animation services have witnessed significant popularity recently, remaining a key trend in the animation sector. Its capability to create dynamic environments and lifelike characters has garnered an international audience. Scientific advancements have made 3D animation easy to get to, allowing animators to produce top-quality content that drives creative boundaries. The demand for visually stunning and immersive animated content across video games, TV shows, and films fuels the status of 3D animation services. Viewers are drawn to the detailed and realistic visuals, improving their complete viewing experience. Moreover, 3D animation has found applications beyond entertainment in sectors such as medical visualization, engineering, and architecture. International 3D Animation Market Key Players Top 3D animation companies, looking into their market shares, capacity, and latest developments like mergers and acquisitions, plant turnarounds, and capacity expansions are mentioned below: - Adobe Inc. - Autodesk Inc. - Corel Corporation - Nvidia Corporation - Zco Corporation - NewTek Inc. - The Foundry Visionmongers Ltd - Maxon Computer GmbH - Anifex - Rip Media Group Benefits of 3D Animation 1. Lifelike Characters and Realistic Renderings: 3D animation services allow for a level of visual immersion and realism previously unattainable. 2. Superior Storytelling Capabilities: The flexibility of 3D animation allows visually stunning and complex environments, increasing storytelling. Irrespective of challenges such as high production costs and a steep learning curve, 3D animation services have become a vital tool for storytellers, artists, and various sectors. Wrapping up The animation sector is in a permanent state of evolution, and 2024, major trends such as the rise of visual effects and digital animation, the revival of stop-motion animation, and the increasing popularity of 3D animation services are shaping the landscape. Visual effects and digital animation provide extraordinary creative possibilities, while 3D animation fascinates users with lifelike characters. The rebirth of stop-motion animation celebrates authenticity and craftsmanship in storytelling. Whether through standard stop-motion, motion graphics, or computer animation, the animation sector continues to reshape visual experiences and captivate audiences. Stay educated, witness the ever-expanding realm of animated creativity, and embrace evolving technology.
ankitaacadereality
1,717,815
Practice Optimizing Prompts
The objective here is to begin your journey and build confidence in your ability to master AI in the...
0
2024-01-05T06:04:02
https://dev.to/mitul3737/practice-optimizing-prompts-10ho
ai, promptengineering
The objective here is to begin your journey and build confidence in your ability to master AI in the future. **_Step 1: Choose Your Scenario_** The following are two scenarios where you might want to have the support of AI. Choose one that best suits your interest or think of your own real world scenario. **Scenario A Content Creation** You, as a member of our marketing team, are tasked with creating a blog post about our company's [latest product release]. The blog post is due in three days. Your mission is to provide essential information and product details, defining your target audience. To help expedite the process, you plan to generate an initial draft of the post. **Scenario B Meeting Preparation** You are a project manager at [company/department] and you have a crucial [project update] meeting with the executive team scheduled for tomorrow. To help create a concise and impactful presentation that showcases the project's success and addresses any potential issues, you would like to prepare using support from AI. You plan to prompt the AI Chatbot to generate the following: a summary of the project's progress, key milestones, and potential challenges. _**Step 2: Trial with your initial prompt**_ Once you have signed up for one of the AI Chatbot platforms (e.g., ChatGPT , Google Bard , or Microsoft Bing ). Start with your selected scenario and think about how you would pose your first initial prompt. This is your chance to start experimenting! You might need to customize the scenario and add more details in the placeholders. Just remember to consider data privacy and avoid sharing sensitive information. Enter your prompt into the chatbox. Pay attention to the AI response and make note of how it responded to your initial input. Was it clear? On-point? Did it meet your expectations? **_Step 3: Revise your prompt_** Based on the AI response, review and revise your prompt. You might need to clarify or add additional context to improve the AI's responses. Enhance your prompt by specifying your intent and defining the required tone or formatting. Moreover, you can provide examples to delineate the desired structures or outcomes. In other words, you are teaching the AI to tailor its responses according to your question or instruction. Repeat this step as many times as needed. Example: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4ir483tv9764whkk07za.png) Step 4: Compare and reflect After multiple rounds of interaction, compile all the prompts you have used and the responses you have received (AI Chatbots may archive the conversations automatically for you). Analyze the changes made and observe how AI has responded differently. Reflect on your learning throughout this process. How has each revision improved the outcome? Is the final result more precise, relevant, and in line with what you anticipated? Document your observations, as this will be an important resource in learning to optimize AI interactions in your future tasks and you may share your thoughts in the discussion forum.
mitul3737
1,718,176
"Day 4: Excel Essentials Unveiled - Sharing Today's Insights on My Learning Adventure! 📊🚀 #ExcelSkills #LearningJourney"
EXCEL - 3 OVERVIEW FORMULA &amp; FUNCTION FORMULA is an equation which we create for our...
0
2024-01-05T12:47:45
https://dev.to/nitinbhatt46/day-4-excel-essentials-unveiled-sharing-todays-insights-on-my-learning-adventure-excelskills-learningjourney-9mk
EXCEL - 3 OVERVIEW FORMULA & FUNCTION FORMULA is an equation which we create for our calculation. FUNCTION predefined formula. EXCEL :- If you are not getting the correct answer from the formula, then try to use this evaluate you formula with the help of Evaluate Formula in Formulas Tab. Google sheets :- IN Google sheets there is not a particular option you can see. 12 categories in Excel function. So, We will only learn only those which are important for DATA ANALYSIS ( Specific Domain). General :- SUM AVERAGE COUNT IF VLOOKUP HLOOKUP INDEX and MATCH SUMIF and SUMIFS COUNTIF and COUNTIFS IFERROR CONCATENATE or CONCAT TEXT LEFT, RIGHT, MID DATEDIF Data Cleaning formula :- TRIM PROPER LOWER and UPPER CLEAN SUBSTITUTE LEFT, RIGHT, MID LEN CLEAN and TRIM Combination IF and ISNUMBER IF and ISTEXT IF and ISBLANK TEXT and DATEVALUE DATA Manipulation :- CONCATENATE or CONCAT LEFT, RIGHT, MID LEN FIND and SEARCH SUBSTITUTE IF IFERROR TEXT CONVERT TRANSPOSE SUMIF and SUMIFS INDEX and MATCH VLOOKUP IF, ISNUMBER, and SEARCH Combination DATE and TIME Functions CONVERT Function RANK ROUND TEXTJOIN IF, ISBLANK, and IFERROR Combination Statistical Based Formula :- AVERAGE MEDIAN MODE STDEV.P and STDEV.S VAR.P and VAR.S CORREL COVAR RANK.EQ and RANK.AVG PERCENTILE.INC and PERCENTILE.EXC QUARTILE.INC and QUARTILE.EXC Z.TEST T.TEST F.TEST NORM.S.DIST and NORM.DIST NORM.INV EXCEL :- If you don’t know the name of the option, go to INSERT Function option and inside a box write the description of the formula you will get the name of the formula. Google sheets :- IN Google sheets there is not a particular option you can see. Function Explain :- Step 1 :- Whenever writing a formula, it starts with “ = ”. Step 2 :- Write formula names like sum, max etc. Step3 :- Open Bracket () which takes input as per the formula. In this whenever you are writing a formula you will get suggestions. In which if you see [ ] square brackets it means that the parameter is optional to write. Other parameters are compulsory to give to get the desired output. Example : - = sum(value1,value2… - [optional]) LEARNING IS AN EVERYDAY PROCESS SO, i will be updating previous learning with new tips and tricks. Follow me on this where every day will be added if i learn something new about it :- https://dev.to/nitinbhatt46 THANK YOU
nitinbhatt46
1,717,820
Afzal Hosen Mandal on
&lt;div class="container"&gt; &lt;h1&gt;স্টাইলিশ এইচটিএমএল এবং সিএসএস কোড&lt;/h1&gt; &lt;p&gt;এই...
0
2024-01-05T06:14:22
https://dev.to/afzalqwe/afzal-hosen-mandal-on-2jlk
javascript, webdev
`<div class="container"> <h1>স্টাইলিশ এইচটিএমএল এবং সিএসএস কোড</h1> <p>এই ওয়েবপেজটি এইচটিএমএল এবং সিএসএস কোড দিয়ে তৈরি করা হয়েছে।</p> <button class="btn">আরও জানুন</button> </div> `
afzalqwe
1,717,844
How to Use ChatGPT to Improve Workflow Efficiency
ChatGPT has been making waves since its launch. Its exceptionally advanced natural language...
0
2024-01-05T06:27:43
https://dev.to/rafikke_lion/how-to-use-chatgpt-to-improve-workflow-efficiency-4ec8
chatgpt, ai, workflow, workplace
ChatGPT has been making waves since its launch. Its exceptionally advanced natural language capabilities have ignited excitement around AI in just about every field imaginable, including project management. Personally, I have found [ChatGPT](https://openai.com/blog/chatgpt) to be a game-changer as it greatly enhances my productivity across various tasks such as: - Writing content - Summarizing complex information - Generating images - Writing code However, despite its many benefits as a generative AI model, I don’t actually recommend using ChatGPT for work purposes. **Some of the limitations of ChatGPT are:** - Anything entered into its chat interface can become part of its future training data. This means you could inadvertently expose confidential business information, trade secrets, and other sensitive content through your ChatGPT interactions. - You could accidentally expose customer data which could violate laws such as the GDPR, HIPAA, and more. This is especially dangerous for companies in industries like finance and healthcare. Another reason why we don’t use ChatGPT all that much is because you have to provide it with a lot of information and context. You have to explain a lot of things about your business before it can give you a quality result. This can be time-consuming and frustrating. **To overcome these shortcomings of ChatGPT, we use [monday dev](https://monday.com/dev/?utm_source=devto&utm_medium=organic&utm_campaign=how_to_use_ChatGPT_to_improve_workflow_efficiency&utm_term=queueVTA_link_1).** It is a project management tool created by monday.com for software development teams – but it also has [monday AI](https://monday.com/ap/ai/?utm_source=devto&utm_medium=organic&utm_campaign=how_to_use_ChatGPT_to_improve_workflow_efficiency&utm_term=queueVTA_link_2) built in! ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fytdjpm8v3466sosgu8u.png) monday AI's versatility as an assistant is unmatched because it has access to all your workspace data - it can quickly generate content like blog posts or emails while seamlessly integrating with existing data. And it’s secure – which is super important for enterprise teams (and any company really) dealing with sensitive data, and that needs to remain compliant with data handling laws. **Below are some of the ways monday AI is improving our workflow…** - **Operations:** Writing internal knowledge base articles, composing incident summaries, and calculating yearly budgets. Assisting in creating compelling content, including blog articles and ad copy. - **Product management:** Creating requirement templates, summarizing retrospective notes, and preparing release plans. Summarizing kickoff meetings, breaking down projects into tasks, and creating meeting agendas. - **Marketing:** Writing blog posts, preparing campaign strategies, and creating content briefs. - **Human resources:** Generating ideas for interview questions, summarizing HR calls, and creating onboarding plans. - **Sales:** Composing emails to prospects, preparing Go-To-Market (GTM) strategies, and summarizing call transcripts. - **Task generation:** Quickly turning project ideas into actionable tasks. - **Summarization:** Providing quick summaries of lengthy topics. - **Formula builder:** Assists in crafting complex formulas that are suitable for both experts and beginners. **Unlike ChatGPT, which requires some training or input to function optimally, monday AI effortlessly works with the data you already have.** I definitely recommend **incorporating AI tools like this into your product and project management.** In my eyes, the power of AI is unlocked when you integrate it into the applications you already use, not from prompting and prodding in ChatGPT. Has ChatGPT improved your productivity and workflow? What about a tool you already use that has been upgraded with AI capabilities? If yes, how? Let me know in the comments below.
rafikke_lion
1,717,902
Nginx and Keycloak: A Perfect Pair for Gateway Security
In today’s fast-paced digital landscape, ensuring robust security at every point of user interaction...
0
2024-01-05T07:38:04
https://dev.to/sergey-dudik/nginx-and-keycloak-a-perfect-pair-for-gateway-security-3ief
In today’s fast-paced digital landscape, ensuring robust security at every point of user interaction is paramount. While there are numerous tools available to fortify our applications, finding the perfect synergy between them can be a challenge. Enter the dynamic duo: Nginx and Keycloak. When paired together, these powerful technologies provide an unparalleled security solution for your gateway. Nginx, known for its high-performance and scalability, combined with the robust authentication and authorization mechanisms of Keycloak, creates a fortress, safeguarding your applications from unauthorized access. In this article, we’ll explore the ins and outs of this compelling combination, demonstrating how you can harness their collective strengths to build a fortified, yet user-friendly, gateway for your applications. Before diving into Nginx and Keycloak, let’s revisit some foundational concepts of security. ## Understanding the Difference: Authentication vs. Authorization In the realm of security, the terms “authentication” and “authorization” often come up. Although they might sound similar and are sometimes used interchangeably, they have distinct meanings and functions. Let’s delve into each of these terms to understand their differences and importance. **1. Authentication** Definition: Authentication refers to the process of verifying the identity of a user, system, or application. It answers the question, “Are you who you say you are?” How it Works: The most common form of authentication is a username and password combination. When a user enters these credentials, the system compares them against stored data to verify their identity. Other methods include biometrics (like fingerprint or facial recognition), OTPs (one-time passwords), and hardware tokens. Examples: - Entering a password to log into your email account. - Using a fingerprint to unlock your smartphone. - Receiving an SMS code to confirm your identity on a banking website. **2. Authorization** Definition: Once authentication is established, authorization determines what that user, system, or application is permitted to do. It answers the question, “Do you have permission to perform this action?” How it Works: Authorization is typically managed by setting permissions or roles. For instance, a user might be granted read-only access to a database, while an admin has the rights to both read and modify it. Examples: A standard employee might access a company portal but can’t make changes to certain critical documents. An administrator, on the other hand, can modify, delete, or even share those documents. In a file-sharing app, you might grant some users the ability to view a file, while others can edit it. While both authentication and authorization play critical roles in security, they serve distinct purposes: Authentication ensures you are communicating with the right entity by validating their identity. Authorization ensures that entity has the correct permissions to perform certain actions. **What is Gateway?** Also known as API Gateway is a service that acts as an intermediary for requests from clients seeking resources from other servers or services. Many organizations use API Gateways in microservice architectures to manage and secure the complex interactions between microservices. Popular API Gateways include Amazon API Gateway, Kong, Apigee, and WSO2. There is a great article if you want to know more about API Gateway and its usage: [https://medium.com/buildpiper/how-do-api-gateways-work-3b989fdcd751](https://medium.com/buildpiper/how-do-api-gateways-work-3b989fdcd751) How can you secure your backend? Imagine we are developing a web application comprised of three components: - Single-page application (SPA) built with frameworks like React or Angular - Data Service that handles all CRUD operations related to our domain entities and manages the connection to the database - Report Service that fetches data from the Data Service and encapsulates the logic for generating custom reports ![Typical architecture](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y7czjvewdk7u3htgvtq9.png) When it comes to securing the backend, there are three primary strategies to consider: - Each microservice handles its own authentication and authorization. - The gateway manages authentication while individual services are responsible for authorization. - The gateway takes care of both authentication and authorization. Each approach has its unique strengths and limitations. For the sake of brevity, this article won’t delve into which strategy is superior. Truthfully, determining the best fit requires a comprehensive understanding of the system in question. In the following sections, we will explore the second and third strategies in-depth, focusing on how NGINX and Keycloak can be effectively employed for these purposes. **What is Keycloak?** Keycloak is an open-source Identity and Access Management (IAM) tool developed by Red Hat. It provides advanced features such as Single Sign-On (SSO), Identity Brokering, and Social Login without the need for deep, specialized security expertise. Keycloak’s blend of flexibility, comprehensive features, and active community support has solidified its reputation in the identity and access management space. As organizations continue to seek efficient ways to manage identities without compromising security, tools like Keycloak remain indispensable. **What is Nginx?** Nginx was created by Igor Sysoev in 2002, with its first public release in 2004. Originally developed to address the C10K problem (handling 10,000 simultaneous connections), Nginx was built from the ground up to be highly efficient and scalable. At its core, Nginx is a web server. But over the years, it has evolved into so much more. Today, Nginx can also function as a reverse proxy server, load balancer, mail proxy server, and even an HTTP cache. **Nginx Plus issue** Nginx offers a free version of its software, but there’s also a premium paid version known as Nginx Plus. While Nginx Plus supports Single Sign-On with Keycloak, the free version unfortunately does not. It’s a bit disappointing, given the popularity of both Nginx and Keycloak. At [TargPatrol](https://targpatrol.com), we utilize both tools, so we’ve had to devise a method for them to effectively communicate with each other. **Modernized architecture** Let’s make some adjustments to our architecture. The updated version is depicted in the image below. ![Modernized architecture](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/931fe3g9ofcxp5o357lm.png) As illustrated, the Nginx service now functions as an API Gateway. Its primary role is to handle both authentication and authorization. Meanwhile, the Keycloak service acts as our Single Sign-On (SSO) server. The Data Service and Report Service process requests coming from Nginx, but they no longer manage authentication or authorization for these requests. I won’t delve into the integration of SPA with Keycloak in this article, as there are numerous comprehensive resources available on this topic. For instance, you can refer to this well-written guide: https://medium.hexadefence.com/securing-a-react-app-using-keycloak-ac0ee5dd4bfc. How should we handle authentication in this scenario? We can use nginx authentication proxy. Now, let’s examine the nginx configuration: ``` http { ... location /auth { proxy_ssl_server_name on; proxy_pass https://targpatrol-keycloak.local/realms/targpatrol-dev/protocol/openid-connect/userinfo; proxy_pass_request_body off; proxy_set_header Content-Length ""; proxy_set_header X-Original-URI $request_uri; } location /data { auth_request /auth; auth_request_set $auth_status $upstream_status; error_page 401 = @handle_unauthorized; proxy_pass http://data-service.local; include /etc/nginx/common/ssl-headers.conf; js_content authService.authorize; } location /report { auth_request /auth; auth_request_set $auth_status $upstream_status; error_page 401 = @handle_unauthorized; proxy_pass http://report-service.local; include /etc/nginx/common/ssl-headers.conf; js_content authService.authorize; } } ``` What’s happening here? First, we’ve defined the /auth route that validates our request using Keycloak. We simply send the request with headers only to Keycloak, requesting user information. If we possess a valid token in our header, Keycloak will respond with a 200 OK, returning the current user’s data. The routes for the Data and Report services contain the ‘auth_request’ instruction. Every time we attempt to access them, a request will be sent to Keycloak first. Alright, we understand the authentication process, but what about authorization? For this, we can leverage a functionality in nginx called ngx_http_js_module. This module permits the execution of JavaScript code on a request. Let’s delve into the ‘js_content’: ``` function extractPayload (token) { const tokenParts = token.split('.'); const encodedPayload = tokenParts[1]; const decodedPayload = Buffer.from(encodedPayload, 'base64').toString('utf-8'); return JSON.parse(decodedPayload); } function authorize(request) { const token = request.headersIn.Authorization; if (!token || !(token.slice(0, 7) === 'Bearer ')) { return false; } const payload = extractPayload(token); const roles = payload['roles']; # request url const url = request.uri; # here we can compare url and roles # to allow or deny access return false; } ``` This file is named authService.js. It should contain a function named authorize since, in our js_content instruction, we reference it as authService.authorize (following the format fileName.functionName). Plain JavaScript can be utilized here. Initially, we parse the Authorization header to extract the Bearer token, which was generated by Keycloak, into an object form. Subsequently, we can match the roles with the request URL to either grant or deny the request. It’s all quite straightforward! One challenge with this approach is that every request is directed to Keycloak. A possible solution is to transition from nginx’s js_content to a Node.js service (or another suitable language). This service would have server-side integration with Keycloak. It’s worth noting that only Nginx Plus supports this, not the free version. For more details, you can refer to: Keycloak’s documentation. ## Conclusion In summary, the synergy between Nginx and Keycloak offers a compelling solution for gateway security. While Nginx efficiently manages and routes web traffic, Keycloak ensures robust authentication and authorization. Their combined capabilities create a fortified layer of defense, enhancing both user experience and system security. By seamlessly integrating these tools, businesses can achieve not only heightened protection but also streamlined operations. As the digital landscape continues to evolve, tools like Nginx and Keycloak are proving indispensable for those seeking a balanced combination of performance, flexibility, and security. Source: [https://medium.com/p/41a801e741f9](https://medium.com/p/41a801e741f9)
sergey-dudik
1,717,987
Top VSCode Extensions for Web Development in 2023
Visual Studio Code is hugely popular among web developers due to its great editing features,...
0
2024-01-05T09:07:49
https://dev.to/matinmollapur0101/top-vscode-extensions-for-web-development-in-2023-37g5
Visual Studio Code is hugely popular among web developers due to its great editing features, extensions ecosystem and cross-platform availability. Here are some of the best VSCode extensions that can enhance productivity for web development. ## Live Server The Live Server extension launches a local development server with live reload capability. It updates code changes instantly without requiring a manual refresh. This is extremely useful when building web apps with HTML, CSS and JavaScript. ## ES7 React/Redux/React-Native Snippets This extension provides shortcuts for React and Redux workflows like creating components, actions etc automatically. For example `rfc` creates a React functional component. The time saved from boilerplate code really adds up. ## Sass This must-have extension brings support for Sass/SCSS syntax highlighting, autocompletion and linting directly within the editor. It compiles SCSS to CSS on the fly. ## Prettier & ESLint Prettier auto-formats code to follow consistent styling rules. ESLint helps catch bugs and enforce code quality. These tools integrate seamlessly in VS Code to produce clean code. ## CSS Peek CSS Peek lets you trace CSS rules directly in the editor by clicking on a class/id in the HTML. This is useful for understanding cascading styles. ## Code Spell Checker As the name suggests, this checks spelling within code comments and strings. It helps avoid silly typos. ## GitLens GitLens supercharges VSCode's built-in Git features. It lets you view commits linked to lines of code, easily access Git commands, see code contributions, and more. ## Thunder Client This extension lets you make API calls right within VSCode through an intuitive GUI. It's great for testing web services without having to use tools like Postman. There are many more great extensions, but these provide a terrific boost in productivity for common web development workflows. Try them out to level up your VSCode game!
matinmollapur0101
1,718,131
Deploy a solid-start app on github pages
Solid-start is the long awaited meta-framework for Solid.js. Since it entered its second beta after a...
0
2024-01-11T16:53:49
https://dev.to/lexlohr/deploy-a-solid-start-app-on-github-pages-2l2l
[Solid-start](https://start.solidjs.com) is the long awaited meta-framework for [Solid.js](https://solidjs.com). Since it entered its second beta after a rewrite, there were a few breaking changes, so the [previous article on this one's topic](https://dev.to/lexlohr/using-solid-start-with-github-pages-3iok) was no longer valid. With solid-start, you can deploy your application with client-side rendering, server-side rendering, or islands architecture basically [anywhere thanks to nitro](https://nitro.unjs.io/deploy). One of these presets to deploy pages is "static", which creates a basic server for the server-rendered pages and then uses it to render static versions of them, to be deployed on github pages or wherever else you want. There is also "github_pages", but I cannot see that it does anything different than "static", so let us stick with that. ## Creating your project ```sh npm init solid@latest my-app # or pnpm create solid@latest my-app ``` Instead of `my-app`, use whatever name you want. Select a version with SSR and whatever other configuration you want. Make sure you are using at least `@solid-js/start@0.4.8` or newer, since that fixes an issue with the hydration of pages with a different base url. ## Install the dependencies Once your package is set up, install the dependencies: ```sh npm i # or pnpm i ``` ## Configure vite You can add whatever configuration you want; the important parts are `ssr: true` and the `server` config. ```ts import { defineConfig } from "@solidjs/start/config"; export default defineConfig({ start: { ssr: true, server: { baseURL: process.env.BASE_PATH, preset: "static" } } }); ``` ## Configure the Router You need to make the `Router` in `src/app.tsx` aware of the base path set in the vite configuration, so add the following `base` property: ```ts <Router base={import.meta.env.SERVER_BASE_URL} root={...} > ``` ## Create a github action to deploy the page We can use `JamesIves/github-pages-deploy-action` to deploy the output from our build on github pages; however, we need to create an extra file `.nojekyll` so directories starting with an underscore (like `_build`) will be served and our page will receive its assets. Add `.github/workflows/main.yml`: ```yaml name: CI on: push: branches: [ main ] pull_request: branches: [ main ] workflow_dispatch: jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: install run: npm i --legacy-peer-deps - name: build run: npm run build env: BASE_PATH: /my-app/ - name: create .nojekyll run: touch .output/public/.nojekyll - name: deploy pages uses: JamesIves/github-pages-deploy-action@v4.5.0 with: branch: gh-pages folder: .output/public ``` Instead of `/my-app/`, insert your github repository name again. ## Enable GitHub pages for your project Once the action finished, * Go to your project's GitHub page and on there to the settings tab * In the left menu, select pages * For branch, select gh-pages and / (Root) It may take a few seconds up to two minutes until the pages are set up for the first time – subsequent deployments are usually a lot faster. After that, you can take a look at your freshly deployed GitHub page with your solid-start project. Happy hacking!
lexlohr
1,718,324
Your Chance to get $750 to your Cash Account!
A post by Sagor
0
2024-01-05T15:02:33
https://dev.to/sagorrajuyt/your-chance-to-get-750-to-your-cash-account-3d2e
[ ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n5718x2d1vxnxx15cc93.jpeg)](https://smrturl.co/o/503514/53177516?s1=)
sagorrajuyt
1,718,357
Empower Team Collaboration: Free Online Video Collaboration Tool
Empower Team Collaboration: Free Online Video Collaboration Tool Empower your team collaboration with...
0
2024-01-05T15:56:54
https://dev.to/video-collaboration/empower-team-collaboration-free-online-video-collaboration-tool-2d1m
[Empower Team Collaboration: Free Online Video Collaboration Tool ](https://simplified.com/video-editor/collaboration)Empower your team collaboration with our free online video collaboration tool. Seamlessly connect and collaborate with team members through high-quality video communication.
video-collaboration
1,718,575
Deploy React App to Google App Engine with Github Actions CI/CD - A Complete Guide
This guide provides you a step-by-step process to deploy your React app efficiently to Google App...
0
2024-01-09T12:02:11
https://dev.to/rushi-patel/deploy-react-app-to-google-app-engine-with-github-actions-cicd-a-complete-guide-67h
googlecloud, react, cicd, githubactions
This guide provides you a step-by-step process to **deploy your React app efficiently to Google App Engine**. Each section offers actionable steps, ensuring a smooth deployment journey. <br /> ### In this guide, we'll cover the following sections: 1. [**Pre-requisite**](#pre-requisite) 2. [**React Project Config**](#react-project-config) 3. [**GCP Account & GCP Project Setup**](#gcp-account-and-project-setup) 4. [**App.yaml and Explanation**](#app-yaml) 5. [**Github Actions CI/CD**](#github-actions-cicd) 6. [**Testing**](#testing) Whether you're a seasoned developer seeking a detailed deployment workflow or an enthusiast eager to take your React project live, this guide will equip you with the necessary knowledge and steps for a successful deployment journey. Let's start with pre-requisites. <br /> <a name="pre-requisite"></a> # Section 1: Pre-requisites Before diving into deploying a React app to Google App Engine, it's essential to ensure that you have the necessary tools and knowledge in place. Mininum requirements are - **Node & npm Installed:** Make sure Node.js and npm are installed on your machine. Node 18 is used throughout this guide. - **React JS Setup:** Familiarize yourself with creating a React project and running it locally. - **Google Cloud Account:** If you don't have one yet, don't worry; we'll create it in a later step. <br /> <hr> <a name="react-project-config"></a> # Section 2: React Project Config ### Initialize & Run a React App If you haven't already set up a React project, follow these steps: - Open your terminal. - Run the following command to create a new React app: ```bash npx create-react-app my-react-app ``` Replace my-react-app with your own project name. - Run the React app: ```bash npm start ``` This will start a development server, allowing you to view your React app in a browser at http://localhost:3000. - Prepare a build of your React app: ```bash npm run build ``` This command creates a build folder at the root of your project, containing the compiled version of your React app. - Exclude build folder from Git: <br> Add `build/` to your `.gitignore` file to prevent it from being tracked and pushed to your Git repository. - React Scripts Configuration: Ensure your `package.json` includes the following scripts: ```json "scripts": { "start": "react-scripts start", "build": "react-scripts build", "test": "react-scripts test", "eject": "react-scripts eject" } ``` Make sure your scripts align with the provided commands for starting, building, testing, and ejecting your React app. <br/> <hr> <a name="gcp-account-and-project-setup"></a> # Section 3: GCP Account & GCP Project Setup ### Step 1: Create a New Project 1. Go to the [Google Cloud Console](https://console.cloud.google.com/) and sign in. 2. Click on "Select a project" at the top and then click on "New Project." 3. Choose a unique project name and select the desired organization if applicable. 4. Once the project is created, note down the Project ID for future reference. 5. Link this project to appropriate Billing Account. ### Step 2: Enable Required APIs 1. Navigate to the **APIs & Services** section in the Google Cloud Console. 2. Click on **Library** in the left sidebar. 3. Search for **Cloud Storage** and enable the **Cloud Storage API.** 4. Search for **App Engine Admin** and enable the **App Engine Admin API.** ### Step 3: Set Up Cloud Storage 1. Go to the **Cloud Storage** section in the Google Cloud Console. 2. Click on **Create Bucket** to make a new bucket. 3. Choose a unique name and select the desired region. 4. Leave other settings as default and create the bucket. ### Step 4: Install GCP CLI 1. Navigate to the root directory of your React app in the terminal. 2. Install the Google Cloud SDK following the instructions at [Google Cloud SDK Installation Guide](https://cloud.google.com/sdk/docs/install). 3. Authenticate the Google Cloud SDK by running `gcloud auth login` and following the on-screen instructions. 4. Set the project ID by running `gcloud config set project PROJECT_ID`, replacing PROJECT_ID with your actual Project ID. <br /> <hr> <a name="service-account-setup"></a> # Service Account Setup #### Generate & Download a Service Account To set up a Service Account for Google App Engine deployment, follow these steps: 1. Go to the [Google Cloud Console](https://console.cloud.google.com/). 2. Navigate to **IAM & Admin** → **Service Accounts**. 3. Click on **Create Service Account**. 4. Provide an appropriate name and description for the service account. For instance, use `github-ci-cd` as it will be utilized for Github CI/CD. 5. Assign the following roles: 1. App Engine Admin 2. Cloud Build Service Account 3. Service Account User 4. Storage Object Admin 6. Click the three dots and select **Manage keys**. 7. Click on **Add Key → Create New Key**. 8. Choose the **JSON key** type and securely download it. Remember, this key grants access to Google Cloud resources, so keep it safe. <br /> <hr > <a name="app-yaml"></a> # Step 4: Create app.yaml and Explanation #### What is app.yaml? - The `app.yaml` file configures settings and routing for Google App Engine applications. #### Placement: - Keep the `app.yaml` in your project's root directory alongside your source code. #### Example Configuration: ```yaml # [START app_yaml] runtime: nodejs18 service: my-react-app # prefix/subdomain of the bucket specific to project and environment handlers: - url: /static/js/(.*) static_files: build/static/js/\1 upload: build/static/js/(.*) secure: always - url: /static/css/(.*) static_files: build/static/css/\1 upload: build/static/css/(.*) secure: always - url: /static/media/(.*) static_files: build/static/media/\1 upload: build/static/media/(.*) secure: always - url: /(.*\.(json|ico|png|jpg|svg))$ static_files: build/\1 upload: build/.*\.(json|ico|png|jpg|svg)$ secure: always - url: / static_files: build/index.html upload: build/index.html secure: always - url: /.* static_files: build/index.html upload: build/index.html secure: always # [END app_yaml] ``` #### Explanation of Configuration: - **runtime:** Specifies the runtime environment (e.g., `nodejs18`). - **service:** Defines the service name, typically a project-specific prefix or subdomain. The `handlers` section defines how incoming requests are handled: - Each `url` pattern specifies a path or file type. - `static_files` points to the location of the static file in your project. - `upload` indicates where the file should be uploaded within the Google App Engine environment. - `secure` ensures that requests are served over HTTPS. This configuration example directs incoming requests to the appropriate static files within the `build` directory, ensuring proper handling and security for various file types and paths. #### Deploy your App Locally: Deploy your React app by executing the command in root directory of your project: ```bash gcloud app deploy ``` This command will package and upload your compiled build to Google Cloud. The deployment process may take a few minutes. > **Note:** When you are deploying for **first time**, comment out line 4, where you specify service name. As first deployment must be of default service. After deployment completion, you'll receive a URL where your React app is hosted. Open the URL to check the deployed React App. <br /> <hr > <a name="github-actions-cicd"></a> # Step 5: CI/CD using GitHub Actions ### CI/CD and GitHub Actions: - **CI/CD:** CI (Continuous Integration) and CD (Continuous Deployment) automate the software delivery process. CI involves merging code changes into a shared repository frequently, automatically running tests, and validating changes. CD automates the deployment of validated code changes to production or staging environments. - **CI/CD in GitHub Actions:** GitHub Actions provide workflows to automate tasks in your repository, including CI/CD processes. These workflows are defined in YAML files and can be triggered by various events like code pushes, pull requests, etc. ### Storing Service Account Key and Project ID in GitHub Secrets: - Store your Service Account Key and Project ID as secrets in your GitHub repository to securely access them during the CI/CD process. - You can create Github secrets by **Github Repository → Settings → Secrets & Variables → Actions → Secrets tab → New Repository Secret** - Secrets: 1. `GCP_SA_KEY`: Your entire JSON of Service Account Key generated in previous step. 2. `GCP_PROJECT_ID` Your GCP Project ID. ### Workflow File Name & Placement: - The workflow file should be named `gcp-deploy.yml`. - Place this file in the `.github/workflows` directory in your project. Refer the below given repository incase of any confusion. - Paste the below configuration in recently created yml file. ### Configuration ```yaml name: Deploy to GCP on: push: branches: - main pull_request: branches: - main jobs: build-and-deploy: runs-on: ubuntu-latest steps: - name: Checkout repository uses: actions/checkout@v2 - name: Setup Node.js and yarn uses: actions/setup-node@v2 with: node-version: '18' - name: Install dependencies run: yarn install - name: Build React app run: yarn run build - name: Create temp folder run: mkdir temp_folder # This will create a temporary folder # It will have build folder and app.yaml # app.yaml will use the relative build folder to deploy to GCPs - name: Copy build to temp folder run: cp -r build/ temp_folder/build/ - name: Copy app.yaml to temp folder run: cp app.yaml temp_folder/app.yaml - name: List contents of temp folder for verification run: ls -l temp_folder/ - name: Google Cloud Auth uses: 'google-github-actions/auth@v2' with: credentials_json: '${{ secrets.GCP_SA_KEY }}' project_id: ${{ secrets.GCP_PROJECT_ID }} - name: Set up Cloud SDK uses: 'google-github-actions/setup-gcloud@v2' - name: Deploy to Google Cloud Storage run: | cd temp_folder gcloud app deploy app.yaml --quiet ``` <br > This workflow triggers on pushes or pull requests to the `main` branch. It uses **GitHub Actions** to perform various steps such as building the React app, authenticating with Google Cloud, setting up the Cloud SDK, and deploying the app to Google Cloud Platform. <br /> <hr > <a name="testing"></a> # Step 6: Testing the CI/CD **Push Your Changes to GitHub (on Main Branch):** Ensure your latest changes are pushed to the main branch of your GitHub repository. **Navigate to the 'Actions' Tab in GitHub Repository:** Visit your GitHub repository and go to the 'Actions' tab. **Check Workflow Status:** Verify the status of your React app deployment workflow using GitHub Actions. **Open the Deployed App URL:** Access the provided URL after a successful deployment in your browser to confirm your React application runs smoothly. > **Note:** In GitHub Actions logs, the URL might contain masked text (\***\* text) representing your GCP project ID. Replace \*\*** with your project ID and open the URL in your browser. **Access Google Cloud Engine Services:** Go to Google Cloud, locate and click on App Engine, then navigate to **Services** in the sidebar. Find your recent deployments and click on the service to open your React App in a new tab. <br /> <hr > # GitHub Repository Link Access the complete code reference for this guide by visiting the [GitHub Repository](https://github.com/rushi-2001/react-app-engine-setup). Feel free to clone the repository and experiment with it on your own! <br /> <hr > # Conclusion 🚀 Congratulations on mastering the deployment of your React app to Google App Engine and GitHub Actions! 🎉 Embrace this automated workflow to streamline your development journey and propel your projects to new heights. You can refer the above given **repository** link as and when required. If you have any queries or need guidance, feel free to chat or **drop a comment.** Keep coding, exploring, and sharing the joy of efficient deployments! Don't forget to **like and share this guide** to inspire others on their deployment adventures! 👍 **Happy coding! 💻✨**
rushi-patel
1,718,909
Convert JSON Data To YAML in Python
Hello everyone, in this session i would like to share a simple tutorial about how to convert json...
0
2024-01-06T08:48:10
https://dev.to/aliftech/convert-json-data-to-yaml-eec
python, opensource, programming, tutorial
Hello everyone, in this session i would like to share a simple tutorial about **how to convert json data into yaml**. In this tutorial i am gonna use python as programming language and a library called FormatFusion. FormatFusion is a powerful data converter library for python, you can check the detail in this [https://pypi.org/project/FormatFusion/](https://pypi.org/project/FormatFusion/). Before we start the tutorial, it is important to know what is json and yaml. **JSON** JSON stands for Javascript Object Notation. It is a lightweight data-interchange format that's widely used for transmitting data between computers. Think for it as a language for computers to talk to each other in a way that's both easy to understand and efficient to transmit. For further information about json, you can visit [JSON: The Lightweight Data-Interchange Format](https://wahyounote.wordpress.com/2024/01/10/json-the-lightweight-data-interchange-format/). Here is some keys to know about json: - **human-readable:** JSON is written in the plaintext, so it is easy for human to read and understand, even if they don't know how to program. This make it a great choice for things like configuration files and API responses. - **language independent:** even though it is based on javascript syntax, json can be used with any programming language. There are libraries and frameworks available for all major languages that can read and write json data. - **lightweight:** JSON files are very small and compact, which make them ideal for transmitting data over the internet. This is why json is often used for web APIs and web applications. - **structured:** JSON data is organized into key-value pairs and arrays. This make it easy to access and manipulate specifict place of data. Here are some common uses of JSON: - **web APIs:** Many web APIs use json to send and receive data. Example, when you use a weather API to get current temperature, the API will return data in json format. - **configuration files:** Many application use json files to store configuration setting. This make it easy to changes the setting without having to modify the code itself. - **data storage:** JSON can be used to store data in a file or database. This is a common way to store data for web applications. Here is the json example data: ```json { "person": { "name": "John Doe", "address": { "street": "123 Main St", "city": "Anytown", "state": "CA" } } } ``` **YAML** YAML ain't a markup language, though originally it meant Yet Another Markup Language. It is another data serialization language similar to json, often used for configuration files and data exchange. However, unlike json, it prioritize human readability with more intuitive and concise syntax. For further information about yaml, you can check it in my article [YAML: The Friendly Face of Data – Unveiling The Depth of Human-Readable Code](https://wahyounote.wordpress.com/2024/01/10/yaml-the-friendly-face-of-data-unveiling-the-depth-of-human-readable-code/). Here are some key points about yaml: - **Human-readable** 1. Use indentation to define structure, similar to python code, make it easier to understand the hierarchy of data. 2. No need for closing tags or square bracket like json, simplifying the visual layout. 3. Support comments for annotation, further clarifying the meaning of the data. - **Powerful and Flexible** 1. A superset of json, meaning any valid json file is also a valid yaml file. 2. Support various data types including strings, numbers, booleans, lists, dictionaries, and often custom types. 3. Allows for anchors and aliases to reference data elsewhere in the document, reducing redundancy. - **Common Uses** 1. Configuration files for applications and servers. 2. Data exchange between programs and services. 3. Storing application state and settings. 4. Defining automation scripts and playbooks. Here is the yaml example data: ```yaml name: My Application version: 1.0.0 settings: debug: true port: 8080 data: users: - name: John Doe email: john.doe@example.com ``` We have already talking about both json and yaml. Now, it is the time to go to our main topic. First of all, we need to install FormatFusion library using the following command: ```bash pip install FormatFusion ``` After that we can use the library to convert our json data into yaml format. Here is the example: ```python from FormatFusion.json_to_yaml import json_to_yaml json_data = """{ "name": "John Doe", "age": 30, "city": "New York" }""" yaml_data = json_to_yaml(json_data) print(yaml_data) ``` We can also convert yaml data into json format using `yaml_to_json` function just like the example bellow. ```python from FormatFusion.yaml_to_json import yaml_to_json yaml_data = """ name: John Doe age: 30 city: New York """ json_data = yaml_to_json(yaml_data) print(json_data) ``` And that's it. How was it? It was easy, right? Converting JSON to YAML with the FormatFusion library is a breeze. In addition to converting YAML and JSON data, FormatFusion can also be used to convert other data formats. For more information, please visit the FormatFusion documentation on the [GitHub repository](https://github.com/aliftech/FormatFusion/blob/master/DOCUMENTATION.md). That's all for this tutorial. Thank you for reading and I hope it was helpful. Here is an image of the FormatFusion library documentation: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i4z2cl8dkhjgs0jf0k8s.png) Here are some additional tips for converting JSON to YAML: - If your JSON data is nested, you can use the indent property to control the indentation level of the YAML output. - If your JSON data contains comments, you can use the comment property to specify how to handle them in the YAML output. - If your JSON data contains special characters, you can use the escape property to specify how to escape them in the YAML output. I hope this helps!
aliftech
1,718,987
Java Command Line Arguments
Imagine you're teaching your dog a new trick. You wouldn't just yell "Fetch!" and hope for the best,...
0
2024-01-06T10:53:17
https://dev.to/manojsharmajtp2/java-command-line-arguments-2mca
javascript, python, tutorial, beginners
Imagine you're teaching your dog a new trick. You wouldn't just yell "Fetch!" and hope for the best, right? You'd give clear instructions, maybe even offer a treat for understanding. That's how you communicate with your Java program using command line arguments: you give it specific instructions at runtime to customize its behavior. But let's ditch the dog analogy and dive into what command line arguments actually are. They're like whispers you tell your program before it starts running. You pass them as strings after the program name when you run it from the command prompt. These whispers can be anything from file names to specific actions, telling your program exactly what you want it to do. Think of it like ordering a pizza. You wouldn't just say "Food!" You'd specify the size, crust, toppings, and maybe even a delivery time. Command line arguments are like your pizza order, giving your program all the details it needs to do its job well. Why Use Command Line Arguments? There are many reasons to use [command line arguments](https://www.javatpoint.com/command-line-argument). Here are a few: Make your program flexible: Instead of hardcoding everything into the program itself, you can let users adjust settings or input data through arguments. This makes your program more versatile and adaptable. Automate tasks: You can use arguments to automate repetitive tasks, like processing multiple files or running your program with different configurations. No more clicking and typing! Simplify testing: Different arguments can trigger different program behaviors, making it easier to test various scenarios and functionalities. Debugging becomes a breeze. Getting Started with Arguments: Now, let's get your hands dirty with some actual code. We'll write a simple program that takes a file name as an argument and prints its contents. Here's the basic structure:
manojsharmajtp2
1,719,031
Fix "Unknown at rule @tailwind" errors in VS Code
TailwindCSS is something I've only just come into recently, while working on an existing project...
0
2024-01-06T14:00:00
https://wheresbaldo.dev/tech/vscode/fix-vs-code-unknown-at-rule-tailwind
vscode, tailwindcss, webdev, howtofix
TailwindCSS is something I've only just come into recently, while working on an existing project started by someone else. So I'm a complete beginner when it comes to Tailwind. Initially I couldn't really see the point of Tailwind. I mean it really just looked to me like writing inline-style rules, but in the `class` attribute instead of the `style` attribute. But since the project was already using it, I had to at least understand the basics. Well, after using it for a while, and reading through some of the Tailwind docs, I can now see the benefits, and I'm actually starting to really like it. So when I started bootstrapping a new project a few weeks ago, I thought "Hey, why not use Tailwind?"! So after bootstrapping my app, and since I'm building the app using Next.JS, I started by installing TailwindCSS following the [Tailwind docs for a Next JS installation](https://tailwindcss.com/docs/guides/nextjs). ## The Problem After getting Tailwind installed, I checked out a few of the files, and I noticed an issue that had been bothering me in my last project ... Visual Studio Code's IntelliSense wasn't working for the `@tailwind` and `@apply` directives in my global styles file! 😡 So while the directives should have looked something like this, with no errors: ```CSS @tailwind base; @tailwind components; @tailwind utilities; ``` Visual Studio Code was underlining them with the yellow squiggly line, indicating a problem: ![Unknown at rule @tailwind](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6me80tk4wzqmo99fwcrn.jpg) Specifically, Visual Studio Code was throwing the errors "**Unknown at rule @tailwind**" or "**Unknown at rule @apply**", depending on which one(s) I was using. On the last project, laziness got the better of me and I just ignored it, and it didn't seem to be causing any issues. But this time, starting a new project, I had to do something about it or I knew it would just drive me nuts! 😒 So after some quick digging, I found a few stackoverflow posts that addressed the issue a few different ways, but this [more recent one](https://stackoverflow.com/questions/76970920/how-to-make-vscode-recognize-tailwinds-apply) was the most up to date, and to the point. However, like the other answers I found, it was missing a crucial step I needed to get it working. So after doing some more digging, and a bit of trial and error, I was finally able to get things working. Here's what I did to fix the issue: ## Fix VS Code's IntelliSense for TailwindCSS 1. First off, if you haven't already, start by installing the TailwindCSS package in your project: ```bash # Using npm npm install tailwindcss # Using Yarn yarn add tailwindcss ``` Or, for a Next.JS project, like I'm doing: ```bash # Using npm npm install -D tailwindcss postcss autoprefixer npx tailwindcss init -p # Using Yarn yarn add -D tailwindcss postcss autoprefixer yarn dlx tailwindcss init -p ``` 2. Next, in your `tailwind.config.css`, which was just created with the `tailwindcss init -p` command, add the following to the **module.exports** `content` key: ```json content: [ "./app/**/*.{js,ts,jsx,tsx,mdx}", "./pages/**/*.{js,ts,jsx,tsx,mdx}", "./components/**/*.{js,ts,jsx,tsx,mdx}", ], ``` Or, if you're using the 'src' folder in your project: ```json content: [ "./src/**/*.{js,ts,jsx,tsx,mdx}", ], ``` 3. Once that's done, and this was the crucial missing step in the answers I found online: 1. Go into your extensions in Visual Studio Code, and 2. Install the **Tailwind CSS IntelliSense** extension. ![VS Code TailwindCSS IntelliSense extension](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m3pdajrrrzoz420eo1ys.jpg) This is what will allow Visual Studio Code to actually recognize the directives. 4. And then finally, after the TailwindCSS IntelliSense extension is installed, you'll need to add a file association. To do this: 1. Go into your Visual Studio Code settings: ![VS Code Settings](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z43m39jqc4028nmzofas.jpg) - On a Mac, you can do this by going to **Code > Settings... > Settings**, or by using the keyboard shortcut <kbd>&#8984;</kbd> + <kbd>,</kbd>. - On a PC, you can do this by going to **File > Preferences > Settings**, or by using the keyboard shortcut <kbd>Ctrl</kbd> + <kbd>,</kbd>. 2. Then, in the search bar, type in "files associations", and click on the **Add Item** button under the 'Files: **Associations**' section. ![VS Code Settings](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g3arvf8ya6q1j82hdnah.jpg) 3. Under the **Item** column, add "\*.css" (without the quotes), and under the **Value** column, add "tailwindcss" (again, without the quotes). Click the **Ok** button to save the changes. ![VS Code Settings](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kx5iekdo6cxgf59046lm.jpg) And that's all there is to it! Simple eh? 😁 Now, when you use the `@tailwind` and `@apply` directives in your CSS files, Visual Studio Code will no longer throw the "Unknown at rule @tailwind" or "Unknown at rule @apply" errors, and you'll get the IntelliSense you're used to with CSS! ![Tailwind CSS IntelliSense installed](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vf4gzffh7p8gs5w7owtd.jpg) ## Notes If you've followed posts online that miss the installation of the TailwindCSS IntelliSense extension, you'll end up with broken IntelliSense for CSS, and your CSS files will look like regular text files without any highlighting: ![Tailwind CSS IntelliSense installed](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b8124deyfz0k5t3pzeqw.jpg) So if you've followed other instructions online that left you with non-highlighted CSS files and broken IntelliSense, you're just missing the TailwindCSS IntelliSense extension to fix things up! Hope this post was able to help if you were in the same boat as me!
mlaposta
1,719,051
Problems with Parallel Routes in Next.js
The Next.js 13.3 brought a new feature that calls the Parallel Routes which allows you to...
0
2024-01-06T15:10:55
https://dev.to/cookiemonsterdev/problems-with-parallel-routes-in-nextjs-241p
learning, typescript, nextjs, javascript
The Next.js 13.3 brought a new feature that calls the Parallel Routes which allows you to simultaneously or conditionally render one or more pages in the same layout. That is a really cool feature that may reduce the usage of react `<Suspense>` and improve the logic of data fetching in the app. **BUT**, I found that there are at least two problems with it and feel to share it --- For now, let's take a look at how code looks like without parallel routes. For example for I have a dashboard route that looks like `app/projects/page.tsx`: ```tsx import { Suspense } from "react"; const ProjectsPage = async () => { return ( <div> <Suspense fallback={<div>Loading Users...</div>}> <Users /> </Suspense> <Suspense fallback={<div>Loading Projects List...</div>}> <ProjectsList /> </Suspense> </div> ); }; export default ProjectsPage; const Users = async () => { await new Promise((res) => setTimeout(res, 1000)); return <h1>Users</h1>; }; const ProjectsList = async () => { await new Promise((res) => setTimeout(res, 3000)); return <h1>Projects</h1>; }; ``` The code is pretty obvious, we are seeing fallbacks until promises in components are resolved. Now let's rewrite it using parallel routes. We will create two more folders inside `app/projects` that will be called `@users` and `@projectslist`. Each will contain `page.tsx` and `loading.tsx` which are component itself and loading fallback. And also the layout where we can use these components. In general, structure will look like: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7bgd1497bc7tata0bdk4.png) ```tsx type LayoutProps = { children: React.ReactNode; projectslist: React.ReactNode; users: React.ReactNode; }; const Layout = ({ children, users, projectslist }: LayoutProps) => { return ( <div> {users} {projectslist} {children} </div> ); }; export default Layout; ``` That works pretty well ha? We have the same logic, do we? Then what are the `children` in our case? The `children` is our `page.tsx` inside the projects folder, and here is the first problem. --- ## First Problem If we take a look at the original code, we will see that inside the projects page, we have two components however now these components are inside our layout. Soooo, what do I have inside the page since the components are not there? The answer is null! ```tsx const ProjectsPage = () => null; export default ProjectsPage; ``` The Next.js works so that it requires have page file inside a route to be able to render a page otherwise we receive an error. Sometimes it is annoying to create pages like this, but what to do... --- ## Second Problem The Parallel Routes can be controlled by nested and dynamic routes it is also a cool feature. But sometimes it is not the best option... The components from parallel routes live inside root layout for projects route. Let's say we want a page that will be rendered depending on the project id. For this we will create new page with path `app/projects/[projectId]/page.tsx`. When we try to reach this page we will receive an error. Answer on question why is: _Next.js will first try to render the unmatched slot's default.js file. If that's not available, a 404 gets rendered_. For us, it means that if we do not want to render these parallel components for our page we need to create new black pages for parallel routes were we return null: `app/projects/@projectslist/[projectId]/page.tsx` and `app/projects/@users/[projectId]/page.tsx`: ```tsx const Page = () => null; export default Page; ``` Just imagine what folder hell awaits you when you have more than 2 parallel routes)) --- ## Conclusion Nonetheless, Parallel Routes is a very cool feature and definitely has cases where it is better to use it than just wrap a component in react `<Suspense>`. But for now, my choice is to use `<Suspense>` if I have nested routes))) Hope this was informative for you! See ya) --- ## Sources - [Next.js documentation](https://nextjs.org/docs/app/building-your-application/routing/parallel-routes)
cookiemonsterdev
1,719,175
Year summary - books I recommend
Year summary It is good time these days for year summary. I would like to recommend books...
0
2024-01-06T17:07:09
https://dev.to/grzegorzgrzegorz/year-summary-books-i-recommend-26n9
learning, programming
## Year summary It is good time these days for year summary. I would like to recommend books which I read in 2023 and which I believe made me better engineer. I had more books in my hands but these one I find to be most valuable from my point of view. ## "Five lines of code" by Christian Clausen [Book link](https://www.manning.com/books/five-lines-of-code) This is great book about refactoring. Very clear tips on how to start and where to go with the refactoring process. I think the most valuable for me are simple tips like: "do not use switch statement", "do not use abstract classes" and my favourite "avoid inheritance". All this tips are coming with TypeScript code showing the practical appliance of the refactoring rules. Actually, as a side effect, I got interested in TypeScript after reading this book. I very much like cognitive constraints described which are related to every human and the way they affect our code understanding. In the past I often had troubles with understanding the code which spread across many classes like with 4-level of inheritance. Now I know this is not my low level of expertise but badly written code. ## "Effective software testing" by Mauricio Aniche [Book link](https://www.manning.com/books/effective-software-testing) Systematic testing approach here. One can use this book as checklist of which test design techniques to use and in which order. This dismisses the uncertainity we often have when dealing with new testing problem. Requirement based testing, structural testing, contract based testing explained. New thing for me was properties based testing which is part of this book as well. I was somehow not aware about this kind of approach before. I very much like and agree with author encouraging to be systematic in this area. ## "Functional thinking" by Neal Ford [Book link](https://www.oreilly.com/library/view/functional-thinking/9781449365509/) Functional programming doesn't seem to be common these days. At least not for everyone just as it was not familiar to myself. I found out about about the set of basic functions which are there in all languages which support functional programming: filter, map and reduce. Currying and memoization as well as optimization which is there behind built-in functions is explained. Author is supporter of applying specific programming paradigm in certain areas which I find very useful and which I often see in other books: write functional code in the core of your application and be more imperative (oo) when going to the edge of the system (controllers, UI). This turned out to be very important book for me: I started to think about functions as something aside of object oriented programming and I use them conciously since then. Finally, I know what I exactly have in my functional toolbox and when to lean towards functional paradigm. ## "Code that fits in your head" by Mark Seemann [Book link](https://www.pearson.com/en-us/subject-catalog/p/code-that-fits-in-your-head-heuristics-for-software-engineering/P200000009593/9780137464357) This book is the most recent one on my virtual bookshelf. It turned out to be kind of summary of all 3 previous ones. It talks about cognitive constraints, proper testing approach and it also mentions functional paradigm when coding the core of the application. However, it has additional advantage, at least for me. Author has the ability of expressing his thoughts in very clear way and so it is like listening to the expert in the office at his desk, getting answers on how to deal with specific problems properly. The are numerous of them addressed there: how to start application from the scratch, when to send feature branch to review, what to write in commit message, how to write the class, method so that it is clear. I very much like fractal architecture proposed by the author. Also, teamwork aspects like code reviews are mentioned. There is great piece of information about human short term and long term memory and how does it affect our code understanding. All of these aspects are explained basing on the code of restaurant application. This is most important book I read last year and so I very much recommend it. ## What next? I am now looking more towards devops related stuff, I plan to take a look at very much recommended ["The DevOps Handbook" by Gene Kim and others](https://www.barnesandnoble.com/w/the-devops-handbook-gene-kim/1138718562). However, maybe some of you also have recommendations of important sources of information from IT area?
grzegorzgrzegorz
1,719,263
Trabalho de Lógica de Programação: Desenvolvendo Soluções Empresariais em Python
Análise e Desenvolvimento de Sistemas - Uninter Introdução Esse é um trabalho acadêmico,...
0
2024-01-06T18:39:12
https://dev.to/kalianny20/trabalho-de-logica-de-programacao-desenvolvendo-solucoes-empresariais-em-python-272k
_Análise e Desenvolvimento de Sistemas - Uninter_ ## Introdução Esse é um trabalho acadêmico, onde tive a oportunidade de utilizar os conhecimentos adquiridos em lógica de programação na linguagem Python. Nesse trabalho, quatro projetos se destacam, cada um oferecendo uma perspectiva única sobre como a linguagem pode ser empregada para atender às demandas empresariais. Vamos explorar esses projetos que abordam desde cálculos de descontos até o controle de estoque de uma bicicletaria. ## 1. Sistema de Cálculo de Descontos em Vendas Atacadistas No primeiro exercício, mergulhamos no desenvolvimento de um programa Python projetado para facilitar o cálculo de descontos em vendas atacadistas. A estratégia progressiva adotada pela empresa X se torna automatizada, proporcionando eficiência e precisão no processo. Este projeto destaca a aplicação prática dos conceitos de lógica de programação e manipulação de variáveis em Python. ## 2. Sistema de Pedidos para Lanchonete Direcionando nosso foco para a interação cliente-negócio, o segundo exercício propõe a implementação de um aplicativo de vendas para uma lanchonete. Com uma interface amigável, o programa permite a fácil escolha de produtos, acumulação de pedidos e apresenta o valor total a ser pago. Esse projeto destaca a importância da usabilidade e experiência do usuário na construção de sistemas práticos. ## 3. Sistema de Cálculo de Frete para Companhia de Logística Em um contexto logístico, o terceiro exercício aborda o desenvolvimento de um sistema para uma empresa recém-criada especializada em encomendas. O programa calcula o valor do frete considerando dimensões, peso e rota do objeto, seguindo as regras da empresa. Este projeto destaca a aplicação prática de algoritmos de cálculo em Python para resolver desafios específicos do setor. ## 4. Sistema de Controle de Estoque para Bicicletaria Finalizando nossa jornada, o quarto exercício propõe a criação de um sistema de controle de estoque para uma bicicletaria. Com funcionalidades como cadastro, consulta e remoção de peças, o programa fornece uma visão clara do status do estoque. Esse projeto destaca a importância da organização e gestão eficiente de recursos em ambientes de negócios. # 1 - Resolução: Sistema de Cálculo de Descontos em Vendas Atacadistas ## - Contexto Imagine-se como um dos programadores responsáveis pela construção de um aplicativo de vendas para a empresa X, especializada em vendas no atacado. O objetivo é calcular o valor total da compra antes e após a aplicação dos descontos, de acordo com a tabela estabelecida. ## - Estratégia de Descontos A empresa X adota a seguinte tabela de descontos: - Até 9 unidades: 0% de desconto na unidade. - Entre 10 e 99 unidades: 5% de desconto na unidade. - Entre 100 e 999 unidades: 10% de desconto na unidade. - 1000 unidades ou mais: 15% de desconto na unidade. ## - Implementação em Python O programa em Python foi desenvolvido para interagir com o usuário e realizar os cálculos necessários. Utilizou-se estruturas condicionais if, elif e else para determinar o desconto apropriado com base na quantidade de produtos adquiridos. ```python print('Bem Vindo a Loja') # Entrada de dados valor_uni = float(input('Entre com o valor do produto: R$ ')) quant = int(input('Entre com a quantidade do produto: ')) # Cálculo do desconto if 0 <= quant <= 9: desconto = 0 porcento = '0%' elif 10 <= quant <= 99: desconto = 0.05 porcento = '5%' else: desconto = 0.15 porcento = '15%' # Cálculos de valores semdesconto = valor_uni * quant calc_descon = semdesconto * desconto comdesconto = semdesconto - calc_descon # Saída de resultados print('O valor sem desconto é de: R${:.2f}'.format(semdesconto)) print('O valor com desconto é de: R${:.2f} (Desconto de {})'.format(comdesconto, porcento)) ``` # 2 - Resolução: Sistema de Pedidos para Lanchonete ## - Contexto Como parte de uma equipe de programadores, a responsabilidade foi assumida para criar a interface do cliente no aplicativo de vendas da lanchonete. O sistema conta com uma tabela de produtos, cada um identificado por um código único, descrição e valor correspondente. ## - Estratégia de Implementação O programa utiliza estruturas de controle, como if, elif, else, while, break, e continue, para oferecer uma experiência interativa ao cliente. Além disso, verificações são realizadas para garantir que o cliente insira códigos válidos de produtos. ## - Implementação em Python O código foi elaborado para receber o código do produto desejado, acumular o valor correspondente ao pedido e permitir que o cliente adicione mais itens ou encerre a compra. Mensagens informativas são exibidas de acordo com a escolha do cliente. ```python acumulador = 0 print('Seja bem-vindo à lanchonete') print('******************Cardápio**********************') print('| Código | Descrição | Valor(R$) |') print('| 100 | Cachorro-Quente | R$ 9,00 |') # ... (Outros produtos listados) while True: codigo = input('Entre com o código desejado: ') if codigo == '100': acumulador += 9 print('Você pediu um Cachorro-Quente no valor de R$9,00!') elif codigo == '101': acumulador += 11 print('Você pediu um Cachorro-Quente Duplo no valor de R$11,00!') # ... (Outras condições para diferentes códigos) elif codigo == '201': acumulador += 4 print('Você pediu um Chá Gelado no valor de R $4,00!') else: print('Opção inválida!') continue resposta = input('Deseja pedir mais alguma coisa? \n 1 - Sim \n 0 - Não ') if resposta == '1': continue else: print('O total a ser pago é: R$ {:.2f}'.format(acumulador)) # total a pagar break ``` # 3 - Resolução: Sistema de Cálculo de Frete para Companhia de Logística ## - Contexto A empresa de logística opera entre três cidades, oferecendo serviços de transporte para encomendas. O valor do frete é calculado por meio de uma equação que leva em consideração as dimensões do objeto, seu peso e a rota escolhida. ## - Implementação em Python O código foi estruturado em funções para tratar diferentes aspectos do cálculo do frete, como dimensões do objeto, peso e rota. Foi implementada a lógica para lidar com possíveis erros de entrada do usuário. ```python # -------FUNÇÃO DIMENSOESOBJETO-------------- def dimensoesObjeto(): while True: try: altura = float(input('Digite a altura do objeto (em cm): ')) largura = float(input('Digite a largura do objeto (em cm): ')) comprimento = float(input('Digite o comprimento do objeto (em cm): ')) volume = altura * largura * comprimento print('O volume do objeto é (em cm³): {}'.format(volume)) if volume < 1000: return 10.00 elif 1000 <= volume < 10000: return 20.00 elif 10000 <= volume < 30000: return 30.00 elif 30000 <= volume < 100000: return 50.00 else: print('Não aceitamos objetos com dimensões tão grandes. \n Entre com as dimensões desejadas novamente.') except ValueError: print('Você digitou alguma dimensão do objeto com valor não numérico. \n Entre com as dimensões desejadas novamente.') # -------FUNÇÃO PESOOBJETO-------------- def pesoObjeto(): while True: try: peso = float(input('Digite o peso do objeto (em Kg): ')) if peso <= 0.1: return 1 elif 0.1 < peso <= 1: return 1.5 elif 1 < peso <= 10: return 2 elif 10 < peso <= 30: return 3 else: print('Não aceitamos objetos tão pesados.\n Entre com o peso desejado novamente.') except ValueError: print('Você digitou o peso do objeto com valor não numérico. \n Por favor, entre com o peso desejado novamente.') # -------FUNÇÃO ROTAOBJETO-------------- def rotaObjeto(): while True: rota1 = input ('Selecione a rota:\nRS - De Rio de Janeiro até São Paulo\nSR - De São Paulo até Rio de Janeiro\nBS - De Brasília até São Paulo\nSB - De São Paulo até Brasília\nBR - De Brasília até Rio de Janeiro\nRB - Rio de Janeiro até Brasília ') if rota1 in ['RS', 'SR', 'BS', 'SB', 'BR', 'RB']: if rota1 in ['RS', 'SR']: return 1 elif rota1 in ['BS', 'SB']: return 1.2 elif rota1 in ['BR', 'RB']: return 1.5 else: print('Você digitou uma rota que não existe.\n Por favor, entre com a rota desejada novamente.') # -----------MAIN--------------- print('Bem-vindo à companhia de logística') dimensoes = dimensoesObjeto() peso = pesoObjeto() rota = rotaObjeto() total = dimensoes * peso * rota print('Total a pagar (R$): {} (Dimensões: {} * Peso: {} * Rota: {})'.format(total, dimensoes, peso, rota)) ``` # **4 - Resolução: Sistema de Controle de Estoque para Bicicletaria** **- Contexto:** O projeto visa criar um software de controle de estoque para uma bicicletaria. O sistema oferece funcionalidades como cadastrar peças, consultar peças por diferentes critérios e remover peças do estoque. **- Implementação em Python:** O código foi estruturado em funções para cada uma das opções do menu, facilitando a manutenção e compreensão do programa. **- Função cadastrarPeca:** - Recebe como parâmetro um código exclusivo para cada peça cadastrada. - Pergunta o nome, fabricante e valor da peça. - Armazena os dados de cada peça em um dicionário. - Adiciona o dicionário à lista de peças. ```python def cadastrarPeca(cdg): print('Você selecionou a opção de cadastrar peça.') print('Código da peça: {}'.format(cdg)) nome = input('Por favor, entre com o NOME da peça: ') fabricante = input('Por favor, entre com o FABRICANTE da peça: ') valor = float(input('Por favor, entre com o VALOR(R$) da peça: ')) dicionarioPeca = {'código': cdg, 'nome': nome, 'fabricante': fabricante, 'valor': valor} listaPeca.append(dicionarioPeca.copy()) ``` **Função consultarPeca:** Exibe um menu com opções para consultar todas as peças, peças por código, peças por fabricante ou retornar ao menu principal. Mostra as informações das peças de acordo com a opção escolhida. ```python def consultarPeca(): while True: try: print('Você selecionou a opção de consultar peças') opConsultar = int(input('Entre com a opção desejada:\n1- Consultar todas as peças\n2- Consultar peças por código\n3- Consultar peças por fabricante\n4- Retornar\n>>')) if opConsultar == 1: for peca in listaPeca: for key, value in peca.items(): print('{} : {}'.format(key, value)) elif opConsultar == 2: entrada = int(input('Digite o CÓDIGO da peça: ')) for peca in listaPeca: if peca['código'] == entrada: for key, value in peca.items(): print('{} : {}'.format(key, value)) elif opConsultar == 3: entrada = input('Digite o FABRICANTE da peça: ') for peca in listaPeca: if peca['fabricante'] == entrada: for key, value in peca.items(): print('{} : {}'.format(key, value)) elif opConsultar == 4: return else: print('Essa opção não existe') except ValueError: print('Pare de digitar valores não inteiros') ``` **Função removerPeca:** Pergunta qual o código do produto que se deseja remover do cadastro. Remove a peça da lista de peças. ```python def removerPeca(): print('Você selecionou a opção de remover peça.') entrada = int(input('Digite o CÓDIGO desejado: ')) for peca in listaPeca: if peca['código'] == entrada: listaPeca.remove(peca) ``` **Função Principal:** Implementa os comandos de entrada e as iterações do menu principal. Oferece as opções de cadastrar peças, consultar peças, remover peças ou sair do programa. ```python print('Bem-vindo ao Controle de Estoque da Bicicletaria') codigo = 0 while True: try: opcao = int(input('Escolha a opção desejada:\n' '1- Cadastrar Peças\n' '2- Consultar Peças\n' '3- Remover Peças\n' '4- Sair\n>>')) if opcao == 1: codigo = codigo + 1 cadastrarPeca(codigo) elif opcao == 2: consultarPeca() elif opcao == 3: removerPeca() elif opcao == 4: break else: print('Essa opção não existe') continue except ValueError: print('Pare de digitar valores não inteiros') ``` ## Conclusão Ao longo deste trabalho, exploramos a aplicação prática dos conceitos de lógica de programação e algoritmos na linguagem Python, por meio de quatro projetos distintos. Cada projeto apresentou desafios específicos, proporcionando uma visão abrangente das capacidades da linguagem no contexto do desenvolvimento de soluções para problemas do mundo real. O primeiro projeto concentrou-se no desenvolvimento de um sistema eficiente de cálculo de descontos em vendas atacadistas, destacando a importância da lógica condicional na aplicação de descontos escalonados. A segunda abordagem trouxe à tona a interação cliente-negócio, implementando um aplicativo de vendas para uma lanchonete, evidenciando a relevância da usabilidade e experiência do usuário. O terceiro projeto explorou a área logística, apresentando um sistema de cálculo de frete para uma companhia de transporte de encomendas. Aqui, a ênfase recaiu sobre a modularidade do código e o tratamento de entradas do usuário, ressaltando a importância da organização e eficiência algorítmica. Finalmente, o quarto projeto concentrou-se no desenvolvimento de um sistema de controle de estoque para uma bicicletaria. A estrutura modular e a implementação de funções específicas para cada funcionalidade do sistema destacaram a importância da gestão eficiente de recursos em ambientes de negócios. Em síntese, este trabalho não apenas proporcionou a oportunidade de aplicar os conhecimentos teóricos de lógica de programação e algoritmos, mas também demonstrou a versatilidade e poder da linguagem Python na resolução de desafios do mundo real em diversos setores. O desenvolvimento desses projetos contribuiu significativamente para o aprimoramento das habilidades práticas dos programadores envolvidos, preparando-os para enfrentar futuros desafios no vasto campo da programação.
kalianny20
1,719,266
🚀 Mastering Frontend Development: A Comprehensive Roadmap 🌐💻
Hello there ambitious frontend developers! Are you ready to become a web development rockstar? Look...
0
2024-01-06T18:49:32
https://dev.to/aajinkya/mastering-frontend-development-a-comprehensive-roadmap-30b6
Hello there ambitious frontend developers! Are you ready to become a web development rockstar? Look no further than this comprehensive roadmap, which will guide you every step of the way. This blog is your friendly companion as you embark on your journey to mastering web development. We'll take you through all the essential skills and latest technologies with easy-to-understand explanations, practical examples, and fun exercises. Whether you're just starting out or you're a seasoned developer, we're here to help you achieve your goals and become the best you can be. So let's dive into the exciting world of web development together and make some magic happen! 🌈 HTML & CSS Mastery: Building a Strong Foundation Start your journey with the basics! Become a pro in HTML for structuring content and CSS for styling. Dive deep into flexbox and grid for advanced layout design. Responsive design is the key to captivating user experiences. 💻 JavaScript Fundamentals: Elevate Your Skills Level up with JavaScript. Learn the language's core concepts: variables, data types, loops, and functions. Familiarize yourself with ES6+ features for modern coding practices. 📚 Git Mastery: Version Control and Collaboration Git is a non-negotiable skill. Master version control, branching strategies, and collaborative workflows. Platforms like GitHub will be your coding playground for seamless collaboration. 🔧 Responsive Web Design: Creating Adaptive User Experiences Explore responsive design frameworks like Bootstrap and CSS frameworks for mobile-first designs. Make media queries your best friends in creating adaptive layouts. 🎨 CSS Preprocessors: Enhance Your Styling Game Elevate your styling with Sass or Less. Master variables, nesting, and mixins to create cleaner and maintainable stylesheets. ⚙️ jQuery: Mastering DOM Manipulation Although not as trendy, jQuery is still valuable for DOM manipulation and event handling. A solid understanding can enhance your frontend skills. 📡 AJAX and APIs: Harnessing Asynchronous Power Learn to make asynchronous requests, understand APIs, and fetch data effortlessly. JSON will become an indispensable companion in your coding journey. 🚀 Frontend Frameworks: Choose Your Powerhouse Dive into a frontend framework - whether it's React, Vue, or Angular. Understanding component-based architecture is the cornerstone of modern web development. 🔍 State Management Mastery: React or VueX for Vue Become proficient in state management, mastering concepts like props, state, and lifting state up in React or VueX for Vue. 🎨 CSS-in-JS: Revolutionizing Styling with JavaScript Explore the paradigm shift of styling in JavaScript using libraries like styled-components in React. It's a game-changer worth exploring. 🚚 Webpack and Module Bundlers: Streamlining Your Workflow Demystify module bundlers like Webpack. Learn to bundle, minify, and optimize your code for seamless production deployment. 🔧 Build Tools: npm and yarn - Your Project Command Center Become adept at package managers. npm and yarn are your essential tools for managing project dependencies and scripts efficiently. 🌐 APIs and RESTful Services: Mastering Integration Gain expertise in consuming APIs and working with RESTful services. Postman will be your go-to testing playground for seamless integration. 🚢 Introduction to Docker: Navigating Containerization Delve into containerization with Docker. Understand the basics of containerized applications and unlock their benefits for smoother development. 📊 Testing: Jest, Enzyme, or Cypress - Ensuring Quality Testing is paramount! Explore testing libraries like Jest for unit testing, Enzyme for React, and Cypress for end-to-end testing to ensure top-notch quality. 🌈 GraphQL Basics: A Modern Approach to Data Explore the world of GraphQL as an alternative to REST. Understand its advantages and leverage its capabilities for efficient data management. 📱 Mobile-First Development: Crafting User-Centric Experiences Familiarize yourself with mobile-first development principles. Learn responsive design with a mobile-first mindset for superior user experiences. 🌍 Progressive Web Apps (PWAs): Future-Proofing Your Applications Uncover the concept of PWAs. Learn how to make your web apps more reliable and faster, even in offline mode, ensuring a seamless user experience. 🚀 Performance Optimization: Maximizing User Delight Master the essentials of web performance optimization. Harness browser developer tools for profiling and debugging to deliver a delightful user experience. 📈 Continuous Integration/Continuous Deployment (CI/CD): Streamlining Deployment Embrace CI/CD practices. Automation tools like Jenkins or GitHub Actions can streamline your deployment workflow for efficient releases. 🛠️ Web Accessibility (A11y): Building Inclusive Experiences Ensure your websites are accessible to everyone. Dive into ARIA roles, semantic HTML, and best practices for creating inclusive web experiences. 🔐 Basic Security Principles: Safeguarding Your Projects Understand the basics of web security. Implement HTTPS, follow secure coding practices, and safeguard against common security threats. 💡 Stay Updated: Follow Industry Trends for Success Frontend development evolves rapidly. Stay ahead by following blogs, podcasts, and communities to remain updated on the latest trends and technologies. Remember, it's not just about the destination but the journey. Enjoy the process of learning and building amazing things! Feel free to ask questions or share your own tips along the way. Happy coding! 👩‍💻👨‍💻 #FrontendDevelopment #WebDevelopment #CodingJourney
aajinkya
1,719,492
NestJS with RabbitMQ in a Monorepo: Building a Scalable Credit Card Payment System with Decoupled API and Consumers
Introduction In this article, we will explore the creation of a credit card payment...
0
2024-01-07T18:17:37
https://dev.to/eduardoconti/nestjs-with-rabbitmq-in-a-monorepo-building-a-scalable-credit-card-payment-system-with-decoupled-api-and-consumers-58bb
## Introduction In this article, we will explore the creation of a credit card payment application using NestJS and RabbitMQ to handle billing generation with a Payment Service Provider (PSP). Additionally, we will incorporate Docker and Docker Compose to streamline container management. Initially, we will create an endpoint simulating the billing creation process, performing a database insert and an HTTP request. We will observe that the response time for this endpoint is unacceptably high, around 1100 ms. Next, we will introduce a message broker service to asynchronously process the HTTP request for billing generation, resulting in a significant reduction in the endpoint's response time. To conclude, we will address the separation of the consumer from the API in a monorepo, enabling both to scale independently. I want to highlight that I developed this project using the Clean Architecture, but you have the flexibility to choose the architecture that best suits your needs. ## Why should I use asynchronous processing and RabbitMQ? - **Asynchronous Processing for Improved Performance**: The introduction reveals that the initial response time for the billing creation process is unacceptably high, around 1100 ms. Introducing a message broker allows you to shift to asynchronous processing, significantly reducing the endpoint's response time. RabbitMQ excels in handling asynchronous communication, enabling your application to scale more efficiently. - **Enhanced Scalability and Responsiveness**: By leveraging RabbitMQ, you can decouple the billing generation process from the API endpoint. This decoupling facilitates improved scalability, as both the API and the billing generation service can scale independently. This flexibility ensures that your application remains responsive, even under increased load. - **Reliable Message Delivery**: RabbitMQ provides reliable message delivery mechanisms, ensuring that messages are successfully delivered even in the event of system failures or network issues. This reliability is crucial in financial applications like credit card payment processing, where data integrity and consistency are paramount. In summary, utilizing RabbitMQ in conjunction with NestJS for a credit card payment application brings tangible benefits such as improved performance, scalability, reliability, and flexibility, making it a well-rounded choice for handling asynchronous communication and optimizing your application's overall architecture. ##1. Start new nestjs app `$ nest new nestjs-rabbitmq-example` _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/197cdfda14b31a34b8a1b9d87e6da00a35d52e6e). After installation, I removed app.controller.ts and app.service.ts _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/616ff73936f7f2415a7529c91d62dac85ca9c015). ##2. Include entity, controller, and services to simulate the creation of a charge using a payment service provider such as Pagarme `$ nest g module credit-card` _You can see the classes created in this_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/6233d66380eb6153a12181ebf0f4adaf06101d8b). ![folder structure](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6mn3lqbmp9xenvpphg5z.png) Now we have an endpoint that **simulates** the creation of a credit card charge, as if making an insertion in the database (100ms) and then an http request to _pagarme_ (1000ms). ![postman](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/80ur3weg042oegvdxfxa.png) In this scenario, the client is not required to linger on the HTTP request, awaiting a response from Pagarme; we can efficiently handle this process asynchronously. To achieve this, we will leverage **RabbitMQ** to queue the creation requests for charges. ##3. Add the necessary libs to implement rabbitmq `$ yarn add @nestjs/microservices amqplib amqp-connection-manager` _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/6ac0d9b411a9b2505f0fb4dfffcfb23cd5142855). ##4. Dockerize application Dockerfile ``` FROM node:16 WORKDIR /usr/src/app COPY package*.json ./ RUN yarn COPY . . RUN yarn build EXPOSE 3000 CMD [ "yarn", "start:prod" ] ``` docker-compose.yml ``` version: '3.7' services: credit-card-api: container_name: credit-card-api restart: on-failure build: context: . volumes: - .:/usr/src/app ports: - 3000:3000 command: yarn start:dev depends_on: - rabbitmq rabbitmq: image: rabbitmq:3.9-management container_name: rabbitmq restart: always hostname: rabbitmq ports: - 5672:5672 - 15672:15672 volumes: - rabbitmq_data:/var/lib/rabbitmq volumes: rabbitmq_data: ``` _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/0d3b5bddf812e3cd36194585554cadbae8bb2ee2). ##4. Connect application with rmq update file main.ts to use _app.connectMicroservice()_ and _app.startAllMicroservices()_ I'm not going to delve into rabbitmq settings. ``` import { NestFactory } from '@nestjs/core'; import { AppModule } from './app.module'; import { RmqOptions, Transport } from '@nestjs/microservices'; async function bootstrap() { const app = await NestFactory.create(AppModule); app.connectMicroservice<RmqOptions>({ transport: Transport.RMQ, options: { urls: [`amqp://rabbitmq:5672`], queue: 'create_charge_psp', prefetchCount: 1, persistent: true, noAck: false, queueOptions: { durable: true, }, socketOptions: { heartbeatIntervalInSeconds: 60, reconnectTimeInSeconds: 5, }, }, }); await app.startAllMicroservices(); await app.listen(3000); } bootstrap(); ``` _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/6f0b6796042cbd34cdfdc3b1db6d087540246cf4). ##5. Start application `$ docker-compose up --build` ![app running console](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x5jkkc7fmzpwnz0lptbf.png) Access rabbitmq panel in [http://localhost:15672](http://localhost:15672) with default crendentials login: **guest** and pass: **guest** ![rmq panel](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mnnjnmn17ys6urzertl8.png) ##6. Create publisher and consumer create-charge.publisher.ts ``` import { Inject } from '@nestjs/common'; import { ClientProxy } from '@nestjs/microservices'; import { catchError, firstValueFrom, throwError } from 'rxjs'; import { CreateChargeInputProps } from 'src/credit-card/domain/contracts/psp-service.interface'; export class CreateChargePublisher { constructor( @Inject('create_charge_publisher') private readonly clientProxy: ClientProxy, ) {} async publish(data: CreateChargeInputProps): Promise<void> { await firstValueFrom( this.clientProxy.emit('CREATE_CHARGE_PSP', data).pipe( catchError((exception: Error) => { return throwError(() => new Error(exception.message)); }), ), ); } } ``` create-charge-on-psp.event-handler.ts ``` import { Controller, Inject } from '@nestjs/common'; import { Ctx, EventPattern, Payload, RmqContext } from '@nestjs/microservices'; import { CreateChargeInputProps, ICreateCharge, } from 'src/credit-card/domain/contracts/psp-service.interface'; import { Pagarme } from 'src/credit-card/infra/psp/pagarme/pagarme.service'; @Controller() export class CreateChargeOnPSPEventHandler { constructor( @Inject(Pagarme) private readonly pspService: ICreateCharge, ) {} @EventPattern('CREATE_CHARGE_PSP') async handle( @Payload() payload: CreateChargeInputProps, @Ctx() context: RmqContext, ): Promise<void> { console.log(payload); const channel = context.getChannelRef(); const originalMsg = context.getMessage(); try { await this.pspService.createCharge(payload); } catch (error) { console.log(error); } channel.ack(originalMsg); } } ``` credit-card.module.ts ``` import { Module } from '@nestjs/common'; import { Pagarme } from './infra/psp/pagarme/pagarme.service'; import { CreateChargeUseCase } from './app/use-cases/create-charge.use-case'; import { CreateChargeController } from './presentation/controllers/create-charge.controller'; import { CreateChargeOnPSPEventHandler } from './presentation/event-handler/create-charge-on-psp.event-handler'; import { CreateChargePublisher } from './infra/rmq/publisher/create-charge.publisher'; import { ClientProxy, ClientProxyFactory, Transport, } from '@nestjs/microservices'; import { ICreateCharge } from './domain/contracts/psp-service.interface'; @Module({ providers: [ Pagarme, { provide: CreateChargeUseCase, useFactory: (pspService: ICreateCharge) => { return new CreateChargeUseCase(pspService); }, inject: [Pagarme], }, CreateChargePublisher, { provide: 'create_charge_publisher', useFactory: (): ClientProxy => { return ClientProxyFactory.create({ transport: Transport.RMQ, options: { urls: [`amqp://rabbitmq:5672`], queue: 'create_charge_psp', prefetchCount: 1, persistent: true, noAck: true, queueOptions: { durable: true, }, socketOptions: { heartbeatIntervalInSeconds: 60, reconnectTimeInSeconds: 5, }, }, }); }, }, ], controllers: [CreateChargeController, CreateChargeOnPSPEventHandler], }) export class CreditCardModule {} ``` _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/558df94b3b896f7728f71fbde440597b0e893b1b). ##7. Change the use case to send charge data to the rabbitmq queue instead of making the call to pagarme directly create-charge.use-case.ts ``` import { CreateChargeInputProps, CreateChargeOutputProps, } from 'src/credit-card/domain/contracts/psp-service.interface'; import { CreditCardChargeEntity } from 'src/credit-card/domain/entities/credit-card-charge.entity'; import { ICreateChargeUseCase } from 'src/credit-card/domain/use-cases/create-charge.use-case'; import { IPublisherCreateCharge } from 'src/credit-card/infra/rmq/publisher/create-charge.publisher'; export class CreateChargeUseCase implements ICreateChargeUseCase { constructor(private readonly publisher: IPublisherCreateCharge) {} async execute( props: CreateChargeInputProps, ): Promise<Omit<CreateChargeOutputProps, 'pspId' | 'value'>> { const entity = CreditCardChargeEntity.newCharge(props); console.log(entity); await new Promise((resolve) => setTimeout(resolve, 100)); //simulate database access await this.publisher.publish(props); return { ...props, status: 'PENDING' }; } } ``` _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/51b862d9d44dc0cace531567f75378e24a3c5531). Now our endpoint for creating a credit card charge has an acceptable response time. ![postman](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9bnuqcz62vlj3mgo1u4q.png) Several might halt their progress at this juncture; I've witnessed products in production following this approach. While it's not inherently incorrect, it hinders our ability to scale both the consumer and API horizontally and vertically independently. Additionally, the processing load on the consumer can adversely affect API response times. To address this, we aim to decouple startAllMicroservices() from listen(), instigating a shift towards a monorepo structure. Simultaneously, we'll develop a dedicated app for the consumer, fostering improved scalability and performance. ##8. Switch from standard mode to monorepo mode Let's change the project structure to monorepo. `$ nest generate app credit-card-consumer` _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/142c3f2027d721d0d72bbf23c47427044ef6d46a). It is imperative to modify the scripts responsible for building and launching the application. Additionally, we must craft a Dockerfile tailored for the consumer, configuring it to spawn three replicas, and subsequently, fine-tune the docker-compose.yml configuration accordingly. _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/add7851d733339ccbc781efe35e7db1bfd5be813). ##9. Start application again `$ docker-compose up --build` Now, running the `$ docker ps` command we can see 3 instances of the consumer and 1 of the api running ![containers](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2eluf984eazwqxsw8611.png) ##10. Decouple the consumer from the api Currently, the consumer remains tightly coupled to the API, as we've only created the new credit-card-consumer app without implementing any modifications. The next step involves decoupling these components by transferring the responsibility of initiating the consumer to the credit-card-consumer application. credit-card-consumer/src/main.ts ``` import { NestFactory } from '@nestjs/core'; import { CreditCardConsumerModule } from './credit-card-consumer.module'; import { RmqOptions, Transport } from '@nestjs/microservices'; async function bootstrap() { const app = await NestFactory.create(CreditCardConsumerModule); app.connectMicroservice<RmqOptions>({ transport: Transport.RMQ, options: { urls: [`amqp://rabbitmq:5672`], queue: 'create_charge_psp', prefetchCount: 1, persistent: true, noAck: false, queueOptions: { durable: true, }, socketOptions: { heartbeatIntervalInSeconds: 60, reconnectTimeInSeconds: 5, }, }, }); await app.startAllMicroservices(); } bootstrap(); ``` See that here we do not need to invoke the app.listen() method. nestjs-rabbitmq-example/src/main.ts ``` import { NestFactory } from '@nestjs/core'; import { AppModule } from './app.module'; async function bootstrap() { const app = await NestFactory.create(AppModule); await app.listen(3000); } bootstrap(); ``` See that here we only keep the app.listen() method. Moreover, I have eliminated the services and controllers produced by the `$ nest generate app credit-card-consumer` command. You can see that changes in this [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/bed1ea5cc07d77616b3b1f87889f6f094ed54560). To finalize the decoupling process, it's essential to relocate the event handler to the credit-card-consumer, as it's responsible for both removing and processing messages from the queue. However, a challenge arises as the event handler relies on the Pagarme class within the nestjs-rabbitmq-example structure. To resolve this issue, let's transfer all shared components between the API and the consumer to the 'libs' folder, utilizing the 'nest generate library' resource. Importantly, we must refrain from importing any class from the API into the consumer under any circumstances. I moved all the shared code to the credit-card lib, basically i kept only the _presentation_ layer for API and consumer. Now our API is completely decoupled from the consumer, allowing us to scale each component independently. _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/51d89b624a1c4c37959057db8e76e3166a95177b). ##11. Limit cpu and memory for containers Establishing a controlled environment serves to restrict the RAM and CPU usage of containers. _See these changes in the_ [commit](https://github.com/eduardoconti/nestjs-rabitmq-example/commit/0214f4c86066394f391f736e413dfc0c80ebfe3e). ##12. Conclusion In conclusion, this article outlined the journey of creating a credit card payment application, emphasizing the efficient integration of technologies such as NestJS, RabbitMQ, Docker, Docker Compose, and the use of a monorepo. When addressing the initial challenge of inadequate response time in our endpoint, we implemented a message broker service to enable asynchronous processing of HTTP requests, resulting in significant improvements in system efficiency and responsiveness. Additionally, we explored separating the consumer from the API within a monorepo, providing flexibility and individual scalability for both parts of the system. It is worth noting that, while I opted for the Clean Architecture, the choice of architecture and implementation within a monorepo remains at the developer's discretion. This approach not only enhances application performance but also provides a solid foundation for future adaptations and customizations as needs evolve. By seamlessly integrating these technologies within a monorepo environment, we hope this article serves as a valuable guide for those seeking to optimize efficiency and scalability in their payment applications. repo: [nestjs-rabitmq-example](https://github.com/eduardoconti/nestjs-rabitmq-example) links: - [nestjs monorepo docs](https://docs.nestjs.com/cli/monorepo) - [nestjs rabbitmq docs](https://docs.nestjs.com/microservices/rabbitmq) - [rabbitmq docs](https://www.rabbitmq.com/#getstarted)
eduardoconti
1,719,579
Best 5 CSS Frameworks
CSS frameworks have revolutionized the way websites are designed and developed. These frameworks...
0
2024-01-09T10:20:20
https://dev.to/wahidmaster00/best-5-css-frameworks-4j3h
CSS frameworks have revolutionized the way websites are designed and developed. These frameworks provide a set of pre-built components and styles that make it easier for web developers to create beautiful and responsive websites. In this section, we will explore few CSS frameworks that every web developer should be familiar with. ## 1. Chota Chota is a key component of the [Blaze UI](https://www.blazeui.com/) framework, providing a lightweight and flexible solution for building user interfaces. In this section, we will explore the features and benefits of Chota, and how it can enhance the development process. One of the main advantages of Chota is its simplicity. With a minimalistic approach, Chota focuses on providing only the essential styles and components, making it easy to understand and use. This simplicity not only makes the framework lightweight but also allows for faster loading times, improving the overall performance of the application. Another benefit of Chota is its modular nature. The framework is divided into small, reusable components that can be easily combined to create complex user interfaces. This modularity promotes code reusability and maintainability, as developers can easily update or replace individual components without affecting the entire application. Chota also offers a [responsive grid system](https://en.wikipedia.org/wiki/CSS_grid_layout), which is crucial for building modern, mobile-friendly interfaces. The grid system allows developers to create flexible layouts that adapt to different screen sizes and devices. This ensures a consistent user experience across various platforms and improves accessibility. In addition to the grid system, Chota provides a wide range of pre-styled components, such as buttons, forms, and navigation menus. These components are designed to be customizable, allowing developers to easily modify their appearance to match the branding and design requirements of the application. This saves development time and effort, as developers can focus on building the core functionality of the application instead of styling every individual component from scratch. Chota also supports CSS variables, which enable developers to define and reuse custom styles throughout the application. This feature promotes consistency and makes it easier to maintain a unified design language across different pages and components. Furthermore, Chota is compatible with modern web development tools and workflows. It can be easily integrated with popular build systems like webpack or Parcel, allowing developers to take advantage of features such as hot module replacement and code splitting. This enhances the development experience and enables faster iteration and deployment of changes. In summary, Chota is a lightweight, modular, and responsive framework that simplifies the process of building user interfaces. Its simplicity, modularity, and compatibility with modern web development tools make it an excellent choice for developers looking to create efficient and visually appealing applications. By leveraging the features and benefits of Chota, developers can streamline their workflow, save time, and deliver high-quality user experiences. ## 2. UIkit UIkit is a lightweight and modular front-end framework that provides a comprehensive set of CSS and JavaScript components. It is designed to be easy to use, flexible, and customizable, making it a popular choice for building modern and responsive user interfaces. One of the key features of UIkit is its modular architecture. It allows developers to pick and choose the components they need for their project, reducing the overall file size and improving performance. This modular approach also makes it easy to customize and extend UIkit to fit specific design requirements. UIkit provides a wide range of components that cover everything from basic elements like buttons and forms to more complex components like sliders and modals. Each component is designed to be highly customizable, with a variety of options and styles available out of the box. This allows developers to create unique and visually appealing interfaces without having to write a lot of custom CSS. Another advantage of UIkit is its responsive grid system. The grid system is based on flexbox, which makes it easy to create responsive layouts that adapt to different screen sizes. The grid system also includes a variety of utility classes that can be used to control the visibility and alignment of elements on different devices. In addition to its CSS components, UIkit also provides a set of JavaScript plugins that enhance the functionality of the framework. These plugins include features like dropdown menus, tooltips, and modals, which can be easily integrated into any UIkit project. The JavaScript plugins are designed to be lightweight and efficient, ensuring optimal performance. UIkit also includes a number of built-in themes that can be used as a starting point for designing a website or application. These themes provide a consistent and professional look and feel, and can be easily customized to match the branding and style of a project. Overall, UIkit is a powerful and versatile front-end framework that offers a wide range of components and features for building modern and responsive user interfaces. Its modular architecture, customizable design, and extensive documentation make it a popular choice among developers. Whether you are a beginner or an experienced developer, UIkit provides the tools and resources you need to create stunning and functional UIs. ## 3. Ant Design Ant Design is a popular UI framework that is widely used in web development. It provides a comprehensive set of components and design principles that make it easy to create visually appealing and user-friendly interfaces. In this section, we will explore the key features and benefits of Ant Design. One of the main advantages of Ant Design is its extensive library of pre-built components. These components cover a wide range of UI elements, including buttons, forms, tables, modals, and more. By using these components, developers can save time and effort in building their user interfaces from scratch. The components are also highly customizable, allowing developers to easily modify their appearance and behavior to suit their specific needs. Another notable feature of Ant Design is its adherence to design principles. The framework follows the principles of the Ant Design System, which emphasizes simplicity, consistency, and usability. This ensures that the user interface remains intuitive and easy to navigate, enhancing the overall user experience. The design principles also promote accessibility, making the UI accessible to users with disabilities. In addition to its components and design principles, Ant Design also offers a range of tools and resources to aid developers in their workflow. The framework provides a command-line interface (CLI) tool that simplifies the setup and configuration process. It also offers a comprehensive documentation website, which includes detailed guides, examples, and API references. This documentation makes it easy for developers to get started with Ant Design and find answers to their questions. One of the key benefits of using Ant Design is its compatibility with different platforms and frameworks. The framework is built on top of React, a popular JavaScript library for building user interfaces. This means that developers can easily integrate Ant Design into their React projects. Ant Design also provides support for other frameworks, such as Angular and Vue.js, allowing developers to leverage its components and design principles in their preferred framework. Another advantage of Ant Design is its active and supportive community. The framework has a large and vibrant community of developers who actively contribute to its development and provide support to fellow developers. This community-driven approach ensures that Ant Design remains up-to-date and responsive to the needs of its users. It also means that developers can easily find help and resources when they encounter issues or have questions. ## 4. Semantic UI Semantic UI is another CSS framework that emphasizes the use of semantic HTML and expressive classes. It provides a wide range of UI components and a responsive grid system. What sets Semantic UI apart is its intuitive and human-friendly class names, which make it easier to understand and use. With Semantic UI, you can quickly build modern and visually appealing websites without sacrificing accessibility and maintainability. ## 5. Materialize Materialize CSS is a CSS framework based on Google's Material Design guidelines. It offers a comprehensive set of UI components and styles that follow the Material Design principles. Materialize CSS provides a responsive grid system, typography, and a wide range of ready-to-use components. With its clean and modern design, Materialize CSS is a great choice for developers who want to create visually stunning websites that are consistent with the Material Design language. ## conclusion we have explored various CSS frameworks that can greatly enhance the development process and improve the overall user experience. These frameworks offer a wide range of features and functionalities, making it easier for developers to create visually appealing and responsive websites. Let's recap some of the key frameworks we discussed. One of the frameworks we explored is Chota. Chota is a lightweight CSS framework that focuses on simplicity and minimalism. It provides a set of utility classes and components that can be easily customized to fit any project's needs. With Chota, developers can quickly build responsive layouts and create consistent designs across different devices. Another framework we looked at is UIkit. UIkit is a comprehensive CSS framework that offers a wide range of components, such as grids, forms, and navigation menus. It also includes a powerful JavaScript library that enhances the functionality of these components. UIkit is known for its modular approach, allowing developers to pick and choose the components they need for their projects. Ant Design is another popular CSS framework that we discussed. It is primarily designed for building enterprise-level applications with a focus on usability and accessibility. Ant Design provides a set of well-designed components that can be easily customized to match the branding of any application. It also offers a rich set of documentation and resources, making it easier for developers to get started with the framework. Semantic UI is a framework that emphasizes the use of semantic HTML and intuitive class names. It provides a wide range of UI components and themes that can be easily integrated into any project. Semantic UI also offers a powerful theming system, allowing developers to customize the look and feel of their applications with ease. Lastly, we explored Materialize, a modern CSS framework that follows Google's Material Design guidelines. Materialize provides a set of responsive components and utilities that enable developers to create visually stunning and interactive websites. It also includes a robust JavaScript library that adds additional functionality to these components. Overall, these CSS frameworks offer a wealth of resources and tools that can greatly simplify the development process and enhance the user experience. Whether you're looking for a lightweight framework like Chota or a comprehensive one like UIkit, there is a CSS framework out there to suit your needs. By leveraging these frameworks, developers can save time, improve productivity, and create visually appealing websites that engage and delight users.
wahidmaster00
1,719,629
How to Write Scalable and Testable Laravel Production App
I have been working full-time as a software engineer for a little over three years. During this...
0
2024-01-07T08:53:37
https://dev.to/ikechukwu/how-to-write-scalable-and-testable-laravel-production-app-5d78
I have been working full-time as a software engineer for a little over three years. During this period I have noticed a couple of things: 1. Microservice is an overkill in most instances where folks use it. 2. Most monolith code bases aren't scalable or maintainable. But what does it mean for your code to be scalable? It involves a couple of things, for this article I will give a simplified but important definition: scalability is the ability of your code to accommodate more requirements and features without breaking or boxing the developer into a corner. Insights I will share in this article can help you write more maintainable and scalable monolith code and even better micro-services if you must write one. While I will use Laravel as an example, this article applies to any MVC-based framework. Shall we? Writing scalable and maintainable code begins with understanding the limitations inherent in your framework's software architecture - and that would be MVC in the case of Laravel. To understand the limitations of MVC, read this article from Uber about their engineering challenges: https://www.uber.com/en-NG/blog/new-rider-app-architecture/ Here is the catch: > First, matured MVC architectures often face the struggles of massive view controllers. For instance, the RequestViewController, which started off as 300 lines of code, is over 3,000 lines today due to handling too many responsibilities: business logic, data manipulation, data verification, networking logic, routing logic, etc. It has become hard to read and modify. This whole thing is about making your controller as lean as possible. What are the things that mess up your controller? * Validation logic * Business logic * Data manipulation and redirection * API vs Web request handling * Non-crude controllers ## Validation Logic For every form submission create a new Request class for form validation. ``` <?php namespace App\Http\Requests; use Illuminate\Foundation\Http\FormRequest; class StoreReviewRequest extends FormRequest { /** * Determine if the user is authorized to make this request. */ public function authorize(): bool { return true; } /** * Get the validation rules that apply to the request. * * @return array<string, \Illuminate\Contracts\Validation\ValidationRule|array|string> */ public function rules(): array { return [ 'body' => 'required|string', 'rating' => 'required|string', ]; } } ``` Use it as follows inside the reviews controller: ``` public function store(StoreReviewRequest $request) { return DB::transaction(function () use ($request) { $this->service->storeUserRestaurantReview($request->all()); Session::flash('success', 'Thank you for leaving a review'); return redirect()->back(); }); } ``` The magic above may not seem like a lot until the number of fields you want to validate increases to a certain number. Get more context [here](https://laracasts.com/discuss/channels/laravel/what-is-the-cleanest-way-to-split-validation-logic-into-multiple-files). ## Business Logic You will write lots of logic and manipulate lots of data, for the interest of maintainability and scalability, you should do this in a service class - this is called Service Pattern. Create a folder named "Services", subfolder it if necessary, or even version it for API. Inside this folder, write your services. Now take a second look at the code above you will see where I imported the ReviewService class. Inside the class, you can write all kinds of methods you need for review. Most of your business logic should be in service classes. However, some that have to do with precisely data retrieval or storage can be in your models. By following this, you have small pieces of code that can be tested via unit tests and integration tests. That my friend is your code becoming maintainable. If necessary you could also use a repository pattern. Create a folder in the App folder called "Repository" and in a file named "BaseRepositoryInterface: ``` <?php namespace App\Repository; //use Illuminate\Datatbase\Eloquent\Collection; use Illuminate\Database\Eloquent\Model; use Illuminate\Database\Eloquent\Collection; interface EloquentRepositoryInterface{ public function all(array $column =['*'], array $relations=[]):Collection; public function findById( int $modelid, array $column = ['*'], array $relations = [], array $appends = [] ):?Model; public function create(array $payload):?Model; public function update(int $modelid, array $payload):bool; // public function restoryById(int $modelid):bool; //public function attach() } ``` Inside the repository, create a folder called Eloquent, inside a file called BaseRepository: ``` <?php namespace App\Repository\Eloquent; use App\Repository\EloquentRepositoryInterface; use Illuminate\Database\Eloquent\Collection; use Illuminate\Database\Eloquent\Model; class BaseRepository implements EloquentRepositoryInterface{ protected $model; public function __construct(Model $model) { $this->model = $model; } public function all(array $column = ['*'], array $relations =[]):collection{ return $this->model->with($relations)->get($column); } public function findById( int $modelId, array $column = [], array $relations = [], array $append = [] ):?Model{ return $this->model->select($column)->with($relations)->findOrFail($modelId)->append($append); } public function create(array $payload):?Model{ $model = $this->model->create($payload); return $model->fresh; } public function update(int $modelId, array $payload):bool{ $model = $this->findById($modelId); return $model->update($payload); } // public function findById( // ) public function attach(){ } } ``` Now we have a MoviesRepository file inside Repository/Eloquent ``` <?php namespace App\Repository\Eloquent; //namespace App\Repository\Eloquent; use App\Models\Movie; use App\Repository\MovieRepositoryInterface; use App\Repository\UserRepositoryInterface; class MovieRepository extends BaseRepository implements MovieRepositoryInterface{ protected $model; public function __construct(Movie $model){ $this->model = $model; } } ``` You can see it extends an interface inside the repository folder, here is the code: ``` <?php namespace App\Repository; interface MovieRepositoryInterface extends EloquentRepositoryInterface{}; ``` Now let us see how to use this code: ``` <?php namespace App\Http\Controllers; use Illuminate\Http\Request; use App\Repository\MovieRepositoryInterface; use Illuminate\Support\Facades\Auth; use App\Repository\LocationRepositoryInterface; use Illuminate\Support\Facades\Session; class MoviesController extends Controller { private $movieRepository; private $locationRepository; public function __construct(MovieRepositoryInterface $movieRepository){ $this->movieRepository = $movieRepository; } public function create(LocationRepositoryInterface $locationRepository){ $this->locationRepository = $locationRepository; $locations = $this->locationRepository->all(); return view('pages.movies.create')->with('locations', $locations); } public function store(Request $request){ // I AM JUMPING FIELD VALIDATION AND PURIFICATION STEP HERE. $id = Auth()->user()->id; $payload = [ 'name'=>$request->title, 'descript'=>$request->description, 'poster'=>$request->poster, 'showtime'=>$request->showtime, 'location'=>$request->location, 'userid'=>$id ]; // var_dump($payload);die; $this->movieRepository->create($payload); Session::flash('success', "New movie Created"); return redirect()->route('new_movie'); } } ``` For the above to work, you must register Repository Provider as follows: ``` <?php namespace App\Providers; use Illuminate\Support\ServiceProvider; use App\Repository\EloquentRepositoryInterface; use App\Repository\Eloquent\BaseRepository; use App\Repository\MovieRepositoryInterface; use App\Repository\Eloquent\LocationRepository; use App\Repository\Eloquent\MovieRepository; use App\Repository\LocationRepositoryInterface; class RepositoryServiceProvider extends ServiceProvider { /** * Register services. * * @return void */ public function register() { $this->app->bind(EloquentRepositoryInterface::class, BaseRepository::class); $this->app->bind(MovieRepositoryInterface::class, MovieRePository::class); $this->app->bind(LocationRepositoryInterface::class, LocationRepository::class); } /** * Bootstrap services. * * @return void */ public function boot() { // } } ``` Now inside AppServiceProvider you must register repository service provider as follows: ``` <?php namespace App\Providers; use Illuminate\Support\ServiceProvider; use Illuminate\Support\Facades\Schema; class AppServiceProvider extends ServiceProvider { /** * Register any application services. * * @return void */ public function register() { $this->app->register(RepositoryServiceProvider::class); } /** * Bootstrap any application services. * * @return void */ public function boot(){ // Fix for MySQL < 5.7.7 and MariaDB < 10.2.2 Schema::defaultStringLength(191); } } ``` Repository pattern introduces more abstraction into your code as such I think you should not use it unless it's totally necessary. ## API vs Web Request Handling You can use the same controllers for web and API but I recommend you use different controllers for web and API and have them share the code in the service class. Now the service class needs to be versioned. This is keeping up with DRY. ## Non-crude Controllers Ideally, your controllers should be pure crude - create, read, update, edit but some devs define other kinds of method in there. Any method that cannot fit into these should be in an invokable controller. Keep your controllers crude or invokable. Invokable controllers are controllers with one method __invokable() that executes when the controller is called. ## Beyond Framework Limitations I mentioned earlier about understanding the limitations of the software architecture your framework of choice uses. Beyond that, you also need to understand the functionalities available in your framework so you don't have to reinvent the wheel while using the framework. For instance, laravel has many ways of authorization who sees and acts on a resource. A deep understanding of them like gates, policies, spatie permissions, and co will save tons of lines of unnecessary code and gnashing of teeth. ## Tests With the few points above you can proceed to add unit and integration tests to your code to keep track of things each time you make a change. ## Database Integrity For things not to blow apart and also to avoid encountering funny edge case malfunctions in your application, I strictly recommend you enforce data integrity with military precision. First, use foreign keys in your relationships so it would be illegal to delete some resources. Use database transactions whenever you are writing to more than one table, this keeps your data consistent across tables. ## Conclusion With the few points we have discussed in this article, your code will be very maintainable and scalable. It will be easy to modify existing features or add a new one without breaking things. This approach will make your monolith application incredibly maintainable and scalable. If you are building microservices, this approach will help to keep things a lot saner. _**I am Ikechukwu Vincent a Full Stack Software Engineer proficient in PHP, Python, NodeJs, and React. I specialize in building B2B Multi-tenancy applications.**_ Connect with me on Socials * Linkedin [Vincent](https://www.linkedin.com/in/ikechukwu-vincent-002934176/) * Facebook [Vincent](https://web.facebook.com/ikechukwu.unegbu.14) * Twitter [Vincent](https://twitter.com/TheV_Exe)
ikechukwu
1,719,722
Map Concept in JavaScript
A post by Emin Altan
25,917
2024-01-07T12:06:06
https://dev.to/eminaltan/map-concept-in-javascript-b5c
webdev, javascript, tutorial, series
{% embed https://gist.github.com/eminaltan/b5a1bbd6c48ee620ce968ad0dc33235a %}
eminaltan
1,719,922
Day 7: Unleashing Functions in Rust 🚀
Greetings, code enthusiasts! On Day 7 of my #100DaysOfCode journey with Rust, let's dive into the...
25,971
2024-01-07T14:46:58
https://dev.to/aniket_botre/day-7-unleashing-functions-in-rust-2d9c
rust, 100daysofcode, programming, self
> Greetings, code enthusiasts! On Day 7 of my #100DaysOfCode journey with Rust, let's dive into the world of functions - the building blocks of any great code composition. ## Declaring Functions: The Rustic Way 🛠️ In Rust, functions start with the `fn` keyword, followed by a name, parentheses, and curly braces. You've already met the star of the show, the **main** function - the entry point of every Rust program: ```rust fn main() { greet(); } fn greet() { println!("Greetings, Rustacean!"); } ``` --- ### Quick Notes on Rustic Etiquette 📜 - Function names in Rust embrace `snake_case`: all lowercase with underscores between words. - Choose names that are both descriptive and expressive to convey the function's purpose clearly. - Avoid starting function names with numbers for the sake of readability. --- ## Adding Flavor: Functions with Parameters 🌶️ Let's spice things up with functions that take parameters: ```rust fn main() { greet("John Doe"); } fn greet(name: &str) { println!("Greetings, {name}!!"); } ``` > **Note:** Explicitly mention the parameter type for a dash of clarity. --- ## Return of the Function: Functions with Return Values 🔄 Witness the grand return of functions with values: ```rust fn main() { let result = add(10, 20); println!("The sum is {}", result); } fn add(a: i32, b: i32) -> i32 { a + b } ``` > **Note:** Indicate the return type after an arrow `->` for a spectacular finale. As the symphony of functions continues to play in Rust, I find joy in crafting code with clarity and purpose. Join me on [Github](https://github.com/Aniket200-ind/100dayscoding) for more updates! 💻🌐✨ #RustLang #Programming #LearningToCode #CodeNewbie #TechJourney #Day7
aniket_botre
1,719,929
Decoding Git and GitHub: An Introductory Handbook on Version Control and Collaborative Coding
In the fast-paced world of software development, effective collaboration and version control are...
0
2024-01-07T14:58:16
https://seracoder.com/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding/
git, github, versioncontrol
In the fast-paced world of software development, effective collaboration and version control are pivotal to success. As coding projects evolve and multiple contributors work simultaneously, chaos can ensue without a robust system in place. This is where version control systems like Git and collaborative platforms like GitHub play a transformative role. ## [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-table-of-contents "Permalink")**Table of Contents** - 1. Introduction - 1.1 Why Version Control is Essential - 1.2 Overview of Git and GitHub - 1.3 Importance of Version Control in Collaborative Coding - 2. Getting Started with Git - 2.1 Installing Git - 2.2 Configuring Git for the First Time - 2.3 Basic Git Commands and Workflow - 3. Understanding Version Control Concepts - 3.1 Commits, Branches, and Merging - 3.2 Repository Structure - 3.3 Forking and Cloning - 4. GitHub Essentials - 4.1 Creating a GitHub Account - 4.2 Creating and Managing Repositories - 4.3 Pull Requests and Code Reviews - 5. Collaborative Workflows - 5.1 Branching Strategies for Teams - 5.2 Resolving Conflicts and Handling Merge Issues - 5.3 Integrating Git into Daily Development Workflow - 6. Best Practices and Tips - 6.1 Writing Meaningful Commit Messages - 6.2 Using .gitignore to Manage Unwanted Files - 6.3 Security Considerations in Version Control - 7. Conclusion - 7.1 Recap of Key Concepts - 7.2 Next Steps in Git Mastery ## [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-1-introduction "Permalink")**1. Introduction** ### [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-11-why-version-control-is-essential "Permalink")**1.1 Why Version Control is Essential** [Version control](https://en.wikipedia.org/wiki/Version_control) is the backbone of modern software development, offering a systematic approach to tracking changes, managing collaborative efforts, and safeguarding project integrity. By demystifying the complexities of version control, developers gain the power to seamlessly collaborate, experiment, and innovate without the fear of losing crucial work. ### [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-12-overview-of-git-and-github "Permalink")**1.2 Overview of Git and GitHub** [Git](https://git-scm.com/), a distributed version control system, is the industry standard for tracking changes in source code during software development. Paired with [GitHub](https://seracoder.com/github/?swcfpc=1), a web-based platform that enhances collaboration and facilitates code sharing, Git becomes a dynamic force, empowering teams to work cohesively, regardless of geographical distances. ### [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-13-importance-of-version-control-in-collaborative-coding "Permalink")**1.3 Importance of Version Control in Collaborative Coding** This article serves as a beginner’s guide to Git and GitHub, aiming to unravel the intricacies of version control for those venturing into the coding realm. By understanding the fundamental concepts and workflows of Git and GitHub, developers, both novice and experienced, can foster a collaborative environment that enhances productivity, promotes accountability, and ensures the seamless progression of software projects. Let’s embark on a journey to demystify Git and GitHub, unlocking the potential for efficient version control and code collaboration. ## [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-2-getting-started-with-git "Permalink")**2. Getting Started with Git** To kickstart your journey with Git, we’ll guide you through the essential steps of installation, configuration, and basic commands. Let’s ensure you have Git set up correctly for a seamless version control experience. ### [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-21-installing-git "Permalink")**2.1 Installing Git** Begin by installing Git on your local machine. The process varies depending on your operating system: - **Windows:** Download the Git installer from the official [Git](https://git-scm.com/download/win) website. Run the installer and follow the on-screen instructions. Ensure you select the option to “Use Git from the Windows Command Prompt” during installation. - **macOS:** Git is often pre-installed on macOS. You can check by opening the Terminal and typing `git --version`. If Git is not installed, you can install it using Homebrew by running `brew install git`. - **Linux (Ubuntu):** Use the package manager to install Git. Run `sudo apt-get update` followed by `sudo apt-get install git`. ### [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-22-configuring-git-for-the-first-time "Permalink")**2.2 Configuring Git for the First Time** After installing Git, configure it with your identity: - Open a terminal or command prompt. - Set your name using the command: ```Bash git config --global user.name "Your Name" ``` - Set your email address using: ```Bash git config --global user.email "your.email@example.com" ``` This information will be associated with your commits, providing clarity about the authorship. ### [](https://mdhamim.hashnode.dev/decoding-git-and-github-an-introductory-handbook-on-version-control-and-collaborative-coding#heading-23-basic-git-commands-and-workflow "Permalink")**2.3 Basic Git Commands and Workflow** Now, let’s explore some fundamental Git commands: - **Initialize a new repository:** ```Bash git init ``` - **Add changes to the staging area:** ```Bash git add <filename> ``` - **Commit changes:** ```Bash git commit -m "Your commit message" ``` - **Check the status of your repository:** ```Bash git status ``` - **View commit history:** ```Bash git log ``` With Git successfully installed and configured, and a grasp of basic commands, you’re ready to commence your version control journey. ## 3. Understanding Version Control Concepts Now that you’ve dipped your toes into the Git waters, it’s time to deepen your understanding of key version control concepts. This section will explore commits, branches, and merging, providing a comprehensive view of how Git tracks changes and manages project history. ### 3.1 Commits, Branches, and Merging **Commits:** At the core of Git is the concept of commits, which represent a snapshot of your project at a specific point in time. This section will delve into creating meaningful commits, understanding commit messages, and using ‘git log’ to navigate through your project’s history. Example command: ```Bash git commit -m "Add initial implementation of feature X" ``` **Branches:** Git’s branching system allows developers to work on isolated features or bug fixes without affecting the main codebase. Learn how to create, switch between, and delete branches to streamline your development process. Example commands: ```Bash git branch feature-X # Create a new branch named 'feature-X' git checkout feature-X # Switch to the 'feature-X' branch git branch -d feature-X # Delete the 'feature-X' branch ``` **Merging:** As features are developed in separate branches, merging becomes crucial. Explore the ‘git merge’ command to combine changes from one branch into another, ensuring a smooth integration of new features. Example command: ```Bash git merge feature-X # Merge changes from 'feature-X' into the current branch ``` ### 3.2 Repository Structure Understand how Git organizes project files and directories. Explore the working directory, the staging area, and the repository itself. Grasp how to use ‘git status’ to check the status of your files and the ‘git add’ command to stage changes for the next commit. Example commands: ```Bash git status # Check the status of your working directory git add filename # Stage changes in the file for the next commit ``` ### 3.3 Forking and Cloning Discover the concepts of forking and cloning, essential for collaborative development on platforms like GitHub. Forking creates a personal copy of a repository, while cloning brings that copy to your local machine. Example commands: ```Bash git clone <repository_url> # Clone a repository to your local machine ``` By mastering these version control concepts and associated commands, you’ll gain a solid foundation for navigating the complexities of Git and GitHub. These skills are vital for efficient collaboration and maintaining a well-organized, versioned codebase. Let’s continue our journey to demystify Git’s core functionalities. ## 4. GitHub Essentials As you become comfortable with Git’s local version control capabilities, it’s time to elevate your collaboration game by exploring [GitHub](https://github.com/), a powerful platform that enhances code sharing, review, and project management. In this section, we’ll cover the essentials of GitHub, from creating an account to managing repositories and collaborating with others. ### 4.1 Creating a GitHub Account If you haven’t already, [creating a GitHub account](https://github.com/signup) is your first step towards unlocking the full potential of collaborative coding. This section will guide you through the account creation process, helping you set up your profile and configure basic settings. ### 4.2 Creating and Managing Repositories Explore the process of creating a new repository on GitHub to host your projects. Learn about repository settings, including **README**, **licenses**, and **.gitignore** files. This section also covers creating branches on GitHub and managing repository access. Example commands: ```Bash git remote add origin <repository_url> # Connect your local repository to the GitHub repository git push -u origin master # Push your changes to the GitHub repository ``` ### 4.3 Pull Requests and Code Reviews The heart of collaborative coding on GitHub lies in **pull requests** (PRs) and code reviews. Understand how to propose changes, submit a pull request, and initiate and participate in code reviews. Dive into the GitHub interface to explore discussions, inline comments, and the overall review process. Example commands: ```Bash git pull origin master # Pull changes from the main repository to your local branch ``` By mastering these GitHub essentials, you’ll be equipped to seamlessly collaborate with others, propose changes, and maintain a streamlined development workflow. GitHub’s features go beyond version control, providing a comprehensive platform for hosting, reviewing, and enhancing your code. Let’s continue our journey into collaborative coding by exploring these key GitHub functionalities. ## 5. Collaborative Workflows Now that you’ve grasped the basics of **Git and GitHub** individually, let’s combine these tools to explore effective collaborative workflows. Whether you’re working on a team project, contributing to open source, or collaborating with fellow developers, understanding collaborative workflows is essential for a smooth and productive development process. ### 5.1 Branching Strategies for Teams Discover best practices for branching strategies that enable seamless collaboration within a team. Learn about long-lived branches, feature branches, and release branches. Understand how branching strategies can enhance code stability, facilitate parallel development, and simplify the integration of new features. Example commands: ```Bash git checkout -b feature-X # Create and switch to a new feature branch git merge feature-X # Merge changes from a feature branch into the main branch ``` ### 5.2 Resolving Conflicts and Handling Merge Issues Conflict resolution is a common challenge in collaborative development. Explore strategies for handling conflicts that arise when merging branches. Understand how to use visual tools, such as ‘git diff’ and merge tools, to resolve conflicts and ensure a smooth integration of changes. Example commands: ```Bash git diff # View the differences between branches or commits git mergetool # Launch a visual merge tool to resolve conflicts ``` ### 5.3 Integrating Git into Daily Development Workflow Discover how Git can seamlessly integrate into your daily development routine. Explore the use of branches for feature development and bug fixing, commit squashing for cleaner history, and rebasing to maintain a linear and organized commit history. Example commands: ```Bash git rebase -i HEAD~3 # Interactively rebase the last 3 commits git push --force # Force-push changes after rebasing ``` By mastering collaborative workflows, you’ll not only enhance the efficiency of your team but also contribute to a more organized and scalable codebase. These strategies and commands will empower you to navigate the complexities of collaborative development with confidence. Let’s dive deeper into the collaborative aspect of version control and code collaboration. ## 6. Best Practices and Tips As you progress in your Git and GitHub journey, adopting best practices becomes crucial for maintaining a clean, efficient, and collaborative development environment. In this section, we’ll explore essential best practices and tips that will help you make the most out of version control and ensure a streamlined coding experience. ### 6.1 Writing Meaningful Commit Messages Effective communication through commit messages is an art that enhances collaboration and project understanding. Learn how to craft concise, informative, and meaningful commit messages that provide context and clarity to your collaborators and future self. Example: ```Bash git commit -m "Fix issue #123: Resolve bug in login validation" ``` ### 6.2 Using .gitignore to Manage Unwanted Files Keep your repositories clean by utilizing the .gitignore file. Understand how to specify files or directories that Git should ignore, preventing them from being tracked and included in commits. This is particularly useful for excluding build artifacts, temporary files, or sensitive information. Example .gitignore file: ``` # Ignore compiled binaries *.exe *.o # Ignore log and temporary files *.log tmp/ ``` ### 6.3 Security Considerations in Version Control Explore security best practices to safeguard your repositories and sensitive information. Learn about using authentication tokens, setting up secure connections, and avoiding the unintentional inclusion of sensitive data in commits. Example: ```Bash # Use HTTPS with authentication token for secure remote operations git remote set-url origin https://<username>:<token>@github.com/<username>/<repository>.git ``` Adopting these best practices and tips will not only improve your individual workflow but also contribute to a more efficient and secure collaborative coding environment. As you continue to refine your skills, integrating these practices will become second nature, fostering a positive and productive coding experience. Let’s conclude our exploration of Git and GitHub with a recap of key concepts. ## 7. Conclusion Congratulations on completing this beginner’s guide to Git and GitHub! Throughout this journey, you’ve demystified the fundamental concepts of version control, learned how to navigate Git commands, and explored the collaborative power of GitHub. Let’s recap the key takeaways and encourage your continued exploration of these invaluable tools. ### 7.1 Recap of Key Concepts - **Version Control Significance:** Understand the importance of version control in tracking changes, managing collaborative efforts, and safeguarding project integrity. - **Git and GitHub Overview:** Familiarize yourself with Git, a distributed version control system, and GitHub, a web-based platform that enhances collaboration and code sharing. - **Essentials of Git Usage:** Learn the basics of installing Git, configuring it for the first time, and using essential commands for version control. - **Version Control Concepts:** Deepen your understanding of commits, branches, merging, and repository structure, along with practical examples. - **GitHub Essentials:** Explore creating a GitHub account, managing repositories, and engaging in pull requests and code reviews to enhance collaboration. - **Collaborative Workflows:** Grasp effective branching strategies, conflict resolution techniques, and integrating Git into your daily development workflow for collaborative success. - **Best Practices and Tips:** Adopt practices like meaningful commit messages, using `.gitignore`, and considering security measures to maintain a clean, secure, and efficient development environment. ### 7.2 Next Steps in Git Mastery As you embark on your Git and GitHub journey, consider advancing your skills by exploring more advanced topics such as Git hooks, continuous integration, and mastering Git workflows for larger projects. Engage with the vibrant developer community on GitHub, participate in open-source projects, and continue refining your collaboration skills. Remember, proficiency in Git and GitHub is a continuous learning process. Embrace challenges, seek out additional resources, and apply your knowledge to real-world projects. The skills you’ve acquired here will undoubtedly serve as a solid foundation for your coding endeavors. Thank you for joining us on this exploration of version control and collaborative coding. Happy coding, and may your Git repositories always remain conflict-free!
seracoder
1,720,093
20 TIPS AND TRICKS JAVASCRIPT
JavaScript, bahasa pemrograman yang membuat situs web interaktif, memiliki beberapa trik menarik yang...
25,974
2024-01-07T19:45:26
https://dev.to/mhdhanif/20-tips-and-tricks-javascript-2d8l
JavaScript, bahasa pemrograman yang membuat situs web interaktif, memiliki beberapa trik menarik yang dapat membuat perjalanan pemrograman Anda lebih lancar dan menyenangkan. Dalam tulisan ini, kita akan menjelajahi 20 tips dan trik JavaScript, masing-masing dijelaskan dengan contoh yang mudah dipahami. Mari kita mulai dan tingkatkan keterampilan JavaScript Anda! ## 1. Keajaiban Destructuring: Ekstrak Nilai dengan Mudah Destructuring memungkinkan Anda membongkar nilai dari array atau objek dengan mudah. Berikut contohnya: ```jsx javascriptCopy code const person = { name: 'Alice', age: 30 }; const { name, age } = person; console.log(name); // Output: Alice console.log(age); // Output: 30 ``` 2. Sebarkan Cinta: Klon Array dan Gabungkan Objek Operator spread (**`...`**) memungkinkan Anda membuat salinan array dan menggabungkan objek dengan mudah: ```jsx javascriptCopy code const originalArray = [1, 2, 3]; const clonedArray = [...originalArray]; console.log(clonedArray); // Output: [1, 2, 3] ``` Menggabungkan objek: ```jsx javascriptCopy code const obj1 = { a: 1, b: 2 }; const obj2 = { b: 3, c: 4 }; const merged = { ...obj1, ...obj2 }; console.log(merged); // Output: { a: 1, b: 3, c: 4 } ``` 3. Kekuatan **`map()`**: Transformasi dengan Mudah Metode **`map()`** adalah senjata rahasia Anda untuk mentransformasi data: ```jsx javascriptCopy code const numbers = [1, 2, 3]; const squared = numbers.map(num => num * num); console.log(squared); // Output: [1, 4, 9] ``` 4. Short-circuit dengan **`&&`** dan **`||`**: Kondisional Elegan Gunakan **`&&`** dan **`||`** untuk membuat kondisional bersih dan ringkas: ```jsx javascriptCopy code const name = user.name || 'Guest'; console.log(name); // Output: Guest ``` 5. Rantai **`setTimeout()`**: Penjadwalan Keterlambatan Rantai **`setTimeout()`** menciptakan urutan tindakan yang tertunda: ```jsx javascriptCopy code function delayedLog(message, time) { setTimeout(() => { console.log(message); }, time); } delayedLog('Hello', 1000); // Output (setelah 1 detik): Hello ``` 6. Fungsi Panah: Ringkas dan Kuat Fungsi panah (**`() => {}`**) tidak hanya ringkas, tetapi juga mempertahankan nilai **`this`**: ```jsx javascriptCopy code const greet = name => `Hello, ${name}!`; console.log(greet('Alice')); // Output: Hello, Alice! ``` 7. Menguasai **`Promise.all()`**: Mengatasi Beberapa Promise Gabungkan beberapa promise dan tangani secara kolektif menggunakan **`Promise.all()`**: ```jsx javascriptCopy code const promise1 = fetch('url1'); const promise2 = fetch('url2'); Promise.all([promise1, promise2]) .then(responses => console.log(responses)) .catch(error => console.error(error)); ``` 8. Nama Properti Dinamis: Kunci Objek yang Serbaguna Anda dapat menggunakan variabel sebagai nama properti objek dengan menggunakan tanda kurung siku: ```jsx javascriptCopy code const key = 'name'; const person = { [key]: 'Alice' }; console.log(person.name); // Output: Alice ``` 9. Keajaiban Template Literals: Formatting String Template literals (**`${}`**) memungkinkan Anda menyisipkan ekspresi dalam string: ```jsx javascriptCopy code const name = 'Alice'; const greeting = `Hello, ${name}!`; console.log(greeting); // Output: Hello, Alice! ``` 10. Pengecekan NaN: Alternatif yang Lebih Aman Gunakan **`Number.isNaN()`** untuk memeriksa dengan akurat apakah suatu nilai NaN: ```jsx javascriptCopy code const notANumber = 'Not a number'; console.log(Number.isNaN(notANumber)); // Output: false ``` 11. Chaining Opsional (**`?.`**): Menjinakkan Nilai yang Tidak Didefinisikan Hindari kesalahan dengan chaining opsional saat berurusan dengan properti bertingkat: ```jsx javascriptCopy code const user = { info: { name: 'Alice' } }; console.log(user.info?.age); // Output: undefined ``` 12. Kebangkitan Regex: Menguasai Pola Regular expressions (**`RegExp`**) adalah alat kuat untuk pencocokan pola: ```jsx javascriptCopy code const text = 'Hello, world!'; const pola = /Hello/g; console.log(text.match(pola)); // Output: ['Hello'] ``` 13. JSON.parse() Reviver: Mengubah Data yang Diuraikan Parameter **`reviver`** dalam **`JSON.parse()`** memungkinkan Anda mengubah JSON yang diuraikan: ```jsx javascriptCopy code const data = '{"age":"30"}'; const diuraikan = JSON.parse(data, (key, value) => { if (key === 'age') return Number(value); return value; }); console.log(diuraikan.age); // Output: 30 ``` 14. Trik Keren di Console: Kenikmatan Debugging Lebih dari **`console.log()`**, gunakan **`console.table()`** dan **`console.groupCollapsed()`**: ```jsx javascriptCopy code const users = [{ name: 'Alice' }, { name: 'Bob' }]; console.table(users); console.groupCollapsed('Details'); console.log('Name: Alice'); console.log('Age: 30'); console.groupEnd(); ``` 15. Fetch dengan **`async`**/**`await`**: Kesederhanaan Asynchronous **`async`**/**`await`** dengan **`fetch()`** menyederhanakan penanganan permintaan asynchronous: ```jsx javascriptCopy code async function fetchData() { try { const response = await fetch('url'); const data = await response.json(); console.log(data); } catch (error) { console.error(error); } } fetchData(); ``` 16. Closures Terkuak: Privasi Data Closures memungkinkan Anda membuat variabel privat dalam fungsi: ```jsx javascriptCopy code function createCounter() { let count = 0; return function () { count++; console.log(count); }; } const counter = createCounter(); counter(); // Output: 1 counter(); // Output: 2 ``` 17. Memoization untuk Kecepatan: Penghitungan Ulang yang Efisien Memoization menyimpan hasil fungsi untuk kinerja yang lebih baik: ```jsx javascriptCopy code function fibonacci(n, memo = {}) { if (n in memo) return memo[n]; if (n <= 2) return 1; memo[n] = fibonacci(n - 1, memo) + fibonacci(n - 2, memo); return memo[n]; } console.log(fibonacci(10)); // Output: 55 ``` 18. Sambut Observer Intersection: Efek Gulir Tanpa Usaha Gunakan API Observer Intersection untuk lazy loading dan animasi gulir: ```jsx javascriptCopy code const observer = new IntersectionObserver(entries => { entries.forEach(entry => { if (entry.isIntersecting) { entry.target.classList.add('fade-in'); observer.unobserve(entry.target); } }); }); const elements = document.querySelectorAll('.animate'); elements.forEach(element => observer.observe(element)); ``` 19. Modul ES6 untuk Kode Bersih: Terorganisir dan Modular Gunakan modul ES6 untuk kode yang bersih dan modular: ```jsx javascriptCopy code // math.js export function add(a, b) { return a + b; } // app.js import { add } from './math.js'; console.log(add(2, 3)); // Output: 5 ``` 20. Proksi: Melampaui Objek Proksi memungkinkan Anda untuk menyusup dan menyesuaikan operasi objek: ```jsx javascriptCopy code const handler = { get(target, prop) { return `Properti "${prop}" tidak ada.`; } }; const proxy = new Proxy({}, handler); console.log(proxy.name); // Output: Properti "name" tidak ada. ``` Dengan 20 tips dan trik JavaScript ini di dalam peralatan Anda, Anda siap untuk meningkatkan keterampilan pemrograman Anda ke tingkat berikutnya. Teruslah menjelajahi, bereksperimen, dan membangun hal-hal luar biasa dengan JavaScript!
mhdhanif
1,720,116
A 504 Gateway Time-out error in Laravel
*A 504 Gateway Time-out error in Laravel web app which is running on Kubernetes * A 504 Gateway...
0
2024-01-07T20:18:47
https://dev.to/dgihost/a-504-gateway-time-out-error-in-laravel-1kcl
**A 504 Gateway Time-out error in Laravel web app which is running on Kubernetes ** A 504 Gateway Time-out error usually occurs when a server acting as a gateway or proxy does not receive a timely response from an upstream server it needs to access to complete the request. In the context of your Laravel web app running on Kubernetes, there could be several reasons causing this issue: **Resource allocation:** Ensure that your Kubernetes cluster has enough resources (CPU, memory) allocated to handle the incoming requests. Monitor resource usage metrics in Kubernetes to check if there are any bottlenecks. **Networking:** Kubernetes networking configuration might be causing delays or timeouts. Check the network policies, service configurations, and DNS settings in your Kubernetes cluster. Sometimes issues with DNS resolution can lead to timeouts. **Load balancing:** If you're using Kubernetes Services with load balancing, it might be misconfigured. Verify your Service configuration to ensure it's distributing the traffic evenly among the pods. **Health checks:** Check if your Kubernetes readiness and liveness probes are configured correctly. If a pod is not ready to accept traffic, it won't receive requests, potentially leading to timeouts. **Application code:** It's also possible that your Laravel application might have code or database queries causing delays. Ensure your application code is optimized, and database queries are efficient. **External dependencies:** If your application relies on external services (database, APIs, etc.), issues with connectivity or slow responses from these services could cause delays. **Here are some steps you can take to troubleshoot:** Check the logs of your Laravel application pods in Kubernetes for any errors or warnings that might provide insight into the cause of the timeouts. Use monitoring tools within Kubernetes to track resource usage, pod health, and network activity. Review your Kubernetes configurations, including Service configurations, Ingress, Network Policies, etc. Use tools like kubectl exec to access the pods and run diagnostic commands (such as curl or traceroute) to test connectivity and response times from within the cluster. You might need to use various diagnostic tools to pinpoint the exact cause of the issue. It's essential to analyze logs, metrics, and configurations thoroughly to identify and resolve the problem. Thanks for reading, DGI Host.com
dgihost
1,720,346
Simplified: Create Custom Beating Heart Emojis Online with our Emoji Maker
Express your emotions with beating heart emojis using Simplified online emoji maker. Create emojis...
0
2024-01-08T05:01:47
https://dev.to/beatingheart/simplified-create-custom-beating-heart-emojis-online-with-our-emoji-maker-5019
**[Express your emotions with beating heart emojis using Simplified](https://simplified.com/ai-emoji-maker/beating-heart )** online emoji maker. Create emojis that represent a beating heart and make your conversations more expressive. Our easy-to-use tool allows you to design and generate beating heart emojis for online messaging.
beatingheart